• Keine Ergebnisse gefunden

SPRING JOINT

N/A
N/A
Protected

Academic year: 2022

Aktie "SPRING JOINT "

Copied!
636
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

AFIPS

CONFERENCE PROCEEDINGS

VOLUME 25

1964

SPRING JOINT

COMPUTER

CONFERENCE

(2)

those of the authors and are not necessarily repre- sentative of or endorsed by the 1964 Spring Joint Computer Conference Committee or the American Federation of Information Processing Societies.

Library of Congress Catalog Card Number: 55-44701

Copyright © 1964 by American Federation of Information Processing Societies, P. O. Box 1196, Santa Monica, California. Printed in the United States of America. All rights reserv:ed. This book or parts thereof, may not he reproduced in any form without permission of the publishers.

Sole Distributors in Great Britain, the British Commonwealth and the Continent of Europe:

CLEA VER-HUME PRESS 10-15 St. Martins Street

London W. C. 2 ii

(3)

1. 1951 Joint AlEE-IRE Computer Confer- ence, Philadelphia, December 1951

2. 1952 Joint AIEE-IRE-ACM Computer Con- ference, New York, December 1952

3. 1953 Western Computer Conference, Los Angeles, February 1953

4. 1953 Eastern Joint Computer Conference, Washington, December 1953

5. 1954 Western Computer Conference, Los Angeles, February 1954

6. 1954 Eastern Joint Computer Conference, Philadelphia, December 1954

7. 1955 Western Joint Computer Conference, Los Angeles, March 1955

8. 1955 Eastern Joint Computer Conference, Boston, November 1955

9. 1956 Western Joint Computer Conference, San Francisco, February 1956

10. 1956 Eastern Joint Computer Conference, N ew York, December 1956

11. 1957 Western Joint Computer Conference, Los Angeles, February 1957

12. 1957 Eastern Joint Computer Conference, Washington, December 1957

13. 1958 Western Joint Computer Conference, Los Angeles, May 1958

14. 1958 Eastern Joint Computer Conference, Philadelphia, December 1958

15. 1959 Western Joint Computer Conference, San Francisco, March 1959

16. 1959 Eastern Joint Computer Conference, Boston, December 1959

17. 1960 Western Joint Computer Conference, San Francisco, May 1960

18. 1960 Eastern Joint Computer Conference, New York, December 1960

19. 1961 Western Joint Computer Conference, Los Angeles, May 1961

20. 1961 Eastern Joint Computer Conference.

Washington, December 1961

21. 1962 Spring Joint Computer Conference, San Francisco, May 1962

22. 1962 Fall J oint Computer Conference, Philadelphia, December 1962

23. 1963 Spring Joint Computer Conference, Detroit, May 1963

24. 1963 Fall Joint Computer Conference, Las Vegas, November 1963

25. 1964 Spring Joint Computer Conference, Washington, April 1964

Conferences 1 to 19 were sponsored by the National Joint Computer Com- mittee, predecessor of AFIPS. Back copies of the proceedings of these conferences may be obtained, if available, from:

• Association for Computing Machinery, 14 E. 69th St., New York 21, N. Y.

• American Institute of Electrical Engineers, 345 E. 47th St., New York 17, N. Y.

• Institute of Radio Engineers, 1 E. 79th St., New York 21, N. Y.

Conferences 20 and up are sponsored by AFIPS. Copies of AFIPS Con- ference Proceedings may be ordered from the publishers as available at the prices indicated below. Members of societies affiliated with AFIPS may obtain copies at the special "Member Price" shown.

List Member Volume

I

Prne Price Publisher

20 21 22 23 24 25

$12.00 $7.00 Macmillan Co., 60 Fifth Ave., New York 11, N. Y.

6.00 6.00 National Press, 850 Hansen Way, Palo Alto, Calif.

(LOO 4.00 Sparatan Books, Inc., 301 N. Charles Baltimore 1, Md.

1000 5.00 Sparatan Books, Inc.

16.50 8.25 Sparatan Books, Inc.

Sparatan Books, Inc.

NOTICE TO LIBRARIANS

This volume (25) continues the Joint Computer Conference Proceedings (LC55-44701) as indicated in the above table. It is suggested that the series be filed under AFIPS and cross referenced as necessary to the Eastern, Western, Spring, and Fall Joint Computer Conferences.

St.,

(4)

Preface

COMPILERS: TUTORIAL

Programming Systems and Languages: A Historical Survey Bounded Context Translation

Syntax-Directed Compiling

TECHNICAL SESSION A general-Purpose Table-Driven Compiler

APPLICATIONS

A Computer Technique for Producing Animated Movies Simulation of Biological Cells by Systems

Composed of String-Processing Finite Automata

Computer Simulation of Human Interaction in Small Groups

Real-Time Computer Studies of Bargaining Behavior: The Effects of Threat Upon Bargaining

Real-Time Quick-Look Analysis for the OGO Satellites

SOCIAL IMPLICATIONS OF DATA PROCESSING An Ethos for the Age of Cyberculture

Information Processing and Some Implications for More Rational Economic Activities The Computer Revolution and the Spirit of Man

NUMERICAL ANALYSIS

New Difference Equation Technique for Solving Non- Linear Differential Equations

Discontinuous System Variables in the Optimum Control of Second Order Oscillatory Systems With Zeros

Two New Direct. Minimum Search Procedures for Functions of Several Variables

COMMAND AND CONTROL On the Evaluation of the Cost-Effectiveness of

Command and Control Systems

Page

S. ROSEN 1 R. M. GRAHAM 17 T. E. CHEATHAM, JR. 31

K. SATTLEY S. WARSHALL 59 R. M. SHAPIRO K. C. KNOWLTON 67

W. R. STAHL 89 R. W. COFFIN H. E. GOHEEN J. T. GULLAHORN 103 J. E. GULLAHORN

R. J. MEEKER 115 G. H. SHURE W. H. MOORE, JR.

R. J. COYLE 125 J. K. STEW ART

A. M. HILTON 139 H. E. STRINER 155 R. H. DAVIS 161 J. M. HURT 169 W. NEVIUS 181

H. TITUS B. WITTE 195

N. P. EDWARDS 211

(5)

Fractionization of the Military Context Some Problems Associated With Large

Programming Efforts

Some Cost Contributors to Large-Scale Programs HYBRID SYSTEMS: TUTORIAL Hybrid-Computation ... What is it? ... Who.

Needs it? . . .

HYBRID SYSTEMS: TECHNICAL A Hybrid Analog-Digital Parameter Optimizer

for Astrac II

A Hybrid Analog-Digital Pseudo-Random Noise Generator

A 2MC Bit-Rate Delta-Sigma Modulation System for Analog Function Storage

ARTIFICIAL INTELLIGENCE A Computer-Simulated On-Line Learning Control

System

A Heuristic Program to Solve Geometric-Analogy Problems

Experiments With a Theorem-Utilizing Program EVALUATING COMPUTER SYSTEMS Analytical Technique for Automatic Data Processing

Equipment Acq.uisition

Cost-Value Technique for Evaluation of Computer System Proposals

The Use of a Computer to Eyaluate Computers MULTI-PROGRAMMING A General-Purpose Time-Sharing System

Remote Computing: An Experimental System Part 1 : External Specifications

Remote Computing: An Experimental System Part 2 : Internal Design

Multi-Computer Programming for a Large-Scale, Real-Time Data Processing System

LOGIC, LAYOUT AND ASSOCIATIVE MEMORIES On the Analysis and Synthesis of Three-Valued

Digital -Systems

An Algorithm for Placement of Interconnected Elements Based on Minimum Wire Length Studies of an Associatively Addressed Distributed

Memory

Design of an Experimental Multiple Instantaneous Response File

F. B. THOMPSON A. E. DANIELS B. NANUS L. FARR

219 231 239

T. D. TRUITT 249

B. A. MITCHELL, JR. 271 R. L. T. HAMPTON 287 H. HANDLER 303 R. H. MANGELS

J. D. HILL 315 G. McMuRTY

K. S. Fu T. G. EVANS 327 L. E. TRAVIS 339 S. ROSENTHAL 359 E. O. JOSLIN 367 D. J. HERMAN 383 E. G. COFFMAN, JR. 397

J. 1. SCHWARTZ C. WEISSMAN

T. M. DUNN 413 J. H. MORRISSEY

J. 1\1:. KELLER 425 E. C. STRUM

G. H. YANG G. E. PICKERING 445

G. A. ERICKSON E. G. MUTSCHLER

J. SANTOS 463 H. ARANGO R. A. RUTMAN 477 G. J. SIMMONS 493 E. L. YOUNKER 515 C. H. HECKLER, JR.

D. P. MASHER J. M. YARBOROUGH

(6)

INFORMATION RETRIEVAL TUTORIAL Research in Automatic Generation of Classification

Systems

Information Storage and Retrieval :. Analysis of the State-of-the-Art

INFORMATION RETRIEVAL: TECHNICAL Training a Computer to Assign Descriptors to

Documents: Experiments in Automatic Indexing Experiments in Information Correlation

Some Flexible Information Retrieval System~ Using Structure Matching Procedures

BUSINESS DATA PROCESSING Two New Improvements in Sorting Techniques Conceptual Models for Determining Information

Requirements

C.

E. H.

H. BORKO 529 G. ARNOVICK 537

J. A. LILES A. H. ROSEN J. S. WOOD M. E. STEVENS 563

G. H. URBAN J. L. KUHNS 577 A. MONTGOMERY

G. SALTON 587 SUSSENGUTH, JK

M. A. GOETZ 599 J. C. MILLER 609

(7)

The following pages contain the papers which were presented and discussed in the formal technical and tutorial sessions of the 1964 Spring Computer Conference. The Conference theme, "Computers '64: Problem- Solving in a Changing World," is intended to suggest the significanc~ of this year as the mid-point between the infancy of electronic digital conl- putation (the first elements of ENIAC began operating in June 1944) and the Orwellian year 1984, which symbolizes to some critics an unhappy potential which might be achieved if we do not wisely guide our rapidly advancing technology. Society is increasingly concerned with the broad adjustments that must take place if man is to gain the maximum long-term advantage from the computer. Reflections of this concern and of the increasing uses to which computers are being applied are, among others, the papers dealing with social implications and the tutorial papers on hy- brid systems, compilers, and information retrieval.

This volume, a product of the 25th in the series of Joint Computer Conferences, for the first time includes an -index. Thanks for this useful addition are due to the authors and session chairmen, who were most -coop- erative in getting "final copy" submitted on schedule, and to our publisher, who assumed the burden of preparing the index. Appreciation must also be expressed for the contributions of many other persons, who participated as conference planners, panel members, and reviewers. A special acknowl- edgement is made to the members of the Technical Program Committee, who willingly assumed the heavy burden of assembling the formal program.

HERBERT R. KOLLER

General Chairman

1964 Spring Joint Computer Conference

(8)

A Historical Survey

Saul Rosen

Professor of Mathematics and Computer Sciences Purdue University

West Lafayette, Indiana 1. Introduction. Twenty years ago, in 1943,

there were no Electronic computers. Ten years ago, in 1953, a large number of Electronic cal- culators were in use, but the general purpose stored program electronic computer was still quite rare. The Coincident Current Magnetic Core memory which finally provided both re- liability and speed at reasonable cost had only just been developed, and was still. a laboratory device. A number of specially designed, mostly one of a kind, computers were in operation at Universities and government' research centers.

Commercially, a few Univac I computers had been delivered and were operating with great reliability at rather low speed. A few IBM 701's provided high speed but with very poor reliabil- ity. In 1953 most computing was being done by the Card-Programmed Calculator, an ingenious mating of an Electromechanical Accounting

\ Machine with an Electronic Calculating Punch.

Between 1954 and 1958 many hundreds of Electronic computers were installed. This was the era of the Vacuum Tube Computer, with Magnetic Drum storage on lower priced ma- chines, and Magnetic Core storage on the larger more expensive ones. By 1959 the first transis- torized computers had been delivered, and the production of vacuum tube computers ceased almost immediately, Low cost magnetic core memories made the magnetic drum almost ob- solete except as auxiliary storage. Since 1959 thousands of computers have been delivered and

1

installed, including many hundreds of very large computers. Electronic Computing and Data-processing have become part of the every- day industrial and commercial environment, and Electronic Computer manufacturing and pro- gramming has become a multi-billion dollar in- dustry.

Because of this rapid, almost explosive pat- tern of gro'wth mod systems, both in the hard~

ware and software area could not be adequately planned, and it was often not possible to make adequate evaluations of old systems in time to use such evaluations in the design of new ones.

2. Developments up to 1957. The first pro- gramming systems were systems of subroutines.

Subroutines were in use on the pre-electronic Mark I, a large electromechanical computer built as a joint effort by IBM and Harvard Uni- versity in the early forties.! The EDSAC at the University of Manchester was probably the first stored program Electronic Computer in oper- ation (1949). The classic text on programming the EDSAC, by Wilkes, Wheeler and Gill 2 makes the subroutine library the basis of pro- gramming, and stresses the automatic relocation feature of the EDSAC, which makes such libraries easy to use.

The IBM Card-Programmed Calculator, de- veloped between 1947 and 1949, was not very fast by modern standards, but was an extremely versatile device. Operation codes were deter-

(9)

mined by the wiring of control boards. In- genuity could be expended on designing boards to optimize performance on particular problems that taxed the capacity of the computer, or it could be expended on the design of general pur- pose boards for use in large classes of problems.

A set of general purpose boards represented a language for the CPC, and a number of lan- guages were tried and used.3 Most Scientific installations finally settled on a wiring that made the CPC appear to be a floating point ma- chine with three-address logic, and with a standard vocabulary of built-in routines like Sq,uare Root, Sine, Exponential, etc. Experi- ence with the CPC systems had many influences on the programming systems that were designed for the stored-program computers that fol- lowed.

One of the most important of the early auto- matic programming groups was associated with the Whirlwind project at MIT. Whirlwind, which was built between 1947 and 1951, was a fast but very basic computer. With only a 16 bit word it had a very limited instruction code, and very limited high speed storage. Even rela- tively simple calcula,tions required the use of multi-precision techniques. The very difficulty of using the machine in its own language pro- vided great incentive toward the development of programming languages. The Summer Ses- sion Computer at MIT was one of the early interpreti ve systems, designed to make the Whirlwind computer available to students at a summer course in computing at MIT. These

~.arly developments led to the design of a quite elaborate "Comprehensive System" for Whirl- wind.4 At the associated Lincoln Laboratories the groundwork was laid for the very large pro- gramming systems that were developed in con- nection with Sage and other military projects.s

The first large scale electronic computer 3;vailable commercially was the Univac I (1951. The first Automatic Programming group associated with a commercial computer effort was the group set up by Dr. Grace Hopper in what was then the Eckert-Mauchly Computer Corp., and which later became the Univac Divi- sion of Sperry Rand. The Univac had been de- signed so as to be relatively easy to program in its own code. It was a decimal, alphanumeric

machine, with mnemonic instructions that were easy to remember and use. The 12 character word made sca)ing of many fixed-point calcula- tions fairly easy. It was not always easy to see the advantage of assembly systems and com- pilers that were often slow and clumsy on a machine with only 12,000 characters of high speed storage (200 microseconds average ac- cess time per 12 character word). In spite of occasional setbacks, Dr. Hopper persevered in her belief that programming should and would be done in problem-oriented languages., Her group embarked on the development of a whole series of languages, of which the most used was probably A2, a compiler that provided a three address floating point system by compiling calls on floating point subroutines stored in main memory.6,7 The Algebraic Translator AT3 (Math-Matic8 ) contributed a number of ideas to Algol and other compiler efforts, but its own usefulness was very much limited by the fact that Univac had become obsolete as a scientific computer before AT3 was finished. The BO (Flow-Matic8,9) compiler was one of the major influences on the COBOL language development which will be discussed at greater length later.

The first sort generators 10 were produced by the Univac programming group in 1951. They also produced what was probably the first large scale symbol manipulation program, a program that performed differentiation of formulas sub- mitted to it in symbolic form.11

Another and quite independent group at Uni- vac concerned itself with an area that would now be called computer-oriented compilers.

Anatol Holt and William Turanski developed a compiler and a concept that they called GP (Generalized Programming 12) • Their system assumed the existence of a very general sub- routine library. All programs would be written as if they were to be library programs, and the library and system would grow together. A program was assembled at compile time by the selection and modification of particular library programs and parts of library programs. The program as written by the programmer would provide parameters and specifications according to which the very general library programs would be made specific to a particular problem.

Subroutines in the library were organized in hierarchies, in which subroutines at one level

(10)

could call on others at the next level. Specifica- tions and parameters could be passed· from one level to the next.

The system was extended and elaborated in the GPX system that they developed for Univac II. They were one of the early groups to give serious attention to some difficult problems rela- tive to the structure of programs, in particular the problem of segmenting of programs, and the related problem of storage allocation.

Perhaps the most important contribution of this group was the emphasis that they placed on the programming system rather than on the programming language. In their terms, the machine that the programmer uses is not the hardware machine, but rather an extended ma- chine consisting of the hardware enhanced by a programming system that performs all kinds of service and library maintenance functions in addition to the translation and running of pro- grams.

The IBM 701 (1952) was the first commer- cially marketed large scale binary computer.

The best known programming language on the 701 was Speedcode,13,14 a language that made the one address, binary, fixed point 701 appear to be a three address decimal floating point ma- chine with index registers. More than almost any other language, Speedcode demonstrated

·the extent to which users were willing to sacri- fice computer speed fOF the sake of program- ming convenience.

The PACT15,16 system on the 701 set a prec- edent as the first programming system designed by a committee of computer users. It also set a precedent for a situation which unfortunately has been quite common in the computer field.

The computer for which the programming system was developed was already obsolete be- fore the programming system itself was com- pleted. P ACT ideas had a great deal of influence on the later developments that took place under the auspices of the SHARE organ- ization.

Delivery of the smaller, medium scale mag- netic-drum computers started in 1953, and by 1955-56 they were a very important factor in the computer field. The IBM 650 was by far the most popular of the early drum computers. The

650 was quite easy to program in its own lan- guage, and was programmed that way in many applications, especially in the data-processing area. As a scientific computer it lacked floating point hardware, a feature that was later made available. A number of interpretive floating point systems were developed, of which the most popular was the one designed at the Bell Tele- phone Laboratories.17 This was a three address floating point system with automatic looping and with built in Mathematical subroutines. It was a logical continuation of the line of systems that had started with the general purpose CPC boards, and had been continued in 701 Speed- code. It proved that on the right kind of com- puter an interpretive system can provide an efficient effective tool. Interpretive systems fell into disrepute for a number of years: They are making a very strong comeback at the present time in connection with a number of so-called micro-programmed computers that have re- cently appeared on the market.

The 650 permitted optimization of programs by means of proper placement of instructions on the drum. Optimization was a very tedious job f'or the programmer, but could produce a very considerable improvement in program running time. A program called SOAP, a Symbolic Optimize:c. and Assembly Programs, combined the features of symbolic assembly and auto- matic optim.ization. There is some" doubt a~ to whether a symbolic assembly system would have received very general acceptance on the 650 at the time SOAP was introduced. The optimiza- tion feature was obviously valuable. Symbolic assembly by itself on a decimal machine without magnetic tape did not present advant~ges that were nearly as obvious.

The major competitor to the 650 among the early magnetic drum computers was the Data- tron 205, which eventually became a Burroughs

product. It featured a 4000 word magnetic drum storage plus a small 80 word fast access memory in high speed loops on the drum. Effi- cient programs had to make very frequent use of block transfer instructions, moving both data and programs to high speed storage. A number of interpretive and assembly system were built to provide programmed floating point instruc- tions and some measure of automatic use of the

(11)

high speed loops. The eventual introduction of floating point hardware removed one of the principal advantages of most of these systems.

The Datatron was the first commercial system to provide an index register and automatic re- location of subroutines, features provided by programming systems on other computers. For these reasons among others the use of machine code programming persisted through most of the productive lifetime of the Datatron 205.

One of the. first Datatron computers was in- stalled at Purdue University. One of the first Algebraic compilers was designed for the Data- tron by a group at Purdue headed by Dr. Al Perlis. This is another example of a compiler effort based on a firm belief that programming should be done in problem-oriented languages, even if the computer immediately available may not lend itself too well to th~ use of such lan- guages. A big problem in the early Datatron systems was the complete lack of alphanumeric input. The computer would recognize pairs of numbers as representing characters for print- ing on the flexowriter, but there was no way to produce the same pair of numbers by a single key stroke on any input preparation device.

Until the nluch later development of new input- output devices, the input to the Purdue compiler was prepared by manually transcribing the source language into pairs of numbers.

When Dr. Perlis moved to Carnegie Tech the some compiler was written for the 650, and was named IT.I8 IT made use of the alphanumeric card input of the 650, and translated from a simplified algebraic language into SOAP lan- guage as output. IT and languages derived from it became quite popular on the 650, and on other computers, and have had great influence on the later development of programming lan- guages. A language Fortransit provided trans- lation from a subset of Fortran into IT, whence a program would be translated into SOAP, and Slofter two or more passes through SOAP it wou Id finally emerge as a machine language program. The language would probably have been more popular if its translation were not such an involved and time-consuming process.

Eventually other, more direct translators were built that avoided many of the intermediate passes.

The 701 used a rather unreliable electrostatic tube storage system. When Magnetic core stor- age became available there was some talk about a 701M computer that would be an advanced 701 with core storage. The idea of a 701M was soon dropped in favor of a completely new com- puter, the 704. The 704 was going to incorpo- rate into hardware many of the features for which programming systems had been devel- oped in the past. Automatic floating point hard- ware and index registers would make interpre- tive systems like Speedcode unnecessary.

Along with the development of the 704 hard- ware IBM set up a project headed by John Backus to develop a suitable compiler for the new computer. After the expenditure of about 25 man years of effort they produced the first Fortran compiler.I9•20 Fortran is in many ways the most important and most impressive devel- opment in the early history of automatic pro- gramming.

Like most of the early hardware and soft- ware systemst Fortran was late in delivery, and didn't really work when it was delivered. At first people thought it would never be done.

Then when it was in field test, with many bugs, and with some of the most important parts un- finished, many thought it would never work. It gradually got to the point where a program in Fortran had a reasonable expectancy of com- piling all the way through and maybe even of running. This gradual change of status from an experiment to a working system was true of most compilers. It is stressed here in the case of Fortran only because Fortran is now almost taken for granted, as if it were built into the computer hardware.

In -the early days of automatic programming, the most important criterion on which a com- piler was judged was the efficiency of the object code. "You only compile once, you run the object program many times," was a statement often quoted to justify a compiler design phi- losophy that permitted the compiler to take as long as necessary, within reason, to produce good object code. The Fort.ran compiler on the 704 applied a number of difficult and ingenious techniques in an attempt to produce object cod- ing that would be as good as that produced by a good programmer programming in machine

(12)

code. For many types of programs the coding produced is very good. Of course there are some for which it is not so good. In order to make effective use of index registers a very compli- cated index register assignment algorithm was used that involved a complete analysis of the flow of the program and a simulation of the running of the program using information ob- tained from frequency statements and from the flow analysis. This was very time consum- ing, especially on the relatively small initial 704 configuration. Part of the index register opti- mization fell into disuse quite early but much of it was carried along into Fortran II and is still in use on the 704/9/90. In many programs it still contributes to the production of better code than can be achieved on the new Fortran IV compiler.

Experience led to a gradual change of phi- losophy with respect to compilers. During de- bugging, compiling is done over and over again.

One of the major reasons for using a problem oriented language is to make it easy to modify programs frequently on the basis of experience gained in running the programs. In many cases the total compile time used by a project is much greater than the total time spent running object codes. More recent compilers on many comput- ers have emphasized compiling time rather than run time efficiency. Some may have gone too far in that direction.

It was the development of Fortran II that made it possible to use Fortran for large prob- lems without using excessive compiling time.

Fortran II permitted a program to be broken down into subprograms which could be tested and debugged separately. With Fortran II in full operation, the use of Fortran spread very rapidly. Many 704 installations started to use nothing but Fortran. A revolution was taking place in the scientific computing field, but many of the spokesmen for the computer field were unaware of it. A number of major projects that were at crucial points in their development in 1957-1959 might have proceeded quite differ- ently if there was more general awareness of the extent to which the use of Fortran had been accepted in many major 704 installations.

Among these are the Algol project and the SOS project which are discussed below.

3. Algol and its Dialects. Until quite re- cently, large scale computers have been mainly an American phenomenon. Smaller computers were almost worldwide right from the begin- ning. An active computer organization GAMM had been set up in Europe, and in 1957 a num- ber of members of this organization were ac- tively interested in the design of Algebraic compilers for a number of machines. They de- cided to try to reach agreement on a common language for various machines, and made con- siderable progress toward the design of such a language. There are many obvious advantages to having generally accepted computer inde- pendent problem oriented languages. It was clear that a really international effort in this direction could only be achieved with United States participation. The President of GAMM wrote a letter to John Carr who was then Presi- dent of the ACM, suggesting that representa- tives of ACM and of GAMM meet together for the purpose of specifying an international lan- guage for the description of computing pro- cedures.

The ACM up to that time had served as a forum for the presentation of ideas in all aspects of the computer field. It had never engaged in actual design of languages or systems.

In response to the letter from GAMM, Dr.

Carr appointed Dr. Perlis as chairman of a committee on programming languages. The committee set out to specify an Algebraic com- piler language that would represent the Ameri- can proposal at a meeting with representatives of GAMM at which an attempt would be made to reach agreement on an internationally ac- cepted language. The ACM committee consisted of representatives of the major computer manu- facturers, and representatives of several Uni- versities and research agencies that had done work in the compiler field. Probably the most active member of the committee was John Backus of IBM. He was probably the only mem- ber of the committee whose position permitted him to spend full time on the language design project, and a good part of the "American Pro- posal" was based on his work.

The ACM committee had a number of meet- ings without any very great sense of urgency.

Subcommittees worked on various parts of the

(13)

language and reported back to the full commit- tee, and in general there was little argument or disagreement. There is after all very general agreement about the really basic elements of an Algebraic language. Much of the language is determined by the desirability of remaining as close as possible to Mathematical notation.

This is tempered by experience in the use of computers and in the design of compilers which indicates some compromises between the de- mands of desirable notation and those of prac- tical implementation.

At one meeting of the committee Dr. Bauer, one of the leaders of the GAMM effort, presented a report on the proposed European language.

Among other things they proposed that English language key words, like begin, end, for, do, be used as a world-wide standard. Of course this is something the American committee would never have proposed, but it seemed quite rea- sonable to go along with the Europeans in this matter. Although some of the notations seemed strange, there were very few basic disagree- ments between what GAMM was proposing, and what the ACM committee was developing. Dr.

Bauer remarked that the GAMM organization felt somewhat like the Russians who were meet- ing with constant rebuffs in an effort to set up a summit meeting. With such wide areas of agreement why couldn't the ACM-GAMM meet- ing take place?

Although there is quite general agreement about the basic elements of an Algebraic lan- guage, there is quite considerable disagreement about how far such a language should go, and about how some of the more advanced and more difficult concepts should be specified in the lan- guage. Manipulation of strings of symbols, direct handling of vectors, matrices, and mul- tiple precision quantities, ways to specify seg- mentation of problems, and the allocation and sharing of storage; these were some of the

top~cs which could lead to long and lively dis- cussion. The ACM language committee decided that it was unreasonable to expect to reach an agreement on an international language em- bodying features of this kind at that time. It was decided to set up two subcommittees. One would deal with the specification of a language which included those features on which it was reasonable to expect a wide range of agreement.

The other was to work toward the future, to- ward the specification of a language that would really represent the most advanced thinking in the computer field.

The short-range committee was to set up a meeting in Europe with representatives of GAMM. Volunteers for work on this committee would have to arrange for the trip to Europe and back, and were therefore limited to those who worked for an organization that would be willing to sponsor such a trip. The ACM was asked to underwrite the trip for Dr. Perlis.

The meeting of the ACM and GAMM sub- committees was held in Zurich in the spring of 1958, and the result was a Preliminary report on an International Algebraic Language, which has since become popularly known as Algol 58.21

With the use of Fortran already well estab- lished in 1958, one may wonder why the Ameri- can committee did not recommend that the international language be an extension of, or at least in some sense compatible with Fortran.

There were a number of reasons. The most ob- vious has to do with the nature and the limita- tions of the Fortran language itself. A few features of the Fortran language are clumsy because of the very limited experience with compiler languages that existed when Fortran was designed. Most of Fortran's most serious limitations occur because Fortran was not de- signed to provide a completely computer inde- pendent language; it was designed as a compiler language for the 704. The handling of a number of statement types, in particular the Do and If statements, reflects the hardware constraints of the 704, and the design philosophy which kept these statements simple and therefore restricted in order to simplify optimization of object coding.

Another and perhaps more important reason for the fact that the ACM committee almost ignored the existence of Fortran has to do with the predominant position of IBM in the large scale computer field in 1957-1958 when the Algol development started. Much more so than now there were no serious comp~titors. In the data processing field the Univac II was much too late to give any serious competition to the IBM 705. RCA's Bizmac never really had a chance, and Honeywell's Datamatic 1000, with

(14)

its 3 inch wide tapes, had only very few special- ized customers. In the Scientific field there were those who felt that the Univac 1103/1103a/1105 series was as good or better than the IBM 701/

704/709. Univac's record of late delivery and poor service and support seemed calculated to discourage sales to the extent that the 704 had the field almost completely to itself. The first Algebraic compiler produced by the manufac- turer for the Univac Scientific computer, the 1103a, was Unicode, a compiler with many interesting features, that was not completed until after 1960, for computers that were al- ready obsolete. There were no other large scale scientific computers. There was a feeling on the part of a number of persons highly placed in the ACM that Fortran represented part of the IBM empire, and that any enhancement of the status of Fortran by accepting it as the basis of an international standard would also enhance IBM's monopoly in the large scale scientific computer field.

The year 1958 in which the first Algol report was published, also marked the emergence of large scale high speed transistorized computers, competitive in price and superior in perform- ance to the vacuum tube computers in general use. At the time I was in charge of Program- ming systems for the new model 2000 computers that Philco was preparing to market. An Alge- braic compiler was an absolute necessity, and there was never really any serious doubt that the language had to be Fortran. 22, 22A The very first saies contracts for the 2000 specified that the computer had to be equipped with a com- piler that would accept 704 Fortran source decks essentially without change. Other manu- facturers, Honeywell, Control Data, Bendix, faced with the same problems, came to the same conclusion. Without any formal recognition, in spite of the attitude of the professional commit-

te~s, Fortran became the standard scientific computing language. Incidentally, the emer- gence of Fortran as a standard helped rather than hindered the development of a competitive situation in the scientific computer field.

To go on with the Algol development, the years 1958-1959 were years in which many naw computers were introduced. The time was ripe for experimentation in new languages. As men- tioned earlier there are many elements in com-

mon in all Algebraic languages, and everyone who introduced a new language in those years called it Algol, or a dialect of Algol. The initial result of this first attempt at the standardiza- tion of Algebraic ]anguages was the prolifera- tion of such languages in great variety.

A very bright young programmer at Bur- roughs had some ideas about writing a very fast one pass compiler for Burroughs new 220 com- puter. The compiler has come to be known as Balgol.

A compiler called ALGO was written for the Bendix G 15 computer. At Systems Development Corporation, programming systems had to be developed for a large command and control system based on the IBM military computer (ANFSQ32). The resulting Algebraic language with fairly elaborate data description facilities was JOVIAL23 (Jules Schwartz' own Version of the International Algebraic Language). By now compilers for JOVIAL ha:ve been written for the IBM 7090, the Control Data 1604, the Philco 2000, the Burroughs D825, and for sev- eral versions of IBM military computers.

The Naval Electronics Laboratory at San Diego was getting a new Sperry Rand Com- puter, the Countess. With a variety of other computers installed and expected they stressed the description of a compiler in its own lan- guage to make it easy, among other things, to produce a compiler on one computer using a compiler on another. They also stressed very fast compiling times, at the expense of object code running times, if necessary. The language was called Neliac,24,25 a dialect of Algol. Com- pilers for Neliac are available on at least as great a variety of computers as for JOVIAL.

The University of Michigan developed a com- piler for a language called Mad, the Michigan Algorithmic Decoder.26.27 They were quite un- happy at the slow compiling times of Fortran, especially in connection with short problems typical of student use of a computer at a Uni- versity. Mad was originally programmed for the 704 and has been adapted for the 7090. It too was based on the 1958 version of Algol.

All of these languages derived from Algol 58 are well established, in spite of the fact that the ACM GAMM committee continued its work

(15)

and issued its now well known report defining Algol 60.28

Algol 60, known simply as Algol, went con- siderably further than was anticipated in some of the early committee meetings. The language did not limit itself to those areas in which there exists almost universal agreement. Concepts like recursive subroutines, dynamic storage allocation, block structure, own variables and arrays, were introduced which require the in- clusion of rather complex structures in the run- ning programs produced by the compiler. With- out attempting any serious evaluation of these concepts here, I think it is fair to say that they are difficult, and their inclusion in an Algebraic language that is intended to ba universal is con- troversial. They led to much debate about the difficult areas and tended to obscpre some of the more fundamental accomplishments of the Algol committee. Algol set an important prece- dent in language definition by presenting a rigorous definition of its syntax in the Backus normal form.29 As compared to Fortran it con- tains a much more general treatment of itera- tive loops. It provides a good systematic han- dling of Boolean expressions and variables and of conditional statements. The most serious deficiency in Algol results from its complete lack of input-output specifications. The han- dling of input-output is one of the most impor- tant services provided by a compiler, and a general purpose Algebraic compiler language is not completely specified until its input-output language has been defined.

Algol compilers have been written for many different computers, but with the exception of Burroughs no computer manufacturer has pushed it very strongly. It is very popular among University and mathematically oriented computer people especially in Europe. For some time in the United States it will probably re- main in its status as another available computer language.

4. Data Processing Compilers. The largest user by far of data-processing equipment is the United States government. The government, by law and by design, avoids giving preferential treatment to anyone computer manufacturer.

More than any other computer user, the govern- ment is plagued by the problems caused by the

lack of compatibility between different kinds of computing equipment, whether manufactured by the same or by different suppliers.

In the spring of 1959, the office of the Secre- tary of Defense summoned representatives of the major manufacturers and users of data- processing equipment to a meeting in Washing- ton, to discuss the problem associated with the lack of standard programming languages in the data processing area. This was the start of the Committee on Data Systems Languages (CODASYL), that went on to produce COBOL, the common business oriented language. From the beginning their effort was marked by mis- sionary zeal for the cause of English language coding.

Actually, there had been very little previous experience with Data processing compilers.

Univac's B-O or Flow-Matic,8.9 which was run- ning in 1956, was probably the first true Data- Processing compiler. It introduced the idea of file descriptions, consisting of detailed record and item descriptions, separate from the de- scription of program procedures. It also intro- duced the idea of using the English language as a programming language.

It is remarkable to note that the Univac I on which Flow-Matic was implemented did not have the data-processing capabilities of a good sized 1401 installation. To add to the problems caused by the inadequacy of the computer, the implementation of the compiler was poor, and compilation was very very slow. There were installations that tried it and dropped it. Others used it, with the philosophy that even with compiling times measured in hours the total output of the installation was greater using the compiler than without it. Experience with Flow-Matic was almost the only experience available on Data Processing compilers prior to the launching of the COBOL project.

A group at IBM had been working for some time on the Commercial Translator, 30 and some early experience on that system was also avail- able.

At the original Defense Department meeting there were two points of view. One group felt that the need was so urgent that it was neces- sary to work within the state of the art as it

(16)

then existed and to specify a common language on that basis as soon as possible. The other group felt that a better understanding of the problems of Data-Processing programming was needed before a standard language could be pro-

posed. They suggested that a longer range ap- proach looking toward the specification of a language in the course of two or three years might produce better results. As a result two committees were set up, a short range commit- tee, and an intermediate range committee. The original charter of the short range committee was to examine existing techniques and lan- guages, and to report back to CODASYL with recommendations as to how these could be used to produce an acceptable language. The com- mittee set to work with a great sense of ur- gency. A number of companies represented had commitments to produce Data-processing com-

pilers, and representatives of some of these be- came part of the driving force behind the com- mittee effort. The short range committee de- cided that the only way it could satisfy its obligations was to start immediately on the design of a new language. The committee be- came known as the COBOL committee, and their language was COBOL.

Preliminary specifications for the new lan- guage were released by the end of 1959, and several companies, Sylvania, RCA, and Univac started almost immediately on implementation on the MOBIDIC, 501, and Univac II respec- tively.

There then occurred the famous battle of the committees. The intermediate range committee had been meeting occasionally, and on one of these occasions they evaluated the early COBOL specifications and found them wanting. The pre- liminary specifications for Honeywell's F ACT30 compiler had become available, and the inter- mediate range committee indicated their feeling that Fact would be a better basis for a Common Business Oriented Language than Cobol.

The COBOL committee had no intention of letting their work up to that time go to waste.

With some interesting rhetoric about the course of history having made it impossible to con- si·1er any other action, and with the support of the Codasyl executive board, they affirmed Cobol as the Cobol. Of course it needed improvements,

but the basic structure would remain. The charter of the Cobol committee was revised to eliminate any reference to. short term goals and its effort has continued at an almost unbeliev- able rate from that time to the present. Com- puter manufacturers assigned programming Systems people to the committee, essentially on a full time basis. Cobol 60, the first official de- scription of the language, was followed by 6131 and more recently by 61 extended.32

Some manufacturers dragged their feet with respect to Cobol implementation. Cobol was an incomplete and developing language, and some manufacturers, especially Honeywell and IBM, were' implementing quite sophisticated data processing compilers of their own which would become obsolete if Cobol were really to achieve its goal. In 1960 the United States government put the full weight of its prestige and purchas- ing power behind Cobol, and all resistance dis- appeared. This was accomplished by a simple announcement that the United States govern- ment would not purchase or lease computing equipment from any manufacturer unless a Cobol language compiler was available, or un- less the manufacturer could prove that the per- formance of his equipment would not be en- hanced by the availability of such a compiler.

No such proof was ever attempted for large scale electronic computers.

To evaluate Cobol in this short talk is out of the question. A number of quite good Cobol com- pilers have been written. The one on the 7090 with which I have had some experience may be typical. It implements only a fraction, less than half I would guess, of the language described in the manual for Cobol 61 ext~nded. No an- nouncement has been made as to whether or when the rest, some of which has only been pub- lished very recently, will be implemented. What is there is well done, and does many useful things, but the remaining features are impor- tant, as are some that have not yet been put into the manual and which may appear in Cobol 63.

The language is rather clumsy to use; for example, long words like synchronized and computational must be written out all too fre- quently; but many programmers are willing to put up with this clumsiness because, within its range of applicability the compiler performs

(17)

many important functions that would otherwise have to be spelled out in great detail. It is hard to believe that this is the last, or even very close to the last word in data processing languages.

Before leaving Data Processing compilers I wish to say a few words about the development of the FACT compiler.

In 1958 Honeywell, after almost leaving the computer business because of the failure of their Datamatic 1000 computer, decided to make an all out effort to capture part of the medium priced computer market with their Honeywell 800 computer. The computer itself is very inter- esting but that is part of another talk.

They started a trend, now well established, of contracting out their programming systems de- velopment, contracting with Computer Usage Co. for a Fortran language compiler.

Most interesting from our point of view was their effort in the Data Processing field. On the basis of a contract with Honeywell, the Com- puter Sciences Corporation was organized.

Their contract called for the design and produc- tion of a Data processing compiler they called FACT.3o.33

Fact combined the ideas of data processing generators as developed by Univac, GE Han- ford,34 Surge 35 and 9PAC with the concepts of English language data processing compilers that had been developed in connection with Univac's Flow-Matic and IBM's commercial translator.

The result was a very powerful and very in- teresting compiler. When completed it con- tained over 250,000 three address instructions.

Designed to work on configurations as small as 4096 words of storage and 4 tape units it was not as fast as some more recent compilers on larger machines.

The Fact design went far beyond the original COBOL specifications,30 and has had consider- able influence on the later COBOL development.

Like all other manufacturers Honeywell has decided to go along with the COBOL language, and Fact will probably fall into disuse.

5. Assemblers and Operating Systems. Sym- bolic assembly language has become an almost universal form for addressing a computer in a computer oriented language.

After the first 704's were delivered in 1956 a number of users produced assembly routines for use on the computer. One of the early stand- ardization efforts involved a choice of a stand- ard assembly program to be used by Share, the 704 users group. It is a sign of some of the thinking that was current then that the stand- ard chosen was U ASAP. 36 The first SAP was a very basic assembly system. It did practically nothing but one-to-one translation, and left the programmer in complete control of all of the parameters of the program. In the early days many users felt that this was all an assembly system should do. Some still feel that way, but on most computers the simple assembly system has been replaced by the full scale computer oriented compiler in which one-to-one code translation is augmented by extensive libraries of subroutines and generators and by allocation, segmentation, and other program-organization features. 37

The word-macro-instruction apparently was coined in connection with the symbolic assembly systems that were developed for IBM's 702/705 computers. These Autocoder 38 systems with their large macro-instruction libraries have been used for huge amounts of data processing programming on a number of machines.

Assembly systems gradually grew into or be- came incorporated into operating systems.39.40 Perhaps the earliest monitor system on the 704 was put into operation at the General Motors Research center.41.42 The idea of automatic se- quencing of batches of jobs spread rapidly until it was almost universally used in connection with large computers. It made it possible for large computers to handle small jobs with rea- sonable efficiency and greatly extended their range of application. The idea of such systems was to run the computer without any stops, and to relegate the operator to occasional mount- ing of tapes, and otherwise to responding to very simple requests presented to him on the on-line printer. Under such a system debugging becomes a completely off-line process. The only response to trouble in a program is to dump and get on with the next program.

At the end of 1956 IBM announced its new 709 computer. The 709 was essentially a 704 with internally buffered input and output.

(18)

As mentioned earlier, IBM was at its peak of penetration of the large scale scientific com- puter market at that time, and the rest of the industry watched with great interest as many of the best programming systems people repre- senting many leading scientific computer instal- lations met as a Share committee to design the very complex programming system which was eventually called SOS (Share Operating Sys- tem).

The design of programming systems by large committees representing many companies and institutions has almost invariably led to dis- appointing results. SOS was no exception.

Planned mainly by personnel of \Vest Coast air- craft and research companies, it was to be writ- ten according to their specifications by the IBM

programming systems activity on the East Coast. Separation of design and implementa- tion responsibility by 3000 miles is almost enough in itself to guarantee great difficulty, if not complete failure. In 1958 the chairman of the Share 709 system committee wrote,43 "ThE fundamental procedures used throughout the system will undoubtedly be retained in every installation." This has not been the case. The SOS system is now in use at only a very few in- stallations. There are many reasons, of which I would like to suggest just a few. SOS put all of its emphasis on the computer oriented program- ming system. The time during which SOS was being designed and implemented was the time during which the attitude toward Fortran was changing from polite skepticism to very general acceptance. By the time SOS was ir nearly full operation some installations were using almost nothing but Fortran. Apparently little or no effort had been expended on the problem of compatibility between SOS and Fortran. It was only in 1962 that an SOS system which handles Fortran was distributed by the Rand Corpora- tion.44 Their system accepts Fortran source programs, and produces the binary symbolic or squoze decks that can be combined with other programs produced by the SOS system. IBM boasted of over 50 man years of effort on the SOS system for the 709. They spent almost no effort on Fortran for the 709, on the theory that Fortran was developed for the 704 would be adequate. The Fortran II system that was originally distributed for the 709 took no ad-

vantage of the fact that the 709 hardware per- mitted buffered input and output. The SOS system provided a very elaborate buffering system.

SOS proposed to provide a system in which the programmer would need to know and use only one language, the compiler source lan- guage. One of its major achievements was the provision of source language modification of programs at load tinle without full recompila- tion. A very versatile debugging system was built around this feature. While this and other features of the system are extremely attractive, there is a' serious question as to whether they are worth the price paid in system complexity, and in increased loading time. I think it is interesting to point out that a relatively simple assembly system, FAP, and a very basic oper- ating system, the Fortran Monitor System, both originated at customer installations and not by the manufacturer, became the most widely used systems on the 709/90 computers. Quite similar systems were introduced on competitive equip- ment, the Philco 2000 and the CDC 1604. Com- plexity and system rigidity no doubt contrib- uted to the fact that SOS was not generally ac- cepted. It win be interesting to foliow the his- tory of a new and very complicated system, the IBSYS/IBJOB complex that has recently been introduced by IBM on the 7090 and related ma- chines. A critique of these systems is far be- yond the scope of this discussion. A few com- ments may be in order. IBJOB presents a very elaborate assembly system MAP, and transla- tors from two languages, FORTRAN IV and Cobol into Map. They are then translated into a language that is close to machine language, with the final steps of the translation left to a very complicated loader. T~le design, which calls for the translation of problem oriented source languages into an intermediate computer oriented source language is very attractive. By ha ving the assembly system do most of the work of the compiler it is possible to have many of the features of the problem oriented language available by means of subroutine calls to those who prefer to write in assembly language. This design philosophy is attractive, but I think that it is wrong. Attractiveness and elegance should not be the determining design criteria for pro- duction compiling systems. Design of a system

(19)

is a compromise between many design criteria.

One of the most important is keeping the sys- tem overhead cost low on the many problems that do not require the use of very sophisticated system features. The code produced by a com- piler like Fortran is straightforward simple code. The assembler for such code can be sim- ple and straightforward. The loading program can be designed to permit easy combination with programs produced by other systems. An as- sembler designed to aid in the production of large programming systems contains many fea- tures that are seldom used except in the coding of such systems. A wasteful mismatch may occur when the output of Fortran is fed through such an assembler.

Not so very many years ago there was quite a bit of discussion as to whether general pur- pose operating systems should be designed and supplied by the manufacturers. Some users felt that the very great difference in the job mix and operating philosophy at the various instal- lations called for specially designed and tailored system programs. For a time the argument seem to be settled by the almost universal as- sumption that operating systems, and computer software in general were as much an obligation of the manufacturer as was the building of the computers themselves. I wonder if this assump- tion will be able to stand up in face of the rapid developments in the large computer field that will lead to computing systems that are very much more diverse, and very much more com- plex than those that are in general use today.

In the large computer field multiprocessing and multiprogramming systems will soon be-

come the rule rather than the exception. Many experiments in these directions are being tried with computers of the generation that is now coming to an end. Systems combining a 7094, a 1301 Disc unit and a 7040 will soon be com-

·monplace. There are a number of military sys- tems involving several computers and much peripheral equipment all working together un- der a common operating system.

Among newer computers already delivered to customers there are several models that }lave been designed to make it possible and practical to run peripheral equipment on-line while simul- taneously carrying out independent computer

processing tasks. The distinction between off- line and on-line tends to disappear on such systems, and the operating systems must be able to control equipment in many different modes.

Systems already delivered that have some fea- tures that permit multiprogramming and multi- processing include the Honeywell 800, The Fer- ranti Atlas, The Burroughs 5000 and D825.

There is some very interesting recent literature about programming systems for these com- pu ters. 45.46.47

In the next generation of large computers it may be possible to implement true demand proc- essing systems. Demand systems have been advocated by many in opposition to batching systems. In a demand system problems are sub ....

mitted as they arise. The system controls the input of jobs and the scheduling of jobs by stacking jobs in queues according to length, priority, etc. A demand system requires multi-

programming facilities, but also requires much more elaborate decision making on the part of an executive system than is present in most monitors today.

The complexity required in some of these op- erating systems may seem to require that they be uniform systems designed and produced by the manufacturer. But, another feature that is being stressed more and more is modularity, which permits an almost unlimited variety in system configurations. It is very difficult to de- sign a single operating system that is appropri- ate for a computing system based on Disc stor- age, and also for one based on tapes or drums, and also for any combination of auxiliary de- vices. The problem will get more complicated when high speed storage at different levels is available in various q,uantities. It is quite rea- sonable to anticipate a system in the next few years that will have a very high speed film mem- ory, backed up by a fast core memory, backed up by a large and somewhat slower core mem- ory, backed up by high speed drums, then discs and tapes. It will be a real challenge to design programming systems that are valid for all combinations in such systems.

In the early days one of the aims of the oper- ating system was to get the human being out of the system as much as possible. In a multi-pro- gramming system it is possible to allow human

Referenzen

ÄHNLICHE DOKUMENTE

4.1 Objetivo del Proyecto: Evaluar a los estudiantes de la Unidad Académica Contabilidad de Costos utilizando cuestionarios de respuesta única, respuestas dicotómicas VoF y pruebas de

Perhaps some explanation of this theme is in order. I have been exposed to the rapidly developing art of large-scale digital computation since 1941. It appears to me

Rather than engaging with the vast literature on the subject (e.g. Cordesman / Yarosh 2012) it will focus on two types of security challenges, perceived or real: the ones

In contrast, the countries of the Gulf Cooperation Council (GCC), including Saudi Arabia, the United Arab Emirates (UAE), and Qatar, have opted for an increasingly muscular

I Über den Zeitpunkt des Empfangs kann keine Aussage getroffen werden (Unbounded indeterminancy). I Über die Reihenfolge der Empfangenen Nachrichten wird im Aktorenmodell keine

On the following day, August 22 nd , the Hungarian radio gave a summary of this Soviet interpretation in the Morning Chron- icle: “The military units of the allied socialist

None of the LINC instructions makes explicit reference to the Memory Address register or Memory Contents register; rather, in referring to memory.. register X,

Bereanu, The continuity of the optimum in parametric programming and applications to stochastic programming, J.Optim.Theory Applic. Robinson, A characterization of stability