PROSE modeling language
![]() | This article has multiple issues. Please help improve it or discuss these issues on the talk page. (Learn how and when to remove these messages)
|
An editor has nominated this article for deletion. You are welcome to participate in the deletion discussion, which will decide whether or not to retain it. |
PROSE[1][2][3][4] was the mathematical 4GL virtual machine which established the holistic modeling paradigm known as Synthetic Calculus[5][6][7] (AKA MetaCalculus). A successor to the SLANG[8]/CUE[9] simulation and optimization language developed at TRW Systems, it was introduced in 1974 on Control Data supercomputers. It was the first commercial language[10][11][12][13] to employ automatic differentiation (AD), which was optimized to loop in the instruction-stack of the CDC 6600 CPU.
Although PROSE was a rich block-structured procedural language, its focus was the blending of simultaneous-variable mathematical systems such as:
- implicit non-linear equations systems,
- ordinary differential-equations systems, and
- multidimensional optimization.
Each of these kinds of system models were distinct and had operator templates to automate and solve them, added to the procedural syntax. These automated system problems were considered "holistic" because their unknowns were simultaneous, and they could not be reduced in formulation to solve piecewise, or by algebra manipulation (e.g. substitution), but had to be solved as wholes. And wholeness also pertained to algorithmic determinacy or mathematical "closure", which made solution convergence possible and certain in principle, if not corrupted by numerical instability.
Holarchies of Differential Propagation
Since these holistic problem models could be independently automated and solved due to this closure, they could be blended into higher wholes by nesting one inside of another, in the manner of subroutines. And users could regard them as if they were ordinary subroutines.
Yet semantically, this mathematical blending was considerably more complex than the mechanics of subroutines, because an iterative solution engine was attached to each problem model by its calling operator template above it in the program hierarchy. In its numerical solution process, this engine would take control and would call the problem model subroutine iteratively, not returning to the calling template until its system problem was solved. During some or maybe all of the iterative model-subroutine calls the engine would invoke automatic differentiation of the formulas in the model holarchy with respect to the model's input-unknowns (arguments) defined in the calling template. Additional mechanisms were performed in the semantics to accommodate ubiquitous nesting of these holistic models.
Differentiation of Prediction Processes
If the nested solution was a prediction (e.g. numerical integration), then its solution algorithm, in addition to the model formulas, would also be automatically differentiated. Since this differentiation propagated (via the chain rule) throughout the integration from initial conditions to boundary conditions, the differentiation of boundary conditions with respect to initial conditions (so called Frechet derivatives) would be performed. This enabled the routine solution of boundary-value problems by iterative "shooting" methods using Newton-method engines. Of course, at the same time this propagated differentiation could also be performed with respect to arbitrary parameters of the differential equations to further shape the integrated functions. And these parameters could be solved for as unknowns of any nest in the holarchy above the integration process, a significant convenience in overall problem formulation.
Differentiation of Search Processes
If the inner nested problem was a search, and the outer problem was also a search (e.g. optimization), then the partial derivatives produced with respect to the inner-search unknowns had to be converted into partial derivatives of the outer search via a differential-geometry coordinate transformation. This was also an iterative process involving higher order differentiation and sometimes different independent variables.
Yet these extended and iterative differential-arithmetic processes were totally hidden from the user, and were hardly more significant in his modeling task than if only ordinary subroutines and their calls were involved. The fact that they were iterative and the number and kind of iterations were indefinite, because a whole sub-problem was being solved which was also a part of a higher problem, it was natural to call each problem nest a "holon", as this dynamic entity perfectly fitted the theory of Arthur Koestler who coined that term. This was not done in the original PROSE documentation, because in those years, Koestler's theory was new, and somewhat controversial. This term was later used after Ken Wilber had ratified Koestler's holon concepts.
Automation Operator Templates
The complete modeling paradigm consisted of only three classes of holons, as distinguished by their operator templates below.
Optimization
- FIND simultaneous-unknowns IN model-subroutine BY solver-engine
- [HOLDING inequality-constraint-variables]
- [MATCHING equality-constraint-variables]
- TO MAXIMIZE|MINIMIZE objective-variable
Correlation
- FIND simultaneous-unknowns IN model-subroutine BY solver-engine
- TO MATCH equality-constraint-variables
Simulation
- INITIATE solver-engine FOR model-subroutine EQUATIONS rate-variables/level-variables
- OF independent-variable STEP increment-variable TO limit-variable
- INTEGRATE model-subroutine BY solver-engine
These three operator templates created dynamic holons encapsulating an equations model subroutine hierarchy which could contain other nested holons, because the model subroutines could contain any of the operator templates encapsulating sub-problems. Each holon in the holarchy had a solver algorithm engine, which could be interchanged with others in its holon class. The extended arithmetic of automatic differentiation and its ability to dynamically differentiate numerical integration gave rise to the unique mode of holarchy modeling illustrated in Figure 1.

This example problem was originally a FORTRAN application from a RAND report about an algorithm used for optimization of boundary-value problem applications. This report, also published as a textbook,[14] described Quasilinearization, an alternative to "dynamic programming", invented by the same author, Richard Bellman. The FORTRAN program in Appendix Two of the textbook contains over five times the amount of code as the 25-line PROSE program, entirely embedded in the white boxes (visible syntax) of Figure 1. More significant in this modeling versus programming discussion is that the FORTRAN program contains 14 DO loops, whereas the PROSE program contains no loops. Another point to make about program simplification is that dynamic memory management could be taken for granted by the user. At the return from a holon to its calling operator template, the holon was destroyed and its memory was freed for other use.
This application is actually trivial in the amount of code required to state the problem. That is why the PROSE program is so small. All of its iterative solution methodology is under the hood in the solver engines (ellipses in Figure 1). Models seldom need loops. This is why spreadsheets, which are modeling tools, don't even have them.
This example problem provides a full encapsulation of the holon paradigm in a single application. All three of its holon types are employed: optimization searching at the highest level of the holarchy, correlation searching (a restricted subset of optimization searching) as the middle holon, and system dynamics simulation as the innermost holon. Another PROSE program with this same anatomy is illustrated in Figure 2. This is a somewhat larger application of the optimization of a cantilevered wing structure to maximize lift, subject to structure and weight constraints. In this case there are ten coordinate dimensions of the optimization unknowns that are being searched by the outer holon solver.

The two outer holons each has a hidden coordinate system of unknowns that its search engine solves for. And these engines require partial derivatives of all downstream variables dependent upon those unknowns, which are evaluated by automatic differentiation arithmetic. The derivatives of the outer coordinate system must be computed from the derivatives of the inner coordinate system, after the inner search engine has converged (found a local solution). This is where a differential-geometry coordinate transformation is applied. The wing problem of Figure 2 has more downstream subprograms, not shown, including an integral quadrature function.
As these subprograms include the numerical integration of the system dynamics (differential equations) model, the automatic differentiation arithmetic includes differentiation of the integration algorithm of the simulation engine (and the quadrature solver), to evaluate derivatives of the boundary (end-point of the integrated curves) conditions with respect to the initial conditions. This calculation is not possible via formal symbolic differentiation. Nor is it feasible with finite difference approximation. Only automatic differentiation, with its exact chain-rule propagation is feasible.
Automated Holon Architecture

Figure 3 shows the generalized architecture of a holon in profile, showing the visible modeling syntax and the invisible semantics architecture with its characteristic 5-step iteration process. A holon is a calculus problem-solving unit, mathematically associated with a coordinate system dynamically created by the operation template. Its operator is a solver engine, either a numerical predictor in the case of simulation, or a search engine in the case of correlation and optimization. Its operand is a model procedure (which may itself be a holarchy of subordinated holons).
In essence a holon is a metaphoric computation container like a spreadsheet, but allowing procedural looping like an ordinary algebraic language. Yet its purpose is to frame algebraic formulas that represent higher mathematics (e.g. differential equations are algebraic formulas, in which some of their variables are rates).
Figures 4-7 show how the different holon classes of simulation, correlation, and optimization reflect this architecture, separating modeling (science equations) from algorithmic solver engines of the art of numerical approximation mathematics.
![]() |
![]() |
![]() |
![]() |
Holons are formula-system solution processes
As mentioned above, a Holon is a computation container like a spreadsheet that encapsulates a set of input algebraic formulas. But unlike a spreadsheet, these formulas are parts of an irreducible whole which can only be solved together as a unit, involving a succession of approximations (iterations). A spreadsheet, which only involves a single pass of formula calculations, may therefore be thought of as a "degenerate" or "reduced" holon, one that only involves single-pass calculations.
A holon model elevates an encapsulated system of algebraic formulas to a higher problem archetype relating simultaneous unknowns to a definable solution condition other than a single pass through the set of formulas. Iterative calculus "under the hood" is required to "converge" multiple-pass approximations to the solution condition.
Metaphoric Problem Archetypes
Each holon automates one of the three system-problem archetypes that have emerged from higher math with a distinct class of solution methods, applicable as interchangeable operators. These methods operate on the input formulas and their calculations, to guide the successive approximations to the holon solution. These problem archetypes easily precipitate out of the formula collections that represent modeling of natural phenomena, and can be used as building blocks to synthesize whole computation programs as holarchies of sequential or nested holons. Used together as an alphabet these archetypal problems become a topology of higher math modeling within an algebraic programming language containing special semantic "glue" methodologies which propagate calculus "influence" through the collections of holons.
As the holons combine to form greater wholes via alphabetic combination, such holarchies also tend to become problem archetypes, which often precipitate out of natural phenomena modeling. Examples are boundary-value problems, solved by the combination of correlation and simulation holons.
PROSE Pantheon
PROSE introduced a pantheon of interchangeable solvers named for mythical gods in the three engine categories:
Optimization
- HERA - an advanced version of Newton's second-order gradient method with special logic to recognize and avoid unwanted extrema during its search process;
- HERCULES - a special constrained optimization solver for linear, integer, and mixed-integer problems;
- JOVE - a sequential unconstrained optimization technique applying Newton's second-order gradient search;
- JUPITER - a moving exterior truncations penalty-function method applying a Davidon-Fletcher-Powell (DFP) variable-metric search;
- THOR - a "sectionally linearized" linear programming technique; and
- ZEUS - a sequential unconstrained optimization technique applying a Davidon-Fletcher-Powell (DFP) variable-metric search.
Correlation
- AJAX - a damped Newton-Raphson and Newton-Gauss pseudo-inverse root finder; and
- MARS - a damped Newton-Raphson and Newton-Householder pseudo-inverse root finder.
System-Dynamics Simulation
- ATHENA - multi-order Runge-Kutta with differential propagation and optional limiting of any output dependent variables;
- GEMINI - self-starting technique of rational function extrapolation from Gragg, Bulirsch, and Stoer with differential propagation or not according to context;
- ISIS - Runge-Kutta-Gill with differential propagation;
- JANISIS - ISIS or JANUS, depending on differential or non-differential-propagating contexts;
- JANUS - Adams-Moulton predictor-corrector for non-differential-propagating contexts;
- MERCURY - Gears rate/state differentiating stiff and step-size optimizing method for non-differential-propagating contexts;
- MERLIN - self-starting technique of rational function extrapolation from Gragg, Bulirsch, and Stoer with differential propagation;
- MINERVA - multi-order Runge-Kutta without differential propagation and optional limiting of any output dependent variables;
- NEPTUNE - self-starting technique of rational function extrapolation from Gragg, Bulirsch, and Stoer without differential propagation; and
- PEGASUS - a special 5th order Runge-Kutta technique known as Sarafyan Embedding, in which a 4th order result is obtained at the same time plus optional limiting of any output dependent variables in non-differential propagating contexts.
Nesting Contexts
These solvers applied different numerical methods in the three engine categories, depending upon the nesting context in which they were applied. Some simulation solvers (JANUS, MERCURY, MINERVA, MERLIN and PEGASUS) could not be nested in automatic differentiation contexts of correlation and optimization because they were not overloaded for automatic-differentiation arithmetic. Thus hybrid versions, JANISIS (ISIS or JANUS) and GEMINI (MERLIN or NEPTUNE) were introduced, which would work efficiently in automatic differentiation mode or ordinary arithmetic mode (differentiation internally turned off). This greatly speeded up the iterative searches of solvers like AJAX, MARS, JOVE, ZEUS, and JUPITER, which iteratively called their models many more times in non-differentation mode, when various modes of non-derivative search sub-steps were applied.
Automated Prototyping of Application Models
Modeling is a fundamentally different discipline from programming as it practiced today, where it is largely intermediated. Namely, a programmer is presumed to be designing code to fit a pre-defined specification or simply an old program being ported to a new purpose (e.g. fitted with a GUI), where the functionality of the application has already reached a level of completeness beyond original formulation and experimentation. As a result, many programmers are "finishers" of programs as end-items, and therefore disdain prototyping as a waste of time or resulting in "inefficient" code.
Rapid Iterative Model Prototyping
R&D applications, however, are hardly ever end-items, but are essentially dynamic lab-apparatus for exploring science. Thus they are constantly evolving, not just in style primarily, but in functionality. Many PROSE users only partially understood the problems they were trying to solve when they started. They often just had a set of equations they had put together by adapting general science formulations (e.g. laws of physics or chemistry) and empirical relationships to model a specific design. They iteratively adapted these formulations by experiencing the output feedback of trial solutions, often to where their initial conception of the problem was totally transformed several times. In design-optimization, for example, knowing what independent variables to solve for is often a great challenge; especially if there are no data sources (e.g. old experiments or simulations) from which to characterize a function being optimized and its dependencies. Many trials were often required to explore functions, often starting with a few unknowns, and then adding more for higher fidelity as their behavior was better understood by feedback from initial runs.
Infeasible Intermediation
This self-learning of the problem by its originator would usually have been prohibitive in cost and time, if programming had been intermediated, where a specification had to be written to teach the programmer what to program, in a lower level language like C, that the originator couldn't understand and therefore would not recognize the problem as his own formulation after it was programmed. Whereas in PROSE, the holon structure self-organized his formulation as a problem statement, not as a solution method or algorithm. All of the methodology of solution was under the hood, and interchangeable by simply substituting the name of a different engine.
Each of the intermediated prototyping trials alone would have taken longer than the entire DIY problem-solving process from start to finish with PROSE. The feedback from automated prototyping and testing crude initial models was therefore invaluable in comprehending what the best formulation of the application was. This was where the very abstract problem archetypes of automated holons paid off. One could start with a simulation holon in an open loop experimentation mode, to gain experience with the problem, and then nest this simulation holon under a correlation holon to match some test data. Finally, one might nest this holarchy in a higher optimization holon experimenting with different objective functions, and different sets of unknowns.
All the while one could rely on a conceptual understanding of the underlying mathematical behavior of the holarchy and its automation of under-the-hood mathematical processes, namely the propagation of the differential influence of one holon upon another. One did not need to understand the specific design of these processes, any more than physicists understand the actual atomic and quantum structure they are abstractly exploring as mathematical paths in bubble chambers.
And the agility of automation enabled users to explore many different problem formulations and find one or more in time to meet the narrow opportunity windows for proof of concept in proposal deadlines or time-to-market constraints which limit R&D markets.
Disintermediation and Peer Collaboration
PROSE essentially re-instated a kind of modeling/programming collaboration that was highly agile and diversified, because it had disintermediated the waterfall specification practice established by assembly-language programming, which could not be shared between modelers and programmers. FORTRAN, on the other hand, could be used as a shared medium for collaboration between problem modelers and algorithm programmers. And this became a standard means of collaboration before and during the Apollo program. It was essentially a peer "driver/mechanic" race posture, where the driver who was in the race defined the equation models by crude prototyping and simulation. But solution engines (to be called as subroutines) for higher math (correlation and optimization) by these modelers, were developed in FORTRAN by professional programmers working with applied mathematicians between the races. So the programmers and mathematicians were essentially the race mechanics. And their responsibility included all of the media, languages, as well as solution engines. So very soon after the invention of FORTRAN, there appeared problem-oriented languages such as DYANA (1958), which converted spring-mass-damper diagrams into FORTRAN programs that were used in computing the mechanical vibration simulations for automotive design.
A plethora of such languages emerged from collaborative development during those early years to deal with the burden of application diversity which the waterfall could not cope with. Millions of diverse one-of-a-kind scientific and engineering applications emerged in a very short time, essentially between 1957 and 1970. PROSE was merely the most advanced of this genre of scientific and engineering languages that emerged in that era before there was a notion of "the productivity paradox", because this "driver/mechanic" division of labor was so prolific.
Waterfall Reinstated by Software Recycling
After 1970, however, the waterfall was essentially brought back in a frenzy of software recycling spurred by the advent of the IBM 360. This began an era focused on converting old software to new computers, which predominates today, primarily with the recycling of the Unix operating system into hundreds of different strains (including Linux and Android), all mainly differing in style. The focus long-ago shifted from highly diversified one-of-a-kind custom applications in R&D to mass-application software, whose recycling development could support a large labor force of programmers. The major languages which emerged (C and Pascal) were surrogates for assembly language, having pointers and other techniques to build device drivers. This was the ascendancy of computer science and the rapid demise of one-of-a-kind application modeling, especially in R&D. Operating systems and systems programming became the computer-science ideal.
The waterfall re-surged because the specification part of the development was replaced by old operating code. End-users were no longer needed to write specifications. So Information Technology became autonomous, and the industry became highly stratified. Collaboration between end-user scientists and engineers with professional programmers and mathematicians became rare, and instead of building more problem-oriented languages, like PROSE, we began to see OOP languages like C++, which further isolated end-users from programmers.
With the emergence of the web, an extremely complex yet session-limited browser-server markup protocol, compared to the already emergent X-client-server session protocol, intermediated programming (mainly in professional website development) surged once again, and C++ led to Java and JavaScript. These presentation languages emphasized, rather than diminished the gap between client and server unlike the X-protocol, which tended to hide this gap. As a result interactive computing became frightfully complex, compared to the time-sharing era, when PROSE enabled interactive optimization searching.
Now, almost all applications are written in a hodgepodge of different languages, and the agility concepts of extreme and agile software development do not include code-sharing by modelers and programmers, like it did during the Apollo era. It presumes that programming is still done exclusively by programmers in computer-science languages like C++, Java, and JavaScript.
Productivity Paradox or R&D Renaissance?
Meanwhile nobody has been able to explain why we still have a productivity paradox. Sure there have been lots of explanations over the years since Solow's quip in 1987. But none of them touch upon the real cause--intermediated programming. If there is to be any reform in this industry, looking at PROSE may be the best clue as to how to achieve it to bring back the driver/mechanic race posture that FORTRAN triggered in 1957, becoming essentially "our Sputnik".
As Economist Robert J. Gordon recently stated, The U.S. entered the “dismal age” of slow productivity growth between 1972 and 1996, and showed this figure to illustrate his point that "if the 1948-72 productivity trend had continued, the level of productivity would have been 69% above what would have occurred if the 1972-96 trend had continued." The actual outcome shown in his figure indicated that the benefits of actual productivity from the internet revolution only closed 9% of that 69% gap.
The important thing to ask ourselves today, is why IBM, "a medium-sized maker of tabulating equipment and typewriters", tripled its gross income and doubled its size from 1955 to 1960 dwarfing all of its competitors, at a time when the growth in computing was primarily in R&D and programming was predominantly DIY with FORTRAN and BASIC. Productivity in those years experienced a continuation of the 1948-72 growth without interruption. Yet after 1972 when the IBM 360/370 and later the IBM PC and its clones dominated computing, productivity growth was dismal. This is the time when programming became intermediated again like it had been in 1957 when FORTRAN changed the game. And intermediated languages like PL/I, Pascal, C, C++, Java,and JavaScript prevailed.
The little known growth difference between DIY programming and intermediated programming is that the former explodes the diversity of application demand and production because it transduces the burden of diversfication to the diverse user pool. So any application need, no matter how few people have that need, can be accommodated by user-applied automation without incurring extra labor cost above that of the user, which would be expended anyway without automation. In R&D this is critical for growth, because demand density of the need is extremely low. All applications are essentially one of a kind, and only benefit the user having that unique need. Yet users don't experience diversity at all, because they are operating within their own specialty. So demand-pull burgeons because diverse needs are rampant, but the burden of diversity is not felt by anybody.
Intermediated programming, on the other hand, is hobbled by diversity, because programmers don't know the application specialties, unless they are staples like word processing, known to most everyone. They have to learn each application specialty in order to know what to program. So someone has to write a specification to teach then. As a consequence, the whole profession of programming is boggled by application diversity, and can only support staple needs that have inherent economies of scale, namely mass applications like office tools, which can amortize enormous labor costs, spread over mass usage of a single or few applications.
Now with programming languages like C++ and Java, which are only understandable by dedicated professionals, users don't know how to program in them. So we have a catch 22. Programmers don't know what to program, and users don't know how. So there is a stalemate in R&D.
The development of languages like PROSE can produce a DIY renaissance of the "driver/mechanic" race posture of the Apollo era. Instead of professional programmers continuing to create a Babel of style-diversified computer-science languages for their own use, they can collaborate with end-users to once again build functionally diversified problem-oriented languages, like DYANA, which optimize instead of just simulate new application designs. Considering the plethora of old simulation applications containing extant models to be converted in this manner without building them from scratch as they originally had to be, the speed of new industry growth, in comparison to the early industry growth of FORTRAN, could be quite phenomenal, especially with the attraction of new manufacturing with 3D printing.
References
- ^ PROSE – A General Purpose Higher Level Language, Procedure Manual, Control Data Corp. Pub No. 840003000 Rev. B (Jan 1977)
- ^ a b c d e f PROSE – A general Purpose Higher Level Language, Calculus Operations Manual, Control Data Corp. Pub. No 840003200 Rev B (Jan. 1977) Cite error: The named reference "ProseCalcOp" was defined multiple times with different content (see the help page).
- ^ PROSE – A general Purpose Higher Level Language, Calculus Applications Guide, Control Data Corp. Pub No. 84000170 Rev. A (Jan 1977)
- ^ PROSE – A general Purpose Higher Level Language, Time Sharing System Guide, Control Data Corp. Pub. No 84000160 Rev A (Jan. 1977)
- ^ J.M. Thames, The Evolution of Synthetic Calculus: A Mathematical Technology for Advanced Architecture, in Proc. of the International Workshop on High-Level Language Computer Architecture, University of Maryland, 1982
- ^ B. Krinsky and J. Thames, The Structure of Synthetic Calculus, A Programming Paradigm of Mathematical Design, in Proc. of the International Workshop on High Level Computer Architecture, University of Maryland, 1984
- ^ a b J.M. Thames, Synthetic Calculus – A Paradigm of Mathematical Program Synthesis, in A. Griewank and G.F. Corliss, eds., Automatic Differentiation of Algorithms: Theory, Implementations, and Applications, SIAM, Philadelphia (1991) Cite error: The named reference "ThamesSynCalc" was defined multiple times with different content (see the help page).
- ^ J.M. Thames, “SLANG—A Problem-Solving Language of Continuous Model Simulation and Optimization", ACM National Conference, San Francisco, 1969.
- ^ J.D. McCully, “The Q Approach to Problem Solving”, Proceedings of the Fall Joint Computer Conference, 1969.
- ^ R.N. Nilsen and W.J. Karplus, "Continuous-System Simulation Languages: State of the Art Survey" Annales de l'Association Internationale pour le Calcul analogique - No 1, Jan, 1974, p. 20
- ^ J.M. Thames, Computing in calculus, Research/Development, (1975), pp. 24–30
- ^ F.W. Pfeiffer, Some Advances Related to Nonlinear Programming, ACM Sigmap Bulletin, Issue 28, Jan 1980, pp. 15-21
- ^ F.W. Pfeiffer, Automatic differentiation in PROSE, ACM SIGNUM Newsletter, 22 (1987), pp. 1–8
- ^ a b R.E. Bellman and R.E. Kalaba, Quasilinearization and Nonlinear Boundary-Value Problems, The RAND Corporation, American Elsevier Publishing Co., New York, 1965, p. 125, p. 168