This post is a mashup of two draft book sections. What was the first section has been moved to the end here because it's a bit disorienting in the context of this blog, and maybe somewhat palatable here in the following order.
The arrival of the DEC PDP-10 marked a pivotal shift away from the IBM paradigm of computing. While the IBM 7094 had perfected batch processing, the PDP-10 pioneered interactive, time-sharing systems. This innovation fundamentally altered the relationship between the user and the computer, moving from indirect asynchronous process to a direct, real-time conversation. The philosophical differences between the two machines were stark, representing a clash between an established empire and an agile challenger. The PDP-10 was both an heir to the 36-bit mainframe tradition and a radical departure from its batch-processing legacy. It emerged in the late 1960s as a platform that pioneered interactive, multi-user computing, offering a glimpse of a future where users could engage with the machine in real time. The PDP-10's architecture was an evolutionary leap, fundamentally reshaping the relationship between user, software, and hardware.
DEC's market strategy was as sophisticated as its technology. The company positioned the PDP-10 to serve the same market niche as the 7094, large-scale scientific and engineering computation, but camouflaged its direct challenge to IBM by emphasizing its next-generation interactive and time-sharing capabilities. This approach was a deliberate effort to avoid direct competition with IBM, a priority of DEC’s Ken Olsen. [1] The result was a platform that preserved the 36-bit architectural lineage while introducing revolutionary features that would foster a new and unique software culture.
Fortran IV code naturally migrated over along with the 36-bit legacy. In some sense Fortran IV, 36-bits, and the punch card era are bound up together. Critically, the PDP-10 maintained this essential architectural link to the past and thus ensured the portability of Fortran IV numerical codebases, easing the migration of applications such as the Simplex algorithm and more generally Operations Research to this new, interactive environment. As part of that legacy, the implicit contract between the Fortran IV and Assembly language layers was maintained. The DEC Fortran IV compiler and MACRO-10 assembler were tightly coupled, and both had 6-bit character data in their genes, so the overall system was preserving continuity with the sixties.
Decwar is a late flower of that legacy, being genetically a member of the 36-bit Fortran IV and Assembly language family. Here is the ancestral link with Simplex, Operations Research, and the IBM 7090 series. The culture and spirit is a direct continuity back to the pioneering large software systems of the fifties and sixties, such as Simplex, now somewhat freed from the presence of actual physical punch cards, tapes, and batch processing by the introduction of video terminals, disks, and time-sharing, but preserving the family traits from a software perspective.
For industry, this was not an academic curiosity; it was a tool of strategic significance. When applied to the complexities of refinery operations, it could produce a deterministic, optimal solution for resource allocation. By implementing the Simplex algorithm, Exxon could systematically improve its profit margins, turning abstract mathematics into tangible financial gains. This direct line from computation to profitability provided the economic imperative for the entire mainframe ecosystem. Within the oil industry, the application was direct and transformative. By determining the most efficient processing pathways and product mixtures, the algorithm delivered a clear and massive return on investment, easily justifying the expense of purchasing and operating mainframe computers.
Computationally, the Simplex algorithm is pure numerical computing, grounded entirely in linear algebra. Its implementation consists of a series of vector and matrix operations designed to find a deterministic, optimal solution. The historical implementation of Simplex mirrored the evolution of programming itself. The earliest versions were coded directly in machine code, and all through the fifties, well into the sixties, implementation in Assembly language or directly in machine code was standard. Fortran entered the picture for this type of high-end computing gradually. In fact, the Simplex algorithm's industrial deployment paralleled the maturation of Fortran, which was created by IBM's John Backus in the early fifties at the same time that Dantzig was creating Simplex at RAND.
IBM's development of Fortran was a strategic masterstroke. The language was not merely a piece of software but a critical component of a complete system, engineered to make the formidable computational power of machines like the 7094 more accessible, practical, and valuable to its industrial clients. The core design philosophy of Fortran IV was explicit and unwavering: it was built for numerical computation. Its strengths lay in vector-matrix operations and linear algebra, precisely the kind of mathematics required by Simplex and the oil and aerospace industries.
IBM's business strategy was aligned with the parallel evolution of Simplex and Fortran. The company did not merely sell computers; it sold a packaged solution for industrial optimization. The IBM 7090 series of mainframes, Assembly code, and the Fortran IV compiler were designed to work together as a seamless platform for numerical computation. IBM understood its market precisely and was selling FLOPS (floating-point operations per second) to corporations like Exxon for the express purpose of running Simplex and similar algorithms. The value proposition was unambiguous. Invest in our hardware and software, and we will give you the computational power to make your operations more profitable. IBM did not merely sell hardware; it engineered and marketed a holistic computational solution.
Fortran IV, Assembly code, and the 7094 were, in essence, designed to work as a symbiotic system to dominate the burgeoning field of numerical computing. In the early 1960s, the 7094 stood as the archetypal supercomputer of its time. For major corporations like Exxon, it represented more than just a piece of advanced machinery; it was a strategic asset that could translate computational power into tangible industrial results by solving the largest and most complex numerical problems of industry. The 7094 was the engine of a new era of data-driven optimization, and its entire design was geared toward solving complex numerical problems at an unprecedented scale.
In a world where every machine cycle was a valuable commodity, programmers adopted a pragmatic, hybrid approach that blended high-level logic with direct, unmediated access to the hardware. Expert programmers viewed Fortran not as a final tool, but as a potential impediment to achieving maximum speed. Standard practice involved writing the high-level structures and logic of a program in Fortran, but pushing the performance-critical inner loops down into Assembly language or even pure machine code. it was common practice to rewrite performance-critical inner loops in Assembly language. This allowed direct access to the hardware, ensuring the machine's floating-point capabilities were exploited to their absolute fullest. The more frequently a section of code was executed, the greater the pressure to move it from Fortran into Assembly.
Recognizing the cultural realities of the Assembly tradition, where seasoned programmers viewed high-level languages as an impediment, IBM anticipated and facilitated a hybrid programming model. IBM was fully aware that this was the standard workflow for their most demanding customers. Consequently, they made every effort to ensure that the integration of Fortran and Assembly was as smooth as possible. The compilers and linkers were explicitly designed to facilitate this hybrid model, acknowledging that true performance lay in the fusion of high-level structure and low-level optimization. This pragmatic, two-tiered programming model became common in the sixties and seventies.
The system’s core architectural and operational paradigm was built around batch processing. A programmer's primary input method was a physical stack of punch cards, a card deck. Decks were submitted to operators, who loaded them into a card reader. The contents of card decks were recorded onto magnetic tape, which formed the heart of the 7094's data-handling system. The role of magnetic tape was paramount and fundamentally different from modern storage. An IBM 7094 was flanked by a row of ten tape drive cabinets. These constituted the machine's online file system, with each tape effectively a file. This simple, direct mapping obviated the need for a hierarchical file system with directories and subdirectories. The machine had ten online files, the ten physical tapes mounted on the drives. The user experience was defined by this batch-oriented workflow. It was an asynchronous, indirect, and entirely non-interactive process.
[1] Rifkin, Glenn, and George Harrar. The Ultimate Entrepreneur: The Story of Ken Olsen and Digital Equipment Corporation. Chicago: Contemporary Books, 1988.
No comments:
Post a Comment