Saturday, April 25, 2026

DEC and the Dawn of Interactive Worlds

This post is a mashup of two draft book sections. What was the first section has been moved to the end here because it's a bit disorienting in the context of this blog, and maybe somewhat palatable here in the following order.

The arrival of the DEC PDP-10 marked a pivotal shift away from the IBM paradigm of computing. While the IBM 7094 had perfected batch processing, the PDP-10 pioneered interactive, time-sharing systems. This innovation fundamentally altered the relationship between the user and the computer, moving from indirect asynchronous process to a direct, real-time conversation. The philosophical differences between the two machines were stark, representing a clash between an established empire and an agile challenger. The PDP-10 was both an heir to the 36-bit mainframe tradition and a radical departure from its batch-processing legacy. It emerged in the late 1960s as a platform that pioneered interactive, multi-user computing, offering a glimpse of a future where users could engage with the machine in real time. The PDP-10's architecture was an evolutionary leap, fundamentally reshaping the relationship between user, software, and hardware.

DEC's market strategy was as sophisticated as its technology. The company positioned the PDP-10 to serve the same market niche as the 7094, large-scale scientific and engineering computation, but camouflaged its direct challenge to IBM by emphasizing its next-generation interactive and time-sharing capabilities. This approach was a deliberate effort to avoid direct competition with IBM, a priority of DEC’s Ken Olsen. [1] The result was a platform that preserved the 36-bit architectural lineage while introducing revolutionary features that would foster a new and unique software culture.

Fortran IV code naturally migrated over along with the 36-bit legacy. In some sense Fortran IV, 36-bits, and the punch card era are bound up together. Critically, the PDP-10 maintained this essential architectural link to the past and thus ensured the portability of Fortran IV numerical codebases, easing the migration of applications such as the Simplex algorithm and more generally Operations Research to this new, interactive environment. As part of that legacy, the implicit contract between the Fortran IV and Assembly language layers was maintained. The DEC Fortran IV compiler and MACRO-10 assembler were tightly coupled, and both had 6-bit character data in their genes, so the overall system was preserving continuity with the sixties.

Decwar is a late flower of that legacy, being genetically a member of the 36-bit Fortran IV and Assembly language family. Here is the ancestral link with Simplex, Operations Research, and the IBM 7090 series. The culture and spirit is a direct continuity back to the pioneering large software systems of the fifties and sixties, such as Simplex, now somewhat freed from the presence of actual physical punch cards, tapes, and batch processing by the introduction of video terminals, disks, and time-sharing, but preserving the family traits from a software perspective.

In the post-WWII industrial landscape, Operations Research emerged as a strategic discipline, transforming complex logistical challenges into solvable mathematical problems. The capital investment required for early mainframes was partly justified by the clear and substantial return on investment promised by solving large-scale optimization problems. These machines were purchased to answer critical industrial questions. How can we use our resources with maximum efficiency to generate maximum profit? A key to unlocking the potential of Operations Research was the Simplex algorithm, an automated method for solving complex linear optimization problems. It stands as one of the most important algorithms of the 20th century and was a quintessential application for the early IBM commercial digital computers. The first practical implementation was led by Dantzig using IBM hardware at the RAND Corporation in the early 1950s.

For industry, this was not an academic curiosity; it was a tool of strategic significance. When applied to the complexities of refinery operations, it could produce a deterministic, optimal solution for resource allocation. By implementing the Simplex algorithm, Exxon could systematically improve its profit margins, turning abstract mathematics into tangible financial gains. This direct line from computation to profitability provided the economic imperative for the entire mainframe ecosystem. Within the oil industry, the application was direct and transformative. By determining the most efficient processing pathways and product mixtures, the algorithm delivered a clear and massive return on investment, easily justifying the expense of purchasing and operating mainframe computers.

Computationally, the Simplex algorithm is pure numerical computing, grounded entirely in linear algebra. Its implementation consists of a series of vector and matrix operations designed to find a deterministic, optimal solution. The historical implementation of Simplex mirrored the evolution of programming itself. The earliest versions were coded directly in machine code, and all through the fifties, well into the sixties, implementation in Assembly language or directly in machine code was standard. Fortran entered the picture for this type of high-end computing gradually. In fact, the Simplex algorithm's industrial deployment paralleled the maturation of Fortran, which was created by IBM's John Backus in the early fifties at the same time that Dantzig was creating Simplex at RAND.

IBM's development of Fortran was a strategic masterstroke. The language was not merely a piece of software but a critical component of a complete system, engineered to make the formidable computational power of machines like the 7094 more accessible, practical, and valuable to its industrial clients. The core design philosophy of Fortran IV was explicit and unwavering: it was built for numerical computation. Its strengths lay in vector-matrix operations and linear algebra, precisely the kind of mathematics required by Simplex and the oil and aerospace industries.

IBM's business strategy was aligned with the parallel evolution of Simplex and Fortran. The company did not merely sell computers; it sold a packaged solution for industrial optimization. The IBM 7090 series of mainframes, Assembly code, and the Fortran IV compiler were designed to work together as a seamless platform for numerical computation. IBM understood its market precisely and was selling FLOPS (floating-point operations per second) to corporations like Exxon for the express purpose of running Simplex and similar algorithms. The value proposition was unambiguous. Invest in our hardware and software, and we will give you the computational power to make your operations more profitable. IBM did not merely sell hardware; it engineered and marketed a holistic computational solution.

Fortran IV, Assembly code, and the 7094 were, in essence, designed to work as a symbiotic system to dominate the burgeoning field of numerical computing. In the early 1960s, the 7094 stood as the archetypal supercomputer of its time. For major corporations like Exxon, it represented more than just a piece of advanced machinery; it was a strategic asset that could translate computational power into tangible industrial results by solving the largest and most complex numerical problems of industry. The 7094 was the engine of a new era of data-driven optimization, and its entire design was geared toward solving complex numerical problems at an unprecedented scale.

In a world where every machine cycle was a valuable commodity, programmers adopted a pragmatic, hybrid approach that blended high-level logic with direct, unmediated access to the hardware. Expert programmers viewed Fortran not as a final tool, but as a potential impediment to achieving maximum speed. Standard practice involved writing the high-level structures and logic of a program in Fortran, but pushing the performance-critical inner loops down into Assembly language or even pure machine code. it was common practice to rewrite performance-critical inner loops in Assembly language. This allowed direct access to the hardware, ensuring the machine's floating-point capabilities were exploited to their absolute fullest. The more frequently a section of code was executed, the greater the pressure to move it from Fortran into Assembly.

Recognizing the cultural realities of the Assembly tradition, where seasoned programmers viewed high-level languages as an impediment, IBM anticipated and facilitated a hybrid programming model. IBM was fully aware that this was the standard workflow for their most demanding customers. Consequently, they made every effort to ensure that the integration of Fortran and Assembly was as smooth as possible. The compilers and linkers were explicitly designed to facilitate this hybrid model, acknowledging that true performance lay in the fusion of high-level structure and low-level optimization. This pragmatic, two-tiered programming model became common in the sixties and seventies.

The system’s core architectural and operational paradigm was built around batch processing. A programmer's primary input method was a physical stack of punch cards, a card deck. Decks were submitted to operators, who loaded them into a card reader. The contents of card decks were recorded onto magnetic tape, which formed the heart of the 7094's data-handling system. The role of magnetic tape was paramount and fundamentally different from modern storage. An IBM 7094 was flanked by a row of ten tape drive cabinets. These constituted the machine's online file system, with each tape effectively a file. This simple, direct mapping obviated the need for a hierarchical file system with directories and subdirectories. The machine had ten online files, the ten physical tapes mounted on the drives. The user experience was defined by this batch-oriented workflow. It was an asynchronous, indirect, and entirely non-interactive process.

[1] Rifkin, Glenn, and George Harrar. The Ultimate Entrepreneur: The Story of Ken Olsen and Digital Equipment Corporation. Chicago: Contemporary Books, 1988.

Wednesday, April 22, 2026

Casscam Image Dissector Star Tracker

This is an experimental post. It's the first to expand the scope to cover the history of computing at UT's McDonald Observatory and Center for Space Research, as discussed in the About. And it's related to an excellent new post on Ken Sherriff's Blog. Ken's blog is a direct inspiration, much like the TCHC Blog. Ken's post explores a historical star tracker. Star trackers are a fascinating topic and played a role in the history of computing at UT. This post will focus just on the image dissector star tracker used in the McDonald Observatory 82-inch telescope Cassegrain Camera. Later posts will discuss the CCD star trackers used by the Center for Space Research for the NASA ICESat and ICESat-2 missions, and the associated computational modeling and data analysis. It's a long and complex story, covering different eras of computing at UT, and best explored gradually over time. There are also three good books for background reading on these topics [1], [2], and [3]. 

Around 1990, the 82-inch telescope at McDonald Observatory was used to make glass plate photographs of asteroids for the Texas Minor Planet Project (TMPP) and the Hubble Space Telescope Astrometry Team. It was the Indian Summer of traditional analog glass plate imaging, digital CCD imagers had not yet taken over this niche. The heart of the TMPP system was the suitcase-sized Casscam and its integrated star tracker. The Casscam was among the last and most advanced plate cameras. It's possible that the star tracking control loop was entirely analog electronics. Based on operational experience and the environment at McDonald, it's also possible that at least the encoders and logic were digital. It was directly descended from the first instruments used on the 82-inch. Below is an old photo of a direct ancestor, probably from the thirties. [4] The knife-edge focus frame and glass lens were still used in 1990, as discussed below.

An 82-inch telescope instrument and direct ancestor of the Casscam. The knife-edge focus frame lies to the left and has the basic outer dimensions of a glass photographic plate. The cone mounted on top is a heavy glass lens. Peering through the lens, celestial objects appeared much as in a long-exposure full-color astrophotograph.

Even the largest asteroids were small and faint, requiring a relatively long exposure to build up an adequate spot in the photographic emulsion on the glass plate. While building up an asteroid image spot, the Casscam had to track the asteroid’s apparent motion to hold the spot still on the glass plate. This apparent motion was principally from the Earth’s own motion and parallax effects. Against the background of effectively fixed stars, an asteroid moved appreciably when viewed from the Earth, especially when imaged with the magnification of the 82-inch telescope. The Casscam had to nullify this apparent motion during asteroid tracking. The photographic plate holder was rotated to align the asteroid's motion along the Casscam's primary axis. Then, during asteroid tracking, the Casscam moved the plate at the same speed as the asteroid's apparent motion, nullifying it. This was open-loop tracking, without feedback or active error correction.

Star tracking was also needed, separately from asteroid tracking. The stars in the image near the asteroid were also faint, and it was essential to build up adequate spots in the photographic emulsion for them as well, as they were the means to computationally tie the asteroid to the celestial reference frame. The computational modeling and data analysis aspects of this are subjects for later, dedicated posts. In this post, the focus is purely on the Casscam’s capability to track the stars and hold their image spots still on the glass plate. It did this using a combined image intensifier and image dissector tube star tracker locked onto a guide star. The image intensifier was a close relative of a photomultiplier tube. Incoming photons initiated a cascade of electrons down a cylindrical tube, roughly twelve inches long and a few inches in diameter. The circular end of the tube was a glass phosphorescent screen with a cross-hair etched on it. In normal operation the green fuzzy ball of a star image was kept centered in the cross-hair. Below is an example with a much lower magnification and wider field of view. Imagine this zoomed in on the central bright star, with a cross-hair on it.

Stars in an image intensifier. This one has a much lower magnification and wider field of view than the one on the Casscam.

Image dissector tubes seem to have been named to suggest dissecting or taking apart an image, in other words sampling an image. Image sampler may be a more suggestive name to modern eyes. An image is formed using electrons, and that image is then sampled, all within the tube. In the picture below, the lens on the right forms an electron image on the photo-electric plate while the aperture and valve samples the electron image.

An image dissector tube in its early role as a television camera. [9]

The Casscam's combined image intensifier and dissector was sampling the electron image just at the center of the field of view, around the cross-hair. Once a star was placed in the cross-hair, the sampling would output an error signal whenever the star began to drift away. The control loop would then move the plate holder to zero the error signal and correct the drift. The control loop was running at about 1 Hz, and produced a loud clicking noise every second. This soon became a familar sound in the darkness of the 82-inch dome, a steady click click click while the star tracker control loop was active.

In note [5] below there's mention of a 64x64 image dissector at McDonald in the seventies, so clearly something like sampling of a pixel grid was possible and a viable technology until solid-state CCD imagers became available in the eighties and nineties. The transition to CCD star trackers will be explored in a future post about the Center for Space Research and the NASA ICESat star trackers.

Since this post includes a photo showing the knife-edge focus frame, its use can also be described. At the beginning of an observing run, early steps included preparing the Casscam and focusing the telescope. The Casscam was a heavy instrument, about the size of a suitcase. The McDonald operations staff would mount it onto the back of the 82-inch using a lift and heavy bolts. The combined system then needed to be focused by moving the secondary mirror, which was roughly fifty feet overhead. The secondary mirror was moved by an electric motor controlled from a control paddle on the observing platform. Focus was achieved using a knife-edge technique within the focal plane. By placing a metal frame with a straight knife-edge into the Casscam’s plate holder, the observer could adjust the secondary mirror until the light from a star was cut off instantaneously rather than gradually. On occasions when time permitted, the knife-edge focus frame could be replaced by the massive glass eyepiece also shown in the photo. Needless to say, star-gazing through the 82-inch telescope was something very special. 

Notes and photos

[1] MacKenzie, Donald. Inventing Accuracy: A Historical Sociology of Nuclear Missile Guidance. Cambridge, MA: MIT Press, 1990.

[2] Spinardi, Graham. From Polaris to Trident: The Development of US Fleet Ballistic Missile Technology. Cambridge: Cambridge University Press, 1994.

[3] Grewal, Mohinder S., Angus P. Andrews, and Chris G. Bartone. Global Navigation Satellite Systems, Inertial Navigation, and Integration. 4th ed. Hoboken, NJ: John Wiley & Sons, 2020.

[4] Evans, David S., and J. Derral Mulholland. Big and Bright: A History of the McDonald Observatory. Austin: University of Texas Press, 1986.

Direct ancestor of the star tracker discussed in Ken's post. [1] 
Another direct ancestor.

[5] Though the image dissector is capable of scanning a two-dimensional image, it is difficult to do so before the phosphor of the last intensifier has decayed substantially. McDonald observatory did indeed build an area photometer which scanned a 64x64 two-dimensional array (P. M. Rybski, G. W. Van Citters & G. F. Benedict, IAU Coll. 40 Astronomical Applications of Image Detectors with Linear Response, 1976). Bull Astr Soc India, 406-423 December, 1985. This could very well have been related to the Casscam star tracker. Fritz Benedict was a member of the Hubble Space Telescope Astrometry Team into the nineties.

[6] Image dissector tubes have found widespread use in astronomy, beginning with the pioneering work of L. Robinson & J. Wampler in the early seventies. Though occasionally used as imaging devices for either recording extended fields or guiding in automatic/remote-manual mode, the more popular usage has been in intensified scanning spectrometers. Such a system was first developed at Lick Observatory (Robinson & Wampler, Publ. Astr. Soc. Pacific 84, 16 1972), who subsequently duplicated it at the Anglo-Australian Observatory. Kitt Peak National Observatory, European Southern Observatory and Ohio State University have subsequently built similar instruments, some of which are still maximally used. The introduction of more sensitive detectors like the image photon counting systems and charge-coupled devices, and resultant shift in the emphasis of observing programs to fainter limits, have rendered the image dissectors less popular in recent years. However, the image dissector remains the most useful detector at intermediate light levels where avenues remain open for astronomical research. Ibid.

[7] The Intensified Image Dissector Scanner has been in routine use at Kitt Peak National Observatory for two years ... In this instrument, the output phosphor of a three-stage image intensifier is used as a temporary storage medium for incoming photon events. An image dissector tube is used to rapidly scan this output phosphor. Instrumentation in astronomy III, Proceedings of the Society of Photo-optical Instrumentation Engineers, v172, p86, 1979.

[8] An image dissector, also called a dissector tube, is a video camera tube in which photocathode emissions create an electron image which is then swept up, down and across an anode to produce an electrical signal representing the visual image. It employs magnetic fields to keep the electron image in focus, and later models used an electron multiplier to pick up the electrons ... they continued to be used for imaging in early weather satellites and the Lunar lander, and for star tracking in the Space Shuttle and the International Space Station. Wikipedia

[9] https://www.earlytelevision.org/baird_and_farnsworth.html

Thursday, April 16, 2026

Machines in Austin and San Marcos

Thank You to Clive Dawson for this fantastic info! What Clive is discussing here fits together perfectly with info from the PDP-10 serial numbers webpage, as discussed below in the Notes.

I am not aware of a dual KL DEC-10 on the UT Austin campus. The KI DEC-10 was in the Humanities Research Center (HRC) (Now known as the Harry Ransom Center). The Research DEC-20, named "UTEXAS-20" on the early ARPANET and R20.UTEXAS.EDU after the domain system came about in 1983, was in Painter Hall. The Academic DEC-20, named the A20, was housed in the Graduate School of Business (GSB) which was, incidentally, built on the site of the old Pearce Hall.

Any report of DECWAR being played on a TOPS-10 Dual KL would not have occurred on the UT Austin campus.  The Texas State Board of Control, which later became Texas State Purchasing and General Services Commission, had at least one KL, possibly a dual KL.  I believe that Tommy Loomis, a systems programmer who worked with me on the KI DEC-10, moved over there sometime after the demise of the KI, so maybe he took DECWAR with him?!  Another nearby KL (possibly dual?) was at Southwest Texas State University in San Marcos (now Texas State University).  I’m pretty positive that DECWAR was running on that system.

An intriguing question is whether DECWAR was ever ported to TOPS-20.  It never ran on the Research 20 (I would’ve known about it) but perhaps it found its way to the Academic 20?   DEC supplied a piece of software called PA1050 (aka “the compatibility package”) which allowed most TOPS-10 programs to run on TOPS-20.  But the memory organization of both operating systems was very different, so I’m guessing that porting DECWAR would have required a bunch of extra effort (i.e. a complete rewrite of the MACRO code) considering the tricks it employed with the shareable high segment.

Notes

We also have the following from the PDP-10 serial numbers webpage

690 TOPS-10 University of Texas Austin, TX KI10 ------> HRC!:)
1114 TOPS-10 Texas St. Purc. Austin, Texas KL10B CPU1? -------> AHHA!!! [1]
1360 TOPS-10 Texas St. Purc., Austin, TX KL1099D CPU0?
2403 TOPS-10 (MARS::) Southwest Texas University, San Marcos, TX KL 1095 --> [2]
2908 TOPS-10 (SATURN::) Southwest Texas University, San Marcos, TX KL1095

[1] Now Texas St. Purc makes sense!!! Thank you Clive:)

[2] This is where teenage me played DECWAR, circa 83 to 85!:)

[3] History and Functions of the Texas Board of Control. Published: 1976. The Board of Control was established by the Texas legislature in 1919 and was composed of three members appointed by the governor for six-year, overlapping terms. The major duties of the board were to purchase supplies for the departments and eleemosynary and educational institutions of the state; control the state's public buildings and grounds; rent extra buildings and offices for state agencies; prepare the biennial appropriation budget and submit it to the governor; and control the state historical parks. ... In the 1970s the board's responsibilities included managing a system of telecommunications services for state agencies. It maintained a central office-supply store, messenger service, and telephone service, as well as an office-machine repair service. The agency was organized into six divisions: Central Purchasing, Centralized Services, Automated Services, Building and Property Services, Security, and Telecommunications Services. In 1979 the Board of Control was abolished and replaced by the State Purchasing and General Services Commission.

Tommy Loomis. Thank you to Rich Denney, with the antennae, for these photos!

Sam Houston State Office Building on the Capitol's grounds.
Could be that the dual KL was in here?

Saturday, April 11, 2026

Fall 2025 Event Pictures and Videos

This post is an early experiment with sharing pictures and videos, in this case from the Fall 2025 Event. And also an early collaboration with the Travis County Historical Commission blog and Richard Denney in particular. Rich is a historian, one of the original UTCC staff working with the DEC-10 in HRC, and a photographer. Many of the photos here in the DecwarOrg blog and on the TCHC blog are by Rich, and we will repeatedly say Thank You for Rich's work: photos, articles, discussion, and feedback. TCHC blog and Rich are the role models for DecwarOrg blog. Here are two TCHC posts of special historical computing interest. First the new TCHC post



Mk Haley and Eric Freeman are faculty in the UT School of Design and Creative Technologies, and Noah Smith is a UT Aerospace Engineering alumni. Mk, Eric, and Noah collaborated on the Fall 2025 Event, spontaneously forming DecwarOrg along the way. The website and this blog are building on that. Roughly speaking, DecwarOrg began taking shape around the nucleus of the Event, and things have been growing spontaneously from there.

Below are a few pictures of attendees experiencing a bit of living history, thanks to Eric's DecwarJS reconstruction. Eric's work brings the seventies technologies from UTCC all the way up to the present day, making them completely accessible. Eric's code and workflows are cutting-edge and provide a great experience for everyone interested in the history. At the same time, anyone interested can also simultaneously get hands on with the original fifty year old code and environment. Both the modern and the vintage are available side-by-side, open for use together, separately, or in ways we haven't foreseen yet. In particular, both the modern and vintage approaches together are an ideal foundation for future AI applications, with AI players bringing to life the sights and sounds of eighteen players battling across the galaxy back in 1982. In short, the Org is working with something very special here, and is sharing that, and welcoming everyone to participate in the next chapters of this surprising, thought-provoking, and historic fifty-year story.
In the entrance to the event, arcade-like terminals with DecwarJS running.
Old-timers from the eighties were also playing remotely, from other states and countries.
Eric watches over his DecwarJS work in action.
Here's a video walk through of the event.
And two videos that were created for and shown during the event. The first is an intro to some of the history, and the second was designed for use with a projector to help establish the vibe during the event.

Wednesday, April 8, 2026

The Computer Science Department Locations On The UT Campus


We received a fantastic note from Clive Dawson this morning that we have to share right away. This explains and clarifies mysteries that have confused even those with decades of experience on campus!:)

From Clive Dawson

This is what I remember about the buildings that the Computer Sciences Dept. occupied over the years.  When I arrived in the Fall of 1971, CS was housed in Waggener Hall together with Classics and Philosophy.  Pearce Hall (Building) was to the south, across Inner Campus Drive, just west of the Business Building (BEB).  It housed an RJE (Remote Job Entry) terminal which is where Dave Matuszek remembers going to submit card decks and retrieve printouts.  These functions could also be performed in the Computation Center proper.

A year later ('72), CS moved to Painter Hall, north of the Main Building, where it stayed for many years.  We shared it with the Physics Department, and it also housed the Painter Hall Telescope.  If any building can be dubbed the “Home of Star Trek”, it would be Painter, just as the HRC was the “Home of Decwar”.

In the early ’80’s  there was talk about constructing a new dedicated building for CS.  A “slot” was even reserved in the UT construction schedule for this, but much to the dismay of many CS faculty, this slot was “stolen” by the new MCC building at Braker and Mopac around 1986-88.

The Taylor Hall Annex on the corner of 24th and Speedway housed another of the Comp. Center’s RJE sites, together with several CC user services (consulting) offices.  It was torn down in the early ’90’s to make room for the ACES building.  By then, the CS Department had moved from Painter into Taylor Hall proper. 

In 2010, the plan was to tear down Taylor Hall to make room for the Gates Computer Science Complex and Dell Hall, commonly known as the Gates-Dell Complex (GDC).  So the CS Department was scattered to various places on campus, including the ACES building as well as a temporary building in the parking lot on the northeast corner of 24th and Speedway.  The CS Department Office was moved to this temporary building.

Finally, in 2013, the GDC was completed.  Some CS people who had a significant affiliation with ACES (now known as the O’Donnell Building) stayed there, but the bulk of the CS Department moved into GDC where it remains today.

Waggener
Pearce Hall (Old Law Building)
MCC

Saturday, April 4, 2026

Dave Matuszek Oral History Interview

Oral History Interview 
Dave Matuszek
Noah Smith Interviewer
February 15 and 16, 2026

Noah: We're talking with Dave Matuszek, co-creator with Paul Reynolds of the 1974 Fortran Star Trek game on the CDC 6600, and we're going to learn about your experiences and your history in 1973 and 1974. So we'll start off with background. So tell me a little bit about yourself. 1973, 1974, how did you end up at UT?

Dave: Okay. I did my undergraduate work at Michigan State University. At that time there basically was no such thing as a computer science program. Nonetheless, a lot of people were interested in computers, and my wife and I took courses where we could find them, like from sociology and psychology and mathematics, and I just got hooked. It was a lot of fun. After I graduated, I wanted to do something more serious with my life, so I went into psychology. And after a year of that, and just deciding it wasn't for me, the ACM at that time posted a little pamphlet of all the computer science degrees in this country and Canada. And we looked them over. I selected five of them and applied. Decided to end up going out to the University of Texas. University of Wisconsin was a strong contender at that point. But that was at the point where people were talking about possibly blowing up the computer there. And in fact, after we moved to Texas, they did try to do that. So I ended up at Texas as a teaching assistant. I really love teaching. I really love programming. And we got into this Star Trek program.

Noah: Well, we'll spend a lot of time on that. We're going to get into that, but let's get a little bit more background. Where did you actually grow up? Were you from Michigan? Dave: Yeah, I was born and grew up in Michigan. Noah: Okay, so moving to Texas was a big move. Dave: Yes. My wife and I kept looking at each other and saying, "Texas?". Okay, but it turns out Austin is a pretty nice cosmopolitan city.

Noah: Agreed, agreed. What was your official role? So you said teaching assistant in what? Dave: In computer science. Noah: In computer science, okay. So you came in as a computer science graduate student, is that right? Dave: Pardon? Noah: You were a graduate student. Dave: Yes, in computer science. Noah: Okay. And did you have any direct affiliation with the computation center? Dave: I'd have to say no.

Noah: Where was the computer science department at the time, 1973, 1974? Do you happen to remember what building it was located in or where on campus it was? Dave: I can't bring the name of the building to mind right away. It was right next to the speech building. Noah: Okay. I mean, I'm thinking it might have been Taylor Hall. Dave: Yes. Noah: Okay, yeah, we know roughly what part of campus. So there was Taylor Hall, and this wasn't far from the actual computation center, right?

Dave: Correct. Noah: Can you maybe just describe a little bit what the computation center was like, the actual building, if you can call it that? Dave: Basically, I can't. Okay, what I can tell you is that there was a building quite a ways across campus called Pearce Hall, which is where we typed up our card decks and submitted them, and they were entered into the system and run there. That was physically separate from the computation center. Okay, and then of course after a couple of years we aren't using card decks anymore. Noah: We'll definitely check in on that, like how you physically were interacting with the machine. Let's get a little bit more background before we dive into it though. So, what was your life like outside of UT when you were in Austin? What part of town were you living in in Austin?


Dave:  Well, let's see. At first, we were living in someone's rented basement. And let's see, after a while, we got into married student housing. Noah: Okay, was this down by the river, by the lake? Dave: Yes, it was. Noah: Yeah, that's still married student housing today, so I know exactly what you mean. Dave:  And it was what, Colorado Apartments, I think? Noah: Uh-huh, something like that, right. It's still there. What would you say Austin was like for you? And were you a Star Trek fan, by the way?

Dave: Less so than a lot of people. I'm a science fiction fan and Star Trek was sort of science fiction, and yes, I enjoyed it. Noah: But not a hardcore Trekkie. Dave: Not a hardcore Trekkie, that's right. Noah: Did you appreciate any other things about Austin like the music or anything special that you remember about Austin at the time? Dave: We were never really into music. My wife was also in a graduate program there. We spent most of our time eating at some of the nearby restaurants. After a few years, we started having children and found a very nice Mexican babysitter for them. Once we had children, spent quite a bit of time going to the nearby library and getting out books. I was there for quite a long time. I was actually in the graduate program for 10 years. And the reason for that was we were getting by okay with the money that we were making, and they kept coming up with new courses, and there was always something new and interesting, and you know of course doing the final dissertation is work. But I eventually got around to it.

Noah: So you were there until the early 80s basically in Austin. Dave: Yeah, not good at dates, but that's about right. Noah: It's about right. So let's go ahead and dive in to actually dealing with the machine and then we'll get into Star Trek. Let's explore a little bit what working with the machine was like. Describe the physical situation. So you say Pearce Hall you were submitting card decks. How did that work?

Dave: Well, in Pearce Hall, they had the card punch machines, and you would write up your program, go to a card punch machine, punch it out into cards, put it into a tray, and they would take the tray, and sometime, probably the next day, you'd come back and get a printed result. Now obviously that was not an interactive system, and after a couple of years, don't ask me how many, we had terminals and we could interact directly with the 6600 rather than this baroque and Stone Age way of doing things. The 6600, although being quite powerful, is really still pretty small, and all of my material was stored on a magnetic tape. The magnetic tape was kept in the center, and when I started a session, I would log in and request the tape and then they would mount that and I'd have my programs and things available.

Noah: So the 6600, yeah, my picture is that it was actually in the computation center which is next to the main building. This kind of subterranean bunker style building and you could walk down into the hallway and behind the glass you could see the machine, right? Dave: Correct. Noah: Was your tape stored there inside the machine room? Okay. And you would be in a different building and you would have to request your tape to be mounted, is that exactly right? How long did that take usually for that to happen?

Dave: A minute or two. Noah: Really? That's nice. Dave: And so you see, although the computation center was right there, there was hardly any reason for me ever to visit it. Noah: Right. Because all you could do would be peer through the glass windows and look at the machine like everybody else. What we can say is because it's in this kind of basement underground bunker, there wasn't a lot of space. There certainly wasn't room for terminal rooms down there. There was just the machine room and maybe some offices, right?

Dave: There was a machine room, as to offices possibly. But you know, as I say, why would I ever go there? Noah: Yeah, there wasn't much space down there. So the machine rooms were in other buildings and people should picture that the users were in other buildings. Let's jump ahead just a little bit to talk about Star Trek. When you were working on Star Trek, did you have a terminal already?

Dave: Yes. Noah: Okay, that's key. Dave: It was interactive. 80 characters, 24 lines, I think. Green screen. Noah: The name Pearce Hall sounds kind of familiar to me. Can you orient me a little bit, like where it was on campus? Was it near the main building? Dave: Okay, there is a, and I think it's still there, a major business building. Sort of on the south end, southwest corner. And it was a bit north of that.

Noah: Okay. Dave: From the westmost corner. And I don't know how long it's been gone, but I'm sure it's been gone for quite a while. Noah: Yep, could be. The name sounds familiar to me, but not. So the terminal room, it was shared with other departments? Was it purely a computer science terminal room or was it shared? Dave: It was shared. If you want a bit of history, it was on the main floor occupying I think pretty much the main floor, although I don't recall. Down below in the basement was anthropology. And it was full of skeletons and things. And there was one office right at the back where if you go down and go through the anthropology room, there's an office. And that was my office for a while.

Noah: Nice. Dave: And I did not get many visitors. Noah: But you were right next to the terminal room, which is nice. It was on the floor above you, right? Dave: Yep. Noah: Was there any possibility of having a terminal in your office? Dave: No, we're still talking about the first year or two. Noah: Actually we skipped over, I missed. How did you meet Paul Reynolds? Dave: He and I were both new. I think he came in the year after I did. In fact, I think, I can't swear to this, but I think he took the Fortran course that I was teaching.

Noah: Okay. Dave: And we got together, did a little bit of programming together, became friends. Noah: So he actually might have been in your class. He was learning Fortran. Did you work together in the terminal room? Did you do coding sessions together? Dave: Okay, now, going back to the terminal room. Again, that was with the punch cards the first year or two. And that was you walk in, you use a machine, you hand in your cards, you walk out again. Except for a period when instead of having an office downstairs, I had an office in what was basically a broom closet in the terminal room, which was a very fascinating place because people thought I was a consultant and they would come to me with all sorts of problems.

Noah: Oh, that's funny. And you were trying to work though. This was your office, and hey I'm willing to help. Dave: I don't care if I don't know the language. You make the same kind of mistakes in every language. Noah: This is curious. Obviously people were using Fortran. Were there people using COBOL? Maybe business students were there as well using COBOL? Dave: Oh God. Okay, here's the truth. COBOL was invented for an IBM 605. Its layout was designed to work perfectly with that machine. That machine was a character-oriented computer where the arithmetic was, you specified how many digits you wanted in each number, and it did the arithmetic from the end like a human would and had a multiplication table stored. 6600 had 60-bit words. It didn't have characters. So fitting COBOL onto that machine was sort of putting a motor on top of a camel or something. And I recall this very well because at one point I got stuck teaching a one-credit course in COBOL, and many is the time I would go in and say to the students, "I tried to prepare for this lecture, but I just couldn't. It's so terrible.".

Noah: Understood, understood. How did you feel about Fortran? Dave: It's what I learned. It was my first language. Fortran II, actually. What we were using on the 6600 was Fortran IV. And I think it was locally developed, their version of it. So let me point out that back in those days, you can't assume that Fortran on one machine was anything like Fortran on another. Noah: Okay, understood. Dave: So we wrote in 6600 Fortran and that was it.

Noah: So, let's start looking at what happened with the Star Trek game. First of all, it sounds like you were quite busy with graduate studies. You were spending a lot of time in the terminal room. Were you aware of games on the CDC? Dave: There was a Space War-type game that the people in the computation center played, but the first and I guess maybe the only game that I was aware of that was terminal-based was Star Trek. And I should recall who wrote that, but I don't.

Noah: And we have a few names. I don't have them at hand right now, but Jim Corp and Brady Hicks. So, the notes we have are that they implemented Star Trek in BASIC on the 6600. Dave: Okay, it was in BASIC, it was on the 6600. Noah: Did you play that game? Dave: A few times, which is why I decided to write it. Okay, there were two basic problems. One is that this is still on a terminal. You're getting text across character by character and it's fairly slow, and it was full of quotes from Marcus Aurelius, who has to be the most boring philosopher that was ever born. And you know you'd be playing and then you'd sit there and twiddle your thumbs for a while while you got some stupid quote. So there were a number of other minor irritants. The computation center really hated it because this is a 6600. Now this is a supercomputer. The BASIC program took 20 seconds to load. And that's an incredible amount of time. And that's not the fault of the programmers, that's the fault of BASIC, but we decided to write our own version in Fortran minus quotes from Marcus Aurelius. And although the official position of the computation center was no games, this is for work, this is for study, you don't play games on it, they loved our Star Trek. First of all, there were a lot of Star Trek nerds. And secondly, instead of taking 20 seconds, it took 1/20th of a second to load. So the pressure on the machine practically vanished with that game.

Noah: Understood. So the 6600 was running in a time-sharing mode. So there were multiple users. So I wanted to ask you about how the computation center felt about games and I think you've made clear that they weren't too happy with it at first. I wanted to ask you, you mentioned that this computation center had a Space War-type game. Dave: I believe so. If you look into the history, as I'm sure you have, there are a number of games floating around at that point that you had to be at the computer in order to play. And it was my impression that they did do that. But of course, nobody told us.

Noah: Over your career, did you ever see the original Space War on a PDP-1? Dave: No, I have not. Noah: Okay. I think that this was a lucky small group of people that actually saw it on a PDP-1 because there just weren't that many PDP-1s. Dave: Right. For a while I used a PDP-8, but that's another story entirely. Noah: So, is there anything else about what we've discussed so far today? Is there anything else that you would like to say about all of that?

Dave: Yeah, let me mention one thing. The game got quite popular among students there at Texas, of course, and because Paul and I had written it, they assumed I was good at playing it. The fact was that I hardly ever finished a game because whenever I started playing, I'd find something that needed polishing or something that could be added. And I had more fun programming it than playing it. Noah: Understood. So I get the impression that you weren't playing a lot of games. In any case, it was more of a programming challenge.


Dave: Yes, that's correct. And in fact, I still don't play many games. Noah: Understood. Can we get a little more information about Paul Reynolds? So, were you working together? When you were working on the game, would you code together or was it more you shared your work but asynchronous? Dave: Well, I actually wrote most of the game. Paul Reynolds and another person, Rich Cohen, spent a lot of time talking about it and planning what to do. Paul's main contribution was writing the photon torpedoes code. At one point, I decided I didn't like that code and tried to rewrite it myself and discovered what a great job Paul had done with it.

Noah: So, was there anyone else there that were hands on with the code? Or it was mostly you? Dave: Just Paul and myself. Noah: Okay. You said that it became quite popular. Did you have any idea that it would have a history and be talked about 40 years later? Dave: No. So, if I had to choose one thing that I'd be famous for, I'm not sure this would be it. But on the other hand, I can't think of anything else I might ever be famous for.

Noah: So on this topic, Eric Raymond, his website and Git repo kind of records the history of the Star Trek games, and you know we learned a lot about you from that. When you interacted with Eric, did you have any idea that what he was doing would grow into kind of an archive of a type? Dave: Oh, yeah. I met Eric at a science fiction convention in Philadelphia. We were both science fiction fans. Became friends, discovered he lived about a mile from me. And so we were friends for many years. We sort of still are. When I say sort of still, we don't talk politics. He's a libertarian and I'm very much a liberal. Noah: Understood. And Eric has strong opinions. Dave: He does have strong opinions. He's somewhat controversial. My editor asked me not to use a quote from him in my data structures book, which I can respect. But you know, he can still be an interesting person and a great programmer and all that stuff even if his politics are not anything I would want to touch with a 10-foot pole.

Noah: Understood. Dave: We used to be a lot closer, but things changed. Noah: What years was this that you were living nearby? Dave: 80s or 90s. We moved here in '85. We probably met him somewhere around '90. I don't know. Paula, when did we first meet Eric? Yeah, my wife says 1990s. 

Noah: All right. So we're going to start with talking about the 6600 and any other machines, any other platforms that you'd like to discuss, right? So, we don't have to stick just to the 6600. So I'm really curious, you were there for around 10 years whether you saw any kind of change over that period. So if you remember anything, maybe you noticed the CDC Cyber come in. I'm quite interested in the history of the Cyber that kind of replaced the 6600, and then we'll go from there into the Fortran, into the game. So let's just start with tell me a little bit about the 6600. How do you feel about it?

Dave: It was a great machine. It was of course a supercomputer in its time. As for changes as I mentioned, started with punched cards and by the time I left we were playing Star Trek online and doing most of our editing online. Let me tell you just a little about the editing process because it wasn't like today when you pull up everything you've got and start editing it. What we did for Star Trek was print the whole thing out every week or two, go through and mark our changes on the printout, and then go to the computer and enter them.

Noah: Would you say that it was kind of pre-screen editor? Dave: Oh, yes. Well, okay, as I remember, now bear in mind that this was a long time ago, I don't remember exactly how the editing process worked, but I know that, yes, we did do it on a relatively limited terminal, 80 characters by 24 lines as I recall. Maybe it wasn't 80 characters. Noah: I'm thinking of some of the early editors I'm familiar with like TECO on the DEC system. So, these were I think they were even called character editors rather than screen editors. You know, this started with editing on paper tape even, right? Dave: Do I remember what editor I used? Absolutely not. Noah: I'm just trying to picture, you know, you would be looking at your printout. Would you maybe enter the line number like I want to edit line number something? Dave: I don't remember line numbers. And as I say, I don't remember exactly how we did the editing on a limited text-only terminal.

Noah: But you, I guess the printout was important. This is a good hint, right, the printout was important. Dave: Oh, yes. That was by far the easiest way to look at the code. Noah: What was your workflow like? You say you print it out once a week roughly speaking and then? Dave: Well I'm a bit amused by calling it workflow. My work was my courses. But whenever I thought of a change I wanted to make or an addition I wanted to make, I would figure out where in the code it went, write it out, and then go type it in. And then, as I recall, I could try it right away. Noah: Okay. I mean you had to compile it and run it but that wasn't a long process at that point. How about a debugger? Was it kind of just run it and see what happens? Dave: Yeah. And I'd go out and club a cave bear and bring it home for dinner. Noah: Right. Dave: Yeah, things have changed a great deal believe me. We're talking what, 55 years ago something like that. Yeah, and this is not a field that has ever stood still.

Noah: Understood. So, it's interesting that you mention that the 6600 was a supercomputer. And you know, it's common to see this that it was the first commercial supercomputer. So did you remember hearing that back at the time? Dave: Oh, yes. I mean, nobody today would call it a supercomputer. Noah: Did you know or have much context on the company? So CDC, this is I guess Minneapolis right? So kind of up north?

Dave: I think so. Noah: Did you know the name Seymour Cray for instance, and that he was involved with the design? Dave: Oh yes. As far as I know, he did the complete design of that and the follow-up machine. I never used it. I don't remember what the next machine was that he designed. I remember one of his comments was that he was done messing around with small computers and was going to build something impressive. I remember pictures of it that showed basically a pillar with what looked like seats all the way around it. So probably tonight in bed I'll remember what that computer was called.

Noah: This is interesting. It might just be the Cray-1. My picture is that UT had kind of all three generations from the 6600. So the next CDC machine, big machine was the Cyber, which was a transistorized version or brought in integrated circuits. That's the second generation and then the Cray-1. And eventually all three generations were there in Austin.

Dave: My undergraduate work was at Michigan State University. They had a Control Data 3600. I spent some time at Indiana University and I think at that point they had a 6600 fairly sure. And then I went to Texas and they had a 6600. By the time the Cray-1 came out, certainly by the time Texas got one, I had been gone from there. But it probably came out well before I left. Noah: So it sounds like you had a lot of experience actually with CDC hardware and CDC Fortran.

Dave: And very little with IBM. Noah: Okay, understood. So yesterday you mentioned that the 6600 was a 60-bit machine. So tell us a little bit about what you think about when you think of 60 bits rather than say 36 bits or 32 bits. Dave: Sure. Two things come immediately to mind actually. One is that there were six bits allowed per character which meant that we didn't have lower case. The second one was that it had 24 registers in groups of three: the index registers, the accumulators, and the third group. And when you got into the assembly language programming for efficiency, you had to keep track of what was in each register. So you didn't have to go up to memory any more often than you had to. And doing some very careful, very efficient programming on that is what decided me that I was done with writing in assembly language.

Noah: Okay understood. Did you do any let's say major projects in assembly? Dave: I did some projects in assembly because I definitely remember struggling with trying to use those registers as efficiently as possible. But I don't remember what it was and I wouldn't call it major in any case. Noah: Okay. So the picture I have is that the 6600 and the 60-bit words were really for floating point and you essentially get the effect of double precision on most machines, but it was single precision on the 6600 because it had the 60 bits. So you got effectively a double precision effect at single precision speed. So this was big for engineers that were doing early numerical simulations, right?

Dave: Yes, I don't recall any distinction between single and double precision. Noah: So what I wonder is, were you aware of any of these power users that were doing numerical simulations on the 60? Were there people that would hog the machine with these big simulations? Dave: I'm sure people were getting into that. It was time shared and there were not any periods where we as students couldn't use it. So nobody hogged the machine to death.

Noah: Okay, in that sense. The reason I ask is I've got some documentation that describes that there were actually two machines. This is the picture that I'm learning about that there was the 6600 and there was a 6400 and they called it the dual dinosaur. So there were two, they had shared memory and the 6400 was handling the time sharing and the 6600 was for the heavy batch oriented big jobs. I was curious if you saw any signs of that. It might have been a different period that this was happening.


Dave: I vaguely recall hearing about something like that, but I don't think that was at UT at the time. Noah: Okay. And this might be later. What I see often is they call it UT2D for the UT dual dinosaur and this suggests these two CPUs with the shared memory. Dave: Well, it's a cute name, but I'm pretty sure that I didn't deal anything with directly with that. It could have come in later. Noah: And there's also a lot of documentation about how the UT system was using kind of a custom operating system. So there was evidently a standard CDC setup and then some oddball sites and UT was one of those oddball sites.

Dave: True. Noah: Something about the character handling. There was something about the line terminations. Does this sound familiar at all? Dave: I think you understand when I say that I don't recall and I doubt that I knew at the time what was being used as a line terminator. And because at no point were we doing any character searches or anything like that. It was just writing code. Every line of code was on a separate line and that's pretty much all we knew.

Noah: What I've seen signs of in documentation is that these kind of odd sites, they had a hard time sharing software and code with the normal CDC sites. It's like code had to be rewritten a bit. Dave: Yes, the word I was trying to think of was standardization. That was in the future.

Noah: That's right. For the languages, for the operating systems, for everything. So you know, I thought about this earlier that let's imagine that we actually recovered your Fortran code. We do have a 6600 simulator, the SIMH type simulator. What I wonder is because your code was on a UT machine would this be a challenge? And I have no idea. It's just something to explore.

Dave: Okay I'm guessing here. The operating system I know UT pretty much did their own. The language I don't think they messed with particularly, the Fortran language. Noah: I think that's the hope that because it's in Fortran, and as you said yesterday Fortran was not standardized but for the 6600 we could hope that it would be a bit standard, so I think we would have hope that it would run. Dave: Yeah. I'd probably give you at least 60 40 odds on that.

Noah: Yeah, understood. That's about what I would guess too. So, before we leave the hardware and move into the Fortran, what would you say was the best part and the most painful part with the 6600? Just to sum it up for people, what did you love the most and what did you find kind of the hardest with the 6600? Dave: Well 6600 was what there was. I mean what am I supposed to compare it to? I very much enjoyed programming on it. There were quite a few workarounds to use characters on it since it was designed as an engineering machine, but it was what we programmed in.

Noah: How many years do you think of 6600 programming did you get? Close to 10 years? Dave: Oh, I'd say so. I've really been into programming languages and although Fortran was my first language, Fortran for the 3600 was different from for the 6600. I've gotten into a lot of other languages and I can't say how much toward the end of that I was actually using Fortran anymore. I think I probably was because I think all the introductory courses were still being taught in Fortran. And that means the upper level courses that I was teaching at that time would also assume Fortran.

Noah: That's important. So this was kind of the academic language in that environment right? Dave: Yep. Noah: So, let's move into the game itself. I think you did an excellent job of describing how you and Paul Reynolds kind of decided to redo this game that was written in BASIC. How long did you spend creating the Fortran version? Dave: I have no idea. I'm guessing probably a month or two to get the initial version running and then probably a few more years of tinkering with it.

Noah: Did you start from the BASIC code of the previous version or just start? Dave: Oh, no. God, that would mean I'd have to look at the BASIC code. And are you familiar with BASIC at all? Noah: A little bit. Dave: Okay. Let me briefly just point out that it depends on line numbers. All the lines are numbered, they have to be in numerical order, and you refer to other parts of the program by line number. So in terms of high level concepts, no. I played the game and I played the game enough to know how it went. I think I changed 8x8 sectors into 10x10 sectors. Nothing fundamental.

Noah: That was all recreating it from what you knew it was supposed to do. When you first saw the Star Trek game, even the BASIC Star Trek game, do you remember seeing this ASCII artifact of when you did the scan and you see a sector grid? Do you remember seeing this ASCII artifact and did that impress you at the time? Dave: I don't think I know what you're talking about. I mean, there was certainly ASCII art, like Snoopy calendars and things like that, but I don't remember seeing any artwork in the game itself.

Noah: Okay. So, I'm not sure that the early Star Trek games did this, but what I'm thinking of is when you typed scan, it would show you a map. That's what I mean by art is this map. Dave: Oh, okay. Noah: Short range scan. Long range scan. The first Star Trek that you saw already had that. And do you think that intrigued you that seeing this kind of map was?

Dave: Well frankly my basic impression of that is that it was useful and it was nice to have the range, nice to be able to type out the map when you needed it. But this was at 33 baud, which is like 10 characters a second or something like that. And so I mean, you'd ask for a long range and go make a new cup of coffee. Noah: Understood. How big do you think the code base was? Was it a large code base that you were working on? Dave: Oh, I'm guessing about 30 or 40 pages. Noah: I would say that's pretty big. Dave: We weren't counting bytes, we were counting printed pages. Noah: I would say that's pretty complex and sophisticated in some way. Do you think that the game was a bit sophisticated for its time?

Dave: No. I mean there was already the BASIC version of it and there were a few other games that I was vaguely aware of. I mean, you could play things like Hangman or Nim. Noah: So let's ask this. What part of the implementation was kind of the most fun or interesting for you and what were you proudest of in some way? Dave: Oh, I most enjoyed, I think that would be The Thing. It occurred to me one night when I went in and added to the code and set it up so that when you saved the game, it wasn't saved. So you would see The Thing appearing maybe once every 20 or 25 games. And you either dealt with it then or it was gone. Do you know what I mean by The Thing?

Noah: I'm guessing some kind of space monster? Dave: Yep. It appeared as a question mark. And Spock would say, "Fascinating.". And about the only thing you could do with it was shoot it with a photon torpedo in which case it screamed and disappeared. Noah: So was there a Romulan ship in those early Star TreksDave: No, I'm afraid that what happened was that, as I say, we listed out our code and used it for a while and threw it away and then apparently someone stole our code from the garbage can. Now, we did not have Romulans or dilithium crystals in it. But someone stole our code, created Super Star Trek, and added it. And that was annoying, of course, but if they had simply come to us and said, "Hey, can we get involved in this?" I'm sure we would have.

Noah: Well, Dave, this is fascinating. I did not know this twist to the history that someone got your code and separately created Super Star TrekDave: Correct. Noah: So, you know what happened then was within two years there was the two-player Star Trek on the 6600. Dave: I'm not aware of that. I wasn't paying a lot of attention. I mean, it wasn't my game anymore. Noah: So, then within four years there was this later 18-player game and in it the Romulans appeared as question marks. So, kind of like The Thing.


Dave: Now, I know nothing about that. Noah: Yeah, so in the 18-player game this AI controlled enemy, the Romulan ship, comes in and it appears as a question mark and it acts somewhat like you described The Thing. So what I'm thinking is you might have in fact invented what many people know as the Romulan when you invented The Thing. Dave: It could be. Let me also mention one of the things that kept us occupied in talking about the game was trying to get everything balanced so that there was not a best way to play. Initially the impulse engines were worthless. So we made it possible so that if you used the impulse engines to enter a sector you weren't detected right away. And that meant you had first shot. But then that meant that the way to win the game was to go every place at a crawl using only the impulse engines. So at that point we decided that impulse engines took time and every once in a while a star would go nova. So if you depended entirely on the impulse engines you'd run out of stars or star bases.

Noah: So this is another thing, stars going nova, that I'm realizing you might have invented several of the things that later became quite familiar and famous to people in your version. This is entirely possible. Dave: Could be. The thing in particular I know while I was at UT had quite a following because anybody that could claim that they had seen The Thing got a certain cachet. Noah: That's right, yeah, this is fascinating. You know we've got one minute left, but what I'm thinking is I'm going to hope that we can explore some of these stories more in the future. I had no idea that Super Star Trek was a separate production from your code.

Dave: Well, you can't call it a separate production the way that Star Trek was separate from BASIC because that was our code. They just took it and added to it. Noah: Yep, okay. So, I'm glad that we got to learn this, I had no idea. So this is something to follow up. You know, a lot happened at UT with Star Trek, let's put it that way. And we're just starting to explore all of that history. So okay I think we're going to run out of time for right now, but I hope that we get to discuss this more in the future. And thank you very much for the last two days.

Dave: It's been fun.

DEC and the Dawn of Interactive Worlds

This post is a mashup of two draft book sections. What was the first section has been moved to the end here because it's a bit disorient...