There's more context. Creative Computing magazine was the vehicle of David Ahl, who also published the Star Trek BASIC code repeatedly from around 1973 and is in some sense at the heart of the single-player Star Trek story that led to Decwar. In other words, Ahl published an article about the UT DEC10 a few years before it was used to create an ultimate version of the Star Trek game he was championing. In the early days of the microcomputer revolution, Creative Computing and BYTE were the two pillars of the industry. While they were friendly competitors, they served different niches and even collaborated occasionally before being absorbed by larger corporate entities. Creative Computing (1974), founded by David Ahl, is widely considered the first personal computer magazine. Ahl, a former DEC employee, launched it to focus on the educational and playful side of computing. BYTE (1975) was launched about a year later and quickly became the journal of record for the industry, known for its technical depth and massive, brick-like monthly issues.
Friday, February 27, 2026
1976 Photo of the UT Austin DEC-10
Saturday, February 21, 2026
Pursuing Authenticity and Efficiency with Tape Images
The project's second stage, which began at the end of 2024, represented a strategic and philosophical shift. The motivation was not merely to escape the awkwardness of client-server file transfers, but to pursue an ideal vision: a complete Decwar PDP-10 system built from scratch using only SIMH tape images of original authentic DEC tapes. The ambition was to build TOPS-10 from DEC source tape using MONGEN (MONitor GENeration, analogous to building UNIX or Windows from source code), then install the appropriate DEC Fortran IV from DEC source tape (DEC FORTAN-10 V6), then install Decwar from reconstructed UT Austin SDT, build, and play.
The transition to a tape-based workflow marked a major jump up in efficiency and realism, with immediate and substantial improvements in cycle time and development ergonomics, rendering the Kermit-based approach obsolete for active development. The new process involved editing source files locally, creating a new SIMH tape image, an automated process taking less than a second, and simply restarting the SIMH PDP-10. The entire process, from end to end, takes a matter of seconds and is invoked with a single command or push of a button.
The new workflow was centered on SIMH tape images. The impact was profound, creating a much smoother and more flexible workflow. The paradigm shift was so complete that Kermit and client-server file transfer have completely vanished in practice and are retained as possibilities mostly for historical reasons, much like the possibility of using a terminal to perform interactive file editing on the PDP-10 using SOS or TECO. They’re possible and interesting historically, but not effective everyday workflows.
Beyond efficiency, this shift held deep cultural significance. By using the reconstructed SDT for every build, the project was "eating their own dog food." This practice is the ultimate validation of the archeological work, proving the integrity of the artifact by using it as the foundation for further progress. This also makes the entire process feel "super realistic". The technical elegance of this approach lies in its fidelity to the original hardware paradigm. The SIMH PDP-10 interacts with the tape image without awareness that it is not physical hardware. This commitment to authenticity is demonstrated in practice with every build from reconstructed SDT, ensuring a clean and consistent environment. While this Tape Era perfected the local build process, it remained tethered to a specific host machine's configuration, setting the stage for the next evolutionary step: total environment abstraction.
Saturday, February 7, 2026
An Initial Pragmatic Bridge to the Past
Beginning in the fall of 2024, the project's first stage was defined by a fundamental logistical problem: how to transfer newly edited source code from a modern host computer onto the SIMH PDP-10. The initial solution was Kermit, a venerable client-server protocol that characterized the project's first development workflow. This approach required running Kermit on both ends, a modern host and the PDP-10, to establish a communications bridge for file transfers. Originally created in 1981 at Columbia University to allow users to move files between smaller computers and campus mainframes, Kermit requires the user to manually run executables on both the local host and the mainframe. In some sense Kermit is a serial terminal style ASCII connection between two user-layer executables, and piggybacks its own file transfer protocol on top of that. This can be juxtaposed with the early 70s ARPANET protocols such as TELNET and FTP which are in some sense lower-level and more deeply integrated into the operating system layer.
The concept for Decwar was to edit Fortran source files locally, then use Kermit to transfer them to the PDP-10. Another apparently simple alternative was to edit Fortran source interactively on the PDP-10 via telnet session using vintage DEC tools such as SOS and TECO. While entertaining, this very quickly proved too slow and cumbersome.
The Kermit workflow, though an indirect way to code and test, was at least somewhat automateable using Kermit's scripting capabilities. One would edit Fortran source files on their local machine, and an automated script would then use Kermit to move those files over to the PDP-10 in order to rebuild Decwar and prepare it for testing. This workflow had a dual nature. It was a learning tool that enabled the project to get started quickly, and it was indeed worthwhile to learn more about Kermit because of its familiarity and relative ease. At the same time, this method imposed significant limitations. The process was comparatively slow and more than a little bit awkward, highlighting the friction between the modern and historical systems and ultimately serving as the catalyst for a more integrated methodology.In retrospect, this workflow, while functional for initial system exploration, presented significant efficiency bottlenecks. It likely slowed down progress compared to what would have been possible using better approaches. The reliance on manual, multi-step file transfers created an indirect and cumbersome development cycle that was ripe for optimization.
The First Internet Computers In Texas
Thanks to Clive Dawson's excellent discussion , it's now known that the first Internet computers in Texas were on campus at UT Austi...
-
We're learning much about the HRC DEC-10 site, from 1975 to 1983. Until meeting Rich and Clive, did not know the 10 was in HRC , always ...
-
Oral History Interview Dave Matuszek Noah Smith Interviewer February 15 and 16, 2026 Noah: We're talking with Dave Matuszek, co-creato...
-
Thank You to Clive Dawson for this fantastic info! What Clive is discussing here fits together perfectly with info from the PDP-10 serial n...


