Tell me more ×
Programmers Stack Exchange is a question and answer site for professional programmers interested in conceptual questions about software development. It's 100% free, no registration required.

What features of old computers helped you learn to be a better programmer -- but don't seem to be available on new computers?

I imagine that, while educational, you are really glad some features are gone, such as

  • programs ran so slowly that I could almost see each pixel being plotted, so I got a visceral feel for the effect of various optimizations.

I imagine other features you may be a little nostalgic for, such as

  • I could turn on the computer, and write a short program that printed "Hello, World" on the printer, before ever "booting" a "disk".

(I'm hoping that this is constructive enough to avoid the fate of the " What have we lost from computers 20 years ago ?" question).

share|improve this question
I think your fate is in the hands of the people who answer you now... lol. – Kenneth Mar 20 '11 at 5:40
frankly I do not see how this question differs from the other. I mean what could I possibly learn from this that will make me a better programmer... – Newtopian Mar 20 '11 at 8:45
All of them...? [Spent last night using an old BBC Emulator] – Orbling Mar 20 '11 at 9:19

12 Answers

up vote 6 down vote accepted

The manual that comes with the machines is not nearly so helpful to programmers as they used to be. The market is different. Take for instance the manual for the Commodore 64. It was a separate buy, but with it you had all of the following essential information:

  • A map of memory, RAM/ROM, special registers, etc.
  • Instructions for how to bank out or bank in ROM (i.e. turn off the BASIC interpreter so you could use that space for your program)
  • Instructions on how to manipulate the graphics and sound cards by changing memory locations (they were memory mapped)
  • Instructions on how to safely add your own interrupts
  • The complete instruction set for the 6502/6510 processor, combined with the cost in cycles, memory, etc. for each instruction
  • The machine language API for the C-64 kernel.

This all from one resource. All these things are so complicated now that most of us don't dare to venture in to it anymore. No-one can have an intimate knowledge of the complete Intel x86 instruction set because it is so vast and a large portion just isn't needed. We don't manipulate memory locations directly to control our graphics or our sound anymore--that's all done through C APIs.

Basically, we've lost a sort of intimacy we used to have with our machines. Some of that is for the better, and some of that is for the worse. For the better, we can write code to one API and have it work on a number of different compatible machines. For the worse, we are not as cognizant of the abuses we put our computer through.

share|improve this answer
1  
Certainly a different market, back then, they were tools for you to control and adapt. Now they are items to be used, tools have become finished goods. Particularly if you look at the Apple sector. – Orbling Mar 20 '11 at 14:11
2  
Computers used to be tools, like saws, screwdrivers, etc. Something to be used by creative types to do something. They're now appliances, like a toaster. Anyone can walk over, push a button, something magic happens, and they get a result (typically mass media entertainment). Toaster owners only get annoyed if you give them the schematics. :-) – Brian Knoblauch Mar 22 '11 at 13:34

I think that working on computers with minimal memory, harddrive and bandwidth have made me more aware of the issues involved in all of those. Too often I see a developer surprised that copying a large array around several times might take up all their memory.

On the other hand, I think having learned in a time where compile times were ridiculously high (measured in hours instead of seconds) have made me too careful. I see developers compile, fail, fix, compile, fail, fix, compile, fail, fix and I cringe, though I know intellectually that the cost is minimal - probably less than me instinctively trying to catch what I can before I hit compile.

share|improve this answer
1  
+1: compile/fail/fix cycles are an excuse for thinking. If I see that happening I know there is no thinging going on and I get quite angry. – quickly_now Mar 20 '11 at 9:14
1  
It depends... TDD requires quite a bit of good thinking. – Levinaris Mar 21 '11 at 13:59
+1 for minimal memory and storage. – msvb60 Mar 21 '11 at 14:37
What do you want to spend time thinking about? I like to think about what the next test case should do. Then think about making it pass. I'm not going to pick through code doing work the computer will do instantaneously. – kevin cline Mar 21 '11 at 17:21

ZX81 - lousy games. I'm sure I wouldn't have spent so much time tinkering around learning to program in BASIC if the games were as good as on today's consoles.

share|improve this answer
Oric 1. Same story. Not sure it made me a better programmer, but it made me a tinkerer which led to me being a programmer – pdr Mar 20 '11 at 6:18
I wonder if there's going to be a programmer shortage now... – Rob Agar Mar 20 '11 at 6:25
No, because now it is considered a desirable career path. People learn programming to get a job rather than because they start tinkering. – pdr Mar 20 '11 at 6:34
3  
And I'm firmly of the opinion that good tinkerers gives a better understanding, leading to a better programmer. – quickly_now Mar 20 '11 at 9:15
Spent last night tinkering with some old games on a BBC Emulator, and a bit of time on the Speccy emulator too, good times. – Orbling Mar 20 '11 at 9:21

Has to be DOS's interrupt service routines and the fancy things you could do. Gave a pretty good start on machine internals, and that heady feeling of "WOW, I am a geek or what!"

share|improve this answer
2  
You really can not say you're a true programmer till you've typed INT 21h a few times and got back something useful. ;-) – Orbling Mar 20 '11 at 9:22

My first personal computer was an IBM 360/20. Put a card in the card reader, press "LOAD", and it would read the card image into 0x0080 and jump to it. By multi-punch I learned to program in machine language, the bottom level of machine code. In DOS you could use Debug to write machine language and even assembler and run it, real time, without every saving it to a file.

Our 360/20 had NO operating system. To read a card you had to issue a CPU instruction with the target address. To write a print line you used a different instruction. So a loop would be 0080: read to 0101 0084: write from 0100 (byte 0100 being the line spacing character) 0088: jump to 0080 which is unbuffered. Then I learned how to detect end of card deck and to buffer input/output myself.

The lowest of the level code, but operating systems have never been a mystery to me since.

share|improve this answer
1  
A personal computer? – user1249 Mar 20 '11 at 10:12
Yes, a personal computer. The company only had one programmer, me, and only one operator, me. Had ladies to keypunch data as necessary but I did all my own keypunching of programs etc. 8 kilobytes of RAM, multi-function card reader/punch, cpu, and printer, the size of three desks. It rented for 1,500 dollars a month in 1969. – Andy Canfield Mar 21 '11 at 0:49
Ok, a personal computer for rich persons. It is hard these days to understand the drive for getting more value out of such a computer, with multi-user operating systems. – user1249 Mar 21 '11 at 3:03
The first computer I used required one to enter the address of the boot routine into the program counter and toggle the run switch. The cool thing about this setup was that one could also enter machine code via the front panel. – bit-twiddler Mar 21 '11 at 14:15
Heh, when I was in college I was a lab monitor, and frequently had to start up a diskless PDP-11/20 that was used as an RJE terminal. I had to toggle in a primary boot routine that loaded the RJE code from a card deck. Had to do something similar with a different machine that had a hex keypad instead of toggles, not nearly as much fun (although a bit faster). – TMN Mar 21 '11 at 14:48

The most evident thing that is missing today is restrictions. There are simply enough

  • memory
  • cpu horsepower
  • disk space

for most things. On eight bit machines you had 64 Kb of RAM - if you were LUCKY - so compact code was a premium. 486's were barely fast enough to do MP3-decoding, so you had to use a special MP3-player which worked with integers as they were faster. The common distribution media were diskettes - hence your stuff should not fit more than absolutely necessary to avoid extra weight and install tediousness.

If you do Java you might get some of the same from entering the 4k competition:

http://www.java4k.com/index.php?action=home

Write a Java program doing fun stuff, where the distribution must not be larger than 4 kb.

share|improve this answer
1  
And still we have lots of people asking "What is the fastest code int I = 0; or int I; I = 0; ?". – Bo Persson Mar 20 '11 at 12:53
1  
@Bo, well, you have to start somewhere. – user1249 Mar 20 '11 at 17:30
The 486 was an absolute speed demon compared to the first computer system on which I worked. It had ferrite core memory. Memory access time was four microseconds. – bit-twiddler Mar 21 '11 at 15:07
Once I read "if you were LUCKY" the rest of it followed in a Yorkshire accent: youtube.com/watch?v=Xe1a1wHxTyo :) – StuperUser Mar 21 '11 at 15:09
1  
@StuperUser, Can we have your liver then? – user1249 Mar 21 '11 at 16:32
show 2 more comments

TI 99/4A

  • no debugger, so I had to learn to find bugs by interpreting the code and follow the control flow

  • not much memory and slow, so optimisation was highly important

share|improve this answer

Security

During the 60's and 70's Multics proved that real security was possible without being intrusive. It is amazing, that with all the advances since then, that security is no better and perhaps even worse.

http://www.multicians.org/general.html

share|improve this answer
I read your off-topic posting the other day. I assume that you started out in the seventies as a DP. I got my start in this field in the late seventies/early eighties as a DP. From reading your posting, it looks like you missed out on the joy having the deal with the old UNIVAC 1218 CPU that was used aboard ships. That thing was a trip to use. I went from using that antique computer to working with supercomputers at the National Security Agency (NSA). I also landed a full-time software development gig in NSA's R Group as a DP3; therefore, I know what it meant to you to get that big break. – bit-twiddler Mar 21 '11 at 14:37
There has been some capability-based OS research lately, I'm hopeful that something may come of it eventually. I haven't looked at HP's WebOS, but I doubt if they have anything like that in there. – TMN Mar 21 '11 at 14:40
PL/I was the first language that I truly loved. I wrote a ton of PL/I code when I was stationed at NSA (I also wrote a ton of FORTRAN an C code). My love for Pascal grew out of my love for PL/I, as both languages where heavily influenced by the work of John Backus. Years later, I found myself being called in whenever someone wanted to port an application from MULTICS to another platform. – bit-twiddler Mar 21 '11 at 14:52
After the Navy I went to work for Honeywell, and had to go to NSA one time. I was escorted to and from the mainframe and only allowed to look at the console, no terminals. There was a Multics system at NSA for quite some time, and was one the highest Orange book rated computers of its day. I was trained on a system in "A" school that I think was Univac, but it had an AN designator. When I first heard of an OS being written in a HLL I was skeptical, but over the years I was convinced. – dbasnett Mar 21 '11 at 21:14
Well, yeah, security was great on my DOS box... It didn't connect to anything. :-) – Brian Knoblauch Mar 22 '11 at 13:35

I didn't start that long ago, but in the last decade one huge change has been the rise of the effectively searchable internet (google) you had very few instantaneous, always on resources for assistance. You had to be a better programmer because help was hours or days away at best.

share|improve this answer

one word: C O N S O L E !!

And no lame localization...

share|improve this answer

A decent architecture. I learned assembly language on a Z-80, then learned the 6502, but it wasn't until I started learning it on the PDP-11 that I really appreciated how nice it could be. You had enough registers that you could keep your frequently-used variables in them, you didn't have instructions that only worked with particular registers, and you had those great addressing modes like "register indirect with displacement" (?), which let you directly access elements of a complex data structure. I was disappointed with the 68000 and its separate "address" and "data" register files, but then I saw the 8086 and just swore off assembly language all together. Only four registers? And segmented memory? And that stupid REP prefix? And I personally think that you LOAD a register with something, not to something. I.e., LD A,#$30 should load 48 into the A register, not move the contents of A to memory location 48. Again, probably just me, but it was just another turn-off to the x86. And yeah, I know they've "fixed" a lot of these things, but I've already written my own CPU emulator for my own CPU arch, so I really haven't had any desire to check it out. I'm hopeful that maybe there'll be some opportunities to do some ARM programming coming up, though.

share|improve this answer

My first computer was a ZX Spectrum. It was (is?) good for learning because of:

Simplicity

Even child can learn how Spectrum work in days. It was shipped with hardware and software manual, including chapter binary arithmetic.

Build-in programming language

You can start programming in Spectrum Basic without any custom software. You have the same environment on all boxes.

Affective

Commodore 64 vs ZX Spectrum rivalry, rubber keys...

share|improve this answer

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.