HK = [ Faqs ] [ Archive ] [ Community ] [ Map ] [ Search ] [ Link ] = HK
[ Newsletter ] [ MailingList ] [ Blog ] [ Updates ] [ News ]
     
 
Torna al Demo-H-Scene-Menu

Life before Demos (or, Hobbyist Programming in the 1980's) by Jim Leonard
This document recounts the early days of fun computer stuff on the IBM PC, so whenever I refer to "the computer" or a similar generic term, I'm talking about an IBM. If I mention the "Tandy sound chip", I'm referring to the sound chip that was present in both the IBM PC Jr. and the Tandy clones (it's the same chip).
Introduction

Many demosceners go through changes in life that affect them profoundly--changes that either bring them into or force them out of the demo scene. I find myself in the latter, which saddens me a bit, as I have not yet given back fully to the scene that has given me so much. I also welcome the change, as it is time for me to pass the baton onward to the newer generation of people who will redefine what demos are. I look forward to what tomorrow's enterprising youth with a love of computers can bring to the demo scene.

But the new blood probably has no idea what was interesting before demos came around in 1990. Sure, everyone can download GR8 / Future Crew and have a laugh or two, but that's the tail end of what I call "the golden age of PC fun." How did anything become interesting or cool on the PC--a machine with no dedicated graphics or music hardware? What held our attention before demos came along?

Good question. Well, this article will attempt to describe the changes in my life that brought me into the scene--what was interesting to me as I grew up with a PC, starting out in 1984 with a new 4.77 MHz processor and ending in 1990, the proper birth of the PC scene. It will be fascinating to some, and only mildly interesting to others, but it needs to be remembered in the sea of Pentium Pros, Windows 95, and accelerated hardware. Trust me, it used to be nothing like it is now. :-)

Also, I will try to cover some of the more eclectic technical information unless I don't remember it clearly, so if you want more info on how something was done, email me and there's a good chance I can find it somewhere. Hope you can parse BASIC code. :-)

So, if you've ever wondered what was interesting or "demo-like" on the PC before demos came around, this article will probably explain it. Let's journey backward, shall we?

[ Top ]
The Wonderful World of Games

Let's start briefly with something that held my fascination long before demos existed (and probably what contributed to their birth): Games. It's not too surprising that the most impressive stuff on any home computer is a game. Games are, in fact, some of the hardest things in the world to program, and in the early days, nothing was more true.

Think about it: You had to do many or all of the following: Accept input, display graphics (quickly), play sound effects, process artificial intelligence and other game elements, and do it all quickly enough so that the user didn't feel cheated out of a decent gaming experience. Of course, there was a multitude of problems in doing this on the original PC:

  • The CPU, an Intel 8088, was a 16-bit processor (8-bit bus) running at 4.77 MHz.
  • The sound hardware was a single-voice tone generator with no volume control.
  • The stock graphics hardware (CGA) was limited to 4 colors, and had an interlaced memory organization (not nice and linear like Mode 13h).
  • The bus was so slow that video memory and port writes were time-consuming; simply reading the joystick port took up to 15% of the CPU's time.

So it's obvious that a well-written game that was fast and fun to play was incredibly impressive, since things just simply weren't fast on the PC. I will mention games liberally in my musings. Just thought you'd like to know. ;-)

Teaching An Old PC New Tricks

How do you make a bland, boring "business" computer do something impressive? Good question. While newer (1984) versions of the PC (IBM's PCjr, and Radio Shack's clone, the Tandy 1000) had more sound and graphics capabilities, the older PC did not, and that was what had the largest installed user base. So, you had a couple of areas to improve on: Sound, Graphics, and anything that would speed up the system. Since a program's first impression was usually how it looked, many chose to improve the CGA graphics.

CGA (Computer Graphics Adapter) always struck me as finicky. The video memory layout was interlaced, which means that the first 8K was every even-numbered scanline and the last 8K was every odd-numbered scanline. Plus, there were 192 bytes unused at the end of both 8K sections, so that became annoying to skip as well. Finally, you had one of two fixed palettes of nasty, ugly colors (cyan, magenta, and white; or red, green, and yellow) to choose from. While some of the best artists could deal with four colors adequately, the crappy color set usually disgusted the user.

Well, I quickly found out that by fiddling with the video registers, I could get an unadvertised (to my manuals, anyway) color palette that seemed to combine the two palettes. The resulting 4-color palette had black, white, cyan, and red--much nicer. I'm sure this palette was advertised somewhere because some game I had used it, which is what got me sending random bytes to the video ports in the first place.

Sending random bytes to the video registers, by the way, was extremely dangerous! If I had known that then, I wouldn't have even tried at all; I was lucky I didn't break anything. The monitors back then (1987 and earlier) were fixed frequency monitors, and ran at low frequencies. If you told the video card to do something outside of those frequencies, you stood an excellent chance of damaging the monitor. A friend of mine was trying to get the rumored CGA 16-color mode to work (more on this later) and destroyed his Compaq's built-in monitor, a small monochrome monitor that emulated color with different shades of gray. He went to the nearest Compaq dealership to get it replaced, and when they refused to fix it, he answered with something like, "Well, I could demonstrate how easy it is to break the monitor to your customers by typing in a three-line BASIC program on, oh... this machine over here," and pointed to their top-of-the-line machine that some customers were huddled around. They quickly replaced his monitor to shut him up. :-)

Let's face it: CGA was crap. The C64 blew it away in terms of speed, and even an Apple ][+ had more colors to choose from. Anything you could do above what it was normally used for, quickly or differently, was neat. So, a bit of experimentation exposed that you could change the background color, usually black, to one of the 16 text colors. This allowed for a quick flash (possibly for an explosion in a game), which was slightly impressive since nothing about CGA was quick. This trick was also commonly used to change the black background to blue, which made it easier to draw bright and colorful pictures if you didn't need the color black. This same technique could be used to change the foreground and background color of the monochrome graphics in CGA's 640x200 2-color mode. (Love that 640x200 2.6:1 aspect ratio--NOT!) This wasn't useful for games, but was used to make black-and-white pictures look less... ugly.

Other palette fun could be had by changing the border color, which was largely an overscan hack. It was cute to have the border change colors and not everything else. I didn't even know the border could be a different color for a while--nothing bothered to change it. Another cute hack was to write to the ports to change from 320x200x4 to 640x200x2 on the fly without erasing video memory. This way, you essentially faked a color-to-mono toggle switch.

The only game I ever ran that used some of the above tricks to do some primitive copper effects was California Games (1987). Its video selection menu had the normal CGA 4-color mode, but it also had a mysterious CGA "more-color" mode. The "more-color" mode was only used in two places: The title screen and the hackeysack portion. Why? Because those screens had a clear horizontal division of a graphic on the top half of the screen and one on the bottom half. The "more-color" mode would switch palettes at a certain scanline to display one set of colors on the top graphic, and a different set of colors on the bottom. As you can imagine, this was unnervingly time-critical, and self-programming vertical-retrace interrupts took too long (hey, you needed all the speed you could get) on a 4.77MHz machine, so this trick only worked on the one machine that they could hard-code the values into: The original IBM with an original CGA. I always used to think that this mode didn't do anything until I brought it over to my friend's house and saw it work. It worked pretty well, oddly enough. Maybe I was just easy to impress. :-)

Here's a bit of trivia: Checking for Vertical Retrace has been the same procedure since CGA on up--you simply monitor port 3dah. Until I discovered that port, I always wondered why delaying 13 or 14 ms before updating the screen made it nice and smooth, but that's because 1000 ms / 70 Hz (the screen refresh rate) is about 14.2 ms. ;-) Nobody ever bothered to monitor the vertical retrace port unless they were trying to avoid snow on CGA screens.

Snow, you ask? What, you don't know about snow? CGA boards, for a reason I can't remember, displayed "snow" in text mode whenever you wrote to the screen memory directly (the BIOS writes avoided snow, but were terribly slow). Since I'm typing this article on a CGA PC right now (I'm not kidding--my word processor runs in 384k! ;-), I'll describe what it looks like: Every time video memory is touched, small horizontal white lines one character cell wide appear and disappear all over the screen. It can get really annoying after a while, so many people waited for vertical retrace before writing to the screen. This was much slower, but reduced snow a great deal, and was still faster than using the BIOS to output text.

ph@nocom.se had the following extra information to offer:

"If my memory serves me right, the snow on the CGA was because when the CPU and the video card both tried to access the one-ported video memory, the CPU would lock out the video cards access until the read was completed."

CyberTaco clarified this a bit:

"I believe you were wondering what causes CGA snow (wow, it's been a long time!)? It goes like this:

"The ram in the standard IBM CGA card was what was called single-ported, meaning that it could not be written to and read from at the same time. If you were writing to the ram, it simply ignored read requests and you got a random result from it instead. Single-port ram was used because at the time both because it was cheaper than dual-port ram (can read and write at the same time, like we're all used to), and because IBM was (is?) staffed by a bunch of idiots. :-)

"End result: When scrolling the screen up a line, every single piece of character memory has to be written to the memory area corresponding to one line above where it was. While this is happening (sloooooly), every time the physical picture-generating hardware goes to read the ram to figure out what dot to put where on the monitor, it keeps getting random results over and over because the ram is being written to during the scroll. The end result? Random dots all over the screen instead of text, resembling static, or.... snow. :-)"

[ Top ]
Low Resolution equals More Color

16 colors in graphics mode with a stock CGA board? Believe it or not, there was not one but two legitimate ways to get more than 4 colors on a CGA card: An undocumented "low res" mode (which I'll talk about later), and CGA's Composite Color mode. Both had drawbacks, unfortunately:

  • Since you only had 16K of video memory, your effective resolution went down to accommodate the extra colors--160x200 in Composite mode, and 160x100 in "lowres" mode.
  • Not all PC's had a CGA card with a composite color jack on the back of their CGA card. The few PC owners that did have one most likely didn't know how to turn it on if it needed special activation before the program ran.
  • Computer owners that had real monitors had little reason to keep a TV close to the computer to hook it up for the extra video mode provided by Composite Color.

But many game companies wanted to have every edge they could over the competition, so many decided to use it. (Starflight from Electronic Arts was the first mainstream game I can remember (1986) that used it well.) The only drawback, with any sort of compatibility back then, was speed--if all your graphics were in 16 colors, then you had two problems on your hands: You either had to convert the 16-color pictures down to 4 on the fly, which was slow, or you had to provide both sets of converted pictures on the disk, which took up too much room and cost more money. (Remember that back in the first half of 1980, disks were still fairly expensive--I remember the best price I could get on a 360K floppy in 1984 was a dollar a disk.)

Still, some companies saw the light and did it, and the result was colorful graphics on almost any system. Besides, you could use a lookup table to quickly plot the low-res sprites, because a word (two bytes) plotted 8 pixels in CGA or 4 pixels in EGA/Tandy (16 colors). (Plus, this gave Tandy/PCjr owners some extra speed because they had a native 160x200 mode, so they got more color without slowing down.) California Games from Epyx supported probably the most graphic options of any game in 1987 using this technique, including CGA, EGA, CGA "more-color" (see above copper trick), CGA Composite color, Hercules Mono, MCGA/VGA 16-color, and Tandy/PCjr. All the graphics and sprites (except for the font) were in 160x200. Mindscape also did the same thing with Bop'n'Wrestle, Uridium, and Bad Street Brawler (hi Lord Blix!).

Many companies used the 160x200 trick even if they didn't have 16 colors to store, simply because the sprites took up half the space. All the "Maxx Out" games from Epyx like Rad Warrior used this even though the graphics were mostly 4 colors.

Finally, some adventure game companies found a unique way to store over 200 or more full-screen pictures on a single 360K disk: Vectors. (By "Vectors", I'm referring to a series of points that define the beginning and end of a line, outline of a polygon, etc.) Regardless of how large or complex the picture was, you could usually fit a decent-looking scene into about 2K, because all you were storing were polygon outlines/definitions. A blue circle with a radius of 100 pixels located at (100,130) could take up over 10K as a raw sprite/bitmap, but it only took up enough bytes to describe "circle, at (100,130), radius 100, color blue" when you stored it as its vector definition. The previous example could take up as little as 7 bytes, if you did it haphazardly:

Data Size
Draw Element byte
X Coordinate word
Y Coordinate word
Radius byte
Fill Color/Pattern byte

Come to think of it, I'm sure the game programmers were much more frugal than that; this was when games had to fit on a single 360K disk. The previous structure would've probably been optimized even further:

Data Size Comments
Draw Element first 3 bits of first byte Holds 8 different types: Square, Circle, Line, Pixel, Fill Point, etc.
Fill Color/Pattern last 5 bits of first byte Allows for 32 different colors or patterns
X Coordinate word Must be a word, since the maximum (320) wouldn't fit in a byte (255)
Y Coordinate byte Could be a byte, since the maximum (200) fits into a byte (255)
Radius byte

So, even though 7 of the 16 bits in the xcoordinate word are wasted and could be used for even more things, we're down to five bytes for the information required to draw a large circle on the screen. You can pack this down even further, which I'm sure they did, but I included the above merely as an example into the thought process involved.

Another nice advantage to this system was that you could compose your pictures with 16 colors and let the drawing algorithm pick the closest color or dither pattern when it drew the vector objects on the screen at runtime. But, as with all computer programs, there was the classic speed vs. size tradeoff: Drawing the polygons took a lot of time, sometimes as much as six seconds for the picture to assemble, which looked sloppy. There were other drawbacks to this system as well; the artist was forced to use an uncommon vector-editing program, which usually had to be developed in-house, and the nature of the whole vector format procedure made it hard to produce pictures with a lot of fine detail. Still, Sierra used it in King's Quest I (1983), and the classic text/graphics adventure Transylvania (1983) from Polarware used it as well. (In fact, when Polarware was still called Penguin Software, they marketed a vector drawing package specifically built for making small graphics for games called The Graphics Magician.)

[ Top ]
Graphics Forged From Text Mode

Ah, now the fun part: 16 color graphics on a standard CGA card. IBM actually announced this mode in January of 1982, when the second version of the IBM PC came out; it's mentioned in their technical documentation, but they evidently saw no need to provide any real documentation for it unless asked. I guess nobody asked. :-) By the time I finally figured out how to program the mode, it was 1990 and I had a VGA card, so I never had a chance to use it in anything. Some games did use it, though, including a good shareware shoot-'em'up called Round 42 and two commercial adventure games, one from Sir Tech (remember Wizardry?) and another from Macrocom called ICON (1984). Here's how you program the mode:

The 160x100 "graphics" mode was actually a text mode. The video registers were changed so that normal 80x25 color text mode squeezed four times as many rows of text characters onto the screen to produce an 80x100 text screen with the bottom 75% of the text characters chopped off. Then, the screen was filled with ASCII 221, which is the left vertical bar. Each character on the screen was used for a pair of two horizontally adjacent pixels by adjusting the foreground and background color of that character. The "blink" option on the video board had to be turned off so that pixels didn't blink when the right-half pixels (which use the text background color) had a color value greater than 7. (Text characters normally blink if the background color is in the 8...15 range, but who uses blinking any more?)

Okay, cute trick. You could make it work in today's world by modifing it for EGA and VGA boards as well; on VGA boards, the text character height is normally 16, so it would be changed to 4. EGA boards, however, normally use a text height of 14 pixels. There is no way to change the text height to exactly one quarter of 14 pixels, so a text height of 3 would have to be used, which is slightly too small, but that's what you get for using EGA after 1990. :-)

Technical details: Blink is suppressed on CGA boards by setting bit 5 of port 3d8 hex to 0. On EGA and VGA cards, blink is suppressed by using subfunction 3h, function 10h of interrupt 10h and setting BL to 0. Trivia: The 160x100 mode uses 16000 bytes at b800:0000 hex to store 16000 4-bit pixels, which is twice the amount of memory that would be required in a normal graphics mode--so each even byte of video memory is wasted to store the ASCII character 221. Moving blocks of video memory around was both fast and annoying, since only the odd bytes (storing the text foreground and background colors) had to be modified.

Well, that's enough about CGA's 16-color "graphics" mode. I was just so happy when I figured it out that just had to spew somewhere. :-) And ICON? They added even more depth by using the entire ASCII character set--or, at least, the top four scanlines of the entire ASCII character set. (The bat's wings were the top four scanlines of ASCII #2, which was the smiley!) Some of the results were fantastic, but that's ANSI from hell that I don't think anyone would want to ever attempt again.

[ Top ]
If You Want Something Done, You Have To Do It Yourself

Demosceners are familiar with speeding up graphics with cute math and hardware tricks. But what do you do when the machine is simply too slow? Speed up your hardware. It sounds silly, but there were actually several ways to make the machine run faster--via software--by modifying certain bits of hardware. For example, the CPU spent time refreshing early DRAM so that the RAM wouldn't lose its contents. This refresh rate was adjustable and usually done way too often, so you could usually lower it a bit and gain more speed for your programs. Speed increases of 5% to 10% were not uncommon; I got a 15% increase by lowering the DRAM refresh rate gradually until the computer locked up. :-) I then used the last setting that didn't lock up the computer. Handy thing to have in your autoexec.bat, that.

But probably the most common method that game programmers used to speed up the computer was to go for the guts and bypass the operating system in a big way--by creating self-booting programs. This solved several problems at the same time:

  • Disk access and loading times were much faster. If you wanted data and knew where you put it, you seek directly to the track and load the entire track to somewhere in memory. If you were a real hotshot, you could trackload "manually" by not using the BIOS and instead using DMA. Very quick.
  • You had more RAM. The operating system took up at least 32K of ram, which your program could have used. (Hey, PCs came with only 128K of RAM until around 1985, so you really needed to save RAM.) So, make your entire program fit into 64K and write a little 446-byte bootloader and you gain that RAM back.
  • The machine was empty. There was no operating system to get in your way, so you could reprogram timers, write obscene self-modifying code, store data in unused portions of the BIOS (or vector table--ACK!), whatever you wanted to do. If the computer were a parking lot, it would be an empty parking lot--you could drive like a maniac for hours and not hit a thing.

Of course, you had to be a damn fine coder to do this. Being merely proficient in assembler programming wasn't enough--remember all those friendly DOS services you use? Gone. So, you'd better be prepared to write your own mini-DOS when you needed one.

Sometimes self-booting programs were mandatory; it was the only way to have any decent form of copy-protection. Stuff like that was really hard to crack. In fact, I don't think anyone in today's world would have the stomach to attempt it. Not only does the game boot (hel-LO!), but it then proceeds to stomp all over memory, usually obliterating the debugger. Your only chance is to dump the boot sector and attempt to disassemble it manually. (While the game programmers were tough back then, so were the crackers, so it got done somehow.) The most common programs that come to mind are the early Accolade and Electronic Arts games; Pinball Construction Set and Music Construction Set (forever classics), for example. Of course, the classic Wizardry did this as well, although it took them two tries to get it right: The first version of Wizardry was not very friendly to non-perfect-100% IBM compatibles, so the more you played the game, the less successful the disk reads were. Odd... The second version played just fine, however. They even took the time to improve the graphics.

[ Top ]
Don't Copy That Floppy

Let's sidetrack for a minute or so on copy-protection schemes. Although they had nothing to do with graphics or sound explicitly, I still found them absolutely fascinating, since they were also very hardware-intensive.

Copy-protection, for those too young to remember, was a method of doing something to the diskette the game came on that made writing a perfect duplicate impossible, which in turn prevented you from copying the disk and giving it to all your friends, robbing the software company of potential sales. A typical method of checking went something like this: When the game started, it checked to see if it was running on the original diskette by looking for a specific piece of data stored in an extra sector hidden in a specially formatted track. If it didn't find that data, it aborted. This way, you could only make the program run by running it off of its original diskette. (Nowadays, "copy-protection" is usually as simple as the program asking you to look up a word on a certain page in your software manual and type it in, or for CDROM games, checking to see if it is running on the CDROM device.)

Copy-protection used to be included on everything, from games to business software, simply because software was extremely expensive back then, and a couple hundred copies of a program could actually make or break a software company (no, I'm not kidding). People just didn't buy software all the time because a typical game (for example) was $50. That's normal in 1996, but very expensive for 1984. (Think about it: $50 in 1984 is like $95 today due to inflation. I don't know about you, but I'm not willing to pay $95 for a computer game.)

I wasn't any good at cracking back in 1984 (I barely knew general programming, let alone assembler), so I had to become good at figuring out the copy protection scheme instead if I wanted to duplicate the program. While many crackers learned DEBUG inside and out, I learned protection schemes and how to duplicate diskettes. Here's some of the more interesting methods that programmers used to prevent you from copying that floppy with DOS's standard DISKCOPY:

  • Simple trickery. Format a diskette as single-sided, but then store the secret copy-protection information on a track formatted on the second side. DISKCOPY would read the boot sector, determine that the disk was single-sided, and only bother copying the first side.
  • User stupidity. The program would simply try writing a dummy file to the disk. Since most commercial software came on write-protected disks, the write would fail, and the program would continue. But if you'd just made a copy, chances were high that you'd forget to write-protect the disk when you were finished, and the write would succeed, which then aborted the program.
  • Secret hardware information. Most disk drives could actually seek beyond track 40; usually to 41. Some software companies formatted that extra track (DISKCOPY didn't go that high) and stored secret info in it for the program to check.
  • Wacko disk formats. This is when you go slightly beyond the obvious, like formatting a normal 9-sector track with 10 or 11 sectors (or less, like 4 or 5), or by writing an incorrect track ID (track 20 says its track 30, etc.), or something similar. DISKCOPY didn't know how to handle stuff outside the norm like this, so you usually needed a special program like CopyIIPC or CopyWrite to analyze the diskette thoroughly, then attempt to duplicate the format. This was about 66% successful; the other 33% you had to do yourself, usually (I personally used the Ultra Utilities). Electronic Arts had one of the best schemes in the early 80's; I couldn't figure it out until about 1985. They formatted track 15 with over 90 sectors! :-D
  • Weak Bits. Some bits on the disk were recorded as halfway between 1 and 0, which made the disk difficult to copy. This wasn't used too often because it was unreliable, even on the original disk!
  • Damage. The most expensive method of copy protection was also the most effective: Physically damage the disk. Using a laser, it was possible to burn a small hole in the disk surface, and then all the program had to do was check to see if there was a read error in that particular sector, and if so, continue running the program. If you turned the disk surface manually by grabbing the inside ring, you could actually see a tiny hole in the disk surface!

If it was so easy (relatively speaking) to figure out these formats, then why didn't everybody just write bitcopy or nibbler programs to analyze the diskette and make perfect copies of everything (except the laser hole, of course)? It wasn't quite that easy: While the IBM floppy controller could read all of these formats, it did not have the ability to write all of them. A third-party company (usually the diskette duplication facility itself) specially prepared the diskettes using custom floppy controllers. To this day I don't know if the read-all-but-not-write-all phenomenon was a decision made on purpose by IBM's engineers or just a hardware glitch that software companies took advantage of. (Probably a hardware glitch.) Either way, I eventually broke down in 1987 and bought a Copy ][ PC Option Board, which went between the floppy drive and controller, allowing me to write those special formats. Trivia: To this day, there is only one diskette I have never been able to duplicate, even with the help of my Option Board: A Cops Copylock ][ demo diskette that I sent away for (Cops was a third-party copy-protection library you could purchase to copy-protect your own programs). I never found any programs that actually used Cops as the copy protection scheme, which was fortunate, since I couldn't copy it. :-(

[ Top ]
Your Computer Is Too Damn Loud

Ah, music. Certainly one of the most interesting things done with computers today in the demoscene; in fact, at least twenty times more MOD/S3M/IT/XM music is being put out per year than demos, and that figure is steadily increasing. Until Sami came out with Scream Tracker in 1990, there was no native digital/multi-channel composition program. Heck, until 1986, there wasn't even any sound hardware you could hook up to your PC. Sure, the PCjr and Tandy had their own 3-voice sound chip built in, but I didn't have a Tandy or PCjr.

That didn't stop me. :-) Or anyone else, for that matter. People did the best with what they had, with surprising results for the time and hardware. Ladies and Gentlemen, I present to you the evolution of composed/tracked personal computer music hardware and techniques, from the point of a fledgling demoscener (which means I'll conveniently ignore MIDI):

1981-1982
The PC speaker, driven by a chip that could only produce a simple tone at a fixed volume, was the only thing that kept us company. If it weren't for BASICA, we'd live in silence. BASICA had a PLAY statement that took real notes and octaves; you could bang out a melody relatively quickly, although it was loud and harsh. You could fake a chord by quickly alternating between different notes at the same time (an arpeggio), but this sounded artificial and bubbly. (If you didn't have a love for computer music, it would quickly drive you crazy.) Pianoman by Neil J. Rubenking was a music composition program that did this; you could compose each voice separately, and then combine them into an arpeggio. A gentler trick was to adjust the pitch up and down very finely, simulating vibrato. One voice, but at least it wasn't so harsh.

1983
The PCjr is released, and Tandy follows suit a year later with the Tandy 1000, which was a clone of the PCjr. One of the enhancements in the PCjr was the addition of a 3-voice sound chip that gave multiple channels, noise generation, tone envelopes, and volume control to the built-in speaker. Now we had something to play with. The BASICA that came with the PCjr and Tandy supported a 3-voice PLAY statement, which, if you played your cards right, could produce some fairly nice sound. One thing I discovered was that the Tandy chip had a hidden strength in low chord layering.

1984
Music Construction Set, programmed by Will Harvey, came out for the PC in 1984 from Electronic Arts. It had a real staff, with treble and bass clefs, and had a neato "construction set" motif--you could drag'n'drop notes onto the staff before "drag'n'drop" was a common catch phrase. Best of all, not only did it support the native sound chip of the PCjr/Tandy, but it could play four voices through the normal built-in speaker! (Granted, it was difficult to discern between the voices, but it was possible to hear the overall chord you were going for.) You could even print out the staff on your printer, although it was one long staff down the side of the page, and not nicely formatted sheet music. :-)

1986
Mindscape publishes Bank Street Music Writer, the first program I ever bought that came with its own hardware if you didn't own a Tandy or PCjr. The "Mindscape Music Board" was a 6 voice sound card which turned out to be a sine or square wave generator with simple attack, sustain, and delay parameters. Not exactly FastTracker 2 envelopes, but it was a start. :-) Plus, it attempted to print out real sheet music, and you could follow your voices on-screen as they played. I went nuts with this board, sometimes spending hours arranging the tunes my school choir was practicing. Although it was very good at producing solid chords (it was a tone generator, right?), it never took off, because the price was a bit high ($110) and it did sound a bit... "plinky". (Come to think of it, Music Construction Set for the Apple supported a similar board called the Mockingboard, but that never took off either.)

I'm fairly certain that I saw the Covox Speech Thing around this time as well. The Speech Thing was a simple digital-to-analog converter that you could connect to your parallel port to hear digitized sound. It sold for about $70, even though the parts cost about $15--including the speaker. :-)

1987
Adlib. :-) This famous board used a chip from Yamaha that produced Frequency-Modulated (FM) sound synthesis through 2 operators and a variety of parameters. You could utilize 9 melodic voices, or 6 melodic and 5 percussion. Armed with the odd Visual Composer, you could compose on a piano roll instead of a musical staff. It wasn't bad at all; in fact, it sounded pretty damn good. If programmed correctly, it could layer voices well, produce decent bass, and fairly full sounds. I still believe that the Adlib was (and still is) underused by the majority of people who composed for it.

While I didn't purchase my Adlib until 1989, the board was actually selling in 1987, and games started supporting it in 1988. Taito's arcade conversions done by Banana Development supported it passably, but it wasn't until 1990 that I heard simply beautiful music through it from a game called Continuum from Infogramme. The game consisted of jumping from platform to platform to reach a certain object, but the music was so good that I booted it up just for the music. (It also supported the Tandy sound chip, but since the music was composed for the Adlib, it was nowhere near the same quality.)

The great former C64 demogroup Vibrants also did some excellent music composed specifically for the Adlib board, but this wasn't until much later, in 1993, when they composed music for a few games. Their composition program, Edlib, is still freely available. Ever hear techno on an Adlib? :-) (Their true strength was jazz, which is what they usually composed.)

IBM Music Feature Card. This board was released from IBM in 1987 in an effort to draw MIDI musicians and game programmers over to the IBM. It cost about $250, and played 8 FM voices. The quality of the FM was much better than the Adlib, presumably since it used a 4-operator FM chip (also from Yamaha) instead of Adlib's 2-operator chip, and had over 100 built-in instrument parameters. It also had a MIDI port, of course. Trivia: You could put two of these boards in your PC at the same time to get a total of 16 simultaneous voices.

1988
Creative Music Systems (the name they had before they changed it to Creative Labs) came out with the Game Blaster around this time, and it offered 12 channels, with each channel producing either a single sine wave of a given frequency and magnitude (in stereo), or noise. The sound quality was obviously worse than the Adlib--the board simply couldn't do much of anything. You can still purchase CMS chips to put inside your Sound Blaster 1.x and 2.0, but it's really not worth it. The only game I know of that supported the CMS Game Blaster with decent music was was Times of Lore by Origin.

Digitized sound! Around this time, game companies had finally started to use digitized sound for music. (It had been used on the PC for sound effects as early as 1983, in Castle Wolfenstein/Beyond Castle Wolfenstein, and in 1987, in the PC version of Dark Castle.) While I had speculated that you could record bits of music and then rearrange them cleverly, a French game company called Loriciels beat me to it, with the excellent games Mach 3 (1987) and Space Racer (1988) (they also did a then-popular Pong/Breakout/Arkanoid clone called PopCorn). These games had a really cool (for a PC at the time) musical intro at the beginning, which was pieced together from small sound snippets that were arranged on the fly to form a longer piece of music. (You can think of this as a .MOD file with only one channel and all instruments/samples played at C#3.) And it played through the PC speaker! Coming from a PC that had a simple tone generator as a sound device, this just blew me away. Crazy Cars by Titus also had a snippet of digitized sound at the beginning, but this was just a 64K sample that looped once. The same went for Wizball by Mindscape. (Wizball was a fabulous game, IMHO.) Finally, games like Bop'n'Wrestle used it for the counts and body-slamming noises.

1989
The Sound Blaster hits the scene, and game companies start supporting it. It's essentially an Adlib clone, but it has the ability to record and playback digitized sound, allowing for speech and decent sound effects. (This information was essentially provided for people who don't know what a Sound Blaster is. I probably shouldn't have even written this paragraph, since the (in)famous Sound Blaster doesn't need mentioning, but I've done it already, so... whatever. :-)

1990
True Mixing. While many remember TrakBlaster being the first program to play MOD files on the PC, a couple of people were mixing before then, most notably PSI / Future Crew. He had created and marketed Scream Tracker in 1990 as shareware, and it could mix and play up to four voices in real-time on an 8MHz (or faster) computer. The early versions of Scream Tracker supported a mode of operation similar to the old SoundTracker on Amiga--you could save the song data and instruments separately. This allowed you to compose over 20 songs and fit them on the same disk, because you use the same set of instruments with each song. EGA Megademo / SpacePigs did this--they had four different songs that used the same instruments, so the whole thing fit onto a single 360K disk.
[ Top ]
A Brief Early History of Demos

This has been covered many times before, so I won't rehash The Beginning Of Demos excessively; however, a few tangible examples would probably serve to help you picture the transition.

Pirating games led to the birth of the demo scene. It's time everyone who is in deep denial about this fact comes out and accepts it. You can see this in the early loaders for pirated games; as early as 1988 you can see some demo-like effects in small loaders. The loader for Bad Street Brawler is a perfect example of this: The screen starts as static; then, like a TV pirate intercepting a TV channel, the screen flickers with the title graphic, until finally the title graphic is fully displayed. The entire program was only 128 bytes, and was tacked onto the front of a CGA screen dump.

This "loader mentality" took a while to grow, but it eventually did. Many early intros were simply one effect, but to be impressive, it had to be an effect never before seen on the PC. "Never before seen" meant one of two things, actually:

  • The effect was an Amiga or C64 effect that was translated to the PC (common), or
  • The effect was an entirely new effect (rare).
Many of the early Brain Slayer intros were good examples of the Amiga-effect-to-PC trend, and until .MOD playing on the PC became popular, most early intros had no sound at all.
Rose-colored Glasses

Well, our look back has ended, and not a moment too soon, since I've taken up valuable time that you could've used to write the newest demos. All I ask is that the next time you take a look at the demos today that contain incredibly complex 3D objects, particle systems, and multi-channel digital music, or the next time you compose music with 64 digital channels in Inertia Tracker, you think about what it used to be like in the old days. Who knows... Maybe some of you will be as innovative as us old-timers needed to be in dealing with such limited, slow hardware.

Then again, the driving force behind demos is that the hardware is only as limited as you think it is. ;-)

Epilogue

It just wouldn't be fair for me to end this article without a list of old demos that you should check out if you want to see the best of the best back in the old days (from 1990 to 1992). Keep in mind that these demos run at the full framerate on a 386 running at 16 MHz:

Demo Group Video Sound Support
Chronologia Cascada VGA Sound Blaster, Internal speaker, LPT DAC
Dragnet DCE VGA Sound Blaster
Putre Faction Skull VGA Sound Blaster
Unreal vers. 1.1 Future Crew VGA Sound Blaster, Gravis Ultrasound, LPT DAC
Vectdemo Ultraforce VGA Sound Blaster
Coldcut Ultraforce EGA Sound Blaster
EGA Megademo SpacePigs EGA Internal speaker, LPT DAC

And, as always, if you're interested about demos in general, please feel free to free to check out PC Demos Explained or download the PC Demos FAQ list.


Written by Jim Leonard. Completed September 25th, 1996. Special thanks to Geoffrey Silverton for some low-level "lowres" information.
This file was last modified on Monday, 20-Mar-2000 21:25:26 CST.


[ Top ]

Released By DaMe`
Visits [1459472]