Thursday, December 11, 2008

Monday, November 24, 2008

Bill Budge and his Pinball Construction Set

I've never played with it myself, but Pinball Construction Set, by Bill Budge. was big with my Apple-owning friends back in 1983. The concept was revolutionary - a software package that not only allowed you to play pinball on your computer screen (something Budge pioneered with Raster Blaster in 1981) but actually let you create your own virtual pinball tables and share them with others - even those who didn't own the construction set! An entire genre of construction sets followed, and Budge became one of the computer industry's early "rock stars." This application precedes - if not necessarily inspires - games like SimCity, first developed in 1985 on the Commodore 64 and finally released in 1989 for all platforms of the day.

PCS used the joystick and keyboard to allow the user to drag and drop playfield elements and even tweak graphics items pixel by pixel. It was inspired by the ultra-influencial work of the Xerox PARC, according to Budge in various interviews. He also revealed that he's a "terrible pinball player" - but is still proud of PCS, because he feels it's a solid piece of programming. Certainly he took the Apple II far beyond what anyone at the time ever believed it could do.

A brief chronology of home computer pinball simulations:

Raster Blaster, Bill Budge, 1981 (based on the pinball table Firepower) The first pinball simulation for computers.
Midnight Magic, David Snider, 1982 (based on Black Knight)
Night Mission, subLOGIC, 1982

PCS in the GameSpy Hall of Fame (many screenshots)
Episode of The Computer Chronicles featuring Budge (13 minutes in)
More info at pcpinball.com

Tuesday, November 11, 2008

The First Paint Program











The program known as "Paint," in Windows 1.0 and Windows 6 (aka Vista).





"Photoshopping" became a household verb some time ago. But much simpler software did much simpler things on a smaller and more primitive scale. What was the very first program to offer a graphical user interface for a would be artist to create graphics? It wasn't Microsoft Paint, which has been on almost every version of Windows since the beginning. It wasn't MacPaint, which was heavily vaunted in a 1983 magazine advertisement showing how it could be used to create illustrations of sneakers. According to the information I've found, it was SuperPaint, conceived by Richard Shoup in late 1972 at the Xerox PARC, the same facility that brought us windowed operating systems, mouse driven interfaces, and other technologies we take for background today. And Superpaint, like Photoshop, could start with a digitized image (gleaned in this case from video input) and work from there.


Photon Paint (on the Amiga, 1987) and GEM Paint circa 1988 (GEM was a Mac-like operating system for IBM PCs.)
Paint programs may seem like a foregone conclusion (I programmed a simple one myself in BASIC in 1983) but some pretty impressive artwork can be created with them. The process is sometimes referred to as "pixeling". Pictured are some screen shots of various examples of the programs. A future entry will illustrate the artworks themselves.






















The famous "Geisha woodcut" MacPaint drawing, and the Overlord* himself in 1984 with MacPaint front and center (according to folklore, he isn't wearing any pants!)

*That's Steve Jobs, to you non-Mac owners.

Monday, November 10, 2008

November 10, 1983 - Windows is late for the first time

Thanks go out to Wired for this story about a premature prediction of 90% market saturation, and a walk down memory lane with an eye-opening visual retrospective of Windowses Thru the Years.

See which ones you remember, and how you feel as you remember them! (Windows ME, anyone?)

There's also good news for those who like Windows XP and dread switching to Vista: XP will be supported (though not sold) through at least 2014 - an unprecedented 13 year span since its inception in 2001. This perhaps makes it the most successful version ever, surpassing even Windows 95.

Tuesday, September 9, 2008

A Brief History of Computer Music

Those of us who were little kids in the 1970s and were interested in technology were impressed by gizmos that are archaic today. I remember my first wristwatch, an all-black model that displayed the time in a red digital readout only when a button was pressed. My second watch was a quantum leap forward in technology - it played SEVEN songs in glorious monophonic beeps! I could even add tremolo by pressing it rhythmically against my wrist!

The first computers were about as audio-phonically advanced as my watch. Steve Wozniak designed the Apple II in his usual manner, elegantly and uber-minimalistically, using as few chips as possible. It didn't even have a tone generator at first, just made sounds at various pitches depending on how fast the speaker clicked.

The IBM PC had a tone generator, but could only make one beep at a given time - and it wasn't a particularly mellifluous beep. Still, one could easily access the tone generator from BASIC, using the PLAY command you could specify notes to be played and their durations (even flats and sharps.) Neil J. Rubenking released a shareware program called PIANOMAN that approximated harmonies by slicing and dicing the notes and weaving them together as arpeggios.

The early Ataris were known for some fancy music. Here's one game that got raves for its soundtrack: The Tail of Beta Lyrae.

The Commodore 64, with its SID chip, spawned a musician's "Scene" that is still active today. Check out SLAY radio for a 24/7 "chiptunes" internet radio station.

The Amiga scene was big as well, as the Amiga had true 4-voice polyphony and digital sound samples. The one quirk was there was no "panning" of the sound when all four voices were in use... there was total separation between right and left, so for instance, the drums and bass would be completely in the left speaker and the guitar and vocal samples would be completely in the right. A bit disconcerting in headphones, but still impressive to hook up to a good stereo system. Even "homebrew" downloaded games could impress, with their nifty sampled explosions, voices, etc. Karsten Obarski invented the first "Tracker" program in 1986, and revolutionized the computer-based-music-composition field. But the first Amiga song to really get noticed was John Malloy's title music for the adventure game "The Pawn":

When the game was released, the Amiga was the only home computer which had hardware support for digitized samples. However, there was probably no other Amiga game released in 1986 that utilized the capabilities of Amiga's Paula sound chip like The Pawn did. This means that The Pawn was a pioneer release in the field of digitized computer game music. The peaceful title music was composed by John Molloy and it features guitar and flute sounds, among others. [wikipedia]


Here are some tunes fondly remembered from the old days; one PC and five Amiga songs. The first three Amiga songs are from games of the time. All the Amiga tracks have been converted to MP3 for your convenience by a chap called Stone Oakvalley, but can be found on the net in much smaller file sizes if you download them in their "module" form.

- CIRCUS.EXE, composed with "Pianoman" circa 1983 (will make your PC speaker beep in DOS.)
- The Pawn, by John Molloy, 1986.
- Crystal Hammer, by Karsten Obarski, 1987.
- Battle Squadron hi score tune, by Ron Klaren.
- Sleepwalk, by Karsten Obarski.
- Bridge to the Universe, by Bjorn Lynne (aka Dr. Awesome).

...and a summary of each machine's capabilities, courtesy of Wikipedia (accessed 9/9/2008):

1977: Apple II
Rather than having a dedicated sound-synthesis chip, the Apple II had a toggle circuit that could only emit a click through a built-in speaker or a line out jack; all other sounds (including two, three and, eventually, four-voice music and playback of audio samples and speech synthesis) were generated entirely by clever software that clicked the speaker at just the right times. Not for nearly a decade would an Apple II be released with a dedicated sound chip.

1979: Atari 8-bit (Models "400" and "800")
The third custom support chip, named POKEY, is responsible for reading the keyboard, generating sound and serial communications (in conjunction with the PIA). It also provides timers, a random number generator (for generating acoustic noise as well as random numbers), and maskable interrupts. POKEY has four semi-independent audio channels, each with its own frequency, noise and volume control. Each 8-bit channel has its own audio control register which selected the noise content and volume. For higher sound resolution (quality), two of the audio channels can be combined for more accurate sound (16-bit). The name POKEY comes from the words "POtentiometer" and "KEYboard", which are two of the I/O devices that POKEY interfaces with (the potentiometer is the mechanism used by the paddle). This chip is actually used in several Atari arcade machines of the 80s, including Missile Command and Asteroids Deluxe, among others.[10]

1981: IBM PC
The PC speaker is best characterized by its inability to play more than one tone at once, the waveform being generated by the Programmable Interval Timer. Because of this, it was often nicknamed a PC beeper or PC squeaker, especially when sound cards became widely available. In spite of its limited nature, the PC speaker was often used in very innovative ways to create the impression of polyphonic music or sound effects within computer games of its era, such as the LucasArts series of adventure games from the mid-1990s, using swift arpeggios.

1982: Commodore 64
The sound chip, SID, has three channels, each with its own ADSR envelope generator, and with several different waveforms, ring modulation and filter capabilities, was very advanced for its time. It is designed by Bob Yannes, who would later co-found synthesizer company Ensoniq. Yannes criticized other contemporary computer sound chips as "primitive, obviously . . . designed by people who knew nothing about music." Often the game music became a hit of its own among C64 users. Well-known composers and programmers of game music on the C64 are Rob Hubbard, David Whittaker, Chris Hülsbeck, Ben Daglish, Martin Galway and David Dunn among many others. Due to the chip's limitation to three channels, chords are played as arpeggios typically, coining the C64's characteristic lively sound.

1985: Commodore Amiga
The Amiga's sound chip, named Paula, supports four sound channels (two for the left speaker and two for the right) with 8-bit resolution for each channel and a 6-bit volume control per channel. The analog output is connected to a low-pass filter, which filters out high-frequency aliases when the Amiga is using a lower sampling rate (see Nyquist limit). The brightness of the Amiga's power LED is used to indicate the status of the Amiga’s low-pass filter. The filter is active when the LED is at normal brightness, and deactivated when dimmed (or off on older A500 Amigas). On Amiga 1000, the power LED had no relation to the filter's status, a wire needed to be manually soldered between pins on the sound chip to disable the filter. Paula can read directly from the system's RAM, using direct memory access (DMA), making sound playback without CPU intervention possible.

Although the hardware is limited to four separate sound channels, software such as OctaMED uses software mixing to allow eight or more virtual channels, and it was possible for software to mix two hardware channels to achieve a single 14-bit resolution channel by playing with the volumes of the channels in such a way that one of the source channels contributes the most significant bits and the other the least ones.

The quality of the Amiga's sound output, and the fact that the hardware is ubiquitous and easily addressed by software, were standout features of Amiga hardware unavailable on PC platforms for years.

Links of note:

Exotica, an Amiga music wiki.

Friday, August 22, 2008

Meet Jim Maxey

Back in the early days of computing, almost all work was done in "text" mode - screens of text characters, with maybe some line graphics or other ASCII art thrown in. Apps like games and pie charts used "Graphics mode," which was exciting and exotic. There were only a few colors at first - seven kind of blurry ones on the Apple II, four garish tasteless ones on the IBM PC, then just black and white when the Mac came out. In the dark ages we actually moved at first towards fewer colors, not more!

In 1983 a fellow named James Maxey started up a BBS called Event Horizons. He themed his board and its graphic images for download around an astronomy bent at first, but entrepreneur that he was, he soon realized he could get more callers (virtually all computer enthusiasts were male at this point) by offering digitized photographs of women. He was soon raking in over 3 million a year - his was the most lucrative BBS of all time. BBSDays.com has a bio and list of his accomplishments.

This success was not lost on a certain men's magazine, which noted some of their images scanned into GIF files offered up on Mr. Maxey's system. Maxey claimed he could not be held responsible for what his users uploaded to his computer - he received massive amounts of files each day - but the fact that he tended to "sign" his images as "MaxiPics" and put his name and BBS phone number on them, rather put the kibosh on this defense. Maxey eventually settled with Playboy by simply writing a check for half a million dollars.

A similar situation occurred a few years later with a system known as "Rusty n Edie's." Having survived an FBI probe aimed at stopping piracy of commercial software offered online for free, Rusty's board apparently did not withstand its second challenge, this time from Playboy. It disappeared, Playboy started its own web enterprise, and the Web became (among other things, thankfully) a sort of global peepshow.

Below you can view some G-rated MaxiPics from the mid-1980s.

Tuesday, August 19, 2008

Demise of the Disc

Physical music formats have always seemed ultimately disposable. Buying a record, or 8-track, or cassette, you would too often find the rest of the songs were much worse than the one you'd heard on the radio, and probably only listen to your purchase a few times. Even if you did like it and give it repeat plays, you'd be required to purchase it again in the next format to come along, and let's face it, these things spend so much more time on the shelf than spinning in a player, it's like we never listen to most of them at all.

In today's greener world, people are liking the switch to digital. Digital music and other media don't need to be delivered in an exhaust-belching, gas gulping truck, they don't contain any petroleum based plastics, and they are as light and space-saving as the devices you put them on. As long as you don't accidentally delete your collection, you're good (but the industry will take a dim view of your backups, most likely - remember, these are the guys who fought against the VCR, media lending, and even the legality of purchasing used CDs.) The digital solution that will please the common music lover, golden-eared audio snobs, musicans, and the media cartels has yet to be found. But on their 26th birthday, CDs' days are numbered, and I won't miss them - I already own more than enough.

Wired: Happy birthday, Compact Disc. Now go away (also has interesting comments submitted by users, at bottom.)

Friday, August 15, 2008

Annual versity

A couple of anniversaries are happening this year. (Well that's silly. Of course all anniversaries happen every year. But still.) A few days ago CNet published a brief piece on the advent of the IBM PC. It's one of those "Don't you feel old now?" gags. But I don't feel old when I think about getting my PC... of course I feel nostalgic. I still remember the feel of the thick, slightly rough metal, the super-clicky keyboard, the way the two floppy drives popped open, the sweet tobacco smell of the room (technically it was my father's computer.) For days after it arrived I would once in a while just go look at it. There was so much I imagined I could do with it... and there was much that I did.

The IBM PC is 27 this year.

CNet: Do you remember where you were when this happened?

PS: Computer bulletin boards are 30.

PPS: Ralph Macchio turns 47 this year. Ok, now I feel old.

Friday, August 8, 2008

Goodbye old friend... the twilight of the music cassette

The New York Times has a good article on current sales of cassettes and their not-so-bright future. It mentions one of the classic uses for cassettes - making a mix tape for your girlfriend (who hasn't done that! Though these days I guess it's an mp3 playlist). Apparently, CDs will be next down the tubes. A good friend of mine who's been a live audio trader for years laments the stampede to MP3s. He fears "lossless" formats - the uncompressed files on CDs or carefully compressed digital files that use algorithms that don't throw away part of the audio information - will become a thing of the past. I tend to doubt that, although "Digital Rights Management" is a concern on both sides of its debate.

Mix tapes will always have a place in my heart. They were the first way we could customize and listen to our music the way we wanted.... and contrary to the industry's cries, home taping never did kill the music business....

See also the mix tape that changed my life.

Saturday, July 26, 2008

Three Viddys

- An interesting stop-motion animated cartoon featuring some of the classic games. The sounds are the same, but various objects are representing the on-screen objects. What strikes me about this was how great the sound was on those ancient games. Not just cold beeps, but sounds with real character.

- An Asian kid plays a "hit the buttons at exactly the right time to play a song" game from a few years ago. Sort of Dance Dance Revolution, but with (a whole lot of!) manual dexterity instead of foot dexterity.

- Soviet video game artifacts from the Cold War era.

Friday, July 25, 2008

Shall we play a game?

In 1983, aged fourteen, I'd owned a computer for less than a year but was already completely smitten with using it to dial into whatever free online systems I could find. A neighbor's father had a couple of Bulletin Board numbers and that was where it started - the first system my friend David and I connected to was called "PBC", or Play by Computer. This was a nascent game BBS system put together by a computer enthusiast in the DC area named Porter Venn. PBC never took off, but watching those white-on-black text characters crawl across the monitor - at 300 baud speed they appeared in about the time it took to read them - was a unique sort of thrill, especially since one kind of information you could get on a bulletin board, was phone numbers of other bulletin boards. Our online life had begun.

The 1983 film Wargames - celebrating its 25th anniversary this year - is the subject of a well-done retrospective in Wired, and was a perfect fit for David and me as we watched it with our big buckets of popcorn back then. But then, wouldn't anyone identify with the idea of seeking out interesting connections - finding out little known or exclusive information, maybe a new or unreleased game? Not to mention logging in remotely to the school computer to change a grade or two. One of the film's most accurate observations: security was lax then as now... the protagonist (also named David) simply noted the password written down for reference, as he sat near a secretary's desk waiting to be disciplined.

The retrospective is worth a read if you're interested in one of the first, and perhaps still the best, cinematic interpretations of the early online experience (with, of course, a healthy dose of Cold War paranoia and speculation on artificial intelligence. If a computer's whole world is a wargame, don't be surprised if it doesn't know the difference between a game and reality...)

Tuesday, July 15, 2008

Online Images Old and New

In December 1984, a small and innovative company called Thunderware released a product called Thunderscan for the Macintosh. Thunderscan was a clever way of making a dot matrix printer work in reverse - rather than outputting dots on paper, the printer would scan an existing document left to right, slowly but surely. The user would install the sensor in place of the print ribbon cartridge, put the sheet to be scanned in the printer, and wait a long time - and after the printer had scanned the whole page line by line, the image was in the computer. In order to imagine how this worked you need to remember dot matrix printers, those slow and noisy beasts we all used to use before inkjet printers were dirt cheap (to the point they started to be given away free with the purchase of a computer in the mid-90s).

I thought some might find it interesting to compare one of the old Mac Thunderscans to a random image from the web today. Above, you see a picture of Christie Brinkley, probably from a magazine, that was "Thunderscanned" (fed thru a printer and scanned into a Mac) in September 1985. These were common trades on the online dial-up bulletin boards. Below is a picture of cute Zooey Deschanel taken from a Web page discussing M. Night Shamalan's latest disaster (of a) movie, The Happening. Look closely at these two image files and compare and contrast - we've gone from one color (black pixels on white, mathematically arranged using a kind of "visual noise" called dithering, to create the illusion of shades of grey) to 70,308 in the Deschanel image, according to my image software (Modern image formats actually can display millions of colors - more than the human eye can perceive). The detail on the more recent picture is amazing as well - zoom in and you can even see the peach fuzz on her chin!


I wonder what online media will look like in another 20 years or so?

Monday, July 14, 2008

New for 89 (hundred bucks)

This one's made the rounds before, but I am including it here as an example of how the old days aren't always the good old days. Note that this includes the CPU and keyboard only. This looks pricey even for the time.. I got complete cutting-edge systems in '83 and '88 that were less than half of this. What makes this system so good, allegedly? The VGA graphics? (Which were a big leap up from 16-color EGA, but still.)

For comparison, here's a complete 2008 system, WITH mouse, for $179. Of course, all the specs are hundreds to thousands of times more advanced, even in this basic setup.

The lower cost aspect is the up side to the appliance-ization of personal computers. Because everyone essentially "needs" to have one, they have lost their mystique... yet the demand has made them affordable as well as ubiquitous.

Tuesday, July 8, 2008

Sanctum-onious Adventure


"Encounter the forces of black magic as you roam around an old 18th century monastery. See all the evil locations in this spooky adventure in full hi-res detail. If you like suspense, you'll love searching out and destroying the evil in this classic tale."
In mid-1980s, a small company called Mark Data Products release 6 parser-based graphic adventure games similar to early Polarware and Scott Adams titles. "The Black Sanctum" in particular is written as a text-only adventure in 1983, released with graphics for the TRS-80 Color Computer the next year, then ported to the IBM PC where it did not sell nearly as well. While the graphics are good for the time, the parser is quite primitive - only simple verb+noun constructs such as GET HAT are understood, and some verbs that are common in IF games such as SEARCH are not always recognized. Fortunately, the games make up for primitive parser with well-written plot, fun characters, and some clever puzzles.

I first encountered The Black Sanctum via a local Bulletin Board system, and as sometimes happened in those days, I downloaded it and played it never knowing it was a commercial release. Though I never bothered to finish the game, it occupies a place in my memory to this day. The authors have since given permission to use the game and source code freely, so now you can download it without guilt.


Many thanks are due The Home of the Underdogs.

See also, an interview with one of the game designers.

Monday, July 7, 2008

Huge Euge

"The only legitimate use of a computer is to play games."
- Eugene Jarvis (creator of Defender)


In videogame companies in the early 1980s, a single programmer would often create an entire game - the graphic design, the sound, the game concept and of course the program code to make it all work together. Not only that, but companies often rose or fell on the strength (or weakness) of a single game. In spite of all this, programmers almost always got a raw deal from the companies. There were only a few star programmers, but even they got small salaries and little to no royalties from their games - simply because it was a new industry and nothing had been negotiated yet.

One such star programmer is Eugene Jarvis, known to arcade geeks as the genius behind a holy trinity of arcade games, Defender, Stargate, and Robotron: 2084. Jarvis' three-day (!) career at boring, gameless Hewlett Packard was probably the impetus for the quote above. A chess expert from a young age and an avid mainframe "Space War" player in college, he saw games as the logical killer app for computers, and when Atari finally called him back after three months of waiting, he jumped at the chance.

Starting out in the industry, Jarvis designed pinball games, but when Atari's pinball section went under, he moved to Chicago to work on pinballs at Williams Electronics. Williams had released a video arcade game years before (Paddle Ball, a clone of "Pong") but pinball was their bread and butter. All that changed when Jarvis came up with Defender, the first smash hit sideways-scrolling video game. Defender used a unique thrust/reverse propulsion mechanic, impressive visuals for the time, and truly groundbreaking sound effects. Not to mention the smart bomb, a new addition that allowed the player to maximize points (or get out of a tough spot) by eliminating all the enemies on the screen at once. Another innovation was the scanner, a sort of radar display at the top of the screen that allowed the player to track what was going on in the area of the "Game world" that was currently off-screen. Many of these concepts are still commonly used today. After Williams' offer of a bonus is rejected, Jarvis leaves the company and starts his own game development concern, which quickly concocts an updated, even more complex sequel to Defender, "Stargate", and licenses it to Williams.

In 1995, someone at the sitcom "NewsRadio" was clearly a Jarvis fan, because he was featured on the show. He played "Delivery Guy #3," dropping off a Stargate machine (which, sadly, has its side art and marquee obscured, apparently for legal reasons having to do with the name), and humor ensues as the main character of the show faces the game to which he gave so much of his young adulthood.

You can watch the NewsRadio episode "Arcade" here for free, with limited commercial interruption.

You can also read about the epsode or download a WAV file with an audio clip.

And, here's a story from folklore.org about the Macintosh development team and Defender (I recall a magazine article from the early 80s about Wozniak - with a picture of him playing Stargate at home.)

At left is a screen shot of Mr. Jarvis on the show, and below is a 1980 promo shot of Defender and "demo dolly." Hand on the joystick, of course.

Tuesday, June 10, 2008

Chronology: Breakout

Breakout was one of the first truly fascinating, influential and compelling video game concepts. The premise is simple - avoid missing the ball. Keep it in play until you have broken all the bricks on the screen.

Breakout is closely tied to two members of its video game family tree: Pong and Arkanoid.
  • 1966. Ralph Baer begins work on the Brown Box, later to become the Magnavox Odyssey home video game system. One of the games developed is Table Tennis, a simple game of two paddles deflecting a ball horizontally on a screen. It's seen by Nolan Bushnell at a demo, and helps inspire the similar Pong in 1972 and Breakout in 1976.
  • 1976. Atari releases Breakout to arcades. It's conceptualized by Bushnell, and ostensibly programmed by Steve Jobs; however, Jobs actually gets his friend Steve Wozniak to do the work.
  • 1986. Nothing much happens Breakout-related for a decade, then there are, in this order, three Breakout-style games released in 1986: Gigas (Sega), Arkanoid (Taito), and Gigas Mark II (Sega). Arkanoid is the one to set the stage for brick-breaking games to follow, with its many enhancements in the form of falling "power-ups" that can be caught and influence the player's deflection device.
  • 1987 to present. An innumerable slew of Breakout games are created - far too many to list. Wikipedia has a partial list.

The sheer simplicity of the game concept, the fact that one can play solitaire for a short or long time, and the addictive and satisfying gameplay have made Breakout a winner for decades. Here is a listing of Breakout games available today at JUST ONE game publisher:

Aquaball
Aztec Ball
Aztec Bricks
Ballistik
Boom Voyage
Break Quest
Brickquest
Bricks of Atlantis
Bricks of Camelot
Bricks of Egypt
Bricks of Egypt 2: Tears of the Pharaohs
Chak's Temple
Egyptoid
Fairy Treasure
Fizzball
Gem Ball
Hyperballoid 2
Hyperballoid Golden Pack
LEGO Bricktopia
Magic Ball 3
Meteor
Nuclear Ball
Nuclear Ball 2
Reaxxion
Ricochet Infinity
Ricochet Lost Worlds
Ricochet Lost Worlds: Recharged
Ricochet Xtreme
Roboball
Temple of Bricks
Treasures of the Deep
Z-Ball

It looks like brick breaking is here to stay.

Recommended reading:

Monday, June 9, 2008

Gooey Interfaces, Then and Now


Here we see an interesting specimen from the twilight of the BBS age (1997, to be precise). In today's point and click, ultra-graphical world, many users are unaware that going online used to be a textual affair. The "hotkeys" are highlighted in yellow - Pressing "J" would let you apply to Join the system, "P" would send a plea for help regarding a lost Password, etc. To make things pretty, an ASCII art butterfly adorns the screen. The communications program (in this case, WinComm) keeps a readout of connection speed and time online at the bottom of the window. This was because connections were slow and long distance was expensive - so you had to be mindful of how long you stayed on. Seems archaic now, with fast broadband and ALWAYS being online....

In the 1990s an interesting thing happened. BBSs and the Web started to merge. Here we see the Web interface of the same BBS pictured above. Note that this was before "usability" became a buzzword - this Web page seems hardly useful, at least the part visible "above the fold" - all the user can do is send an email, see how many hits the page has gotten, and look at a more detailed butterfly! Note the dial-up number at the bottom of the window, in case you want to go old school and dial in the old fashioned way. Also, check out the miserable state of the browser (in this case, AOL, looking like a diseased version of Netscape.)

Which is not to say minimalist Web design is always bad. Google.com has done very well for itself with just a simple graphic, a text entry box and a few well-chosen links.

Wednesday, June 4, 2008

Video Game Pitch Meeting, ca. 1979

Video via my longstanding friend Lapsed Cannibal, of the great Glass Maze.

Friday, May 30, 2008

The First Phonebook

The only known edition of the world's first telephone book has just surfaced in Connecticut. November 1878 is about a hundred years earlier than the usual focus of this blog, but it's old technology all the same. Some notes:

- The whole document is only 20 pages.
- Users were asked to limit calls to three minutes, so other subscribers would have a chance to use the line.
- "Hulloa!" is the appointed greeting when you initiate a call. (I had thought it had been "Ahoy-hoy").
- "Never take the telephone off the hook unless you wish to use it."
- "When you are done talking say, 'That is all,' and the person spoken to should say, 'O.K.'"

The telephone was a source of uneasy paranoia when it was first introduced. If it can transmit our voice to others when we wish it to - how do we know it's not "listening" at other times?

Article at Discovery News.

P.S.: The first computer bulletin board system went online 100 years later. These, you recall, were how people outside governmental, educational or military organizations traded electronic mail and files and used message forums before the Internet became widespread for casual use. Bulletin Board Systems used phone lines too - and in the 21st century dial-up online connections, pay phones, and to some extent land lines in general, are now in decline.

Thursday, May 22, 2008

Ebonstar


There are games you pause to remember fondly once in a while, and then there are games you and your friends still play 20 years later. Ebonstar, designed by the Dreamers' Guild in 1988 and released only for the Commodore Amiga, is both of these. It's a straight-up one-to-four-player space arcade extravaganza wherein you battle the computer ships, your friends, or both. How it came to be is a story that's never been told on the Web until now.

Robert McNally, at the tender age of 17, was the youngest programmer Sega had ever hired. The Sega workplace was a friendly one, with a regular group gathering during the lunch hour to wolf down sandwiches then make a beeline for the dimly-lit back room where sat one of the few four-player Eliminator machines ever made. (see "Chronology: Spacewar," below.) The group was more than four in number, so at the end of a game, only the player with the highest score would continue in the next game - the other three would be replaced, sort of a "king of the hill" series. Young Robert would play so hard his palms would sweat profusely - something that never happened in any other context - and once, when his ship was destroyed, he spontaneously lost control of his body and fell over backwards (luckily there was a pile of empty boxes and packing material behind him which broke his fall.) He also shared this anecdote:
I remember when one of the engineers who had originally worked on Eliminator snuck in a re-coded ROM with a cheat: Between rounds, if a player continuously held down a particular combination of the four control buttons, the machine would reduce the size of that player's "hit radius" for the duration of the round. This would result in the player not being quite as easy to hit as the other players. Used intermittently so as not to give it away, that engineer had quite a bit of evil fun for several months before the other engineers got sufficiently suspicious to pull out the ROM and compare it to the production master.
When Robert left Sega he couldn't stand to see Eliminator go, so he and his brother Michael created Ebonstar.

A few fun notes about Ebonstar:
  • The best way to play, by far, is in Tournament mode with four players of about equal skill. Tournament means you're concentrating on your human opponents rather than trying to get a high score or advance to the higher levels of the game. The game regularly provides fireballs, homing shots, and a lightning bolt that zaps-out-of-existence any ships near you when you trigger it - your mission is to pick up as many of these items as possible and put them to good use.
  • Team play is preferred - two on two. You can play every man for himself, but the game takes on a new dimension when you have one ally and two opponents. Especially when your ally accidentally kills you - "Ca-raaaaaaaap."
  • A computer keyboard was not designed to support the energetic pressing and/or holding down of a dozen different keys at once, arcade-stylee. Occasionally you will experience "keylock," the sudden nonresponsiveness of the keyboard - unless you're the one guy who only uses the joystick - or the mouse (not recommended)
  • Very rarely, gravity will affect the entire play area, and your little ship will resemble a salmon desperately swimming upstream instead of a little bee buzzing around fancy free. No one - not even the programmers - know exactly what triggers this, but some believe that a secret key combination does it - and that I know what it is.
  • There is a great debate that has raged for decades about what constitutes "winning" a Tournament game. It's customary for the last player standing to quit the game with Amiga-Q when he's defeated the last opponent - everyone's eager to play again, and doesn't want to sit around while he plays against the computer ships that once again start coming out. The game itself displays "Winner" in the color of the ship with the highest score, presumably in an homage to those early Eliminator lunches at Sega. But the winner must necessarily be considered to be the sole survivor - who, obviously, could rack up big points on his own, by putting the non-human-piloted computer ships in what is known as "das position" (say it with an evil German accent) and pummel them till the cows come home - or extra ships are awarded. Because your standard shots don't destroy other ships - only bounce them away and possibly into a wall - it's possible to dribble another ship like a basketball for a good while, racking up points in the process.
Some other enjoyable moments to experience:
    • When you've got no weaponry whatsoever and another guy has collected a ton of fireballs, homing shots, AND the lightning, destroy the black hole satellite while he's on the other side of the screen - the game will reset everyone to no weapons for the next level - he'll have nothing (and you'll like it).
    • If the big yellow Nemesis ship emerges from the black hole and comes at you, do all you can to divert it to another player and force him to deal with it instead. As you do this, be sure to say "Take 'im."
    • Too many more to list.
To play Ebonstar on your PC, just download WinUAE, an Amiga kickstart ROM, and Ebonstar.ADF, get three friends, and fire it up!

Also, at the top of this post is a completely clueless negative review of the game (Amiga User International, June 1988). The reviewer clearly didn't experience the multiplayer/tournament/team aspect of the game.

Thursday, May 15, 2008

Chronology: Spacewar


(Image above is "Computer Space")

"Hackers" get a bad rap.

"Hacking" today tends to mean breaking into a computer system. This is a corruption of the term - the original hackers didn't break into computers - because they were the only computer users and already had access. At MIT at the beginning of the 1960s, the first hackers prized efficient, elegant, beautiful solutions - the noun "hack" has been said to be undefinable, but has been characterized as "an appropriate application of ingenuity." So it's a shame that the mass media has been ruining the word since the 80s.

Especially because it's hackers we have to thank for making computers fun. For a long time, computers were just for doing work. Sure, there was Tennis for Two back in 1958, and other trinkets here and there, but things didn't go serious arcade style until 1961 when those hackers at MIT dreamed up Spacewar. It was at once complex and bind-mogglingly simple - with game physics including gravity, thrust and hyperspace, not to mention an astronomically-accurate background star field, courtesy of a subroutine dubbed "Expensive Planetarium" - yet at its heart, it was a simple two-player hot seat duel. Spacewar was, by necessity, a shared experience.

Take a look at the prodigious progeny of this game:
  • 1961-62 - Spacewar! (Steve Russell et al, PDP minicomputer)
  • September 1971 - Galaxy Game (Installed at the student union at Stanford, a version of Spacewar that cost 10 cents or three plays for a quarter.)
  • November 1971 - Nolan Bushnell (later the father of Atari) mass produces Spacewar as "Computer Space" - the first video game in arcades.
  • 1977 - Space Wars, the first vector graphics arcade game, is released by Cinematronics- it's a very accurate arcade version of Spacewar. "Vector" graphics are made up of very bright, clear, lines, which makes the game more appealing than the blocky low resolution graphics on black and white televisions, as is more common in arcade games of the day.
  • 1979 - Atari releases Asteroids - along with Space Invaders, one of the biggest arcade smash hits of the 70s. Asteroids uses the same left/right/thrust/fire control scheme as Spacewar, as well as the "Hyperspace" concept that - as everyone knows - randomly teleports your ship to another location on the screen.
  • 1980 - Cinematronics releases Star Castle, a one-player-at-a-time space battle against a circular fortress in the middle of the screen.
  • 1981 - Sega comes out with Eliminator, a vector game for one to four players, not unlike a Star Castle with the enemy installation mobile and floating around the screen. [video]
  • 1982 - a version of Spacewar is ported to the Vectrex game console, the only vector graphics game console ever made. It's a very faithful rendering, and even includes a one-player version - though the intelligence of the opposing ship is not what you'd face from a human adversary.
  • 1988 - The Dreamers Guild creates Ebonstar, a version of Eliminator for the Amiga home computer.
This is only a small sampling. Steve Russell, when asked, was extremely modest about his contribution, saying if he hadn't done it, someone else would have. Maybe. What we do know for sure is he came up with one very influential computer program - and since his nickname was "Slug" due to his tendency to be slow to start anything, and he was the first, who's to say we might not still be waiting for Spacewar if he hadn't come through?

Wednesday, May 14, 2008

Stick Around

Whatever happened to the joystick?

If you're about my age, you've probably used one of these. Back in junior high we all complained about the standard issue Atari model, but looking back, it was one of the best - maybe not always the most comfortable, but it provided good control for years - until you'd beaten the life out of it, for instance, playing Decathlon, which required you to jerk your stick left and right as fast as possible - for minutes on end.

The joystick may be passe, but it's still bubbling under the surface of our pop culture. This guy hacked one into a TV remote, and this woman made a giant joystick work of art.

Newsweek has a good overview of game control from the Atari era to today. An excerpt, if I may:

*1977: the Atari 2600 controller. One joystick, one button. 2 inputs.
*1980: the Intellivision controller featured a 12-key keypad and two action buttons on each side, and included a “control disc” that essentially functioned as a joystick input. Function overlays were included for most of the games and fit over the keypad. All told, it was 17 inputs.
*1982: the Atari 5200 was the gold standard for the early complexity era. A joystick, a 12-key keypad, four action buttons, plus start, pause, and reset buttons. 20 inputs. Incredibly, this controller had as many inputs as the PS3 controller—twenty-five years sooner.
*1985: the Nintendo Entertainment System reduced the 20 inputs on the Atari 5200 controller to a d-pad, two action buttons, plus select and start buttons. 5 inputs. The NES did, um, pretty well, and the NES controller marked a permanent break from the complexity of only a few years earlier.
*1990: the Super Nintendo controller added a third and fourth button, as well as two shoulder buttons. Both would become standards. 9 inputs total.
*1995: the Sony Playstation controller added a third and fourth shoulder button. They also made each d-pad direction a separate button. 14 inputs total.
*1998: in response to the analog stick of the Nintendo 64 controller, Sony introduced the Dual Shock controller, which featured two analog sticks in addition to all the buttons of the original Playstation controller. The analog sticks were also clickable, thus potentially functioning as two additional buttons. We’re up to 18 inputs now, if you don’t count the "analog" button (which really couldn’t be used as in input in games).
*2006: the Sony PS3 controller, which we’ve already mentioned, had 20 inputs.


For more on sticks, check out Kokatu or just try a Google image search.

Friday, May 9, 2008

Basically, the Classics

Apropos of my past post, I just wanted to throw out this cool BASIC interpreter that lets one run BASIC programs from the Windows desktop.

Classic BASIC Games

Cool. Classic. BASIC.

Thursday, May 8, 2008

Digital Archaeology Turns Up an Artifact from 1984

I was searching for a "shareware" BBS-downloaded game I played on my IBM PC circa 1985 and used the search term GRIME.ZIP. I found the game, and something else that surprised me even more - a program I wrote on the same PC, the previous year. It's a BASIC program that draws the Ghostbusters logo on the screen. (Ghostbusters was absolutely HUGE in the summer/fall of 1984, you will remember.) I found a couple other Ghostbusters programs in the same filebase.

It's kind of neat what you can find on the Web. Even things that were created long before there WAS a Web. My file, the Grime game, and many others were uploaded to this BBS in 1989 it seems, and burned onto CD maybe a year later - eventually, the BBS became a dialup/Web hybrid, and here we are. Here's a list of some of the files on the BBS.

Here's a shot of the screen and some text from the program.

I hope you enjoyed this presentation!!

THIS HAS BEEN A CATRON SOFTWARE PRODUCTION.
Copyright (c)October 16th, 1984

Friday, May 2, 2008

ASCII and Ye Shall Receive

Not so long ago, in a relative sense, computer graphics were almost an afterthought - the earliest computers didn't have a monitor screen at all. Programmers looked at a wavy line on an oscillocope, a few blinky lights, or a printout, to see if their program worked. Nowadays of course the graphics are the goal, whether it's the latest slick user interface or the most photo realistic game environment. Back in the halcyon days, computer users made the best of what they had; which, initially, was text characters. The ASCII character set is a standard that most computers and printers use to keep track of letters, numbers and symbols - and computer enthusiasts were quick to utilize them as a kind of graphics, when pixels were not available.

In 1966, a man named Kenneth Knowlton at Bell Labs created a photomosaic by scanning a series of photographs - you can see the result above, Studies in Perception I. But just as interesting, and possibly even more creative, are the uses ASCII has been put to by those who manipulate the characters by hand. A great overview is available at the Wikipedia ASCII Art article.

The colon and parenthesis "smiley faces" we still use in our emails today were first proposed back in 1982, and enthusiasts were making good use of text characters in games and simple animation then; the IBM PC initially did not come with a graphics card as standard. Dial-up Bulletin Board Systems - the only way for most people to go online before the Internet became accessible to the general public - were completely text-based for years, and an entire "Art Scene" sprang up around doing the fanciest things possible with whatever color, sound and motion came available. People were trading BASIC programs displaying animated ASCII "Cartoons" in the early 80s, the direct descendant of which you can see in "Star Wars ASCIImation" - or here, with sound added. An entire feature film has also been converted to ASCII.

If ASCII art interests you, be sure to check out Jason Scott's nostalgic collection at textfiles.com (his stated favorite is Angela.art), as well as Joan Stark's site.

Wednesday, April 30, 2008

A Diction Comes Pulsin'

I read today that Grand Theft Auto IV is predicted to break video game sales records... and that it's already being lambasted by critics who haven't even seen it yet. Although I don't buy, play, or even like today's violent games, I have to shake my head at the notion that games are sending our culture down the garden path. It's a chicken and egg scenario; are games making people like violence - or is a market for violence boosting the sales of games? I find it difficult to believe a game can single-handedly raise a nation of carjackers. Much has been made of the individuals who have played games compulsively until they dropped dead, or murderers who enjoyed computer games and expressed such freely. I submit to you that there was almost certainly something else wrong with those people beyond their chosen game pastime. For proof, I give you the hundreds of millions of mentally healthy people playing the same games with no ill effect.

Back in my day, games were simpler, but there was still often shooting and destruction. The first game to cause a real stir was Death Race (Exidy, 1976). In this game, you drive your little on-screen car around to try and catch up with "gremlins." (I.E., pedestrians.) The game was inspired by a silly B-movie released the year before. The outrage over the game was greater than the outrage over the (much more graphic) film - presumably because the game was 1) interactive and 2) found in arcades sometimes frequented by children and families.

Here's a video of the game in action:


Doesn't seem particularly shocking by today's standards, does it? Yet as a symbol, to some it represents possibly losing control of ourselves, getting sucked into a virtual world that has the power to corrupt us, to influence our thinking and our actions in the real world.

Whether Death Race or GTA IV is the potential influence in question, I hope we are not that feckless, and I hope those of us who are parents will take seriously their responsibility for their charges. Our media are a mirror of our culture; they are not our society itself. Don't blame the mirror for how you look - these games sell like they do because people want to play them.

There was another new medium that caused a panic in some circles when it was introduced - it offered a leisure-time activity that caused young people to withdraw for hours at a time, fired their imaginations, separated them from reality, etc. - this new medium was the novel.

Tuesday, April 22, 2008

and They Named it "Female Friend"

It seems I have always been a fan of the obscure. For me the choice was never "Beatles or Elvis" or "Beatles or Stones"... Pink Floyd was the clear choice; anything else is a false dichotomy. Given the choice, I didn't go for Coke, nor Pepsi; RC was my drink (Yes... I really can taste a difference). And when it was time to buy my very own first personal computer back in the 1980s, I didn't gravitate towards the IBM PC platform (even though I'd grown up with it) nor the Macintosh (though it was supposedly a life-changing niche machine.) I opted instead for a little-known machine made by Commodore, who'd previously come out with the best-selling computer of all time. This newer, state of the art machine did not enjoy such sales figures - for one thing, it always had an identity crisis; the creators originally designed it as the ultimate games machine... it was soon expanded and marketed as a "do-anything" machine; called "the computer for the creative mind," (and who wouldn't want to think of themselves as creative?)... and it ended up as a top choice for video editing and production.

They wanted a friendly, easy to remember name for their creation - they settled on "Amiga," and the fact that it came alphabetically before Apple and Atari didn't hurt. Later given the model number 1000 to differentiate from its siblings the home-aimed Amiga 500 and the more expandable Amiga 2000, the Amiga cost less than the IBM and Mac computers of the day, yet did much more. When it launched in 1985, the Amiga could display 4096 colors on screen at once, while the IBM of the day could display sixteen and the Macintosh, only two. The Amiga also featured stereophonic digital sound and music (up to four sound channels playing simultaneously) when the other guys mostly just beeped, and it could do true hardware multitasking - running multiple applications simultaneously in its various windows. All things we take for granted today, of course; but at the time, revolutionary. The Amiga had custom computer chips for sound, graphics, and so forth, just as computers today do - but other computers didn't begin to have custom chipsets until years after the Amiga was gone.

So given all this, why have most people never heard of Amiga? It's a tragic tale of squandered opportunity and corporate infighting. After a big splash debut in New York City, featuring appearances by Deborah Harry and Andy Warhol (both Amiga users, it seems), Commodore didn't do much to promote or market their new computer line; and what was done, was often done badly - I remember commercials showing a kid hijacking a TV broadcast, helping out various celebrities, or even levitating his house. Not things most people have a need for - but it showed the common confused conception of computers at that time: they were something that could do anything you wanted, if you just learned how to tap their power. A far cry from the common household appliance they are today.

I have linked below to an article in the August 1985 issue of Personal Computing, previewing the first Amiga (warning: the files are large in size, about 1 MB each). I like the sense of wonder it evokes... it really did seem like anything was possible back then.
Many thanks are due Gerd Frank, of the German site AmiATLAS, who so kindly scanned this for me after winning it on eBay.

More information: arstechnica did a piece on the history of the Amiga. And here's a page with some info on how Warhol used the Amiga.

Thursday, April 17, 2008

The Mix Tape that Changed My Life


Almost 20 years ago as of this writing, I started college. That freshman year was a high water mark in many ways; I'm still friends with a lot of the people I met then; I got my first taste of independence and self-reliance; and I started discovering really interesting music I'd never heard on the radio - which, in effect, meant that I became a music listener when I'd never really seen any reason to be one before.

One of the musical artists I discovered that autumn is Robyn Hitchcock - a songwriter from England who has had a lot of labels stuck on him over the years, such as "quirky", "eccentric", "cult figure" and even "God." He's been called the Monty Python, Douglas Adams, and Rod Serling of pop. The venerable Trouser Press record guide called his "entire body of work ...one of the great undiscovered treasures of modern pop music." And it was Creem magazine that proclaimed "God walks among us." I wonder if his obscure status makes people like him more; I always get a warm feeling from discovering something unique and special that it seems almost no one else knows about. Not that I don't try and spread the word - you're reading this, right?

My introduction to Mr Hitchcock's oeuvre was a common one for the time - a "mix tape" cassette that a friend of a friend had compiled from vinyl records, and passed on to an interested party (my new school friend, Dan) in order to spread the gospel. Dan was a great source of new music in those days - he had an immense collection of store bought cassettes as well, and was generous about lending them out. He gave me the Hitchcock mix tape outright and I was instantly hooked. It was cued up to side B when I first played it, in the room of one of the girls on our floor. "In St. Petersburg... In the night... Where the light shines down on the snow..." The voice, slightly nasal, almost self-mocking, full of character, crooned about an object of desire who was - in typical fashion for that time in that musical career - beautiful, mysterious, and no longer alive: brought back to memory by a dark vision of death, with dark music accompaniment to match.

Since that day, I've met still more friends online through the Hitchcock listserv, and I'm working on a third tribute songs collection. In honor of 20 years of music fandom, and in honor of Dan's birthday today, I enshrine here a scan of the mix tape that changed my life.

Happy birthday, Dan!