Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

+1 for describing the "Apple II+ with a Z80 Softcard" as "the ultimate mullet machine, all the business software upfront, all the gaming party in the back."

I agree with your point, the bar IBM was shooting for was set by existing popular microcomputers circa 1979. The only significant consideration for future growth/competition was seemingly that the established trend of RAM size growth would probably continue. At the time there wasn't really any established trend of progressive growth in graphics resolution or colors. Pre-Apple II examples like the Cromemco Dazzler for the Altair weren't fundamentally different than the Apple II and probably not even on their radar due to being barely out of the kit/hobbyist level.

I'll add that when considering the 5150's initial design, the "IBM" we're talking about isn't really "The IBM" but rather a sole skunkworks project located in a backwater division down in Boca Raton Florida intended as an experiment to learn more about these new microcomputers. Most of the rest of the traditional IBM management structure barely knew about it during development and those parts that did mostly ignored it. If 'mainstream IBM' had approached the PC as a real IBM project, it would have certainly been very different and probably unsuccessful (if it had managed to ship at all). As it was, the 5150 was only able to use off the shelf components (including the CPU) because it was considered a one-off experiment initially given a month for the design and a year to ship.



> RAM size growth would probably continue.

True. But note - very long RAM grows ~ periodically doubling one chip size, and first chips don't have controller inside, so require very short traces to bus chip or CPU.

And usually, old chip becomes for example 10% cheaper, but twice size priced ~50% more than old, and to adopt new chips you need new memory controller with additional pins.

> At the time there wasn't really any established trend of progressive growth in graphics resolution or colors

Unfortunately, only partially true.

You may hear about RAMDAC on video forums topics. It is partially palette, but also generator of video signal, reading from RAM very fast.

Problem is that first "fast page" DRAM have very slow interface, so when larger chips become available (and with cheaper kilobytes than older, this was real logic of semiconductor technology progress), speed of RAM was not grow. And unfortunately, this once become bottleneck, it limits grow pixelrate, so even with twice RAM you could not got twice resolution.

In past, I few times calculated speed of RAM need to give classic 60 FPS, and at least up to (and including) first SDRAM machines just show their screen was enough to eat significant share of main RAM throughput, so internal graphics could even affect CPU performance.

On consoles problem was not so harmful, because limited resolution of consumer TV, but on few consoles used expensive frame buffer inside graphics chip.

On modern GPUs problem of RAM throughput solved by used overclocked designed VRAM chips and with extremely wide RAM bus, so chips run in parallel - in computers typical ~64bit, but GPUs start with 128 and top models have 512 or even 1024 bits.


I once read that IBM had contacted Atari about licensing their chipset, so they did actually care about gaming to some degree.

Also a lot of Apple users gamed on a monochrome monitor, so how many colors maybe wasn't the biggest concern, just 'has some'. The resolution was largely fixed by the tube technology.


Interesting. I hadn't heard that about Atari. The odd thing is that the Atari 400/800 chipset couldn't display 80 column text, which seems to have been a 'must have' for IBM due to word processing and terminal emulation being considered essential.

I wonder if may be it was when IBM was working on the PC Jr.


Yeah, the impression I have is the talks went nowhere, but Atari was obviously on top of the market on that point, so no surprise they made a call. Maybe IBM wanted to contract something out, but IIRC Jay Miner had already quit.


Atari's Sunnyvale Research Lab (SRL), run by Alan Kay, were working on some graphics chipset at that time. Probably why IBM came knocking.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: