The Anachronistic Computer Project

It’s time to blow the lid of my next major project. Projects, really. I’ve been working on them for quite some time already. It will take some effort to describe them all, so let’s dig in!

An exercise in alternate history

Close your eyes, and transport yourself back to the early 1980-s! The age of president Reagan, MTV and rather regrettable hair-doos. Oh, and the IBM PC.

It was an exciting time to be a computer nerd. Almost every year a new, groundbreaking computer got released:

’81: the IBM 5150, aka. the IBM PC
’82: the Commodore 64
’83: the Compaq Portable
’84: the Apple Macintosh
’85: the Amiga 1000
’87: the Acorn Archimedes

The list goes on. It was nothing like the current PC mono-culture or the barren landscape of the ’70s. It seemed as if, for a short time, anything was possible.

We of course all know what happened: the PC won out and swept away almost all alternatives. But, back then, it didn’t seem inevitable.

Imagine an alternate way of events! Imagine a company with the coffers of IBM, the visionary drive of Apple and the ruthless execution tactics of Commodore! Imagine what a product such a company could have put on the market!

I invite you on that journey. Let’s design the baddest, meanest micro of the early ’80s. What would it look like?

This is the goal of this project: design a computer from the ground-up, using early ’80s technology. The rules of the game are:

  • Custom chips are OK as long as they use process technology available at the time period
  • Use of period-accurate memory technologies and densities
  • The end result must be affordable by the standards of the day

Reality check 1: custom chips

What? You might say? Custom chips, designed in 2024, using processes of the early ’80s? That surely is impossible. Indeed it is. The point is not that the actual implementation follows what the ’80s allowed for, but rather that the chosen complexity would be congruent with what was possible. So too are the clock-frequencies involved.

The actual implementation – and here I have to introduce another project-in-the-works – would be based on an FPGA, more specifically the UnIC concept.

Very briefly the UnIC project aims at creating universal ICs in a DIP-40 compatible package, based on an FPGA. These “chips” can impersonate existing designs (such as a 6502 or a Z80 processor, even – hopefully – a VIC-II chip) or be the back-bone of new, never-before-seen designs.

To measure the complexity, one can use a modern silicon toolchain (such as OpenRoad with the SKY130 PDK) and Dennard Scaling to back-project to era-correct nodes; 1.5, 2 or 3 micron processes.

This technique allows for estimating chip area as well as clock speed with some accuracy.

As a rule of thumb, it seems to me that something that fits in about 5k LUTs of a modern FPGA would be something that was feasible and price-competitive back in the day. Larger would have been possible, but expensive, smaller would have allowed the use of older (cheaper) process nodes.

Reality check 2: I/O

This is a place where I have to make exceptions. Sure, it would be possible to design a system, driving television sets, using floppy drives and 9-pin joystick ports, but what would be the point? Those peripherals have long gone extinct. Implementing interfaces to them is a waste of effort, because immediately one would start looking for emulation options. and interface gadgets.

Instead, I decided to implement modern I/O options: HDMI (maybe VGA) for video, USB for keyboard/mice/joysticks/controllers/thumb drives and SD card for ‘hard drives’.

Some of these decisions, specifically HDMI have some cascading effects as we shall see…

The curious case of video

Home computers in the early ’80s almost exclusively targeted TV sets. Either NTSC or PAL, depending on the side of the pond you happened to find yourselves on. Even machines that did support RGB monitors stuck to timing pretty close to the TV standards. This means:

  • Around 15kHz horizontal sync frequency
  • 50 or 60Hz refresh time
  • 200 or 240 scan-lines
  • 320 or 640 horizontal pixels per line
  • Higher resolution only in interlaced modes

What this boils down to is about 8Mpixel/s memory bandwidth requirement for ‘high resolution’ and 4Mpixel/s for ‘low resolution’ modes.

This is important: memory bandwidth is a very precious resource even today. More so in these early systems. We have access times on the order of 120-150ns for DRAM, so an 8-bit system can’t really sustain more than 6.6 to 8.3 MBps. I’ll come back to this in a later post, but these numbers should immediately show that sharing memory between the graphics and the CPU subsystems is a great challenge. Not surprisingly most systems didn’t do it (exceptions here are the C64 and the Mac).

Now, HDMI (or VGA for that matter) doesn’t really support something as low a resolution as the modes above. Yes, VGA technically supports 320×200 pixels, but in reality, it’s just 320×400 with every scan-line repeated twice. This helps with frame-buffer size, but not with memory bandwidth. DVI did have a mode spec for 320×240, but I’m not sure how widely it was supported. I’m not sure if HDMI carried it over at all.

VGA monitors technically supported EGA resolution (640×350) but at 70Hz refresh rate, which – in terms of pixel clock – is the same as VGA resolution (640×480 at 60Hz). DVI is just a digital transport for VGA and HDMI is an evolution of that. All in all, the lowest supported pixel clock is about 25MHz, that is 25MPixel/second! Compare that to the ~8Mpixel/second above. And we have said that that was a challenge in old systems.

My current plan – and, yes, this is just a plan at the moment – is to include a crude up-scaler in the video controller. This is something that would not have existed in the old days and thus would not count against my budget of about 5k LUTs. This up-scaler would support scan-line doubling without reading out the pixels twice from video memory (in effect have two lines worth of buffer within the FPGA) as well as supporting pixel-doubling. These techniques combined would allow the frame-buffer reads to progress at a – comparatively glacial – 6.25Mpixel/s for 320×240 pixel resolution, while presenting a regular 640×480 pixel resolution at 60Hz refresh rate to the monitor.

The specs!

So what this machine would look like?

Main processor: custom RISC instruction set with a core frequency of 8MHz. Internal 32-bit architecture, external 8-bit double-pumped data-bus; 16MBps peak external transfer rate.

Main memory: 128kB base memory, expandable up to 8MByte. The 128k base memory is going to be very limiting. Due to the intricacies of the memory architecture, the next step up would have been 512k, something that would have been possible, though rather expensive in those days.

ROM: well, not sure. I included support for up to 512kB of memory, but that would have been just a pipe-dream in those days. Maybe 32-64k would have been more realistic, even that on the high-end. ROM access speed is significantly lower than that of DRAM in this machine, so large ROMs would have been counter-productive in terms of performance anyways. Boot times of course are a different story.

Video memory: the computer has dedicated video memory (similar in concept to the chip-RAM of Amigas: it’s not impossible to use it as main memory, it’s just that it’s slow). This memory would have started off at 128kB, expandable to 512kB.

Video modes: there’s quite a bit of uncertainty here still, but the main supported modes would be 320×240 at 256 simultaneous colors and 640×480 at 16 colors. These colors are selectable form a palette of 15k colors, that is to say that each R/G/B channels have a 5-bit resolution.

Audio: I’m working through what would have been possible here. In terms of what the actual HW is capable of, it’ll include a modern, 16-bit CODEC, support for stereo audio of up to 48kHz sampling rate. The interface has support for both input and output, technically simultaneously. Audio shares both memory and bandwidth with video, so high sampling rates would not have been practical, especially with the 128kB option.

USB Connectivity: The will be a single, USB 1.1 full speed host controller in the system. This limits the total bandwidth to 12Mbps, minus overhead. This controller is connected to a 6-port HUB, providing 6 USB ports on the machine. Interrupt and bulk protocols will be supported isochronous transfers probably won’t be. This means that HID (keyboard, mice, game controller) and mass storage class (thumb drives) devices should be supported given enough SW effort, but things such as webcams or external sound cards will not be.

Networking: I’ve included a 10/100Mbps Ethernet controller in the design. Reaching actual, sustained transfer rates of even 10Mbps is going to be problematic though, so while the controller will negotiate 100Mbps links, the speed advantage is not likely going to be there.

Expandability: I’m including an 8-bit ISA bus interface for expansion purposes. Support for it should be fairly complete, with some limitation in terms of power draw and the missing REFRESH signal. I’m debating whether to provide 16-bit ISA slots. These would not support 16-bit transfers, but would at least terminate the unused signals in a way as to not confuse 16-bit cards.

Form-factor: The first design would simply be a micro-ATX motherboard. Power however would be provided by an external 9-12V power brick instead of a regular ATX power supply. At least at the moment, that’s how it’s designed. Later on, I’m thinking about a wedge form-factor variant, something similar to an Amiga 500. That’s in the far, far future though.

The goods

The design is published on GitHub. The schematic entry is pretty complete at this point, the PCB design needs a lot of optimization, but is also pretty close to being done.

Technology-wise the board is a 2-layer one with (mostly) through-hole components. The only exceptions are the HDMI connector and the SD-card socket, which are not manufactured in through-hole format. There are also common-mode chokes on the high-speed signals (USB and HDMI) that are surface-mount, for the same reason: they are not available in any other format.

Reality check 3: Ethernet, USB, through-hole?

Try finding a through-whole packaged USB HUB controller chip. Or a 10/100 Ethernet controller for that matter. They don’t exist. So how did I manage to create a through-hole board with them?

The idea is to use – essentially – break-out boards. Small, however-many-pin PCBs that carry the surface-mount component and break them out into a DIP-compatible package. As I was at it, I threw in level shifters and voltage regulators to mimic a single-supply, 5V-compatible component.

You might call it cheating, I call it re-creating period-accurate components.

Call to action

If you are interested in making this design real, please reach out! I’m just one person and it’s a lot of work. Plus, collaboration makes projects better. New ideas, different perspectives help. To be more specific though:

  • If you can help with mechanical design for cases that would make a huge difference
  • If you have industrial design experience and would like to drive that aspect, it would be fantastic
  • If you want to sink your teeth into SW design, that would get the project off the ground as well
  • RTL and HW design experience, you can help out with the design, verification and bring-up of the various FPGA-based components.

A word from our sponsor

I’m really grateful for the support from PCBWay. They helped me with the first PCB for this project, and I have to say, the results were pretty! Granted, I’m not using the most cutting-edge manufacturing processes, but, hey, quality is still quality. PCBWay of course doesn’t stop at just capabilities that I need. They provide all manner of PCB fabrication services, FPC and rigid-flex capabilities, even PCBA assembly and 3D-printing services if those are of your fancy. Check them out!

Print Friendly, PDF & Email