The DIY-CPU Project – The Clock Generator

The-clockAfter reading about how a simple CPU works in J. Clark Scott's book and other sources I've been toying with the thought about building a CPU myself. It's a bit of a project and it will take a while but the journey is the reward. As one has to start somewhere I decided to start with the clock as the details of how one is built is a bit vague in the book.

As the CPU is for educational purposes I want it to run at a clock cycle of around one Hertz, so one can actually see how things are working. In addition to a long clock cycle to drive an 'enable' signal to put something on the data and address bus that interconnects everything, a shorter clock cycle in the middle of the overall clock cycle is required to drive the set inputs to a number of components such as registers, the memory etc. to tell them to take over what's currently on the bus. These two signals can be generated out of a clock and a delay of itself of half a clock cycle. The book describes how to use AND and OR gates to generate the enable and set pulses ot of those signals but not how to create clock signal itself and how to derive the delayed clock from the original clock.

So I improvised a bit and used two inverters (NOT gates) with a resistor and capacitor for a 1 Hz clock generator (in the middle of the picture), a transistor, some resistors and a capacitor for the delayed but inverted signal (the right part in the picture) and another NOT gate to revert the delayed impulse back. The circuit with the AND and NOT gates described in the book to generate the two output signals are shown on the left together with two LEDs to visualize the final signals.

The result looks a bit complicated but it's actually not, because there are three distinct and independent building blocks that can work independently of each other. One thing that makes things look a bit complicated is the use of one AND of the chip on the left and three NOT gates in the chip in the middle to create a single OR gate.

Using parts of the electronic kits I got as a teenager and  parts of kits I recently bought to have a more complete setup was ideal for prototyping the circuit. I'm sure there are a million ways to build this more efficiently and with fewer parts. But efficiency was not the point of the exercise. There we go, the first step towards my own CPU is done.

Anti-Noise Headset for the Mobile Traveler

I spend a lot of time commuting and traveling to far away places so I spend a lot of time in trains, cars and planes. Especially in cars and planes I usually make good use of the time by reading or writing something, such as this blog entry for example. But there's one thing usually in the way and that's the noise made by the vehicle itself, frequent (useless) announcements and other travelers as well. Up to a certain level, I can ignore it and get on with whatever I do. But at some point, especially when people close to where I am start talking my concentration is usually gone. Earplugs help somewhat but only to a certain extent. I've long wished for noise canceling headsets to go further. I had some in the past but they had limited effect and when I lost the plastic ear plugs and couldn't get replacements I never ventured into this area again. Then recently, I read a number of raving reports in several places about the new Bose QC20 in-ear noise canceling headsets. To say they were positive would be an understatement so I couldn't wait for them to become generally available (looks like the Bose PR department has done their job well).

What's definitely not an understatement is the price. 300 Euros is a tough number but for real good noise suppression I was willing to spend the money. So I got myself a QC20 and swallowed hard when swiping the credit card through the readier, ah, no actually when clicking on the "One Click To Buy" button online.

Needless to say I couldn't wait for them to arrive and give them an instant test. Amazing, when pressing the silence button the external environment in trains, train stations and office just goes away. If a person nearby speaks loudly a little extra music in addition to the noise suppression makes that sound go away, too. Incredible.

The other thing that always bothered me about in-ear headsets is that they get uncomfortable after a while. The QC20 however is not an in-ear headset, however as its not held by pressing something into the ear channel. Instead, it fixes itself to the ear with a plastic hold that fits inside the ear cups. Perfect, I've worn them for several hours over several days now and it never hurt a bit, not even after several hours of wearing them.

And finally when not suppressing the nose the headset still analyzes the sound environment and compensates for the plastic isolation over the ear. This is great as without it, just like with other in-ear headsets, the external environment sounds artificial and I get a strange and uncomfortable feeling when I speak myself as that has a strange effect on a blocked ear canal. The compensation works great and it almost feels like not having earplugs in at all when switching to "listen to the outside world" mode.

I have high hopes for my next plane trips as well. On intercontinental flights, current 'over ear' headsets were of little use to me as one can't sleep with them when trying to sleep on the side. With the QC20 in-ear, or rather on-ear headset it might just be possible now.

Despite the super high price for the headset I am still full of praise for them, traveling and working in noisy office environments has become very different. Let's see how this story develops and what I think about the headset in a couple of months.

The Computer – Levels of Understanding – And Building My Own CPU

Looking at historical computing educational kits that explain how computers work rather than 'only' how to work with and program computers I started thinking a bit about the different levels of abstraction on which people understand computers. Here's what I came up with:

The Working Level Understanding: This is how most people understand computers today. They use it as a tool and know how to work with programs that serve their needs such as word processors, spreadsheets, web browsers, etc. Most people on this level, however, know little about what's inside that notebook or smartphones and can not explain the difference between, let's say, a hard drive and RAM or even know that such things exist.

Hardware Understanding: The next level from what I can tell is knowing about the components a computer consists of such as a processor, RAM, the hard drive, etc. and what they do.

Programming: The next level is programming. One can certainly learn programming without knowing about the hardware but I guess learning about that would come in the process of learning how to program anyway.

Understanding how the individual components work: The next level is to understand how the different parts of a computer work and what they are based on, i.e. logical gates, bits and bytes to simplify it a bit. There are certainly different depths one can go into on this level as pretty much on all other levels as well. The "But How Do It Know" book I've reviewed some time ago is one of the best ways to really feel comfortable on this level.

The physics behind the gates: Next in line is to understand how gates are built, i.e understand how transistors work on how they are implemented on silicon. I liked this video on Youtube which gives a good introduction from a non-technical point of view. Obviously one can go much further here, down to the quantum level and beyond but I think the basics of this level are still understandable for somebody interested in the topic without a deep technical background.

Personally I think I have a pretty good grasp on most of these levels, at least from a high level point of view. But I decided to go a bit further about understanding how individual components work. As I said in a previous post I learned early in my career how a CPU works and what is inside. However, the control part of it always remained a bit mysterious. I wouldn't have thought it to be possible to build my own CPU before, but after reading the "How Do It Know" book plus some extra material I am sure I can pull it off, given some time and dedication. So there we go I have my new quality time project: Building my own CPU. I'll call it the Do It Yourself (DIY) CPU and will of course blog about it as things develop 🙂

Why Open Source Has Become A Must For Me

While the Internet is doubtlessly a great invention and I wouldn't want to miss it in my daily life anymore there are certainly downsides to it. Last year I summarized them in a post titled „The Anti-Freedom Side Of The Internet“. While I have found solutions for some of the issues I discussed there such as privacy issues around remotely hosted cloud services, I have touched one topic too lightly that has become much more apparent to me since then: The changing business models and interaction of software companies with their customers that is not necessarily always to the advantage of the customers compared to pre-Internet times.

In the pre-Internet times software was bought on disks or CDs and installed on a computer. For most commercial software you got a usage license with an unlimited duration and the user was in control over the software and the installation process. Fast forward to today and the model has significantly changed. Software is now downloaded over the Internet and installed. The user's control over the process and privacy is largely gone because most software now requires Internet connectivity to communicate with an activation server of some sort before it installs. While I can understand such a move from the software companies point of view I find it highly controversial from a user's point of view because there is no control what kind of information is transmitted to the software company. Also, most software today frequently 'calls home' to ask for security and feature updates for security and perhaps also for other purposes. While this is good on the one hand to protect users it is again a privacy issue because a computer frequently connects to other computers on the Internet in the background without the users knowledge, without his consent and without his insight into what is transmitted. Again, no control as to what kind of data is transmitted.

And with some software empires on the decline, a new interesting license model, not thought of in pre-Internet times, is the annual subscription model. Adobe is going down that path with Photoshop and Microsoft wants to do the same thing with their Office suite: Instead of buying a time unlimited license once, they now want to sell time limited licenses that have to be renewed once a year. Again, understandable from the software companies point of view as that ensures a steady income over the years. From a users point of view I am not really sure as that means there are yearly maintenance costs for software on computers at home that simply was not there before.

I wonder if that will actually accelerate the decline of those companies? If you buy software once you are inclined to use it as long as possible and perhaps buy an update every now and then. But if you are faced with a subscription model where you have to pay once a year to keep that software activated, I wonder if at some point people are willing to try out other alternatives. And alternatives there are such as Gimp for graphics and of course LibreOffice.

Already today I see a lot of people using LibreOffice on their PCs and Macs so that trend is definitely well underway. Perhaps it also triggered by people not only using a single device anymore which would require more than one paid license. Also, the increasing number of different file formats and versions that make sending a document for review to someone else and getting a revision that is still formatted as before it was sent a gamble, so why stick to a particular program or version of a word processor?

In other words, Open Source is the solution in a world where the Internet allows software companies to assert more control over their customers than many of them are likely to want. Good riddance.

Historical Computing And The Busch 2090 – Simulating A Newer CPU Architecture On An Old Processor

20130902_175609-smIt took a while to get hold of one but I finally managed to get a 1980’s Busch 2090 microcomputer I mused about in this and other previous blog posts. What I could only read about in the manual before I could now finally try out myself on this 30 year old machine and true to the saying that when you read something you remember it but when you do something yourself you will understand, I found out quite a number of things I missed when only reading about it. So here’s the tale of working and programming a 30 year old machine that was there to teach kids and adults about how computers work rather than how to work with computers:

The 2090 is programmed on a hexadecimal keyboard (see figure on the left) in a slightly abstracted pseudo machine code. It makes a number of things easier such as querying the keyboard or to display something on the six digit 7-segment display but otherwise it looks like machine code. After doing some more research into the TMS 4 bit processor used in the 2090, I found out that it is a direct descendant of the first Texas instruments microprocessor with a few more input/output lines, RAM and ROM added. Otherwise the processor works as its predecessor, the TMS 1000 from 1972. In other words when the 2090 appeared in 1981 the processor architecture was rather dated already and much more sophisticated processors such as the Intel 8080, the Zilog Z80, the Motorola 6800 and the MOS 6502 were available. While microcomputer learning kits appearing on the market a year or two later used these or other 8 bit processors, Busch decided to use an old 4 bit architecture. I can only speculate why but pricing was perhaps the deciding factor.

Tms1000-architectureSome research on the net revealed some more material about the CPU and other chips used. The manuals of the TMS 1000 architecture that also includes later versions such as the 1400 and 1600 can be found here and here. These documents are quite fascinating from a number of perspectives as they go into details on the architecture and the instruction set and also give an interesting impression of how what we call ’embedded computing systems’ today were programmed in the 1970s. Simulators were used to test the program which were then included in a RAM on the CPU chip as part of the production. No way to change it later on so it better had to be perfect during the production run.

What surprised me most when studying the hardware architecture and instruction set is that it is very different from the pseudo machine code presented to the user. My impression is that the pseudo machine code was very much inspired by newer processor architectures with a lot of registers and a combined instruction and data RAM residing in a separate chip. The TMS 1600, however, has nothing of the sort. Instructions and data are separate in the chip, all ‘real’ machine code instructions are in a ROM that is accessed via an instruction bus that is separate from the bus over which the built-in memory is accessed.

While the pseudo machine code uses 16 registers, the processor itself only has an accumulator register. The 16 registers are simulated by using the 64 x 4 bit RAM of the TMS 1600, which, on the real machine, is accessed as RAM over an address bus and not as registers. In addition, the processor chip as no external bus to connect to external memory. There are input and output lines but their primary purpose is not to act as a bus system. The 2090 however uses an external 1 kbyte RAM that is accessed via those input output lines. In effect, the small operating system simulates an external bus to that memory chip in which the pseudo machine code the user typed in resides. Very impressive!

There are a number of support chips on the board used for purposes such as multiplexing different hardware units such as the keyboard, the display, LEDs, input connectors and the memory on the input/output lines. As the input/output lines are separate on the chip and do not work like a bi-directional bus, one of the support chips offers tri-state capability for some hardware so it can be removed from the bus.

The TMS 1600 also has no stack as we know it today. Instead it has 3 subroutine return registers so up to three subroutines can called at any one time. This is already an improvement over the original TMS 1000 which only had one such register. Another interesting fact is that the TMS 1600 doesn’t have instructions for integer multiplication and division.

Apart from the accumulator register there are the x- and y-registers. These registers, however are used to address the data ram. A separate 6 bit program counter is used to address the ROM. While the pseudo machine code uses a zero flag and a carry flag, something that is part of all popular 8 bit microprocessor architectures even today, there are no such flags in the TMS 1600. Instead, there’s only a status register that acts as a carry and zero flag depending on the operation performed. Also, the processor doesn’t have the capability for indirect or indexed addressing.

Also quite surprising was that there are no binary logic instructions such as AND, OR, XOR etc. in the CPUs instruction. Therefore, these have to be simulated for the pseudo machine code which contains such commands, again resembling the instruction set of other ‘real’ CPUs at the time.

And another nifty detail are the two different kinds of output lines. There are 13 R-output lines that can be freely programmed and some of them are used in the 2090 e.g. for addressing the RAM chip (address bus) and some for writing 4 bit values to the RAM (data bus). In addition there are 8 O-outputs that can’t be freely programmed. Instead they are set via a 5 bit to 8 bit code converter and the conversion table was part of the custom CPU programming. From today’s perspective it’s incredible to see to what lengths they went to reduce the circuit logic complexity. So a 5 bit to 8 bit code converter, what could that be good for? One quite practical application is to illuminate the digits of a 7-segment display. As only one digit of the 6 digit display can be accessed at a time it’s likely that the 8 O-outputs are not only used for addressing the RAM but also to select one of the six numbers of the display.

Virtually traveling back in time and seeing a CPU like this in action rather than just reading about it is incredible. I now understand much better how the CPU architecture we still use today came to be and how limitations were lifted over time. An incredible journey that has led me to a number of other ideas and experiments as you shall read here soon.

Have Turned Off Auto-Approval For Comments For The Moment

If you have commented in the past couple of days you have probably noticed that the comments are not published immediately anymore. Unfortunately I am getting a lot of spam comments at the moment that are not filtered out automatically. As it is less work to approve valid comments for the moment than to remove the spam I've decided to turn off auto-approval of comments. Sorry for the inconvenience, I'll turn it on again as soon as Typepad can handle the spamming…

Retiring the Dongle Dock

Being a frequent traveler I was one of the first to wish for a product with which I could share a 3G connection over Wi-Fi. My first article about it is back from 2006. It took another two years until in 2008, however, before one of the first easy to setup devices, the Huawei D100 Wi-Fi access point designed to establish an Internet connection over a separate 3G USB stick appeared on the market. Fortunately I was in Austria at the time and could buy an unlocked version for a few euros. I've used it frequently since then and it has become a mandatory travel accessory for me. Now in 2013, however,  i.e. 5 years later I am finally about to retire it.

Thanks to Android, Wi-Fi tethering has now become a standard feature of most smartphones and despite having limits such as the number of concurrent Wi-Fi connections it supports, it is sufficient for my use. The range of the Wi-Fi chip in a smartphone is perhaps not as good as that of the D100 but in practice the distances I need to cover in hotel and meeting rooms rooms are no problem for a smartphone. 5 years for a wireless device in use before it is retired is quite a thing. Back in 2008, the N95 was the latest and greatest in terms of technology, just to give you an idea of the timeframe we are talking about.

Impact of Virtual Machines on Idle Mode Power Consumption

Ever since I discovered the benefits of running Virtual Machines on my notebook for a variety of things and how easy it is in practice I usually have three of them running at the same time. Yes, three of them at the same time and with 8 GB of RAM and using Ubuntu as host operating system makes the experience quite seamless.

A second Ubuntu is usually running in one virtual machine so I can quickly try out things, install programs I only need for a short time and don't want to linger around on my system and to run a TORified Firefox against unfriendly eavesdropping of half the world's security services. Also, by disabling the virtual network adapter and mapping a 3G USB stick or USB Wi-Fi stick directly into the virtual machine gives me a completely separated and independent second computer. Great for networking experiments. The other two machines usually run an instance of Windows XP or Windows 7 for programs that aren't natively available under Linux. There aren't a lot of those but they do exist. As the VMs are usually not in the way I usually start them but never terminate them unless I need to reboot the host. The only thing I noticed is that there is a power consumption impact.

When I was recently taking a long train trip I noticed that the remaining operation time indicated in the status bar was about one hour longer than usual. I was puzzled at first but soon noticed that the difference is that I had just rebooted the day before and I didn't have the need for a VM running since. It's obvious that VMs have an idle power consumption impact because instead of one OS there are usually four operating systems performing their background operations during idle times on my notebook. So while I was surprised I really shouldn't have been. But the takeaway from this is that in the future I know of a good way to increase the autonomy time in case I need it.

The Map On Paper In the Car On The Way Out

I always like to have a backup plan in place in case something goes wrong. For that reason I have kept a paper map of Europe in the car, just in case there's a problem with the maps and navigation app on my smartphone de jour. But recently I noticed that I can't remember when I've last taken it out!?

Honestly I can't and it must be close to 10 years that I haven't used it. This, the fact that the map must now be pretty out of date anyway and usually having more than one device that can run a navigation app with me these days make me think that the map is about to be discarded. Or perhaps I should keep it for historical reasons? The last paper map I bought…

Like telephone booths and coins that are fading away it's one of these things which mobile devices, mobile voice and mobile Internet access have made superfluous. Can you remember the last time you've used a paper map for navigation or orientation?

The ‘Must Read’ Book If You Want To Understand How A Processor Works

When it comes to computers I always had something of a blind spot: I know how memory works, what Boolean logic is, how a computer adds and subtracts, I know what a bus is, what registers are, how to program in machine language, etc. etc. However, I never really quite figured out how the CPU makes data go from RAM to the ALU and, after processing, back to RAM. I always had a vague idea how it works but the control unit with fixed control paths or driven by what is called microcode pretty much remained a black box. Recently I started looking into this topic again and found a number of sources that explain in simple words how a processor works, including the control unit.

An incredible resource I found is a book called "But How Do It Know – The Basic Principles of Computers for Everyone" by J. Clark Scott. I wouldn't have thought it's possible but within 30 minutes with this book understood how a control unit in a CPU works (based on my previous understanding of how all other parts worked). And I didn't only understand only sort of how it works, but how it really works. The book describes how a CPU and memory works in less than 150 pages and although that might be considered short it goes into the details down to the gate level. And it does it in a language that can be understood by anyone even without prior knowledge of electronics. Over decades I tried to understand how this works and always had to abort my efforts at some point. And then the mystery is solved by the book in 30 minutes. It's almost shocking as is the price of only 16 Euros for the paperback version.

There's a 20 minute video on Youtube that is based on the book, also highly recommended. While the video is great, you should keep in mind, however, that the book goes into much more detail without becoming complicated or boring. Yes, I am very enthusiastic about the book, it has been a real eye opener.

While the book describes a traditional 'wired' control unit with gates, some processors also use a "microcode" based control unit. That sounds even more complicated but if you have good prior knowledge of how a CPU works (e.g. by reading the book above) and then have a look at this project that shows how to build a CPU on your own that uses a microcode based control unit you'll see that a microcode based control unit is actually simpler to understand than a traditional control unit with gates. Another revelation for me!