Surveillance State: Lavabit, Silent Circle, Groklaw Cease Their Services – Who’s Next?

Three months into Edward Snowden's revelations of PRISM and other government programs to monitor pretty much everything that flows through the Internet today and the news still get worse by the day. Now, first services are shutting down because they can't offer privacy anymore in the bounds of the laws of an open, free and democratic society. Instead they are shutting down because secret court orders they can't even talk about would force them to reveal private information on an unprecedented scale. I find that very disturbing and I feel that we need to speak up against this now as politicans word wide are still not willing to have a public discussion on the right balance between security, privacy and personal freedom.

Don't get me wrong, I am not at all against court sanctioned wiretapping when there is evidence  that someone is in the preparation of comitting a serious crime. What I'm against is monitoring everything. Lavabit's owner, Ladar Levinson, who used to run a secure email service that encrytped all data stored on his servers and could only decrypt it while the user was logged in seems to be of the same opinion. Being asked by reporters he stated that in the past he's always complied to court orders to give information out to government agencies on a case by case basis. But it seems the US government now wants to go much further and hence he's decided to shut down the service. Exactly what is going on he can't tell because he is bound to secrecy by law and threatened with imprisonment if he fails to do so. But from the public knowledge how his email service works and his statement that he complied with court orders for surrendering information from and about specific accounts before, it's pretty easy to discern that the latest order went much further, likely a tap for security agencies directly into his system. From a privacy and civil liberty point of view that's absolutely not acceptable.

Next in line was Silent Circle, another secure email provider, who shut down their email service without any notice because it suffers from the same shortcoming: There's no end to end encryption in email as per design. No matter how secure you make transmission, at some point it always has to be encrypted before transmission or storage. And finally Groklaw, a popular law website has shut down as the owner feared that the privacy and confidentiality of her sources was no longer ensured with the current practice of security agencies monitoring the whole Internet rather than only the traffic of persons for which they have a court order for surveillance.

All of these services could shut down because they are privately owned. That of course does not shed good light on the big service providers who have not spoken out against this and keep running their services without being able or wanting to tell their customers of how their private communication is monitored. Society needs trust in order to function. Where's the trust in this? This makes me wonder about the future of Internet companies in the US!? The current state of affairs simply means that it's impossible for customers to trust US companies or US owned companies abroad to securely and privatly handle their data. Secret court orders can force them to reveal sensitive data to governments and what is once out of their hands can then be easily used by governments for many purposes. If I owned a non-US company today the last thing I would do is to store or process any data I didn't encrypt on my own premisis on servers of such companies. Money's usually a strong argument and loosing business because of
run-away anti-terror laws is perhaps a strong incentive to pressure for
change. But trust is lost and it will take a lot to restore it. So perhaps we'll see an Exodus of tech companies from a country to which in the past people fled to because they wanted freedom. It would be ironic in the extreme.

Sure I'm trying everything to better protect my privacy. Those of you who follow my ongoing 'Raising the Shields' sequel know that I go far beyond what a normal user can do. But a lot of my communication is still exposed to mass surveillance and some of it always will be. Raising the shields is treating the symptions, it's not the cure. We need governments to clearly define what security agencies are allowed to do, what they are not allowed to do and to communicate that openly. Otherwise, a significant part of our civil liberties will remain lost.

Were There Computer Learning Kits for Teenagers In Your Country?

There's one thing I have been wondering about while enjoying my recent trips back in time to the 1980's and computer learning kits: Were there similar kits in other countries around the globe tailored for teaching teenagers how computers work (vs. how to work with computers)?

In German speaking countries there were three different kits (Philips 6400, Busch 2090 and the Kosmos CP1) sold mainly via toy stores (yes!). So I searched the net a bit in the languages I am able to understand if I could find similar kits having been sold in other countries but came up pretty much empty handed. Perhaps this is due to the way I searched and the keywords I used but it's difficult to believe this was only something that happened in a few German speaking countries in Europe.

As many of you must have also grown up in the 1980's perhaps you still remember something that could point me in the right direction. If so, please consider leaving a comment!

How To Assign Special Characters To Keys In Ubuntu and Linux

Having friends and business partners around the world I frequently type texts in different languages and so far always struggled with non-standard Roman characters on my German keyboard. At some point I was so fed up that I spent a couple of hours to find a solution to make the process a bit less cumbersome.

While some non-standard Roman characters can be typed even on an German keyboard by using an "accent" modifier key, others such as for example the
'ç' are not directly reachable this way. As I need such characters quite frequently, however, I was looking for a possibility to assign such special characters to standard Roman alphabet keys together with the ALT or ALT-GR modifier key. The 'ç' for example should be reachable with the ALT or ALT-GR key (not available on a standard US keyboard) + the standard 'c' key.

On Ubuntu and I guess many other Linux based GUI's, the ALT key is already used for other purposes by the GUI so I focused on a solution with the ALT-GR key. As this key is not available on the standard US keyboard layout I am not sure if the following also works for this keyboard layout. But for all layouts that have an ALT-GR key here's the command to put the 'ç' on ALT-GR + c:

xmodmap -e "keycode 54 = c C c C ccedilla Ccedilla ccedilla Ccedilla"

54 is the code for the key on which the standard Roman "c" is located at on a German keyboard.

The current assignment of all keys is queried with the following command:

xmodmap -pk

and

xmodmap -pk | grep "(c)"

with the brackets around the character filters the output for the line with the code for key to which a specific character is assigned.

Non-standard Roman characters have a name that can be used in the assignment command above. The 'ç' character, for example like in 'François', is called ccedilla for the lowercase variant and Ccedilla for the uppercase variant. For the list of other special characters have a look here.

For each special character assignment a separate xmodmap -e… command is required. Changes are not persistent, however, i.e. a reboot returns the computer to the standard keyboard layout. To make the assignments persistent one can for example put all xmodmap commands in a shell script and execute it automatically during the login process.

Learning How Computers Work – Comparing Approaches of the 1980s

In a previous post I've been musing about how in the 1980s, microcomputer learning kits appeared on the market to teach kids and adults how computers work. Note that this is different from learning how to work with computers or even how to program them in a higher level language as in both cases the inner workings of a computer is abstracted. As I'm fascinated by the approach I've also wondered in the post if such kits are still available today or if learning about how computers work has effectively moved on to only be a topic at university courses. Here's what I've found out so far:

Even after some more research I still came up empty handed and it seems such learning and tinkering kits have disappeared today. What I have found, however, is a number of 1980s microcomputer learning kits each with a slightly different approach to the topic. As I am not only interested in the historical perspective but also wonder how this approach could be revived with current hardware and perhaps some self-written software I had a closer look at the tutorials which can be found in PDF format on websites of vintage computing enthusiasts to see how the different kits went about teaching computing basics. Interestingly enough, the approaches are quite different:

The Busch 2090 4-bit Microcomputer Kit

Busch-img-2Released in 1981, the Busch 2090 is the oldest of the three microcomputer learning kits I could find. It's based on a 4-bit microcomputer. As the other two, the kit is part of a larger electronics experiment program so it works well together with other electronics kits of the company to expand experiments into the physical world by triggering outputs to LEDs and loudspeakers and receiving input from different kind of self assembled circuits. I've blogged about it in a previous post and you can find some more details see here. (Sorry for this and many other links in this post being in German but it seems these kits were mainly popular in the German speaking world. Google Translate is a helpful tool to translate the pages into your language on the fly.)

Busch-mascotPrograms are written in a 4-bit pseudo machine code. Each pseudo instruction is always 3 hexadecimal digits long and all operations are explained in the decimal, binary and hexadecimal system. While most pseudo instructions closely resemble actual machine instructions some also perform more complex tasks such as displaying the contents of the accumulator or other registers on the display or reading key input. What I liked best about the teaching approach of the tutorial is the jovial style and a mascot that demonstrates important principles in a funny and easy understandable way as shown in the first image on the left

Busch-programAnother great approach of the tutorial is that almost all program listings contain the memory address for each instruction, the instruction code, the mnemonic for it and a description for each operation as shown in the third image on the left. This makes understanding and memorizing the codes and mnemonics very easy.

I find that the writing style and level of detail makes this kit appealing for kids from 12 to 99. The descriptions of how a microprocessor works and how it communicates with memory and peripherals are very simple but still precise. The tutorial has around 80 pages, so it's not a daunting task to go through it even with the attention span of a 12 year old and I found it to be a perfect balance between beeing too simple and too complex for people that have never touched the subject before.

The Philips 6400 Microcomputer Master Lab

Philips-6400-2Two years later, in 1983, Philips released it's microcomputer learning kit, the Philips 6400 microcomputer masterlab. While the general approach is the same as in the Busch 2090 there are some significant differences in the practical implementation. Philips used an 8-bit processor for its kit and instead of pseudo machine code instructions, real machine code was used for user programs. I very much like this approach as this is as close to the bare metal as possible. It makes programming a bit more complicated but from the tutorial I take it that the kit was addressed at a somewhat older audience than the Busch computer, perhaps 16 years and up, so this is not much of an issue. Less humor and more facts I would say.

The level of detail is much deeper compared to the Busch kit. At 160 pages the tutorial is twice the size and technical descriptions go much further and deeper. There are good technical descriptions of how two numbers are added in the hexadecimal system, how this is done in the binary system and finally of how this is implemented in hardware with Boolean logic. How subtraction is done by adding the two's complement of a number to another is also introduced as well as multiplication and division operations. The kit also introduces the reader reader to function calls, the stack and stack pointer, discusses Boolean logic (AND, OR, NAND, NOR, XOR), bit masking, jump conditions and quite a number of other bare metal operations. For somebody in the final years of high school who has a knack for mathematics, this level of detail and complexity is probably just right.

Philips-programMy only gripe with the tutorial is that at the beginning program listings are not explained at all and the user has to type in numbers without really knowing what they do or what they stand for. This gets better as more details are introduced as shown in the image on the left but I still felt when reading the tutorial that the program listings should have contained more explanations on an instruction to instruction basis for an easier understanding.

The Kosmos CP1

Kosmos-cp1The tutorial I have of this kit is from 1983 so it also seems to have been brought to the market two years after the Busch 2090. Like the Busch computer, the Kosmos CP1 also seems to address a younger audience as the introduction to computing is quite on a high level of abstraction. Like the Busch approach, the kit uses pseudo-machine instructions, that are, however, much more abstracted than the Busch pseudo-instructions that stick much closer to real machine instructions.

Kosmos-programHaving gone through the other two tutorials before I focused on this one I have to admit that I found the pseudo-machine code of this approach a bit difficult to understand and I wonder why they ventured so far away from the bare metal instructions. Perhaps this was done to get around the hexadecimal system. Yes, there's not a single hex number throughout the whole tutorial, the computer is entirely programed with decimal numbers. Discovering the hexadecimal system later on must be somewhat of a culture shock. The image on the left shows how a program for the CP1 looks like. Each command is well described but the commands themselves are a bit too abstracted from the real world for my taste. The tutorial has 145 pages and is split into two parts. The first 60
pages are dedicated to introducing the commands and how a computer
works and the second part is dedicated to program examples that are
discussed in detail. Quite a number of the examples let the computer interact with the real world via electronic input and output circuits that can be plugged together on an extension board. While the other two kits include such extension boards, some resistors, transistors, LEDs, etc., I am not sure they were included in the CP1 kit or if was assumed that the reader has already bought other electronics experiment kits from Kosmos before.

What All Have in Common And Why Simplicity Trumps The Full Keyboard + Screen Approach

One thing all three kits have in common is a 7-segment display with six digits (Busch 2090 and Kosmos CP1) or eight digits (Philips 6400). As can be seen in the images the kits are completely operated and programed via numerical input pads and a number of special keys for operations such as performing a reset, for selecting and running a program, for entering programming mode, etc.). This might seem a bit archaic from today's point of view but since these kits are there to introduce the bare metal innards of computers I would still do it like that today. It might be too cumbersome to write long programs with only a numerical keypad to type in raw machine instruction codes but the point of these kits is to teach how a computer works on a low level. Therefore, interacting with it on this level is very appropriate and serves the 'bare metal' approach rather than being a restriction.

In addtion, all kits are very hardware centric so experiments are not limited to generating an output on a 7 segment display and receiving input from the numeric keypad. Instead, experiments can make use of digital input and output ports to which electronic circuits that are put together by the user on an extension board can either deliver input or receive an output to trigger lights or generate a sound.

My Preferences For A Modern Microcomputer Learning Kit

If I were to create such a kit again today for kids age 12 and up and also an adult audience, I think I would take an approach between the Busch 2090 and CP1 on the one hand and the Philips approach on the other. I like the simple and fun approach to the topic of the Busch 2090 tutorial but instead of using pseudo machine instructions I would make use of real machine instructions as is done by the Philips kit. That would mean to introduce function calls a bit earlier in the tutorial to do some more complicated things as showing register contents on the 7-segment display or receiving keyboard input but I think that is a compromise that would be worth it. One could even have two tutorials then, one for a younger and one for a somewhat more technically knowledgeable audience. Or one could create an easy to understand tutorial for both audiences and have a second one with more technical details that builds on the first one for those that have been hooked.

While I think that the audience for such kits was small back in the 1980's and might perhaps be even smaller today it would still be fun to work on this for pure self enjoyment and for giving something to those who'd like to find out how a computer really works. Too bad the day only has 24 hours…

I Just Noticed We Had 40 MHz LTE Channels in Europe, Counting the American Way

So far, pretty much everyone in the industry measured channel bandwidths in what either the downlink or the uplink channel provides. A UMTS channel thus always had a bandwidth of 5 MHz despite twice the spectrum being used, 5 MHz for the downlink and another 5 MHz for the uplink. Sometimes this was also described as 2 x 5 MHz. But it seems some in the US are now adding uplink and downlink together as for example in this Gigaom article. A small 5 MHz LTE carrier now has a bandwidth of 10 MHz. Sounds nice when comparing it to other US network operators that in their words also use 10 MHz LTE carriers, which are, of course 2 x 10 MHz. So by those standards network operators in Europe using the 1800 or 2600 MHz bands already have 40 MHz LTE deployed! That sounds nice! After 4G, Real 4G, True 4G and LTE-Advanced it's the latest smoke bomb to confuse the world and make yourself look better than you really are. Sigh…

Raising the Shields – Part 7: Auto-Delete Cookies When The Browser Closes

Most users today are very happy that web services recognize them when they come back to use a service over many weeks or even months as they don't have to identify themselves each time they visit a site. While this is undoubtedly convenient it creates a number of severe privacy issues:

  • On sites like my favorite online shopping portal I like to browse for things anonymously. If I keep being recognized then the shopping portal can record all searches and results I have clicked on. To me that feels like I if I was observed by a dozen cameras in a store and the store then analyzes the recordings and keeps the results indefinitely. No thanks.
  • Except for Safari and perhaps Firefox in the future, browsers allow the use of third-party cookies on web pages. This way, advertisers can track a user's path through the Internet because each time a web site is visited that links to some content of the advertiser, the same cookie is sent back thus creating a trail of where the user went for the advertising company. 
  • The third-party cookie mechanism also allows popular social networking services to keep track of where their users go when they leave their website. 

All these things are totally unacceptable to me. Fortunately there's an easy fix for this. In the web browser a few simple settings protect users from such schemes:

  • Disallow 3rd party cookies ("Accept 3rd party cookies = never"). It's done by the Safari browser by default and I've used it for many months now without any bad side effects.
  • Configure the browser to delete all cookies when it is closed ("Keep until: I close Firefox"). This way, Amazon and other web services I use do not recognize me again once I restart the browser. The downside is of course that I have to log-in again for personalized services. But with the browser's autocomplete username and password feature it is only a minor inconvenience.
  • On a few websites such as my blogging service I would like to remain logged in despite browser restarts. For this, there's a cookie whitelist in Firefox (click on the "Exception" button in the Privacy Tab in Preferences). Only cookies on the whitelist survive browser restarts if the option "Keep Until" is set as described in the previous bullet point.

The images below show the settings I have made in Firefox to ensure as much privacy for myself as possible for normal web surfing. For advanced privacy needs I then use a TORified browser as described in a previous post.

Firefox privacy  
Firefox exceptions

This has already been part 7 of my "Raising the Shields" series and it will probably not be the last. In case you have missed some of them and are interested, here's a link that shows them all in sequential order.

Bluetooth Revival With PC Connectivity

Over the years my use of Bluetooth for various purposes has significantly diminished. With the advent of Wi-Fi tethering that I wished for back in 2006 when the first phones with Wi-Fi connectivity were pioneered by Nokia (see here and here) and a few years later realized on the Android platform, my Bluetooth use became occasional at best. I tried Bluetooth headsets every now and then but was always disappointed with their range and I just don't listen often enough to music on my smartphone to have made it worthwhile to buy a Bluetooth enabled stereo headset. Also for transferring pictures or files from mobile devices to my PC or vice versa, Bluetooth was mostly not an option because I always needed a dongle on the PC side. The time this takes was usually the same as removing the SD card from the smartphone and inserting it into the PC or plugging in a cable to access the pictures from the simulated SD card provided by the generic SD card driver that Android provides. My new PC, however, now provides for the first time a built-in Bluetooth transmitter and due to no longer having to plug-in anything, I have found myself using Bluetooth again for quickly moving files and images back and forth if there is only one or two files or images involved in the transfer. A late renaissance of the technology as Wi-Fi direct still doesn't seem to take hold (see here, here and here).

Probing Layer 1 – Part 5: DVB-T Signals

Dvt-t-signalVenturing down a bit on the frequency scale I've now also taken a closer look at DVB-T television signals with my DVB-T USB receiver stick and SDR-Sharp. The image on the left shows the right edge of an 8 MHz DVB-T signal that encodes 4 television stations. Like in the LTE signals I wrote about in a previous post, there are also vertical stripes that can be seen in the waterfall diagram. The stripes are much narrower, however, likely due to the 1 or 4 kHz carriers used in the C-OFDM modulation of DVB-T compared to the 15 kHz OFDM carriers used in the LTE downlink. Interesting to note is that parts of the channel on the frequency axis are broadcast with more power than others. Not sure why, my understanding of DVB-T signals is very limited.

I was also quite baffled with how little of the spectrum assigned to terrestrial television is actually used today. In Germany, DVB-T can be broadcast in the 177.5 – 226.5 MHz range (i.e. a total bandwidth of 49 MHz) and between 474 and 786 MHz (i.e. a bandwidth of 312 MHz). That's 361 MHz in total or enough for 45 DVB-T channels. Despite DVB-T being a single frequency technology in which neighboring transmitters can use the same channel, not all can be used simultaneously as different TV stations are broadcast in different parts of the country. But even if only every second channel is used, that spectrum could still hold an impressive 90 TV channels. In practice much less is broadcast today and when scanning through the spectrum in Cologne, most channels were empty.

360 MHz is quite a sizable chunk of spectrum and as the popularity of DVB-T and terrestrial TV broadcasting is on the decline I can see why there are moves to re-assign the 694-790 MHz range for wireless Internet connectivity, i.e. for use with LTE and perhaps other wireless technologies in the future. The Wikipedia article linked above indicates that the additional band could be brought into operation by 2026. This would give network operators access to an additional 96 MHz of spectrum that, with a duplex gap of 11 MHz, would offer 42 MHz for downlink and 42 MHz for uplink data transmission. That's a bit more than the current digital dividend spectrum in the 800 MHz band bundled in LTE band 20. Here, 30 MHz of bandwidth is available in each direction and used by three network operators in Germany today.

The New Nexus 7 Tablet (2013) Supports AT&T, Verizon and T-Mobile LTE In One Device

While in Europe, GSM, UMTS and LTE are used by all network operators, the US wireless landscape has always been much more diverse. This meant and still means that there always had and have to be several device variants to support different networks. But with the advent of LTE and advances in chip technology this may be about to change.

When Google recently introduced the 2013 version of the Nexus 7 tablet, it upgraded the cellular hardware to support 7 LTE bands. For details see Google's page on the Nexus 7 and AnandTech's mini review over here which has a somewhat different frequency listing. Apart from the stunning number of supported LTE bands it also supports the five major UMTS frequency bands.

The LTE band combination for the US is especially interesting, as band 13 is included for Verizon's LTE, band 17 for AT&T's LTE and band 4 for T-Mobile's LTE. This might very well become the future trend and will finally allow US consumers the same flexibility as in Europe to buy a device independent of the network operator or even change network operators over the life cycle of the device.

I'm not quite sure how Sprint fits into this equation!? From what I can tell they have LTE up and running in a 1900 MHz band and have taken over Clearwire's 2500 MHz TDD assets. While the 2500 MHz TDD band is not supported by the device I have no information which band Sprint uses with it's 1900 MHz assets. If you have some more information on that, please leave a comment.

Also interesting is the absence of CDMA support in the tablet. Probably not surprising for a tablet, as mobile telephony, if implemented on the user interface at all, is not a prime use case for a tablet. Also, unlike UMTS with it's great data rates, CDMA EvDo only offers limited speeds which are undesirable in a data heavy product as well. So why bother?

40% of UMTS Band 1 is Unused

While in the US, network operators were perhaps struggling with the amount of spectrum they had for their 3G services and thus rushed to jump onto the LTE bandwagon, Europe continues to enjoy very good data rates over 3G UMTS to this day (for details have a look here for example) in addition to the massive additional capacity now available on other frequency bands with LTE. So I was wondering a bit how much for the UMTS 2100 MHz band 1 spectrum is actually used today.

Cologne is one of the bigger cities in Germany so it is fair to assume that it is also a place in which the highest number of UMTS carriers are needed to satisfy demand. In total, band 1 can host 12 individual UMTS 5 MHz carriers. In practice, however, SDR-Sharp and my DVB-T stick show quite clearly that only 7 of those are used today. In other words, 40% of the bandwidth available for UMTS in this band are still unused.

It's interesting to also look at how many 5 MHz slots each of the four network operators has in Germany in the prime UMTS band. The distribution is as follows:

  • Operator 1: 2 channels
  • Operator 2: 3 channels
  • Operator 3: 3 channels
  • Operator 4: 4 channels

Operator 1 has both channels on air and thus, LTE in other bands is the only way to increase available capacity.

Operator 2 and 3 also have two channels on air and in addition have deployed 10 MHz in band 20 for LTE. If necessary, they could still extend their UMTS capacity with one extra channel.

Operator 4 is only using one of its four channels so far! That's in line with that operator always trailing all other operators in speed tests by quite a bit. As that operator does not have spectrum for LTE in the 800 MHz band I would not be surprised if they started with LTE in the 2100 MHz band with a 10 or 15 MHz carrier.