Equivalent PLMNs in Germany – Telefonica O2 and EPlus Network Network Integration

After acquiring the E-Plus mobile network in Germany, Telefonica O2 has started integrating the two physical networks. For their customers the first visible change is that Telefonica has enabled national roaming between the two networks. There are several ways to do this so I had a closer look which option Telefonica has taken in Germany.

As reported by the press, national roaming has been enabled for the two formerly separate GSM and UMTS networks but not yet for LTE. In Cologne, there's advertising in the streets now to make customers aware of it. And indeed the two UMTS networks are now open for national roaming while the LTE networks are still separate and registration attempts with a SIM card of E-Plus are rejected on a Telefonica LTE cell. At this point in time in Cologne, however, the GSM networks are not yet shared and my E-Plus SIM card is rejected in Telefonica's GSM network. From a policy point of view that's quite interesting and shows that mobile Internet access is now in the focus rather than 2G voice telephony.

To make devices aware that they can roam between the networks, there's the Equivalent PLMN (Public Land Mobile Network) ePLMN mechanism defined in 3GPP. The ePLMN indication can be put into the GSM, UMTS and LTE broadcast information which is what is done in Austria for example to indicate national roaming between ex. Orange, 3AT and T-Mobile. Telefonica in Germany, however, has chosen not to go down this route as I could observe no ePLMN information in the broadcast information of any network technology.

Instead, ePLMN information is sent to the device at the end of the registration process in the Location Area Update Accept and Routing Area Update Accept messages. I played around a bit and this works just as well. As designed, my mobile went from an E-Plus cell to an O2 cell despite manual network selection to E-Plus.

While being 3GPP standards compliant not all customers are happy with this in practice. As reported in this article, some Telefonica/O2 customers are not happy to find their mobile on the E-Plus side of the combined network which at times and in some locations seems to be more congested than the O2 network that is also present. But once in the E-Plus UMTS network the only way to automatically go back to the O2 UMTS network is to loose E-Plus UMTS and GSM network coverage. It would of course also be possible to define neighbor cell relationships between the two networks so the mobile could find cells of the 'other side' but I guess that's too much trouble for an interim solution.

Network sharing is also bad news for subscribers in dense population areas where only one of the two UMTS networks has been available so far. Due to the sharing the network now has to cope with twice the numbers of subscribers. According to the article above that has some rather unpleasant slow-down effects at times.

A quirk caused by sharing the two UMTS networks but not the LTE networks is that once a device is in the 'partner' UMTS network it is unable to go back to the LTE network of its home network operator until coverage of the shared UMTS/GSM network has been lost. As the LTE networks are not shared it would not even be possible to announce neighbor cell relations ships. I can see why that will make some customers with patchy LTE coverage quite unhappy.

So while national roaming is a good idea to extend coverage for customers and probably a requirement to shutdown the E-Plus network over time it will be interesting to observe how many customers are affected by the negative consequences of a move that the marketing departments try to give a positive spin.

98% Wireless Broadband Coverage A Requirement After The German Spectrum Auction

As reported in the previous post, another spectrum auction has started in Germany this week, this time with only 3 companies being allowed to bid for the spectrum. Two things make this spectrum auction especially interesting for me. First, all of the GSM 900 spectrum and quite a bit of the GSM 1800 spectrum is re-auctioned as the licenses awarded a decade or two ago are expiring. So it's going to be interesting to see who wants to acquire how much spectrum in the pretty narrow GSM 900 band that is not very suited for broadband Internet services because it will have to be used for continuing the GSM narrow-band service there for the foreseeable future. The 1800 MHz band is a different beast as it's broad enough for high speed Internet services of several network operators and already used in Europe for that purpose in addition to GSM.

The second interesting thing for me in this auction is that the German Regulator (BNetzA) requires each company that acquires spectrum in the new 700 MHz (digital dividend 2) band to cover 98% of the population with their mobile broadband Internet service with a speed of at least 10 MBit/s per customer on average, and 50 Mbit/s per sector of a base station. The later requirement means that at least 10 MHz of spectrum has to be used per sector. For the details have a look at the 250+ page rules and requirements document for the auction.

Today, we are still quite a bit away from that goal. According to the regulator's report for 2014 that was published a couple of days ago, 92% of the population is now covered by at least one LTE network and the market leader's LTE network covers 80% of the population. According to the auction rules, EACH network has to cover 98% of the population in three years time, however, so from that point of view there's still some work to be done.

And finally I also find it quite interesting that the rules also go into the details of what statistics each network operator has to annually deliver to the regulator, including the requirements to supply SIM cards and methods for the regulator to make their own assessment how well each network is deployed.

German Spectrum Auction 2015 Started – Online Sources

Yesterday, the 2015 spectrum auction for wireless network operators has started in Germany. In addition to the re-auctioning of spectrum in the 900 and 1800 MHz band due to decade old licenses expiring, new spectrum is auctioned in the 700 MHz band (Digital Dividend 2) and some extra uni-directional spectrum in the 1500 MHz band. Hopefully, having only 3 companies bidding for the spectrum will not drive up the auction results to unreasonable levels as in the past. Anyway, Teltarif has published a good report about the results of the first day here and they'll probably follow the proceedings and comment on a regular basis. Worth watching in case you are interested in spectrum auctions. Their posts are in German but Google can help with the translation… And in case you are wondering about the T&C's of the auction, they can be found here, again, unfortunately in German only.

We Are Past The “Human Subscription Peak” As Well

A couple of days ago I started my analysis of this year's report of the German telecoms regulator (RegTP) with a first post on that we are past "peak telephony". The report also clearly lays out that we are past the "human subscription peak" as well.

10 years ago in 2004 there were 89 million mobile subscriptions in Germany. The peak was seen back in 2011 with 142 million subscriptions. Since then the number of subscriptions have gone up and down a few millions year on year and in 2014, 139 million subscriptions were counted. In other words there is no growth anymore despite the push for mobile devices in addition to smartphones such as a tablets that also have cellular connectivity.

So should we see growth in this area in the future again it will probably come from other areas. Machine to machine for example. In other words the number of SIM cards might from now on be a good indicator of how much and how fast non-human machine communication gains traction.

Past the “Peak Telephony” In Germany

Recently, Dean Bubbley has written an interesting blog post about how most industrial nations are beyond “peak telephony”, i.e. the number of voice minutes in fixed line and mobile networks combined is decreasing. When the German regulator published its report for 2014 a couple of days ago I had a closer look here as well to see what the situation is in Germany. And indeed, we are clearly past peak telephony as well.

And here are the numbers:

In 2014, fixed line networks saw 154 billion outgoing minutes in Germany which is 9 billion minutes less than last year. On the mobile side they've been observing an increase of 1 billion minutes. In total that's 8 billion minutes less than the previous year, which is about -3%. The trend has been going on for quite a while now. In 2010, combined fixed and mobile outgoing voice minutes were at 295 billion compared to 265 minutes in 2014. That's 11% less over that time frame.

A question the numbers can't answer is where those voice minutes have gone. Have they been replaced by the ever growing traffic of instant messaging apps such as Whatsapp or have the been replaced by Internet based IP voice and video telephony such as Skype? I'd speculate that it's probably both to a similar degree.

Skype Still Supports Linux – But I Got Rid Of It On The PC Anyway

Despite my fears last year that Skype that is owned by Microsoft these days might cease to support PC Linux at some point and leave me stranded, it hasn't happen yet. Last year I speculated that should this happen I would probably just move Skype to an Android tablet and be done with it. As I remarked at the time this would have the additional benefit that I would reduce exposure of my private data to non-open source programs as an added benefit. Between then and now I went ahead and tried out using Skype on a tablet and a smartphone despite its ongoing support for Linux on PCs and found that it's even nicer to use on these platforms than on the PC. During video calls I can even walk around now without cutting multiple cords first. And I have the added benefit that there's no exposure to my private information anymore by a non-open source program as I otherwise only use that tablet for ebook reading. I'm glad tablets have become so cheap that one can have several of them each dedicated to a few sepcific purposes. That ties in well with my thoughts on the Macbook 2015 becoming the link between Mobiles and Notebooks.

My Gigabit/s Fiber in Paris Is Already Outdated – Say Hello to 10 and 40 Gbit/s PON

Since I know how a gigabit GPON fiber link feels and performs and that it's deployed significantly in some countries I can't but wonder when telecom operators in other countries stop praising DSL vectoring with 100 Mbit/s downlink and a few Mbit/s in the uplink as the future technology and become serious about fixed line optical network development!? Having said that I recently noticed that the Gigabit Passive Optical Network (GPON) I have in Paris with a line rate of 1 Gbit/s is actually quite out of date already.

10G-PON, already specified since 2010, is the successor technology and, as the abbreviation suggests, offers a line rate of 10 Gbit/s. According to the Wikipedia article that line rate can be shared by up to 128 users. And thankfully, PON networks are upgradable to 10G-PON as the fiber cable is reused by changing the ONT. Backwards compatibility is ensured as 10G-PON uses a different wavelength compared to GPON so both can coexist on the same fiber strand thus allowing a gradual update of subscribers by first changing the optical equipment in the distribution cabinet and subsequently the fiber devices in people's homes.

But that's not all as standardization of the successor to the successor is already full swing. NG-PON2 is the new kid on the block and will offer 40 Gbit/s downlink speeds on several wavelengths over a single fiber cable and 10 Gbit/s in the uplink direction. For details have a look at the ITU G.989.1 document that contains the requirements specification and G.989.2 for the physical layer specification.

So who's still talking about a measly 100 Mbit/s in the downlink?

5G -Separating Use Cases From Technical Debates

For first and second generation mobile networks the use case was pretty simple: Let's make voice telephony wireless and mobile. For third and fourth generation networks the use case was to mobilize the Internet. Perhaps it's only in retrospect but these use cases are pretty easy to grasp. On the other hand I can still remember the 'search for a killer app' for 3G networks that went on for many years. I'm not sure if it was ever found as that search was done in mindset that the killer app should come from network operators when in reality the 'killer app', as far as I'm concerned, was to mobilize the Internet as a whole. So what about 5G then?

Compared to the discussion that was taking place around 3G (UMTS) and 4G (LTE) at the time the discussion on what 5G will be and why we need it is too hazy for me as lots of more or less realistic use cases are discussed while the discussion on how 5G will actually work is more or less done in the background. Stephen Temple over on his web site suggests to split the 5G discussion into a use case debate and a technical debate. A good idea in the light that most of the network operator centric use cases discussed at the time for 3G and 4G were never realized they way they were discussed (e.g. IMS as a universal service platform). He has a number of very interesting thoughts on the technical side, including the potential non-regulation of spectrum above 5 GHz and close range wireless-fiber networks as technical corner stones of 5G.

C64 History: Chuck Peddle Amp Hour Podcast

Being a bit of a history buff (e.g. see my article on 'C64 Vintage and Virtual Hardware For Exploring The Past') I stumbled over a recent podcast the Amp Hour did with Chuck Peddle. If the name doesn't sound familiar it could be because main stream media often portrays the 1980s and 1990s as an epic struggle between Apple, Microsoft and IBM. This is perhaps because all three companies still exist today but the story is a lot bigger than that as companies such as Commodore and Atari and home computers like the C64 play a big part in that revolution as well. In the Amp Hour interview with Chuck Peddle, the leader of the team that designed the 6502 processor that would make home computing in the 1980s affordable for the masses, goes back to the times before and after the C64 and tells history from his point of view.

Peddle says that while Apple built in style and IBM for business, Commodore built for the masses. I more than agree with this statement as the C64 was the only home computer my parents could afford to buy me as a kid. Both Apple and IBM played in a totally different league from a pricing point of view. So if you want to spend a good time hearing about history, lean back and enjoy that podcast. And if you want to learn more, Brian Bagnall's 'Commodore – A Company On The Edge' is a great source for additional details and stories about the 1980s and 90s in computing.

The 2015 Macbook Is The Link Between Mobiles And Notebooks

The 2015 Macbook is certainly not a product that could replace my productivity notebook. With only a single connector for power and connectivity it utterly disqualifies for my usage scenario where 3 USB ports, a single external screen connector and a single Ethernet port I have on my current notebook is often not enough. But while that is so that device is a first of its kind because it's a product that bridges the gap between smartphones/tablets on the one side and notebooks/PCs on the other.

Have a look at the iFixit tear down and you see what I mean. The motherboard is only slightly bigger than what you find in a tablet today and certainly doesn't look like a traditional notebook motherboard anymore. But the whole setup is strong enough to run a "full" operating system and not a stripped down version such as Android or iOS. If the screen was touch sensitive that device would actually be a tablet with a built in keyboard with a full desktop operating system rather than a notebook.

So what began in the mobile space when the Linux/BSD kernel replaced operating systems that were developed far away from the desktop world in 2007/2008 with the first Android and iOS devices has now extended to the overall operating system itself.