The Symbian Foundation: Will It Make A Difference For Developers?

A lot has been written lately about Nokia buying the Symbian shares of Sony-Ericsson and others and creating the Symbian Foundation to release the OS as open source in the future. A lot of people become ecstatic when they hear  ‘Open Source’ as it seems to be a synonym for success and the only way to go. However, there are different kinds of open source approaches and usage licenses so it is worth to consider what developers will be able to do with Open Symbian that they can’t do today.

I think the big difference to Linux, which is also open source and has attracted many individuals and companies to start their own distribution, is that I think it is unlikely the same will happen with Open Symbian in the mobile space. In the PC world, the hardware is well standardized so people can easily modify the kernel and compile and run it on their machine. In the mobile world however, hardware is very proprietary so I think it is unlikely that the same will happen here, no matter how open the Symbian OS becomes. Therefore, an open Symbian is mainly interesting for hardware manufacturers as they will have easier access to the OS and can customize it more easily to their hardware. That’s a long way from ‘I don’t like the current OS distribution on my mobile so I download a different one from the Internet and install it on my phone’. But maybe we are lucky and open sourcing the OS will allow application programmers to use the OS more effectively and extend it in ways not possbile today due to the lack of transparency.

For more thoughts on what the Symbian Foundation might or might not change in practice, head over to Michael Mace and AllAboutSymbian, they’ve got a great insights on their blogs from a lot of different angles.

Blackberry Impressions

Location: A restaurant in Miami Beach and I am surrounded by Blackberry and a couple of Danger hiptop users! And no, these people are not the typical business users that used to carry the Berries exclusively only a short while ago.

I’ve noticed a similar trend at the conference I attended in Orlando last week. Most people had a Blackberry with them, nothing else, a bit of American monoculture. About half of them had a company Berry while the others bought the devices themselves because they see the usefullness of having mobile eMail. Each and everyone I asked also used the device for mobile web access and most of them used Facebook. And we are not talking about the teens and twens of Miami Beach here but of mothers and fathers in their thirties and forties.

Two very different and very interesting directions for the Berries and the mobilization of the Internet!

Can 300 Telecom Engineers Share a 1 Mbit/s Backhaul Link?

I am sitting in a Starbucks in Miami after an intensive conference week right now and starting to reflect on what I have seen and noted during the week. One of the straight forward things that comes to mind about last week is that conference organizers, especially in the high tech sector, have to ask about the details of Internet connectivity of the place they want to use. Just having Wi-Fi in a place is not enough, capacity on the backhaul link is even more important. In our case, 300 people were rendered without a usable Internet connection for the week because the backhaul was hopelessly underdimensioned for the load. When I arrived as one of the first on Sunday, the best I got was about half a megabit per second. During the week it was a few kbit/s at best. eMails just trickled in and using the Internet connection for Voice calls was impossible.

While some might see this just an inconvenience and argue that you should concentrate on the conference anyway there are others, like me, that require to answer a couple of eMails and call people throughout the day to keep the normal business going. So instead of making free calls, I and many others had to fall back on their mobile phones and paid a dollar/euro or more per minute due to high roaming charges. The extra cost of that to the company multiplied by 300 is significant. Last year, same conference, different venue there was an 8 MBit/s backhaul link and things ran a lot smoother. But I guess by next year, even that will not be good enough anymore to keep things going when 300 engineers arrive.

P.S.: Good that I had my AT&T prepaid SIM card. With the MediaNet add-on I could access the net and get to my eMails via AT&T’s EDGE network. Definitely not at multimegabit speed but a lot faster than over the hotel’s Wi-Fi.

How many Gold Subscribers Can You Handle

While well dimensioned 3G networks are offering fast Internet access today, some somewhat underdimensioned networks show the first signs of overload. Some industry observers argue that the answer is to introduce tiered subscriptions, i.e. the user gets a guaranteed bandwidth or a higher bandwidth if he pays more. But I am not sure that this will work well in practice for two reasons: The first reason is that when some users are preferred over others in already overloaded cells, the experience for the majority gets even worse. And second, if such higher priced subscrptions get more popular because the standard service is no good, it won’t be possible at some point to statisfy even these subscribers. So such gold subscriptions just push out the problem a bit in time but otherwise don’t help a lot. There is just no way around sufficient capacity or your subscibers will migrate to network operators who have made their homework. So instead of only investing in QoS subscription management I would rather also invest in analysis software that reports which cells are often overloaded. That gives the operator the ability to react quickly and increase the bandwidth in the areas covered by such cells. Having said all of this, what do you think?

3 UK Data Roaming Performance

Here I am, back in Italy and again using my 3 UK SIM for Internet access, since there are no data roaming charges between 3 networks in different countries. Very interesting and also a bit depressing to see the performance throughout the day. Sunday morninig 8 am seems to be a pretty quiet time and I easily got 1.5 MBit/s in downlink. Evenings seem to be high time, with data rates dropping to less than 300 kbit/s and long page loading times due to lots of lost packets. I am pretty sure it’s not a cell overload since the Wind UMTS network at the same locations easily gives me 1 MBit/s and no packet loss in direct comparison. So the bottleneck is either the link back to the home network or the GGSN in the UK .

While it’s good to see the networks being used and affordable data roaming in place, I’d appreciate sufficient capacity in the core network.

Putting The Hotel TFT To Good Use

Hotel tft
Since I travel a lot, I often stay in hotels. One thing in hotel rooms I could live without is the TV set, as I never have the desire nor the time to watch anything anyway. In recent years, however, I've noticed that good old cathode ray tube TV's are giving way to TFT TV's. Last week I took a closer look and noticed that these TV's usually also have a VGA or DVI input. Excellent, now I can finally put them into good use and connect them to my notebook as a second screen. All that is needed is a VGA or DVI cable, which I will take with me from now on. The picture on the left shows how my typical hotel setup looks like. Screen 1, screen 2 and a 3.5G HSPA modem for Internet connectivity. Almost as good as at home 🙂

S60 Power Measurements

Power Consumption
In a comment to a recent blog entry somebody left a link to an interesting S60 utility that records power consumption and other interesting technical information while Noka NSeries and other S60 based devices execute programs. Running in the background, one can perform actions and see the impact on power consumption. The picture on the left for example shows power consumption during different states during mobile web browsing.

The left side of the graph (click to enlarge) shows power consumption while a web page is loaded, i.e. while the device is in Cell-DCH / HSPA state. Power consumption is at almost 2 watts during this time. Once no more data is transferred, the mobile is set into Cell-FACH state by the network which requires much less power. However, the 0.8 watts is still significant. After about 30 seconds of inactivity the network releases the physical bearer and only maintains a logical connection. In this state, power requirements drop to about 0.2 watts which is mostly used for driving the display and the background light. When the device is locked and the background light is switched off, the graph almost drops to the bottom, i.e. to less than 0.1 watts.

An excellent tool to gain a better understanding of power requirements of different actions and processes!

Wireless and Mission Critical

I am on the road quite often and, as most of you have figured out in the meantime, a heavy user of 3G networks for Internet access. While I generally like the experience some network outages like this two and a half day nationwide full Internet access blackout in the Vodafone Germany network recently sends shivers down my spine. After all, we are not talking about a third class operator but one that claims to be a technology leader in the sector. As I use Vodafone Internet access a lot I was glad I was only impacted for half a day, having been in a DSL save haven for the rest of the time. If I had been on the road, however, this would have been a major disaster for me.

I wonder if the company that delivered the equipment that paralyzed the Vodafone network for two and a half days has to pay for the caused damage!? If it’s a small company then such a prolonged outage with millions of euros in lost revenue can easily put them out of business. And that doesn’t even consider the image loss for all parties involved and the financial loss of companies relying on Vodafone to provide Internet access. The name of the culprit was not released to the press but those working in the industry know very well what happened. Hard times for certain marketing people on the horizon…

Vodafone is certainly not alone facing such issues as I can observe occasional connection issues with other network operators as well. These, however, are usually short in nature and range from a couple of minutes to an hour or so. Bad enough.

To me, this shows several things:

  • There is not a lot of redundancy built into the network.
  • Disaster recovery and upgrade procedures are not very well thought trough as otherwise such prolonged outages would not happen.
  • Short outages might be caused by software bugs and resetting devices.
  • I think we might have reached a point where capacity of core network nodes have reached a level that the failure of one device triggers nationwide outages.

So maybe operators should start thinking in earnest about reversing the trend a bit and consider decentralization again to reduce the effect of fatal equipment failures. And maybe price should not be the only criteria to be considered in the future. Higher reliability and credible disaster recovery mechanisms which do not only work on paper might be worth something as well. An opportunity for network vendors to distinguish themselves?

Antenna Stuff

Recently I spoke to a sales engineer of Kathrein, a prominent German antenna maker that develops and produces all sorts of antenna equipment from TV antennas to sophisticated cellular network antennas. I can still remember how simple antennas were 10 years ago when GSM was first deployed: In many places simple dipol antennas were used and sometimes funny looking trident antennas (that’s how I call them anyway, I am sure they have a more scientific name…). In the meantime we have mostly moved to bipolar antennas that offer a main and diversity output to the base station. On top of that, lots of other things have been developed which are either now being deployed or waiting for that 4G bandwidth push that requires sophisticated antenna features:

Dual band antennas, e.g. 900 + 2100 MHz in one standard casing for sites with GSM and UMTS base stations: Such antennas give themselves away with four connectors at the bottom.

Wideband antennas, e.g. 2.1 – 2.5 GHz to support UMTS and LTE with a single antenna: I am sure those will be in high demand once LTE is deployed in the 2.5 GHz range.

Cable reduction: To reduce the number of expensive coax copper cables between the base station and the antennas, combiners have been developed to combine the signals of several base stations, send them through a single cable and then separate them again before they go into the different antennas.

Remote Electrical Tilt (RET): The size of a cell mostly depends on the angle of the antenna at the rooftop. The more it is tilted towards the ground, the smaller the coverage area. When a new base station is installed to increase the availble capacity in an area, it is necessary to change the tilt of neighboring antennas to reduce interference. Also as capacity in the network increases, it is also sometimes necessary to change the tilt of antennas to improve the overall coverage and bandwidth distribution. Manually changing the tilt of an antenna for these scenarios is expensive and sometimes simply not possible. This is where RET comes in. Instead of physically changing the angle of the antenna, RET changes it by increasing the lengths of the different antenna elements inside the antenna casing. This way, the overall antenna can be electrically tilted by around 10 degrees. Practically, changing the antenna lengths is done with an electrical motor that drives a spindle inside the antenna casing which can change the antenna lenghts of the different modules. The electrical motor is an add on module at the bottom of the antenna.

Antenna auto adjust: It can also be imagined that RET is used in the future to automatically ajust antenna angles based for example on the time of day. This could help to increase coverage in certain areas at certain times of the day by decreasing the cell size. Things could be pushed even further by linking the RET mechanism to the load of the cell and increase the tilt when the cell gets busy to offload some of the borader traffic to neighboring cells.

MIMO and Beamforming: And then there is Multiple Input, Multiple Output and beamforming for further bandwidth increases that requires several antennas at the rooftop. In practice, they are again included in a single casing to look like a single antenna to the outside. The first 2×2 MIMO systems will use little crossed antenna elements that send each MIMO channels with a vertial and a horizontal polarization.

All highly interesting and shows how important antennas have become for increasing bandwidth in the future. Thanks to Kathrein for the interesting information!

A Day In Rome With Nokia Maps and Wikipedia

I’ve been in Rome many times before and the main tourist attractions have lost a bit of appeal to me. So I decided to discover some of the more hidden gems on my current trip. Discovery, that is the difficult thing for me, since it’s nice to read something in a tourist guide but quite another to know how far that attraction is from the hotel or the place I am at the time. So I decided to try a Nokia Maps City Guide plugin. For Rome, three were available from different sources. As each is usable for 10 minutes before one has to pay for it, I downloaded all three and finally decided for WCities Guide.

When the plugin is started, one can choose from a number of categories. I was interested in sightseeing so I chose that. Next, a list is presented with the sights closest to the current position on top. They are also shown in normal map mode so one can easily see where the different sights are in relation to the current position and get some first info by clicking on the icons. Very nice! That makes it very simple to decide where to go next and how to get there. There is only little textual information provided for each location but with the help of the Internet, OperaMini and Wikipedia it is quite easy to get full background information and pictures on almost anything.

Here’s an example: I decided to visit San Clemente, a basilika close to the Colosseo which is built on top of a fourth century church which is built on a house dating back to the Roman empire. So after finding the place with the City Guide and Nokia Maps, I started OperaMini and used Google to find the relevant Wikipedia entry on the San Clemente Basilica in Rome. Fantastic, lots of background information in there and since it is all linked you can venture out and discover the life of San Clemens, San Cyril and other people connected to the location. Better than any general tourist guide book! The rest of the day went accordingly.

From the above it is quite obvious that I really liked the experience and I had a great day walking through Rome and discovering things this way I haven’t seen before. Naturally, there are some things that could still be improved. Here are a few:

The WCities Guide doesn’t precisely pinpoint the location of many sights they feature. That made it sometimes a bit difficult to find the place, as in a city, even an inaccuracy of a hundred meters or so puts you into an entirely different place.

I’d really like a clickable link in the City Guide description of a location to Wikipedia or another encyclopedia for further information. Ideally, this would open up the web browser. I could also live with a copy/paste feature for the URL, that is much better in combination with OperaMini anyway.

Price: 8 euros for a guide to a single city is a bit steep. After all, no book has to be printed, there are no shipping and storage cost, so I don’t quite see why I should pay almost as much as for a book.

Altogether, a great solution for a different sightseeing experience and I will surely use it again. Also nice to see where things could go with this in the future. Lots of potential!