FON on Femtos

Here's an interesting press release from FON, the company behind the FON hotspots that help you share your Wi-Fi Internet connection at home in exchange for free access around the world when you encounter another FON hotspot. In their quest to increase the number of FON hotspots, they have in the past already partnered with DSL providers in France and the UK to include their software in their home gateways. Now, FON has made an additional partnership with 3G femtocell maker Ubiquysis.

With this deal, future femtocells from Ubiquisys can also include the FON software so a home gateway can be used to enhance 3G coverage, to provide private Wi-Fi connectivity at home or in an office and can broadcast a public Wi-Fi signal. I think this setup might also make a lot of sense in hotels, airports and shopping areas and other places with lots of people who could benefit from better 3G coverage or Wi-Fi.

Now FON and Ubiquysis probably have to find out how to sell the concept to a fixed line / wireless network operator. I imagine that the incentive to include the FON software for a mobile network operator is similar as for the fixed line / wireless operators in France and UK. I'd say it is likely that they are splitting the revenue.

An interesting proposition that rises and falls with the popularity of Femtos. A winning cooperation? What do you think?

HSPA+ No Substiture for LTE

A follow up to this earlier post on LTE and capacity: Every now and then I see a debate of whether it is better to upgrade from HSPA to HSPA+ or to go to LTE. From a capacity point of view I think it's not an "or" decision, it's rather an "and" decision.

I don't see a reason why operators using LTE as a capacity extension in the 2.6 GHz band should not also upgrade to HSPA+ (before, after or at the same time) to make the best out of of the 2.1 GHz spectrum as well and to support as many HSPA users as possible. True, not all mobile devices will be HSPA+ capable for quite some time, but HSPA+ is backwards compatible thus taking everyone forward.

Especially once current equipment is end of life and replaced by multi-technology and multi-band base stations, potentially with new MIMO antennas that can be used simultaneously in the 2.1 and 2.6 GHz band, it seems rather a natural thing to do to me.

As always, thoughts are welcome.

LTE as an HSPA Capacity Extension

Thought of the day: I keep hearing that LTE is great because you can go beyond the capacity of HSPA(+). Well, from an air interface perspective (Bits/s/Hz), that's going to be tough to achieve as HSPA(+) is going in the same direction capacity wise given the same amount of spectrum as LTE.

BUT, and that's how I haven't thought about it yet, LTE can be used in the new 2.6 GHz band, something for which HSPA is not specified and current base stations and many antennas are also not designed for. So once you get to the limit of your HSPA capacitiy, i.e. once you have used all the assigned spectrum in the 2.1 GHz band, you can use LTE to increase capacity with an overlay 2.6 GHz deployment. Where a lot of capacity is needed, range, which is going to decrease with the higher frequency, is also not a big issue.

Maybe you are lucky as an operator at this point and your 2G/3G equipment at this location is end of life and needs to be replaced at the time you run out of capacity anyway. It might then be possible to replace the old base station with a multi-technology, multi-band base station. If you are even luckier you manage to get a backhaul fiber to the site as we are talking triple digit MBit/s that have to be transported.

And at the same time you give your customers an incentive to upgrade to LTE: Higher speeds for them as they don't have to share the network with the masses anymore. At least for a while.

Internet Access on the Flying Scotsman

Despite its name, the Flying Scotsman is not a plane but actually a train with lots of history, running daily between London and Edinburgh. From a wireless perspective, a recent trip with it was interesting because free Internet access is offered to all passengers during the ride. A bit of background research revealed that the service has been realized together with Swedish company Icomera and some very high level information about it can be found here.

It looks like for most of the trip, a satellite based connection is used to backhaul the data. However, compared to the Internet access on Thalys trains between France, Belgium, The Netherlands and Germany, the connection is very slow and I could not get more than a couple of kbit/s over the link at any time. Round trip times varied greatly between a couple of hundred milliseconds to several seconds. Yes, several seconds (!), no idea where those packets went in the meantime…

It seems the connection is pretty much congested all the time, which might be because it is free for all passengers, due to a limited link capacity or a combination of both. So when web browsing, it usually takes quite some time for pages to come up. After a while I adopted a "better than nothing" approch but I wonder if some people would rather be willing to pay extra for the privilege to be "fully" connected and to shorten the wait!?

It would be interesting to know what kind of satellite system they use as I didn't see any dome like structures on top of the train such as those on the Thalys. Also, I wonder if they have similar uplink / downlink capacity on the link. No way of telling from the outside.

To summarize I'd say I got the work done I wanted to do during the trip, which was mainly e-mail, IM and a little bit of research on the web. I'd be quite unhappy to be stuck with such a slow connection for more than a couple of hours but for the train ride it was all right. That doesn't mean, though, that National Express East Coast should sit on their hands and do nothing, the service could be much improved as the Thalys example shows.

Mobile Application Stores Conference At CTIA With A Great Lineup of Speakers

Ajit Jaokar, friend and co-presenter of our LTE course at the University of Oxford has a couple of interesting events coming up over the next couple of months. Needless to say that I am more than happy to help spreading the word:

At the upcoming CTIA, Ajit's company Futuretext is producer of the Mobile Application Stores, Strategy and Deployment conference and is proud to announce an all-star speaker lineup for this unique event. Mobile Application Stores is a partner seminar of International CTIA WIRELESS I.T. and Entertainment October 8th in San Diego.

Mobile Application Stores is the only conference to focus exclusively on the business of mobile applications and will focus on the tremendous opportunities in the mobile apps stores ecosystem. The event is designed to give a complete understanding of how to capitalize on this dynamic market.

Featured speakers for the event include:

  • Dr. Jin-Sung Choi Ph.D, Senior Vice President, Head MC Global
  • Product Planning Team, LG Electronics Korea
  • George Linardos Vice President, Product Management, Media, Nokia
  • Ilja Laurs Founder & CEO, GetJar.
  • Tim Haysom, Chief Marketing Officer,OMTP
  • Mike Merril, CEO-Smart Phone Technologies
  • Ajit Jaokar, President-futuretext
  • Chetan Sharma, CEO, Chetan Sharma Consulting
  • Jouko Ahvenainen, Founder, Grow VC International
  • William Volk, CEO, PlayScreen
  • Sena Gbeckor-Kove, Chief Technology Officer, imKon

The timing is perfect for an event like this. The Apple Appstore announced its billionth download in less that 9 months after opening and the recent launch of LG’’s Applications Store and Windows Marketplace for Mobile as well as Android and the Blackberry App World are making tremendous impact in the mobile marketplace.

Mobile Application Stores is co-located with the largest wireless event in the U.S., International CTIA Wireless I.T. and Entertainment. Registration is only $295 at http://www.mobileappevent.com/

About Mobile Application Stores: The Mobile Application Stores event is produced by futuretext, a London based research, consulting and publishing company. For additional information, please contact Larry Lockhart at NextVision Media at 727-388-9849 or Larry@nextvisionmedia.com or Ajit Jaokar Ajit.jaokar at futuretext.com. The web site again for registration is: http://www.mobileappevent.com/

Still no Cellular in the London Tube

I really like London and do come here often, but as soon as I go down an escalator to take the tube, my cellular signal fades away and I feel like my hands are bound until I surface again at the other end of the trip.

That feeling is of course heightened by having good underground coverage in most other cities I usually travel to and I keep wondering why on earth London, capital of a nation that has fierce competition amongst network operators and good network coverage hasn't come around on this issue yet!?

Even in mobile markets that aren't known for their competitive environment such as France, the metro is fully covered, even between the underground stations. Worries about terrorists using the network for their purposes are in my opinion also no good reason for not going forward. Previous attacks unfortunately worked despite no cellular network being present to trigger anything.

Also, arguments that it's difficult to find space for cables also don't count as Transport of London has just finished deploying their underground Tetra network. Signaling equipment being sensitive to GSM or UMTS? Unlikely if TETRA (probably on 450 MHz) doesn't confuse the equipment…

And finally to those who argue that people using the phone create disturb others, take a Eurostar to Paris and convince yourself of the opposite. Most people in the metro use their phones for texting, emailing and web browsing as it's just too noisy for lengthy phone calls. So no worries here either.

So please, everyone involved, give yourself a push to finally do it and join the rest of the developed world!

Vodafone Websessions Keep Timing-Out

Gateway Timout 2 Sometimes you have to look at the good sides of things going wrong. Every now and then, I use Vodafone's Websession offer in Germany as it is a convenient way to go online occasionally. If it works… Over the summer, unfortunately, I have observed continuous glitches. Everyone has glitches now and then and I wouldn't haven mentioned it at all if it was only a singular occurrence. But this one keeps dragging on.

So here's what's going on: In most failure cases, automatic initial redirect to the payment web portal works, the money is deducted from the account, but anything else either results in nothing or a technical error message on the portal. In most cases, disconnecting and re-connecting a couple of times from the network helps the service to come back to its senses. Not so, however, when I recently needed it. While I was charged and the service kept insisting that it would forward me to the requested page after connecting to the network, it never did no matter how often I tried to re-connect.

So what's the good side of this you might wonder!? Well, the good side is that Wireshark revealed a bit of how the service works and I learnt a couple of things:

First, the services uses a transparent web proxy that redirects all http requests to the landing page with a “moved temporarily” answer to a HTTP GET request for any page until the session is paid for or directly after re-establishing a network connection to display the remaining time of the session.

Second, the web proxy also responds to a request to a web page if there is a problem with the service. In the most recent one, it answered web page requests after a while with a “504 Gateway Timout”. The picture on the left shows how this looks in practice. The server information element of the message is set to “WebProxy/5.0”. From that I assume that each and every requested web page flows through the transparent proxy. This, by the way, is also supported by the network based picture compression that can be deactivated via the Vodafone dashboard software or by including a special http header parameter in every request.

O.k., I've learnt enough this way now, so please Vodafone, fix this service!

The Hollow Operator – From Operator to Owner

LightReading Insider has recently published a paper on the trend of mobile network operators outsourcing operational tasks to external service companies. This includes maintenance of existing core and radio networks, network monitoring and operation, performance monitoring, capacity management, new network rollouts, introducing and running services and many other things. Very interesting to me, and since LightReading was kind enough to send me a copy, I took a closer look.

LightReading chose the term “Hollow Operator” for their paper. Interesting wording and it makes me wonder about how many operational tasks can be outsourced before the term “operator” no longer applies!? What is a network operator that doesn't operate its network anymore? A network owner?

One of the main reasons for companies to outsource work that is not at the core of their business to reduce cost, which, if it works out, makes them more competitive. Global services companies taking over operational tasks such as Alcatel-Lucent, Ericsson, Nokia Siemens Networks, Huawei and others add to this that their global structure allows them to grow the business by improving net subscriber revenue, making the network more operationally efficient, and can develop the right plan to evolve a network for future consumer demands. And a global structure they truly have. Ericsson states for example […], that they have 37.000 people working for them in their service division, 28.000 close to the customer and an extra 9.000 globally). Many of those people are likely to come from network management deals were network operator staff joined the service company when the contract was signed. From a technical point of view, a big advantage is that information and knowledge on how a network can be operated efficiently can be shared, for example by continuous streamlining of company wide processes as a result of what is learned by operating many networks.

Skeptics argue that outsourcing itself does not improve anything on its own. In the end, funding has a lot to do with how things develop. If not funded appropriately, outsourcing can quickly turn into degrading standards. Also, outsourcing is difficult to reverse in case one day the network owner wants to take back operation. Once employees are gone and operational procedures integrated into a different company, it's difficult to get employees and procedures back in house. Unless of course, the outsourcing contract contains clauses for such a circumstance. Also, outsourcing increases complexity. Any extras that would previously have been handed down the internal hierarchy is now an external business matter between two companies. That means such requests are answered with a quotation as it wasn't included in the original calculation in the contract. That doesn't help to speed things up. On the technical side, there is no network that is like any other. Each network operator has different components in different configurations and uses different software versions. In short, while processes for network management inside a service company are probably similar, each network requires dedicated experts to deal with that configuration which in turn limits to port the lessons learned from one network to another.

Many more things can be added to support and counter network operation outsourcing and I can highly recommend Lightreading's paper for further reading. I'll leave it at that for the moment but please feel free to add your thoughts by leaving a comment below.

The Cloud and Denial of Services

I am not a huge fan of services in the cloud, as discussed here towards the end of the presentation. Instead, I much prefer to have my services and data securely running and stored behind a firewall in my own home network readily accessible over a secure connection over the Internet from my wireless devices.

In the past weeks, lots of denial of service attacks have been launched on Twitter and some mashups and front-ends were blocked for days because they could not have been distinguished from the DoS attack. In effect, no Twitter service for me when I wanted to use it a number of times.

These incidents made me think about my views on connected home services again. Dabr, the Twitter front-end I use on my mobile device can also be installed on a different server, e.g. at home, and would thus probably not have been an indirect victim of the DoS attack. Another thought was what would happen, if in the future a very popular web based service would be the target of a DoS and you wouldn't be able to use your critical applications for days. Not from home and not from the mobile either. Not a nice thought.

Sure, it's possible to launch a DoS attack on connected home services, too but it's unlikely that anyone would go through all the pain and expense just to make life difficult for one person.

I guess that, too speaks for distribution of private services rather than concentration in the cloud.

The Nokia N900 – Escape from the Cloud and Jail?

Next week Nokia World 2009 will be held in Stuttgart and while I am waiting for the press to give me the details of the event while I am roaming in the Scottish highlands I've been thinking a bit about what Nokia's recent announcements around the new Nokia N900 could mean for the future of mobile devices.

To me, the current smartphone market by and large looks as follows:

S60

For the moment I am still stuck with my S60 driven Nokia N95. The OS so far is closed source but anyone can develop programs for it and does not depend on Nokia to allow or certify anything if the developer thinks the user can handle a couple of warnings during program installation. To get rid of them, programs can be certified by Nokia / S60, which takes a bit of time, but unless the program does something really malicious, Nokia / S60 have no preconceptions on what should be allowed or not. While this sounds all great and a lot of applications are available, S60 has lost a lot of mindshare in the past 18 months. Many developers are now preferring the iPhone OS or Android when it comes to new and cool stuff. On top, Nokia has decided to strip out a couple of cool features in its latest phones such as VoIP, a killer argument for me against buying another S60 phone in the future.

The Jail

And then, of course, there is the iPhone. Great marketing, great user interface, very easy to use. Unfortunately, it has no multitasking and Apple is pretty opinionated on what should run on the iPhone and what not. The latest Apple / Google quarrel is a good example. No, thank you, not my piece of cake, either. When I buy a device, I should be the one to decide what I want to run on it and what not.

The Cloud

Next, there's Android. Based on Linux and undoubtedly very innovative, it is most useful if the user shares his private data with Google servers in the cloud. From a usage perspective, it's great, as your e-mail, address book, documents, etc. are available and synchronized between all devices of a user. I like that a lot but I don't like sharing my private data with Google or with anyone else for that matter. Private synchronization or connected home services are the way forward to me. For details, see here.

The Rest: And then there are OS'es like Windows Mobile and the the Palm Pre's WebOS which either fall into one of the categories above or in between.

The N900 – To The Rescue?

So what I want from a mobile device is quite simply described:

  • An OS for which new and innovative programs are developed for
  • Multitasking
  • My private data should be treated privately
  • I decide which programs I want to use on my device and no one else.

Or in short: The same experience as I have on my PC and my netbook: I decide!

With Nokia announcing the Linux / Meamo based N900 smartphone I am getting my hopes up again a bit. Maemo has already been around for a number of years now on Nokia's Internet tablets so I have a fair idea of what it is and what it is not. While I've so far not been very impressed by it due to lacking 2G / 3G support, wrong form factor and slow speed, Nokia seems to have an answer to all of that with the N900. 600 MHz processor speed should hopefully take care of speed issues, 2G/3G network support has been added and the physical dimensions of the device are in the same ballpark as my current N95. Also, Nokia says that it will be VoIP capable.

On top, Maemo, at least up until now, has been a very open platform from various angles: First, it's based on Linux so it's very well known in the developer community. Second, unlike with Android, where applications developers have to work with a Java framework for their applications and have no direct access to the OS, Maemo works just like a PC based Linux distribution: (Almost) everything is fully open to developers, existing programs can easily be ported to Maemo and there is no lengthy certification process. In other words, while Android is based on Linux but doesn't give access to it to applications, Maemo fully does, unless, Nokia decides to remove that openness in the new version of Maemo. Let's hope not.

So if Nokia plays it right, they will make developers happy, they will make users like me happy and they've sold their first smartphone in two years to me!

Diversity Rules

But don't get me wrong here, I don't argue for a full and open Linux phone to be the one and only answer. I think there's also a place for devices that do fewer things, that are not as configurable and expandable, that are more tightly controlled. The reasons for that are plenty: Ease of use, better support from manufacturers or network operators for users, etc. etc. While many users might want that on their PC world and thus might prefer it in the mobile world too, there are many, like me, who thrive on openness!