I’ve just returned from the Mobile World Congress, and a fairly clear theme this year was the alleged imminent arrival of 5G, with companies promoting the current status variously as 4.9G, the Bridge to 5G or pre-5G. The only problem is that no-one seemed to be very clear about what 5G is going to be.
Up until now, it’s been pretty clear what the “G”s stand for – it’s been the main user application area over and above the basics of voice and text. For 2G they were pretty much confined to Games and Gambling. In other words, applications which relied on timely, but minimal data. 3G gave us Girls, as the porn industry realised that, with higher data rates, they could charge for sending pictures to the most private of our devices. The increased bandwidth of 4G resulted in Gossip – the net curtain twitching of Twitter and Facebook which has glued millions to their smartphones. But the potential killer app for 5G is proving remarkably elusive. Participants at the Global 5G Test Summit event kept on emphasising the importance of early testing for exploring new usage models and applications. That appeared to be because nobody had no idea of what they might be. Judging from the reticence of many network operators at the show, who are obviously struggling to see how they are going to make any money from investing in 5G infrastructure, the fifth “G” may end up bringing little other than Grief and Gloom.
At which point I’d like to highlight a recent book by William Webb, entitled “The Myth of 5G”. In it, he argues that not only does no-one know what 5G is, but there’s no need for it. After which, I’ll tell you about Qualkia.
As the old adage goes, “while the cat’s away, the mice will play”. In the case of NB-IOT, “when the spec’s delayed, LPWAN will play”, which is exactly what’s happening in the Internet of Things market today. The problem is that 3GPP (the 3rd Generation Partnership Project), the standards body which has been responsible for the 3G, 4G and 5G mobile standards, dropped the ball as far as the Internet of Things is concerned. Seduced by the slabs of black glass which suck up both our attention and the mobile networks’ spectrum, the 3GPP engineers totally forgot to design something to replace the old 2G workhorse of GPRS, which is responsible for most of today’s machine to machine communications. Instead, they spent all of their time designing high power, high speed, expensive variants of 4G to support an ongoing dynasty of iPhones, Galaxys and Pixels, none of which were any use for the Internet of Things.
Noticing this hole, a number of companies who had been developing proprietary, low cost, low speed, low power communication options saw an opportunity and created the Low Power WAN market. Whilst many perceived them as a group of Emperors with no clothes, the network operators were so desperate to have something to offer for upcoming IoT applications that they started engaging with them, rolling out LPWAN infrastructure. Whether they believed the LPWAN story, or just hoped it would fill a hole is difficult to ascertain, but no-one can deny that LPWAN is now firmly on the map, in the form of Sigfox, LoRa, Ingenu and a raft of others. To address that challenge to their hegemony, the GSM Association (GSMA) directed the 3GPP to assemble their own suit of imperial clothing which would be called the Narrow Band Internet of Things, or NB-IoT.
This is the story of why NB-IOT was too late, why it will fail in the short term, why it will win in the long term, and why the industry will struggle to make any money from it.
It’s a New Year, which means it’s time for the annual week of madness in Las Vegas which is the Consumer Electronics Show. For four days, the electronics industry comes together to tell consumers what they ought to be buying, whilst analysts and the media try to predict what will really be the hot product sector for the coming year.
Over the last few years, as PCs, tablets and phones have lost their wow factor, that’s proven to be a little more difficult than it used to be. In 2014, the consensus was that wearables would be the next big thing. They have definitely made strides beyond basic step counting, but are still smouldering rather than setting the world on fire. Instead, the innovation which caught the public imagination at CES in 2014 was the selfie stick.
In 2015, the smart money was on smart homes. But with a few exceptions, consumers felt the smart thing to do with their money was to buy more selfie sticks. This year, the pundits will probably predict that 2016 will be the year of the drone. My guess is that most consumers will still prefer to buy selfie sticks. Unless someone comes up with cheap drones that take selfies*.
Of course, like all good works of fiction, the CES show contains a number of interesting subplots, one of which will be the battle for mesh.
All of a sudden, there’s a lot of activity in the Long Range wireless network community. In France, Orange has just announced that they’re going to follow in Bouygues’ footsteps in deploying a LoRa network for M2M which will cover the whole of metropolitan France. That in turn follows on from a similar announcement from KPN that they are planning to do the same thing in Holland, while Proximus are going to cover Belgium and Luxembourg. It’s a bit like a rerun of the SigFox PR offensive, after they managed to sign up operators in France, Holland, Portugal, Belgium, Luxembourg, Denmark, Spain, the U.K. and San Francisco. Nor is it just a European phenomenon. In the U.S., Ingenu, the company which was formerly known as On Ramp, has raised $100 million to roll out its own similar, proprietary network. It seems that there’s a new announcement almost every day. So it’s interesting to look at why mobile operators are desperately announcing new network technologies to support M2M and IoT applications, when just a few months ago they gave the impression that they would rule the IoT with their 4G networks.
When you stop and look behind all of this activity, you see something that should be worrying the M2M and IoT industry (which is not the same as the cellular industry). For the last fifteen years they’ve not had to worry much about how they make their data connections – they just embedded a GPRS module and bought a data contract. But look forward a few years and there’s a worrying hole in the air as networks start to switch off their GPRS networks. That’s just beginning to dawn on network operators, who see an unexpectedly unpleasant vision of the future, in which their anticipated IoT revenues could disappear into thin air.
In the last blog I wrote about the immense damage that could be done to the market for connected personal devices and the Internet of Things by licensing the 2.3GHz spectrum to mobile networks. As OFCOM is still asking for consultation responses prior to their auction I thought it timely to list some of the reasons that I believe justify a delay in releasing this spectrum. If you agree that it should be postponed, you have until June 26th to send OFCOM your views. Please do, as I believe this could cost the industry billions of pounds and push back innovation.
The battle is between mobile network operators, who want more spectrum and the ongoing survival of the 2.4GHz band. The 2.4GHz spectrum is unlicensed, and used by the wireless standards in most consumer devices, including Bluetooth, Wi-Fi, ZigBee and others. If mobile phones start to use frequencies close to 2.4GHz, it will degrade the performance of these products. Your Internet access may slow down, audio bars and Sonos systems may get noisy, hearing aids will perform poorly, the response of smart home systems could get sluggish or stop. Everything that uses the 2.4GHz band may work less well and have a reduced range, to the point where they’re no longer compelling devices. If that happens, users will stop buying products, businesses may close, investors will lose their money and the current Internet of Things bubble will be firmly burst.
There are a lot of “mays” in that. That’s because we can’t be sure. To their credit, OFCOM have commissioned some tests which show that there is a problem, but they didn’t test enough, or new enough products to determine the true extent of the problem. OFCOM’s response is to say that manufacturers need to redesign their products to be more resistant to interference. However, that adds cost, the technology is not yet available for small products and it can’t be retrofitted to the billions of existing products already on the market. For that reason I believe any auction should be delayed to give the industry time to test and see if it can develop solutions. Otherwise the costs could be enormous.
Don’t worry – it’s not a blog about Tindr or Grindr. The connections we’re talking about here are mobile subscriptions and the men are those at this year’s Mobile World Congress in Barcelona. It is still mostly men. Despite the best efforts of the GSMA with sub-events like the Connected Women’s Summit and France’s promotion of its exhibiting companies as “La France Tech” (which must have had the members of the Académie Française heading to their graves for some early turning), MWC remained defiantly male. In the opening keynotes around 85% of the audience were men. Telecoms, for all of its populist marketing, is still largely a suited profession.
What was exercising the males of the species this year was numbers. Back in 2009, Ericsson predicted that there would be 50 billion mobile connections by 2020. At the time it seemed possible; phone usage was growing and everyone expected that the things around us would follow suit by getting their own mobile connections, leading us to that kind of number. It’s now beginning to strike the CEOs within the industry that five and a half years have passed and we’re half-way there. Yet we’ve still only connected a few tens of millions of machines. That’s why they’re getting so excited about wearables and the Internet of Things as the only way to make those predictions come true.