One of the perks of working in technology standards groups is that you get to go to meetings in nice places around the world. A more minor perk is that the standards group tends to provide gifts for the participants. They’re not generally much more than a T-shirt saying you’ve been there, or a packet of the local equivalent of popcorn or haggis, but they’re something to remember it by.
Last year, Covid put an end to international travel and we’ve been having to make do with virtual conferences. As every standards group is discovering, they’re OK, but they don’t really work as well. It’s far more difficult to have a good argument when you’re not face to face and there’s no substitute for a fight for the whiteboard markers or the reconciliations and wild flights of fancy that take place over a beer or a coffee. For most standards, the even greater casualty is in testing, where prototype implementations normally come together to check that the specifications actually work. Few companies are happy to let their prototypes out of their sight and running tests remotely, especially for wireless standards, is incredibly difficult. Every standards group is suffering from that at the moment, with the result that we’re seeing release dates pushed back and features cut down.
In the previous article I looked at the tools the UK Government has available to deal with the coronavirus pandemic. Essentially, they have two. The first is to increase the number of ventilators and ICU beds, which gives more people with severe respiratory infections a chance to recover. That means that doctors and politicians can avoid the unpleasant choice of deciding who gets treated and who does not, but only if the number of infections are curtailed in the first place, so that we don’t run out of ventilators.
The second is the lockdown tool. It is currently a crude On/Off switch, which limits infections by keeping everyone at home. At the moment, it’s not flexible – you’re either locked down, or you’re not, unless you’re a key worker or in an essential industry. The hope is that few key workers will be infected, either because they have sufficient Personal Protection Equipment, or they’re able to social distance whilst doing their jobs. Everyone else has to stay at home. A lucky few can continue to work, but most are either furloughed or become unemployed, putting the economy in stasis.
The Government, quite rightly, is desperate to find ways to ease the lockdown. The question is how to do that without immediately seeing infection rates rise?
The flavour of the day is to roll out smartphone apps which can trace whether you have come into contact with someone else who is infected. The theory goes that if you do, you can be alerted and stay at home until you’re tested. If you have coronavirus, you self-isolate. If you don’t, you’re free to go back to work. Like many proposals for phone apps, it sounds simple, which is why it’s so appealing. Particularly to people like Matt Hancock, who has always had a bit of a penchant for phone apps, which he believes will save the NHS. What nobody is mentioning, is that for contact-tracing to work, we will need the ability to provide at least half a million additional tests that can be administered at home every day.
I’ve just returned from the Mobile World Congress, and a fairly clear theme this year was the alleged imminent arrival of 5G, with companies promoting the current status variously as 4.9G, the Bridge to 5G or pre-5G. The only problem is that no-one seemed to be very clear about what 5G is going to be.
Up until now, it’s been pretty clear what the “G”s stand for – it’s been the main user application area over and above the basics of voice and text. For 2G they were pretty much confined to Games and Gambling. In other words, applications which relied on timely, but minimal data. 3G gave us Girls, as the porn industry realised that, with higher data rates, they could charge for sending pictures to the most private of our devices. The increased bandwidth of 4G resulted in Gossip – the net curtain twitching of Twitter and Facebook which has glued millions to their smartphones. But the potential killer app for 5G is proving remarkably elusive. Participants at the Global 5G Test Summit event kept on emphasising the importance of early testing for exploring new usage models and applications. That appeared to be because nobody had no idea of what they might be. Judging from the reticence of many network operators at the show, who are obviously struggling to see how they are going to make any money from investing in 5G infrastructure, the fifth “G” may end up bringing little other than Grief and Gloom.
At which point I’d like to highlight a recent book by William Webb, entitled “The Myth of 5G”. In it, he argues that not only does no-one know what 5G is, but there’s no need for it. After which, I’ll tell you about Qualkia.
As the old adage goes, “while the cat’s away, the mice will play”. In the case of NB-IOT, “when the spec’s delayed, LPWAN will play”, which is exactly what’s happening in the Internet of Things market today. The problem is that 3GPP (the 3rd Generation Partnership Project), the standards body which has been responsible for the 3G, 4G and 5G mobile standards, dropped the ball as far as the Internet of Things is concerned. Seduced by the slabs of black glass which suck up both our attention and the mobile networks’ spectrum, the 3GPP engineers totally forgot to design something to replace the old 2G workhorse of GPRS, which is responsible for most of today’s machine to machine communications. Instead, they spent all of their time designing high power, high speed, expensive variants of 4G to support an ongoing dynasty of iPhones, Galaxys and Pixels, none of which were any use for the Internet of Things.
Noticing this hole, a number of companies who had been developing proprietary, low cost, low speed, low power communication options saw an opportunity and created the Low Power WAN market. Whilst many perceived them as a group of Emperors with no clothes, the network operators were so desperate to have something to offer for upcoming IoT applications that they started engaging with them, rolling out LPWAN infrastructure. Whether they believed the LPWAN story, or just hoped it would fill a hole is difficult to ascertain, but no-one can deny that LPWAN is now firmly on the map, in the form of Sigfox, LoRa, Ingenu and a raft of others. To address that challenge to their hegemony, the GSM Association (GSMA) directed the 3GPP to assemble their own suit of imperial clothing which would be called the Narrow Band Internet of Things, or NB-IoT.
This is the story of why NB-IOT was too late, why it will fail in the short term, why it will win in the long term, and why the industry will struggle to make any money from it.
It’s a New Year, which means it’s time for the annual week of madness in Las Vegas which is the Consumer Electronics Show. For four days, the electronics industry comes together to tell consumers what they ought to be buying, whilst analysts and the media try to predict what will really be the hot product sector for the coming year.
Over the last few years, as PCs, tablets and phones have lost their wow factor, that’s proven to be a little more difficult than it used to be. In 2014, the consensus was that wearables would be the next big thing. They have definitely made strides beyond basic step counting, but are still smouldering rather than setting the world on fire. Instead, the innovation which caught the public imagination at CES in 2014 was the selfie stick.
In 2015, the smart money was on smart homes. But with a few exceptions, consumers felt the smart thing to do with their money was to buy more selfie sticks. This year, the pundits will probably predict that 2016 will be the year of the drone. My guess is that most consumers will still prefer to buy selfie sticks. Unless someone comes up with cheap drones that take selfies*.
Of course, like all good works of fiction, the CES show contains a number of interesting subplots, one of which will be the battle for mesh.
All of a sudden, there’s a lot of activity in the Long Range wireless network community. In France, Orange has just announced that they’re going to follow in Bouygues’ footsteps in deploying a LoRa network for M2M which will cover the whole of metropolitan France. That in turn follows on from a similar announcement from KPN that they are planning to do the same thing in Holland, while Proximus are going to cover Belgium and Luxembourg. It’s a bit like a rerun of the SigFox PR offensive, after they managed to sign up operators in France, Holland, Portugal, Belgium, Luxembourg, Denmark, Spain, the U.K. and San Francisco. Nor is it just a European phenomenon. In the U.S., Ingenu, the company which was formerly known as On Ramp, has raised $100 million to roll out its own similar, proprietary network. It seems that there’s a new announcement almost every day. So it’s interesting to look at why mobile operators are desperately announcing new network technologies to support M2M and IoT applications, when just a few months ago they gave the impression that they would rule the IoT with their 4G networks.
When you stop and look behind all of this activity, you see something that should be worrying the M2M and IoT industry (which is not the same as the cellular industry). For the last fifteen years they’ve not had to worry much about how they make their data connections – they just embedded a GPRS module and bought a data contract. But look forward a few years and there’s a worrying hole in the air as networks start to switch off their GPRS networks. That’s just beginning to dawn on network operators, who see an unexpectedly unpleasant vision of the future, in which their anticipated IoT revenues could disappear into thin air.