Qualkia – Constructing the 5G Myth

I’ve just returned from the Mobile World Congress, and a fairly clear theme this year was the alleged imminent arrival of 5G, with companies promoting the current status variously as 4.9G, the Bridge to 5G or pre-5G.  The only problem is that no-one seemed to be very clear about what 5G is going to be.

Up until now, it’s been pretty clear what the “G”s stand for – it’s been the main user application area over and above the basics of voice and text.  For 2G they were pretty much confined to Games and Gambling.  In other words, applications which relied on timely, but minimal data.  3G gave us Girls, as the porn industry realised that, with higher data rates, they could charge for sending pictures to the most private of our devices.  The increased bandwidth of 4G resulted in Gossip – the net curtain twitching of Twitter and Facebook which has glued millions to their smartphones.  But the potential killer app for 5G is proving remarkably elusive.  Participants at the Global 5G Test Summit event kept on emphasising the importance of early testing for exploring new usage models and applications.  That appeared to be because nobody had no idea of what they might be.  Judging from the reticence of many network operators at the show, who are obviously struggling to see how they are going to make any money from investing in 5G infrastructure, the fifth “G” may end up bringing little other than Grief and Gloom.

At which point I’d like to highlight a recent book by William Webb, entitled “The Myth of 5G”.  In it, he argues that not only does no-one know what 5G is, but there’s no need for it.  After which, I’ll tell you about Qualkia.

I would urge anyone involved with any aspect of 5G to read William’s book.  I’ve known him for many years, through his time working for the UK telecoms regulator OFCOM; as the president of the IET and more recently with Neul and the Weightless Foundation.  He’s seen the industry from many sides and presents some cogent arguments about the fallacies which are driving 5G. 

If I reflect on the 5G messages I heard at MWC, the most common was that it would bring higher data rates.  But do we need them?  William argues that with very few exception, we don’t.   There’s only so much data that you can usefully stream to a phone or laptop.  Once you have enough for a decent resolution video, there’s no obvious application which will benefit from higher data rates.  What we have will suffice.  As mobile video is taking off, the industry is developing more efficient codecs, so the likelihood is that data demand on individual devices will go down.

What most people don’t realise is that we don’t really use cellular for delivering data to mobile devices.  More and more of the time, laptops and phones will preferentially connect to a Wi-Fi network, typically one which the user has installed themselves, although increasingly it could be on a shared network.  In the UK, BT, which is our dominant domestic broadband supplier, has incorporated FON’s Wi-Fi access technology in every domestic router, so that a portion of the bandwidth is reserved for other BT users to access, generating what’s approaching nationwide Wi-Fi coverage for their customers.  Similar initiatives are happening elsewhere, making it far easier for mobile devices to access Wi-Fi.

Operators like the way we utilise Wi-Fi.  It costs them nothing to install it and removes their need to support data streaming inside buildings, which is difficult.  Their attempts to remedy poor indoor coverage with femtocells have mostly been a failure, not least because they’re expensive and bind all of the household or office occupants to a single network supplier.  However, providing in-building coverage will be even more difficult with 5G because it’s likely to use higher frequencies, which are less effective at getting through walls.  Poor indoor coverage is the reason most smartphones are designed to look for known Wi-Fi networks and automatically switch to routing any data over them, rather than burdening the cellular connection.

William argues that network operators and regulators need to understand that circumstances and technology limitations have conspired to produce a “Wi-Fi First” world, where Wi-Fi is set to be the primary method by which mobile devices and smartphones will access data, wherever it is available.  They will only revert to cellular data when they have no Wi-Fi coverage.  As Wi-Fi coverage is becoming more and more widespread, rather than attempting to move devices from Wi-Fi to 5G, it would make more sense for the industry to concentrate on integrating more services over Wi-Fi, allowing them to better focus cellular spectrum on when and where it’s most needed.  As some operators look to a 4G only network future, which does not natively support voice but requires VOLTE, that would seem a very sensible way forward.

Despite the common sense of that approach, which has seen the Wi-Fi Alliance call for more spectrum to be allocated to Wi-Fi, the opposite appears to be happening, with the FCC approving LTE-U devices which share the 5GHz Wi-Fi spectrum.  T-Mobile appears to be the first to take advantage of this, proudly announcing that it’s “building on its track record of launching advanced technologies first”, although in this case, that could be synonymous with shooting itself in the foot.

The second marketing argument for 5G at MWC was capacity.  Either as the ability to support billions of yet to be identified Internet of Things devices, or tens of thousands of sports fans in a stadium.  The first is probably spurious.  The whole point of the recent NB-IOT standard is to support this quantity of devices within an existing LTE framework.  (Incidentally, I saw no evidence that we’re moving towards billions of IoT devices.  Everything I saw under that heading looked like existing M2M applications and volumes, which are very vertical and rarely get above tens of millions.  But I’ll cover that in a forthcoming IoT post.)  The Stadium scenario does appear to be a real concern for a few operators, predominantly in the Far East, who believe they need to allow spectators at sporting events to watch their smartphones rather than the sporting event itself, or at least augment the experience of watching people running around in front of them.  The Winter Olympics in Seoul in 2018 is regularly cited as the first demonstration for this, which feels unrealistic, as we neither have a definition of what 5G is, nor any handsets to support it.  The bigger event is the 2020 Olympics in Tokyo, where Japan is keen to show off its 5G expertise, along with robots and driverless cars.  There’s even a helpful white paper from some vested interests on how the Olympics will shape 5G, just in case they need any further persuasion.

The only problem here is that here are a limited number of these events.  Whilst they might be good PR for the industry, they have nothing to do with global network operators recouping their infrastructure costs.  As I’ve mentioned above, any such application could be covered by Wi-Fi.  The Wi-Fi cell management would be complex, but much cheaper than a 5G network.  It’s also a cost that could be borne by the stadium or the hosting Olympic country, not the mobile operator.

The other 5G theme at MWC was low latency, allowing data to get to a device faster.  This generally hinged around connected cars, where the argument goes that latency is vital to allow them to be controlled over the cellular network.  I’d question whether this is going to be real in our lifetimes.  When we get to self-driving cars, they need to be autonomous, as no regulator would allow a ton of metal to hurtle down a road purely under the control of a wireless network.  That removes the need for very low latency.  Once more it feels like a solution looking for an application.

Which brings me back to why 5G is being pushed.  William makes his conclusions with regulatory eyes, considering the ways in which regulators have shaped the market and constrained the way networks operate, particularly within Europe.  The situation in the US could be about to change.  Ajit Pai – Presdient Trump’s recent appointment as FCC Chairman, used his speech at MWC to highlight a new, lighter touch approach, which may affect the way in which 5G develops there, giving operators more freedom about how they use spectrum. 

Regardless of that, William’s regulatory viewpoint concentrates on telling us why we should disregard the myth of 5G.  However, it doesn’t tell us enough about the myth makers, and why the 5G myth has been created.  In Barcelona, the myth makers were very much in evidence, selling their story not just to operators, but to the media, Governments, regulators and anyone else who might be able to help perpetuate their vision.  Because they know that if the myth loses its power, their future may disappear. 

 

Which brings me to Qualkia.  What, you are probably asking, is Qualkia?   I’ve coined it as the mobile equivalent of what the PC industry called Wintel – a symbiotic “partnership” between Microsoft and Intel, which drove a cycle of continual development of PCs and Windows software.  In a marketing brochure Intel produced in 2009, they claimed that this partnership enabled the two companies to give customers the benefit of “a seemingly unending spiral of falling prices and rising performance”. Customers, particularly corporate customers, would probably have described it differently.  Microsoft kept producing more complex and bloated versions of Windows, which needed more powerful processors, which Intel duly supplied.  They provided additional MIPS for the next version of Windows and Office, perpetuating the Wintel cycle.  Which was the chicken and which was the egg was as unanswerable for the Wintel partners as it is in any other version of the proverb.  The end result was that users, both companies and individuals, were forced into a constant spiral of software upgrades and hardware replacement, often simply to allow them to stand still and run the same applications at the same speed.  If I look at my own example, the version of Word 2016 I’m using to write this article differs little in functionality from what I could achieve with Ami Pro in 1989.  But I’ve probably been through a dozen or more PC upgrades to retain that same level of productivity.

The game wasn’t just about selling more products and software licences.  By adding more and more complexity, it constantly raised the technology barrier, making it increasingly difficult for other companies to threaten either Microsoft’s or Intel’s position, making them effective monopolies.  New companies came in and prospered or died depending on how well they played the Wintel platform, but Microsoft and Intel managed to remain largely unchallenged.  The Wintel partnership worked for almost twenty years, only falling apart as smartphones and cloud software began to change the rules of the game for both of them.

It’s a very clever business model.  Both companies act separately, in different business areas, but regularly raise the technology bar to protect themselves from competition.  The Qualkia imperative acknowledges the fact that the same game is being played in the mobile industry, this time to the benefit of Qualcomm as the major silicon supplier, with the infrastructure giants supplying the network hardware and systems.  That’s predominantly Nokia, along with Ericsson and Huawei.  Hence Qualkia.  These companies learnt the lesson of developing standards as barriers to entry, developing and contributing IP that constantly moved the mobile standards forward, squeezing out any competition.  By ensuring that there would always be a next release of 3G or 4G, they could be confident that they always had a lead on any new upstart.  Even if a pretender came along, they would probably be at a disadvantage because they’d need to pay licence fees to Qualkia.

Within the mobile networking space, there is an interesting extra twist.  Regulators generally limit the number of operators in any country, assuming it should be around four.  That seems to be the sweet spot for a Government, where they can maximise the money from spectrum licenses, whilst claiming that there should be sufficient competitive pressure on consumer prices.  Up until now, most 3GPP upgrades have provided customer benefits, so there has been customer pressure for the latest network features, not least because they’re advertised by the phone manufacturers (which mostly have Qualcomm chips inside, and who just paraphrase the silicon company’s PR).  Mobile operators are worried that if they don’t constantly upgrade their network, users will move to an operator who has upgraded.  After twenty years at this game, they’re all paralysed by the Fear Of Missing Out and pay for their network upgrades.  As margins get thinner and thinner in the operator world, churn becomes the bogeyman forcing them all to dance to the Qualkia tune, just as corporates used to slavishly quickstep to each measure of the Wintel dance.

But there’s a cold wind of change.  It’s becoming clearer that 5G isn’t providing new applications.  Instead, it’s a story that being conceived and propagated for the benefit of the 5G myth makers.  They need to keep selling major infrastructure upgrades, as without that they have a very bleak future.  Without new infrastructure and applications, Qualcomm’s market dominance in silicon could come under threat, as users put off their next smartphone purchase.  I suspect that alarm bell is already ringing, given their recent investments in alternative market areas of automotive and IoT.  Cumulatively, the biggest fear in the industry is that LTE evolves to the point where it is good enough for consumers.   We may now be within one or two releases of reaching that point.  Operators are no longer the goldmines they were. They may not yet have reached the point where they are simply mobile pipes, but that day is coming. At which point, like any other utility, they need to concentrate on maximising their return from existing assets, rather than constantly investing in new ones.

So expect to hear a lot more of the Qualkia myth over the next few years.  Of course, it could be Qualsson or Qualwei instead of Qualkia.  But the Qual is always there.  Which is an interesting reflection on which company, largely unknown to consumers, has been most effective in driving our mobile industry to where it is today.  They’ve realised that there are better ways to succeed than spending billions of marketing dollars on “Intel Inside” stickers.  All you need to do is keep making up a better story.  Until the day your audience stops believing.