In the last blog I wrote about the immense damage that could be done to the market for connected personal devices and the Internet of Things by licensing the 2.3GHz spectrum to mobile networks. As OFCOM is still asking for consultation responses prior to their auction I thought it timely to list some of the reasons that I believe justify a delay in releasing this spectrum. If you agree that it should be postponed, you have until June 26th to send OFCOM your views. Please do, as I believe this could cost the industry billions of pounds and push back innovation.
The battle is between mobile network operators, who want more spectrum and the ongoing survival of the 2.4GHz band. The 2.4GHz spectrum is unlicensed, and used by the wireless standards in most consumer devices, including Bluetooth, Wi-Fi, ZigBee and others. If mobile phones start to use frequencies close to 2.4GHz, it will degrade the performance of these products. Your Internet access may slow down, audio bars and Sonos systems may get noisy, hearing aids will perform poorly, the response of smart home systems could get sluggish or stop. Everything that uses the 2.4GHz band may work less well and have a reduced range, to the point where they’re no longer compelling devices. If that happens, users will stop buying products, businesses may close, investors will lose their money and the current Internet of Things bubble will be firmly burst.
There are a lot of “mays” in that. That’s because we can’t be sure. To their credit, OFCOM have commissioned some tests which show that there is a problem, but they didn’t test enough, or new enough products to determine the true extent of the problem. OFCOM’s response is to say that manufacturers need to redesign their products to be more resistant to interference. However, that adds cost, the technology is not yet available for small products and it can’t be retrofitted to the billions of existing products already on the market. For that reason I believe any auction should be delayed to give the industry time to test and see if it can develop solutions. Otherwise the costs could be enormous.
Don’t worry – it’s not a blog about Tindr or Grindr. The connections we’re talking about here are mobile subscriptions and the men are those at this year’s Mobile World Congress in Barcelona. It is still mostly men. Despite the best efforts of the GSMA with sub-events like the Connected Women’s Summit and France’s promotion of its exhibiting companies as “La France Tech” (which must have had the members of the Académie Française heading to their graves for some early turning), MWC remained defiantly male. In the opening keynotes around 85% of the audience were men. Telecoms, for all of its populist marketing, is still largely a suited profession.
What was exercising the males of the species this year was numbers. Back in 2009, Ericsson predicted that there would be 50 billion mobile connections by 2020. At the time it seemed possible; phone usage was growing and everyone expected that the things around us would follow suit by getting their own mobile connections, leading us to that kind of number. It’s now beginning to strike the CEOs within the industry that five and a half years have passed and we’re half-way there. Yet we’ve still only connected a few tens of millions of machines. That’s why they’re getting so excited about wearables and the Internet of Things as the only way to make those predictions come true.
The mobile industry loves hype. Now that 4G phones have reached the market, suppliers are keen to promote the next dollop of “jam tomorrow” by offering the world 5G – something that’s still rather nebulous, but as always in this industry, allegedly better than what we have today. Most users have still to experience 4G, but that’s par for the course. The industry loves something new, preferably with a bigger number. It begs the question of whether we need it, and even what it is? To try and answer these questions it’s instructive to look back at the history of mobile to see just what the “G”s mean.
Today Google and Nest launched the Thread Group – a new wireless network for home automation. It’s not the first and it won’t be the last, but it has some important names behind it. The big two are Google and Nest, not least because Nest’s products may already be using it. But others in the consortium are interesting. ARM is there. Today they power most of our mobile phones, providing the IP behind the processors in billions of chips. But they have a vision of being the microprocessor architecture of choice for the Internet of Things. They processors will be smaller, cheaper and lower powered, but will provide the first opportunity for chip vendors to think about trillions. ARM’s inclusion in the group is an obvious step in their process of acquisition and investment in IoT companies.
Samsung are there (aren’t they always), but so are some very large names in home automation, such as Big Ass Fans and Chubb. And what must be worrying the ZigBee community is that Freescale and Silicon Labs complete the list of founder members.
The important point here is that Thread is not ZigBee. It works in the same spectrum and can use the same chips. It is also a mesh network. But it is not compatible. As the Thread technology backgrounder says, they looked at other radio standards and found them lacking, so they started working on a new wireless mesh protocol. To put it more crudely, it’s Google and Nest saying “ZigBee doesn’t work”.
Investing in a new wireless standard can be an expensive experiment. The investment can be vast, as I’ve described in a previous article on the cost of wireless standards. It’s not unusual for the combined cost of writing and bringing a standard to market to run into billions of dollars. When a standard loses out to a competing one, it’s a heavy loss both for the VCs who have invested in it as well as the companies who have worked on it. The problem is that there’s not been a good way of determining in advance which standards will succeed and which will fail.
Up until this point, the only real yardstick has been the Intel test. That’s the principal that if Intel invests heavily in a wireless standard (think HomeRF, WiMedia or WiMax), then the standard will fail spectacularly. Conversely, if Intel withdraws its development effort from a wireless standard, as they did in the early days of Bluetooth back in 2002, then the standard will be a roaring success. The Intel test isn’t a perfect one – it fails to predict the acceptance of Wi-Fi, but with a track record of four predictions out of five, it’s a lot better than just flipping a coin.
What the industry needs is a new test. I’m going to suggest the Byron test. It’s a more literary approach, suited to the alphabet soup of the 802.11 family of wireless standards and inspired by the popular description of the romantic poet as “mad, bad and dangerous to know”.
Back in 2010, Mark Thomas, the head of PA Consulting’s Strategy and Market practice published a book called The Zombie Economy. In it he defined a Zombie company as one which is generating just about enough cash to service its debt, so the bank is not obliged to pull the plug on the loan. The issue with such companies is that they can limp along, and just about survive, but as they don’t have enough money to invest, they fall over once the economy picks up, as they become uncompetitive. The problem they pose is that by continuing to exist in this Zombie state they threaten the development of other companies, acting as a damper to more sustainable businesses.
It struck me that there’s a close analogy in the area of wireless standards where we have what are effectively Zombie wireless standards. There’s not necessarily anything fundamentally wrong with these individual standards, other than that they have failed to get traction and so limp along. Here, the problem is that they tend to jealously claim a particular application sector or market segment, blocking other more successful standards from entering. That has a damping effect on product development, creating silos which keep putting off innovation in the hope that one day the standard will gain traction, constantly delaying growth and interoperability. Because they’re not being incorporated into enough products, they have effectively lost their ability to function and have become half-dead, half-alive ‘Zombies’.
I think it’s time to recognise the damage that this is doing. Rather than pursuing multiple parallel paths, the industry needs to concentrate on a far smaller number of short range wireless standards. They in turn need to embrace the requirements of a wider range of sectors.