It’s been customary for Azenby to write a year-end review of the mobile world but just for a change we are leaping forward to December 31st, 2029 to review the last decade. No, we haven’t found a way of time travel. We just thought that it would be stimulating to gaze into our Crystal Ball for a change instead of looking in the rear-view mirror and have a go at predicting what the mobile world might look like come 2030. Of course, it is pure fiction. Maybe!
Standards Development Organisations
Geopolitical fallout at the beginning of the decade continued with the US imposing ever more restrictions on the use of technology from China in critical communication infrastructure and all underlying technology. An already fractious relationship between the two super-powers was intensified with the outbreak of the Coronavirus virus and the Covid-19 disease which changed the blueprint irrevocably for the decade to come. An argument that started with digital security concerns had become a full-blown new cold war on health and human security and that ramped up the stakes considerably.
The US persuaded more allies in the west, beyond just the Five Eyes, to follow suit and similarly ban the use of Chinese vendors from their national infrastructure. The ban extended to the use of any software code used in any product by any OEM where the code was written in China. Similarly, any component from China met with the same restriction. Indeed, the restriction extended from China to include other countries thought to be hostile to western interests including Russia, North Korea, and Iran.
The inevitable fall out from this was a realisation in the west that they had fallen behind China in this area of technology and there was a gap to bridge if they were to deliver to consumers and to industry the sort of communication services that 5G had promised at the dawn of the decade. This also brought home to roost that it was not only the manufacturing capability that was absent, which for too long had been blamed solely on the impact of the China price for 4G and 5G equipment eradicating incumbent traditional vendor market share, but also a gap in expertise and Intellectual Property had developed.
Western governments pondered on this and came to the conclusion that their major Communication Service Providers (CSPs) and their home based OEMs, had more or less abandoned serious participation in the standard body behind 5G (and 3G and 4G before that) and left the development of the crucial standards – and the ownership of critical IP – to Chinese manufacturers who had swamped 3GPP for a number of years and completed a peaceful ‘coups d’état’. Operators and manufacturers alike had sleepwalked into this position. Regulators and governments hadn’t seen it coming; indeed, some had been actively courting better relationships and internal investment from China since the start of the new millennium. The unintended consequences of this had come home to roost.
The answer, they decided, was to start a new standards body for communication standards to replace 3GPP. Led by hawks in the USA, determined to ensure that US-based manufacturing would be able to rebuild and gain a dominant position, they created the new 6th Generation SDO called ATIS6G and invited partners in the West to join. The first thing they did, which in hindsight was a master stroke, was to end backward compatibility. Controversial as this seemed at the time, it did take out a lot of complexity from the standard, reduced development cost and had a dramatic effect on the time to bring the standard to the market. It also meant the engineers could spend more time and focus on the major goals and benefits that had been identified and not to worry themselves about hard to resolve backward compatibility issues. The goals were identified as ever faster data speeds, lower latency, spectral efficiency, dispersed data storage, broadcast, one Operating System for device and network, and enhanced cyber security built in. They also followed up with an IoT variant of the standard but ignored the 5G URLCC option, believing at the time that it was fairly pointless in practice. Removing the complexity in the new standard caused by the need for backward compatibility, resulted in lower costs of wireless access points and together with the merging of mobile and fixed cores, meant the inevitable amalgamation of the new standards body with IEEE, and the consolidation of WiFi and Cellular standards.
Meanwhile the EU bloc, sensing it could be vulnerable to the new standard, was also aware of the need to create an industrial advantage for its own manufacturing base and, hankering after the glory era of GSM, was determined to recreate the environment that made GSM a global success story and resulted in Nokia, Ericsson, Siemens, and Alcatel becoming dominant suppliers to Europe and beyond. The schism that emerged between France and Germany was inevitable as the French were determined that ETSI should be the SDO responsible for a new standard as they still regarded ETSI as their baby. But with ETSI still at the time being a part of 3GPP before its disintegration, that just wasn’t possible.
Germany and the Nordic countries saw the need for a new dedicated European mobile standard and created the EU7G Programme. They agreed that backward compatibility would end at 6G and that ETSI would remain the standards body for 2G through 5G but have no involvement beyond that. One thing that both the new European and US standards bodies did agree upon was that there would be no IP carried forward from 2G-5G in the new standards. This was to encourage the industry to get radical about technology and be more inventive. They went even further and – taking a leaf out of the early days of Chinese OEM development book – ignored any Chinese patent claims for infringement of IP. This did inevitably mean of course that there was no hope for a vendor’s products in the Far East. In this way, the world Telecoms standards fragmented into regional systems.
With 3GPP now falling apart as the organisational partners split, it was inevitable that the Chinese body, CCSA, would break away and focus on its own standards developments. That left a dilemma for TDA (Korea) and TTC (Japan) on where to go. Many doubted that they could work together in unison and would seek to cooperate as partners in both ATIS6G and EU7G but the realization that creating industrial capability was paramount, and as we were at the beginning of new standard wars, they concluded that they would instead have to work together and independently of other new and existing SDOs. That just left TSDSI and all the world focused on what India would do. It was seen as a vital jigsaw piece needed by all the new SDOs as India was the world’s second largest market for telecoms.
The new SDOs were courting, persuading, even bribing India to join with them. Political pressure was being exerted. India pondered long and hard about going it alone and create an Indian Sub-Continent sphere of influence of its own but as that would have needed to include Pakistan it was seen as a step too far. So, given assurances from the USA about pooling and sharing IP, India decided to join forces with ATSI6G with the agreement that at least 30% of the standards development work would be done in India. And so, the stage was set for the battle to commence now between four standards bodies all determined to move the fastest and provide the most advanced and economical efficient communications infrastructure.
It took time for other countries and regions to decide what to do. Russia decided it was too weak to go it alone, despite failed attempts to do so, and threw its hand in with China to create what came to be known as the Red Standard. Latin America decided against tying itself to its historic relationships and most Lat Am countries, but not all, joined the Red Standard. Africa went more or less with the Red Block as well, making it the largest of the new SDOs, at least in terms of addressable market.
The new SDOs did at least agree to work together on interoperability and interworking standards. This was driven by a forlorn hope from the dominant device and operating system owners that a single global market for them could be retained. Interoperability, however, was at the very heart of why the 3rd Generation Partnership Project was conceived in the first place and brought the leading SDOs together to create 3G. Consequently, it always looked impossible to achieve any significant degree of interoperability whilst the new four standards bodies were internally focused and also saw the need to be moving fast and certainly moving faster than the other SDOs. They didn’t want the handbrake effect of sorting out interworking. This would have seemed inconceivable in 2019 but in 2020 the world had stopped travelling and didn’t know when it would start again. A mobile device that worked everywhere in the world didn’t seem to be a priority at that time.
We know now that this fragmentation ended the existence of a global standard for communications and device compatibility between different zones became limited to a small sub-set of functions and capability. Can we say that this was offset by the enhancements made in telecommunication technologies as various standard bodies competed with each other to produce the best in class? At the beginning of this decade a 5G device cost $1k and at the end of it a 7G device has been launched in three markets for less than $200 (at 2020 equivalent pricing).
Device incompatibility has been overcome by content and data being separated from access to the extent that any device can access a user’s data and this, with the advent of cheap devices, meant that people no longer travel with their own devices and simply use freely available devices wherever they go, inserting their profiles into them with the click of a QR code. How we view devices now is so different from when the decade began, and nobody really sees the way it all happens today as a problem.
We now see that effective competition between standards bodies increases the pace of technological change and that new technologies overcome the old interworking restrictions we encountered which lead the drive for a global single standard, 90% of compatibility is now provided by OTT apps which let the inter standard interworking happen at a very low ‘access’ level which proved to be relatively simple.
The dichotomy between why a Wi-Fi access point was so cheap (and which users bought and installed themselves) and the cost of cellular access was suddenly becoming front of mind for the industry at the beginning of the decade. Different deployment models underpinned by different business models were becoming increasingly more difficult for customers to fathom. Other factors were also at play in why the two ways of access wireless communications were so different. One of these was the ‘gift’ of an abundance of licensed spectrum at 3.5GHz and that was turning out to be more of a headache to mobile operators. Getting devices to behave consistently over such a wide spectrum range from 700Mhz to 3.5Ghz was one issue but the real issue MNOs were grappling with was indoor coverage. Penetration of 5G and then 6G signals in higher frequency bands into buildings was causing a whole new wave of coverage issues for Communication Providers that they thought were something from a distant past.
It became clear that cellular radio inside buildings had to be provided by radio access points on the inside. In fact, the industry had known this for some time but chose to ignore it. This could no longer be the case as something happened in the early part of the decade that they were just not expecting. Fibre in the ground was becoming prevalent with a number of players now exercising new business models and investing in putting fibre direct to homes and businesses in urban and suburban environments. The new economy of scale meant that broadband over fibre was competitive and it certainly had better performance characteristics. Consumers and businesses were quite prepared to pay for wireless distribution inside the building and with WiFi sharing becoming commonplace, were becoming less dependent on cellular macro coverage outdoors. The emergence of very cheap Wi-Fi only devices was fueling this.
The market was changing and for CPs, selling 6G indoor access points to the customers who then sorted out their own backhaul had become the way to go. Self-provision of cellular access points is now as common as buying a WiFi wireless router was at the start of the decade.
So, the new era of CP mergers had begun and would reshape the industry for the decade to come. MNOs were buying into or merging with fixed broadband players and fixed broadband players were equally keen to buy cellular properties. As we have now seen, the most successful of this new breed of CPs were the ones that also knew they had to include content in their product offering. Content is where customers perceived the most value and it was content that they were more likely to spend their money on, rather than access. The way they accessed the content they wanted had become commoditized and they just wanted a low price, high bandwidth and no latency.
This still left an issue for rural coverage. The problem being that although it often made up as much as 80% of the landmass of a market, it only equated to 10% of the population. It was a commercial problem for CPs and also a political problem for administrators and regulators who felt compelled to act to ensure this problem was addressed. Spectrum made available at 400Mhz helped enormously as range was increased from each base station meaning only one 400MHz base station was required to replace eight 800/1900 base stations but the amount of spectrum in this band was not enough to provide the capacity needed for the demand.
Local ‘private’ 5G and 6G networks were built using newly deregulated frequency bands. These, then, became more widely accessible through neutral hosting in some cases and through national roaming in others. Nonetheless, the economics were still not working for effective competition in rural areas and regulatory intervention in the marketplace was going to be necessary to resolve this. Many differing yet similar schemes emerged around the world from having a state-owned infrastructure company provide the infrastructure through to dividing up areas between MNOs and enforcing national roaming in these areas.
The bold experiment to use commercial networks for blue light services has been successful but not before going through birthing difficulties and early growing pains. Reserving and dedicating a relatively small amount of spectrum for blue light services together with more sophisticated prioritisation mechanisms proved to be an effective solution combined with commercial networks.
From embedded SIM to software accreditation
The evolution of the SIM has been closely coupled with developments in personal data privacy and security.
Our accreditation to use services is no longer controlled by all the organisations we subscribe to or register with but is controlled from our personal keys that we keep in our trusted digital vaults and allow 3rd party access at our discretion. What used to be the SIM supplied and owned by the CP became subsumed by these personal keys. This means that we have gained access to services from all devices on one single authentication. Devices have become shareware and commodotised.
Not only did this allow us to switch service providers quickly and simply but also to use our keys as our personal authentication to many other services, utilities, apps, financial services, and web sites.
Software SIMs enabled a high degree of interoperability across the four SDO domains and networks deployed on different standards. Our profiles and personal keys are securely programmed into the software SIMs which one selects from a library depending on where in the world one is.
The evolution of MNOs to Content Service Providers
As we mentioned earlier the convergence of fixed and mobile kickstarted the search for the 3rd element, content. Since the advent of 2G there had been an uncomfortable juxtaposition of the operator and service provider roles. 2G did embed a set of services into the standard, voice calls, SMS, call forwarding etc. Nothing spectacular but a first set of essential services that 2G should support. Here the role of the operator and the service provider was clear and simple. Both the access to the services and the services themselves were provided by the operator of the network.
One of the founding principles of 3GPP was that the standards body should only define the technology required to enable services. The Operators were very slow to see the long-term consequences of this and by the time they did, the services ship called OTT, had long since sailed. No services were defined in the emerging 3G standard. White Papers and examples of services were produced but it was decided that the Operators would devise their own services and the provision of services was where the competition should begin, and they would be determined by the operator.
This meant that the operators now had two functions. To operate the network and also to now develop the services. The first aspect was part of their heritage and core competence. The latter was to be proven to be beyond them. The other crucial factor that 3GPP brought about was the opening up of the network interfaces to allow third parties to provide services. This followed the internet business model of the time. Operators thought that they would be able to control who offered services on their network, but this was just fallacy. The cat was out of the bag and the slippery slope to dumb pipe providers had begun.
Customer money followed the services, predominantly though the application layer which was totally beyond the control of the network operator. This, for the first time, broke the dependency between revenues from services (or mainly subscriptions) and the investment required to build the networks and create the capacity required for the services being carried. We saw between 2000 and 2020 the impact this had on MNOs as the conventional business model was breaking and the need to dramatically reduce costs drove MNOs to share base station sites, sell towers, share RAN infrastructure and also participate on a round of consolidation and merges.
In later stages of 4G and through 5G, technology evolution with software defined networks, network function virtualisation and network slicing led to most of the network being implemented in software and located in data centres and in the cloud. Apart from antennas, only RF generation and amplification remained at the physical cell sites.
New spectrum bands up to millimetre waves, including shared and deregulated use of some of that spectrum in 5G, coupled with network virtualisation and slicing, drove a considerable level of diversity in the types of networks built and also a large degree of diversity in who operated them. There came a surge of private 5G networks for the Enterprise, as well as neutrally hosted local networks in rural areas.
One thing was common in the 6G networks which emerged in the four SDO domains. Artificial Intelligence (AI) in network technology drove a huge change in network architectures whereby physical boundaries of networks disappeared and were replaced by logical boundaries. Conventional base stations disappeared and were replaced by self-managing devices. Content is now obtained and delivered locally. Devices negotiate amongst themselves the configuration and control of the ‘virtual’ 6G networks and the routing of content from source to destination a bit like how internet traffic is routed. This was the case for human users as well as ‘things’ which existed in the 6G world shaped by IoE, Internet of Everything.
6G development, the self-provisioning of connectivity, and the fixed mobile convergence of infrastructure, changed the economics of the operator to be much more aligned to that of ISPs in the 1990s. Investment priorities moved more to fibre and backhaul. This also resulted in the operator becoming more involved in the service experience and buying into content that formed the major revenue generator. Content as part of the subscription bundle or sold as pay-per-view on top of the bundle became the major generator of revenues. This forced the second great shake out in the MNO sector. Those with access to content thrived and those that didn’t wilted and died. Markets were getting used to three or four dominant service providers supplying their mobile, broadband, internet and content. Independent content providers began to realise that to provide their own streaming services, they were going to have cooperate and work with carriers unless they wanted to invest in infrastructure themselves.
The App Market Revolution
Why Android and iOS were unsustainable
Once the US prevented Google and Apple from licensing Android and iOS to Chinese device manufacturers, China had no option but to develop their own operating system. In fact, the development of the OS became one of the key objectives of all the new SDOs. The OS was more than just the necessary firmware to allow a device to function, they were the gateway to the App eco system and the app eco system was where the money is generated.
The unintended consequence of the US decision on iOS and Android, was that China did something it had only dabbled with previously, the need for its own OS that not only supported a very Far East centric App eco system including payment methods, but by the use of its soft power and political influence, ensured that much of Africa and Lat Am adopted this. The other clever thing they did was couple device OS with network OS and thus were able to dominate both the infrastructure and device markets.
The new OS also meant it could produce 6G devices for $100 and this gave it extreme dominance in the developing and emerging markets. This badly wounded Google and Apple and broke their global supremacy of the App market. In Europe Symbian-2 failed badly and its own OS was seen as slow, cumbersome, expensive and costly and was eventually abandoned.
Privacy and Data
The advent of personal vaults in which we keep our data and we make it available to others at a price i.e. a complete reversal of the current situation.
Solid was viewed by many as a pipe dream and some said that vested interest parties would lobby hard to destroy it. But Tim Berners Lee had one vital advantage. The trust of two generations who had grown up in the internet age. The concept of us all deciding where to store our data and who to allow access to it was probably the biggest game changer of the decade. As a new generation of users took much more interest in who had access to their data, where they were browsing, what they were searching, how they were being profiled and then decided that they should be the guardians of all this, the very business models that had made the Tech Giants, well, giants, was crumbling.
Users chose what data to make available to third parties, on what conditions and what price. Tech Giants and Social Media service providers were now unable to mine data for free and sell it on at great profit. A new balance existed between the data owner and the data seller. The biggest users of personal data, advertisers, then discovered that people truly opting into what data they could access, actually gave them better results and more of their advertising expenditure was better targeted and yielded better responses and conversions to buy from consumers.
Thus the business model of tech giants offering hitherto free use of search engines, blog sites, messaging and social media in return for unfettered access to customer data was seriously under threat and depended on their users agreeing to make available to them, at prices they could afford, the very data that they built their businesses on.
The New Internet Age
It would have seemed implausible, even incredulous, at the beginning of the decade that by the end of it we would see a regulated internet. The uncontrolled, no rules and no holds-barred approach to how the internet existed and flourished became unsustainable. Partly due to interventions, attacks and destabilization by sovereign actors and partly due to the need for increased quality of service controls, the need for a ‘clean’ and safe internet became a necessity.
The early moves by Roskomnazor (the Russia regulator) to successfully close down the internet in Russia except for internal communications was the first step towards a segregated internet. The official rationale was that Russia said it needed to ‘switch off’ the global internet to defend itself against Cyber-attacks from foreign agencies and all traffic was routed to their ‘eye’ for evaluation.
Others saw this as an experiment to see if after cutting off the outside world, would web pages that are often made up of content that comes from scores of different sources and places and perhaps even from other jurisdictions, still work? Would crucial financial services, aviation systems, defense systems stop working in this scenario? Some say that this paved the way for those that believed the only sustainable way forward for internet services to continue was to have in effect two internets. The wild west version that it had become and a new regulated internet.
And so, WWW 2.0 came into existence where only registered people and organisations could use it. This was the much-needed safe haven for financial transactions, health care systems, traffic and aviation systems could be deployed. Users are now safe from hackers and from manipulation and we now all feel confident that our data is not compromised.
The wild west internet (v1) still exists though it is smaller, and very unreliable as those that do try and do business there are frequently subject to DDOS attacks, which make it feel like ‘the shops are closed 1 day out of three’.
30th December 2029
We’ll give ourselves marks out of ten at the end of the decade. Whether any of this comes to pass only time will tell. One thing we at Azenby are pretty sure about is that we will see a lot of changes over the next ten years and the pace of change in the ICT world is accelerating fast and maybe there will be new disrupters we have not even envisaged yet. One hope we have above all else is that Mobile Operators and other Communication Service Providers re-engage with the standards body work and exert more influence on making sure we focus on technology changes with firm financial business cases attached and end the mad rush for the next ‘G’ before we ever get near to payback on the last one. Without a doubt Service Providers also need to be a lot more careful about what they wish for as our industry has a tendency to throw up a lot of unintended consequences along the way.
We hope you have enjoyed reading our review of the 2020 decade before it happened as much as we have enjoyed writing. Get in touch and give us your thoughts on the comments section here.