Lately, wireless operators have begun wondering what to include in fifth-generation (5G) networks. There is a feeling of urgency as outside heavyweights like Google and Facebook threaten to upset their cosy business. If the mobile carriers can agree among themselves, they hope to have their fifth-generation networks in place by 2020.
That may be a bit ambitious. Years of haggling lie ahead while policy-makers and standards bearers lobby for technologies their national carriers and telecoms firms deem vital for their own wellbeing. However, the hope burns bright that, unlike previous generations of mobile technology, 5G will be a true global standard—allowing travellers to use their personal phones anywhere in the world, without the hassle of having to swap their SIM cards for local ones bought on arrival.
What to expect from 5G? At this stage, one of the few things that can be said about 5G with certainity is that—if it is to meet society’s growing demands for ubiquitous and instantaneous connectivity—such networks will need to have a “latency” (ie, response time) of about one millisecond. The speed at which two devices can begin to communicate with one another over today’s 4G networks is about 50 milliseconds, and around 500 milliseconds for the still widely used 3G services.
Even 4G is nowhere near fast enough for, say, cloud-based systems to transmit emergency instructions to driverless cars threading their way through traffic. Nor is it good enough to provide seamless language translation between participants sharing a teleconference, let alone to guide a scalpel while a surgeon is performing a life-saving operation remotely. Many real-time wireless applications will need latencies of a millisecond at most.
Another cornerstone requirement is going to be a data rate of at least one gigabit per second (1Gbps) to start with, and multiple gigabits per second thereafter. Mobile users will need such speeds if they are to stream ultra-high-definition (ie, 4k and soon 8k) video formats to their phones and tablets.
Today, 4G networks based on LTE (long-term evolution) technology can manage between 10 and 100 megabits per second (Mbps), depending on the setup and amount of traffic. Most mobile carriers are still rolling out their LTE services, while a few have started to install the latest LTE-Advanced equipment (ie, true 4G as opposed to the half-baked versions carriers have been pretending are the real thing). The peak bit rate of LTE-A is claimed to be 1Gbps. In the real world, however, it is more like 250Mbps.
So, how much of an improvement will 5G offer over the best of 4G? Difficult to say. But given the ten-fold improvement seen over previous generations, an average 5G download speed of 1Gbps seems realistic—with the possibility of up to 10Gbps as the technology ripens with age. Such wireless bit rates are beyond even the scope of the optical-fibre currently used to deliver internet access and high-definition television to the home.
Two technical features—carrier aggregation and MIMO antennas—are responsible for giving LTE-A its big boost over earlier iterations. Neither technique is particularly new, but both are expected to play a big role in helping 5G fulfill its promise.
For its part, carrier aggregation is a way of boosting download speeds by plucking signals from a number of local base stations, instead of simply the most powerful one in the vicinity. These different channels—often with different frequencies from different bands in the spectrum—are combined into what is effectively a single fat pipe capable of delivering data at a far higher rate than would otherwise be possible. In LTE-A, up to five component carriers, each offering up to 20 megahertz of bandwidth, can be aggregated into a single carrier 100MHz wide.
Given the global shortage of spectrum, most mobile telecoms firms have snapped up frequencies wherever they can. As a result, few of their chunks of spectrum are contiguous. Fortunately, carrier aggregation not only allows mobile operators to boost their data rates, but it also permits them to patch together their disparate blocks of spectrum. This is going to be even more important when 5G enters service in the more crowded wireless world of five or more years hence.
Much the same goes for MIMO (multiple input/multiple output). This works by transmitting two or more data streams via two or more antennas, and having the receiving antennas process all the incoming signals instead of just the strongest one. It has been likened to replacing a country road with a single lane for traffic with a multi-lane highway. Today’s MIMO implementations tend to have three or four antennas on both the transmitting and the receiving ends. But what if each end had tens of antennas or even hundreds? That would translate into a significant increase in download speed, and a far more efficient use of the available spectrum.
Which spectrum that will be, though, has still to be decided. Today’s wireless devices operate in the crowded 700MHz to 2.6GHz part of the radio-frequency compass. It is not as though once 5G hits the airwaves, chunks of spectrum used today by 4G and even 3G networks will suddenly become vacant. Mobile carriers will still have to continue their legacy services for the millions of subscribers who do not immediately upgrade to the latest devices—and may not do so for years to come.
The obvious answer for 5G is to migrate from today’s UHF frequencies to either the SHF (super high frequency) band between 3Ghz and 30GHz, or even to the EHF (extremely high frequency) band from 30GHz to 300GHz. Current occupants of these rarefied frequencies (also known as “millimetre waves” because of their wavelength) include satellite television, microwave relay links, air-traffic radar, radio astronomy and amateur radio.
In most regions of the world, a chunk of spectrum around 60GHz has been designated for public use. With their new 802.11ad standard, the WiFi community plans to exploit the unlicensed 60GHz band for streaming ultra-high-definition video around the home. In typical configurations, 802.11ad can beam more than 6Gbps over modest distances.
As always, there are drawbacks. One is that such extreme frequencies are easily blocked by walls and even people moving around. They also get absorbed by the atmosphere, by causing oxygen molecules in the air to resonate—though the absorption effect only becomes significant at distances greater than 100 metres or so. However, by going to 70GHz and above, atmospheric absorption disappears altogether. Nokia, a Finnish network-infrastructure firm, is said to have achieved speeds in 70GHz trials of 115Gbps over short distances in the laboratory.
All of which suggests that 5G will need base stations closer to users than current cellular towers. As it so happens, that is already a trend. So far, microcells—no bigger than a WiFi modem—have been used mainly inside buildings, to overcome poor mobile reception. To handle 5G’s needs, hundreds of microcellular access points will be required to fill the gaps between existing cellular base stations. With the tiny antenna boxes attached to lamp-posts and the sides of buildings, few people will ever notice them, let alone object to their presence—as is so often the case when new cellular towers are erected these days.
It is tempting to think that, even when the “internet of things” adds billions of more digital devices chatting over the airwaves, the technology underpinnings of 5G will offer so much potential bandwidth as to render future generations of mobile networking unnecessary. Indeed, some network architects expect 5G to be the end of the line; everything thereafter, they suggest, will be merely some evolutionary improvement. A nice thought. But the past teaches otherwise, and the future always finds ways to thwart even the smartest of prognosticators.
Thanks to the economist Blog!