There is growing appetite for standards-based 5G LEO satellite cellular.

When most of us think about widespread technology adoption, we have a pronounced urban bias. Tech growth pitches around 5G are based on emerging applications in western cities and progressive adoption in urban areas in developing countries. Beyond that, most pitches get a bit fuzzy. On-the-ground IoT applications are clear—soil humidity and temperature sensing, for example—but how these communicate with a datacenter is less clear. 

Line of sight 5G base stations are available only where there is enough demand (generally a nearby town) to justify the cost of a base station. And, even then, only in relatively clear areas; forested or mountainous regions, oceans, or deserts need not apply. Mesh networks and fixed wireless access can expand the reach to IoT devices a little around a base station but still under the same restrictions. In contrast, low earth orbit (LEO) satellite options like Starlink are visible everywhere they are deployed but depend on proprietary protocols and hardware link support. It is not surprising, therefore, that there is growing appetite for standards-based 5G LEO satellite cellular to truly open up this market. 

Satellite options 

Satellite support for phone communication is not a new idea. Motorola introduced their Iridium system in the 1990s. The first generation was not successful probably because we weren’t yet ready for that level of coverage, and emerging cellular solutions were more effective and lighter weight (I should note in fairness that a newer version of Iridium is active today). Now that cellular is ubiquitous (at least where coverage is available), many feel it is time to revisit the satellite option. 

There are three orbit options: geostationary orbit (GEO), middle earth orbit (MEO), and low earth orbit (LEO). GEO orbits at ~35,000km and offers the advantage that any given satellite is always at the same position in the sky. This works well for high bandwidth home-based communication for TV and internet service in remote areas. Here, a dish can be aligned once with a satellite and does not need tracking support. HughesNet is one service that offers this option. 

MEO satellite orbits lie between 2000km and 35,000km and are mostly used for positioning systems such as GPS and GNSS, also for moderate bandwidth support in remote areas. LEO satellite orbits lie between 160km and 2000km and are the hot option for 5G (and beyond) communication. However, both MEO and LEO satellites move relative to a user, requiring added support to maintain a link. 

GEO satellites are big, costly, power hungry (to communicate over a long distance), and expensive to launch. MEO and LEO satellites are progressively smaller, cheaper, less power hungry, and cheaper to launch, especially considering the advent of small satellites (SmallSats). Also, MEO and LEO satellites can provide better coverage at high latitudes. Each provides unique advantages and disadvantages, suggesting a blend of options may be ideal for satellite-based communication. 

Market opportunity  

This is an early-stage market but backed by some heavy hitters. Starlink stimulated early visibility, Amazon has its Kuiper project, T-Mobile is working with Starlink and with OneWeb, and there is talk of Google collaborating with Verizon. Market forecasts are very encouraging—$29B by 2030 with a CAGR of 29%—which makes sense. How else would we build a truly worldwide communication infrastructure, not just “worldwide as long as you are in/near a city, not in the mountains, not in a forest, not at sea far from a port, not…”? 

In our modern and riskier world, those limitations are no longer acceptable: wildfires, hurricanes, flooding, and crop failures pose risks that 5G satellite-based IoT technologies could help mitigate if deployed widely. Critical services, such as emergency communications for individuals or emergency services, should be able to continue to function even if power is lost. This becomes possible if emergency communications from user equipment (UE) to a dish/ feeder station can connect direct to a satellite. 

Of course, we want competition to drive prices down and accessibility, quality, and capabilities up. That can’t happen if we are locked into Starlink or a similar proprietary service. This is why we need a 3GPP-ratified standard with which all infrastructure and UEs will play nicely. 

Technology challenges 

LEO satellites present a terrific new communication opportunity both commercially and for society in general. Nevertheless, there are new challenges that come with these satellite-based systems. One is latency. Latency to a geostationary satellite can be 600ms versus 30ms for a cable signal. That may seem high, but it isn’t just simple round-trip time. There’s a lot of other processing going on too. LEO satellite latencies fall in the range of 180ms, which is better than the geostationary option, but still quite a bit longer than ground-based communication. These latencies are not currently suitable for ultra-low latency applications, but for many IoT uses such a latency may not be a problem.  

While a LEO satellite orbit improves latency, it also covers a smaller area at any one time and is moving (multiple orbits per day) demanding frequent communication handovers between satellites, even when the ground UE device is stationary. Handovers add some of that latency to communication. Link controllers may choose to handover to another LEO satellite or perhaps to a nearby MEO satellite offering a longer period before the next handover. Unsurprisingly, algorithms in this area are continuing to evolve. 

An additional problem comes from Doppler effects. To stay in orbit, LEO and MEO satellites must move quickly, MEO satellites at 3.1km per second (km/s) or more, LEO satellites at 7.8km/s. Doppler shifts at these speeds can be severe, especially for LEO satellites, and that shift can significantly degrade link reliability. Corrections based on the known ephemeris of a satellite are effective only if the ground terminal is stationary or at least has predictable movement, which is typically not the case. 

Alternative modulation schemes have been proposed that can compensate for Doppler, such as OTFS (orthogonal time frequency space) rather than a more conventional OFDM (orthogonal frequency division multiplexing). Other methods propose deep learning as a method for compensation (when connecting to dishes rather than a movable UE I would guess). Once again, this is an area where algorithms are evolving quickly. 

Network implementation 

Start with the 5G network architecture. There are different architectures proposed for different use-cases. One offers a direct connection between an IoT device and a 5G LEO satellite in the service link while the satellite connects with the core (terrestrial) network through a feeder link provided by a satellite dish. Another architecture has the same setup for the feeder link, and a service link from the satellite also connects through a dish which in turn connects to an edge network, say for 5G fixed wireless access. 

Standardization through 3GPP is still underway, though they are already suggesting the possibility of tens to hundreds of Mbps bandwidth (with a dish) in the downlink and roundtrip delays on the order of a few tens of milliseconds, all subject to multiple factors of course. 

Clearly, between handover management and Doppler mitigation plus methods to minimize latencies for different classes of service, new hardware and software will be required. UEs supporting direct satellite links will need enhancements, service and feeder link hardware must be provided, and the satellites themselves must support the protocol, all as those protocols evolve. 

Demands on HW/SW development 

Already with Open RAN there has been an accelerating trend away from off-the shelf CPUs, FPGAs, and DSPs. Infrastructure and user equipment makers want to maximize differentiation and minimize capital and operational costs. This trend will be further amplified for satellite-based networks. Off-the-shelf products are too costly and power hungry with limited differentiation options to appeal to OEMs. In this area, OEMs will turn even more to ASIC-solutions with software-defined radio (SDR) architectures. This will be essential to enable future proofing in anticipation of algorithm upgrades discussed earlier, as well as to support a wide range of existing and future service opportunities to further differentiate products. 

From my perspective, these requirements suggest that any competitive solution must offer the following characteristics as an embeddable IP with strong hardware acceleration options yet significant software configurability: 

For Open RAN implementations, the full range of Open RAN support in the base station (supporting both Macro DU and virtual DU as well as Small Cell capability) and in the radio (supporting Open RAN Low-PHY, Massive MIMO and Beamforming). 

For the UE, a baseband platform with a modem supporting the full range of 5G eMBB, URLLC, Sidelink, and RedCap use cases, for both mmWave and sub-6GHz, as well as legacy LTE and Cat 1 technologies. This embedded IP also should offer configurations to support high-end use cases (such as V2X automotive applications) as well as low margin, ultra-low power use cases (such as agricultural monitors). 

On top of these use cases, as the 3GPP definition moves towards standardization, the solution must offer significant software-based flexibility in ability to evolve the SDR algorithms. 

At CEVA, we have considerable experience in offering a wide range of wireless platforms, especially cellular options, already proven in 5G, building on our Open RAN platform, and—for baseband applications—building on our 5G Modems platform. We are tracking this domain very closely and would be happy to share our ideas. 

www.ceva-dsp.com