Telecomm analytics firm OpenSignal released a report last week analyzing the connection experience of 5G users across the world, on ten different providers. Unfortunately—and typically for 5G—the source data is so muddled that it’s difficult to draw meaningful conclusions from the results.
In the USA, Verizon is the only carrier which has deployed a significant millimeter-wave (5G FR2, various bands from 24GHz to 40GHz) network—and in fact, at the moment Verizon is only deploying 5G FR2, which is why their average 5G download speed bar leaps off the chart, at 506Mbps. 5G is a protocol, not a wavelength—and the extreme high speeds and low latencies carriers and OEM vendors promote so heavily come with the high-frequency, short-wavelength FR2 spectrum, not with the protocol itself.
The other carriers in the chart are deploying 5G in the FR1 range—the same frequencies already in use for 2G, 3G, and 4G connections. FR1 spectrum runs between 600MHz and 4.7GHz, and is further commonly split informally as “low band”—1GHz and less, with excellent range but poor throughput and latency—and “mid band”, from 1GHz to 6GHz, with improved throughput and latency, but less range.
Carriers deploying mid-band 5G FR1 (such as Sprint) are currently showing average download speeds of 100Mbps-250Mbps, and low-band 5G FR1 carriers (AT&T, T-Mobile) show average speeds of around 50Mbps.
But it’s faster than 4G, right?
OpenSignal has another chart demonstrating all carriers’ users getting much faster downloads on 5G than they do on 4G—AT&T users average 32.7Mbps on 4G, and 62.7Mbps on 5G, for example. Unfortunately, this says more about the connected population density than it does about the protocol itself—there are very few users with 5G capable phones right now, so they get the benefit of far less spectrum congestion.
The 4G standard already specifies download speeds of 100Mbps to moving clients, and 1Gbps for stationary clients. Even Verizon’s millimeter-wave-only speed test results are considerably slower than the maximums that have been defined for 4G for a decade.
For most people, the real bottleneck to their cellular data throughput isn’t the protocol at all—it’s the number of other wireless customers they must share the spectrum with. The fix for that isn’t necessarily a protocol change, it’s just more towers—and frequencies with shorter range, allowing smaller broadcast and collision domains. Fewer connections per tower mean more throughput, and lower latency, to each of those remaining connections.
This isn’t to say that 5G is worthless—even at the same spectrum, the 5G protocol offers lower latency than earlier protocols. With air latency in the 10ms range and total latency of ~~30ms, most 5G FR1 networks will have a 30-50% latency advantage over existing 4G networks. Lower latency means snappier web page loading, better gaming, and generally better scale in dense environments.
5G is still hard to find
The final, ugly truth about 5G is that very few people are getting consistent use of it. According to OpenSignal, T-Mobile leads the world in 5G availability—at less than 20% availability for its 5G-capable users.
Verizon—who, you’ll remember, is only deploying millimeter-wave 5G—trails the pack at only 0.5% availability. These numbers hammer home the point we made earlier about speeds—we have no way of knowing what speeds will really be like yet, because so few people are able to use the service. As more people get 5G enabled phones, and the carriers shift more of their operations over to 5G, we’ll see average speed per user drop accordingly.
Exactly how far those per-user speeds will drop, we don’t know yet—but we can be certain that low-band and mid-band 5G connections won’t continue to offer double the throughput of 4G connections on similar bands, once 5G becomes the norm.
https://arstechnica.com/?p=1678676