الخميس، 24 أكتوبر 2019

Qualcomm Publishes Half-Baked Study to Make 5G Look Good

Cellular companies in the United States have a significant problem where 5G is concerned. In other countries, 5G is being deployed in lower frequency bands (sub-6GHz), where the problems with range are much less significant. In the United States, all of the 5G rollouts to date have been in the so-called “millimeter wave” or mmWave bands at 28GHz or above. Non-coincidentally, the 5G rollout in the US has also been something of a joke, with low performance, terrible line-of-sight issues, and devices that overheat in hot weather and had to be carried around in coolers in order for various publications to complete their real-world testing.

None of those facts mesh very well with Qualcomm’s fervent desire to declare 2019 “the year of 5G.” To make this argument, Qualcomm relied on a third-party analysis performed by the Signal Research Group, which prepared a 41-page report and a multi-page presentation. Both of these are an excellent example of how companies can distort data to make arguments that suit their purposes.

Qualcomm sent out a PR blast concerning this report, so this is a document with the company’s weight behind it.

Let’s dive in. First, the Signal Research Group (SRG) declares that Qualcomm hired it to perform this work because everyone else has been getting it wrong.

Qualcomm reached out to us mid-summer and asked us to write a paper which highlights 5G network performance. One reason, we suspect, is that our testing and analysis provide credible information that we can back up with supporting data. Frequently, casual “testers,” such as media and bloggers, publish results and analysis from their experiences in a 5G network that misrepresents how the networks are really performing.

Page 7 offers important information about how Signal Research frames its report. “We generally prefer to look at other performance parameters involving signal strength (RSRP) and signal quality (SINR) or the efficiency of data transfers (MCS), however, in order to appeal to a larger audience, we are focusing the results of this paper on well-understood performance metrics.” (emphasis added)

The sections of the paper focused on Seoul, London, and Bern do indeed frame things in terms of Mbps. But not Minneapolis. Here, the goalposts change. Once the focus shifts to Minneapolis, Signal Research writes: “Signal quality and signal strength, in our view, are a better indicator of network performance since these parameters exclude extraneous factors which can influence data speeds, and which do not reflect the full capabilities of the network.” The company does not explain why it has contradicted its earlier position just five pages earlier.

At this point, the company introduces a new ratio, BSINR. It never defines BSINR or states how BISNR is different from SINR. Googling “BSNIR” and phrases like “BSINR 5G” turned up nothing useful. The only thing I know about BSINR is that it must be different from SINR because Signal Research uses both terms in the same whitepaper, clearly indicating that they are two different things.

But Wait, There’s More!

Here’s an image showing the results of throughput testing in Central London. Black means no throughput, hot pink is up to 250Mbps, and purplish is >250Mbps.

5G-Presentation-5

Here’s an image showing the results of throughput testing in Bern, Switzerland. Black means no throughput, hot pink is up to 250Mbps, and purplish is >250Mbps.

5G-Presentation-4

And finally, here’s an image showing the result of BSINR signal quality in Minneapolis. Black means nothing, but red means X < 0, while hot pink means X >= 15. Instead of data on channel bandwidth, network capability, total mapped area, and total data transferred, we’re told when the testing took place and what phone was tested. The total data transferred (274GB) is given elsewhere in the report.

There are a lot of really good reasons for publications to present data in different ways, but consistency is critical. Switching between data sets and presenting different data without a careful and clear explanation for why you are doing this is, at best, an excellent way to confuse one’s audience. For Seoul, Bern, and Central London, SRG provides a metric for the total distance its testers’ traveled while evaluating each 5G network. For Minneapolis, no data is provided and the scale of the map SRG provides with their slide looks different and significantly smaller.

The US map also uses color somewhat differently than the European and Korean maps. In one set of maps, black is the lowest color and purple-pink is the highest. In the US map, red is the null/worst color, and hot pink is the best. Regarding the US map, the Signal Research report states: “higher BSINR results in faster data speeds, although data speeds of several hundred Mbps are possible with a BSINR of only a few dB. Gigabit data speeds generally require a BSINR closer to 10 dB or higher – much also depends on the channel bandwidth of the 5G transmission.”

I do not know what “BSINR” is, but SINR stands for Signal to Interference plus Noise Ratio. There are a number of sophisticated metrics that cellular engineers use, and Signal Research isn’t wrong that these may be better ways to capture 5G network performance. But declaring that you are using one set of metrics at the beginning of a report and then switching partway through with a self-contradictory explanation and a derived metric you never even bother to define isn’t a great look.

Beyond this point, we have other questions. What’s the channel bandwidth on the Verizon network, since Signal Research provided it for each of the other providers, and “much also depends” on it? The company doesn’t say.

dB is a logarithmic scale, so “close to 10dB or higher” for gigabit isn’t a helpful statement, particularly when “close to 10dB or higher” corresponds to three different measurements in the graph. If all those points were actually capable of gigabit performance, the average 5G performance on the network would be higher than the 383Mbps Signal claims for 5G. Note that 5G only hits 383Mbps if you remove all of the areas where X < 0. If you keep the X < 0 data, Mbps rates for the entire network averaged to 174 Mbps.

If 5G can maintain speeds of several hundred Mbps with a BSINR of “a few” dB, the implication is that everywhere we don’t see a red dot supports network connections several times faster than the fastest LTE networks, which this report indicates are typically under 100Mbps in real-world conditions. The only problem with this claim is that literally every real-world report on 5G performance in the United States suggests it’s not true.

It’s possible that the reason these Verizon charts are entirely out-of-step with the rest of the document is that they were originally published in a separate Signals Research report. The figures are listed as being from a report priced at $1,650, entitled “Vikings vs. Bears.” We were unable to examine this report to detail the differences between it and the document Qualcomm provided.

Signal Research Group didn’t have a problem providing other details, though:

What-The-Hell

Sponsored by the Minneapolis Tourism Department? What does this have to do with 5G?

If You Can’t Challenge Good Reporting, Just Misrepresent It

A little farther on, Signal claims: “Figure 13, along with Figure 14, highlights another very important observation and a key reason why casual 5G testers frequently mischaracterize millimeter wave performance. Figure 13 shows several instances when there wasn’t 5G connectivity and the smartphone used the LTE network instead.”

The figures aren’t actually very important to what I have to say next, but here’s Figure 13, just for completeness’ sake:

Casual testers — excuse me, “testers” — has a definition earlier in the document. That’s “media and bloggers.” Signal makes no distinction between various media outlets. It’s the kind of statement people tend to nod past — unless you reported on some of the original 5G network testing in the first place.

But I did. And I remember Sascha Segan writing an entire section in his PCMag article discussing how the 5G network icon flickered in his testing. The PCMag article contains an entire section specifically discussing whether tests were run on 5G or LTE. A quick Google for “5G network flicker” shows multiple publications raising this issue. It wasn’t some unknown issue skewing results. It certainly did not pass by unremarked.

All of this adds up to a report that was either remarkably badly written or issued in remarkable bad faith. The data presented in the US section of this report is presented using different metrics than the other cities it discusses. The data presented for the US takes potshots at the supposedly poor execution of early 5G testing while falsely representing the overall state of that coverage. It does not read like an honest assessment of  5G deployments in the United States. It reads like an attempt to paper over exactly how poor 5G service in the United States currently is.

The State of 5G in the United States Is Anything but Good

The reason why I’ve bothered to spend so much time deconstructing this document isn’t that I expect Qualcomm to put out a report on how 5G sucks. It’s because you deserve to see how these arguments are actually being made in sophisticated-looking documents, and how well they stand up to analysis. There’s no way to do that in 500 words when dealing with a document this long.

Carriers in Europe and Korea are using fundamentally different networks with better range characteristics and far fewer coverage problems, which may explain why Signal Research chose to present the US data with entirely different metrics. The good news is, deploying 5G in the 600-800MHz band makes it a lot more like LTE. The bad news, if you’re a carrier who wants to charge US consumers more money and sell them a new cellphone at a higher price, is that deploying 5G in the 600-800MHz band makes it a lot more like LTE.

Signal Research Group spends a significant amount of time crowing about 5G’s superior stadium signal penetration, going so far as to write:

In the last figure shown in this section, we show the application layer throughput that we observed while walking behind the stadium seating and where rabid Vikings fans purchase their brats and beer. These results are impressive because the network wasn’t designed to provide coverage in this area. However, the RF energy from the 5G radios was able to radiate through the narrow gaps in the stadium seats and fnd its way into the concourse.

That sounds really impressive — until you remember that Verizon has already admitted that its 5G network can’t cover an entire NFL stadium. Verizon’s full quote on the subject is actually rather ominous.

Verizon 5G Ultra Wideband service will be available in areas of the above stadiums. Service will be concentrated in parts of the seating areas but could be available in other locations in and around the stadium as well. When customers with 5G enabled smartphones move outside Verizon’s 5G Ultra Wideband coverage area, the 5G-enabled devices will transition the signal to Verizon’s 4G network. In some cities the stadium will be the only place with Verizon 5G Ultra Wideband service, offering fans a unique 5G experience they can’t get anywhere else in their local area. (Emphasis added)

It’s not just a question of guaranteeing service within the stadium. Verizon is straight-up telling people that only parts of the seating area will be covered and everything else is a total crapshoot.

Here’s the funny thing. The report by Signal states: “Over the course of a day, we walked the entire stadium, walking through up to 3 rows in each section on all levels, including the end zone sections.” It also states: “Although we can’t definitively state that every single seat in the 66,655-seat stadium has 5G coverage, we are confident that virtually all seats have good, if not great, 5G RF connectivity.” It does not define the meaning of “5G RF connectivity” as opposed to “5G coverage,” despite indicating that these are two distinct and particular things. Googling the phrase “5G RF connectivity” turned up nothing useful. And the actual map in the Signal Research report only shows signal strength in part of the stadium, even though the text refers to a complete stadium map.

5G-Presentation-7

A New Metric For Power Efficiency: Time Spent in Solitaire per 30GB of Data Downloaded Over 5G

Confused? Don’t worry. I’m going to help. The full quote that I’m referencing with the above is:

Although this scenario doesn’t conclusively prove that a 5G smartphone can go an entire workday on a single charge, it does provide strong evidence that it should last a full day. Furthermore, to the extent a battery doesn’t last a full day, it is most likely due to factors other than 5G data connectivity. According to our calculations and results from our field measurements, a 30-minute gaming session of Solitaire can be the equivalent of downloading more than 30 GB of data with a good (not great) 5G connection.

This is an amazing metric.

No, really. It only gets better the longer you think about it. Soon, you too will be cataloging previously indescribable phenomenon in your life with all the accuracy sixth-grade English adjectives can muster.

The actual worst part about the “30 minutes of Solitaire per 30GB of data downloaded” benchmark idea is that this actually isn’t a terrible idea for comparing the amount of energy used to perform basic tasks. But that only works if you provide additional context in how much data a typical phone is likely using per day, or how much CPU, GPU, and modem power (for ad-supported gameplay!) the SoC is using in this specific title. There’s a right way to do the kind of testing that Signals Research is handwaving at. Tossing a metric out without context or explanation isn’t how you explain your methodology.

I suspect this report and the blog post behind it were an attempt to change the narrative around 5G. But the truth around 5G is literally leaking out around the seams, despite every single US carriers’ desperate attempts to pretend otherwise. Verizon can’t cover an entire NBA stadium with 5G signal, either. Verizon had to admit 5G is actually going to look a lot like “good 4G” if you don’t live downtown in a major metropolitan area. It’s in the fact that there are major questions about whether Verizon’s fixed broadband has a future.

Right now, mmWave 5G is the RDRAM of cellular broadband networks. In highly specific and idealized circumstances, it’s absolutely great. Once you start departing from those circumstances, however, the negatives stack up thick and fast. Lower-frequency 5G has the advantage of better range, but doesn’t offer the same performance. This is physics. Physics, in this case, is highly inconvenient to marketing departments. 5G networks are going to improve over time, but those improvements (and general availability) are going to take years. Right now, it’s no kind of deal.

2019 is not the year of 5G. 2020 is not going to be the year of 5G, though 5G adoption will certainly be much higher next year, simply because more companies will be shipping 5G phones. The “year of 5G,” realistically, is probably somewhere between 2022 and 2025. That’s assuming cell phone companies are able to improve their networks enough to articulate a reason why anyone outside a major city center should care about it at all. As someone who lives in an area where 3G is still widely deployed, you’ll forgive me if I don’t hold my breath waiting for 5G deployments.

Signal Research Group has a reputation for doing stellar work. This report is not. You don’t need to be an RF engineer to understand the importance of properly defining terms or presenting data in a consistent, clear, and easy-to-understand fashion. I’ve heard it’s even possible to make an argument in favor of a new technology without coming off as profoundly arrogant and completely unprofessional to everyone else engaged in good-faith evaluation.

Now Read:



sourse ExtremeTechExtremeTech https://ift.tt/2pMe2fg

ليست هناك تعليقات:

إرسال تعليق