The Road to WiFi 6E Part 3: Low Latency

Published on April 5, 2022

The Road to WiFi 6E Part 3: Low Latency

This blog post is the third in a series of articles we’ll be presenting over the course of 2022 about WiFi 6 (and 6E) – the current and future top-of-the-line WiFi standards championed by the WiFi Alliance. We’ll be discussing the history of the evolving standard of WiFi, new features introduced in WiFi 6 and 6E, applications beyond the commonly known consumer use cases, and much more. Additionally, we’ll be releasing a companion series of video interviews with Ezurio (formerly Laird Connectivity) experts, which you can find here.

Speedy Delivery: How Low Latency Impacts Everything

If you’ve been following our series of blog posts on WiFi 6 and 6E, you’ll have started to notice a pattern emerging among the many changes and updates in the new standard: Efficiency. Whether it’s packing more data into each clock cycle with QAM, or smarter channel management, or new spectrum that spreads devices farther apart across the available frequencies, many of the changes in WiFi 6 and 6E are focused on improving the efficiency of the network. That means more devices, communicating more effectively, and according to their requirements.

In this post we’re talking about the low latency achieved in WiFi 6 and 6E with orthogonal frequency‑division multiple access, or OFDMA. OFDMA is new to WiFi, but it’s well established in other wireless technologies such as LTE. And as with many of the other features we’re discussing here, its advantages are multipurpose; that is to say, OFDMA is about more than getting data to its destination rapidly. It has far-reaching effect in terms of device battery usage, priority of data, and other concerns for every kind of WiFi device.

Understanding OFDMA – Packing WiFi Frames With Multiple Users

To begin to understand OFDMA, let’s look at a common analogy that is fairly effective at explaining the nature of WiFi traffic, its many channels, and the delivery mechanism of data in a link. That analogy is to compare WiFi traffic to parcels loaded on trucks and delivered up and down the freeway. And much like with shipping logistics, OFDMA is about optimizing the delivery of those parcels (packets of data, in this analogy) in the most efficient way as possible, while also making sure that small shippers have access to the same reliability and consistent delivery as the big shippers.

For this analogy, think of each truck as a frame in a WiFi burst signal. Frames are chunks of the burst split up in time, like trucks moving down the freeway one at a time. There is only one lane of traffic, and each truck can only carry cargo from one sender, and at a fixed rate of speed (the speed of light, in fact, but let’s not complicate the analogy too much). That means one truck might be half-packed with cargo from big shipper A, the next might have a single box from small shipper B, the next two might be full with cargo from big shipper C. In this arrangement, lots of potential data sits surrounded by empty space in a truck, waiting its turn to reach its destination. There’s no carpool lane or bypass to expediate its delivery. Everyone waits the same.

With OFDMA, the idea is to further subdivide WiFi frames into subcarriers. What that means is that each frame can be further cut into portions of spectrum, and divided and sorted at the receiving end to allow multiple concurrent users a chance to send data in each frame. In the truck analogy this might be expressed as: big shippers load out the left of the truck, a small shipper gets access to the middle for a few small parcels, and so on until the truck is full. Never again does a full truck need to be wasted on a single parcel. Efficiency jumps dramatically. And it’s not a zero-sum game that disadvantages the big shippers: There was usually room for one more box. Everyone wins.

In the example below, we see the previous arrangement (OFDM) against the new scheme available in OFDMA. Consider the red voice client: in OFDM, the voice client waits several frames to deliver its payload, introducing latency as it waits for its turn. In OFDMA, the voice client has access to portions of frames more frequently, an important consideration for voice communications and a much better outcome in terms of latency.

OFDMA.png

So, to step out of the analogy for a moment: what does this practically look like in a WiFi transmission? The genius of OFDMA lies in giving the infrastructure the ability to plot and plan these frames based on the needs of the clients with the information available to the infrastructure. An entire 20 MHz channel can be divided into 234 multi-user subchannels in WiFi 6, and those subchannels can be doled out to the clients as needed, as well as decoded separately to separate out whose parcels are whose. To briefly return to our analogy, that means 256 equal size parcels per truck, arranged in the way that makes the most sense for the overall network based on need.

This analogy isn’t perfect (for one, you should never load out just the left-hand side of a truck), but it expresses the key benefit of OFDMA: better management and utilization of the empty space inside the average WiFi frame. The “MA” in OFDMA (multiple access) is the really transformative development: The previous method, OFDM (which stands for Orthogonal Frequency‑Division Multiplexing) was created to allow multiple simultaneous communications (creating the frame / truck structure in the first place) and prevent those signals from interfering with each other. OFDMA is an advancement of complexity in the same direction.

Impact on Throughput: How Much More Can We Send?

As you can probably imagine when you envision that fully-loaded truck, more data is getting to the delivery point, and faster. But how much more, and how much faster? That depends on what kind of shipper you are, and what your demands are. The big shippers that were already packing out most of the truck correspond with high-volume data devices, those streaming high quality video or other payload-intensive devices. And, unsurprisingly, they’re unlikely to see much of a change in their delivery schedules. It’s still a lot of data, and it’s still got to wait in line.

The big advantages are for the lightweight shippers, those with a box or two. These correspond to sensor type devices, remote control devices, the types of devices whose communication might be infrequent and ultimately data-light when they do communicate. By many estimates, the average improvement is as much as a 3x reduction in latency. All of this is managed by the infrastructure, what it knows about the client, what kind of device and traffic is characteristic of its connection, and what is available.

It's also important to note that the improvement in latency is not universal. It’s entirely dependent on the kinds of devices on the network and what kinds of efficiencies can be cobbled out of their traffic. It’s impossible to give a solid percentage decrease in latency, which is why 3x is an estimate, but not a rule. The important point is that the overall network becomes closer to 100% utilization of bandwidth, cutting out empty space as much as possible and providing for the opportunity of important smaller data packets to be delivered much faster than previously. Again, everyone wins.

…But It’s Not Just About Data

A last thought here on latency, but an important one to note, is to turn back to where we started in this blog post. Latency is important for time-sensitive data, it’s important for wireless applications, and it’s obviously more important for devices and traffic that are health or safety focused. But lowering latency is about more than that. It can have a huge impact particularly on battery-operated devices, the kinds of devices that are on the go and can’t afford to spend more battery than necessary to carry out their tasks.

With wireless devices, power is wasted when poor latency or transmission retries compromise the signal. That’s part of why OFDM was designed as it was, with separate frames and buffer time in between to protect each from the next. Limited interference means a cleaner link, less retries, and a better application.

OFDMA takes this the next logical step forward, and in so doing further reduces the time to wait for devices to send their data, receive acknowledgement, and go back to a lower-power state. Bigger devices with bigger batteries, or devices powered by AC power, have less concern in this regard. But small, battery-conscious devices can last for much, much longer on a single charge when their battery isn’t being monopolized by the wireless link.

WiFi 6 and 6E represent a major improvement for how battery-operated devices manage their available power, through minimizing the higher power on time these devices must use to transmit data, which makes for a better user experience overall. And it couldn’t come at a better time, as the number of WiFi devices continues to explode and the infrastructure’s management of all its clients becomes more and more complex.

Examples

There are many WiFi applications for which low latency is crucial. A delay in operation can cause problems for many, including desynchronization between connected devices that can jeopardize the expected behavior.

  • Machine Control / Robotics: In automated machine control, multiple machines or robotics systems often operate in collaboration with each other, synchronizing their movements to each other within tight tolerances. For example, one machine may pick up an assembly component, apply an adhesive, and then pass it off to the next station where another machine picks and installs the part. It’s critical that these parts work together correctly, as well as stop simultaneously in the event of an emergency stop. Low latency in WiFi 6 gives us better results in terms of coordination and safety for machine control.
  • Lighting Control: In indoor venues, lighting can be connected to Wi-Fi to achieve centralized and synchronized control over lights across a large area. In the case of something like theater and stage lighting, that control needs to be very low latency in order to look and feel correct: When bringing down the house lights, if some fall out of sync, the visual effect is uncoordinated and breaks the experience. The same goes for decorative lighting effects that are part of the show itself which are highly timing-sensitive. By giving these light controls a route to ultra-low-latency, the engineering booth creates a better, more cohesive experience.

Conclusion

OFDMA is a major step forward that brings WiFi in line with the other technologies, notably LTE, that use smarter spectrum management to service a multitude of devices smartly and effectively. That’s our next topic of discussion in our upcoming part 4 of this series: Device Density. We’ll return to how OFDMA supports this much higher device count, as well as other technologies involved such as MU-MIMO and BSS coloring, which help support environments with dozens of tightly collocated devices. Consider a hospital room, which can easily contain dozens of devices. Then consider the wing of a hospital, with two dozen rooms. Then consider that there are floors full of these wings.

This is why the time for WiFi 6 and 6E is now. The challenges are great – but these new standards are well-poised to support the fully wireless world of the near future.

Subscribe to our blog series to get updates on our next post and many more as we continue our 2022 series on WiFi 6 and 6E.

Learn more on these topics: