Technology has been a boon to our lives in so many ways. At dinner with friends and can’t agree who Jennifer Aniston is currently married to? A couple of taps on your smartphone and Wikipedia will settle the debate for you. Have a craving for cream puffs? Send out an order on Deliveroo and you’ll get your Beard Papa’s in under 30 minutes. Want to find out what happens next on the Korean hospital drama you’re watching? Just click ‘Next episode.’
Things are so darn convenient today, we hardly think twice about it. The downside to all of this, however, is that we’ve been conditioned to expect immediacy. We’re impatient and we hate to wait. Which is why when we stream a YouTube video or Netflix film and find that it stalls or seems to take forever to buffer, we get frustrated and annoyed.
Streaming — the way in which we view videos or listen to music online — emerged as a crucial service during the pandemic, powering our Zoom meetings and providing entertainment to help us escape from our lockdown lives.
The process involves transferring content as small packets of data from a server, such as one owned by YouTube, to a laptop or phone. The speed of this data transfer — measured as units of data sent per second; otherwise called the bitrate — directly corresponds to the video quality you see. The higher the bitrate, the better the quality, and vice versa.
While it’s natural to assume we would always want the highest-quality videos possible to maximise viewing pleasure, things aren’t always that simple, says Wei Tsang Ooi, an associate professor at NUS Computing who studies multimedia systems.
“A high bitrate gives you high quality videos, but the data transferred may be too much for the network to handle, and you end up entering a buffering phase,” he says, referring to lag users encounter while waiting for their videos to load.
In order to decide which bitrate to use, streaming services have to consider a number of factors. These include the network’s speed (throughput), how much data has been downloaded (buffer occupancy), the maximum amount of data the player can download before playback (buffer capacity), how fast the video is being played (playback rate), and the available video quality level of the server (video bitrates).
“They want to balance everything nicely,” says Ooi. “But it’s a fine line to tread.”
The situation is even more challenging when it comes to services such as Facebook Live and Twitch, where the margin of error is even smaller and streaming delays can only be a few seconds at most.
In their attempts to find such a balance, streaming services use rate adaptation algorithms to find the ideal bitrate. But these algorithms are often flawed, says Ooi. “They don’t take into account all these different factors, which are interdependent and interlinked, when deciding how to stream something. Or if they do, then they usually combine them in an ad hoc manner.”
The result: streaming that isn’t consistent across all scenarios.
Forming a queue
The key to fixing this problem, Ooi realised, was to create a mathematical model that would combine all these different factors. “We came up with an equation to relate them all,” he says. “So using the equation, I can plug in all the parameters and ask: ‘What if I download a video of this quality? What will it do to my buffer?’”
“The model helps us decide what the video quality should be, so we don’t download quality that is too low or too high,” says Ooi.
The advantage of QUETRA, short for QUEueing Theory-based Rate Adaptation — the name he and his then PhD students, Praveen Yadav and Arash Shafiei, gave their model — is that it is much simpler than existing rate adaptation algorithms. Users don’t have to configure or tune the parameters in any way, or undergo complex trial-and-error processes to figure out which ones work best for a given condition. “This means that our model is applicable to a lot of different scenarios and buffer capacities,” says Ooi.
A key aspect of creating QUETRA involved taking a novel approach to estimating a video player’s buffer capacity. To do this, the researchers used a branch of mathematics called the queuing theory to model how a video file enters a virtual queue while waiting to be streamed from a server.
Think of a supermarket checkout queue, says Ooi. “People arrive, they queue up, and the supermarket management needs to decide how many counters to open so that the queue doesn’t get too long.”
QUETRA does a similar thing, but for buffer capacity. “So we don’t want the buffer to get too full because this means that the quality of the video we download will be low,” he explains. “But we also don’t want the buffer to get too empty because this means that we don’t have any more video data to play back.”
Video mirror real life
To test their model, Ooi and his team evaluated its streaming performance against several state-of-the-art rate adaptation algorithms, involving seven video samples.
The results were very encouraging, he says. For instance, when compared with the algorithm used by Netflix, QUETRA provided a quality of experience that was 7% better. This figure shot up to 140% when compared with the default algorithm the industry uses to stream video files today.
Since publishing their work in 2017, Ooi and his team have continued to adapt and improve QUETRA. They have now used QUETRA to stream immersive 360-degree videos, and recently applied the model to scenarios that demand very low streaming delay, for example webinars, Facebook Live, Twitch streams. Their work on the latter, which was published in June 2021, earned second place in the DASH Industry Forum Excellence in DASH Awards at this year’s ACM Multimedia Systems Conference. Praveen Yadav, the lead student on the project, has since co-founded a Singapore-based start-up, Atlastream, that uses QUETRA to deliver smooth, high-quality, low-latency video streaming solutions.
Reflecting on his work, Ooi says: “My overall end goal is that I want to make interacting over video to be as natural in real life as possible.”
Paper: QUETRA – A Queuing Theory Approach to DASH Rate Adaptation