ECE Researchers win Best Paper Award at the 2020 ACM Multimedia Systems Conference (MMSys)

A team of researchers led by ECE Assistant Professor Sheng Wei has won the Best Paper Award at the ACM Multimedia Systems Conference (MMSys 2020) for their paper titled "QuRate: Power-Efficient Mobile Immersive Video Streaming”. ACM MMSys is one of the major conferences of the ACM Special Interest Group on Multimedia (SIGMM). It provides a forum for researchers to present and share their latest research findings in multimedia systems. This work was conducted by Prof. Wei and his PhD student Nan Jiang, in collaboration with researchers from both academia and industry (SUNY Binghamton, WPI, SUNY Buffalo, Adobe Research, and UNL).

In addition to the Best Paper Award, the paper also won the DASH-IF Excellence in DASH Award (3rd place) presented at ACM MMSys 2020. This award acknowledges papers substantially addressing MPEG-DASH (i.e., the international standard for adaptive video streaming over HTTP) as the presentation format and are selected for presentation at ACM MMSys 2020. Preference is given to practical enhancements and developments which can sustain future commercial usefulness of MPEG-DASH.

Congratulations to Sheng and his team on this recognition!

The abstract of the award winning paper is below:

Smartphones have recently become a popular platform for deploying the computation-intensive virtual reality (VR) applications, such as immersive video streaming (a.k.a., 360-degree video streaming). One specific challenge involving the smartphone-based head mounted display (HMD) is to reduce the potentially huge power consumption caused by the immersive video. To address this challenge, we first conduct an empirical power measurement study on a typical smartphone immersive streaming system, which identifies the major power consumption sources. Then, we develop QuRate, a quality-aware and user-centric frame rate adaptation mechanism to tackle the power consumption issue in immersive video streaming. QuRate optimizes the immersive video power consumption by modeling the correlation between the perceivable video quality and the user behavior. Specifically, QuRate builds on top of the user's reduced level of concentration on the video frames during view switching and dynamically adjusts the frame rate without impacting the perceivable video quality. We evaluate QuRate with a comprehensive set of experiments involving 5 smartphones, 21 users, and 6 immersive videos using empirical user head movement traces. Our experimental results demonstrate that QuRate is capable of extending the smartphone battery life by up to 1.24X while maintaining the perceivable video quality during immersive video streaming. Also, we conduct an Institutional Review Board (IRB)-approved subjective user study to further validate the minimum video quality impact caused by QuRate.