Skip to main content
Publications lead hero image abstract pattern

Publications

IEEE CTN
Written By:

Ning Wang, Peng Qian, Vu San Ha Huynh, Sweta Anmulwar, and Rahim Tafazolli, 5G/6G Innovation Centre, University of Surrey

Published: 6 Jun 2022

network

CTN Issue: May 2022

A note from the editor:

Holographic teleportation?  Start Trek anyone?  Not quite but certainly an exciting concept non-the-less.  When people started talking about 5G, holographic teleportation was considered as a killer use case. Now a few years have gone by and we don’t quite hear the same buzz, are we there yet? This month’s article by researchers from the University of Surrey gives us an introduction about the technical issues that they have tackled and the challenges that we’re still facing to achieve a real immersive experience in a mixed reality.

Yingzhen Qu, CTN Editor

Enabling Holographic Teleportation – What Can 5G Edge Computing Do?

Ning Wang

Ning Wang

5G/6G Innovation Centre, University of Surrey

Peng Qian

Peng Qian

5G/6G Innovation Centre, University of Surrey

Vu San Ha Huynh

Vu San Ha Huynh

5G/6G Innovation Centre, University of Surrey

Sweta Anmulwar

Sweta Anmulwar

5G/6G Innovation Centre, University of Surrey

Rahim Tafazolli

Rahim Tafazolli

5G/6G Innovation Centre, University of Surrey

Introduction

Ever since humans started communicating there has been a desire to increase the communication’s realism to convey our thoughts and ideas as realistically as possible.  From writing amazingly detailed letters in centuries past, to the transmission of voice in its many forms to most recently video communication, we have always strived to be as close as possible to the recipient of our messages.  In recent years, holographic teleportation has been seen as the next generation of “Zoom-like” applications, allowing people from different Internet locations to interact in real-time with each other with increasingly immersive experiences. In particular, the supported feature of 6-degree-of-freedom (6DoF) will enable end users to freely move in the virtual scene as compared to the traditional 2D/3D video applications. In addition to the support of immersive inter-person communication services, there is no doubt that holographic teleportation will become a key enabler for a wide range of vertical sectors such as gaming/entertainment, education, healthcare, tourism and many others. In fact, the most basic holographic teleportation was demonstrated as a potential killer app for 5G back in 2019, when the telecom industry started to showcase the 5G capabilities. Despite its attractive features, it will take time for relevant services/applications to widely enter the consumer markets due to a number of challenges that need to be tackled.  In this article we highlight two of the several technical issues that have been tackled in the context of 5G, in particular through edge computing. Beyond the technical challenges, and to incentivize consumers and enterprises to adopt this type of technology, cost reduction and large-scale performance requirements need to be fully addressed.

Holographic Teleportation and the 5G Edge

First we introduce our proposed solution for enabling the feature of remote production of live holographic teleportation content with 5G edge computing. The key idea is to offload complex holographic content production functions from end user premises to the 5G edge cloud in order to substantially reduce the cost of running such applications on the user side. Figure 1 presents a basic illustration of the solution for 5G edge computing enabled teleportation operations, where the teleportation content production is made into a virtual function hosted by the 5G multi-access edge computing (MEC) node. We carried out real-life testing based on off-the-shelf Kinect Azure DK cameras and significantly adapted the teleportation platform based on the original open-source point-cloud system [1]. Point-cloud media are typically generated by synthesizing the videos captured simultaneously by multiple RGB-depth cameras located at different angles around the object and in general a denser camera setup results in a higher point cloud video quality [2].

Figure 1: 5G MEC for real-time remote production of holographic teleportation
Figure 1: 5G MEC for real-time remote production of holographic teleportation

The cameras attached to the platform support wide resolution ranges, demanding specific bandwidth requirements ranging from tens of Mbps all the way to the Gbps level.  With these data rate requirements, even today’s 5G new radio (NR - uplink) may not be able to support the live streaming of the highest resolution levels from the camera side to the 5G MEC for remote production operation. This observation indicates that, while 5G is able to provide support for early holographic teleportation applications, we may still need to wait for future network capabilities (such as 6G) to facilitate truly immersive experiences which may require Gbps-level user experienced data rates [3]. We also analysed a wide variety of network and application contexts to pinpoint the readiness of 5G MEC for supporting holographic teleportation applications and details can be found at our recent publication [4].   

On the receiver side, 5G MEC can handle different functions, one of them is the frame synchronization of holographic content frames originated from multiple teleportation sources at different network locations on the Internet. This is the generic scenario of multi-source teleportation applications such as distributed virtual conferences or music orchestration performances across multiple Internet locations. From the receiver’s point of view, depending on the network distance from individual sources, the frames from those sources generated at the same time may arrive at the user equipment (UE, i.e., head mounted devices such as Microsoft Hololens) at different time points, thus introducing receiver-perceived motion misalignment across those teleported objects during ongoing sessions [5].

One solution to this issue is to apply frame synchronization/buffering directly at the UE side, however there are two limitations for this option: (1) Head mounted devices, which are mainly used for content display, normally don’t have very high processing power and storage for handling such high-end teleportation media, and (2) in case of a large number of audience receiving the common stream, it is not efficient to have every single UE to perform dedicated frame synchronization operations. Instead, we designed and implemented a solution that is able to perform teleportation frame synchronization at the 5G mobile edge that covers the audience behind the MEC (Figure 2). Once the incoming frames have been synchronized at the 5G MEC, they are streamed seamlessly to the local UE in the session, and thanks to the deterministic data transmission capability of 5G NR (downlink), further disruptions on frame synchronization over the last wireless mile to the UE, will be very low. Our solution is able to bring down the magnitude of second-level of motion misalignment to sub 100ms. 

Figure 2: 5G MEC for real-time frame synchronization across multiple sources
Figure 2: 5G MEC for real-time frame synchronization across multiple sources

The complete end-to-end framework in the general case is shown in Figure 3. On the source side, depending on the network locations of the teleported objects, co-located sources covered by the same 5G MEC can share that MEC for remote frame production operations before streaming the frames to the 5G MEC on the receiver side. The MEC on the receiver side has the responsibility of performing frame synchronization on behalf of all the attached local receivers before streaming the synchronized teleportation frames to them over the 5G downlink channel.   

Figure 3: General solution framework
Figure 3: General solution framework

Challenges Ahead and Summary

Holographic teleportation is an exciting new way of communication made possible in part by advancements in wireless communications.  Many issues remain to be solved but demonstrations are taking place with many academic and industry organizations working to solve the existing challenges.  Dealing with the multi-source synchronization issue is just one element of the 5G MEC functions in the context of assuring end-to-end user experiences in holographic teleportation applications. Compared to the traditional video applications, handling user quality of experiences (QoE) for the immersive teleportation applications is much more challenging. This is not only about increased bandwidth requirements and heavier media processing tasks, but also more complex user experience models involving a wide range of performance metrics [6]. Take point-cloud based systems as a representative example, the visual quality on each teleportation frame can be influenced by the RGB resolution and point size configurations, additionally there are other aspects such as frame rate, playback latency and synchronisation which may jointly affect user QoE.

In addition to teleporting people, it is also possible to teleport other types of non-human objects including virtual scenes. In this case, motion-to-photon delay also becomes essential for end users to perform real-time interactions with the live teleported virtual objects/scenes. This type of delay refers to the time gap between a user taking an action and when he/she perceives the reaction from the virtual environment. For instance, if an end user starts to walk “closer” to the object, he/she can seamlessly perceive his movement in the virtual space as if being actually in that physical space. Such a feature in holographic teleportation is effectively a natural extension from the sub-20ms motion-to-photon delay in traditional VR based applications but this requires more comprehensive support of 6DoF movement in live teleportation streaming.

In this article we are just scratching the surface of all the necessary capabilities needed to achieve a fully immersive holographic teleportation type experience.  And while we shed some light on how the 5G MEC can be used for supporting basic holographic teleportation applications, there’re still many other unresolved technical issues that require advances in future emerging technologies such as 6G in the wireless space, and computing architectures on the cloud/MEC side.  These challenges in front of us makes for an exciting journey towards the commercial realization of this mind-catching technology.

References

  1. M. Kowalski, J. Naruniec, and M. Daniluk, “Livescan3d: A Fast and Inexpensive 3d Data Acquisition System for Multiple Kinect v2 Sensors,” Proc. International Conference on 3D Vision, 2015, https://github.com/MarekKowalski/LiveScan3D
  2. Z. Liu, Q. Li, X. Chen, C. Wu, S. Ishihara, J. Li, and Y. Ji, “Point Cloud Video Streaming: Challenges and Solutions”, IEEE Network, Vol. 35, Issue 5, 2021
  3. M. Giordani, M. Polese, M. Mezzavilla, S. Rangan, and M. Zorzi, “Toward 6G Networks: Use Cases and Technologies”, IEEE Communication Magazine, Vol. 58, Issue 3, 2020
  4. P Qian, V. S. H. Huynh, N. Wang, S. Anmulwar, D. Mi, R. Tafazolli, “Remote Production for Live Holographic Teleportation Applications in 5G Networks”, IEEE Transactions on Broadcasting, DoI: 10.1109/TBC.2022.3161745, online access at https://ieeexplore.ieee.org/abstract/document/9745991
  5. S. Anmulwar, N. Wang, A. Pack, V. S. H. Huynh, J. Yang, R. Tafazolli, “Frame Synchronisation for Multi-Source Holograhphic Teleportation Applications-An Edge Computing Based Approach”, Proc. IEEE PIMRC 2021
  6. A. Clemm, M. Torres Vega, H. K. Ravuri, T. Wauters, F. De Turck, “Toward Truly Immersive Holographic-type Communication: Challenges and Solutions”, IEEE Communication Magazine, Vol. 56, Issue 1, 2020

Statements and opinions given in a work published by the IEEE or the IEEE Communications Society are the expressions of the author(s). Responsibility for the content of published articles rests upon the authors(s), not IEEE nor the IEEE Communications Society.

Sign In to Comment