The Cloud and High Performance Computing Enables Modern Workflows and High Resolution Virtual Reality 360-Degree Video

[ad_1]

By Tom Coughlin, Coughlin Associates

This piece explores insights on digital storage for media and entertainment applications based largely on the 2020 SMPTE Technical Conference.  This includes looking at increasing use of cloud storage and other services for media and entertainment applications as well as VR/AR content including 360 degree and volumetric imaging

 Growth of the Cloud for Media Workflows

With the impact on the industry from the Covid-19 pandemic there has been a significant shift to using the cloud in media and entertainment workflows.  In Hanno Basse’s Azure talk he showed some results from an interview survey of 46 top creatives in the film and TV industry.  The chart below shows the breakdown of key needs in the M&E industry.  Note that working with the cloud was the third-highest need.

The figure below plots the annual historical and projected spend on cloud storage capacity for media and entertainment, including cloud storage for archiving, post-production and various content distribution technologies.  Archiving in object storage on the cloud has been the major revenue source into 2020 and continues to grow in future years.  Note that due to the Covid-19 pandemic and many people working at home through most of 2020 and likely part of 2021 we project a significant bump in the use of cloud storage for post-production starting in 2020 compared to 2019 (from 8% to 20% respectively) and continuing to increase through 2025.  

By 2025 we project that spending on cloud storage for post-production (including collaborative workflows as well as post-production work directly using cloud-based applications, will reach parity with archiving on cloud storage spending.

Emerging 360-Degree VR Workflows

In addition to its use for collaborative workflows, which has been accelerated with the Covid pandemic, the cloud also provides data analysis and processing services.  This includes video rendering but also help with processing for emerging video services including stitching of 360-degree video for virtual reality, including content captured as volumetric video.

Virtual reality is finding increasing M&E use, such as TV applications, particularly live sports events.  Mauricio Aracena from the VR Group spoke about live VR workflows for sporting events.  He compared linear TV production and content distribution to VR live workflows as shown below.

At the event location, linear TV require taking audio from microphones for mixing and combining the data from multiple cameras, including cameras with a remote wireless connection.  For a VR event multiple VR camera outputs are MUXed together and renders to a flat rectilinear image and are encoded for wireless transmission with embedded local audio.  

As shown in the image above the VR live content is distributed to various distribution modes, including a game engine model.  Customers can receive VR live content via a linear/VR simulcast or a separate VR live through from the game engine model.  Ethernet-based production using SMPTE 2110 makes VR workflows earlier for production and to create special use cases—such as a personal point of view and to follow a favorite.

At the 2019 IBC global streaming of live 8K 360-degree VR was said to deliver 8K equivalent quality to 4K devices.  The image below shows information on the content capture requirements, distribution formats and content delivery characteristics of this demonstration.

Creating many VR effects will benefit significantly from volumetric image capture where samples of all the wavefronts in a scene are captured allowing additional image processing and display options.  Intel talked about their volumetric image capture technology with a camera array, TrueView.  Viewing this content as virtual reality, they call TrueVR.  An image showing this technology applied to immersive sports is shown below.  Volumetric content is also associated with point clouds and emerging standards include streaming volumetric content and compression of point clouds.

The Omnidirectional MediA Format (OMAF) is a virtual reality system standard from the Moving Pictures Experts Group. OMAF 2nd edition provides for multiple viewpoints, overlays and advance tiling profiles that can be used for ad placement and other uses.  

Ozgur Oyman, from Intel and Virtual Reality Industry Forum (VRIF) guidelines working group chair spoke about 8K 360-degree virtual reality in the VR Industry Forum guidelines.  He pointed out that VR has evolved from initially being 4K viewpoint independent (VPI) to 4K viewpoint dependent (VPD) which can be used to double the resolution per eye and provides a better quality VR experience.  VPI delivers the entire 360 video to the VR headset.  It is simple to deploy, but not bandwidth efficient.  VPD delivers only a portion of the 360 video in high quality, which reduces bitrate requirements roughly by half.  Doing VPD requires low latency interaction through the network.  The VRIF has adopted MPEG OMAF profiles for both VPI and VPD.  The image below shows the operations needed to implement VPD video.

Note that 5G mobile devices are expected to support 8K video decoding capabilities and new 8K-capable devices can send and decode the full 8K sphere in the device and provide the same resolution in the viewpoint as the best 4K viewpoint-dependent solutions at up to 60 fps.  Cloud-based content-aware technology with HEVC compression may require only 30-35 Mbps encoded bitrate for content delivery.

HPC for Media Workflows

8K and higher resolution content, at higher frame rates and high dynamic range, especially when combined with 360-degree volumetric content capture results in very large data sets.  Raji Krishnaswami and Reona Yanagishita from Rohde and Schwarz talked about using high-performance computing (HPC) technologies for high-resolution file-based production.  HPC is commonly used for weather modeling, geologic modeling and simulation, aerodynamics and cryptology, among other uses.  Video rendering for animation and special effects uses a type of HPC.

The Rohde and Schwarz speakers pointed out that HPC commonly uses commercial off the shelf (COTS) technology.  HPC makes use of NVMe and NVMe over fabrics (NVMe-oF) with solid-state storage and supports remote direct memory access, which allows sharing memory using remote direct memory access (RDMA) between compute nodes.  HPC also makes use of co-processors (or domain-specific processors) such as GPUs and Direct direct graphics memory access (Direct GMA) that allows a direct data path between storage and GPUs.  In HPC commercial vendors focus on software and functionality with commodity hardware.

RDMA over converged ethernet (RoCE) allows remote memory access over standard Ethernet interconnects.  This vastly reduces latency by skipping the CPU for data movement and reduces infrastructure costs and management complexity.  Most 8K workflows are bespoke systems based upon working with lower resolution proxies.   This approach results in several memory and storage access bottlenecks.  A next-generation, high-resolution workflow should use SMPTE 2110 including PTP and RDMA and leverage solid-state storage with NVMe and NVMe-oF.  It should maximize the use of CPU and domain-specific processors as well as memory.  The figure below shows a representative model for such a higher-performing HPC-inspired storage and memory system to support high bandwidth rich content.

Summary

The 2020 SMPTE virtual technical conference featured many presentations on advances in the industry, including the use of the cloud for remote collaborative workflows.  There was also many talks about data rate and storage needs to support high resolution, high frame rate and high dynamic range video content including 360-degree and volumetric storage supporting more immersive content such as VR and AR.

The 2020 Digital Storage in Media and Entertainment Report

The 2020 Digital Storage for Media and Entertainment Report, from Coughlin Associates, provides 251 pages of in-depth analysis of the role of digital storage in all aspects of professional media and entertainment.  Projections are given out to 2025 for digital storage demand in content capture, post-production, content distribution and content archiving are provided in 62 tables and 129 figures.   

The report benefited from input from many experts in the industry including end-users and storage suppliers, which along with economic analysis and industry publications and announcements, was used to create the data including in the report.  As a result of changes in the economics of storage devices, higher performance solid-state storage will play a bigger role in the future.   The cloud and hybrid storage including the cloud have assumed new importance for many workflows during the Covid-19 pandemic.  When the pandemic passes, use of cloud storage will continue to grow in the media and entertainment storage market going forward. 

You can find out more about the report at https://tomcoughlin.com/product/digital-storage-for-media-and-entertainment-report/.

 

[ad_2]

Source link
Studionics The Best Media Production house in Coimbatore


Category: ,
%d bloggers like this: