AWS for M&E Blog

Immersive viewing of the NASA Artemis 1 launch with Futuralis, Felix & Paul Studios, and AWS

This blog was co-authored by P. Andrew Keenan and Donavan Prieur, Felix & Paul Studios, Nadeem Shaik,  Futuralis, and Hamdy Eed, AWS.

Live streaming in multiple formats to multiple device types can be challenging and time consuming when building a global content delivery system. In this post, we look at how to reduce complexity and the time involved to conduct live streaming with AWS Media Services, a suite of purpose-built services for video processing, packaging, and delivery from Amazon Web Services (AWS). Further, we will see how Amazon Partner Futuralis and customer Felix & Paul Studios experienced cloud workflow efficiencies in designing a first-of-its-kind immersive virtual reality (VR) streaming event for the launch of NASA’s Artemis I mission to the Moon.

This post is part one of a two-part blog series focused on live streaming to users with Meta Quest VR headsets and subscribers to Meta’s Horizon Worlds platform (specifically the ‘Venues’ space within Worlds). In an upcoming blog post, we will continue discussion about how the same architecture supported concurrent delivery to social media platforms and various domes and planetarium venues worldwide.

About Futuralis: Futuralis is an AWS Advanced Tier Services Partner and global technology consulting firm helping customers implement solutions with AWS cloud services and modern application development. In the process of understanding what’s important to their customers, Futuralis implements solutions focused on the six pillars of an AWS Well-Architected Framework.

About Felix & Paul Studios:  Felix & Paul Studios is an EMMY® Award-winning creator of immersive entertainment experiences, creating unparalleled, highly engaging, and inspired virtual reality, augmented reality, and mixed reality experiences for audiences worldwide.

Use case introduction

Felix & Paul Studios partnered up with Futuralis to build a live VR streaming experience to process and deliver 8K UHD-360o live video to a variety of end users and locations via the cloud. The team turned to AWS to implement the solution with AWS Media Services and Amazon CloudFront as core services for real-time video processing and content delivery, allowing viewers to experience an immersive event. In this post, we discuss the portion of the workflow responsible for 180° VR streaming to viewers using Meta Quest VR headsets via Meta’s Horizon Venues platform.

workflow diagram

Figure 1-1: High-level system architecture diagram

Prerequisites

As we walk through the use case and implementation, it may be helpful to familiarize yourself with the services shown in the following architecture diagram.

  • AWS Elemental Link Device an encoding device that connects to a live video source, like a camera or video production equipment to securely and reliably transfer video to AWS Elemental MediaLive.
  • AWS Elemental MediaLive a video processing service used for video transcoding and packaging.
  • AWS Elemental MediaPackage prepares, protects, and distributes video content to a broad range of connected devices.
  • Amazon CloudFront content delivery network (CDN) service to deliver content with low latency and high performance using globally distributed edge locations.

Figure 1-2: Live streaming architecture

Use case implementation

Real-time video streaming has been around for over a decade, so what makes this architecture interesting and unique? The source content in this case is formatted with 180°/360° viewing in mind, allowing each individual user a unique experience. The design starts with the acquisition of content in 8K UHD, the highest resolution supported by the camera systems used to live stream the event. As content is captured on-premises in its native 8K UHD format, it is also scaled to 4K UHD for contribution to AWS Media Servies. While professional production equipment and video editing software can often work in the higher “native” resolutions such as 8K (7680×4320), most consumer devices are still geared toward 4K or HD resolutions (3840×2160 or 1920×1080). As 4K UHD is the highest resolution distributed to end users in Horizon Venues, the solution also implements adaptive bit rate (ABR) streaming to support lower resolutions and bitrates and to ensure video plays without interruption for a consistent viewing experience. The following sections step through system-level requirements as well as supporting services.

Key requirements 

  • Broadcast-grade delivery: Stream up to 4K UHD to Oculus/VR headsets and Meta Horizon Venues
  • Live-to-VOD capture: Maintain a recorded backup of the live stream
  • Stream routing: Routing/switching between live content and static content
  • Redundancy: Ensure high availability and system-level redundancy
  • High quality: Provide a high-quality user experience with ABR streaming
  • Low latency: Maintain low-latency performance without compromising video quality

Step one – contribution encoding

In this step, AWS Elemental Link UHD-capable devices are used for signal acquisition. Link UHD devices provide high-quality broadcast-grade encoding using HEVC/H.265 compression and support up to 4K UHD image resolution at 60 frames per second (FPS). HEVC is a key requirement in not only maintaining the highest video quality, but doing so in the most bit-efficient manner for transport to the cloud. Importantly, the cloud workflow begins with on-premises AWS Elemental Link devices, providing secure and reliable video ingest from ground to cloud. Each of the two AWS Elemental Link devices is assigned to a different AWS Availability Zone (AZ) in the US-EAST region to provide dual path diversity to, and within, the cloud. The separation of Link devices over two different AZs is a design consideration to ensure high availability and redundancy in the workflow. In the unlikely event of a disruption in one AZ, the Link device assigned to the other AZ continues to operate and pass traffic to AWS Elemental MediaLive. Once the device is started, the Link will forward video to its associated AWS Elemental MediaLive channel to complete the video contribution from ground to cloud.

Front Panel

Back Panel

Figure 2-1: AWS Elemental Link UHD device

Step two – distribution encoding

As previously mentioned, AWS Elemental Link works as an extension of AWS Elemental MediaLive and operates as a cloud-managed on-premises contribution encoder. The second step in the workflow involves setting up the distribution streaming formats to end user devices with MediaLive. While AWS Elemental Link performs contribution-style encoding using HEVC/H.265, most end-user devices still consume video as AVC/H.264, and in particular Meta’s Horizon Venues. For this reason, the workflow with MediaLive includes conversion from HEVC to AVC formats.

During this transcode, we also introduce ABR to the workflow. ABR encoding is a technique that generates multiple renditions of the same content similar to a ladder with many steps from top to bottom. For video, the steps are different combinations of video resolutions and bitrates encoded and frame-aligned. This allows client devices that join a stream to dynamically switch between available encoding steps based on their bandwidth and/or supported viewing device resolution. Ultimately, the use of ABR encoding optimizes the streaming experience for viewers by creating a dynamic video distribution with various bitrates and resolutions to choose from (e.g. 2160p, 1920p, 1440p, 1080p, 720p, etc.). MediaLive also extends high availability and redundancy options by operating in standard-channel mode across two AZs in a region. This live-live dual path diversity continues all the way to the origination service where AWS Elemental MediaPackage integrates with the Amazon CloudFront CDN for final egress.

Step three – origination and distribution

The next step in the workflow is connecting the standard-channel (live-live) from AWS Elemental MediaLive to the next set of services in line—AWS Elemental MediaPackage and Amazon CloudFront. AWS Elemental MediaPackage offers just-in-time packaging/re-packaging of a video stream into various formats including HLS, MPEG-DASH, CMAF, and MSS. The main format distributed for NASA’s Artemis I space launch used HLS formatted content. While media segments are stored on the packager as the origin service, connection to Meta’s Horizon Worlds is via the Amazon CloudFront CDN, extending edge locations around the world to accelerate content to end user devices. Amazon CloudFront is tightly integrated with MediaPackage, making it quick and easy to add global delivery in just a few clicks in the AWS Management Console. To improve CDN performance, the team enabled use of Origin Shield, providing an additional layer of caching to help reduce the number of simultaneous requests back to the origin (e.g. cache hit ratio).

Step four – deployment and management

Other important considerations around the workflow involved security, deployment management, and overall system level monitoring. Futuralis solutions architects leveraged AWS cloud services for each of these associated subtasks. Starting with security, the solution aligned with best practices outlined by the AWS Shared Responsibility Model and managed access roles with AWS Identity and Access Management (IAM). To simplify architecture deployment and updates, the solution was templated using AWS CloudFormation, enabling an Infrastructure as Code (IaC) configuration that could easily be re-used later. Other services were useful for the team during the event were monitoring aspects with metrics gathered by Amazon CloudWatch for health and performance information (e.g. input loss, bitrate, network output, 4xx/5xx errors, etc.).

Figure 2-2: Live streaming architecture detail

Observations

Observations and lessons learned during the event follow.

AWS Elemental MediaLive supported video formats pertaining to VR distribution using settings such as square aspect ratio 1:1.
AWS Media Services are flexible in that they support a number of video codecs (e.g. HEVC, AVC, MP2), resolutions (e.g. UHD, HD, SD), and formats (e.g. SRT, HLS, DASH), making it easy to transition content from collection systems to end user devices.
AWS Elemental MediaLive offers a built-in input switcher, allowing many live/static inputs to be mapped to a single output channel and using API or console level controls for switching.
AWS Media Services were simple to integrate and automate with embedded features such as the AWS Elemental MediaLive workflow wizard to reduce the time and complexity to stand up a channel end-to-end.
AWS Elemental Link and AWS Elemental MediaLive offer up to 4K UHD distribution with the ability to generate and associate other ABR formats to satisfy downstream conditions.

Conclusion

In this blog post, we reviewed how we process and deliver live video reliably and at scale using AWS Media Services for the Artemis I space launch event managed by Felix & Paul Studios and Futuralis. We touched on several key features behind AWS Media Services and Amazon CloudFront, and reviewed how to implement these architectures using IaC configurations. While we hope this post provided some valuable insights, tips, and ideas on how you might deploy similar VR experiences, AWS partners like Futuralis are available from our AWS Partner Network to help accelerate your cloud journey. In the next post, we will continue discussion about how the architecture connected live content to social media platforms and planetariums worldwide for different viewing experiences.

 

Futuralis is an AWS Partner and global technology consulting firm raising standards in AWS Cloud services and modern application development. Click here for more information: https://partners.amazonaws.com/partners/0010L00001v0233QAA/Futuralis

Hamdy Eed

Hamdy Eed

Hamdy Eed is a Sr. Partner Solution Architect with AWS and is a member of the Serverless Technical Field Community. With over 20 years of industry experience, he is well-recognized as a Matter Subject Expert in Serverless and Enterprise integration. Prior to AWS, he worked at NIC Inc and held several positions at IBM. In his spare time, he coaches soccer to elementary school students and enjoys international travel. You can reach him on LinkedIn linkedin.com/in/hamdyeed

Matt Carter

Matt Carter

Matt Carter serves as a Principal Solutions Architect with AWS Elemental leading media solutions for Public Sector. With over 20 years of industry experience, a patent holder in video metadata processing, and contributor to the Motion Imagery Standards Board (MISB), he has become a well-recognized Subject Matter Expert in video technologies for Government applications. Matt obtained his degree in Applied Science from Miami University and is a veteran of the United States Army Signal Corps.