Untitled

  • rss
  • archive
  • Under the Hood: Delivering the First Free Global Live Stream of an NFL Game on Yahoo

    yahooeng:

    P.P.S. Narayan, VP of Engineering

    On Sunday, October 25, Yahoo delivered the first-ever, global live stream of a regular season NFL game to football fans around the world, for free, across devices. Our goal was to distribute the game over the Internet and provide a broadcast-quality experience. Leveraging our focus on consumer products, we worked to identify features and experiences that would be unique for users enjoying a live stream for the first time. In other words, we wanted to make you feel like you were watching on TV, but make the experience even better.

    For us, success was twofold: provide the best quality viewing experience and deliver that quality at global scale. In this blog, we will talk about some key technology innovations that helped us achieve this for over 15M unique viewers in 185 countries across the world.

    On the technical side, the HD video signal was shipped from London to our encoders in Dallas and Sunnyvale, where it was converted into Internet video. The streams were transcoded (compression that enables efficient network transmission) into 9 bitrates ranging from 6Mbps to 300kbps. We also provided a framerate of 60 frames per second (fps), in addition to 30fps, thus allowing for smooth video playback suited for a sport like NFL football. Having a max bitrate of 6Mbps with 60fps gave a “wow” factor to the viewing experience, and was a first for NFL and sports audiences.

    One special Yahoo addition to the programming was an overlaid audio commentary from our Yahoo Studio in Sunnyvale. It was as if you were watching the game alongside our Yahoo Sports experts on your couch. This unique Yahoo take gave NFL viewers a whole new way to experience the game.

    image


    Quality Viewing Experience

    Our goal was to deliver a premium streaming quality that would bring users a best-in-class viewing experience, similar to TV–one that was extremely smooth and uninterrupted. This meant partnering with multiple CDNs to get the video bits as close to the viewer as possible, optimizing bandwidth usage, and making the video player resilient to problems on the Internet or the user’s network.


    Multiple CDNs

    In addition to Yahoo’s own Content Delivery Network (CDN) and network infrastructure, which are capable of delivering video around the world, we partnered with six CDNs and Internet Service Providers (ISPs).

    The NFL game streams were available across all seven CDNs; however, we wanted to route the viewer to the most suitable CDN server based on multiple factors – device, resolution, user-agent, geography, app or site, cable/DSL network, and so on. We built sophisticated capabilities in our platform to be able to define routing and quality policy decisions. The policy engine served more than 80M requests during the game.

    Policy rules to routes were adjusted based on CDN performance and geographies. For example, we were able to reduce the international traffic to one underperforming CDN during the game and the changes were propagated in under six seconds across viewers. Such capabilities delivered a near flawless  viewing experience.

    During the game, we served on average about 5Tbps across the CDNs, and at peak we were serving 7Tbps of video to viewers globally.


    Bitrates and Adaptation

    Viewers of video streaming on the Internet are all too familiar with the visual aspects of poor quality: the infamous “spinner,” technically termed re-buffering; the blockiness of the video that represents low bitrates; jerkiness of the video, which could be due to frame drops; and plain old errors that show up on the screen.

    Since we had nine bitrates available, our Yahoo player could use sophisticated techniques to measure the bandwidth (or network capacity) on a user’s home or service provider network, and pick the best bitrate to minimize buffering or errors. Such adaptive bitrate (ABR) algorithms make the viewing experience smooth. Since we supported 60fps streams, the algorithm also monitored frame drops to decide if the device was capable of supporting the high frame rate. It then reacted appropriately by switching to the 30fps stream if necessary.

    image


    Testing and simulation

    Manually testing adaptive video playback is very difficult, subjective and time consuming. So we built a network and device simulation framework called “Adaptive Playground” that brings automation, integration and a more scientific approach to testing and measuring video playback performance.

    image


    Another tool we developed is a “Stream Monitor” that was used to constantly monitor all the streams across CDNs, check the validity or correctness of the streams, and ultimately identify ingestion or delivery problems. During the game, the tool detected issues, helped to identify the exact problem and take action.

    Yahoo broadcasts live events, news segments and concerts regularly. So these types of tools are continuously used on these events to measure, test and analyze our infrastructure and partner systems.


    Player Recovery

    The video playback must be smooth even if the connection to the streaming server is lost or if there are Internet connectivity issues. So, we introduced seamless recovery in the Yahoo video player. Under problematic conditions, the recovery mechanism is automatically activated, and the player reconnects to our backend API servers to fetch from another CDN. In essence, this replaces a user reloading the page or clicking the player when problems occur–an otherwise manual action that is incredibly frustrating.

    During the game on Sunday, thanks to the seamless recovery of our player, many viewers automatically switched CDNs when their current CDN or ISP had issues. This resulted in a smooth watching experience. In one severe case, we had up to 100K viewers automatically switching CDNs in less than 30 seconds, as seen in the graph below.

    image

    Broad Audience Reach

    We wanted to make sure that our global audience could watch this stream anywhere in the world, on any device so we delivered it on laptops and desktops, on phones and tablets; and finally, we wanted to reach the ardent fans on the big screen TVs, game consoles, and other connected devices. 

    Our destination page, which provided a full screen experience of the game on web and mobile web, was built on node.js and React, and extensively optimized for page load and startup latency. In addition, we decided to launch the NFL experience on our key mobile apps: Yahoo, Tumblr, Yahoo Sports and Yahoo Sports Fantasy. 


    Pure HTML5 on Safari

    We brought users a pure HTML5 video delivery on the Safari web browser. There is currently an industry-wide move away from Flash, and Yahoo is no exception. As the first step toward achieving this goal, we deployed a “pure” HTML5 player on Safari for the NFL live stream. Making this leap had a positive impact to millions of viewers during the game.


    Connected Devices & TV Experience

    Our objective was to create a connected TV video experience better than cable/satellite TV. In just a few months, we were able to develop and deploy on nine different connected device platforms and on 60+ different TV models.

    We wanted a large percentage of our big screen viewers to experience the 60fps streams. However, we soon realized that even on modern devices this was not easily feasible due to memory, CPU and firmware limitations. So we conducted hundreds of hours of testing to come up with the right stream configuration for each device. We developed automation tools to quickly validate stream configurations from various CDNs, as well as created a computer vision (camera automation) test tool to monitor and verify video quality and stability across devices.

    Chromecast

    Because NFL games are traditionally viewed on television, we wanted to provide viewers an easy way to watch the NFL/Yahoo Live Stream on their big screens. In addition to connected TV apps, we built Chromecast support into apps for iOS and Android, allowing viewers to cast the game on big screen TVs from their mobile devices.

    To ensure a high-quality, uninterrupted cast, we also built a custom Chromecast receiver app with the same improved resiliency through robust recovery algorithms. Judging by the engagement on our Chromecast streams, we consistently matched or surpassed the viewing times on other experiences.

    Global Scale

    Yahoo operates multiple data centers across the US and the world for service reliability and capacity. We also have dozens of smaller point-of-presence (POPs) located close to all major population centers to provide a low latency connection to Yahoo’s infrastructure. Our data centers and POPs are connected together via a high redundancy private backbone network. For the NFL game, we upgraded our network and POPs to handle the extra load. We also worked with the CDN vendors to setup new peering points to efficiently route traffic to their networks.

    As part of running “Internet” scale applications, we always build our software to take advantage of Yahoo’s multiple data centers. Every system has a backup, and in most cases, each backup has another backup. Our architecture and contingency plans account for multiple simultaneous failures.

    During an NFL game, which typically lasts just under four hours, there is a very small margin of error for detecting and fixing streaming issues. Real-time metrics as well as detailed data from our backend systems provide a high fidelity understanding of the stream quality that viewers are experiencing.

    Yahoo is a world leader in data, analytics and real-time data processing. So, we extensively used our data infrastructure, including Hadoop, to provide industry leading operational metrics during the game.


    Player Instrumentation

    The Yahoo video player has extensive instrumentation to track everything happening during video playback. And, this data is regularly beaconed back to our data centers. The data includes service quality metrics like first video frame latency, bitrate, bandwidth observed, buffering and frame drops.  

    The beacons are processed in real-time, and we have dashboards showing KPIs like the number of concurrent viewers, total video starts, re-buffering ratio by numerous dimensions like CDN, device, OS and geo. These real-time dashboards enabled our operations team to make decisions about routing policies and switching CDN(s) in real-time based on quality metrics.

    In terms of scale, our beacon servers peaked at more than 225K events/sec, handling about two billion events in total, which equaled about 4TB of data during the game.


    Backend APIs

    Prior to the NFL streaming event, we had designed the backend APIs to deliver at scale, with low latency and high availability. During the game, we served 216 million API calls, with a median latency of 11ms, and a 95th percentile latency of 16ms. The APIs showed six 9s of availability during this time period.

    Our systems are instrumented exhaustively to obtain real-time feedback on performance. These metrics were available for monitoring through dashboards, and were also used for alerting when performance breached acceptable thresholds. 


    The Take-Away

    Pioneering the delivery of a smooth 60fps live video experience to millions of users around the world was a significant undertaking. Huge thanks to the team for executing against our vision - it was a coordinated effort across Yahoo.

    While much of our technology and infrastructure was already set up to handle the scale and load–one of the reasons the NFL chose us–in preparation for the main event, we designed a new destination page and enhanced our mobile applications. We also enhanced the control and recovery mechanisms, as well as expanded our infrastructure to handle the huge traffic of the game. We worked hard to ensure that the experience was available on every Internet connected device. We tuned our video players to deliver the optimal video stream, taking into account device, connectivity, location and ISP. Behind everything was our massive analytical system that would measure and aggregate all aspects of quality and engagement. We conducted comprehensive tests with our partners so that game day would be successful. In the end, the game exceeded our high expectations, setting a bar for quality and scale for live Internet broadcasts to come. We’re thrilled and proud of the experience we delivered, and further, the reception and accolades from our community of users has been gratifying.  

    Looking to the future, we expect live sporting events to be routinely streamed over the Internet to massive global audiences. People will expect these broadcasts to be flawless, with better than HD quality. October 25th 2015 was a significant step towards this vision. Yahoo, as a leading technology company and a top destination for sports, is proud of our role in setting a new standard for sports programming. We look forward to making other global scale broadcasts like the NFL game happen in the future.

    Want to help? Email me at ppsn@yahoo-inc.com and we can talk about opportunities on our team.

    Source: yahooeng
    • 2 years ago
    • 51 notes
  • rainy day…

    rainy day…

    • 4 years ago
    • 1 notes
  • garden fresh vegetables sandwich, warm and soft bread, over sour green salad. loved it!

    garden fresh vegetables sandwich, warm and soft bread, over sour green salad. loved it!

    • 4 years ago
    • 1 notes
  • after a hard night’s vigil

    after a hard night’s vigil

    • 4 years ago
    • 1 notes
  • steak for vegans, yes that’s tofu

    steak for vegans, yes that’s tofu

    • 4 years ago
    • 1 notes
  • Joseph Kim: The family I lost in North Korea. And the family I gained. | Video on TED.com

    amazing account of a young boy’s journey from north korea to america.

    • 4 years ago
    • 1 notes
© 2013–2018 Untitled