In this paper, Charter’s Emerging Technology group will share its findings from the field trial of a low-latency virtual reality (VR) gaming service built on its edge computing premises. High-powered virtualized servers were configured for streaming interactive VR content to Spectrum Internet customers. Key metrics (latency, jitter, packet loss) were used to evaluate and benchmark network performance for high bandwidth, low latency services deployed at the Charter edge. Recent trends in connectivity have accelerated use cases in which customers utilize high bandwidth but also require low latency. Emerging immersive use cases, such as virtual reality and augmented reality (AR), are inherently latency-sensitive due to the nature of being tethered to a user’s head. This demand for low latency presents a significant challenge for hardware, especially in form factors that are consumer friendly and affordable to a mass market. Today, devices priced in line with gaming consoles suffer from low fidelity and devices with high fidelity require tethering to a gaming computer that usually costs thousands of dollars. The cable industry is poised to accelerate these technologies by offloading compute to high-powered servers on the network edge. With the industry’s commitment to 10G service, higher levels of bandwidth can be utilized to enable experiences through the network that traditionally require high local compute. To better understand the latency demands of these emerging use cases, Charter aimed to launch a field trial that tested our network more than typical existing use cases. Charter’s Cloud Virtual Reality field trial is a system in which high-powered virtualized servers stream high fidelity interactive VR content from a regional data center (RDC) to customers on the Charter network. The aim of this field trial was to understand how an immersive use case that demands high bandwidth and low latency would perform in our existing network. Network and performance metrics were collected to draw conclusions on what performance our edge delivers in terms of latency, packet loss and jitter. Additional metrics of the client’s connection were collected such as WiFi signal (RSSI), WiFi frequency and WiFi band. Metrics were collected on the server to understand how the infrastructure required to serve this experience performed, including render latency, encode latency, GPU utilization and CPU utilization. All together, these metrics establish key parameters and expectations on enabling low latency services through the network. Any game streaming experience must consider the network latency as a key factor of the overall latency. Total input latency (i.e., the latency between a user action and the action being reflected in the display) is made up of the client-side latencies, the network latency and server-side latencies. When compared to a traditional gaming system, streaming a game also adds encode and decode latencies. A major challenge service providers face is how to utilize high powered compute to help make up for the additional latencies introduced by streaming. Techniques that utilize additional bandwidth were explored to help hide and minimize the impact of latency. These tradeoffs can be adjusted to allow for different levels of latency and fidelity depending on the use case and distance to users. Learnings from this field trial help Charter understand how low latency services can perform on the network today and how this can change with future network enhancements such as Low Latency DOCSIS (LLD), Active Queue Management (AQM) and Low Latency, Low Loss and Scalable Throughput (L4S).