Charter Communications’ Network:1 (N:1) strategic network architecture creates a transformed, unified network for product and application services. It enables strategic increases in capacity and performance, creates adaptable, scalable, reliable, and secure network design patterns, and enables network modeling, abstraction, and orchestration. While facilitating the commoditization of network and systems functions, it leverages programmability and the disaggregation of traditional telecommunications offerings.
Foundational to N:1 is its advanced network analytics services that rest upon a modern data plane infrastructure producing vast amounts of data that drive intelligence and decisions for both humans and machines. Numerous and disparate devices, equipment, software technologies, and applications create and drive data in disparate formats for near-real time analytic model execution and decisions. In addition, various consumers have unique needs to make use of these data. Applications involve joining customer experience, network quality of service, traffic engineering, and several other data sources, which create extraordinary challenges and opportunities for unified network intelligence. These demands and complexities beg the question whether it is possible to govern data in this environment. This question only heightens with skepticism that surrounds data governance today and, often, its inability to achieve the benefits organizations expect from it.
We examine the need for new at scale data governance for near real-time streaming and event data for human and machine actionable analytics within N:1. We also provide an overview and common framework for current data governance while addressing its shortcomings. While data governanceen compasses a broad array of processes and governing bodies, we focus on the various technical aspects critical for success within N:1.
We assert that current data governance methods must evolve to enable the complexity of near real-time data streams from multiple sources. By relating to Total Quality Management (TQM), we define the new technical data governance components that are necessary to maintain data integrity, control, and value for intended consumers as data move in this environment. Finally, we show how a curated and collaborative Data Product Catalog helps address today’s governance challenges, enabling responsible data production, consumption, and joins using big and small data