Data-Driven Transformation in Telco Operators | White Paper | Polystar
The Telecom Industry is facing a deep transformation. Read about the CSP status, as well as the drivers for transformation. Download here
Achieving e2e-insights requires seamless ingestion, unification, and correlation of data from many sources. Read about data ingestion with AI in telecom
Data is, of course, foundational to our analytics and assurance programs. If we want to obtain service quality metrics (SQM) with a true 360° view or ensure that we can accurately perform Root Cause Analysis (RCA) end-to-end across our networks, we need data. Of course, there’s plenty of data available.
But, current approaches have typically created silos of data that are viewed separately. Rich silos to be sure – but none-the-less, most operators have individual stacks of data, from different systems and domains, such as network probes, the RAN, transport, core and so on. They exist side by side but are not easily (or currently) related – which means we can’t explore data as a unified resource.
This causes problems. For example, if a customer has problems enjoying streaming video on YouTube, there could be problems in the operator’s network, or the problem could be with the streaming provider. If the former proves to be the case, there could be an issue in the RAN that has already been detected but which has not been directly related to the issue raised by the user.
A problem might also have occurred in the transport that impacts the experience in some way too. Each issue could result in separate, unrelated trouble tickets, so there may be a chain of such events that are actually related but which are not correlated.
Similarly, RAN solutions typically follow energy management cycles. While these could be seen to be functioning as expected, that information doesn’t tell us anything about the impact on users. To understand that, we need SQMs from the user and device level, which we can also align with the cycles in progress.
So, we know that these silos exist, and it has become imperative for us to close them and to ensure that we can secure better visibility of the different data sets – because while they tell us different things, they are intimately related to each other.
What we need to do is to combine different sources of data and to be able to parse these in order to quickly determine how different events and captured transactions relate to each other and to service impacting issues.
We have discussed the task of combining multiple data sources at length, but we need also to focus on the problem of bringing these silos together in practice. It’s not just a task of combining data sources, however. We must organize the data so that it can be made accessible to analytics and search functions. Essentially, we need a common data format that is independent of the source of the data.
On its own, that task sounds simple. We know we can sort and organize data from different sources because we have the techniques of DataOps to achieve this. DataOps allows us to unify data into a format that can be consumed upstream by different users, stakeholders and platforms.
However, there’s a further problem. Data transformation requires significant effort, with configuration, mapping and more required from data engineers. There may also be knowledge gaps, requiring the attention of Subject Matter Experts or SMEs – who are, by definition, a scarce resource.
Data engineering, then, takes time, resources and expertise. All of that makes it not just difficult, it also has a direct cost in that it delays the time it takes to secure the insights we seek.
This is where we need to add Artificial Intelligence to the picture. AI allows us to change the way in which we ingest data, so that we can make use of it, faster. In fact, by using AI, we’ve been bale to cut the time it takes to process data and make it available for analytics and other processing from weeks to minutes.
At Elisa Polystar, we’ve already achieved this task for traditional network probe data, as well as 3GPP Performance Management data, and for different formats of data, such as XML, .csv or Kafka streams. By doing so, we’re building a complete data layer that breaks out of those traditional silos and provides a unified foundation for future analytics – and enhancements to our SQMs and accelerating RCA.
In time, we expect to add more data sources – but the bottom line is that, by leveraging AI to assist with the ingestion phase, we’re getting closer and closer to securing that end-to-end view we all know we need – and helping you get insights, faster.
We discussed these innovations at DTW in Copenhagen this June – if you’d like to learn more about how we can help you close silos, unify data – and accelerate insights for true end-to-end service assurance, get in touch.
This article was written by three co-authors: Asparuh Rashid, Michal Kubicki and Mohammad Al-Shouha.
Polystar delivers innovation by turning any telco data into smart decisions and actions.
Our Telco Analtyics software supports operators to: