More
    spot_img
    HomeAutomation/AIMassive network automation might be years away, but there’s plenty of scope...

    Massive network automation might be years away, but there’s plenty of scope now

    -

    Steve Jarrett, Global Head of Data and AI at Orange, talks about the operator’s ambitious network automation strategy, timing and progress.

    Jarrett charts the path from the test and learn approach that is being applied today, to the recent transformational tie-up with Google Cloud and a future of “massive network automation”.

    This article first appeared on FutureNet World and is reproduced with kind permission.

    Orange unveiled its Engage2025 five-year strategic plan in December 2019, which has four main strands. They include gaining a new level in its digital transformation by positioning AI and data at the heart of innovation model. More specifically, to achieve three tightly linked goals of smarter networks, greater operating efficiency and reinventing customer experience.

    Telcos have been trying to exploit big data for years, with little or mixed success in the main. Jarrett says that artificial intelligence (AI) has accelerated progress with this greatly in the last two years, but adds, “We’re in this environment where there’s lots of new tools, most of which are not very mature and the environments extremely dynamic.

    That’s what led us to the ‘test and learn approach‘ [with AI] because it’s just a very dynamic situation.”

    He stresses that Orange is, “very focused on impact and use cases to help the business” right across the organisation. He is also keen to emphasise that in no sense is Orange waiting for the 5G non-standalone core to progress. 

    Jarrett says, “The vast majority of our investment is in physical infrastructure and will continue to be so. Think about how much it costs to lay fibre and deploy base stations, even if they’re virtualised. There is still the power element, the antenna and the compute, however it’s structured at the base station, and those towers, not to mention the spectrum.”

    He says that AI and automation could be applied to them, with the test and learn approach, from network planning to predictive maintenance. Early experiments are already saving the company millions of euros as well as improving customer service, from identifying the most profitable base station to preventing truck roll for fibre broadband problems, and saving energy through predicting idle nodes in the RAN.

    On the starting blocks

    However, Jarrett thinks that massive network automation, “will be more like a ten-year time horizon, because we’re going to have a radical shift in network architectures that we’re just beginning to see now. They will begin to function well over the next two to three years, but it’ll take a couple more for them to be deployed and then adopted across the world.”

    He said that if on a scale of one to ten, the massive network automation envisioned in ten years is ten, then Orange’s progress stands at two or three now. Jarrett continues, “Even somebody like Rakuten [Mobile I would put them at three to four], on that scale”. This is not dismissive of achievement, but an indication of the “enormous opportunity” the industry has “to transform the way that we run our businesses through automation”.

    He adds, “If you look at, for example, how Google runs their data centres and how they also run the operations internally at their own company, and their capital efficiency. I think you see a model there for what the opportunity is for all businesses, not just telco.”

    “As you disaggregate the software from hardware, you have good interfaces to access the data and to act on the data that enables a much more dynamic kind of environment. I think we’re going to see an explosion of that. We already see it generally in the containerisation of how services work on cloud providers today in terms of software providers. You see an explosion of ways that enable you to, for example control your costs, or provide security and so on. We’re going to see enormous amount of innovation.”

    Partnership with Google Cloud

    Little wonder then that in July, Orange and Google Cloud announced a strategic partnership to accelerate the transformation of Orange’s IT infrastructure and the development of future cloud services, in particular edge computing.

    Jarrett says, “A big part of my job is to make sure that we make large partnerships decisions that allow us to improve the advantage of all [that] external investment,” pointing out that the hyperscale cloud companies have been dealing with data problems similar to those of the operators for years, but at much greater scale.

    Still, while the cloud hyperscalers were created to be data driven, the operators were not, hence the travails of digital transformation. Jarrett is nothing daunted, comparing the situation to the early years of the internet when the pace of change was extraordinary, and companies had fundamentally to rethink how their business would operate.

    He says, “We need to think about data as being a common wealth, which is the ability to share data between teams and break down the silos. It enables everyone to take business benefit from using the data for different purposes. Historically, the team that generated a particular data set felt like they owned it but, for example, network data is…extremely useful to many different teams. That’s the biggest, the hardest, problem plus the willingness to change and that also relates to training and skills.”

    As part of Engage2025, Orange is committed to invest more than €1.5 billion in a skills-building programme open to all employees, that will train 20,000 staff in network virtualisation, AI, data, cloud computing, code and cybersecurity.

    Data governance

    In the meantime Jarrett explains, “To have extremely heterogeneous network data requires really good data governance, which is the methodology to structure the data, to understand where the data comes from and what actions have been taken on the data. Then additionally [you need] really good tools to allow you to ingest the data and look for anomalies.”

    He says very high scale data systems should not always simply use a pipeline to extract the data and prepare it for the next step, as pipelines can go wrong for many reasons. Consequently, systems must not only to look for network anomalies, but for anomalies in the data to avoid acting on bad or skewed data created by a software glitch or another issue generating inaccurate data.

    He states, “I think we have a really good understanding of those problems and we’ve done very nice work, including… this deal with Google, which is a really fundamental shift for the company. And I think they also have a lot to bring to us regarding these kinds of problem.”

    Healthy market

    This is because, Jarrett says, there is a rapidly growing awareness and understanding of the value of data and potential problems with it. “As a result, there’s so many new startups, as well as established players, that are really invested in addressing these kinds of problems, and there is enormous innovation. I probably spend an hour or two, every day, just reading and trying to keep up with the industry.”

    He says the likes of Amazon and Google and Azure’s not only invest in cloud infrastructure, but provide a platform for these companies to sell their specialist value added services – for example, fixing the labelling of complex, poorly labelled data.

    Jarret says, “They have a really strong business model,” and continues, “There’s enormous venture capital investments and acquisitions and so that’s a very, very healthy market and it’s really helping us dramatically in our ability to be efficient.”