Home Blog Page 3

UK universities demo first long-distance ultra-secure link via quantum network

For the first time, two kinds of quantum key distribution and ordinary data were transmitted simultaneously, between the Universities of Bristol and Cambridge

Researchers have demonstrated the UK’s first long-distance ultra-secure transfer of data over a quantum communications network, including the UK’s first long-distance quantum-secured video call.

A team from the Universities of Bristol and Cambridge created the quantum communications network using standard fibre infrastructure and two types of quantum key distribution (QKD) schemes.

They are described as “unhackable encryption keys” hidden inside particles of light and distributed entanglement, a phenomenon that causes quantum particles to be intrinsically linked.

The researchers demonstrated the capabilities via a live, quantum-secure video conference link, the transfer of encrypted medical data and secure remote access to a distributed data centre. The data was successfully transmitted between Bristol and Cambridge – a fibre distance of over 410 kilometres.

This is the first time that a long-distance network, encompassing different quantum-secure technologies such as entanglement distribution, has been successfully demonstrated. The researchers presented their results at the 2025 Optical Fiber Communications Conference (OFC) in San Francisco.

Why is this such a big deal?

The universities say that quantum communications offer unparalleled security compared to ordinary telecoms solutions.

In the past few years, researchers globally have been working to build and use quantum communication networks. China recently set up a massive network that covers 4,600 kilometres by connecting five cities using both fibre and satellites.

In Madrid, researchers created a smaller network with nine connection points that use different types of QKD to securely share information.

In 2019, researchers at Cambridge and Toshiba demonstrated a metro scale quantum network operating at record key rates of millions of key bits per second. In 2020, researchers in Bristol built a network that could share entanglement between multiple users. Similar quantum network trials have been demonstrated in Singapore, Italy and the US.

Despite this progress, until now, there has not been a large, long-distance network that can handle both types of QKD, entanglement distribution and ordinary data transmission simultaneously.

Details of the experiment

The experiment demonstrates the potential of quantum networks to accommodate different quantum-secure approaches simultaneously with classical communications infrastructure. It was carried out using the UK’s Quantum Network (UKQN), established over the last decade by the same team, supported by funding from the Engineering and Physical Sciences Research Council (EPSRC), and as part of the Quantum Communications Hub project.

The current UKQN covers two metropolitan quantum networks around Bristol and Cambridge, which are connected via a ‘backbone’ of four long-distance optical fibre links spanning 410 kilometres with three intermediate nodes.

The network uses single-mode fibre over the EPSRC National Dark Fibre Facility (which provides dedicated fibre for research purposes), and low-loss optical switches allowing network reconfiguration of both classical and quantum signal traffic.

Next steps

The team will pursue this work further through a newly funded EPSRC project, the Integrated Quantum Networks Hub, whose vision is to establish quantum networks at all distance scales, from local networking of quantum processors to national-scale entanglement networks for quantum-safe communication, distributed computing and sensing, all the way to intercontinental networking via low-earth orbit satellites.

“This is a crucial step toward building a quantum-secured future for our communities and society,” said co-author Dr Rui Wang, Lecturer for Future Optical Networks in the Smart Internet Lab‘s High Performance Network Research Group at the University of Bristol. “More importantly, it lays the foundation for a large-scale quantum internet – connecting quantum nodes and devices through entanglement and teleportation on a global scale.”

Adrian Wonfor from Cambridge’s Department of Engineering added, “This marks the culmination of more than ten years of work to design and build the UK Quantum Network,” said co-author . “Not only does it demonstrate the use of multiple quantum communications technologies, but also the secure key management systems required to allow seamless end-to-end encryption between us.”

Telcos: Strategic partners in sovereignty and national security | White paper by intersec

0

In an era of ubiquitous connectivity, growing threats and increasing demands for public safety, CSPs are expected to help support public safety and protect national interests, becoming trusted sovereign partners for governments’ all-hazards and all-threat approach.

  • How to provide real-time population density data in the event of a natural disaster?
  • How to ensure the most accurate location when someone calls for help?
  • How to support criminal investigations using ultra-high-definition positioning techniques while maintaining privacy and compliance?
  • How is the role of telecoms regulators evolving to orchestrate CSP data at a national level? 
  • How can CSPs truly benefit from advanced geolocation tools from a commercial perspective?

In this white paper, you will gain practical insights to enable emergency response, law enforcement, civil protection and true innovation, while ensuring data sovereignty, security and compliance.

We prioritize secure sharing: this white paper is available to verified professionals after a brief identity check. Thank you for supporting responsible information distribution.

Zayo, Nokia trial 800Gbps alien wavelength between Marseille and Paris

0

Introducing new WDM channel into existing infrastructure boosts capacity which the parties claim will be needed for AI and 6G

Nokia and network infrastructure provider Zayo Europe completed an 800Gbps alien wavelength trial between Paris and Marseille – a distance of more than 1,000km. In wave division multiplexing (WDM), an alien wavelength is a coloured optical signal originating from equipment not directly controlled by the network operator, but which is transmitted over existing its existing infrastructure.

Zayo Europe connects more than 600 data centres with a network that spans more than 2.3 million kilometres of fibre and seven subsea cables. It links 17 metro markets across 11 European countries.

Alien integration

Nokia says the trial proves its alien wave technology can be seamlessly integrated into Zayo Europe’s existing infrastructure. The live field trial used Zayo Europe’s optical line system and Nokia’s PSE-6s transponders. It increased the capacity by transmitting more data per WDM channel, compared to a standard 150GHz WDM, thereby improving efficiency and performance.

Michael Katz, Chief Product Officer at Zayo Europe, says: “With the EU pushing to supercharge AI capabilities and Europe’s 6G vision, there’s an urgent need for networks that can support these advancements…we have always been at the forefront of innovation, being the first provider to upgrade its Pan-European network to support 400G client interfaces.

“This trial with Nokia sets a new industry standard as data centres increasingly mix and match equipment to enhance flexibility, reduce costs, and improve performance.”

Era of 6G and AI

Paul Alexander, Vice President and Country General Manager of UK & Ireland at Nokia, said “Our collaboration with Zayo Europe in this live field trial showcases how our PSE-6s technology can seamlessly integrate into existing infrastructures, enabling service providers to meet the evolving demands of a digital economy.

“With the growing need for high-speed, high-capacity networks to support advancements in AI and 6G, this trial exemplifies the critical role of innovation in helping businesses and data centres stay ahead of the curve.” 

European operators demand EU, NATO and UK act to secure subsea cables

0

There is no doubt action is needed, but exactly what and how are thorny issues, practically and legally

Nine telecom organisations have written to authorities in the European Union, NATO and the UK demanding action to secure Europe’s subsea ecosystem. The operators are GlobalConnect, Orange, Proximus, Sparkle (Telecom Italia), Telefónica, Telenor, and Vodafone; the manufacturers are Alcatel Submarine Networks and NKT.

They wrote, 

“At this crucial time for Europe’s security and resilience, we commend your efforts to strengthen collective defence and protect critical infrastructure. Subsea cables play a vital role in Europe’s connectivity, competitiveness, defence readiness, and economic stability. We recommend the EU/EEA and UK authorities as well as NATO renew their collaboration to address this situation effectively, together with the industry stakeholders from the EU and from the UK.

With the rise in hybrid threats, including incidents affecting subsea cables in the Baltic and North Sea, we emphasize the importance of enhanced, coordinated action to safeguard Europe’s cross-border networks. The EU Action Plan on Cable Security [published in February] provides a clear approach to further increase the resilience and security of subsea cables.

We welcome in particular the reference made to the instrumental aspect of the Connected Europe Facility (CEF), and the willingness of the European Commission to launch a dedicated dialogue with industry notably on the definition of the list of CPEIs, and the need for the deployment of surveillance and protection technologies for submarine cable infrastructure. Instruments of the UK authorities and of NATO could strengthen the momentum if coordinated effectively.

The signatories of this letter are strongly committed to contributing to this dialogue with the EC, the UK and the NATO. Globally to deliver on the Action Plan, it is crucial to engage industry stakeholders and establish a clear roadmap for implementation.

The challenge now is to ensure a high level of security and resilience across Europe building on existing frameworks such as the NIS2 Directive and promoting best practices. The entire subsea cable ecosystem must be regarded as critical infrastructure. It is essential to collectively ensure the appropriate level of security screening, protection, and investments into resilience. The repercussions of damage to subsea cables extend far beyond Europe, potentially affecting global internet and power infrastructure, international communications, financial transactions, and critical services worldwide.

We urge the EU/EEA, UK, and NATO decision-makers to work together closely with EU/EEA and UK industry stakeholders. Harmonised approaches must be developed for the subsea cables ecosystem, aiming to align security objectives with operational feasibility as well as a viable business model and based on proportionate and risk-based best practices, developed in close consultation with industry. These best practices on security could be spread through public procurement, which can be mirrored across likeminded partners and NATO allies.

The EU/EEA, the UK, and NATO must invest in the robust exchange of knowledge and intelligence, as well as shared monitoring and surveillance initiatives, involving all relevant stakeholders to pursue collective and enduring solutions. Stronger public-private cooperation with trusted partners is essential for our effectiveness at managing threats and at developing shared resilience strategies.

Investment in advanced technologies to detect and mitigate damage to subsea cables is critical. Such efforts should be supported by funding instruments such as the CEF, or the European Defence Fund. It is therefore of utmost importance to confirm and increase the budget allocated to digital, including the CEF, in the future MFF, as well as the instruments of the UK government and of NATO.

In parallel, the development of additional routes, both terrestrial and subsea, will enhance redundancy and reduce vulnerability to single points of failure. Such aspects of resilience should be enhanced in the objectives of the CEF.

By partnering with industry, Europe can leverage advanced technologies and expertise to improve situational awareness, enable rapid response and strengthen repair capabilities. Simplifying the permitting process and governance structures will further expedite these necessary security measures.

Subsea cable security must be a cornerstone of broader infrastructure protection efforts. By acting now, we can safeguard the networks that underpin our shared future.

Geopolitics up the ante

At least 11 subsea cables have been damaged in the Baltic Sea since October 2023 and more than 50 Russian ships have been seen in the Baltic Sea where there is a dense network of subsea cables. There have been a number of further incidences in the North Sea.

The Telegraph yesterday reported that the UK is monitoring the Russian spy ship Yantar which it suspects is mapping underwater infrastructure.

There have also been attacks on subsea cables in the Red Sea, which have had a profound effect on Europe and elsewhere.

The ITU’s International Advisory Body for Submarine Cable Resilience held its first meeting last December. It spelled out that while there is no doubt that subsea cables are being sabotaged the world over, most damage to cables is caused by human error or natural activity on the seabed, and it can be hard to prove otherwise.

In addition, there are many grey areas regarding legal responsibilities in international waters, which with so many parties involved could be difficult to resolve.

Telia, Finnish Defense and Nokia handover first cross-border 5G slice

The companies were keen to showcase the potential of 5G tech in critical communications for defence units “within coalition environments” on live commercial networks

Nokia, Telia and the Finnish Defense Forces conducted what they say is the world’s first seamless 5G Standalone (SA) slice handover between multiple countries in a live network. This “groundbreaking trial”, was carried out in Finland last month. The parties say it is a milestone in advancing critical 5G capabilities for defence and other mission critical industries.

The test was conducted as part of a Nordic exercise with the Finnish Defense Forces. It demonstrated a continuous and secure data connection over a 5G SA slice while moving across three separate networks in three different countries. The parties say this capability is crucial for modern defence forces, as military personnel increasingly operate in coalitions beyond their national territories and need uninterrupted access to mission-critical applications and services.

Possibilities of dual use

The trial was achieved using Nokia’s 5G Core Software-as-a-Service (SaaS) and AirScale 5G base stations powered by ReefShark System-on-Chip technology connected to Telia’s commercial network. Nokia’s intelligent network management system, MantaRay NM, provided a consolidated network view, optimising monitoring and management.

Jarmo Vähätiitto, Major General, Finnish Defense Command, Chief of C5, stated, “This trial marks a…milestone in showcasing the dual-use possibilities of 5G for defense while also enhancing communication capabilities within the NATO domain. We are delighted to have partnered with Nokia and Telia on this project and are eager to explore further opportunities for integrating 5G into our operations.

Jari Collin, CTO at Telia Finland, added, “5G and network slicing enable secure, mission-critical communications. In collaboration with the Finnish Defense Forces and Nokia, we are pioneering in using commercial technology for critical defense communications. This trial meets the Defense Forces’ needs and proves that commercial 5G networks can be utilized also in this domain.”

Telco to Techco: How much progress has cloud native tech made in telecoms?

Philippe Ensarguet, VP of Software Engineering at Orange group, is among those best qualified to provide a comprehensive update

This article is bonus content – a longer, more detailed version of the keynote conversation that took place between Ensarguet and Mobile Europe’s editor, Annie Turner, at our recent Telco to techco virtual conference.

You can also watch the video from the conference here.

AT: Philippe, four years ago we had a big conversation at MWC about how telecoms was beginning to adopt cloud native technologies, for instance with CI/CD. What are the key challenges telcos face regarding cloud native technologies?

PE: I’d say three main things. To start with, a skills gap: if a network function (NF) remains a NF, but everything around its features has changed, and massively – for example, the use of microservices, a new resiliency and scalability. Then network softwarisation and network automation, which means most importantly that the software skills we need are quite different.

Then there’s the complexity of integration and operational challenges – for instance, implementing microservices-based architecture, containers and Kubernetes orchestration, plus monitoring and observability based on OSS technology. And let’s not forget all this must co-exist with legacy systems.

Thirdly, we are dealing with massively distributed infrastructures and systems – distributed and modular microservices architecture. Honestly, telco system are really complex, with real-time, synchronous workloads next to asynchronous ones, and all of this massively distributed across a huge infrastructure.

AT: Could you summarise what progress has been made since then? How mature is telcos’ implementation of cloud native tech?

Being cloud native is the things: having a cloud-native runtime; having automated deployment, lifecycle and operation management that is cloud native; and having truly cloud native NFs.

My feeling is that globally, over the past four years, the telecom industry has made notable strides in adopting cloud native technologies, particularly with the rollout of 5G core Standalone. Many communication service providers (CSPs) are transitioning to cloud native architectures, but the shift still incomplete, even if the 5G core is the spot where things are happening right now.

About broad adoption I’d say there is growing recognition of the necessity for cloud native solutions to meet scalability, elasticity and efficiency demands, that’s where cloud native interest is rising.

Note that the C in CNF does not stand for container but for cloud native, so cloud-native network functions: We have now de facto adoption of container technology by network-function vendors but the software for cloud-native network function is still developing. The full transition requires technological, process, people and cultural changes within telcos, which is notably related to network vendors with really different states of maturity.

There are other challenges. We’ve mentioned that the industry has a skills gaps in cloud native expertise, and managing operational complexity from hybrid and multi-cloud environments, and the transformation from virtual network functions (VNFs) to CNFs.

Another factor is that open source initiatives are crucial in establishing standards for cloud-native frameworks, enhancing open portability and interoperability, and addressing the complexities of multi-cloud deployments.

AT: Has progress been faster or slower than expected?

When you are committed and impatient like I used to be, things never move fast enough, but a huge ecosystem shift is happening that involves telecom operators, NF vendors, orchestration and operation providers, and AIOps providers. That takes time.

I think that things moved relatively quickly on the infrastructure side. I know several telco peers who set up a very robust and highly automated infrastructure, so we are already able to bring industrial-grade, cloud-native infrastructure from OSS, IT and network vendors. We are moving towards an horizontal implementation strategy.

Regarding lifecycle, the GitOps operating model, is moving forward but it can work only if all the parts are moving into the cloud native space, At Orange we implemented a NIF TZ to help with converging our NF deployment and lifecycle management. I know peers at DT, Swisscom and some others are very seriously in this play too.

The NF part is certainly where we need more of progress to be made. Some vendors are quite advanced while others keep putting bloated virtual machines (VMs) into containers then adding a cloud native sticker. This is a most important area. We need network vendors to become true software vendors if we are to succeed in this ecosystem transformation. 

The world of orchestration is also about to be a bit disrupted because when you have Kubernetes in the place, you can expect to have proper cloud-native orchestration based on the right operations and [custom resource definitions] CRDs which are quite different from the traditional [service order management] SOM approach.

Where are the big benefits that cloud-native tech has brought to telecoms?

I think that there are different benefits areas covering almost the full lifecycle, starting with scalability and elasticity from dynamic resource allocation. Cloud-native architectures allow operators to scale resources up or down based on demand, which is crucial for managing traffic spikes, especially in regions experiencing rapid digital transformation with automation. We can have very scalable infrastructure with cloud native.

Next we gain greater cost efficiency by reducing operational costs. By leveraging cloud-native solutions, we can optimise infrastructure and reduce costs associated with hardware and maintenance, leading to more efficient operations.

Agility and speed come from faster deployment. Cloud-native technologies facilitate quicker deployment of services and applications, enabling telecoms to respond rapidly to market changes and customers’ needs.

Better service delivery and reliability – improved performance – are the result of adopting an intent-based approach to managing cloud and network functions. In particular this enhances significatively the level of control operators have over deployments because they can rely on one source of truth with Git, and automated orchestration is provided by GitOps and Kubernetes to manage cloud infrastructure and network functions.

Moreover, the integration of AI and machine learning in cloud-native applications enhances service delivery, allowing for better customer experiences and more reliable services.

Another advantage is in operational flexibility due to multi-cloud capabilities. Cloud-native frameworks support multi-cloud environments, giving telcos the flexibility to choose the best services and tools for their needs, while avoiding vendor lock-in.

Today by leveraging the work of the Sylva project, hosted by the Linux Foundation, we can set up infra from bare metal to VM, from private networks, to the edge and to public cloud that are 100% intent based using Kubernetes as cornerstone of our management strategy.

The last thing I want to mention in terms of benefits here is innovation through open source and the power of collaboration and standards: Open source projects are driving innovation and establishing standards that improve interoperability and portability, making it easier for telcos to adopt new technologies.

Open source is tightly linked to the adoption of cloud native technologies in telecoms – can you tell us more about its importance and role?

Yes. Open source is integral to the successful adoption of cloud-native technologies in telecom, driving standardisation, collaboration, cost savings, and innovation while enhancing flexibility and the development of skills within the industry.

In terms of standardisation, open source projects establish common frameworks and help to create standards that facilitate interoperability and portability across different cloud environments. This is essential for multi-cloud deployments.

Community-driven development drives collaboration and innovation. The open source community fosters collaboration among telecom operators, developers and vendors, leading to innovative solutions that address industry-specific challenges.

Open source is cost effectiveness because of reduced licensing fees which can be less than those associated with proprietary software licences, so telcos can allocate resources more effectively. It’s important to understand that open source doesn’t automatically mean it costs less so much as it’s a different way to distribute the cost across the end-to-end chain, and to leverage your team more and their efforts.

Open source can results in accelerated deployment, meaning faster time to market because use of open source tools and frameworks can streamline development and deployment processes. In turn, this enables quicker rollout of services and applications.

The management of critical services is super important in this time of shifting geopolitics, and open source is a way to manage sovereignty – such as provided by Europe-based players.

AT: I know you are involved in many cloud native projects and bodies that are working on various aspects of open-source, cloud-native standards and implementation models. For example, you are heavily involved with the Cloud Native Computing Foundation’s (CNCF’s) KubeCon and CloudNativeCon event this week.

Please could you outline the main bodies and projects you are involved with and give us a summary of what they are working on, their goals and why they matter?

PE: OK, to start with, as I just discussed, open source is a great way for telecom operators to manage synergies and collaboration. It allows us to share common resources and focus our energy on what makes us unique.

First, I want to put the focus on the Project Sylva, hosted by the Linux Foundation Europe, aimed at building an industrial-grade, cloud-native telco stack to cover core, Ran and edge network functions.

At Orange, we are building our Orange Telco Cloud CaaS based on Sylva. If this runtime is covering only 10% of our infra today, it will be more than 60% by 2030, mainly to host 5G core services and the RAN. You can also see that we are leveraging CNCF projects like CAPI (for managing cluster APIs) and Flux (an open and extensible continuous delivery solution for Kubernetes) for intent-based GitOps model with different open source Kubernetes flavors.

Orange is also active in Anuket [formed from the merger of OPNFV and the Cloud iNfrastructure Telco Taskforce (CNTT) whose mission is to create a common understanding and new capability for infrastructures across the telecom industry and to plot a collective future] and in particular for Reference Architecture for Kubernetesand its conformance testing.

We actively participate in the LFN CNTi [that is. the Linux Foundation NetworkingCloud Native Telecom Initiative] which is a true catalyst for best practices for cloud-native networking, but also for improving an open-source, CNF testing framework and developing a vendor-neutral CNF conformance programme.

OpenSSF [Open Source Security Foundation] is working on the SLSA framework that we use to manage the security of our infrastructure supply chain.

Nephio is a north star in the automation and management of CNF. Release 4 has been announced recently, with major improvements.

Finally, as telecom operators embark on a new journey to monetise their assets with telco APIs, we are contributing to CAMARA, which is bringing the Service APIs specification to market and giving developers super powers to build unique features using our networks.

It’s also an opportunity to bring tons of kudos to the cloud native community. For the last decade, the CNCF has managed key core projects that carry our foundation from technical standpoint.

From this, you can see how much the open source ecosystem is connected to the most important transformation, the move from telcos to techcos.

AT: Is there a danger of fragmentation with so much work going on in some many different places in parallel?

PE: It’s a very good question and this is why telecom operators need to participate in the open source projects they trust and that are cornerstone for their future. We’ve got CNCF bringing the foundation. We’ve got Linux Foundation Networking managing connectivity, edge and networking projects. 3GPP is the home of standard specification and the GSMA brings architecture and implementation orientations. After the manifesto, NGMN is now addressing the assessment scope.

This may seem like a complex ecosystem, and it is, but the boundaries are defined. At Orange, we know the right contacts when necessary, and we engage with all of them.

AT: Looking forward, how is all this work that is underway now going to feed into 6G – and maybe before we dig into that, we should talk briefly about why we need 6G when there is still such a long way to go with 5G, for instance, in terms of monetising it?

PE: This looks like a very nice last question!

6G has been widely used so far to refer to the mobile technology evolutions to be deployed from 2030 onwards. We call on the industry to reassess the benefits of using a generation-based terminology for the evolution or new versions of technologies. It needs to focus the marketing and communication on the value enabled for customers and society by new innovations, rather than on the enabling technology itself.

So, at Orange we pledge to have a software-based 6G, without massive hardware replacement. 

The way we are envisaging 6G is first to respect the carbon trajectory – to use less hardware where possible even if new hardware is necessary for some additional frequencies, but overall 6G should not force us to renew all our hardware.

Secondly, our investments must be in line with our revenues which again leads us to limiting hardware deployment as much as possible.

Thirdly, we need to size according to the use cases we see. There’s no point in rolling out 10Gbps without use cases; it’s not sustainable.

In terms of timing this is the perfect moment for envisaging a truly cloud native 6G, as we just got 3GPP’s kick-off launching the initial studies. We know we won’t have near final spec before March 2029, so no real product-grade 6G implementations before that period. We could expect initial commercial deployments by the end of 2030.

In between those milestones, most telcos have their initial 5G deployments to renew or reconsider, and it will be a massive learning time for operators, and for vendors to push further along their cloud native path. We need cloud-native experts from operators and vendors sitting together to manage the right non-functional requirements.

I’m always surprised years after years of MWC, we still have sessions about how to monetise 5G, so when we come to 6G, it would be nice to have a business focus and not what could become a pure techno push position. In other words, now is the right time to put cloud native in the 6G trajectory.

Watch the video on demand here.

Siemens to sell solutions to German watercos based on O2’s slices

The vendor claims the jointly developed offer is a “breakthrough for 5G network slicing in critical, distributed industrial applications in Germany”

Siemens technology company and O2 Telefónica in Germany are partnering to offer a new, 5G-based solution to the country’s water industry. The partners have developed and tested the solution which is based on Siemens’ 5G routers and 5G slices from O2 Telefónica.

It will be marketed by Siemens to thousands of water and wastewater utilities in Germany. Unlike manufacturing, which runs localised private 5G networks on campus, the water industry’s facilities are distributed across a large area so the solution needs to provide long distance connections.

According to Siemens, the 5G Slice for the Water Industry enables water utilities to monitor and control their entire system using automation technology over a 5G slices. The networks are optimised “to reliably deliver defined Quality of Service…for automation applications,” the firm says. It adds, “This is critical for processes like pressure control, flow measurement and automated emergency response at water utilities’ mostly distributed sites.”

Slice-based solutions

Siemens states that 5G slices can be optimised for specific use cases in terms of their speed, response time and security. Also, they are separated, end to end, from traffic on the public internet with “a high level of cybersecurity and data protection for water companies which are subject to critical infrastructure regulations”.

The solution is being tested by a water utility in the state of North Rhine-Westphalia. According to the press release, it “addresses one of the biggest challenges in the water industry: a secure and efficient central orchestration of distributed infrastructure, from pumping stations and reservoirs to water towers and water treatment plants”.

Until now, water utilities have typically expended great effort on monitoring, control and connectivity, which is time-consuming and costly, requiring complex individual solutions. Siemens says the new solution means the utilities can “easily and securely connect their entire infrastructure via 5G network slicing technology and integrate additional sites into their network faster”.

Alfons Lösing, Chief Partner and Wholesale Officer at O2 Telefónica, said, “As a leading technology company, Siemens is the ideal partner for the practical application of 5G network slicing. For the first time, this technology is driving industrial applications via mobile communications with defined performance parameters. Our jointly developed solution marks the breakthrough for 5G network slicing in critical, distributed industrial applications in Germany.”

S Africa 2024 revenues reach €12.8bn – affordability and accessibility still issues

According to the regulator Icasa’s annual report, mobile data services grew 10% year on year and fixed internet income rose 15% – social media was key driver

The latest annual report from the Independent Communications Authority of South Africa (Icasa) found the country’s telecom sector generated R272 billion (€12.839 billion) in revenue in 2024. This is an increase of almost 12% from 2023’s total of R232 billion.

Social media engagement is one of the main drivers of broadband connectivity last year, via mobile and fibre connections in which mobile services grew 10% and revenues from fibre broadbrand grew 15%.

Regulator Icasa’s The State of the ICT Sector of South Africa report notes that as data consumption increases, revenues from mobile operators’ phone calls continued to fall, dropping by nearly 8% in 2024.

The report found that revenue from text and multimedia services rose 20% year on year, however. It commented, “The substantial growth in mobile data services reflected a higher demand for mobile internet, while the decline in voice services and roaming revenue indicated a shift in consumer preference towards data-based services.”

Annual snapshot in tenth year

Each year Icasa requires telecoms, broadcasting and postal services sectors to submit information a bout key data points in the last 12 months, up to the cut off date of 30 September. This is Icasa’s tenth annual report on ICT in South Africa.

Icasa’s annual research reveals that between 2020 to 2024, the telecoms sector grew by a compound annual growth rate (CAGR) of 4%, although mobile outstripped this average, growing by a CAGR of 9% during that time.

Icasa said this is due to the “rapid adoption of mobile technologies, increased internet penetration, and the roll-out of 4G and 5G networks to support digital services”.

The regulator also highlighted e-commerce and remote working as drivers of broadband connectivity.

Two regulator factors

Icasa also said that two major regulatory changes it had made have had an impact on the 2024 figures. The first was the publication of new call termination rates, explained here by TechCentral which were intended to lower the cost of communications. Icasa acknowledged more effort is needed to drive prices down further, making communications affordable to more people.

Network coverage must also expand, Icasa says, making services more widely available available, and pointed to its second regulatory change – starting wok on a new licensing framework for satellite services, which is still work in progress.

Brussels outpaces Britain on 5G – but better policy could fix it

Britain needs to ditch its failing price-oriented regulatory model, not compound it by implementing Part 2 of the Product Security and Telecommunications Infrastructure (PSTI) Act

Britain has no shortage of ambition when it comes to digital infrastructure. Successive governments have championed the importance of fast, reliable 5G as the foundation for economic growth, productivity gains, and technological competitiveness. Yet despite this, the UK’s record on delivery continues to lag behind.

Recent analysis by MedUX placed London at the bottom of the 5G table among 15 major European capitals, with substandard performance on speed, reliability, and availability. Nationally, the picture is just as bleak: the Social Market Foundation ranks the UK 30th out of 39 countries for 5G availability, and 37th for quality. For a G7 economy with advanced telecoms infrastructure and high urban density, the scale of underperformance is striking.

Government policy is a blocker

Despite successive government’s vocal support, it is not the UK’s geography, demand, or technological capacity that is holding 5G back – it is government policy. 

The reforms introduced in 2017 to the Electronic Communications Code (ECC), though well-intentioned, have had unintended consequences. The central change was a shift in how telecoms companies pay landowners to host mobile infrastructure. Instead of being based on market value, rents are now calculated using a ‘no-scheme valuation’ model – essentially quasi-compulsory purchase rules – meant to lower costs for operators and accelerate roll-out.

But over the past decade, this approach has disrupted the commercial relationship between telecoms firms and landowners, who now receive a fraction of previous rents, and in many cases, far below what they could expect from alternative uses such as renewable energy. This disruption has been further compounded by the evolution of the market structure itself.

The emergence of intermediary tower companies and wireless infrastructure providers (WIPs), who benefit from the Code’s valuation model but bear no network coverage obligations under Ofcom, has introduced a layer of complexity that further undermines landowners’ confidence and incentives.

Increased reluctance

Unsurprisingly, this has made landowners more reluctant to host masts, while also triggering a sharp rise in litigation. Since 2017, more than 1,000 court cases have been filed over telecoms infrastructure, compared to just 33 in the three decades prior. This surge in legal disputes has delayed roll-out, higher costs and uncertainty for both investors and public bodies.

Local authorities, NHS Trusts and public landowners are increasingly caught in the crossfire, as seen in the case of Hillingdon Hospital in Greater London. It was forced to repay over £300,000 to a mobile operator after losing a rent dispute at the height of the Covid-19 pandemic when healthcare systems needed every resource they could get. 

A tougher act to follow

Despite mounting evidence that the current model is failing, the UK Government is preparing to proceed with implementing Part 2 of the Product Security and Telecommunications Infrastructure (PSTI) Act – extending the existing regime to a further 15,000 sites. Though intended to accelerate rollout, this approach risks compounding past mistakes rather than correcting course in the face of growing evidence.

This is in stark contrast to the European Union, which has taken a different and arguably more effective approach. The EU’s Gigabit Infrastructure Act, passed in 2024, provides a more balanced system that encourages voluntary agreements between landowners and operators, underpinned by good-faith negotiation and market-reflective pricing.

It actively supports deployment through a mix of streamlined planning processes, co-financing opportunities and targeted subsidies, particularly in under-served areas. Crucially, no other country in Europe has adopted the UK’s price-oriented regulatory model for access to land, a decisive factor in explaining why the UK is falling behind.

The European Union’s framework is designed not just to reduce friction but to attract capital and accelerate delivery, and it will enlarge an already existing gap reflected in the recent white paper from Analysys Mason Access to Land under the GIA: Considerations for Regulation. European countries such as France, Germany, and Spain are pulling ahead in both availability and quality of 5G coverage. Denmark now delivers 5G access to over 80% of users; in the UK, it remains under 45%.

Regulatory freedom

What makes this comparison more pressing is that the UK now has the regulatory freedom to do things differently. No longer bound by EU frameworks, the UK has an opportunity to build a system tailored to its own growth ambitions – one that reflects both investors’ realities and local needs. Britain’s finance minister says removing barriers to investment is central to the government’s economic strategy. In the case of telecoms infrastructure, the barrier is regulatory, and it is entirely within our power to address.

There is a moment here for reflection and course correction. A pause on the implementation of Part 2 of the PSTI Act would allow policymakers to assess whether the current framework is fit for purpose. The weight of evidence suggests it is not.

Rather than speeding up deployment, the post-2017 regime has increased conflict, deterred investment and undermined the willingness of landowners – especially in the public and third sectors –to participate in the network expansion effort. Continuing to expand this system without review risks locking in underperformance at a time when infrastructure delivery has never been more important.

The UK could take the lead

The UK’s ambition to lead in digital infrastructure is still achievable. But delivering on that ambition will require policy that enables, rather than obstructs, the capital, cooperation and certainty needed to get Britain’s 5G roll-out back on track. That means using the freedoms we now have to rethink what works – and acting quickly before today’s problem becomes tomorrow’s entrenched failure.

Securing Lawful Interception: Protecting Networks, Data, and Compliance | White paper by SS8

0

The data involved in lawful interception operations is highly sensitive. A compromise can jeopardize lives, justice, and national security.

Yet the rapid evolution of communications technologies and ever-growing number of connected devices, as well as the distributed nature of modern networks, challenge the defense of such systems. These industry best practices and standards can help satisfy compliance and ensure integrity.

In this whitepaper you will learn:

  • How lawful interception targets are provisioned
  • Lawful intercept security elements and best practices
  • How emerging technologies will influence security
- Advertisement -
DOWNLOAD OUR NEW REPORT

5G Advanced

Will 5G’s second wave deliver value?