Showing posts with label mobile broadband. Show all posts
Showing posts with label mobile broadband. Show all posts

Friday, July 5, 2024

Readout: Ericsson's Mobility Report June 2024

 


It has been a few years now, since Ericsson has taken to provide a yearly report on their view of the evolution of connectivity. Alike Cisco's annual internet report, it provides interesting data points on telecom technology and services' maturity, but focused on cellular technology, lately embracing fixed-wireless access and non terrestrial networks as well. 

In this year's edition, a few elements caught my attention:

  • Devices supporting network slicing are few and far in-between. Only iOS 17 and Android 13 support some capabilities to indicate slicing parameters to their underlying applications. These devices are the higher end latest smartphones, so it is no wonder that 5G Stand Alone is late in delivering on its promises, if end to end slicing is only possible for a small fraction of customers. It is still possible to deploy slicing without device support, but there are limitations, most notably slicing per content / service, while slicing per device or subscriber profile is possible.

  • RedCap (5G reduced Capability) for IoT, wearables, sensors, etc... is making its appearance on the networks, mostly as demo and trials at this stage. The first devices are unlikely to emerge in mass market availability until end of next year.

  • Unsurprisingly, mobile data traffic is still growing, albeit at a lower rate than previously reported with a 25% yearly growth rate or just over 6% quarterly. The growth is mostly due to smartphones and 5G penetration and video consumption, accounting for about 73% of the traffic. This traffic data includes Fixed Wireless Access, although it is not broken down. The rollout of 5G, particularly in mid-band, together with carrier aggregation has allowed mobile network operators to efficiently compete with fixed broadband operators with FWA. FWA's growth, in my mind is the first successful application of 5G as a differentiated connectivity product. As devices and modems supporting slicing appear, more sophisticated connectivity and pricing models can be implemented. FWA price packages differ markedly from mobile data plans. The former are mostly speed based, emulating cable and fibre offering, whereas the latter are usually all you can eat best effort connectivity.

  • Where the traffic growth projections become murky, is with the impact of XR services. Mixed, augmented, virtual reality services haven't really taken off yet, but their possible impact on traffic mix and network load can be immense. XR requires a number of technologies to reach maturity at the same time (bendable / transparent screens, low power, portable, heat efficient batteries, low latency / high compute on device / at the edge, high down/ up link capabilities, deterministic mash latency over an area...) to reach mass market and we are still some ways away from it in my opinion.

  • Differential connectivity for cellular services is a long standing subject of interest of mine. My opinion remains the same: "The promise and business case of 5G was supposed to revolve around new connectivity services. Until now, essentially, whether you have a smartphone, a tablet, a laptop, a connected car, an industrial robot and whether you are a working from home or road warrior professional, all connectivity products are really the same. The only variable are the price and coverage.

    5G was supposed to offer connectivity products that could be adapted to different device types, verticals and industries, geographies, vehicles, drones,... The 5G business case hinges on enterprises, verticals and government adoption and willingness to pay for enhanced connectivity services. By and large, this hasn't happened yet. There are several reasons for this, the main one being that to enable these, a network overall is necessary.

    First, a service-based architecture is necessary, comprising 5G Stand Alone, Telco cloud and Multi-Access Edge Computing (MEC), Service Management and Orchestration are necessary. Then, cloud-native RAN, either cloud RAN or Open RAN (but particularly the RAN Intelligent Controllers - RICs) would be useful. All this "plumbing" to enable end to end slicing, which in turn will create the capabilities to serve distinct and configurable connectivity products.

    But that's not all... A second issue is that although it is accepted wisdom that slicing will create connectivity products that enterprises and governments will be ready to pay for, there is little evidence of it today. One of the key differentiators of the "real" 5G and slicing will be deterministic speed and latency. While most actors of the market are ready to recognize that in principle a controllable latency would be valuable, no one really knows the incremental value of going from variable best effort to deterministic 100, 10 or 5 millisecond latency.

    The last hurdle, is the realization by network operators that Mercedes, Wallmart, 3M, Airbus... have a better understanding of their connectivity needs than any carrier and that they have skilled people able to design networks and connectivity services in WAN, cloud, private and cellular networks. All they need is access and a platform with APIs. A means to discover, reserve, design connectivity services on the operator's network will be necessary and the successful operators will understand that their network skillset might be useful for consumers and small / medium enterprises, but less so for large verticals, government and companies." Ericsson is keen to promote and sell the "plumbing" to enable this vision to MNOs, but will this be sufficient to fulfill the promise?

  • Network APIs are a possible first step to open up the connectivity to third parties willing to program it. Network APIs is notably absent from the report, maybe due to the fact that the company announced a second impairment charge of 1.1B$ (after a 2.9B$ initial write off) in less than a year on the 6.2B$ acquisition of Vonage.

  • Private networks are another highlighted trend in the report with a convincing example of an implementation with Northstar innovation program, in collaboration with Telia and Astazero. The implementation focuses on automotive applications, from autonomous vehicle, V2X connectivity, remote control... On paper, it delivers everything operators dream about when thinking of differentiated connectivity for verticals and industries. One has to wonder how much it costs and whether it is sustainable if most of the technology is provided by a single vendor.

  • Open RAN and Programmable networks is showcased in AT&T's deal that I have previously reported and commented. There is no doubt that single vendor automation, programmability and open RAN can be implemented at scale. The terms of the deal with AT&T seem to indicate that it is a great cost benefit for them. We will have to measure the benefits as the changes are being rolled out in the coming years.


Monday, December 4, 2023

Is this the Open RAN tipping point: AT&T, Ericsson, Fujitsu, Nokia, Mavenir


The latest publications around Open RAN deliver a mixed bag of progress and skepticism. How to interpret these conflicting information?

A short retrospective of the most recent news:

On the surface, Open RAN seems to be benefiting from a strong momentum and delivering on its promise of disrupting traditional RAN with the introduction of new suppliers, together with the opening of traditional architecture to a more disaggregated and multi vendor model. The latest announcement from AT&T and Ericsson even would point out that the promise of reduced TCO for brownfield deployments is possible:
AT&T's yearly CAPEX guidance is supposed to reduce from a high of ~$24B to about 20B$ per year starting in 2024. If the 14B$ for 5 years spent on Ericsson RAN yield the announced 70% of traffic on Open RAN infrastructure, AT&T might have dramatically improved their RAN CAPEX with this deal.

What is driving these announcements?

For network operators, Open RAN has been about strategic supply chain diversification. The coalescence of the market into an oligopoly, and a duopoly after the exclusion of Chinese vendors to a large number of Western Networks has created an unfavorable negotiating position for the carriers. The business case of 5G relies heavily on declining costs or rather a change in the costs structure of deploying and operating networks. Open RAN is an element of it, together with edge computing and telco clouds.

For operators

The decision to move to Open RAN is mostly not any longer up for debate. While the large majority of brownfield networks will not completely transition to Open RAN they will introduce the technology, alongside the traditional architecture, to foster cloud native networks implementations. It is not a matter of if but a matter of when.
When varies for each market / operator. Operators do not roll out a new technology just because it makes sense even if the business case is favorable. A window of opportunity has to present itself to facilitate the introduction of the new technology. In the case of Open RAN, the windows can be:
  • Generational changes: 4G to 5G, NSA to SA, 5G to 6G
  • Network obsolescence: the RAN contracts are up for renewal, the infrastructure is aging or needs a refresh. 
  • New services: private networks, network slicing...
  • Internal strategy: transition to cloud native, personnel training, operating models refresh
  • Vendors weakness: Nothing better than an end of quarter / end of year big infrastructure bundle discount to secure and alleviate the risks of introducing new technologies

For traditional vendors

For traditional vendors, the innovator dilemma has been at play. Nokia has endorsed Open RAN early on, with little to show for it until recently, convincingly demonstrating multi vendor integration and live trials. Ericsson, as market leader has been slower to endorse Open RAN has so far adopted it selectively, for understandable reasons.

For emerging vendors

Emerging vendors have had mixed fortunes with Open RAN. The early market leader, Altiostar was absorbed by Rakuten which gave the market pause for ~3 years, while other vendors caught up. Mavenir, Samsung, Fujitsu and others offer credible products and services, with possible multi vendors permutations. 
Disruptors, emerging and traditional vendors are all battling in RAN intelligence and orchestration market segment, which promises to deliver additional Open RAN benefits (see link).


Open RAN still has many challenges to circumvent to become a solution that can be adopted in any network, but the latest momentum seem to show progress for the implementation of the technology at scale.
More details can be found through my workshops and advisory services.



Thursday, November 23, 2023

Announcing Private Networks 2024


Telecoms cellular networks, delivered by network operators, have traditionally been designed to provide coverage and best effort performance for consumers' general use. This design prioritizes high population density areas, emphasizing cost-effective delivery of coverage solutions with a network architecture treating all connections uniformly, effectively sharing available bandwidth. In some markets, net neutrality provisions further restrict the prioritization of devices, applications, or services over others.

Enterprises, governments, and organizations often turn to private networks due to two primary reasons. First, there may be no commercial network coverage in their operational areas. Second, even when commercial networks are present, they may fail to meet the performance requirements of these entities. Private networks offer a tailored solution, allowing organizations to have dedicated, secure, and high-performance connectivity, overcoming limitations posed by commercial networks.

Enterprise, industries, and government IT departments have developed a deep understanding of their unique connectivity requirements over the years. Recognizing the critical role that connectivity plays in their operations, these entities have sought solutions that align closely with their specific needs. Before the advent of 5G technology, Wi-Fi emerged as a rudimentary form of private networks, offering a more localized and controlled connectivity option compared to traditional cellular networks. However, there were certain limitations and challenges associated with Wi-Fi, and the costs of establishing and operating fully-fledged private networks were often prohibitive.

Enterprises, industries, and government organizations operate in diverse and complex environments, each with its own set of challenges and requirements. These entities understand that a one-size-fits-all approach to connectivity is often inadequate. Different sectors demand varied levels of performance, security, and reliability to support their specific applications and processes. This understanding has driven the search for connectivity solutions that can be tailored to meet the exacting standards of these organizations.

Wi-Fi technology emerged as an early solution that provided a degree of autonomy and control over connectivity. Enterprises and organizations adopted Wi-Fi to create local networks within their premises, enabling wireless connectivity for devices and facilitating communication within a confined area. Wi-Fi allowed for the segmentation of networks, offering a level of privacy and control that was not as pronounced in traditional cellular networks.

However, Wi-Fi also came with its limitations. Coverage areas were confined, and the performance could be affected by interference and congestion, especially in densely populated areas. Moreover, the security protocols of Wi-Fi, while evolving, were not initially designed to meet the stringent requirements of certain industries, such as finance, healthcare, or defense.

Establishing and operating private networks before the advent of 5G technology posed significant financial challenges. The infrastructure required for a dedicated private network, including base stations, networking equipment, and spectrum allocation, incurred substantial upfront costs. Maintenance and operational expenses added to the financial burden, making it cost-prohibitive for many enterprises and organizations to invest in private network infrastructure.

Moreover, the complexity of managing and maintaining a private network, along with the need for specialized expertise, further elevated the costs. These challenges made it difficult for organizations to justify the investment in a private network, especially when commercial networks, despite their limitations, were more readily available and appeared more economically feasible.

The arrival of 5G technology has acted as a game-changer in the landscape of private networks. 5G offers the potential for enhanced performance, ultra-low latency, and significantly increased capacity. These capabilities address many of the limitations that were associated with Wi-Fi and earlier generations of cellular networks. The promise of 5G has prompted enterprises, industries, and government entities to reassess the feasibility of private networks, considering the potential benefits in terms of performance, security, and customization.

The growing trend of private networks can be attributed to several key factors:

  • Performance Customization: Private networks enable enterprises and organizations to customize their network performance according to specific needs. Unlike commercial networks that provide best effort performance for a diverse consumer base, private networks allow for tailored configurations that meet the unique demands of various industries
  • Security and Reliability: Security is paramount for many enterprises and government entities. Private networks offer a higher level of security compared to public networks, reducing the risk of cyber threats and unauthorized access. Additionally, the reliability of private networks ensures uninterrupted operations critical for sectors like finance, healthcare, and defense.
  • Critical IoT and Industry 4.0 Requirements: The increasing adoption of Industrial IoT (IIoT) and Industry 4.0 technologies necessitates reliable and low-latency connectivity. Private networks provide the infrastructure required for seamless integration of IoT devices, automation, and real-time data analytics crucial for modern industrial processes.
  • Capacity and Bandwidth Management: In sectors with high data demands, such as smart manufacturing, logistics, and utilities, private networks offer superior capacity and bandwidth management. This ensures that enterprises can handle large volumes of data efficiently, supporting data-intensive applications without compromising on performance.
  • Flexibility in Deployment: Private networks offer flexibility in deployment, allowing organizations to establish networks in remote or challenging environments where commercial networks may not be feasible. This flexibility is particularly valuable for industries such as mining, agriculture, and construction.
  • Compliance and Control: Enterprises often operate in regulated environments, and private networks provide greater control over compliance with industry-specific regulations. Organizations can implement and enforce their own policies regarding data privacy, network access, and usage.
  • Edge Computing Integration: With the rise of edge computing, private networks seamlessly integrate with distributed computing resources, reducing latency and enhancing the performance of applications that require real-time processing. This is particularly advantageous for sectors like healthcare, where quick data analysis is critical for patient care.

As a result of these factors, the adoption of private networks is rapidly becoming a prominent industry trend. Organizations across various sectors recognize the value of tailored, secure, and high-performance connectivity that private networks offer, leading to an increasing shift away from traditional reliance on commercial cellular networks. This trend is expected to continue as technology advances and industries increasingly prioritize efficiency, security, and customized network solutions tailored to their specific operational requirements.

With the transformative potential of 5G, these entities are now reevaluating the role of private networks, anticipating that the advancements in technology will make these networks more accessible, cost-effective, and aligned with their specific operational requirements.

Terms and conditions available on demand: patrick.lopez@coreanalysis.ca  

Monday, November 13, 2023

RAN Intelligence leaders 2023


RAN intelligence is an emerging market segment composed of RAN Intelligent Controllers (RICs) and their associated Apps. I have been researching this field for the last two years and after an exhaustive analysis of the vendors and operators offerings and strategies, I am glad to publish here an extract of my findings. A complete review of the findings and rankings can be found through the associated report or workshop (commercial products).

The companies who participated in this study are AccelleRAN, AIRA, Airhop, Airspan, Cap Gemini, Cohere Technologies, Ericsson, Fujitsu, I-S Wireless, Juniper, Mavenir, Nokia, Northeastern. NTT Docomo, Parallel Wireless, Radisys, Rakuten Symphony, Rimedo Labs, Samsung, Viavi, VMWare.

They were separated in two overall categories:

  • Generalists: companies offering both RIC(s) and Apps implementations
  • Specialists: companies offering only Apps

The Generalist ranking is:



#1 Mavenir
#2 ex aequo Juniper and vmware
#4 Cap Gemini



The Specialists ranking is:



#1 Airhop
#2 Rimedo Labs
#3 Cohere Technologies



The study features a review of a variety of established and emerging vendors in the RAN space. RAN intelligence is composed of:

  • Non Real Time RIC - a platform for RIC intelligence necessitating more than 1 second to process and create feedback loops to the underlying infrastructure. This platform is an evolution of SON (Self Organizing Networks) systems, RAN EMS (Element Management Systems) and OSS (Operations Support Systems). The Non RT RIC is part of the larger SMO (Service Management and Orchestration) framework.
  • rApps -  Applications built on top of the Non RT RIC platform.
  • Near Real Time RIC - a platform for RIC intelligence necessitating less than 1 second to process and create feedback loops to the underlying infrastructure. This platform is a collection of capabilities today embedded within the RUs (Radio Units), DUs (Distributed Units) and CUs (Centralized Units).
  • xApps - Applications built on top of the Near RT RIC platform.
The vendors and operators were ranked on their strategy, vision and implementation across six dimensions, based on primary research from interviews, publicly available information, Plugfests participation and deployments observation:
  • Platform - the ability to create a platform and a collection of processes facilitating the developers' capability to create Apps that can be ported from one vendor to the other with minimum adaptation. Considerations were given to Apps lifecycle management, maturity of APIs / SDK, capability to create enabling apps / processes for hosted Apps.
  • Integrations / partnerships - one of the key tenets of Open RAN is the multi vendor or vendor agnostic implementation. From this perspective, companies that gave demonstrated their integration capabilities in multi vendor environments of the hosting of third party applications were ranked higher.
  • Non Real Time RIC - ranking the vision, implementation and maturity of the Non RT RIC capabilities.
  • Near Real Time RIC - ranking the vision, implementation and maturity of the Near RT RIC capabilities.
  • rApps - ranking the vision, implementation and maturity of the rApps offering
  • xApps - ranking the vision, implementation and maturity of the xApps offering

Friday, November 3, 2023

Telco edge compute, RAN and AI


In recent years, the telecommunications industry has witnessed a profound transformation, driven by the rapid penetration of cloud technologies. Cloud Native Functions have become common in the packet core, OSS BSS, transport and are making their way in the access domain, both fixed and mobile. CNFs mean virtual infrastructure management and data centers have become an important part of network capex strategies. 

While edge computing in telecoms, with the emergence of MEC (Multi Access Edge Computing), has been mostly confined to telco network functions (UPF, RAN CU/DU...) network operators should now explore the opportunities for retail and wholesale of edge computing services. My workshop examines in details the strategies, technologies and challenges associated with this opportunity.

Traditional centralized cloud infrastructure is being augmented with edge computing, effectively bringing computation and data storage closer to the point of data generation and consumption.

What are the benefits of edge computing for telecom networks?

  • Low Latency: One of the key advantages of edge computing is its ability to minimize latency. This is of paramount importance in telecoms, especially in applications like autonomous vehicles, autonomous robots / manufacturing, and remote-controlled machinery.
  • Bandwidth Efficiency: Edge computing reduces the need for transmitting massive volumes of data over long distances, which can strain network bandwidth. Instead, data processing and storage take place at the edge, significantly reducing the burden on core networks. This is particularly relevant for machine vision, video processing and AI use cases.
  • Enhanced Security: Edge computing offers improved security by allowing sensitive data to be processed locally. This minimizes the exposure of critical information to potential threats in the cloud. Additionally, privacy, data sovereignty and residency concerns can be efficiently addressed by local storage / computing.
  • Scalability: Edge computing enables telecom operators to scale resources as needed, making it easier to manage fluctuating workloads effectively.
  • Simpler, cheaper devices: Edge computing allows devices to be cheaper and simpler while retaining sophisticated functionalities, as storage, processing can be offloaded to a nearby edge compute facility.

Current Trends in Edge Computing for Telecoms

The adoption of edge computing in telecoms is rapidly evolving, with several trends driving the industry forward:

  • 5G and private networks Integration: The deployment of 5G networks is closely intertwined with edge computing. 5G's high data transfer rates and low latency requirements demand edge infrastructure to deliver on its promises effectively. The cloud RAN and service based architecture packet core functions drive demand in edge computing for the colocation of UPF and CU/DU functions, particularly for private networks.
  • Network Slicing: Network operators are increasingly using network slicing to create virtualized network segments, allowing them to allocate resources and customize services for different applications and use cases.
  • Ecosystem Partnerships: Telcos are forging partnerships with cloud providers, hardware manufacturers, and application developers to explore retail and wholesale edge compute services.

Future Prospects

The future of edge computing in telecoms offers several exciting possibilities:
  • Edge-AI Synergy: As artificial intelligence becomes more pervasive, edge computing will play a pivotal role in real-time AI processing, enhancing applications such as facial recognition, autonomous drones, and predictive maintenance. Additionally, AI/ML is emerging as a key value proposition in a number of telco CNFs, particularly in the access domain, where RAN intelligence is key to optimize spectrum and energy usage, while tailoring user experience.
  • Industry-Specific Edge Solutions: Different industries will customize edge computing solutions to cater to their unique requirements. This could result in the development of specialized edge solutions for healthcare, manufacturing, transportation, and more.
  • Edge-as-a-Service: Telecom operators are likely to offer edge services as a part of their portfolio, allowing enterprises to deploy and manage edge resources with ease.
  • Regulatory Challenges: As edge computing becomes more integral to telecoms, regulatory challenges may arise, particularly regarding data privacy, security, and jurisdictional concerns.

New revenues streams can also be captured with the deployment of edge computing.

  • For consumers, it is likely that the lowest hanging fruit in the short term is in gaming. While hyperscalers and gaming companies have launched their own cloud gaming services, their success has been limited due to the poor online experience. The most successful game franchises are Massive Multiplayer Online. They pitch dozens of players against each other and require a very controlled latency between all players for a fair and enjoyable gameplay. Only operators can provide controlled latency if they deploy gaming servers at the edge. Without a full blown gaming service, providing game caching at the edge can drastically reduce the download time for games, updates and patches, which increases dramatically player's service satisfaction.
  • For enterprise users, edge computing has dozens of use cases that can be implemented today that are proven to provide superior experience compared to the cloud. These services range from high performance cloud storage, to remote desktop, video surveillance and recognition.
  • Beyond operators-owned services, the largest opportunity is certainly the enablement of edge as a service (EaaS), allowing cloud developers to use edge resources as specific cloud availability zones.
Edge computing is rapidly maturing in the telecom industry by enabling low-latency, high-performance, and secure services that meet the demands of new use cases. As we move forward, the integration of edge computing with 5G and the continuous development of innovative applications will shape the industry's future. Telecom operators that invest in edge computing infrastructure and capabilities will be well-positioned to capitalize on the opportunities presented by this transformative technology. 


Tuesday, October 3, 2023

Should regulators forfeit spectrum auctions if they cant resolve Net Neutrality / Fair Share?

I have been
writing about Net Neutrality and Fair Share broadband usage for nearly 10 years. Both sides of the argument have merit and it is difficult to find a balanced view represented in the media these days. Absolutists would lead you to believe that internet usage should be unregulated with everyone able to stream, download, post anything anywhere, without respect for intellectual property or fair usage; while on the other side of the fence, service provider dogmatists would like to control, apportion, prioritize and charge based on their interests.

Of course, the reality is a little more nuanced. A better understanding of the nature and evolution of traffic, as well as the cost structure of networks help to appreciate the respective parties' stance and offer a better view on what could be done to reduce the chasm.

  1. From a costs structure's perspective first, our networks grow and accommodate demand differently whether we are looking at fixed line / cable / fibre broadband or mobile. 
    1. In the first case, capacity growth is function of technology and civil works. 
      1. On the technology front, the evolution to dial up / PSTN  to copper and fiber increases dramatically to network's capacity and has followed ~20 years cycles. The investments are enormous and require the deployment, management of central offices and their evolution to edge compute date centers. These investments happen in waves within a relatively short time frame (~5 years). Once operated, the return on investment is function of the number of users and the utilisation rate of the asset, which in this case means filling the network with traffic.
      2. On the civil works front, throughout the technology evolution, a continuous work is ongoing to lay transport fiber along new housing developments, while replacing antiquated and aging copper or cable connectivity. This is a continuous burn and its run rate is function of the operator's financial capacity.
    2. In mobile networks, you can find similar categories but with a much different balance and impact on ROI.
      1. From a technology standpoint, the evolution from 1G to 5G has taken roughly 10 years per cycle. A large part of the investment for each generation is a spectrum license leased from the regulating / government. In addition to this, most network elements, from the access to the core and OSS /BSS need to be changed. The transport part relies in large part on the fixed network above. Until 5G, most of these elements were constituted of proprietary servers and software, which meant a generational change induced a complete forklift upgrade of the infrastructure. With 5G, the separation of software and hardware, the extensive use of COTS hardware and the implementation of cloud based separation of traffic and control plane, should mean that the next generational upgrade will be les expensive with only software and part of the hardware necessitating complete refresh.
      2. The civil work for mobile network is comparable to the fixed network for new coverage, but follows the same cycles as the technology timeframe with respect to upgrades and changes necessary to the radio access. Unlike the fixed network, though, there is an obligation of backwards compatibility, with many networks still running 2G, 3G, 4G while deploying 5G. The real estate being essentially antennas and cell sites, this becomes a very competitive environment with limited capacity for growth in space, pushing service providers to share assets (antennas, spectrum, radios...) and to deploy, whenever possible, multi technology radios.
The conclusion here is that you have fixed networks with long investment cycles and ROI, low margin, relying on number of connections and traffic growth. The mobile networks has shorter investment cycles, bursty margin growth and reduction with new generations.

What does this have to do with Net Neutrality / Fair Share? I am coming to it, but first we need to examine the evolution of traffic and prices to understand where the issue resides.

Now, in the past, we had to pay for every single minute, text, kb received or sent. Network operators were making money of traffic growth and were pushing users and content providers to fill their networks. Video somewhat changed that. A user watching a 30 seconds video doesn't really care / perceive if the video is at 720, 1080 or 4K, 30 or 60 fps. It is essentially the same experience. That same video, though can have a size variation of 20x depending on its resolution. To compound that issue, operators have foolishly transitioned to all you can eat data plans with 4G to acquire new consumers, a self inflicted wound that has essentially killed their 5G business case.

I have written at length about the erroneous assumptions that are underlying some of the discourses of net neutrality advocates. 

In order to understand net neutrality and traffic management, one has to understand the different perspectives involved.
  • Network operators compete against each other on price, coverage and more importantly network quality. In many cases, they have identified that improving or maintaining quality of Experience is the single most important success factor for acquiring and retaining customers. We have seen it time and again with voice services (call drops, voice quality…), messaging (texting capacity, reliability…) and data services (video start, stalls, page loading time…). These KPI are the heart of the operator’s business. As a result, operators tend to either try to improve or control user experience by deploying an array of traffic management functions, etc...
  • Content providers assume that highest quality of content (8K UHD for video for instance) equals maximum experience for subscriber and therefore try and capture as much network resource as possible to deliver it. Browser / apps / phone manufacturers also assume that more speed equals better user experience, therefore try to commandeer as much capacity as possible. 
The flaw here is the assumption that the optimum is the product of many maxima self-regulated by an equal and fair apportioning of resources. This shows a complete ignorance of how networks are designed, how they operate and how traffic flows through these networks.

This behavior leads to a network where resources can be in contention and all end-points vie for priority and maximum resource allocation. From this perspective one can understand that there is no such thing as "net neutrality" at least not in wireless networks. 

When network resources are over-subscribed, decisions are taken as to who gets more capacity, priority, speed... The question becomes who should be in position to make these decisions. Right now, the laissez-faire approach to net neutrality means that the network is not managed, it is subjected to traffic. When in contention, resources are managing traffic based on obscure rules in load balancers, routers, base stations, traffic management engines... This approach is the result of lazy, surface thinking. Net neutrality should be the opposite of non-intervention. Its rules should be applied equally to networks, devices / apps/browsers and content providers if what we want to enable is fair and equal access to resources.

As we are contemplating 6G, and hints of metaverse, augmented / mixed reality and hyper connectivity, the cost structure of network infrastructure hasn't yet been sufficiently decoupled from traffic growth and as we have seen, video is elastic and XR will be a heavy burden on the networks. Network operators have essentially failed so far to offer attractive digital services that would monetize their network investments. Video and digital services providers are already paying for their on premise and cloud infrastructure as well as transport, there is little chance they would finance telco operators for capacity growth.

Where does this leave us? It might be time for regulators / governments to either take an active and balanced role in Net Neutrality and Fair share to ensure that both side can find a sustainable business model or to forfeit spectrum auctions for next generations.

Wednesday, November 22, 2017

Video of the presentation at TIP 2017: Telefonica's Internet para todos


This is the video describing the project "internet para todos", connecting the unconnected in LatAm.
I present the industry trends and constraints that force telcos reeaxamine their model and the necessary changes in the value chain and the technology to enable ultra low cost versatile networks to connect the unconnected





Internet Para Todos: Connecting the Unconnected in LATAM

Patrick Lopez, VP, Networks Innovation, Telefonica



Monday, June 13, 2016

Time to get out of consumer market for MNOs?

I was delivering a workshop on SDN / NFV in wireless, last week, at a major pan-european tier one operator group and the questions of encryption and net neutrality were put again on the table.

How much clever, elastic, agile software-defined traffic management can we really expect when "best effort" dictates the extent of traffic management and encryption renders many efforts to just understand traffic composition and velocity difficult?

There is no easy answer. I have spoken at length on both subjects (here and here, for instance) and the challenges have not changed much. Encryption is still a large part of traffic and although it is not growing as fast as initially planned after Google, Netflix, Snapchat or Facebook's announcements it is still a dominant part of data traffic. Many start to think that HTTPS / SSL is a first world solution, as many small and medium scale content or service providers that live on a freemium or ad-sponsored models can't afford the additional cost and latency unless they are forced to. Some think that encryption levels will hover around 50-60% of the total until mass adoption of HTTP/2 which could take 5+ years. We have seen, with T-Mobile's binge on  a first service launch that actively manages traffic, even encrypted to an agreed upon quality level. The net neutrality activists cried fool at the launch of the service, but quickly retreated when they saw the popularity and the first tangible signs of collaboration between content providers, aggregators and operators for customers' benefit.

As mentioned in the past, the problem is not technical, moral or academic. Encryption and net neutrality are just symptoms of an evolving value chain where the players are attempting to position themselves for dominance. The solution with be commercial and will involve collaboration in the form of content metadata exchange, to monitor, control and manage traffic. Mobile Edge Computing can be a good enabler in this. Mobile advertising, which is still missing over 20b$ in investment in the US alone when compared to other media and time spent / eyeball engagement will likely be part of the equation as well.

...but what happens in the meantime, until the value chain realigns? We have seen consumer postpaid ARPU declining in most mature markets for the last few years, while we seen engagement and usage of so-called OTT services explode. Many operators continue to keep their head in the sand and thinking of "business as usual" while timidly investigating new potential "revenue streams".

I think that the time has come for many to wake up and take hard decisions. In many cases, operators are not equipped organizationally or culturally for the transition that is necessary to flourish in a fluid environment where consumer flock to services that are free, freemium, or ad sponsored. What operators know best, subscription services see their price under intense pressure because OTTs are looking at usage and penetration at global levels, rather than per country. For these operators who understand the situation and are changing their ways, the road is still long and with many obstacles, particularly on the regulatory front, where they are not playing by the same rules as their OTT competition.

I suggest here that for many operators, it is time to get out. You had a good run, made lots of money on consumer services through 2G, 3G and early 4G, the next dollars or euros are going to be tremendously more expensive to get than the earlier.
At this point, I think there are emerging and underdeveloped verticals (such as enterprise and IoT) that are easier to penetrate (less regulatory barriers, more need for managed network capabilities and at least in the case of enterprise, more investment possibilities).
I think that at this stage, any operator who derives most of its revenue from consumer services should assume that these will likely dwindle to nothing unless drastic operational, organizational and cultural changes occur.
Some operator see the writing on the wall and have started the effort. There is no guarantee that it will work, but certainly having a software defined, virtualized elastic network will help if they are betting the farm on service agility. Others are looking at new technologies, open source and standards as they have done in the past. Aligning little boxes from industry vendors in neat powerpoint roadmap presentations, hiring a head of network transformation or virtualization... for them, the reality, I am afraid will come hard and fast. You don't invest in technologies to build services. You build services first and then look at whether you need more or new technologies to enable them.

Monday, April 4, 2016

MEC 2016 Executive Summary

2016 sees a sea change in the fabric of the mobile value chain. Google is reporting that mobile search revenue now exceed desktop, whereas 47% of Facebook members are now exclusively on mobile, which generates 78% of the company’s revenue. It has taken time, but most OTT services that were initially geared towards the internet are rapidly transitioning towards mobile.

The impact is still to be felt across the value chain.

OTT providers have a fundamentally different view of services and value different things than mobile network operators. While mobile networks have been built on the premises of coverage, reliability and ubiquitous access to metered network-based services, OTT rely on free, freemium, ad-sponsored or subscription based services where fast access and speed are paramount. Increase in latency impacts page load, search time and can cost OTTs billions in revenue.

The reconciliation of these views and the emergence of a new coherent business model will be painful but necessary and will lead to new network architectures.

Traditional mobile networks were originally designed to deliver content and services that were hosted on the network itself. The first mobile data applications (WAP, multimedia messaging…) were deployed in the core network, as a means to be both as close as possible to the user but also centralized to avoid replication and synchronization issues.
3G and 4G Networks still bear the design associated with this antiquated distribution model. As technology and user behaviours have evolved, a large majority of content and services accessed on cellular networks today originate outside the mobile network. Although content is now stored and accessed from clouds, caches CDNs and the internet, a mobile user still has to go through the internet, the core network, the backhaul and the radio network to get to it. Each of these steps sees a substantial decrease in throughput capacity, from 100's of Gbps down to Mbps or less. Additionally, each hop adds latency to the process. This is why networks continue to invest in increasing throughput and capacity. Streaming a large video or downloading a large file from a cloud or the internet is a little bit like trying to suck ice cream with a 3-foot bending straw.

Throughput and capacity seem to be certainly tremendously growing with the promises of 5G networks, but latency remains an issue. Reducing latency requires reducing distance between the consumer and where content and services are served. CDNs and commercial specialized caches (Google, Netflix…) have been helping reduce latency in fixed networks, by caching content as close as possible to where it is consumed with the propagation and synchronization of content across Points of Presence (PoPs). Mobile networks’ equivalent of PoPs are the eNodeB, RNC or cell aggregation points. These network elements, part of the Radio Access Network (RAN) are highly proprietary purpose-built platforms to route and manage mobile radio traffic. Topologically, they are the closest elements mobile users interact with when they are accessing mobile content. Positioning content and services there, right at the edge of the network would certainly substantially reduce latency.
For the first time, there is an opportunity for network operators to offer OTTs what they will value most: ultra-low latency, which will translate into a premium user experience and increased revenue. This will come at a cost, as physical and virtual real estate at the edge of the network will be scarce. Net neutrality will not work at the scale of an eNodeB, as commercial law will dictate the few applications and services providers that will be able to pre-position their content.

Mobile Edge Computing provides the ability to deploy commercial-off-the-shelf (COTS) IT systems right at the edge of the cellular network, enabling ultra-low latency, geo-targeted delivery of innovative content and services. More importantly, MEC is designed to create a unique competitive advantage for network operators derived from their best assets, the network and the customers’ behaviour. This report reviews the opportunity and timeframe associated with the emergence of this nascent technology and its potential impact on mobile networks and the mobile value chain.

Friday, March 18, 2016

For or against Adaptive Bit Rate? part V: centralized control

I have seen over the last few weeks much speculations and claims with T-Mobile's Binge On service launch and these have accelerated with yesterday's announcement of Google play and YouTube joining the service. As usual many are getting on their net neutrality battle horse using fraught assumptions and misconceptions to reject the initiative.

I have written at length about what ABR is and what are its pros and cons, you can find some extracts in the links at the end of this post. I'll try here to share my views and expose some facts to enable a more pragmatic approach.

I think we can safely assume that every actor in the mobile video delivery chain wants to enable the best user experience for users, whenever possible.
As I have written in the past, in the current state of affair, adaptive bit rate is often times corrupted in order to seize as much network bandwidth as possible, which results in devices and service providers aggressively competing for bits and bytes.
Content providers assume that highest quality of content (1080p HD video for instance) equals maximum experience for subscriber and therefore try and capture as much network resource as possible to deliver it. Browser / apps / phone manufacturers also assume that more speed equals better user experience, therefore try to commandeer as much capacity as possible. The flaw here is the assumption that the optimum is the product of many maxima self regulated by an equal and fair apportioning of resources. This shows a complete ignorance of how networks are designed, how they operate and how traffic flows through these networks.

An OTT cannot know why a user’s session downstream speed is degrading, it can just report it. Knowing why is important because it enables to make better decisions in term of the possible corrective actions that need to be undertaken to preserve the user’s experience. For instance, a reduction of bandwidth for a particular user can be the result of handover (4G to 3G or cells with different capacity), or because of congestion in a given cell or due to the distance between the phone and the antenna or whether a user enters a building, an elevator, or whether she is reaching her data cap and being throttled etc.… Reasons can be multiple and for each of them, a corrective action can have a positive or a negative effect on the user’s experience. For instance, in a video streaming scenario, you can have a group of people in a given cell streaming Netflix and others streaming YouTube. Naturally, the video streamed is in progressive download adaptive bit rate format, which means that the stream will try to increase to the highest available download bit rate to deliver the highest video definition possible. All sessions will theoretically increase the delivered definition up to the highest available or the highest delivery bit rate available, whichever comes first. In a network with much capacity, everyone ramps up to 1080p and everyone has a great user experience.

More often than not, though, that particular cell cannot accommodate everyone’s stream at the highest definition at the same time. Adaptive bit rate is supposed to help there again by stepping down definition until it fits within available delivery bit rate. It unfortunately can’t work like that when we are looking at multiple sessions from multiple OTTs. Specifically, as soon as one player starts reducing its definition to meet lower bit rate delivery, that freed-up bandwidth is grabbed by other players, which can now look at increasing even more their definition. There is no incentive for content provider to reduce bandwidth fast to follow network condition, because they can become starved by their competition in the same cell.

The solution here is simple, the delivery of ABR video content has to be managed and coordinated between all providers. The only way and place to provide this coordination is in the mobile network, as close to the radio resource as possible. [...]

This and more in my upcoming Mobile Edge Computing report.


Part I:What is ABR?
Part II: For ABR
Part III:Why isn't ABR more succesful
Part IV: alernatives

Tuesday, February 9, 2016

What's wrong with the mobile value chain


I have been compiling quarterly reviews for the retainer clients of my mobile video and SDN/NFV practices and it is clear that there is big gap between consumers interests and the mobile industry revenue model.

Video apps T-Mobile customers can stream for free.Mobile data traffic continues to grow unabated, fueled by social media, and web service providers moving towards a mobile first and in some cases a mobile only strategy.
Did you know that close to half of Facebook users are exclusively mobile generating over 3/4 of the company's revenue? That half of YouTube views are on mobile devices? Nearly half of Netflix under 34 members watch from a mobile device?
Most of these services are free, or adverting or subscription-based. They do not rely on usage (time or Gigabytes or access) for monetization. As they transition to mobile networks, they do not change their business model and mobile network operators are left bereft trying to figure out how their traditional per byte/minute/message model fits in this new paradigm.

... well, here is a hint: it doesn't.

We have seen recently how T-Mobile in the US is now allowing zero rated video streaming in exchange for a quality cap at 480p in its Binge On service. Verizon has answered in kind just this week with its go90 service.

These might appear as popular and innovative moves, but their are just "tricks" to acquire and secure high ARPU postpaid LTE subscribers, akin to the unlimited voice / data packages we see flourish every time a challenger MNO with an empty network tries to aggressively churn its competition.

These tricks are shortsighted and won't help MNOs reconcile the fact that their costs keep increasing and their revenue from traditional services decrease. I am convinced that by 2020, we will see operators or MVNO with free, or close to it, voice, data and messaging services. What will they do then?

Most MNOs have identified that mobile video and Internet of Things are their largest revenue making opportunity in the medium term. 
Internet of Things can be a lot of different things but seems too uncertain and immature to build a solid strategy for a while. There are too many conflicting standards and initiatives from too many established vendors and start ups to make sense of it and create a mass market business in the short term.
Mobile video, by contrast is close to a mature market and technology. It is appalling that most network operators have such a poor grasp of it. Take mobile advertising, for instance.
We have just established that nearly half of all digital content is consumed on a mobile device.
2015 was the first year digital advertising spend exceeded broadcast with 42 billion $ vs 40. Mobile barely registered with only 7 billion$. Although growing, mobile advertising is only 21% of the global advertising spend in the US. Announcers have identified that mobile video is their largest medium opportunity to reach their most important target demographics (high net worth + youth).
How is it that you have 50% of eyeballs on a service that draw only 21% of ad spend?
Well, there several reasons for that, but first and foremost, it is because mobile video is such a poor service. With many vendors and observers reporting slow start time, between 3 to 5 seconds on cellular and WiFi, with an abandonment rate ranging from 15 to 25%. Network operators' poor understanding of video as a technology and advertising as a model, leads to poor video service quality, which yields poor video advertising returns. There are potential strategies that could help there, but there isn't much movement on the MNO's front. Most initiatives in this space are from OTTs and vendors. 

In any case, if mobile video advertising is supposed to reach its potential (80% of mobile advertising, which should be at least equal to digital) and create 33 billion $ of spend, MNOs better start treating it seriously. Measuring, managing video QoE becomes key and while you are at it, if your network transports video for 75% of its traffic, might as well build a video network that happens to do voice, messaging browsing, rather than the other way around... Just saying. 

In the future, consumers, service providers, OTT will value much more a network that can deliver and guarantee the best video quality than anything else.

Tuesday, February 2, 2016

How to Binge On?

So... you have been surprised, excited,. curious about T-Mobile US Binge On launch
The innovative service is defining new cooperative models with so-called OTT by blending existing and new media manipulation technologies.

You are maybe wondering whether it would work for you? Do you know what it would take for you to launch a similar service?
Here is a quick guide of what you might need if you are thinking along those lines.


The regulatory question

First, you probably wonder whether you can even launch such a service. Is it contravening any net neutrality rule? The answer might be hard to find. Most net neutrality provisions are vague, inaccurate or downright technologically impossible to enforce, so when launching a new service, the best one can have is an opinion.
MNOs have essentially two choices, either not innovating and launching endless minute variations of existing services or launching innovative services. The latter strategy will always have a measure of risk, but MNOs can't aspire to be disruptive without risk taking. In this case, the risk is fairly limited, provided that the service is voluntary, with easy opt in /  opt out. There are always going to be challenges - even legal - to that operating assumption, but operators have to accept that as part of the cost of innovation. In other words, if you want to create new revenues streams, you have to grow some balls and take some risks, otherwise, just be a great network and abandon ambition to sell services.


The service

For those not familiar with Binge On, here is a quick overview. Binge On allows any new or existing subscribers with a 3GB data plan or higher to stream for free videos from over 4o popular content providers including Netflix, Hulu, HBO and ESPN.
The videos are zero rated (do not count towards the subscriber's quota) and are limited to 480p definition.
The service is free.

The content

Obviously, in the case of Binge On, the more content providers with rich content sign on for the service, the richer and the more attractive the offering. T-Mobile has been very smart to entice some of the most popular video services to sign on for Binge on. Netflix and HBO have a history of limited collaboration with few network operators, but no MNO to date has been able to create such a rich list of video partnerships.
Experience proves that the key to successful video services is breadth, depth and originality of the content. In this case, T-Mobile has decided not to intervene in content selection, simply allowing some of the most popular video services to participate in the service.
Notably, Facebook, Twitter, Google, Apple and Amazon properties are missing, with YouTube claiming technical incompatibility to participate.

The technology

What does the service entails technically? The first functionality a network needs to enable such a service is to discriminate content from a participating video provider versus other services. In some cases, when traffic is not encrypted, it is just a matter of creating a rule in the DPI or web gateway engine to apply zero rating to a specific content / origin. 

Picking out Netflix traffic out of the rest of the videos is not necessarily simple, since many premium video service providers deliver their service over encrypted protocols, to avoid piracy or privacy issues. The result is certainly that there is a level of integration that is necessary for the network to unambiguously detect a video session from Netflix. 

In this case, unencrypted metadata can be used in the headers to identify the service provider and even the content. That is not all, though as conceivably, some services might not be exclusively video. If we imagine a service like Facebook being part of Binge one, the network now needs to theoretically separate browsing traffic from video. This can be achieved with traffic management platforms that are usually deploying heuristics or algorithm to segregate traffic from a same source looking at packet size, session duration, packet patterns, etc.

Now that you are able to discriminate the content from participating partners, you need to tie it to subscribers that have opted in or opted out for this service. This usually is performed in the PCRF charging function or in the EPC where the new service is created. A set of rules are assembled to associate the list of content providers with a zero-rated class of service and associate a subscriber class with these services. The subscriber class is a toggled setting in the subscriber profile that resides in the subscriber database. As a subscriber starts a HBO episode, the network detects that this service is part of Binge on and looks up whether that user is subscribed or not to the service and applies the corresponding rate code. As a result, the amount of data consumed for this session is either accumulated and deduced from the subscriber's quota or not depending on whether she is a Binge On user.

We are almost done.

The big gamble taken by T-Mobile is that customers will trade unlimited quality for unlimited content. Essentially, the contract is that those who opt in Binge On will be able to stream unlimited video from participating providers at the condition that the video's definition is limited to 480p. In many cases, this is an acceptable quality for phones and tablets, as long as you do not hotspot the video to a laptop or a TV.
That limitation is the quid pro quo that T-Mobile is enforcing, allowing them to be able to have cost and service quality predictability.

That capability requires more integration between content provider and T-Mobile. 480p is an objective video display target that is usually describing a 640 x 480 pixels picture size. Videos encoded at that definition will vary in size, depending on the codec used, the number of frames per seconds and other parameters.

Most premium video providers in Binge ON are delivering them using adaptive bit rate, essentially delivering a number of possible video streams ranging from low to high definition. In this case, T-Mobile and the content provider have to limit the format up to 480p. This could be done by the content provider, of course, since it has all the formats. They could decide to send only 480p and lower versions, but that would be counter productive. The content provider does not know whether the subscriber is opted in to Binge On or not and that information that belongs to T-Mobile cannot be freely shared.
As a result, content providers are sending the video in their usual definition, leaving T-Mobile with the task to select the right format.

There are several ways to achieve that. The simplistic approach is just to limit the delivery bit rate so that the phone can never select more than 480p. This is a hazardous approach, because 480p encoding can result in bit rate delivery demand ranging from 700 to 1.5 Mbps depending on the codec being used. This is too wide to provide any guarantee by T-Mobile. Set the setting too low and some providers will never achieve 480 p. Set it too high and subscribers will have fluctuating quality with even 720 or 1080p formats.
The best way to achieve the desired result is to intercept the adaptive bit rate manifest delivered by the content provider at the establishment of the session and strip out all definitions above 480p. This guarantees that the video will never be delivered above 480p but can still fluctuate based on network's congestion. This can be achieved either with a specialized video optimization platform or in some of the more advanced EPC.

As we can see, the service is sophisticated and entails several steps. A network's capacity to deploy such a service is directly linked to its ability to link and instantiate services and network functions in an organic manner. Only the most innovative EPC, traffic detection and video management functions vendors can provide the flexibility and cost effectiveness to launch such a service.


Tuesday, January 26, 2016

2015 review and 2016 predictions

As is now customary, I try to grade what I was predicting for 2015 and see what panned out and what didn't. I'll share as well what I see for 2016.

Content providers, creators, aggregators:

"They will need to simultaneously maximize monetization options by segmenting their user base into new price plans and find a way to unlock value in the mobile market.While many OTT, particularly social networks and radio/ audio streaming have collaborated and signed deals with mobile network operators, we are seeing also a tendency to increasingly encrypt and obfuscate online services to avoid network operators meddling in content delivery." 
On that front, I think that both predictions held true. I was envisioning encryption to jump from 10 to 30% of overall data traffic and I got that wrong, at least in many mature markets, where Netflix is big in mobile, we see upwards of 50% of traffic being encrypted. I still claim some prediction here, with one of my first post indicating the encryption trend 2 years before it started in earnest.

The prediction about segmentation from pricing as OTT services mature has been also largely fulfilled, with YouTube's 4th attempt, by my count, to launch a paid service. Additionally, the trend about content aggregators investing in original content rights acquisition is accelerating with Amazon gearing up for movie theaters and Netflix outspending traditional providers such as BBC with a combined investment by both company estimated in the 9$Bn range. Soon, we are talking real money.


In 2016, we will see an acceleration of traditional digital services that were originally launched for fixed line internet transitioning to predominantly mobile or mobile only plays. Right now, 47% of Facebook users are exclusively through  mobile and account for 78% of the company's revenue. More than 50% of YouTube views are on mobile devices and the corresponding revenue growth is over 100% year on year. 49% of Netflix' 18 to 34 years old demographics watches the service on mobile devices. We have seen signs with Twitter's vine,  and Periscope as well as Spotify , MTV and Facebook that the battlefield will be on video services.


Network operators: Wholesaler or value providers?

The operators in 2016 are still as confused, as a community as in 2015. They perceive threats from each other, which causes many acquisitions, from OTTs, which causes in equal measure many partnership and ill-advised service launches and from regulatory bodies, which causes lawyers to fatten up at the net neutrality / privacy buffet.
"we will see both more cooperation and more competition, with integrated offering (OTT could go full MVNO soon) and encrypted, obfuscated traffic on the rise". 
We spoke about encryption, the OTT going full MVNO was somewhat fulfilled by Google's disappointing project Fi launch. On the cooperation front, we have seen a flurry of announcements, mostly centered around sponsored data or zero rated subscription services from Verizon, AT&T.
"We will probably also see the first lawsuits from OTT to carriers with respect to traffic mediation, optimization and management. " 
I got that half right. No lawsuit from content providers but heavy fines from regulators on operators who throttle, cap or prioritize content (Sprint, AT&T, ...).

As for digital service providers, network operators are gearing themselves to compete on video services with services such as mobile TV /LTE broadcast (AT&T, EE, Telekom SlovenjeVodafone), events streaming (China Telecom, ), sponsored data / zero rated subscription services (Verizon, T-mobile Binge On, Sprint, AT&T, Telefonica, ...).

"Some operators will seek to actively manage and mediate the traffic transiting through their networks and will implement HTTPS / SPDY proxy to decrypt and optimize encrypted traffic, wherever legislation is more supple."
I got that dead wrong. Despite interest and trials, operators are not ready to go into open battle with OTT just yet. Decrypting encrypted traffic is certainly illegal in many countries
or at the very least hostile and seems to be only expected from government agencies...



Mobile Networks Technology

"CAPEX will be on the rise overall with heterogeneous networks and LTE roll-out taking the lion share of investments. LTE networks will show signs of weakness in term of peak traffic handling mainly due to video and audio streaming and some networks will accelerate LTE-A investments or aggressively curb traffic through data caps, throttles and onerous pricing strategies."
Check and check.
"SDN will continue its progress as a back-office and lab technology in mobile networks but its incapacity to provide reliable, secure, scalable and manageable network capability will prevent it to make a strong commercial debut in wireless networks. 2018 is the likeliest time frame."
I maintain the view that SDN is still too immature for mass deployment in mobile networks, although we have seen encouraging trials moving from lab to commercial, we are still a long way from a business case and technology maturity standpoint before we see a mobile network core or RAN running exclusively or mostly on SDN.
"NFV will show strong progress and first commercial deployments in wireless networks, but in vertical, proprietary fashion, with legacy functions (DPI, EPC, IMS...) translated in a virtualized environment in a mono vendor approach. "
We have seen many examples of that this year with various levels of industry and standard support from Connectem, Affirmed Networks, Ericsson, Cisco and Huawei.

"Orchestration and integration with SDN will be the key investments in the standardization community. The timeframe for mass market interoperable multi vendor commercial deployment is likely 2020."
Orchestration, MANO has certainly driven many initiatives (Telefonica OpenMANO) and acquisitions (Ciena acquired Cyan, for example) and remains the key challenge in 2016 and beyond. SDN NFV will not take off unless there is a programmatic framework to link customer facing services to internal services, to functions, to virtual resources to hardware resources in a multi-vendor fashion. I still maintain 2020 as the probable target for this.

In 2016, the new bit of technology I will investigate is Mobile Edge Computing, the capacity to deploy COTS in the radio network, unlocking virtualized services to be positioned at the network's edge, enabling IoT, automotive, Augmented Reality or Virtual Reality services that require minimal latency to access content even faster.


In conclusion, 2016 shows more than ever signs that the house of cards is about to collapse. Data traffic is increasing fast, video is now dominating every networks and it is just starting. With 4K and then 8k around the corner, without talking about virtual or augmented reality, many of the players in the value chain understand that video is going the next few years' battlefield in mobile, OTT and cloud services. This is why we are seeing so much concentration and pivot strategies in the field. 

What is new is the fact that if mobile was an ongoing concern or barely on the radar for many so-called OTT, it has now emerged as the predominant if not exclusive market segment in revenue. 
This means that more pressure will rain on network operators to offer bandwidth and speed. My reports and workshops show that mobile advertising is not growing fast enough in comparison to the subscribers eyeball moving to mobile screens. This is mostly due to the fact that video services in mobile networks are a pretty low quality service, which will get worse as more subscribers transition to LTE. The key to unlock the value chain will be collaboration between operators and OTT and that will only happen if/when a profitable business model and apportioning of costs is worked out.

At last, my prediction about selfie kills seem to unfortunately have been fulfilled with selfies now killing more people than shark attacks. Inevitably, we have to conclude that in 2016, commercial drones and hoverboards will kill more people than selfies...


That's all folks, see you at MWC next month.