Showing posts with label congestion. Show all posts
Showing posts with label congestion. Show all posts

Wednesday, November 22, 2017

Video of the presentation at TIP 2017: Telefonica's Internet para todos


This is the video describing the project "internet para todos", connecting the unconnected in LatAm.
I present the industry trends and constraints that force telcos reeaxamine their model and the necessary changes in the value chain and the technology to enable ultra low cost versatile networks to connect the unconnected





Internet Para Todos: Connecting the Unconnected in LATAM

Patrick Lopez, VP, Networks Innovation, Telefonica



Tuesday, April 19, 2016

Net neutrality, meet lawful interception

This post is written today from the NFV World Congress where I am chairing the first day track on operations. Many presentations in the pre-show workshop day point to an increased effort from standards bodies (ETSI, 3GPP..) and open source organizations (OpenStack, OpenDaylight...) to address security by design in next generations networks architecture.
Law enforcement agencies are increasingly invited to contribute or advise to the standardization work to ensure their needs are baked into the design of these networks. Unfortunately, it seems that there is a large gap between lawful agencies requirements, standards and regulatory bodies. Many of the trends we are observing in mobile networks, from software defined networking to network functions virtualization and 5G assume that operators will be able to intelligently route traffic and apportion resources elastically. Lawful interception regulations mandate that operators, upon a lawful request, may provide means to monitor, intercept, transcribe any electronic communication to security agencies.

It has been hard to escape the headlines, lately when it comes to mobile networks, law enforcement and privacy. On one hand, privacy is an inalienable right that we should all be entitled to, on the other hand, we elect governments with the expectation that they will be able to protect us from harm, physical or digital. 

Digital harm, until recently, was mostly illustrated by misrepresentation, scams or identity theft. Increasingly, though, it translates into the physical world, as attacks can impact not only one's reputation, credit rating but also one's job, banking and soon cars, and connected devices.

I have written at length about the erroneous assumptions that are underlying many of the discourses of net neutrality advocates. 
In order to understand net neutrality and traffic management, one has to understand the different perspectives involved.
  • Network operators compete against each other on price, coverage and more importantly network quality. In many cases, they have identified that improving or maintaining quality of Experience is the single most important success factor for acquiring and retaining customers. We have seen it time and again with voice services (call drops, voice quality…), messaging (texting capacity, reliability…) and data services (video start, stalls, page loading time…). These KPI are the heart of the operator’s business. As a result, operators tend to either try to improve or control user experience by deploying an array of traffic management functions, etc...
  • Content providers assume that highest quality of content (HD for video for instance) equals maximum experience for subscriber and therefore try and capture as much network resource as possible to deliver it. Browser / apps / phone manufacturers also assume that more speed equals better user experience, therefore try to commandeer as much capacity as possible. A reaction to operators trying to perform traffic management functions is to encrypt traffic to obfuscate it. 
The flaw here is the assumption that the optimum is the product of many maxima self-regulated by an equal and fair apportioning of resources. This shows a complete ignorance of how networks are designed, how they operate and how traffic flows through these networks.

This behavior leads to a network where resources can be in contention and all end-points vie for priority and maximum resource allocation. From this perspective one can understand that there is no such thing as "net neutrality" at least not in wireless networks. 

When network resources are over-subscribed, decisions are taken as to who gets more capacity, priority, speed... The question becomes who should be in position to make these decisions. Right now, the laissez-faire approach to net neutrality means that the network is not managed, it is subjected to traffic. When in contention, resources are managing traffic based on obscure rules in load balancers, routers, base stations, traffic management engines... This approach is the result of lazy, surface thinking. Net neutrality should be the opposite of non-intervention. Its rules should be applied equally to networks, devices / apps/browsers and content providers if what we want to enable is fair and equal access to resources.

Now, who said access to wireless should be fair and equal? Unless the networks are nationalized and become government assets, I do not see why private companies, in a competitive market couldn't manage their resources in order to optimize their utilization.


If we transport ourselves in a world where all traffic becomes encrypted overnight, networks lose the ability to manage traffic beyond allowing / stopping and fixing high level QoS metrics to specific services. That would lead to network operators being forced to charge exclusively for traffic tonnage. At this point, everyone has to pay per byte transmitted. The cost to users would become prohibitive as more and more video of higher resolution flow through the networks. It would mean also that these video providers could asphyxiate the other services... More importantly, it would mean that the user experience would become the fruit of the fight between content providers' ability to monopolize network capacity, which would go again any net neutrality's principles. A couple of content providers could dominate not only service but the access to these service as well.

The problem is that encryption makes most traffic management and lawful interception provisions extremely unlikely or at the least very inefficient. Privacy is an important facet of net neutrality's advocates' discourse. It is indeed the main reason many content and service providers are invoking for encrypting traffic. In many case, this might be a true concern, but it is hard to reconcile that with the fact that many provide encryption keys and certificates to third party networks or CDNs for instance to improve caching ratios, perform edge packaging or advertising insertion. There is nothing that would prevent this model to be extended to wireless networks to perform similar operations. Commercial interest has so far prevented these types of models to emerge.

If encryption continues to grow, and service providers deny to operators the capability to decrypt traffic, the traditional burden of lawful interception might be transferred to the former. Since many providers are transnational, what is defined as lawful interception is unlikely to be unenforceable. At this stage we might have to choose, as societies between digital security or privacy.
In all likeliness, though, one can hope that regulatory bodies will up their technical game and understand the nature of digital traffic in the 21st century. This should lead to lawful interception mandate being applicable equally to all parts of the delivery chain, which will force collaborative behavior between the actors. 

Monday, June 8, 2015

Data traffic optimization feature set

Data traffic optimization in wireless networks has reached a mature stage as a technology . The innovations that have marked the years 2008 – 2012 are now slowing down and most core vendors exhibit a fairly homogeneous feature set. 

The difference comes in the implementation of these features and can yield vastly different results, depending on whether vendors are using open source or purpose-built caching or transcoding engines and whether congestion detection is based on observed or deduced parameters.

Vendors tend nowadays to differentiate on QoE measurement / management, monetization strategies including content injection, recommendation and advertising.

Here is a list of commonly implemented optimization techniques in wireless networks.
  •  TCP optimization
    • Buffer bloat management
    • Round trip time management
  • Web optimization
    • GZIP
    •  JPEG / PNG… transcoding
    • Server-side JavaScript
    • White space / comments… removal
  • Lossless optimization
    • Throttling / pacing
    • Caching
    • Adaptive bit rate manipulation
    • Manifest mediation
    • Rate capping
  • Lossy optimization
    • Frame rate reduction
    • Transcoding
      • Online
      • Offline
      • Transrating
    • Contextual optimization
      • Dynamic bit rate adaptation
      • Device targeted optimization
      • Content targeted optimization
      • Rule base optimization
      • Policy driven optimization
      • Surgical optimization / Congestion avoidance
  • Congestion detection
    • TCP parameters based
    • RAN explicit indication
    • Probe based
    • Heuristics combination based
  • Encrypted traffic management
    • Encrypted traffic analytics
    • Throttling / pacing
    • Transparent proxy
    • Explicit proxy
  • QoE measurement
    • Web
      • page size
      • page load time (total)
      • page load time (first rendering)
    • Video
      • Temporal measurements
        • Time to start
        • Duration loading
        • Duration and number of buffering interruptions
        • Changes in adaptive bit rates
        • Quantization
        • Delivery MOS
      • Spatial measurements
        • Packet loss
        • Blockiness
        • Blurriness
        • PSNR / SSIM
        • Presentation MOS


An explanation of each technology and its feature set can be obtained as part of the mobile video monetization report series or individually as a feature report or in a workshop.

Tuesday, March 10, 2015

Mobile video 2015 executive summary

As is now traditional, I return from Mobile World Congress with a head full of ideas and views on market evolution, fueled by dozens of meetings and impromptu discussions. The 2015 mobile video monetization report, now in its fourth year, reflects the trends and my analysis of the mobile video market, its growth, opportunities and challenges.

Here is the executive summary from the report to be released this month.

2014 has been a contrasted year for deployments of video monetization platforms in mobile networks. The market in deployments and value has grown, but there has been an unease that has gripped some of its protagonists, forcing exits and pivot strategies, while players with new value proposition have emerged. This transition year is due to several factors.

On the growth front, we have seen the emergence of MVNOs and interconnect / clearing houses as a buying target, together with the natural turnover and replacement of now aging and fully amortized platforms deployed 5/6 years ago.

Additionally, the market leaders upgrade strategies have naturally also created some space for challengers and new entrants. Mature markets have seen mostly replacements and MVNO green field deployments, while emerging markets have added new units in markets that are either too early for 3G or already saturated in 4G. Volume growth has been particularly sustained in Eastern / Central Europe, North Africa, Middle East and South East Asia.

On the other hand, the emergence and growth of traffic encryption, coupled with persisting legal and regulatory threat surrounding the net neutrality debate has cooled down, delayed and in some cases shut down optimization projects as operators are trying to rethink their options. Western Europe and North America have seen a marked slowdown, while South America is just about starting to show interest.

The value of the deals has been in line with last year, after sharp erosions due to the competitive environment. The leading vendors have consolidated their approach, taken on new strategies and overall capitalizing on installed base, while many new deals have gone to new entrants and market challengers.

2014 has also been the first year of a commercial public cloud deployment, which should be followed soon by others. Network function virtualization has captivated many network operators’ imagination and science experiment budget, which has prompted the emergence of the notion of traffic classification and management as a service.

Video streaming, specifically, has shown great growth in 2014, consolidating its place as the fastest growing service in mobile networks and digital content altogether. 2014 and early 2015 have seen many acquisitions of video streaming, packaging, encoding technology company. What is new however, is that a good portion of these acquisitions were not performed by other technology companies but by OTT such as FaceBook and Twitter.

Mobile video advertising is starting to become a “thing” again, as investments, inventory and views show triple digit growth. The trend shows mobile video advertising becoming possibly the single largest revenue opportunity for mobile operators within a 5 years timeframe, but its implementation demands a change in attitude, organization, approach that is alien to most operators DNA. The transformation, akin to a heart transplant will probably leave many dead on the operating table before the graft takes on and the technique is refined, but they might not have much choice, looking at Google’ and Facebook’s announcements at Mobile World Congress 2015.

Will new technologies such as LTE Multicast, for instance, which are due to make their start in earnest this year, promising quality assured HD content, via streaming or download, be able to unlock the value chain? 


The mobile industry is embattled and find itself looking at some great threats to its business model, as the saying goes, those who will survive are not necessarily the strongest, but rather those who will adapt the fastest.

Friday, July 5, 2013

The war of machine 2 machine: Internet of nothing?

A recent Tweet conversation got me thinking about all the hoopla about machine-to-machine / internet of everything.

Many telecom equipment manufacturer hail the trend as the next big thing for wireless networks, both a bounty to be harvested and a great opportunity for new revenue streams.

There is certainly a lot to think about when more and more devices that were not designed for real time connectivity are suddenly able to exchange, report, alarm... All these devices that could have well suited rudimentary logging software or technology, most of the time for manual retrieval (think your home gaz, water  or electricity meters being read by a technician) could in the future be eligible for over the air data transfer.

A similar discussion I had at LTE world Summit where I was chairing the data explosion stream comes to mind. A utility company in Italy, I think, had rolled out these "smart" meters. The implementation in labs was flawless, the utility was going to save millions, with only a handful of employees monitoring the data center instead of hundreds scouring the countryside reading manually meters. What was unexpected was that all meters had the same behavior, sending keep-alive and reporting logs at the same time. This brought the wireless network down, in a signalling and payload storm that was self-inflicted.

When I look at all the companies that have created apps with no knowledge of how a phone or a mobile network behaves, I can't help but think about the consequences of meters, cars, irrigation sensors, gaz turbines, fridges and traffic light trying to send snippets of data and signalling through a wireless network with no understanding of how these signals and streams will affect the infrastructure.

This immediately bring to mind horrific headlines: "Sheep herds monitoring device bring down network in New Zealand!". "Water and electricity meters fighting over bandwidth..."

More seriously, it means all these device manufacturers will need to get some serious programmers who understand wireless not only to put the transmitters on the devices but also to code efficiently so that signalling and payload are optimized. Network operators will also need to publish best practices for M2M traffic in term of frequency, amount, etc... with stringent SLAs since most of this traffic will be discrete (subscription paid with service or device, no usage payment).

Monday, May 27, 2013

All bytes are not created equal...



Recent discussions with a number of my clients have brought to light a fundamental misconception. Mobile video is not data. It is not a different use case of data or a particular form of data, it is just a different service. The sooner network operators will understand that they cannot count, measure, control video the same way as browsing data, the sooner they will have a chance to integrate the value chain of delivering video.

Deep packet inspection engines count bytes, categorize traffic per protocol, bearer, URL, throttle and prioritize data flow based on rules that are video-myopic. Their concern is of Quality of Service (QoS) not Quality of Experience (QoE). Policy and charging engines decide meter and limit traffic in real-time based on the incomplete picture painted by DPIs and other network elements.

Not understanding whether traffic is video (or assuming it is video just based on the URL) can prove itself catastrophic for the user experience and their bill. How can traffic management engine instantiate video charging and prioritization rules if they cannot differentiate between download, progressive download, adaptive bit rate? How can they decide what is the appropriate bandwidth for a service if they do not understand what is the encoding of the video, what are the available bit rates, if it is HD or SD, what is the user expectation?

Content providers naturally push a content of the highest quality that the network can afford, smartphone and tablets try and grab as much network capacity available at the establishment of a session to guarantee user experience, often at the detriment of other connections / devices. It is wrong to assume that the quality of experience in video is the result of a harmonious negotiation between content, device and networks.
It is actually quite the opposite, each party pulling in their direction with conflicting priorities.
User experience suffers as a result and we have started to see instances of users complaining or churning due to bad video experience.

All bytes are not created equal. Video weighs heavier and has a larger emotional attachment than email or browsing services when it comes to the user's experience of a network's quality. This is one of the subjects I will be presenting at Informa's Mobile Video Global Summit in Berlin, next week.



Monday, January 21, 2013

The law of the hungriest: Net neutrality and video


I was reflecting recently on net neutrality and its impact on delivering video in wireless networks. Specifically, most people I have discussed this with, seem to think that net neutrality means doing nothing. No intervention from the network operator to prioritize, discriminate, throttle, reduce or suppress a type of traffic vs another, whether based on a per subscriber, location, device or service.

This strikes me as somewhat short sighted and not very cogent of how the industry operates. I wonder why net neutrality is to apply to mobile networks, but not to handset manufacturers, app providers or content providers for instance.

There has been several depictions of some handset vendors or app providers having implemented method that are harmful to networks either unwittingly or downright predatory. Some smartphone vendors, for instance implement proprietary variations of streaming protocols to grab as much capacity of the network as possible, irrespective of the encoding of the accessed video, to ensure a fast and smooth video delivery to their device...at the detriment of others. It is easy to design an app or a browser or a video service that would use as much of a network capacity as possible, irrespective of the actual need for the service to function normally, which would result for a better user experience for the person accessing the service / the app / using this device but a degraded quality of experience for everyone else.

Why is that not looked after by net neutrality regulatory committees? Why would the network provide unrestricted access to any app / device / video service and let them fight for capacity without control? Mobile networks become ruled then by the law of the hungriest and when it comes to video, it can quickly become a fight dominated by the most popular web sites, phone vendors or app providers... I think that net neutrality, if it has to happen in mobile networks must be managed and that the notion of fair access extends to all parties involved.

Tuesday, November 6, 2012

LTE and video elasticity

I often get asked at events such as Broadband Traffic Management 2012, where I am chairing the mobile video stream this afternoon, "How does video traffic evolves in a LTE network? Won't LTE negate the need for traffic management and video optimization ?".

Jens Schulte-Bockum, CEO of Vodafone Germany shocked the industry last week, indicating that Vodafone Germany traffic in LTE is composed of mobile video for 85%.

I think what most people fail to understand is that video, unlike voice or generic data is elastic. Technologies such as adaptive streaming and source based encoding by content providers means that devices and content providers, given bandwidth will utilize all that is available. 

Device manufacturers implement increasingly aggressive versions of video streaming, grabbing as much bandwidth that is available, independently of video encoding, while content providers tend to offer increasing quality if video, moving from 480p to 720p and 1080p and soon 4K. 
This was corroborated this morning by Eric Klinker, president and CEO of BitTorrent. 

Operators need to understand that video must be managed as an independant service, independently from data and voice as it behaves differently and will "eat up" resources as they are made available.

So the short answer is no, LTE will not solve the issue but rather become a new variable in the equation.

Friday, September 28, 2012

How to weather signalling storms

I was struck a few months back when I heard an anecdote from Telecom Italia about a signalling storm in their network, bringing unanticipated outages. After investigation, the operator found out that the launch of Angry bird on Android had a major difference with the iOS version. It was a free app monetized through advertisement. Ads were being requested and served between each levels (or retry).
 If you are like me, you can easily go through 10 or more levels (mmmh... retries|) in a minute. Each one of these created a request going to the ad server, which generated queries to the subscriber database, location, charging engine over diameter resulting in +351% diameter traffic.
The traffic generated by one chatty app brought the network to its knees withing days of its launch.



As video traffic congestion becomes more prevalent and we see operators starting to measure subscriber's satisfaction in that area, we have seen several solutions emerge (video optimization, RAN optimization, policy management, HSPA +  and LTE upgrades, new pricing models...).
Signalling congestion, by contrast remains an emerging issue. I sat yesterday with Tekelec's Director of Strategic Marketing, Joanne Steinberg to discuss the topic and what should operators do about it.
Tekelec recently (September 2012) released its LTE Diameter Signalling Index. This report projects that diameter traffic will increase at a +252% CAGR until 2016 from 800k to 46 million messages per second globally. This is due to a radical change in applications behavior, as well as the new pricing and business models put in place by operators. Policy management, QoS management, metered charging, 2 sided business models and M2M traffic are some of the culprits highlighted in the report.

Diameter is a protocol that was invented originally to replace SS7 Radius, for the main purposes of Authentication, Authorization and Accounting (AAA). Real time charging and the evolution to IN drove its implementation. The protocol was created to be lighter than Radius, while extensible, with a variety of proprietary fields that could be added for specific uses. Its extensibility was the main criterion for its adoption as the protocol of choice for Policy and Charging functions.
Victim of its success, the protocol is now used in LTE for a variety of tasks ranging from querying subscriber databases (HSS), querying user balance and performing transactional charging and policy traffic.

Tekelec' signaling solutions, together with its policy product line (inherited from the Camiant acquisition), provides a variety of solution to handle the increasing load of diameter signaling traffic and is proposing its "Diameter Signaling Router as a means to manage, throttle, load balance and route diameter traffic".

In my opinion, data browsing is less predictable than voice or messaging traffic when it comes to signalling. While in the past a message at the establishment of the session, one at the end and optionally a few interim updates were sufficient, today sophisticated business models and price plans require a lot of signalling traffic. Additionally, diameter starts to be used to extend outside of the core packet network towards the RAN (for RAN optimization) and towards the internet (for OTT 2 sided business models). OTT content and app providers do not understand the functioning of mobile networks and we cannot expect device and app signalling traffic to self-regulate. While some 3GPP effort is expended to evaluate new architectures and rules such as fast dormancy, the problem is likely to grow faster than the standards' capacity to contain  it. I believe that diameter management and planning is necessary for network operators who are departing from all-you-can eat data plans and policy-driven traffic and charging models.