Showing posts with label Net Neutrality. Show all posts
Showing posts with label Net Neutrality. Show all posts

Tuesday, October 3, 2023

Should regulators forfeit spectrum auctions if they cant resolve Net Neutrality / Fair Share?

I have been
writing about Net Neutrality and Fair Share broadband usage for nearly 10 years. Both sides of the argument have merit and it is difficult to find a balanced view represented in the media these days. Absolutists would lead you to believe that internet usage should be unregulated with everyone able to stream, download, post anything anywhere, without respect for intellectual property or fair usage; while on the other side of the fence, service provider dogmatists would like to control, apportion, prioritize and charge based on their interests.

Of course, the reality is a little more nuanced. A better understanding of the nature and evolution of traffic, as well as the cost structure of networks help to appreciate the respective parties' stance and offer a better view on what could be done to reduce the chasm.

  1. From a costs structure's perspective first, our networks grow and accommodate demand differently whether we are looking at fixed line / cable / fibre broadband or mobile. 
    1. In the first case, capacity growth is function of technology and civil works. 
      1. On the technology front, the evolution to dial up / PSTN  to copper and fiber increases dramatically to network's capacity and has followed ~20 years cycles. The investments are enormous and require the deployment, management of central offices and their evolution to edge compute date centers. These investments happen in waves within a relatively short time frame (~5 years). Once operated, the return on investment is function of the number of users and the utilisation rate of the asset, which in this case means filling the network with traffic.
      2. On the civil works front, throughout the technology evolution, a continuous work is ongoing to lay transport fiber along new housing developments, while replacing antiquated and aging copper or cable connectivity. This is a continuous burn and its run rate is function of the operator's financial capacity.
    2. In mobile networks, you can find similar categories but with a much different balance and impact on ROI.
      1. From a technology standpoint, the evolution from 1G to 5G has taken roughly 10 years per cycle. A large part of the investment for each generation is a spectrum license leased from the regulating / government. In addition to this, most network elements, from the access to the core and OSS /BSS need to be changed. The transport part relies in large part on the fixed network above. Until 5G, most of these elements were constituted of proprietary servers and software, which meant a generational change induced a complete forklift upgrade of the infrastructure. With 5G, the separation of software and hardware, the extensive use of COTS hardware and the implementation of cloud based separation of traffic and control plane, should mean that the next generational upgrade will be les expensive with only software and part of the hardware necessitating complete refresh.
      2. The civil work for mobile network is comparable to the fixed network for new coverage, but follows the same cycles as the technology timeframe with respect to upgrades and changes necessary to the radio access. Unlike the fixed network, though, there is an obligation of backwards compatibility, with many networks still running 2G, 3G, 4G while deploying 5G. The real estate being essentially antennas and cell sites, this becomes a very competitive environment with limited capacity for growth in space, pushing service providers to share assets (antennas, spectrum, radios...) and to deploy, whenever possible, multi technology radios.
The conclusion here is that you have fixed networks with long investment cycles and ROI, low margin, relying on number of connections and traffic growth. The mobile networks has shorter investment cycles, bursty margin growth and reduction with new generations.

What does this have to do with Net Neutrality / Fair Share? I am coming to it, but first we need to examine the evolution of traffic and prices to understand where the issue resides.

Now, in the past, we had to pay for every single minute, text, kb received or sent. Network operators were making money of traffic growth and were pushing users and content providers to fill their networks. Video somewhat changed that. A user watching a 30 seconds video doesn't really care / perceive if the video is at 720, 1080 or 4K, 30 or 60 fps. It is essentially the same experience. That same video, though can have a size variation of 20x depending on its resolution. To compound that issue, operators have foolishly transitioned to all you can eat data plans with 4G to acquire new consumers, a self inflicted wound that has essentially killed their 5G business case.

I have written at length about the erroneous assumptions that are underlying some of the discourses of net neutrality advocates. 

In order to understand net neutrality and traffic management, one has to understand the different perspectives involved.
  • Network operators compete against each other on price, coverage and more importantly network quality. In many cases, they have identified that improving or maintaining quality of Experience is the single most important success factor for acquiring and retaining customers. We have seen it time and again with voice services (call drops, voice quality…), messaging (texting capacity, reliability…) and data services (video start, stalls, page loading time…). These KPI are the heart of the operator’s business. As a result, operators tend to either try to improve or control user experience by deploying an array of traffic management functions, etc...
  • Content providers assume that highest quality of content (8K UHD for video for instance) equals maximum experience for subscriber and therefore try and capture as much network resource as possible to deliver it. Browser / apps / phone manufacturers also assume that more speed equals better user experience, therefore try to commandeer as much capacity as possible. 
The flaw here is the assumption that the optimum is the product of many maxima self-regulated by an equal and fair apportioning of resources. This shows a complete ignorance of how networks are designed, how they operate and how traffic flows through these networks.

This behavior leads to a network where resources can be in contention and all end-points vie for priority and maximum resource allocation. From this perspective one can understand that there is no such thing as "net neutrality" at least not in wireless networks. 

When network resources are over-subscribed, decisions are taken as to who gets more capacity, priority, speed... The question becomes who should be in position to make these decisions. Right now, the laissez-faire approach to net neutrality means that the network is not managed, it is subjected to traffic. When in contention, resources are managing traffic based on obscure rules in load balancers, routers, base stations, traffic management engines... This approach is the result of lazy, surface thinking. Net neutrality should be the opposite of non-intervention. Its rules should be applied equally to networks, devices / apps/browsers and content providers if what we want to enable is fair and equal access to resources.

As we are contemplating 6G, and hints of metaverse, augmented / mixed reality and hyper connectivity, the cost structure of network infrastructure hasn't yet been sufficiently decoupled from traffic growth and as we have seen, video is elastic and XR will be a heavy burden on the networks. Network operators have essentially failed so far to offer attractive digital services that would monetize their network investments. Video and digital services providers are already paying for their on premise and cloud infrastructure as well as transport, there is little chance they would finance telco operators for capacity growth.

Where does this leave us? It might be time for regulators / governments to either take an active and balanced role in Net Neutrality and Fair share to ensure that both side can find a sustainable business model or to forfeit spectrum auctions for next generations.

Tuesday, April 19, 2016

Net neutrality, meet lawful interception

This post is written today from the NFV World Congress where I am chairing the first day track on operations. Many presentations in the pre-show workshop day point to an increased effort from standards bodies (ETSI, 3GPP..) and open source organizations (OpenStack, OpenDaylight...) to address security by design in next generations networks architecture.
Law enforcement agencies are increasingly invited to contribute or advise to the standardization work to ensure their needs are baked into the design of these networks. Unfortunately, it seems that there is a large gap between lawful agencies requirements, standards and regulatory bodies. Many of the trends we are observing in mobile networks, from software defined networking to network functions virtualization and 5G assume that operators will be able to intelligently route traffic and apportion resources elastically. Lawful interception regulations mandate that operators, upon a lawful request, may provide means to monitor, intercept, transcribe any electronic communication to security agencies.

It has been hard to escape the headlines, lately when it comes to mobile networks, law enforcement and privacy. On one hand, privacy is an inalienable right that we should all be entitled to, on the other hand, we elect governments with the expectation that they will be able to protect us from harm, physical or digital. 

Digital harm, until recently, was mostly illustrated by misrepresentation, scams or identity theft. Increasingly, though, it translates into the physical world, as attacks can impact not only one's reputation, credit rating but also one's job, banking and soon cars, and connected devices.

I have written at length about the erroneous assumptions that are underlying many of the discourses of net neutrality advocates. 
In order to understand net neutrality and traffic management, one has to understand the different perspectives involved.
  • Network operators compete against each other on price, coverage and more importantly network quality. In many cases, they have identified that improving or maintaining quality of Experience is the single most important success factor for acquiring and retaining customers. We have seen it time and again with voice services (call drops, voice quality…), messaging (texting capacity, reliability…) and data services (video start, stalls, page loading time…). These KPI are the heart of the operator’s business. As a result, operators tend to either try to improve or control user experience by deploying an array of traffic management functions, etc...
  • Content providers assume that highest quality of content (HD for video for instance) equals maximum experience for subscriber and therefore try and capture as much network resource as possible to deliver it. Browser / apps / phone manufacturers also assume that more speed equals better user experience, therefore try to commandeer as much capacity as possible. A reaction to operators trying to perform traffic management functions is to encrypt traffic to obfuscate it. 
The flaw here is the assumption that the optimum is the product of many maxima self-regulated by an equal and fair apportioning of resources. This shows a complete ignorance of how networks are designed, how they operate and how traffic flows through these networks.

This behavior leads to a network where resources can be in contention and all end-points vie for priority and maximum resource allocation. From this perspective one can understand that there is no such thing as "net neutrality" at least not in wireless networks. 

When network resources are over-subscribed, decisions are taken as to who gets more capacity, priority, speed... The question becomes who should be in position to make these decisions. Right now, the laissez-faire approach to net neutrality means that the network is not managed, it is subjected to traffic. When in contention, resources are managing traffic based on obscure rules in load balancers, routers, base stations, traffic management engines... This approach is the result of lazy, surface thinking. Net neutrality should be the opposite of non-intervention. Its rules should be applied equally to networks, devices / apps/browsers and content providers if what we want to enable is fair and equal access to resources.

Now, who said access to wireless should be fair and equal? Unless the networks are nationalized and become government assets, I do not see why private companies, in a competitive market couldn't manage their resources in order to optimize their utilization.


If we transport ourselves in a world where all traffic becomes encrypted overnight, networks lose the ability to manage traffic beyond allowing / stopping and fixing high level QoS metrics to specific services. That would lead to network operators being forced to charge exclusively for traffic tonnage. At this point, everyone has to pay per byte transmitted. The cost to users would become prohibitive as more and more video of higher resolution flow through the networks. It would mean also that these video providers could asphyxiate the other services... More importantly, it would mean that the user experience would become the fruit of the fight between content providers' ability to monopolize network capacity, which would go again any net neutrality's principles. A couple of content providers could dominate not only service but the access to these service as well.

The problem is that encryption makes most traffic management and lawful interception provisions extremely unlikely or at the least very inefficient. Privacy is an important facet of net neutrality's advocates' discourse. It is indeed the main reason many content and service providers are invoking for encrypting traffic. In many case, this might be a true concern, but it is hard to reconcile that with the fact that many provide encryption keys and certificates to third party networks or CDNs for instance to improve caching ratios, perform edge packaging or advertising insertion. There is nothing that would prevent this model to be extended to wireless networks to perform similar operations. Commercial interest has so far prevented these types of models to emerge.

If encryption continues to grow, and service providers deny to operators the capability to decrypt traffic, the traditional burden of lawful interception might be transferred to the former. Since many providers are transnational, what is defined as lawful interception is unlikely to be unenforceable. At this stage we might have to choose, as societies between digital security or privacy.
In all likeliness, though, one can hope that regulatory bodies will up their technical game and understand the nature of digital traffic in the 21st century. This should lead to lawful interception mandate being applicable equally to all parts of the delivery chain, which will force collaborative behavior between the actors. 

Friday, March 18, 2016

For or against Adaptive Bit Rate? part V: centralized control

I have seen over the last few weeks much speculations and claims with T-Mobile's Binge On service launch and these have accelerated with yesterday's announcement of Google play and YouTube joining the service. As usual many are getting on their net neutrality battle horse using fraught assumptions and misconceptions to reject the initiative.

I have written at length about what ABR is and what are its pros and cons, you can find some extracts in the links at the end of this post. I'll try here to share my views and expose some facts to enable a more pragmatic approach.

I think we can safely assume that every actor in the mobile video delivery chain wants to enable the best user experience for users, whenever possible.
As I have written in the past, in the current state of affair, adaptive bit rate is often times corrupted in order to seize as much network bandwidth as possible, which results in devices and service providers aggressively competing for bits and bytes.
Content providers assume that highest quality of content (1080p HD video for instance) equals maximum experience for subscriber and therefore try and capture as much network resource as possible to deliver it. Browser / apps / phone manufacturers also assume that more speed equals better user experience, therefore try to commandeer as much capacity as possible. The flaw here is the assumption that the optimum is the product of many maxima self regulated by an equal and fair apportioning of resources. This shows a complete ignorance of how networks are designed, how they operate and how traffic flows through these networks.

An OTT cannot know why a user’s session downstream speed is degrading, it can just report it. Knowing why is important because it enables to make better decisions in term of the possible corrective actions that need to be undertaken to preserve the user’s experience. For instance, a reduction of bandwidth for a particular user can be the result of handover (4G to 3G or cells with different capacity), or because of congestion in a given cell or due to the distance between the phone and the antenna or whether a user enters a building, an elevator, or whether she is reaching her data cap and being throttled etc.… Reasons can be multiple and for each of them, a corrective action can have a positive or a negative effect on the user’s experience. For instance, in a video streaming scenario, you can have a group of people in a given cell streaming Netflix and others streaming YouTube. Naturally, the video streamed is in progressive download adaptive bit rate format, which means that the stream will try to increase to the highest available download bit rate to deliver the highest video definition possible. All sessions will theoretically increase the delivered definition up to the highest available or the highest delivery bit rate available, whichever comes first. In a network with much capacity, everyone ramps up to 1080p and everyone has a great user experience.

More often than not, though, that particular cell cannot accommodate everyone’s stream at the highest definition at the same time. Adaptive bit rate is supposed to help there again by stepping down definition until it fits within available delivery bit rate. It unfortunately can’t work like that when we are looking at multiple sessions from multiple OTTs. Specifically, as soon as one player starts reducing its definition to meet lower bit rate delivery, that freed-up bandwidth is grabbed by other players, which can now look at increasing even more their definition. There is no incentive for content provider to reduce bandwidth fast to follow network condition, because they can become starved by their competition in the same cell.

The solution here is simple, the delivery of ABR video content has to be managed and coordinated between all providers. The only way and place to provide this coordination is in the mobile network, as close to the radio resource as possible. [...]

This and more in my upcoming Mobile Edge Computing report.


Part I:What is ABR?
Part II: For ABR
Part III:Why isn't ABR more succesful
Part IV: alernatives

Tuesday, January 26, 2016

2015 review and 2016 predictions

As is now customary, I try to grade what I was predicting for 2015 and see what panned out and what didn't. I'll share as well what I see for 2016.

Content providers, creators, aggregators:

"They will need to simultaneously maximize monetization options by segmenting their user base into new price plans and find a way to unlock value in the mobile market.While many OTT, particularly social networks and radio/ audio streaming have collaborated and signed deals with mobile network operators, we are seeing also a tendency to increasingly encrypt and obfuscate online services to avoid network operators meddling in content delivery." 
On that front, I think that both predictions held true. I was envisioning encryption to jump from 10 to 30% of overall data traffic and I got that wrong, at least in many mature markets, where Netflix is big in mobile, we see upwards of 50% of traffic being encrypted. I still claim some prediction here, with one of my first post indicating the encryption trend 2 years before it started in earnest.

The prediction about segmentation from pricing as OTT services mature has been also largely fulfilled, with YouTube's 4th attempt, by my count, to launch a paid service. Additionally, the trend about content aggregators investing in original content rights acquisition is accelerating with Amazon gearing up for movie theaters and Netflix outspending traditional providers such as BBC with a combined investment by both company estimated in the 9$Bn range. Soon, we are talking real money.


In 2016, we will see an acceleration of traditional digital services that were originally launched for fixed line internet transitioning to predominantly mobile or mobile only plays. Right now, 47% of Facebook users are exclusively through  mobile and account for 78% of the company's revenue. More than 50% of YouTube views are on mobile devices and the corresponding revenue growth is over 100% year on year. 49% of Netflix' 18 to 34 years old demographics watches the service on mobile devices. We have seen signs with Twitter's vine,  and Periscope as well as Spotify , MTV and Facebook that the battlefield will be on video services.


Network operators: Wholesaler or value providers?

The operators in 2016 are still as confused, as a community as in 2015. They perceive threats from each other, which causes many acquisitions, from OTTs, which causes in equal measure many partnership and ill-advised service launches and from regulatory bodies, which causes lawyers to fatten up at the net neutrality / privacy buffet.
"we will see both more cooperation and more competition, with integrated offering (OTT could go full MVNO soon) and encrypted, obfuscated traffic on the rise". 
We spoke about encryption, the OTT going full MVNO was somewhat fulfilled by Google's disappointing project Fi launch. On the cooperation front, we have seen a flurry of announcements, mostly centered around sponsored data or zero rated subscription services from Verizon, AT&T.
"We will probably also see the first lawsuits from OTT to carriers with respect to traffic mediation, optimization and management. " 
I got that half right. No lawsuit from content providers but heavy fines from regulators on operators who throttle, cap or prioritize content (Sprint, AT&T, ...).

As for digital service providers, network operators are gearing themselves to compete on video services with services such as mobile TV /LTE broadcast (AT&T, EE, Telekom SlovenjeVodafone), events streaming (China Telecom, ), sponsored data / zero rated subscription services (Verizon, T-mobile Binge On, Sprint, AT&T, Telefonica, ...).

"Some operators will seek to actively manage and mediate the traffic transiting through their networks and will implement HTTPS / SPDY proxy to decrypt and optimize encrypted traffic, wherever legislation is more supple."
I got that dead wrong. Despite interest and trials, operators are not ready to go into open battle with OTT just yet. Decrypting encrypted traffic is certainly illegal in many countries
or at the very least hostile and seems to be only expected from government agencies...



Mobile Networks Technology

"CAPEX will be on the rise overall with heterogeneous networks and LTE roll-out taking the lion share of investments. LTE networks will show signs of weakness in term of peak traffic handling mainly due to video and audio streaming and some networks will accelerate LTE-A investments or aggressively curb traffic through data caps, throttles and onerous pricing strategies."
Check and check.
"SDN will continue its progress as a back-office and lab technology in mobile networks but its incapacity to provide reliable, secure, scalable and manageable network capability will prevent it to make a strong commercial debut in wireless networks. 2018 is the likeliest time frame."
I maintain the view that SDN is still too immature for mass deployment in mobile networks, although we have seen encouraging trials moving from lab to commercial, we are still a long way from a business case and technology maturity standpoint before we see a mobile network core or RAN running exclusively or mostly on SDN.
"NFV will show strong progress and first commercial deployments in wireless networks, but in vertical, proprietary fashion, with legacy functions (DPI, EPC, IMS...) translated in a virtualized environment in a mono vendor approach. "
We have seen many examples of that this year with various levels of industry and standard support from Connectem, Affirmed Networks, Ericsson, Cisco and Huawei.

"Orchestration and integration with SDN will be the key investments in the standardization community. The timeframe for mass market interoperable multi vendor commercial deployment is likely 2020."
Orchestration, MANO has certainly driven many initiatives (Telefonica OpenMANO) and acquisitions (Ciena acquired Cyan, for example) and remains the key challenge in 2016 and beyond. SDN NFV will not take off unless there is a programmatic framework to link customer facing services to internal services, to functions, to virtual resources to hardware resources in a multi-vendor fashion. I still maintain 2020 as the probable target for this.

In 2016, the new bit of technology I will investigate is Mobile Edge Computing, the capacity to deploy COTS in the radio network, unlocking virtualized services to be positioned at the network's edge, enabling IoT, automotive, Augmented Reality or Virtual Reality services that require minimal latency to access content even faster.


In conclusion, 2016 shows more than ever signs that the house of cards is about to collapse. Data traffic is increasing fast, video is now dominating every networks and it is just starting. With 4K and then 8k around the corner, without talking about virtual or augmented reality, many of the players in the value chain understand that video is going the next few years' battlefield in mobile, OTT and cloud services. This is why we are seeing so much concentration and pivot strategies in the field. 

What is new is the fact that if mobile was an ongoing concern or barely on the radar for many so-called OTT, it has now emerged as the predominant if not exclusive market segment in revenue. 
This means that more pressure will rain on network operators to offer bandwidth and speed. My reports and workshops show that mobile advertising is not growing fast enough in comparison to the subscribers eyeball moving to mobile screens. This is mostly due to the fact that video services in mobile networks are a pretty low quality service, which will get worse as more subscribers transition to LTE. The key to unlock the value chain will be collaboration between operators and OTT and that will only happen if/when a profitable business model and apportioning of costs is worked out.

At last, my prediction about selfie kills seem to unfortunately have been fulfilled with selfies now killing more people than shark attacks. Inevitably, we have to conclude that in 2016, commercial drones and hoverboards will kill more people than selfies...


That's all folks, see you at MWC next month.

Monday, December 21, 2015

Bytemobile: what's next?

Following the brutal announcement of Bytemobile's product line discontinuation by Citrix, things are starting to get a little clearer in term of what the potential next steps could be for their customers.

Citrix was market leader in terms of number of deployments and revenue in the video optimization market when it decided to kill this product offering due to internal strategic realignment. The news left many customers confused as to what - if any- support they can expect from the company.

Citrix' first order of action over the last month has been to meet with every major account to reassure them that the transition will follow a plan. What transpires at this point in time is that a few features from ByteMobile T-3100 product family will be migrated to NetScaler probably towards the end of 2016. Citrix is still in the process of circling the wagons at this stage and seems to be trying to evaluate the business case for the transition, which will condition the amount of feature and the capacity to reach feature parity.

In many cases, network operators who have deployed versions of ByteMobile T-3100 have been put on notice to upgrade to the latest version, as older versions will see end of support notices going out next year.

Concurrently, presumably, Citrix won't be able to confirm NetScaler's detailed roadmap and transition plan until they have a better idea in term of the number and type of customers that will elect to migrate.

In the meantime, ByteMobile's historical competitors are drawing battle plans to take advantage of this opportunity. A forklift upgrade is never an easy task to negotiate and, no doubt, there will be much pencil sharpening in the new year in core networks procurement departments.

Video optimization market has dramatically changed over the last year. The growth in encrypted traffic, the uncertainty surrounding Citrix and the net neutrality debate has change the feature set operators have been looking for.
Real-time transcoding orders have severely reduced because of costs and encryption, while TCP optimization, encrypted traffic analytics, video advertising and adaptive bit rate management are gaining increasing favors.

The recent T-Mobile USA "Binge On" offering, providing managed video for premium services is also closely followed by many network operators and will in all likeliness create more interest for video management collaboration solutions.

As usual, this and more in my report on video monetization.

Tuesday, March 31, 2015

Net neutrality... so what?


[...] 
In the US, on February 26, days before the mobile world congress, the Federal Communications Commission released a declaratory ruling on “protecting and promoting the open internet”. The reclassification of fixed and mobile network services under title II telecom services by the FCC means in substance that network operators will be prevented from blocking, throttling, and prioritizing traffic and will have to be transparent in the way their traffic management rules are applied.  This is essentially due to an earlier ruling from the DC circuit Verizon v. FCC that struck down FCC’s rules against blocking and traffic discrimination but remarked that “broadband providers represent a threat to Internet openness and could act in ways that would ultimately inhibit the speed and extent of future broadband deployment.”

It is a great issue that broadband providers in this case are exclusively network operators, and not OTT providers, who have, in my mind the same capacity and have a similar track record in that matter. The FCC tried to provide “more broadband, better broadband and open broadband” and in its haste has singled out one party of the ecosystem, essentially condemning network operators to a utility model. This nearsightedness is unlikely to continue as several companies have already decided to challenge it. Less than a month after its publication, the order is being challenged in court by the United States Telecom Association, a lobbying group representing the broadband and wireless network operators as well as Alamo, broadband provider in Louisiana. There is no doubt that legal proceedings will occupy and fatten lawyers on both sides for years to come.

In Europe, the net neutrality debate is also far from being settled. After the European Commission seemed to take a no throttling, no blocking no prioritization stance in its “Digital single market” initiative, network operators, throughout their lobbying arm ETNO (European Telecommunications Network Operators’ association) started to challenge these provisions at the country level. Since the European Commission has not yet passed a law on the subject, the likeliness of a strong net neutrality stance will depend on support from each nation. In November 2014, compromises in the form of “non-discriminatory and proportionate” plans were discussed. The result is that net neutrality is still very much a moving target, with a lot of efforts being expanded to enable a managed internet experience, with a fast and a best effort lane. The language and ideas surrounding net neutrality is very vague suggesting either a great lack of technical expertise or a reluctance to provide an enforceable guidance (or both). It is more likely that countries at their individual level will start passing law to regulate some aspects of traffic management until a consensus is found at the European level.


In conclusion, there is obviously much debate over net neutrality globally, with many emotional, commercial, technical implications. There is at this stage no evidence of any regulatory authority having a good grasp of both the technical and commercial realities today to make a fair and enforceable ruling. As a result, politics, public sentiments, lobbying and lawyers will dictate the law for the next 5 years. In the meantime, it is likely that loopholes will be found and that collaborative approaches will show a lucrative business model that is likely to make the whole debate obsolete.

More analysis on traffic encryption, mobile advertising, data, video, mobile and media trends in  "Mobile video monetization 2015". 

Tuesday, March 10, 2015

Mobile video 2015 executive summary

As is now traditional, I return from Mobile World Congress with a head full of ideas and views on market evolution, fueled by dozens of meetings and impromptu discussions. The 2015 mobile video monetization report, now in its fourth year, reflects the trends and my analysis of the mobile video market, its growth, opportunities and challenges.

Here is the executive summary from the report to be released this month.

2014 has been a contrasted year for deployments of video monetization platforms in mobile networks. The market in deployments and value has grown, but there has been an unease that has gripped some of its protagonists, forcing exits and pivot strategies, while players with new value proposition have emerged. This transition year is due to several factors.

On the growth front, we have seen the emergence of MVNOs and interconnect / clearing houses as a buying target, together with the natural turnover and replacement of now aging and fully amortized platforms deployed 5/6 years ago.

Additionally, the market leaders upgrade strategies have naturally also created some space for challengers and new entrants. Mature markets have seen mostly replacements and MVNO green field deployments, while emerging markets have added new units in markets that are either too early for 3G or already saturated in 4G. Volume growth has been particularly sustained in Eastern / Central Europe, North Africa, Middle East and South East Asia.

On the other hand, the emergence and growth of traffic encryption, coupled with persisting legal and regulatory threat surrounding the net neutrality debate has cooled down, delayed and in some cases shut down optimization projects as operators are trying to rethink their options. Western Europe and North America have seen a marked slowdown, while South America is just about starting to show interest.

The value of the deals has been in line with last year, after sharp erosions due to the competitive environment. The leading vendors have consolidated their approach, taken on new strategies and overall capitalizing on installed base, while many new deals have gone to new entrants and market challengers.

2014 has also been the first year of a commercial public cloud deployment, which should be followed soon by others. Network function virtualization has captivated many network operators’ imagination and science experiment budget, which has prompted the emergence of the notion of traffic classification and management as a service.

Video streaming, specifically, has shown great growth in 2014, consolidating its place as the fastest growing service in mobile networks and digital content altogether. 2014 and early 2015 have seen many acquisitions of video streaming, packaging, encoding technology company. What is new however, is that a good portion of these acquisitions were not performed by other technology companies but by OTT such as FaceBook and Twitter.

Mobile video advertising is starting to become a “thing” again, as investments, inventory and views show triple digit growth. The trend shows mobile video advertising becoming possibly the single largest revenue opportunity for mobile operators within a 5 years timeframe, but its implementation demands a change in attitude, organization, approach that is alien to most operators DNA. The transformation, akin to a heart transplant will probably leave many dead on the operating table before the graft takes on and the technique is refined, but they might not have much choice, looking at Google’ and Facebook’s announcements at Mobile World Congress 2015.

Will new technologies such as LTE Multicast, for instance, which are due to make their start in earnest this year, promising quality assured HD content, via streaming or download, be able to unlock the value chain? 


The mobile industry is embattled and find itself looking at some great threats to its business model, as the saying goes, those who will survive are not necessarily the strongest, but rather those who will adapt the fastest.

Wednesday, January 14, 2015

2014 review and 2015 predictions

Last year, around this time, I had made some predictions for 2014. Let's have a look at how I fared and I'll risk some opinions for 2015.
Before predictions, though, new year, new web site, check it out at coreanalysis.ca

Content providers, creators, aggregators:

"OTT video content providers are reaching a stage of maturity where content creation / acquisition was the key in the first phase, followed by subscriber acquisition. As they reach critical mass, the game will change and they will need to simultaneously maximize monetization options by segmenting their user base into new price plans and find a way to unlock value in the mobile market." 
On that front, content creation / acquisition still remains a key focus of large video OTT (See Netflix' launch of Marco Polo for $90m). Netflix has reported  $8.9B of content obligations as of September 2014. On the monetization, front, we have also seen signs of maturity, with YouTube experimenting on new premium channels and Netflix charging premium for 4K streaming. HBO has started to break out of its payTV shell and has signed deals to be delivered as online broadband only subscriptions, without cable/satellite.
Netflix has signed a variety of deals with european MSOs and broadband operators as they launched there in 2014.
While many OTT, particularly social networks and radio/ audio streaming have collaborated and signed deals with mobile network operators, we are seeing also a tendency to increasingly encrypt and obfuscate online services to avoid network operators meddling in content delivery.
Both trends will likely accelerate in 2015, with more deals being struck between OTT and network operators for subscription-based zero-rated data services. We will also see in mobile networks the proportion of encrypted data traffic raise from the low 10's to at least 30% of the overall traffic.

Wholesaler or Value provider?


The discussion about the place of the network operator and MSO in content and service delivery is still very much active. We have seen, late last year, the latest net neutrality sword rattling from network operators and OTT alike, with even politicians entering the fray and trying to influence the regulatory debates. This will likely not be setted in 2015. As a result, we will see both more cooperation and more competition, with integrated offering (OTT could go full MVNO soon) and encrypted, obfuscated traffic on the rise. We will probably also see the first lawsuits from OTT to carriers with respect to traffic mediation, optimization and management. This adversarial climate will delay further monetization plays relying on mobile advertisement. Only integrated offering between OTT and carriers will be able to avail from this revenue source.
Some operators will step away from the value provider strategy and will embrace wholesale models, trying to sign as many MVNO and OTT as possible, focusing on network excellence. These strategies will fail as the price per byte will decline inexorably, unable to sustain a business model where more capacity requires more investment for diminishing returns.
Some operators will seek to actively manage and mediate the traffic transiting through their networks and will implement HTTPS / SPDY proxy to decrypt and optimize encrypted traffic, wherever legislation is more supple.

Mobile Networks

CAPEX will be on the rise overall with heterogeneous networks and LTE roll-out taking the lion share of investments. 
LTE networks will show signs of weakness in term of peak traffic handling mainly due to video and audio streaming and some networks will accelerate LTE-A investments or aggressively curb traffic through data caps, throttles and onerous pricing strategies.

SDN will continue its progress as a back-office and lab technology in mobile networks but its incapacity to provide reliable, secure, scalable and manageable network capability will prevent it to make a strong commercial debut in wireless networks. 2018 is the likeliest time frame.

NFV will show strong progress and first commercial deployments in wireless networks, but in vertical, proprietary fashion, with legacy functions (DPI, EPC, IMS...) translated in a virtualized environment in a mono vendor approach. We will see also micro deployments in emerging markets where cost of ownership takes precedence over performance or reliability. APAC will also see some commercial deployments in large networks (Japan, Korea) in fairly proprietary implementations.
Orchestration and integration with SDN will be the key investments in the standardization community. The timeframe for mass market interoperable multi vendor commercial deployment is likely 2020.

To conclude this post, my last prediction is that someone will likely be bludgeoned to death with their own selfie stick, I'll put my money on Mobile World Congress 2015 as a likely venue, where I am sure countless companies will give them away, to the collective exasperation and eye-rolling of the Barcelona population.

That's all folks, see you soon at one of the 2015 shows.

Monday, October 27, 2014

HTTP 2.0, SPDY, encryption and wireless networks

I had mused, three and half years ago, at the start of this blog, that content providers might decide to encrypt and tunnel traffic in the future in order to retain control of the user experience.

It is amazing that wireless browsing is becoming increasingly the medium of choice for access to the internet, but the technology it relies on is still designed for fixed, high capacity, lossless, low latency networks. One would think that one would design a technology for its primary (and most challenging) use case and adapt it for more generous conditions instead of the other way around... but I am ranting again.

We are now definitely seeing this prediction accelerate since Google introduced SPDY and proposed it as default for HTTP 2.0.
While HTTP 2.0 latest draft is due to be completed this month, many players in the industry are silently but definitely committing resources to the battle.

SPDY, in its current version does not enhance and in many cases, decreases user experience in wireless networks. Its implementation of TCP lets it too dependant on round trip time, which in turns creates race conditions in lossy networks. SPDY can actually contribute to congestion rather than reduce it in wireless networks.

On one side content providers are using net neutrality arguments to further their case for the need for encryption. They are conflating security (NSA leaks...), privacy (apple cloud leaks) and net neutrality (equal, and if possible free access to networks) concerns.

On the other side, network operators, vendors are trying to argue that net neutrality does not mean not intervening, that the good of the overall users is subverted when some content providers and browser/client vendors use aggressive and predatory tactics to monopolize bandwidth in the name of QoE.

At this point, things are still fairly fluid. Google is proposing that most / all traffic be encrypted by default, while network operators are trying to introduce the concept of trusted proxies that can decrypt / encrypt under certain conditions and user's ascent.

Both these attempts are short-sighted and doomed to fail in my mind and are the result of aggressive strategies to establish market dominance.

In a perfect world, the device, network and content provider negotiate service quality based on device capabilities, subscriber data plan, network capacity and content quality. Technologies such as adaptive bit rate could have been tremendously efficient here, but the operating word in the previous sentence is "negotiate", which assumes collaboration, discovery and access to relevant information to take decisions.

 In the current state of affair, adaptive bit rate is often times corrupted in order to seize as much network bandwidth as possible, which results in devices and service providers aggressively competing for bits and bytes.
Network operators tend to either try to improve or control user experience by deploying DPI, transparent caches, pacing technology, traffic shaping engines, video transcoding, etc...

Content providers assume that highest quality of content (HD for video for instance) equals maximum experience for subscriber and therefore try and capture as much network resource as possible to deliver it. Browser / apps / phone manufacturers also assume that more speed equals better user experience, therefore try to commandeer as much capacity as possible. The flaw here is the assumption that the optimum is the product of many maxima self regulated by an equal and fair apportioning of resources. This shows a complete ignorance of how networks are designed, how they operate and how traffic flows through these networks.

This behaviour leads to a network where all resources are perpetually in contention and all end-points vie for priority and maximum resource allocation. From this perspective one can understand that there is no such thing as "net neutrality" at least not in wireless networks. When network resources are over-subscribed, decisions are taken as to who gets more capacity, priority, speed... The question becomes who should be in position to make these decisions. Right now, the laissez-faire approach to net neutrality means that the network is not managed, it is subjected to traffic. When in contention, resources are managing traffic based on obscure rules in load balancers, routers, base stations, traffic management engines... This approach is the result of lazy, surface thinking. Net neutrality should be the opposite of non intervention. Its rules should be applied equally to networks, devices / apps/browsers and content providers if what we want to enable is fair and equal access to resources.

Now, who said access to wireless should be fair and equal? Unless the networks are nationalized and become government assets, I do not see why private companies, in a competitive market couldn't manage their resources in order to optimize their utilization.

If we transport ourselves in a world where all traffic becomes encrypted overnight, networks lose the ability to manage traffic beyond allowing / stopping and fixing high level QoS metrics to specific services. That would lead to network operators being forced to charge exclusively for traffic. At this point, everyone has to pay per byte transmitted. The cost to users would become prohibitive as more and more video of higher resolution flow through the networks. It would mean also that these video providers could asphyxiate the other services... More importantly, it would mean that the user experience would become the fruit of the fight between content providers; ability to monopolize network capacity, which would go again any "net neutrality" principle. A couple of content providers could dominate not only service but the access to these service as well.

The best rationale against this scenario is commercial. Advertising is the only common business model that supports pay TV and many web services today. The only way to have an efficient, high CPM ad model in wireless is to make it relevant and contextual. The only that is going to happen is if the advertising is injected as close to the user as possible. That means collaboration. Network operators cannot provide subscriber data to third party, so they have to exploit and anonymize it themselves. Which means encryption, if needed must occur after ad insertion, which need to occur at the network edge.

The most optimally  commercially efficient model for all parties involved is through collaboration and advertising, but current battle plans show adversarial models, where obfuscation and manipulation are used to reduce opponents margin of maneuver. Complete analysis and scenario in my video monetization report here.

Friday, July 4, 2014

Q2 multiscreen video news

I use a service to curate and collate my news. Reading through the last few months, I realized that there are so many subjects worthy of comment that a single post wouldn't begin to address them meaningfully. I reserve in-depth analysis of specific trend or topic for my paying clients, so I decided to review and comment on press clippings and announcements as they become available as a way to illustrate the trends, threats and opportunities surrounding our market.
Here is what caught my attention in the last quarter:

Technology: Is 4K the new 3D?

April of course is synonymous with NAB frenzy. Sifting through the trough of announcements at the show, I have noticed a sharp change of direction in vendor’s announcements and claims from last year. When 2013 was all about HEVC H.265, this year seems to be about 4K. While HEVC licensing terms have been agreed and announced by MPEGLA in February, Google’s royalty-free VP9 has captured some support as well, forcing chipset and platform vendors to contemplate fragmentation and multi codec support. Obviously, the battle for codec and protocol will determine who controls the management and delivery of 4K content going forward. In this race, not surprisingly, YouTube is siding with its parent company with VP9 support, while Netflix is adopting H.265. Both companies agree though, and are adamant, that 4K is a lot easier to manage and deliver for OTT properties than for traditional broadcasting payTV providers. Netflix forecasts mass market for 4K to be five years out at the current rate of TV replacements. My opinion is that 4K adoption will suffer from H265/VP9 fragmentation. We will probably see further delays because of the cost of implementing dual protocol stack throughout the delivery chain.

Technology: Cloud, SDN, NFV

At NAB as well, vendors were eager to show off their new acronyms, touting dreams of cloud-based virtualized, self-managed, software-defined networks that would… In reality, most MSOs are still focusing on rolling out HD, improving and automatizing workflows and overall costs reduction. I think we still have 5 years to go before seeing practical, mature implementation of SDN in professional video. Anything else is a science experiment or a proprietary implementation at this point.

Business: MSO to OTT

One of the big news was the announcement from AT&T regarding their intent to invest, jointly with the Chernin Group up to $500million to create SVOD and advertising based web streaming services. Umm... Is it too much or not enough? $500 million goes a long way if you want to build a web streaming service, but it does not seem nearly enough if you want to build an attractive content offering.
HBO, the next day was reported to have signed a multi-year agreement with Amazon. The deal should see some of HBO’s back catalogue series made available to Amazon Prime subscribers. Little by little, HBO nudges the boundaries. You will remember that it signed a deal with Comcast last year to offer HBO Go to Comcast broadband subscribers, without a cable subscription. All signs point that HBO could be a major league OTT provider when they will be ready to cross over.

Business: OTT to Wireless

Almost coincidentally, rumours emerged that Netflix was in discussions with the Vodafone Group to distribute Netflix services on some Vodafone subscriptions. It is likely that these deals will increase in frequency. LTE /4G will see opportunities for cord-never and cord-shavers to access their favourite service and content on cellular networks. That is… if they figure out the charging model (paying 8$ a month for Netflix and $150 in data overage charge to Vodafone wouldn't really work).

Business: OTT to MSO

Netflix has integrated its offering on Atlantic Broadband, Grande Communication and RCN Telecom services set-top boxes, a first in the US after having piloted the concept in Europe. Subscribers will be able to select the service from their PayTV provider. It is an interesting strategy for small MSOs to bundle Netflix in hybrid Set top boxes. It increases reach, provides an attractive offering and good differentiation against market leaders.

Business: M&A

Kaltura bought TVinci to expand its SVOD offering to live and linear programming. Arris bought SeaWell Networks for its advertising insertion and packaging at the edge technology. SeaWell Networks’ strong adaptive bit rate streaming skill set will be invaluable to expand the company’s multiscreen strategy.

That’s all folks for this quarter! I will keep all the good net neutrality commentary for next month, hopefully when the smoke dissipates from the PR battlefield.


Tuesday, July 1, 2014

Mobile network 2030





It is summer, nice and warm. England and Italy are out of the world cup, France will beat Germany on Friday, then Brazil and Argentina in the coming weeks to obtain their second FIFA trophy. It sounds like a perfect time for a little daydreaming and telecom fiction...

The date is February 15, 2030

The mobile world congress is a couple of weeks away and has returned to Cannes, as the attendance and indeed the investments in what used to be mobile networks have reduced drastically over the last few years. Finished are the years of opulence and extravagant launches in Barcelona, the show now looks closer to a medium sized textile convention than the great mass of flashy technology and gadgets it used to be in its heyday. 

When did it start to devolve? What was the signal that killed what used to be a trillion dollar industry in the 90's and early 2000's. As usual, there is not one cause but a sort of convergence of events that took a momentum that few saw coming and fewer tried to stop. 

Net neutrality was certainly one of these events. If you remember, back in 2011, people started to realize the level of penetration fixed and wireless networks were exposed to from legal and illegal interception. Following the various NSA scandals, public pressure mounted to protect digital privacy. 
In North America, the battle was fierce between pro and con neutrality, eventually leading to a status quo of sorts, with many content providers and network operators in an uneasy collaborative dynamic. Originally, content providers unwilling to pay for traffic delivery in wireless networks attempted to secure superior user experience by implementing increasingly bandwidth hungry apps. When these started to come in contention for network resources, carriers started to step in and aggressively throttle, cap or otherwise "optimize" traffic. In reaction, premium content providers moved to an encrypted traffic model as a means to obfuscate traffic and prevent interception, mitigation and optimization by carriers. Soon enough, though, the encryption-added costs and latency proved impractical. Furthermore, some carriers started to throttle and cap all traffic equally, claiming to adhere to the letter of net neutrality, which ended up having a terrible effect on  user experience. In the end cooler heads prevailed and content providers and carriers created integrated video networks, where transport, encryption and ad insertion were performed at the edge, while targeting, recommendation, fulfillment ended up in the content provider's infrastructure. 

In Europe, content and service providers saw at the same time "net neutrality" as the perfect excuse to pressure political and regulatory organizations to force network providers to deliver digital content unfiltered, un-prioritized at best possible effort. The result ended up being quite disastrous, as we know, with content being produced mostly outside Europe and encrypted, operators became true utility service providers. They discovered overnight that their pipes could become even dumber than they were.

Of course, the free voice and texting services launched by some of the 5G licensees new entrants in the 2020's accelerated the trend and nationalization of many of the pan European network operator groups.

The transition was relatively easy, since many had transcended to full virtual networks and contracted ALUSSON the last "european" Telecom Equipment Manufacturer to manage their networks. After they had spent collectively over 100 billion euros to virtualize it in the first place, ALUSSON emerged as the only clear winner of the cost benefits brought by virtualization. 
Indeed, virtualization was attractive and very cost effective on paper but proved very complex and organizationally intensive to implement in the end. Operators had miscalculated their capacity to shift their workforce from telecom engineering to IT when they found out that the skill-set to manage their networks always had been in the vendors' hands. Few groups were able to massively retool their workforce, if you remember the great telco strikes of 2021-2022.
In the end, most ended up contracting and transitioning their assets to their network vendor. Obviously, liberated from the task of managing their network, most were eager to launch new services, which was one of the initial rationale for virtualization. Unfortunately, they found out that service creation was much better implemented by small, agile, young entrepreneurial structures than large, unionized, middle aged ones... With a couple of notable exceptions, broadband networks were written off as broadband access was written in the European countries' constitutions and networks aggregated at the pan European level to become pure utilities when they were not downright nationalized.

Outside Europe and North America, Goopple and HuaTE dominate, after voraciously acquiring licenses in emerging countries, ill-equipped to negotiate the long term values of these licenses versus the free network infrastructures these companies provided. The launch of their proprietary SATERR (Satellite Aerial Terrestrial Relay) technology proved instrumental to creating the first fully vertical service /network/ content / device conglomerates.  

Few were the operators who have been able to discern the importance of evolving their core asset "enabling communication" into a dominant position in their market. Those who have succeeded share a few common attributes:

They realized first that their business was not about counting calls, bites or texts but enabling communication. They first started to think in term of services and not technology and understood that the key was in service enablement. Understating that services come and go and die in a matter of months in the new economy, they strove not to provide the services but to create the platform to enable them.

In some cases, they transitioned to full advertising, personal digital management agency, harnessing big data and analytics to enrich digital services with presence, location, preference, privacy, corporate awareness. This required much changes organizationally, but as it turned out, marketing analyst were much easier and cost effective to recruit than network and telecom engineers. Network management became the toolset, not the vocation. 

In other cases, operators became abstraction layers, enabling content and service providers to better target, advertise, aggregate, obfuscate, disambiguate, contextualize, physical and virtual communication between people and machines.

In all cases they understood that the "value chain" as they used to know it and the consumer need for communication services was better served by an ever changing ecosystem, where there was no "position of strength" and where coopetition was the rule, rather than the exception.