Monday, October 27, 2014

HTTP 2.0, SPDY, encryption and wireless networks

I had mused, three and half years ago, at the start of this blog, that content providers might decide to encrypt and tunnel traffic in the future in order to retain control of the user experience.

It is amazing that wireless browsing is becoming increasingly the medium of choice for access to the internet, but the technology it relies on is still designed for fixed, high capacity, lossless, low latency networks. One would think that one would design a technology for its primary (and most challenging) use case and adapt it for more generous conditions instead of the other way around... but I am ranting again.

We are now definitely seeing this prediction accelerate since Google introduced SPDY and proposed it as default for HTTP 2.0.
While HTTP 2.0 latest draft is due to be completed this month, many players in the industry are silently but definitely committing resources to the battle.

SPDY, in its current version does not enhance and in many cases, decreases user experience in wireless networks. Its implementation of TCP lets it too dependant on round trip time, which in turns creates race conditions in lossy networks. SPDY can actually contribute to congestion rather than reduce it in wireless networks.

On one side content providers are using net neutrality arguments to further their case for the need for encryption. They are conflating security (NSA leaks...), privacy (apple cloud leaks) and net neutrality (equal, and if possible free access to networks) concerns.

On the other side, network operators, vendors are trying to argue that net neutrality does not mean not intervening, that the good of the overall users is subverted when some content providers and browser/client vendors use aggressive and predatory tactics to monopolize bandwidth in the name of QoE.

At this point, things are still fairly fluid. Google is proposing that most / all traffic be encrypted by default, while network operators are trying to introduce the concept of trusted proxies that can decrypt / encrypt under certain conditions and user's ascent.

Both these attempts are short-sighted and doomed to fail in my mind and are the result of aggressive strategies to establish market dominance.

In a perfect world, the device, network and content provider negotiate service quality based on device capabilities, subscriber data plan, network capacity and content quality. Technologies such as adaptive bit rate could have been tremendously efficient here, but the operating word in the previous sentence is "negotiate", which assumes collaboration, discovery and access to relevant information to take decisions.

 In the current state of affair, adaptive bit rate is often times corrupted in order to seize as much network bandwidth as possible, which results in devices and service providers aggressively competing for bits and bytes.
Network operators tend to either try to improve or control user experience by deploying DPI, transparent caches, pacing technology, traffic shaping engines, video transcoding, etc...

Content providers assume that highest quality of content (HD for video for instance) equals maximum experience for subscriber and therefore try and capture as much network resource as possible to deliver it. Browser / apps / phone manufacturers also assume that more speed equals better user experience, therefore try to commandeer as much capacity as possible. The flaw here is the assumption that the optimum is the product of many maxima self regulated by an equal and fair apportioning of resources. This shows a complete ignorance of how networks are designed, how they operate and how traffic flows through these networks.

This behaviour leads to a network where all resources are perpetually in contention and all end-points vie for priority and maximum resource allocation. From this perspective one can understand that there is no such thing as "net neutrality" at least not in wireless networks. When network resources are over-subscribed, decisions are taken as to who gets more capacity, priority, speed... The question becomes who should be in position to make these decisions. Right now, the laissez-faire approach to net neutrality means that the network is not managed, it is subjected to traffic. When in contention, resources are managing traffic based on obscure rules in load balancers, routers, base stations, traffic management engines... This approach is the result of lazy, surface thinking. Net neutrality should be the opposite of non intervention. Its rules should be applied equally to networks, devices / apps/browsers and content providers if what we want to enable is fair and equal access to resources.

Now, who said access to wireless should be fair and equal? Unless the networks are nationalized and become government assets, I do not see why private companies, in a competitive market couldn't manage their resources in order to optimize their utilization.

If we transport ourselves in a world where all traffic becomes encrypted overnight, networks lose the ability to manage traffic beyond allowing / stopping and fixing high level QoS metrics to specific services. That would lead to network operators being forced to charge exclusively for traffic. At this point, everyone has to pay per byte transmitted. The cost to users would become prohibitive as more and more video of higher resolution flow through the networks. It would mean also that these video providers could asphyxiate the other services... More importantly, it would mean that the user experience would become the fruit of the fight between content providers; ability to monopolize network capacity, which would go again any "net neutrality" principle. A couple of content providers could dominate not only service but the access to these service as well.

The best rationale against this scenario is commercial. Advertising is the only common business model that supports pay TV and many web services today. The only way to have an efficient, high CPM ad model in wireless is to make it relevant and contextual. The only that is going to happen is if the advertising is injected as close to the user as possible. That means collaboration. Network operators cannot provide subscriber data to third party, so they have to exploit and anonymize it themselves. Which means encryption, if needed must occur after ad insertion, which need to occur at the network edge.

The most optimally  commercially efficient model for all parties involved is through collaboration and advertising, but current battle plans show adversarial models, where obfuscation and manipulation are used to reduce opponents margin of maneuver. Complete analysis and scenario in my video monetization report here.

No comments: