Thursday, September 15, 2011

Openet's Intelligent Video Management Solution

As you well know, I have been advocating closer collaboration between DPI,   policy management and video optimization for a while (here and here for instance). 


In my mind, most carriers have had to deal in majority with transactional traffic in data until video came along. There are some fundamental differences between managing transactional and flow-based data traffic.The quality of experience of a video service depends as much from the intrinsic quality of the video than the way that video is being delivered.


In a mobile network, with a daisy chain of proxies and gateways (GGSN, DPI, browsing gateway, video optimization engine, caching systems...), the user experience of a streamed video is only going to be as good as the lowest common denominator of that delivery chain.




Gary Rieschick, Director – Wireless and Broadband Solutions at Openet spoke with me today about the Intelligent Video Management Solution launched this week.
"Essentially, as operators are investing in video optimization solutions, they have been asking how to manage video delivery across separate enforcement points. Some vendors are supporting Gx, other are supporting proprietary extensions or proprietary protocols. Some of these vendors have created quality of experience metrics as well, that are used locally, for static rule based video optimization."
Openet has been working with two vendors in the video optimization space to try and harmonize video optimization methods with policy management. For instance, depending on the resulting quality of a video after optimization, the PCRF could decide to zero rate that video if the quality was below a certain threshold.


The main solution features highlighted by Gary are below:
  • Detection of premium content: The PCRF can be aware of agreements between the content provider and operator and provisioned with rules to prioritize or provide better quality to certain content properties.
  • Content prioritization: based on time of day, congestion detection
  • Synchronization of rules across policy enforcement points to ensure for instance that the throttling engine at the DPI level and at the video optimization engine level do not clash.
  • Next hop routing, where the PCRF can instruct the DPI to toute the traffic within the operator network based on what the traffic is (video, mail, P2P...)
  • Dynamic policies to supplement and replace static rules provision in video optimization engine to be reactive to network congestion indications, subscriber profile, etc...


I think it is a good step taken by Openet to take some thought leadership in this space. Operators need help to create a carefully orchestrated delivery chain for video. 
While Openet's solution might work well with a few vendors, i think though, that a real industry effort in standardization is necessary to provide video specific extensions to Gx policy interface.
Delivering and optimizing video in a wireless network results in destructive user experience whenever the control plane enabling feedback on congestion, original video quality, resulting video quality, device and network capabilities is not shared across all policy enforcement and policy decision points.

Monday, September 12, 2011

Openwave CEO replaced - Consolidations to come in the traffic management market

Openwave announced today the resignation of its CEO, Ken Denman, quoting personal reasons. Denman is being replaced by Anne Brennan, the company's CFO.

As we have seen in a previous post, Openwave has been struggling for a while to deliver on the expectations it has raised in the market to provide an integrated traffic management solution for video.

After failing to show the results on over 40 announced trials, after failing to upsell their installed base with their next generation of products, after after buying back old patents and suing RIM and Apple, Openwave sees its CEO resign and, the same day,  is nominating Peter Feld as Chairman of the Board, replacing Charles E. Levine.

This market segment, born from the ashes of the wap gateway market, sees companies like Acision, Bytemobile, Comverse,  Ericsson, Flash Networks, Huawei, Mobixell, Nokia Siemens Networks, and others become the intelligent gateway in the network. That gateway's role is to complement and orchestrate DPI, charging, PCRF, video optimization. It is a key network function.

As most data traffic is browsing related, companies that used to sell wap gateway are the best positioned to capitalize on upselling a richer, more sophisticated gateway that can provide means for operators to control, monetize and optimize browsing and video traffic in their network.

Openwave has not been able to negotiate that trend early enough to avoid its market share being eaten up by traditional competitors and new entrants. Additionally, as the traffic has fundamentally changed since tablets and smartphones have entered the market, key capabilities such as TCP, web and video optimization were late to appear in Openwave's roadmap and proved challenging to build rather than buy.

Mobixell started the consolidation with the acquisition of 724 solutions last year.
I bet we will see more consolidations soon.

Friday, September 9, 2011

How to charge for video? part 3 - Pros and Cons

Here are the pros and cons from the methods identified in the previous post.



Pros
Cons
Unlimited usage
Customer friendly, good for acquisition and churn reduction
Hard to plan network capacity
Will be a real differentiator in the future
Expensive, if data usage continues doubling on a yearly basis
Fair Limit
Provides some capacity planning
The limit tends to change often, as the ratio of abuser vs. Heavy users goes down.
Hard Cap
No revenue leakage
Not customer friendly
Easy network planning (max capacity needed = max number of users x caps)
Does not allow to capture additional revenue
Hard cap with overage fee:
Can be very profitable with a population that has frequent overage
Many customers complain of the bill shock.
Soft cap
Customer friendly, easy to understand
Not as profitable in the short term
Soft cap with throttling
A better alternative to hard cap in markets where video usage is not yet very heavy
Becomes less and less customer friendly as video traffic increases
Speed capping
Very effective for charging per type of usage and educating customers
Requires sophisticated network (DPI + Charging + subscriber management)
Application bundling
Popular in mature market with high competition, where subscribers become expert at shopping and comparing the different offerings.
Complex requires sophisticated network, requires good understanding of subscriber demographics and usage to maximize revenue
Metered Usage
Very effective way to ensure that capacity planning and revenue are tied
Not very popular, as many subscribers do not understand Megabytes and how 2 minutes of video could “cost” from 1 to 10 times .
Content based charging
Allow sophisticated tariffing that maximizes revenue
Complex requires sophisticated network, requires good understanding of subscriber demographics and usage to maximize revenue. Technology not quite ready.
Time of day charging
For operators who have a “prime time” effect with peaks an order of magnitude higher than average traffic, an effective way to monetize the need to size for peak.
Not very popular. The network is still underutilized most of the time.
Location based charging
Will allow operators with “hot spots” to try and mitigate usage in these zones or at least to finance capacity.
Most subscribers wont accept having to carry a map to understand how much their call/video will cost them.

As with many trends in wireless, it will take a while before the market matures enough to elaborate a technology and a business model that is both user-friendly and profitable for the operators. Additionally, the emergence of over-the-t0p traffic, with now content providers and aggregators selling their services directly to customers, forces the industry to examine charging and tariffing models in a more fundamental fashion.
Revenue sharing, network sharing, load sharing require traditional core network technologies to be exposed to external entities for a profitable model where brands, content owners, content providers and operators are not at war. New collaboration models need to be thought of. Additionally, while the technology has made much progress, the next generation of DPI, PCRF, OSS/BSS will need to step up to allow for these sophisticated charging models.

Thursday, September 8, 2011

How to charge for video? part 2 - pricing models

While 4G is seen as a means to increase capacity, it is also a way for many operators to introduce new charging models and to depart from bundled, unlimited data plans.
Let’s look at some of the strategies in place for data pricing in a video world:
·         Unlimited usage: This category tends to disappear as data demand increases beyond network capacity. It is still used by new entrants or followers with a disruptive play.
o   Fair limit: even with unlimited packages, many operators tend to enforce a fair limit, usually within 90% of their subscriber’s usage.
·         Capacity capping: this mechanism consists in putting a limit to the subscriber’s capacity to use data on a monthly basis. It is usually associated with a flat monthly fee. It is mostly a defensive measure. Past that limit, the operator has four choices:
o   Hard cap: no data usage is allowed beyond the limit. The subscriber must wait for the next period to use the service anew.
o   Hard cap with overage fee: Once the customer has reached her limit, a fee per metered usage is imposed, traditionally at a very high rate. For instance, 20 € for 2GB and 1 € per additional 10MB
o   Soft cap: The operator introduces several levels of caps and usage and once a customer reaches a cap, she switches to the next one.
o   Soft cap with throttling: The operator throttles the speed of delivery of data past the cap. Usually at a rate that makes it inefficient/impossible to use data intensive applications such as video. It is called as well “trickle-loading”.
·         Speed capping: As video, P2P and download usage becomes close to fixed broadband, operators have started to provide means to measure and charge for different speeds and usage. It allows to create different packages for the type of usage
o   low speed for transactional (email)
o   Medium speed for real time (social network, internet music and radio)
o   High speed for heavy use (downloads and videos)
·         Application bundling: This method consists in grouping applications or usage by bundles with individual tariffing schemes. For instance, free, unlimited IM, Facebook, Twitter, Email at 20$ per month up to 2GB, No P2P...
·         Metered usage: This method consists on charging based on the amount of data consumed monthly by the subscriber.
·         Contextual charging:
o   Content based charging: This is the target of many operators, being able to differentiate between the types of content, origin, quality and create a tariff grid accordingly. For instance: a pricing structure that will have different rates for HD and SD video, whether it is on deck or off deck, whether it is sport or news, live or VOD...
o   Time of day charging: This is a way to make sure that peak capacity is smoothed throughout the day or to get the most margin from busiest times.
o   Location based charging: Still embryonic. Mostly linked to Femtocells deployments.
In my next post, I will look at the pros and cons of each charging model.

Wednesday, September 7, 2011

How to charge for Video? part 1 - How did we get there?

In January 2009, when Cisco released its first Visual Networking Index, a forecast of data traffic in mobile networks, the first reaction from the market was incredulity.

Cisco was projecting that, based on traffic observed over the last 5 years, mobile data traffic was to double every year. Even more remarkable, video, then a mere 20% of the overall traffic would rise and account up to 64% of the traffic by 2013.

The industry met these projections with raised eyebrows and many dismissed the report as a simple attempt for vendors to sell more network equipment. While the intention behind the report is undoubtedly to bring carriers to the conclusion that they need to strengthen their network and prepare for huge CAPEX spending, the observations remain relevant.

By the summer of 2009, networks started experiencing data outages (AT&T). While the trend seemed to accelerate and spread (Verizon, Sprint, Vodafone Germany, Vodafone UK, O2 UK, Orange UK...), carriers and vendors alike started to look at identifying and defining the issue.

Mobile data indeed was growing fast and video seemed to be a large part of it. Additionally, the outages seemed caused by a variety of factors, from radio access network (signalling) to core (congestion) instability.
It is clear that the massive take-off of smartphones and tablets, coupled with the change in media consumption patterns by mobile subscribers had taken all by surprise.

The main cause, in my mind, for this surge and instability in mobile network traffic is not to be found in the technology but rather in the business model.

At the beginning of 2000, the wireless world is in ebullition. 3G licenses are being sold for Billions (with UK auction the most expensive at £22.4Bn). Wireless operators embark on the promise of wireless internet (WAP) and multimedia messaging. These promises were not delivered on, and many started to look for content and applications to fill their new-found bandwidth.
USB dongles proved popular for the enterprise market, to provide data connectivity on the go. Along that time, flat fee, all-you-can-eat, unlimited data packages start to appear. While there wasn’t that much attractive content available, these plans proved effective in drawing throngs of subscribers and became a weapon of choice in the customer acquisition arsenal.

Fast forward to 2011 - with the rise of social media, the introduction of smartphones and tablets as new categories, the explosion of user-generated-content and the emergence of apps as the preferred way to access or interact with content in the mobile world -  networks find themselves flooded with data usage.

In the next post, I will look at an inventory of existing charging models.

Thursday, September 1, 2011

Bytemobile T 3000 series & Unison update

Bytemobile released this week a new platform (T3000) and a new product (T3100).
With more than 40 operator deployments, Bytemobile is the leader in the video optimization market. The new platform is launched to allow Bytemobile to address the intelligent traffic management market .

Mikko Disini, in charge of the new T 3OOO series and Unison platforms discussed with me the rationale behind the introduction of the new product and how it complements Unison.

T3000 series has been created in an effort to provide more monetization options for mobile broadband operators. For those familiar with Unison, which is essentially a web and video proxy and optimization gateway, T3100 expands beyond browsing to proxy and manipulate all traffic, including UDP based applications, P2P, video chat, RTSP, etc..
While Unison remains a software based solution, on off-the-shelf IBM blade center, T3000  series is a purpose built IBM hardware based appliance. T3100 combines load balancing, DPI, PCEF and traffic rules in one package. Bytemobile is planning to introduce new products on the T3000 platform in the future.

Mikko commented that the rationale behind the hardware based approach is to be more channel-friendly. " It is easier to deploy, package, explain, it is an easier sale".

My opinion is that Bytemobile makes a smart move to expand their product portfolio with new verticals. While there is a large level of overlap between Unison and T3100 today, Bytemobile can upsell their installed base with purpose-built solutions. While in the past, Unison was a Swiss Army knife, for a market who was looking for a quick solution, that had a bit of everything, the growth of the traffic is forcing many vendors to separate applications to have more granular scalability.


With T3000, Bytemobile moves more decidedly into the DPI, load balancing, PCEF space than with Unison. Additionally, moving to a hardware appliance model is going to enable them further to resist price erosion, reusing the Unison tactics of bundling several applications and features together with different market prices and models.
What remains to be seen is how effective the strategy is going to be in acquiring new channels, beyond IBM, NSN and TechMahindra now that T3000 is sure to encroach on some bigger players such as F5 and Cisco... or maybe, this is the strategy?