Tuesday, May 31, 2011

Worst of breed, golden silo

After some 13-odd years in new technology and product introduction, you can't help but look at trends that pop up in this industry with a cynical eye.

One of the catch phrases I hear often is best-of-breed approach. For me, every time I hear it, it is a sure sign that a market segment or a technology is not mature.

It is somewhat counter-intuitive. Best-of-breed pick-and-choose componentized approach to service delivery hints at ranges of well defined components, fungible, interchangeable. You would think that the interfaces being well defined, each vendor competes on unique differentiators, without impacting negatively the service delivery.

Conversely, silo has become an increasingly bad word in telecoms, evoking poorly architected, proprietary daisy chain of components that cannot integrate gracefully in a modern organic network.

Then why is it that best-of-breed always end up taking longer, being more expensive and less reliable than a fully integrated solution from a single vendor?

In my mind, the standards that have been created to describe the ideal networks, from WAP to MMS, from IMS to LTE have been the product of too many vendor lobby-ism. The results in many case are vaguely defined physical and functional components, with lowest common denominator in term of interface and call flows.
The service definition being somewhat excluded from standards has left little in term of best practice to integrate functional components efficiently.

There is a reason in my mind why the Chinese vendors ZTE and Huawei are doing so well. It is not only because of their cost structure, it is because their all in-house technology approach for business critical components make sense.
It allows fast, replicable deployment and trouble shooting. There is much less complexity in integration and roll out, which is the most consuming part of CAPEX.

Whenever you see these vendors using third party technology, it is because it is either so mature and stable that it is not worth developing in-house or so specialized that it has not been developed yet.
In any case, we are talking about fringe technologies. Anything that is business critical is identified long in advance and developed in-house.
Their product and services might not be as sophisticated or differentiating than specialized vendors, but they deliver value by providing the minimum services at the lowest cost, with good enough reliability.

The companies that will win will either be small niche vendors at the periphery of the larger market opportunities or companies that will be good at providing better value, with stronger benefit, but at a price that  is equivalent.

Thursday, May 26, 2011

Mobile TV, video advertising and Apple

FreeWheel released a study this week, showing a number of interesting facts in the on-deck content syndication space.
  • Mobile TV represents only 1% of overall TV viewer ship (including fixed, web).
  • Apple devices represent 80% of mobile TV consumption. This is due to the fact that, having entered the market first, there is a larger penetration of the brand withing tablet and smart phone users, which has prompted content editors and aggregators to prioritize this platform for content availability.

  • Video advertising is starting to grow in revenue and viewing completion, mostly due to the fact that there is more long-form content available. Mid rolls are the most effective in term of completion rate, as long form content is more engaging, users stay until completion of the ad to resume their viewing.

I think on-deck mobile TV is till the least attractive of the mobile video services. It is expensive, offers a service that is the same as regular TV, with less quality.

Mobile TV becomes attractive to me when it offers different features from TV, that is acknowledging the fact I am mobile, like play and resume across screens (a feature I created for Videotron in Canada, see below)
or content / channel targeting. Operator that are successful in mobile TV cater to the long tail and offer niche content and channels for specific demographics. Access to the same top 10 channels as fixed TV is only good for a few bucks a month. Real monetization opportunities come from specialized content and channels.

Wednesday, May 25, 2011

The cloud's silver lining

As I mentioned, I am far from being an IT wizard, but I like to try new things and to do things by myself before going to the real professionals who can solve the problems I encounter or I create.

When I started my consulting practice, last month, I thought about the different tools I could use to market my skill set and experience.

Naturally, I had to have a web site. After little research, I found a wealth of services allowing me to create a web site from scratch, without downloading any special program, just using my computer and a browser.

I have selected wix.com and created a professional flash web site in less than 48 hours. It was intuitive, inexpensive, did not require any special training or software.

For me this is a perfect example of the best experience a cloud service
 can provide. Ease of use, dependable, inexpensive and good result ( at least, I think so :-) ).

Tuesday, May 24, 2011

When the cloud is broken

Since I have started my consulting practice, I have been busy with a variety of activities. Legal, fiscal, sales, marketing, accounting... When you start your company, you have to do a bit of everything yourself.

The most rewarding and at the same time the most frustrating part of the whole experience has been IT. I consider myself effective in computer use, everything from Microsoft, a little bit of Internet technology as well. I am not a programmer (or even an engineer for that matter) so excel macros are the extent of my efforts when it comes to coding.

This long preamble to introduce my experience of "the cloud" so far. I have used the cloud as a consumer and in enterprise settings. Like most of you, I have been using web based email, intranet technology, crm tools, collaborative software etc... But I want to focus today on my experience as a consumer.

I have been using cloud technology for a long time before it was called the cloud. I am a gamer. Always have been. As many gamers, I have long been acquainted with client-server interactivity and gone through the various thin client/ fat client debates both on pcs and handheld devices.

I was quite happy with myself, setting up my practice in about a week, with a web site, a blog, two published editorials, articles of incorporation and all the paperwork and delightful interactions with the tax administration.

The reason behind my diligence might lie elsewhere than pure motivation, though. 

The PlayStation Network and its well publicized hacking and month-long disconnection have been a very good productivity enhancer for me. Overnight, my PlayStation (PS3) went from being the main media server and gateway in my house to an obsolete gaming console for offline (non connected) games. One of the main success of Sony has been its capacity to create a community of players and online applications "in the cloud". Over 77 million players globally have accounts and play online. Massive online multiplayer games require massive processing power and with 770 millions accounts, Sony has done a good job of turning every PS3 out there into both a client and a server for distributed online applications.

PSN's responsibility is mostly around account management, ranging from IDs, password, prepaid balance, etc...
When PSN got hacked, on April 20th, Sony's engineers had no other choice but to shut down the service and audit every transaction to review where the security breach has occurred and the nature of the stolen information.
For a month, I was not able to use the service and platform I had purchased, not knowing what happened to senders and worse of all for me, not being able to do anything about it (ok, the worse was not being able to play online).The idea that the problem affected not only me but also millions of others does not make it better but worse.

Another dissatisfying experience was on this blog, when after working on the last posts series (mobile video 101102 and 103).

I had all drafts ready and setup for publishing. On May 5th, Blogger had an unfortunate incident, resulting all blog contributions being suspended and wiped out for 24 hours. Eventually, the service came back online and all contributions were restored, but it made me feel again very powerless to do anything about the situation. I did not know if my contributions were lost, whether they would be restored or whether I would have to rewrite them.
The team at Blogger did an amazing job to restore the situation, but from now on, I keep a copy of my blogs on my hard drive before and after I publish them, just in case.

Now, when I look at my experience and the launch of Google's laptop, which is essentially a browser with access to the cloud, I am not sure I am ready to get into that storm just yet.

When my computer has a virus or runs out of memory or has corrupted files, I might not be able to do very much about it, but at least I feel that I can investigate, trouble shoot or even in some cases solve the issue. I can buy software, bring my computer to a nerd doctor or buy a new one. In any case, I have control over the issue identification, its progress and resolution.

These two anecdotal  issues here gave me none of that and while the services have been restored, my account did not have any sensitive information and I will probably receive free games or services or compensation, the lack of control over the situation is probably what left me the most dissatisfied as a user.

What is your experience of cloud computing as a consumer?

Cloud or vault?

My understanding of cloud computing is somewhat superficial.  I am going to refer to it in the next few posts in the meaning of cloud computing, cloud services, software as a service (SAAS) and no doubt, applications you would not immediately associate with "the cloud".

The cloud is a means to separate data from it's processing and storage. Until recently, data and processing power had to be co-located. Word processors, for instance, are fat client installed on a PC, used to create, edit documents that are destined to be stored on the same unit. Cloud computing allows to separate the functions and to have for instance the storage, processing of the data physically separated from it's access and editing functions.
A browser, a thin client or an app present content and data, while its storage and computation happen in the cloud. This has been made possible by the increase in available fixed and wireless bandwidth and two key concepts I develop below.

As applications and content require more and more processing power and as the type of content and application is in constant flux, we need to have the capacity to have very flexible model for allocating in near real-time capacity for the processing, delivery, management of content and applications.
This concept in cloud computing is called elasticity.

If you operate a server, for instance to stream video, it has finished capacity in term of I/O, CPU, wattage,etc... When you reach the system's capacity, performance decrease and in some case, the application shuts down. Theoretically, in a cloud, you have a large amount of servers, that are not dedicated to one application in particular, but as demand increase in one service, then capacity can be captured from other resources. Of course, it means virtualization across application and intelligent networks that can organically adapt to the demand. Ideally, a large farm of servers or a collection of data centers present a general capacity, processing power, etc... that can be used by units, on demand by the residing applications.

This concept is called fungibility. You might have a large server farm, with several applications deployed concurrently in a virtualized environment. Ideally, the resources of the farm are dynamically allocated to each application as the demand for these resources vary over time.

Cloud or vault?
Cloud computing is a great evolution. It enables us to use resources in a more efficient manner, reducing fixed footprint for specific applications. What cloud computing is not, though is free, unlimited resources. Cloud computing is still bound by law of physics so if you need a lot of processing power or storage for an application, the fact that you are using cloud computing does not necessarily mean you are being more efficient. Cloud computing in my mind is particularly well suited for spiky, unpredictable, low I/O, transactional content and apps.

I am not saying that the cloud is not ready for business critical high bandwidth, high I/O traffic, just that I am not. It is more a matter of mindset, maybe of generation than technology.

I feel more confident and more in control with a vault than a cloud. I would keep all my content, programs, apps in a vault, that I can physically access myself, even if it is less efficient, more costly and ultimately less reliable than the cloud. That is until the cloud is so prevalent, with so much redundancy, so many safety nets that I could never loose one bit of data and the service could never be interrupted.

I am not ready to relinquish total control over my content and apps. I am less trained, equipped and capable than cloud services  providers, but I will need to change my mindset to choose a cloud service over my vault.

I will provide a couple of example of my best and worst experience with cloud computing as a consumer in the next few posts.

In the meantime, please comment, are you cloud or vault?

Wednesday, May 18, 2011

Mobile video 103: On-deck, off deck, broadcast and unicast

Mobile video, as a technology and market segment, can at times be a little complicated.
Here is simple syllabus, in no particular order of what you need to know to be conversant in mobile video. It is not intended to be exhaustive or very detailed, but rather to provide a knowledge base for those interested in understanding more the market dynamics I address in other posts.

On-deck and off-deck (or over-the top, OTT)
  • On-deck mobile video services are services offered directly by the network operator to their subscribers.
    They usually range from a mix of syndicated/ licensed content from aggregators and media companies (music clips, news segments...) to full mobile TV and VOD services such as live and recorded TV, network PVR and catch up TV, Full movie VOD etc... These services are usually premium services with specific tariff plans put in place by the carrier.
  • Off deck services or content are usually provided by third parties not necessarily affiliated with the carrier. These range from social media (linked in, twitter, Facebook...) to user generated content (YouTube, dailymotion...) and professional content (Disney, Hulu, ESPN...). These services are usually not controlled by the network operator and are usually charged for the transport of the data by the carrier but not the content itself.
On-deck content are premium services and generate an important part of a carrier's revenue.  Most carriers do not have content assets though and end up aggregating and reselling content from media companies. Off-deck services are what is driving mobile data growth today. Facebook, YouTube, Netflix and others start to overwhelm and cannibalize what used to be a good revenue maker for carriers. The concept of dumb pipe has emerged over the last few years. Carriers tryu to recapture mobile data traffic and revenue on-deck to avoid being a "dumb pipe" charging for access, but not the value added content.

Broadcast and Unicast 
  • Broadcast video is the method where the connection between the streaming server and the client is one-to-many. The streaming source is unique and is accessed by many devices. This technology has been used for mobile TV services relying on OMA Bcast, ATSC, MBMS, DVB-H and Mediaflo technologies. 
  • Unicast video is the method where the connection between the streaming server and the client is one-to-one. Each connection is unique and can be adapted to the specific conditions of the device, network, etc... The content, its diffusion, quality, advertising,etc... can be personalized for each target. Unicast is used in many mobile TV and Video on Demand, with RTSP protocol. For many"legacy" (i.e. non-smartphone) devices, it is the only video streaming technology. This method has been implemented in many mobile networks for mobile TV as on-deck service.
Although branded as the next generation mobile TV, broadcast has so far failed in many market. But this is due more to the business model than the technology. With broadcast, users all have access to the same programming, the same ads. Most users will consider paying a nominal monthly fee to get access to a few channels, but this is not where the profits are. Unicast offers an alternative as each stream and program can be tailored to the subscriber need. It is more management overhead, but users are ready to pay a premium for an individualized experience.

    Monday, May 16, 2011

    Mobile video 102: lossless and lossy compression

    Mobile video as a technology and market segment can at times be a little complicated.
    Here is simple syllabus, in no particular order of what you need to know to be conversant in mobile video. It is not intended to be exhaustive or to be very detailed, but rather to provide a knowledge base for those interested in understanding more the market dynamics I address in other posts.

    Compression (lossless) and optimization (lossy)
    • Compression is the action of reducing the size of the representation of a media object without loosing data. It is lossless when, after decompression, a compressed media is absolutely equal to the original. Compression methods are based on statistical analysis, to represent recurrent data items within a file. PNG, GIF, Zip, gzip and deflate are lossless compression formats. Throttling, just-in-time delivery and caching are lossless delivery methods.
    • Optimization is a form of compression called lossy in the sense that it discards data elements to achieve reduced size. The optimized version is not identical to the original. Transcoding, transrating are lossy methods.
    Lossy optimization methods:
    • Frame per second (fps)A video is composed of  a number of still frames (pictures). The illusion of movement is achieved after 15 frame per second. TV is 24 to 30 fps (depending on the standard and whether it is progressive or interlaced). Many lossy optimization method will reduce the frame per second ratio in order to reduce the size of a file.
      • key frame: Not all frames contain the same amount of data in video optimization. The main way to reduce the quantity of information in a video is to use statistical analysis to predict motion. In other words, analyse differences between one frame to the following. Most optimization method will model only the difference between a frame and the next one, therefore not coding all the information. Key frames or Intra frames are the frames used as reference. When lossy optimization is performed using fps reduction, one has to be careful not to remove the key frames or the user experience will be garbled with many artifacts.
    • Bit rates: Bit rate is the rate at which a video is encoded (quality) or transmitted. 
      • Encoding bit rate:The encoding bit rate represent the amount of information that is captured in each frame. It is measured in kbps or (kilobit per second) Mbps (Megabit per second). HD video is encoded at 20Mbps, SD at 10 Mbps, internet video usually around 1Mbps and video transmitted on wireless between 200 and 700 kbps.
        • Variable bit rate (VBR) or transrating is a lossy optimization method that will vary the encoding bit rate throughout the video to take into account lossy network conditions.
        • Constant bit rate is used for broadcast, fixed line connection and generally lossless transmissions.
      • Delivery bit rate: When a video is transmitted over a wireless network, the connection capacity dictates the user experience. The bit rate of delivery should always exceed the bit rate of encoding of the video for smooth viewing. If the delivery bit rate goes below the encoding bit rate, buffering, stop and go is experienced. Lossy optimization techniques such as VBR allow to reduce the encoding bit rate in real time, as the delivery bit rate varies.
    • Transcoding is the action of decoding a video file and recoding it under a different format. This lossy method is effective to reduce a video file from a definition, format that are not suitable for mobile transmission (HD, 3D...). Additionally, a lot of size saving can be operated by changing the aspect ratio (4:3 or 16:9 from TV) or changing the picture size (HD 1080 is 1920 x 1080 pixels, while most smartphones WVGA are 800 x 480 pixels). You can reduce drastically a video file size by changing the picture size.
    • Transprotocol is the action to change the protocol used for video transmission. For instance, many legacy phones that do not support progressive download cannot access internet video unless they are transprotocoled to RTSP.

    Sunday, May 15, 2011

    Mobile video 101: protocols, containers, formats & codecs

    Mobile video as a technology and market segment can at times be a little complicated.
    Here is simple syllabus, in no particular order of what you need to know to be conversant in mobile video. It is not intended to be exhaustive or very detailed, but rather to provide a knowledge base for those interested in understanding more the market dynamics I address in other posts.

    There are many protocols used in wireless networks to deliver and control video. You have to differentiate between routing protocols (IP), transmission protocols (TCP & UDP), session control (RTP), application control (RTSP) and content control protocols (RTCP). I will focus here on application and content control.
    These protocols are used to setup, transmit and control video over mobile networks

    Here are the main ones:
    • RTSP (Real Time Streaming Protocol) is an industry protocol that has been created specifically for the purposes of media streaming. It is used to establish and control (play, stop, resume) a streaming session. It is used in many unicast on-deck mobile TV and VOD services.
    • RTCP (Real Time transport Control Protocol) is the content control protocol associated with RTP. It provides the statistics (packet loss, bit transmission, jitter...) necessary to allow a server to perform real-time media quality control on an RTSP stream.
    • HTTP download and progressive download (PD). HTTP is a generic protocol, used for the transport of many content formats, including video. Download and progressive download differentiate from each other in that the former needs the whole content to be delivered and saved to the device to be played asynchronously, while the later provides at the beginning of the session a set of metadata associated with the content which allow it to be played before its complete download.
      • Microsoft silverlight, Adobe RTMP and Apple progressive streaming. These three variants of progressive download are proprietary. They offer additional capabilities beyond the vanilla HTTP PD (pre-encoding and multiple streams delivery, client side stream selection, chunk delivery...) and are the subject of an intense war between the three companies to occupy the mindset of content developers and owners. This is the reason why you cannot browse a flash site or view a flash video in your iPhone.
    A container in video is a file that is composed of the payload (video, audio, subtitles, programming guide...) and the metadata (codecs, encoding rate, key frames, bit-rate...). The metadata is a set of descriptive files that indicate the nature of the media, its duration in the payload. The most popular are:
    • 3GPP (.3GP) 3GP is the format used in most mobile devices, as the recommended container for video by 3GPP standards.
    • MPEG-4 part 14 (.MP4) one of the most popular container for internet video.
    • Flash video (FLV, F4V). Adobe-created container, very popular as the preferred format for BBC, Google Video, Hulu, metacafe, Reuters, Yahoo video, YouTube... It requires a flash player.
    • MPEG-2 TS: MPEG Transport Stream is used for broadcast of audio and video. It is used in on-deck broadcast TV services in mobile and cable/ satellite video delivery.
    Formats are a set of standards that describe how a video file should be played.

    • H.263 old codec used in legacy devices and applications. It is mandated by ETSI and 3GPP for IMS and MMS but is being replaced by H.264
    • H.264, MPEG4 part 10, AVC is a family of standards composed of several profiles for different use, device types, screen sizes... It is the most popular format in mobile video.
    • MPEG2 is a standard for lossy audio and video compression used in DVD, broadcast (digital TV, over the air, cable, satellite). MPEG2 describes two container types: MPEG2-TS for broadcast, MPEG-2 PS for files.
    • MPEG4 is an evolution of MPEG2, adding new functionalities such as DRM, 3D and error resilience for transmission over lossy channels (wireless for instance).  There are many features in MPEG 4, that are left to the developer to decide whether to implement or not. The features are grouped by profiles and levels. There are 28 profiles or part in MPEG 4. A codec usually describe which MPEG-4 parts are supported. It is the most popular format on the internet.
    Codec stands for encoding and decoding a media stream. It is a program that has the ability to decode a video stream and re encode it. Codecs are used for compression (lossless), optimization (lossy) and encryption of videos. A "raw" video file is usually stored in YCbCr (YUV) format which provides the full description of every pixel in a video. This format is descriptive, which requires a lot of space for storage and a lot of processing power for decoding / encoding. This is why a video is usually encoded in a different codec, to allow for a better size or variable transmission quality. It is important to understand that while a container obeys strict rules and semantics, codecs are not regulated and each vendor decides how to decode and encode a media format.
    • DivX Proprietary MPEg-4 implementation by DivX
    • WMV (Windows Media Video) - Microsoft proprietary
    • x264 a licenseable H.264 encoding spoftware
    • VP6, VP7, VP8... proprietary codecs developed by On2 technologies, acquired by Google and released as open source

    Wednesday, May 4, 2011

    The age of video

    Do you remember how it all started? How back in the days, you needed at least two phones if you traveled frequently in Europe and the US? Back when GSM, CDMA and TDMA were trying to become the dominant radio access? 
    You couldn't send a text (let alone a picture message) to another carrier. There was no mobile email, no social networks, no YouTube.

    It was interesting that at that time, radio technologies were incompatible. Operators then were battling each other over who had the most coverage, the largest network, who had the clearer voice quality. Remember the Sprint adds with the pin drop? or the Cingular dropped call ads ? or Verizon's "can you hear me now"?
    It was the 90's, it was the age of voice. Carriers were competing to get as many customers as possible, as fast as possible. The game, then, was "I have the biggest network" or "you can actually complete calls on my network".

    Remember how texting then became the biggest thing? How AT&T introduced SMS to America in the second season of American Idol? Remember how you could pick an all-you-can-eat voice, data, texting plan from any carrier?
    It was the '00, it was the age of messaging. Carriers were pushing messaging as a way to expand beyond voice. Texting, picture mails, visual mail, mobile email... it was the next big thing.

    Now it's all about socializing, networking, updating, twitting... blogging. Underneath it all, the technology is almost the same, the services have evolved. 
    The main thing that is new is video.
    As video grows to become the dominant part of mobile traffic, most carriers will start communicating around it. It could be the age of video. Will they be focusing on services such as mobile TV, video calls or will content providers and aggregators like Hulu and YouTube take over? 

    Today, most mobile strategies around video are about cost containment. It is a defensive strategy born from the fact that with the traffic growth, many networks wont be able to meet the demand before they have an operational 4G network. It leads to data caps, throttling, new data plans...

    I will address in future posts what monetization strategies can be offered by the mobile video opportunity and what underlying technologies will be necessary to implement.

    As carriers started as voice specialists and became messaging specialists, I believe they will need to become video specialists if they want to capture mobile video monetization opportunities .