Monday, November 18, 2019

Announcing Edge computing and hybrid clouds workshops

After working 5 years on edge computing and potentially being one of the only analysts having evaluated, then developed and deployed the technology in a telco networks, I am happy to announce immediate availability of the following workshops:

Hybrid and edge computing strategy
  • Hybrid cloud and Edge computing opportunity 
  • Demand for hybrid and edge services (internal and external)
  • Wholesale or retail business?
  • Edge strategies: what, where, when, how?
  • Hyperscalers strategies, positions, risks and opportunities
  • Operators strategies
  • Conclusions and recommendations

Edge computing Technology
  • Technological trends
  • SDN, NFV, container, lifecycle management
  • Open source, ONF, TIP, Akraino, MobiledgeX, Ori
  • Networks disaggregation, Open RAN, Open OLT
  • Edge computing: Build or buy?
  • Nokia, Ericsson, Huawei
  • Dell, Intel, …
  • Open compute, CORD
  • Conclusions and recommendations

Innovation and transformation processes
  • Innovation process and methodology 
  • How to jumpstart technological and commercial innovation
  • Labs, skills, headcount and budget
  • How to transition from innovation to commercial deployment
  • How to scale up sustainably
Drop me a line if you are interested.

Saturday, November 16, 2019

Edge Computing or hybrid cloud?

Edge computing has been gaining much recognition and hype since I started working on it 5 years ago. I am in a fortunate position to have explored it, as an analyst, being one of the early participants of ETSI's Industry Standardization Group on Multi Access edge Computing (MEC), then develop and deploy it as an operator for Telefonica group and now, back to advising vendors and service providers on the strategies and challenges associated with its development.

One of the key challenges associated with edge computing is that pretty much every actor in the value chain (technology vendors, colocation and hosting companies, infrastructure companies, telecommunication operators, video streaming, gaming and caching services, social media and internet giants, cloud leaders...) is coming at it from a different perspective and perception of what it could (and should) do. This invariably leads to much misunderstanding, as each one is trying to understand the control points in the value chain and assert their position.
  1. Technology vendors see a chance to either entrench further their position, based on proprietary, early implementation or disrupt traditional vendors oligopoly, based on open (source) disaggregated networking. Traditional blue chip vendors see also a chance to move further down the path of black box networks replacement by white box.
  2. Colocation and hosting companies do not quite see why edge is that much different from cloud hosting that they have been doing all along, but are happy to jump on the bandwagon if it means better margins.
  3. Infrastructure company see a chance to move up the value chain, by providing differentiated value added real estate and connectivity services.
  4. Telecommunications operators tend to see edge computing as a possible opportunity to rejoin the cloud war, after having lost the last battles. The promise of futuristic 5G-like services for drones, remote surgery, autonomous cars, etc... is certainly what they have been communicating about but that is not going to materialize tangible revenue streams before 5 to 7 years. There are other short term revenues that can be created by the technology deployment.
  5. Video streaming, gaming and caching services feel that they have been the edge pioneers, with specialized services or physical slices, deep in cloud and teco networks. They tend to resist the move from physical, proprietary appliances towards the open, multi-tenant model that would make the business more profitable for all.
  6. Social media and internet giants ted to feel that there is something they should, or could do there, but most of their infrastructure relies either on their proprietary private cloud or on public clouds and it is unclear whether these models are compatible.
  7. At last, the cloud leaders certainly see edge computing as a growth opportunity to offer differentiated cloud services and performance, but again, they are unsure whether to push the limit of their cloud or integrate with others.

I feel that we have started from the wrong foot here. There is no such thing as edge computing. There is a cloud, and there are devices and data centers. The largest, most impactful performance move cloud can make is to integrate with the last mile - the telco networks.Where you want a workload to run, a dataset to reside, a pipeline to transit through should be the result of:
  • What is available in terms of capacity
  • What is your budget / needs in terms of workload, performance, latency
  • How much it costs / what is the price to run where
  • What are the legal / regulatory restrictions with respect to locality, sovereignty, privacy...

The rest should be easily enough programmatically calculated. For this to occur, there is still much work to be done. The "plumbing", which is how to connect and administer heterogeneous clouds is almost there. The largest effort is really for these industries to come together on the reservation, consumption and fulfillment model. We might be able to live today with Amazon, Microsoft, Alibaba and Google cloud models, but we certainly won't be able to accomodate a lot more.

This means we need an industry wide effort for cloud hybridization at the business layer. It is necessary for all network operators to present the same set of APIs and connectivity services to all the cloud operators if we want to see this market move in the right direction.

Thursday, October 17, 2019

A brief history of the cloud

Our industry is undergoing a sea change. Entire service categories where network operators had a leading position have shifted to new entrants.

It started with app stores and social networks for portals, then skype for voice , then WhatsApp for messaging, then Netflix for video…

Each of these disruptions have been the product of ambitious digital services strategy, coupled with technological advances that left previous generations instantly obsolete.

Economic population disparities have also been an important trend to consider. Whereas in the past, a large portion of the population wanted a robust, reliable service, we have seen market segments that used to be marginal emerge in strength where affordability has become more important than reliability.

The "best in class", no longer affordable for most, has shifted to "good enough" but free or inexpensive for many.

The new digital companies, unencumbered by analog technology legacy, started discovering new ways to scale up explosively IP infrastructure to sustain their growth.

The webscalers were born, and with them, the idea that infrastructure does not have to be dedicated to a single purpose or service but can be shared across services and geographies. This came from the observation that adding order of magnitudes of capacity per year was impractical, slow and costly with legacy networking.

Legacy networking was assembled of specialized appliances, dedicated to a single purpose, integrated into a coherent but rigid network topology. It was reliable but not efficient, stable but complex.

The first breakthrough was to use commercial of the shelf hardware. Unspecialized servers, which could make up from the lack of relative performance compared to specialized appliances with a much cheaper price point. If a server is 3x less efficient than an appliance, but 10x cheaper, it is an easy calculation. Moore's law accelerated this shift tremendously.

But, with more servers and physical elements came more operational complexity. The second breakthrough was to remove and centralize the management of each of these servers. Whereas in the past, it was necessary to physically access each server to configure it, troubleshoot it, the new method allowed to remotely access any server in any network, and furthermore to provide a consolidated, centralized view of the physical attributes and health of a whole network, from a single web portal. Beyond simple monitoring, increasingly sophisticated management routines were built in, to deploy, manage, scale programs remotely.

The ability to manage software remotely meant not only that the network manager didn’t need any longer to be physically where the data center was, but also that it was possible to decentralize and deploy several data centers in different geographies and manage them remotely. The cloud was born.

Thursday, September 5, 2019

The cloud and the network edge

There is so much talk about edge computing these days that it is easy to feel confused. There seem to be as many definitions as to what and where is Edge Computing as there are vendors and service providers.

While the amount and type of computing and its location are debatable, there is an element that I feel is missing in the discussion. For me, the edge of a cloud or a network is the closest point to the user / developer to provide a continuous and consistent experience. This notion of continuity is key, in my mind, to the definition of Edge Computing. After all, deploying a compute or storage capability at a remote location to provide local or decentralized service is not a novelty and can hardly qualify as edge computing.

The clouds' edges

Beyond compute and storage, connectivity is an equally important and often ignored component of edge computing. To provide a consistent and continuous experience, a provider has to control not only the routing but also the connectivity performance associated with its services. A cloud's performance is function not only of the raw compute capabilities, but also its geographic scale and the capability of the provider to deliver an homogeneous service across a variety of regions. This is why large cloud providers routinely flaunt not only the size and number of their data centers but also the dedicated network that physically connects them and guarantees cross-regional performance. Deploying data centers in a nearby location where you consume cloud services can have a radical impact on your user experience, as a consumer or a developper. This is why cloud providers tend to emphasize proximity.

Performance is not strictly related to physical distance though, or rather, it is more related to your physical distance to the peering point (where your internet service provider's backbone terminates and your cloud provider's starts) and the number of hops and devices your traffic has to go through. The distance might be relatively short, but if the traffic has to go through several peering points, or if the peering is unprioritized with the rest of the internet traffic, performance might vary dramatically.

A cloud provider is only able to measure and guarantee the performance of its "inner" network, up to where the traffic enters or exits its network to the internet. If we accept that an edge is the point at which a provider can deliver continuous and consistent experience, then the cloud's edge is certainly somewhere between these centralized data centers and the peering points. Beyond, as traffic enters the internet, performance is no longer guaranteed or controlled.

This is why we have seen Amazon, Google and Microsoft, lately, start discussing with network operators the possibility to deploy their edge in fixed and mobile networks. The value proposition is simple: let us deploy our mini data centers in your networks, so that we can provide the same cloud services your customers consume, with an enhanced performance due to the reduction of traffic transiting through unmanaged internet.

Each provider is looking at a slightly different positioning, ranging from Space as a Service, a fully managed infrastructure by the cloud provider, deployed in a telco data center, with a nominal revenue share on traffic and resource utilization, to a IaaS or PaaS do-it-yourself edge compute environment based on open source cloud computing tools. The business model is underpinned by the size of each provider's developer's community and their ability to monetize the investment through differential user experience. From their perspective, deploying their infrastructure in a telco data center is a natural organic expansion of their growth and they have trouble understanding the telco's resistance in accepting essentially free infrastructure that will generate new revenue streams quasi immediately.

The telco network's edge

While most telco providers have dabbled in cloud services, they have not been really successful at providing a convincing cloud offering, incapable of matching the geographical scale or the lifecycle development services of the traditional cloud providers.

Many see edge computing as a chance to re enter the cloud value chain, by providing a valuable performance enhancement based on an asset that is expensive and difficult to rival: their network capillarity. As 5G investments business case are being examined, it is clear that consumer retail market will be unlikely to justify alone the investment. Wholesale, IoT and B2B2x might be the key to sustainable growth. If 5G's promise is to deliver a substantially different user experience (not just more speed / volume, but different services with guaranteed performance), edge computing is going to be an important part of the deployment options.

Some bet that edge computing will be a market opportunity large enough that a new ecosystem can develop, with developers, providers and clients that can be distincts from the traditional cloud value chain. Their bet is that they can control that ecosystem if they define the infrastructure, the services lifecycle management and develop a platform for developers to deploy and consume edge services.

Burned to some degree by their experience with video caching - where telcos were too slow to identify the opportunity and accepted video providers' proprietary caches in their network in exchange for a revenue share that never materialized - Telcos resist the idea of deploying dark infrastructure in their networks that they won't be able to monitor, control or monetize.

This is why they tend to wait or actively work on a "standards" based edge computing ecosystem to develop; where they can buy and deploy cloud technology in their edge and offer PaaS or SaaS to developers.

The battle at the edge

It is understandable at this stage of market maturity that the different actors in the value chain try to capture as much control until a consensus is reached. I suspect that whether edge computing is more about cloud or more about connectivity could dictate the position of the actors in the resulting ecosystem.

Beyond adversarial positioning, there is certainly a hybrid approach that would merit investigation. While telcos are certainly struggling to create an edge computing infrastructure and platform that would offer a homogeneous experience to developers irrespective of the network or country at the same level as a cloud providers for years to come, it is also unlikely that cloud providers will have the appetite to operate and manage not few but thousands of data centers worldwide, if latency becomes a real competitive advantage.

From that perspective, there is surely an opportunity for collaboration, where cloud providers could contribute their cloud infrastructure, development and management tools and platforms as a a service to telcos that could assemble and operate them in their network, controlling access and utilization for their internal and own brand services, while allowing the cloud provider to sell their services in the same environment. An additional benefit would be in a deeper integration, allowing cloud providers to consume, operate and resell slices or QoE packages of their cloud services in the edge environment.

This scenario requires a little more openness on both side, and the understanding that it is unlikely that the edge battle can be won by a (category of) player alone.

I have been playing around over the last few years with telco edge computing, Amazon Outpost, Google Anthos and Microsoft Azure stack. There is a path to create a lot of value, but it requires a level of integration that goes beyond today's level of comfort of cloud and telcos. I bet the first collaboration of that sort will create a de facto standard in the industry and have a strong first mover advantage.

Wednesday, April 3, 2019

Gaming in the edge of cloud

As i just wrap up a couple of weeks immersed in the world of cloud gaming, I thought I would share some of what I have learned and a few opinions on the subject while it is still fresh in my mind.
First, a short confession - I have been a gamer since my first Atari Pong and Activision consoles - I don't play enough to my taste, since I haven't been able to reach my lifelong ambition of being paid to play video games.
An opinion has formed and has imposed itself as an evidence through my various meetings at the Game Developers Conference last week:
Gaming is like video streaming a few years ago: we used to look at a fraction of mobile users as "bandwidth hogs" as 5-10% of them used 70-80% of data capacity as video streaming appeared. Within a few years, as LTE was implemented at scale and higher capacity became available, we found out that we were all bandwidth hogs, we were all willing to stream video, if the networks were fast and reliable and if the costs were reasonable.
I feel that gaming will go through a similar aha moment once we make every game available on any device, at reasonable price, without having to buy or build expensive PCs or consoles. We are all gamers, we just don't know it yet.
As I turn my attention to this market, I find that much of my gaming experience still has a lot of frictions:
  • Buying a console game requires going to a store or a long download (if I have enough storage left)
  • Buying a PC game requires the same, with the added effort of ensuring that I have enough graphic capacity, computing to run it well. If you're hardcore, you start chasing latency by buying specialized keyboards and mouse for fast twitch response.
  • Once I put the disk on my system, I usually have to wait to download updates, patches, installation...
  • Once I start playing, my community is usually linked to my console or service, the Venn diagram of my physical and digital friends has overlaps that are artificial
  • I would like to think i would get better results if i had better connectivity, as lag affects my performance, particularly in First Player Shooters.
  • I still cant play my favourite games (well) on my phone or tablet.
All in all, gaming is already great, but there are many things we could do as an industry to make it better. As I look at the market and the games that are the most popular, there are a number of clear trends:
  • Game studios are starting to enforce real multiplatform play, thanks to Fortnite
  • There is interest in extending the life cycle of a game from one shot, to downloadable content, to subscription
  • To satisfy players, and keep them engaged, MMO (Massive Multiplayer Online) is key
  • Freemium can work (thanks Fortnite, Apex...)
  • Cloud streaming works OK for single player but struggles for MMO, particularly looking forward to 1080p, 2k, 4k, VR / AR...
  • Gaming is still a large screen first experience
Online Gaming requires cloud. Cloud Gaming requires excellent connectivity. Gaming streaming requires a better cloud and better telco. Edge computing might be able to help there.