OpenStack was born of two innovators. One a private company founded by three university classmates, recently sold for billions; the other responsible for landing the first humans on the moon. Made free for all to use, the platform now powers the world’s largest railway, mobile network, automotive manufacturer, payment card provider -- and still the list expands. However, what if this part of provider’s infrastructure extended to your train, the telephone pole on your street, even in your car? With Taiwan fast becoming a global centre of attention for IoT and Industry 4.0, it may be worth thinking about the role open source cloud software could play in this transition.
Edge computing takes application logic, processing power and associated data from highly centralised datacentres and pushes it to the far boundaries of the network. This provides benefits such as additional reliability, performance (especially due to network latency improvements and reduced data movement), scalability and the introduction of new capabilities for geographically-tailored services.
Early Adopters
Though a relatively new philosophy, successful early adopters can be readily found.
For instance Verizon, the largest cellphone provider in the USA, is bringing software defined networking benefits normally limited to massive cloud deployments, literally to local coffee shops. The company’s Linux/OpenStack-based “Universal” CPE allows rapid changes to circuit configuration or near-instant provision of higher-level network services (eg IDS) at the click of a button. It also gives technical staff from businesses with disparate locations monitoring and access that may have previously been possible only with a long car ride. All this from little white box about the size of your wireless router (larger units available, too) that’s probably stashed on top of the cupboard where the coffee beans live.
More regionally, SKT (Korea) and Telstra (Australia) have already made significant starts into their 5G networks. SKT aims to virtualise its network functions not just in the core, but all the way out to the cell tower.
Use Cases
Recently, we gathered IT leaders from around the world across a wide range of industries at an event in Silicon Valley. It reminded me a little bit of a cloud conference ten years ago, where no-one could agree on precisely what “cloud” was. However, over two days some patterns emerged, and we compiled this list of use cases to define “edge” , as seen from the perspective of companies adopting OpenStack to deliver it.
|
Hardware
|
Network |
Number of Applications |
Frequency of device access |
Frequency of application update |
Micro
|
Single system on a chip |
Low bandwidth, unreliable |
Single |
Never to rare |
Infrequent |
Small
|
Up to 1 RU, usually less. |
Low bandwidth, unreliable |
Multiple |
Rare |
Occasional |
Medium
|
4RU to a half rack |
Fast, either reliable (critical) or unreliable (non-critical) |
Up to a dozen |
Occasional |
Daily |
Large
|
One rack up. |
Fast, diverse fiber path |
Potentially thousands |
Occasional |
Daily |
Micro Edge Device
Readers familiar with edge computing will readily identify the smallest use case. The typical definition of “IoT” - a small, low-powered device with network connectivity typically running one application at a time. It could be a 5G connected sensor, your ChungHwa Telecom MOD box, or a consumer router. One company even suggests a child’s soft toy could be a Micro Edge Device.
The connectivity to such a device may be low bandwidth, or intermittent. The ability to update the device is relatively rare, but must be possible on a large scale for security fixes to millions of devices. Configuration information comes via the network.
Small Edge Device
From there it’s a small step up to our Verizon example from earlier, a slightly larger device that can support multiple applications (up to a 1RU server). They are most commonly connected using slow and unreliable DSL links, and unlike the Micro case what those applications are can also change occasionally.
This use case’s relative flexibility when compared to its size platform means it is in higher demand than smaller or larger edge platforms. Current vendors expect such a device in coffee shops, small retail outlets for a point-of-sale system, or a Fiber-to-the-Node cabinet. Telcos also see a cell tower site as a Small Edge Device, albeit with more reliable bandwidth.
Medium Edge Deployment
Moving beyond small devices able to be labelled as “IoT” may be unfamiliar for some. However, the methodologies around dealing with remote, infrequently-accessed devices apply well even if it’s a collection of servers running a few more applications. These applications could change daily, and can either be backhaul-critical (eg cell sites) or backhaul non-critical (eg retail store).
This extends the edge concept to sites such as retail stores in a chain. There may be thousands of stores under a brand (example OpenStack user: Walmart), which currently have complicated systems to deal with managing the connection of product pricing, payroll and surveillance to central datacenters.
Something as benign as keeping in-store music playing can be a challenging task when doing so across thousands of remote sites. Edge concepts can be used to cache music, update the catalog and monitor for problems without needing a knowledgeable person at each location. For retail stores, keeping the cash registers running is mission critical. A local edge deployment should be able to keep the store running during a network outage and seamlessly re-integrate with its regional datacentre after the issue is resolved.
In 5G Radio Access Networks (RAN), there is a split of tasks between real-time and non-real-time. Cell towers need to perform complex real-time manipulation of radio beams, to maintain connection to thousands of devices and this task should likely be done as close to possible to the tower. However, handover of a cell device from one tower to another can afford a few more milliseconds (a long time in the telco world!), might be better done at a more centralised location.
Large Edge Deployment
Somewhat contentiously, companies with a large datacentre footprint may consider a remote, unstaffed server room an edge site even if it’s of a reasonable size (thousands of applications/a rack of servers upward). Despite such sites being connected to multiple, reliable high bandwidth fibres, the relatively infrequent physical access to the facility means that large organisations could still gain some benefit from treating it with an edge approach. The applications in a Large edge deployment are changing constantly.
Diversity of use cases
In reading the above, it may have become clear that edge computing takes many forms and will take some time to clarify. Though, already the industry appears ready to agree on some key principles. For instance, one discussion is the current lack of vendor support for this style of deployment. Space for your organisation?
Security: Unsolved Problem or Improved Situation
Current datacentres are highly resilient, bordering on fortresses. The latest in physical security and biometric sensors are used to limit access to the critical servers within. Some quip that a move to bring infrastructructure to the network edge is abandoning that physical security. Something akin to throwing open those heavy doors and encouraging any member of the public to enter with whatever tool they wish. After all, that cell phone site on a remote mountain road is secure only from hackers who don’t own a ladder.
However, with the recent compromises in central datacentre applications, Edge advocates note that networks can be designed such that each step between the edge and the network core can introduce additional data checks, encryption and firewalls. This layering approach could potentially identify hacks and viruses earlier than the traditional model, and limit compromise to a smaller surface.
Regardless, a new deployment approach is a good time to think further on application and network security. With the recent high profile data leaks and hacks, the industry seems keen to do so.
So, What about OpenStack?
From the beginning, OpenStack has been a set of components available for deployers to select (or not) at their option. For instance, there are object storage clusters running quite happily running only the Swift project without any other pieces of OpenStack. However, greater focus on this “composability” is needed to support Edge cases, particularly on the small-device end of the scale. Our community is committed to making this happen.
How much of the cloud runs at each edge site depends on its size, as per the use cases. A Large Edge Device is probably going to be running a full OpenStack deployment, whereas a Medium Edge Device may be a “region” or a “cell” of OpenStack Compute (nova). These cases work fairly well today. While Verizon has been successful in deploying OpenStack on a Small Edge Device, the configuration needed was complicated and the choices of what to include or exclude not yet simple to make. This is where the most work will be occurring over the next year.
To accelerate this, the OpenStack community also collaborates with others. ETSI (European Telecommunications Standards Institute) is working to ensure the next generation telecommunications networks are well served. OpenEdge Computing provides a real-world test bed to demonstrate the capability of edge, and the French Institute for Research in Computer Science and Automation in partnership with Orange Labs is realising the edge vision for science. If you or your organisation has the interest, we welcome you to join the OpenStack community.
Find out more at https://www.openstack.org/edge , or read about OpenStack at http://superuser.openstack.org.