We just announced the use of Ensemble Orchestrator and Connector in DartPoints’ micro data center deployments. This deployment is generating a lot of interest because it’s a bold use of network functions virtualization (NFV) for service delivery. While everyone’s talking about NFV and service innovation, DartPoints is deploying it. The DartPoints use case is really interesting, so let me tell you a little bit about it.
DartPoints is deploying micro data centers that are co-located with their customers. DartPoints’ customers are typically small- to medium-sized businesses that are looking for cost-effective ways to create data centers that can be customized for their particular business needs. By co-locating with DartPoints, a business gets exactly what it needs at a fraction of the cost of developing a data center on its own. Why is co-location so valuable for customers? Why is NFV important for this application – and why is the micro data center application important for NFV? To answer those questions, let’s take a look at how things were done before.
Problems With Remote Macro Data Centers
All we hear about these days is the cloud. That’s because the cloud is growing in importance for all aspects of communications services and business applications. Enterprises want to take advantage of low-cost managed services and computing. However, they face complexities in reaching this goal.
- Distance – The remote nature of today’s data centers introduces delay or latency, which slows performance and application response times.
- Security – Remote data centers are reached insecurely via the public internet. Privacy can be achieved with encryption, which adds cost and delay.
- Network Performance – Access to the data center via the public internet is a best-effort, meaning that performance is unpredictable. Replacing internet access with private lines is very expensive.
Extending the Cloud
If customers have problems getting to the cloud, why not bring the cloud to them? That’s what DartPoints is doing: extending the cloud by bringing it to the customer. Access to the micro data center can now be fast and secure using local fiber links going directly between the customer and the micro data center.
The DartPoints micro data center concept is similar to fog computing in that it improves communications efficiency, latency and security:
Fog computing, also known as fogging, is a distributed computing infrastructure in which some application services are handled at the network edge in a smart device and some application services are handled in a remote data center – in the cloud. The goal of fogging is to improve efficiency and reduce the amount of data that needs to be transported to the cloud for data processing, analysis and storage. This is often done for efficiency reasons, but it may also be carried out for security and compliance reasons.
In a fog computing environment, much of the processing takes place in a data hub on a smart mobile device or on the edge of the network in a smart router or other gateway device. This distributed approach is growing in popularity because of the Internet of Things (IoT) and the immense amount of data that sensors generate.
The big difference is that fog computing is all about the very small compute nodes supporting IoT and large volumes of data, whereas micro data centers are useful for traditional computing applications.
Making NFV More Efficient by Pooling Resources
Micro data centers are great for end users. But what about service providers?
Micro data centers help solve one of the big questions of NFV: where to host the virtual network functions (VNFs). Hosting VNFs centrally gives the lowest cost and best opportunity for re-use of resources, but it doesn’t meet the needs of some applications that need hosting at or near the customer site. Micro data centers bridge this gap by moving the hosting out to the customer, but preserving the benefits of scale. Tom Nolle recently wrote:
It may be that the most interesting thing about the micro-datacenter concept is the fact that it’s a jump to an edge-distributed multi-tenant cloud for VNF hosting. One of the problems I’ve identified with NFV progression from the virtual CPE model is that the logical next step is to start building a central resource pool to offload functions from the edge. That’s obviously the next move from a financial standpoint, but it creates long data paths and it also creates the risk that all your VNFs are now stranded in the center of each metro area, when many NFV applications (mobile services, content delivery, and IoT to name a few) are better served if you host them at the edge.
NFV Brings Service Agility to the Micro Data Center
Micro data centers are good for customers, and they improve the deployment of NFV. So, what does NFV bring to the micro data center game?
DartPoints has demonstrated one advantage: achieving service agility by replacing appliances (routers, firewalls, etc.) with VNFs hosted on open servers. By orchestrating these VNFs, DartPoints can give its customers services on demand and without truck rolls. That means customers get what they want, when they want it.
In addition, the micro data center approach enables DartPoints to achieve some economies of scale. Not only does it prevent stranded assets, DartPoints is able to use NFV in an innovative fashion by hosting multiple customers on a single VNF.
The cloud is growing and will soon be everywhere. DartPoints is helping speed up this growth by bringing the cloud to its customers. I expect to see a lot more of this type NFV-enabled innovation being commercialized in 2016.