Colocation at the Edge: Why regional data centers and hybrid clouds can maximize performance

A year ago, there were more than 500 super-premium data centers in existence worldwide and more than 170 or more data centers in the pipeline, according to Synergy Research. But the explosion in cloud adoption – public and private – isn’t just driving demand for super-high-end data centers. Driven by the IoT and the advent of 5G, the continuous decentralization of the cloud is also contributing greatly to the growing change to distributed ‘edge’ computing.

The edge cloud environment is currently playing a key role in scaling the cloud down to the local level. This allows much of the processing, storage, control, and management of local applications to take place closer to users, machines, and devices. This latency is significantly improved and optimized for application responsiveness, thereby maximizing productivity, efficiency, competitive advantage, user and customer experience of the business. Low latency also ensures availability and future performance potential of 5G mobile network coverage; super-fast streaming video for content delivery providers; real-time cloud gaming; Real-time AI, machine learning / deep learning decision-making in industrial automation and medical environments; Precise control of unmanned vehicles and more.

“Driven by IoT and 5G, the continuous decentralization of the cloud is making a huge contribution to the growing change to distributed edge computing”

For lower latency and greater flexibility, data centers must be able to quickly provision and scale compute and storage resources as precisely as needed – but without the risk of images. Impact on IT security and resilience. At the same time, it is important to note that edge computing in edge data centers complements instead of competing with public cloud services.

Hence, ICOs and developers that depend on the lowest latency may have to consider the best place to deploy and support new services, and rethink the network architecture. In doing so, large businesses, SMEs as well as cloud and telecom service providers will benefit from their data and applications that are closer to users and customers only. with less time-sensitive critical data, non-missions are sent to a centralized public cloud for further analysis or storage.

In addition to improving latency, the cost of backing up all data to one or two super-large data centers can be significantly reduced by keeping it local. The cost of transferring high volumes of data can be enormous, such as in the case of autonomous vehicles.

Technical approach to hybrid cloud

In response to new and growing market requirements, a more regional edge data center colocation solution has become essential. This directly solves the data transfer latency and cost issues that often occur with centralized cloud business models that are too reliant on data centers in remote locations – on the other end. of the country or even further away. Edge colocation facilities are purposefully designed to fill significant gaps between micro-modular (unmanned) data centers – located on the edge of the network, for example next to a cell tower, on the floor machinery and hospital complexes – and the super-high-end concentrated.

However, to optimize the best approach of both worlds between public and local private clouds requires data centers strategically located for regional internet exchange as well as cable connectivity. on-site optical carrier diversity. A hybrid architecture – combining public, private, and perhaps on-premises IT heritage – will also be required often creating complex engineering challenges.

Moving the app will determine the hybrid strategy and one size doesn’t all fit. Building business cases and prep work can be difficult considering which applications will be placed in the edge data center and which in the super high end data center; How long does it take to migrate all of the applications to the new infrastructure; skills and experience available in the IT department; whether any remaining old on-premises IT infrastructure needs to be equipped; Essential software to manage all environments in a hybrid deployment.

“To optimize the best approach of both worlds between public and local edge private clouds requires data centers strategically located for regional internet exchange as well as fiber optic connections. Diverse on-site carriers ”

With the above in mind, the level of onsite technical capacity available at regional colocation sites will be critical. Directly connecting to public cloud provider infrastructure via on-premises ports is another factor, along with the flexibility to do pre-production testing in the data center to ensure everything. works before launch.

Comprehensive view

The ‘need for speed’ for achieving a low latency connection coupled with greater bandwidth and the benefits of reduced data transmission costs should not distract from the fundamentals of colocation: Availability use of 24/7 data and storage systems continuously by providing secure and flexible critical infrastructure.

It is wise to check physical and network security, on the front of the power supply, the types of cooling systems used and the overall energy efficiency (PUE) – the use of 100% new derived energy should be outlined now but also consider how another potential data center provider is addressing sustainability. Finally, require proof of uptime service records, certified security and operational credentials, DR and business continuity redundancy, and migration service capabilities. and install the server from start to finish.

Leave a Reply

Your email address will not be published. Required fields are marked *