Skip to content
Containerisation

Containerisation has received quite the buzz over the last few years, and rightly so, as for many businesses, it’s become the new way of architecting applications. While it might be the talk of tech forums and industry blogs, that doesn’t mean that every company is utilising it yet, even if it may feel like it.

In this blog, we’re going to look at the fundamentals of containerisation, the tools out there which make it happen and what this means, specifically for enterprise-scale businesses. After all, for businesses operating at real scale, switching to these new methodologies and ways of working always takes time, so hopefully by the end of this post, you’ve got a clearer idea as to whether containerisation might be worth exploring further.

“All major cloud firms are favouring containerisation, so it’s clear that, in time, the business world will follow. With the backing of the companies like Google, the number of tools available will only multiply while the functionality grows, providing increased flexibility for businesses.”

Guillaume Bettayeb, Cloud Architect at Cloudhelix

What is containerisation?

It’s a means of architecting applications whereby the application itself isn’t bound to the environment it runs within. The abstraction between environment and application makes it easier to deploy software, regardless of whether that’s to a private cloud platform like ours, a public cloud provider like AWS or to dedicated, on-premise infrastructure. So long as the operating system (OS) is the same, your application will run in whatever environment it’s deployed into.

A container is a set of configurations that allow lightweight portioning of an OS into individual segments. Containers were historically Linux-based, but in the last year or so, progress has been made to bring native containers to Windows within tools such as Docker, something we’ll tell you more about later.
How does it differ from virtualisation? With containers, you can have various, separate services or applications running on one host, all accessing the same OS. This is fundamentally different to virtualisation, where each virtual machine (VM) requires its own operating system.

Why is this significant?

Well, containers share resources, such as the OS, which means they use significantly less resource to run. You can run containers inside a VM, in which case, each container will utilise the same OS as the VM. VMs are great places to run containers because the VM makes very efficient use of its hypervisor, while the containers will make very efficient use of the resources within the VM. This makes it much easier to scale applications as utilisation is very effective. However, the most efficient way to run containers is still directly on top a single OS, as they are more efficient at utilising resources than a hypervisor and individual operating systems.

Before we jump down a VM-shaped rabbit hole, that’s just one means of deploying containers. The whole point is that the application is abstracted from the environment it runs in, therefore by focusing too heavily on VMs, we’re kind of missing the point. It is, however, good news for enterprises who have embraced virtualisation, as containers can be introduced into the ecosystem; it doesn’t have to be a case of starting from scratch in order to take advantage of containers. In fact, depending on what you’re trying to achieve, VMs could be the best place to host containers.

New call-to-action

Why containers?

Efficiency

Focusing on containers as individual components, they are really efficient. Because they don’t need their own OS, you can use containers to squeeze the most out of a particular host in the best way, making scale easier.

Stop configuring for multiple environments

It’s also very nice to know that utilising containers means you only build the application once. There’s no need to build the app and then go through the added task of configuring it for multiple platforms or types of hardware.

Move away from the monolith

Building your application out into many of these neat, little standalone containers takes away some of the complexity that comes with working with big, legacy applications. If you have a problem with one container or part of an application, you can focus in on that without taking the whole thing offline.

Let your devs do more dev-ing!

With a super clear separation between infrastructure and application, you can have the skills of your staff focused where they are meant to be focused. This is where DevOps comes into play, which is often implemented into teams through containerisation. This creates a more cohesive development function and begins to speed up the production lifecycle.

Tools of the trade

When it comes to containerisation, there are two main tools out there, and it’s important to understand both and how they differ before you do anything.

First off, Docker. The principle functionality of Docker is to let you build and package containers. Dockerfiles are fed into the command line of your infrastructure (which could be anywhere, remember) and act as a snapshot of your application, which will run once you start them up. Docker has a range of tools that can allow you to utilise a full-blown microservices approach, but you don’t have to use Docker on its own.

Next up, Kubernetes, which lets you deploy, scale and manage containers, known collectively as orchestration. Once you’re spinning up containers in your environment, you will want to begin hosting different containers on different machines or in different locations, and have them run collectively. Kubernetes deals with the heavy lifting involved in starting containers when they need starting, ensures containers are speaking to one another and deals with failed containers. In short, it makes your containers work collectively.

Both of these tools can be utilised collectively to create a solid, modern application architecture, but it’s important to understand the principle differences in these two platforms.

“Google runs its entire business on containers. They created Kubernetes, which we use here at Cloudhelix to support clients such as DataJAR, who manage an Apple device management service. With the backing of the companies like Google, the number of tools available for container orchestration will only multiply while the functionality grows, providing increased flexibility for businesses.”

Guillaume Bettayeb, Cloud Architect at Cloudhelix

What does the future hold?

Containerisation for enterprises
The world of containerisation is still evolving, with new advances and features being added to Kubernetes and Docker all the time. When it comes to enterprises, though, who are likely to be either running their applications on physical, dedicated infrastructure or on a cloud platform utilising virtual machines, it’s possible to begin integrating containers alongside your existing ways of working.

“From an operational perspective, businesses should start thinking about containerisation because containers use the host’s logical resources, making them much cheaper to run than full, isolated VMs. For the same reason, this makes containers a lot faster and, by sharing resources, you can easily run hundreds of containers seamlessly.”

Guillaume Bettayeb, Cloud Architect at Cloudhelix

For example, as we mentioned earlier, you can run containers in virtualised environments, which means you can run containers inside a VM and have them sit alongside existing VMs.

If you have plans to migrate a VM-based application to a multi-cloud environment, you could work through your infrastructure decoupling and containerising an application. Once everything is containerised, you can be confident it will run in your proposed multi-cloud environment, and the migration itself should be straight forward.

The same goes for bare metal hardware. If you require really low latency and so, despite wanting to move your application forward, don’t want to virtualise, you could consider building containers on physical kit.

It’s certainly not a black and white, yes/ no type affair, it really depends on what you’re hosting. The good news is that by utilising containers, you should be able to host parts or all of of your application where it’s technically best suited, due to the portable nature of containers.

So containers definitely feel like the way the world is moving, but their agility should mean that, once you’ve taken on the approach, you’re becoming less locked in to a particular provider or method of hosting. Businesses have become used to poring over hosting decisions because they’re usually stuck with them for a one or three year contract, and it requires a whole heap of effort to migrate. However, once you’ve gone through the process of containerisation, it will actually open up the hosting options rather than lock you in.

modern vibrant office Woman smiling at laptop

Question?
Our specialists have the answer