Unlike other popular tools that can trace their roots back to the nineties or even earlier, Docker didn't exist until 2013.
Yet, in a very short period of time, it has managed to become something of a household name (assuming the household consists primarily of IT specialists).
But what is Docker exactly? How did it manage to become so popular so quickly? And how can you deploy it on your VPS? Let's find out.
What is Docker?
To understand what Docker does, we first need to discuss virtualization – the technology that has enabled the creation of your Virtual Private Server.
Every VPS account has its own virtual machine running on top of a physical server owned by your host. While it's creating the virtual server, the provider uses hardware virtualization. This means that the machines are controlled via a hypervisor, and hardware emulation allows the host to run a Windows VPS on top of a Linux host and vice versa.
The virtual servers themselves behave pretty much exactly the same as physical machines. You can have root access to them, install and run applications, and use them for pretty much anything you want.
Docker is similar in that it's a collection of tools that let you set up multiple different isolated environments (called containers) on a single host. However, unlike your VPS, the Docker containers are created using OS-level virtualization.
They still get an allocated set of hardware resources, but there's no hypervisor to emulate the hardware, so there's less isolation from the host server. What's more, Docker uses the host's kernel, so you can't have a container with a different operating system.
Docker was one of the first solutions that popularized OS-level virtualization, and we've seen many similar platforms appear after it. This goes to show that the technology has many different applications. Let's explore some of them.
When can I use Docker?
There are many instances when you'd want multiple different environments on the same server. Here are some of them.
You want to host multiple applications on the same host.
Different applications have different requirements, and this could sometimes cause problems. With Docker, you can put each app in its own container. That way, you can create the perfect environments for all your projects without them getting in each other's way.
You want a cleaner host OS.
Even if two applications can co-exist on the same server, the different services they use can create quite a lot of clutter and even compromise the performance. Putting them in separate containers leaves your server's operating system cleaner and more optimized.
You want to have a quick way of deploying the same applications on different hosts.
One of Docker's best features is the ability to create images. A Docker image is more or less a snapshot of the container and everything in it exactly the way you have configured it. Taking an image of one container and deploying it on another is a piece of cake, so if you ever need to migrate your applications, Docker can make the job a whole lot easier.
You need an isolated testing environment.
Developers the world over know that deploying new features and updates without testing is never a good idea. A Docker container is an excellent way of setting up an environment where you can make sure that all the new code works before you present it to the users.
You need a backup solution.
Containers can also act as backups. Although Docker doesn't have a Graphic User Interface (GUI), once you learn a few commands, you'll see that creating a Docker image is a straightforward process. You can put an image in a backup container and leave it for safe storage. If something in production breaks, you can redirect the users to the backup container or simply copy the image to a new container.
Why Docker?
As we've established already, Docker is far from the only solution that uses OS-level virtualization. People new to the technology may be looking at all available platforms and are probably wondering which one is the best.
The truth is, the solutions you'll see on the market are very different. Just because one of them works for a particular project doesn't necessarily mean that it will be the most suitable option for all scenarios.
That being said, Docker does have some universal advantages that make it an appealing option in many cases. Let's have a closer look at them.
It's free and open-source.
Docker Inc., the company behind Docker, does offer a few paid plans starting at $5 per month. With them, you get some nifty extra features like vulnerability scans and audit logs. However, the core platform is open-source and completely free to use. You can deploy Docker containers on any server without paying a penny.
It's easy to set up.
Docker images can save you a lot of time installing applications and configuring the environment to fit your project's needs perfectly. For example, if you want to build a website with WordPress, you can have it installed with a single command, which means that you can start work immediately.
It works on all platforms.
Docker works on Windows, Linux, and macOS, so no matter what your host server runs, you can host your containers on it. This means that it's suitable for a vast range of projects.
Can I use Docker on my VPS?
VPS plans rely on hardware virtualization, meaning virtual servers have their own kernels and operating systems, just like standalone machines. As a result, nothing is stopping you from deploying Docker containers on your VPS.
Here, for example, are the steps you need to take to install Docker on a Ubuntu VPS:
1. Set up the Docker repository.
Docker isn't part of Ubuntu's default repository, so you'll need to add it manually. Doing this will simplify the process of updating the virtualization solution later on.
You first start by updating the software packages with:
sudo apt-get update
Next, you need to add Docker's official GPG key. The command is:
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg
Finally, setting up the stable repository is done with the following command:
echo \
"deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] https://download.docker.com/linux/ubuntu \
$(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
2. Install the Docker Engine
Once again, you need to update the software packages on your server with:
sudo apt-get update
To install the latest version of the Docker engine, use:
sudo apt-get install docker-ce docker-ce-cli containerd.io
3. Verify that Docker is installed correctly.
To ensure the process is successful, you can get Docker to download a test image and run it in a container. The command is:
sudo docker run hello-world
If Docker prints out the message, everything is set to go.
Conclusion
It has frequently been criticized for the lack of an easy-to-use control panel or management tools, but the truth is, Docker was never conceived as a novice-friendly solution. Usually, people interested in it are pretty handy with a keyboard anyway.
Although ease-of-use isn't among its priorities, Docker does manage to give developers the performance and flexibility they're after without being a heavy weight on the budget. This plays a key role in boosting its popularity.