Product and service reviews are conducted independently by our editorial team, but we sometimes make money when you click on links. Learn more.
 

Docker Container: What It Is and Why It's Important

By - Source: Toms IT Pro

Container technology like Docker doesn't just take a process that's running and isolate it, it packages it with all the other files and libraries it needs.

Containers in general, and Docker in particular, a "disruptive" technology that seems to be everywhere. Developers have been the catalyst, making Docker a mainstay in the industry. But outside of the developer community, many still aren't sure what all of the fuss is about and don’t know whether it's something that they should be learning and using or whether it even applied to them at all.

Where Did It Come From?

Docker and containers are becoming synonymous; if an organization is considering using containers then it's likely they are talking specifically about using Docker. But Docker is a company that has been creating technology around the container concept that has been around for decades, beginning with the chroot command found in Unix operating system.

If you've previously used the chroot command on Linux or Unix to isolate a process from the operating system, then containers is going to sound very familiar. Both chroot and containers can isolate a process, limiting its exposure to the rest of the operating system, and likewise keeping the operating system safe from the process. However, containers take the concept of process isolation to a new level.

MORE: How to Deploy a Docker Container in Windows Server 2016

How Does It Work?

Container technology like Docker doesn't just take one process that's running and isolate it. Instead, a container is created that contains everything that a process needs, including all of the supporting files, including any libraries and binaries needed to run, and then packages it into a container image.

This container image is really where the magic happens in a Docker container environment. This container becomes a process that is ready to be moved, shipped and deployed to one or more other computers running Docker.

How Can It Help Me?

Docker images, with their ability to be deployed, copied, modified and started on multiple computers brings many to identify containers as a way to virtualize applications.

There are some similarities between Docker containers and virtual machines. Both use an abstraction layer that acts as a go-between for the resources, whether that's a virtual machine or a containerized application. Both allow sharing of the abstracted hardware or operating system and function as a bridge that can enable the resources to be accessed.

The impact of using containers in an environment can be dramatic. Because the container image leverages the operating system that it's running on, the OS doesn't need to be included in the container image, so individual container images are often not much larger than the application itself running on the OS had it been installed directly on the host OS.

When this is compared to using a virtual machine to test applications, this can really save a lot of storage that now won't have to be used to provision multiple virtual machines, each with their own OS, to house the application in all of its deployment environments.

Because even one application can be provisioned multiple times on the host. This can be great for simulating the differences in moving data from a development environment through to a quality control or product testing, then finally a production environment. Since the applications are each unaware of the presence of the other, they can all operate on the same host just as though they were running on their own virtual machine.

The other huge benefit to working with Docker containers for working with applications is that the configuration for the application is all stored inside the Docker image. This means that if you are working on an application, and you've got the development instance ready to have more people working with it, you can move it to its different hardware and the image is self-contained. All that's needed to share the work is to ship the container and it is ready to be started almost immediately. 

"It's clear that Docker and containers are not just a niche, and it's not a fad."

Installing the application manually can become a lengthy and error-prone process that includes verifying prerequisites have been installed, the application extracted, configuration items set and the data being created or imported. By using scripting, the installation process can be improved to ensure that no steps have been forgotten or missed, and the elements of human error is removed.

But even a scripted solution is slow compared to shipping with a container. This is because inside the container, the application has already been installed and the data has already been loaded. All of the configuration items are set just the way they were at the time the container image was created.

To move the Docker image from one machine to another, or from one machine to many, all you have to do is send it to them. This can be done through a file share, but a better solution is available that gives people a central place to store, search and download their containers.

Docker repositories are available, and are a perfect place for storing your container images and keeping them straight. Similar to Github, the repositories can be public or private and you can have multiple repositories so that you can store your images by client, company or project.

With the phenomenal adoption of Docker, and the rapid growth of the Docker repositories, you can search for and find many popular applications already in a container image and ready to be downloaded and run.

What is Docker Engine?

Docker Engine is the application that needs to be installed on any host OS that you want to use to run or create Docker images. Docker was created to run on Linux and in its initial release it was only available on Linux. To run it on Windows or Mac required the use of a virtual machine running Linux and the Docker engine. In later releases, a Windows and Mac client is available so that Docker can be run directly on the developer machine without the use of a VM.

Bottom Line

The success of Docker and container technologies has created a cottage industry within technology, with a wide range of products and services that improve the usability, features or security of containers. Public cloud providers like Amazon, Google and Microsoft Azure all provide services for Docker containers, and the marketplace on those public cloud providers offer a rich ecosystem for Docker and containers.

Microsoft is also looking to capitalize on the success of Docker, with a native container feature that is part of Windows Server 2016. With the increased use of containers over the last 2 years, it's clear that Docker and containers are not just a niche, and it's not a fad. This technology will continue to develop and is likely to become as ubiquitous as machine virtualization in the future. What role will it play in your organization?

Comments