Question

Curious how containers are portable across development/testing/cloud environments with no worry needed about the underlying infrastructure. Does the Docker Engine essentially standardize operating systems to the point where the engine is able to launch the same container from the different operating systems? Or does a cloud deployment environment for the container(s) need to be running Linux if the development environment was also running Linux? Thanks!

No correct solution

OTHER TIPS

No Docker engine alone doesn't do that. The underlying system still has to run the appropriate OS kernel to be able to run a container written for that family of operating system.

There are however, tools built on top of Docker engine, like Docker for Mac, Docker for Windows, and Docker Machine, which automatically creates and runs a virtual machine, so you can, for example, run Linux container on a Mac or Windows almost seamlessly (with a couple caveats with mounted folders). This is possible because the Docker command line clients are build as a remote client that talks to the Docker daemon, so it doesn't really matter whether the daemon and the client are actually running on the same physical machine.

There are also orchestration tools built on top of Docker engine, like Docker Swarm, that can, to some degree, run with mixed cluster of Docker engines running on top different operating system, to create a cluster hybrid engine that can run a mix of Linux and Windows containers.

Licensed under: CC-BY-SA with attribution
scroll top