my-platformĪs you can see we also have a docker-compose.yml file in our backend services as we want to be able to run and test those services individually. Lets have a look at a potential folder structure. For example to coordinate your deployment of the platform as a whole. The main Git project can then be used to version and orchestrate your platform as a whole. So image your platform requires a web frontend and some backend services, we can easily develop the web frontend and the backend services in their own Git repositories. I suggest you to create one Git project that will contain your docker-compose.yml file and furthermore uses Git submodules for all your platform components. How to apply in your own projectsįollowing approach could be a way to orchestrate the platform for your project. See the CMD in the Dockerfile we are deriving from. This npm script will be called when the container will call when launched (unless specified differently in your Dockerfile). Make sure the npm start command is defined in your package.json. Now it is just a matter of adding a Nodejs webserver in this very same folder. To clarify things a little further I will also show you a Dockerfile.ĭockerfile FROM node:6.3.1-onbuild MAINTAINER EXPOSE 3000Īs you can see we are using a Nodejs application that exposes its application on port 3000. The Redis container we will just be using the default image available from the Docker registry, instead of building our own. Furthermore there is a dependency specified to a cache container. Volumes are used to map folders to your containers to preserve the data when the container is destroyed. Then it will map the current directory to the /code directory in the container and it will map the logvolume01 folder to the containers log folder. Once the container is build it will map port 3000 to the container exposed port 3000 (this will be defined in the Dockerfile). Furthermore we will also build an api image, using the Dockerfile in the api directory (hence the build. In case you want to have more detail on the Dockerfile for the web image, have a look at my previous article. In the root of this directory our Dockerfile is located. So the Angular2 sample app is located in a subfolder web. It will build our web application from our previous blogpost. You can use below commands respectively from your PowerShell or Bash.ĭocker-compose.json version : "2" services : web : build : web/ Chocolatey for Windows or Homebrew for MacOS. Same as in the previous article I will be using a package manager to install the required tooling. So before going nuts in this article, make sure you have a basic understanding on how to run a single container. This article will give you some insight on how to build a container and how to run a single container. In case you don't actually get how to build or run a single container you also might want to check out my previous article on running your Angular app in a Nginx container. As we are testing against immutable infrastructure this will also bring less risk when we deploy the same Docker images to production. Having smaller containers will therefore also help you in keeping as many parts as possible unchanged in your infrastructure. Meaning if we don't rebuild a container it also won't change. With Docker containers we can achieve immutable infrastructure. Having smaller containers might look like overkill, but in the long term it will enable you to only redeploy the containers that actually changed. So if you need a bigger Mysql cluster we want to be able to spin up more Mysql containers without having our application code in that same container. Furthermore you want to independently be able to scale your platform. You want to be as close as possible to production with your development environment. Do I really need this for my development environment? The short answer is, because it is a best practice to run only one process per container. You might be thinking why you ever want to run your application across multiple containers. Why the wutt… I need more containers to run my solution In case you don't have a working Docker environment yet, you might want to start with this article first. In a previous article I have shown you how to setup a Docker development environment on Windows or Mac. Then, using a single command, you create and start all the services from your configuration. With Compose, you use a docker-compose file to configure your applications services. Docker Compose is a tool for defining and running multiple Docker containers using a single command. In this article I want to show you the way for running your multi container solution on Docker.
0 Comments
Leave a Reply. |