In my previous blog, I talked about dockerizing .NET applications by building a sample .NET application that contained multiple components interacting with each other. This blog is the second of the three-part series on containerizing .NET applications using Docker. In this part, we will see how to add Docker support to our application. Let’s start the process.
Run The Application On Docker
Let us first add docker support to our Weather Forecast API project. Right-click the project -> Add -> Docker support.
And, that is it. Visual Studio has already created a new Docker file for our project.
If you see at the top of the Docker file, there is a URL that explains how Visual Studio uses this file to build images. Let us first understand the build process. The first section uses a build image of the name aspnet and version 5.0 from the Microsoft Container Registry(mcr) and calls it as base. It creates a directory called app where the image will be pulled and copied the build output. If we check the details of this image on the docker hub,
it says ASP.NET Core Runtime which basically means it provides the binaries to run a .NET Core application. But we first need to build our application and for that, we need another base image and that is being done in the second section. It uses a build image of the name sdk and calls it build. Let us check out the details of this image on the docker hub.
This image contains all the necessary libraries to build our project. But it will be required only for the build phase and hence will not be a part of the final image of the application. Another directory called src will be created where the .csproj file of the project and all of its content will be copied. We could have used the sdk image as our runtime image too, but that will just increase the final size of our image considerably. Since sdk image not only contains the runtime libraries but also other libraries required to build .NET projects hence it is quite large in size.
Let us take a look at the difference in the size of these images. I created two console applications namely mcrsdk and mcrruntime and added Docker support to both of them. I used the default template of Docker file for mcrruntime which uses separate mcr images for build and runtime. For mcrsdk project, I used the sdk for both build and runtime. Here is what these two images look like
As we can see, the mcrsdk image is 3 times more than the size of runtime. Using the sdk image makes no sense for our final output as that is needed only for building. Having it for final output will just increase the image size of our project. And, that is the benefit of using the multistaged Docker file as we can use different docker images for different steps in the pipeline but only one image for the final one. The intermediate images are just pulled from the registry and discarded once the final image is built.
The rest of the section deals with restoring dependencies of our project and running the build command. Once the build is generated, let us now move to the third section.
Here, the content from the build directory is copied to /app/publish directory using the dotnet publish command. The /app/publish folder will only contain the output of the project i.e. WeatherForecastApi.dll and all of the project’s dependencies.
The final section then takes the original runtime image and copies the published output into a new image called final and sets the entry point of the application.
When Visual Studio added Docker support to our project, apart from adding a Docker file, it also modified the launchSettings.json file to add to the following section.
We are now ready to run our Web API with Docker. We can see there is a new option for Docker along with other options to run and debug our application.
Let us choose Docker and wait for it to pull the images for runtime and SDKs, build the image, and run the docker container for our application. Visual Studio also displays all the logging information for each of the containers in the Output window. Once it is completed, it launches the swagger UI where we see the Get API. But when we call the API, we see a transient failure while it tries to connect to the database. The same error can be seen in the log tab of the container window where it displays other information of our container as well like Environment, Ports, Files etc.
The reason is that now our application is running as a container having its own ports and address. It is not able to identify the database running on port 5432 on localhost because the same port is not available on the container. If we want the container to connect to localhost port 5432, we will have to modify the connection string and change the Host from localhost to host.docker.internal. This will inform the container that it wants to connect to the host and port to resolve the internal IP address used by the host.
“WeatherDatabase”: “Host=host.docker.internal;Port=5432
If we run the application now with Docker, we can see that it correctly shows the weather forecast data from our local PostgreSQL server.
We need to add the docker support for the rest of the two applications as well. So, let us do that. The process is similar to what we did for our API project.
Adding Docker Support
Let us first add docker support in the WeatherForecastAdmin project. The Dockerfile looks mostly the same as API one with the only difference being the difference in name of files and folders.
This application talks to the Rabbit MQ server on localhost which again will not be accessible if we connect using localhost from our application. So let us now modify the appsettings.json to use host.docker.internal
“RabbitMqHost”: “host.docker.internal”
If we run the application now, we see the same Admin UI opens and if we fill in the weather forecast data and submit the form, it correctly sends the data to our local Rabbit MQ server.
The last one is WeatherForecastProcessor. Its Dockerfile will look like.
No surprises here. Make the same change in appsettings.json for the connection string and RabbitMQ
“ConnectionStrings”: {
“WeatherDatabase”: “Host=host.docker.internal;Port=5432;
},
“RabbitMqHost”: “host.docker.internal”
Now, if we run the three applications together, we see that the same data is displayed from the Get API. Since it is fetching it from the same local database as before, but, now the application is running as a container. In the same way, if we submit the weather forecast data from the admin app, it will be submitted to the local RabbitMQ instance. Finally, the worker process will receive the message from the same instance and write the data to the local database. The difference is that now the three applications are running as containers.
This is how we run the .NET applications on docker. As we saw, we didn’t have to make any major changes in our applications to add the Docker support. We effectively added a Dockerfile to our projects and made some configuration changes in order for them to work on Docker. The development and debugging experience remain the same with Visual Studio. We can see all the running containers in the Containers panel that lists all the data related to specific containers like logs, environment variables, port mappings etc.
Conclusion
In a typical production scenario, the database, as well as the message queue, does not reside on the machine where the application is running and it runs on a remote server to which the application connects.
In our containerized environment, it would be better if the database and the message queue run in their own container. We have already created three containers for our applications and we need two more for Rabbit MQ and the database. There are dependencies in how these containers run and connect to other containers. For example, since the API and Processor applications connect to the database, the database should already be running otherwise these applications will throw an error. So, we need to make sure the container running database is up before any of these application containers startups.
In micro-services architecture, this is a typical scenario where we need to manage multiple services and their dependencies. And, that is where container orchestration comes up. In the next and final part of this article, we are going to cover the container orchestration tool that Visual Studio supports i.e., Docker Compose. Till then, happy coding!