Skip to main content

Experiment No. 6 Title: Exploring Containerization and Application Deployment with Docker

 Experiment No. 6

Title: Exploring Containerization and Application Deployment with Docker 


Objective:

The objective of this experiment is to provide hands-on experience with Docker containerization and application deployment by deploying an Apache web server in a Docker container. By the end of this experiment, you will understand the basics of Docker, how to create Docker containers, and how to deploy a simple web server application.


Introduction

Containerization is a technology that has revolutionised the way applications are developed, deployed, and managed in the modern IT landscape. It provides a standardised and efficient way to package, distribute, and run software applications and their dependencies in isolated environments called containers.


Containerization technology has gained immense popularity, with Docker being one of the most well-known containerization platforms. This introduction explores the fundamental concepts of containerization, its benefits, and how it differs from traditional approaches to application deployment.


Key Concepts of Containerization:


  • Containers: Containers are lightweight, stand-alone executable packages that include everything needed to run a piece of software, including the code, runtime, system tools, libraries, and settings. Containers ensure that an application runs consistently and reliably across different environments, from a developer's laptop to a production server.

  • Images: Container images are the templates for creating containers. They are read-only and contain all the necessary files and configurations to run an application. Images are typically built from a set of instructions defined in a Dockerfile.

  • Docker: Docker is a popular containerization platform that simplifies the creation, distribution, and management of containers. It provides tools and services for building, running, and orchestrating containers at scale.

  • Isolation: Containers provide process and filesystem isolation, ensuring that applications and their dependencies do not interfere with each other. This isolation enhances security and allows multiple containers to run on the same host without conflicts.


Benefits of Containerization:

  • Consistency: Containers ensure that applications run consistently across different environments, reducing the "it works on my machine" problem.

  • Portability: Containers are portable and can be easily moved between different host machines and cloud providers.

  • Resource Efficiency: Containers share the host operating system's kernel, which makes them lightweight and efficient in terms of resource utilization.

  • Scalability: Containers can be quickly scaled up or down to meet changing application demands, making them ideal for microservices architectures.

  • Version Control: Container images are versioned, enabling easy rollback to previous application states if issues arise.

  • DevOps and CI/CD: Containerization is a fundamental technology in DevOps and CI/CD pipelines, allowing for automated testing, integration, and deployment.


Containerization vs. Virtualization:

  • Containerization differs from traditional virtualization, where a hypervisor virtualizes an entire operating system (VM) to run multiple applications. In contrast:

  • Containers share the host OS kernel, making them more lightweight and efficient.

  • Containers start faster and use fewer resources than VMs.

  • VMs encapsulate an entire OS, while containers package only the application and its dependencies.



Materials:

  • A computer with Docker installed (https://docs.docker.com/get-docker/)

  • A code editor

  • Basic knowledge of Apache web server


Experiment Steps:

Step 1: Install Docker

  • If you haven't already, install Docker on your computer by following the instructions provided on the Docker website (https://docs.docker.com/get-docker/).

     

     To install Docker on Ubuntu, you can follow these steps:

  • Update Package List: It's a good practice to ensure that your package list is up to date. Open a terminal and run the following command:

  • sudo apt update
  • Install Dependencies: Docker requires some dependencies that can be installed with the following command:

  • sudo apt install -y apt-transport-https ca-certificates curl software-properties-common
  • Add Docker GPG Key: Add Docker's official GPG key to ensure that the software you download is from a trusted source. Run this command:

  • curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg
  • Add Docker Repository: Add the Docker repository to your system using the following command. You can choose the appropriate release (e.g., "focal" for Ubuntu 20.04) based on your Ubuntu version:

  • echo "deb [arch=amd64 signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
  • Update Package List (Again): Run an update to make sure the newly added repository is recognized:

  • sudo apt update
  • Install Docker: Finally, install Docker using the following command:

  • sudo apt install docker-ce
  • Start and Enable Docker Service: After installation, start the Docker service and enable it to start on boot:

  • sudo systemctl start docker sudo systemctl enable docker
  • Verify Docker Installation: To confirm that Docker is installed and running, run the following command:

  • sudo docker --version

You can also check if Docker is running with:

  • sudo systemctl status docker
 Manage Docker Without Sudo (Optional): By default, you need to use sudo to run Docker commands. If you want to manage Docker without sudo, you can add your user to the "docker" group with the following command
    • sudo usermod -aG docker $USER

      After running this command, you will need to log out and log back in or reboot your system for the changes to take effect.

    That's it! You should now have Docker installed and running on your Ubuntu system. You can start using Docker to create and manage containers

 

Step 2: Create a Simple HTML Page

  • Create a directory for your web server project.

  • Inside this directory, create a file named index.html with a simple "Hello, Docker!" message. This will be the content served by your Apache web server.


Step 3: Create a Dockerfile

  • Create a Dockerfile in the same directory as your web server project. The Dockerfile defines how your Apache web server application will be packaged into a Docker container. Here's an 

  • example:


Dockerfile


# Use an official Apache image as the base image

FROM httpd:2.4


# Copy your custom HTML page to the web server's document root

COPY index.html /usr/local/apache2/htdocs/


Step 4: Build the Docker Image

  • Build the Docker image by running the following command in the same directory as your Dockerfile:


docker build -t my-apache-server .

  • Replace my-apache-server with a suitable name for your image.


Step 5: Run the Docker Container

Start a Docker container from the image you built:


docker run -p 8080:80 -d my-apache-server

  • This command maps port 80 in the container to port 8080 on your host machine and runs the container in detached mode.


Step 6: Access Your Apache Web Server

Access your Apache web server by opening a web browser and navigating to http://localhost:8080. You should see the "Hello, Docker!" message served by your Apache web server running within the Docker container.


Step 7: Cleanup

Stop the running Docker container:


docker stop <container_id>

  • Replace <container_id> with the actual ID of your running container.


  • Optionally, remove the container and the Docker image:



docker rm <container_id>

docker rmi my-apache-server


Conclusion:


In this experiment, you explored containerization and application deployment with Docker by deploying an Apache web server in a Docker container. You learned how to create a Dockerfile, build a Docker image, run a Docker container, and access your web server application from your host machine. Docker's containerization capabilities make it a valuable tool for packaging and deploying applications consistently across different environments.


Exercise/Questions:


  1. Explain the concept of containerization. How does it differ from traditional virtualization methods?

  2. Discuss the key components of a container. What are images and containers in the context of containerization?

  3. What is Docker, and how does it contribute to containerization? Explain the role of Docker in building, running, and managing containers.

  4. Describe the benefits of containerization for application deployment and management. Provide examples of scenarios where containerization is advantageous.

  5. Explain the concept of isolation in containerization. How do containers provide process and filesystem isolation for applications?

  6. Discuss the importance of container orchestration tools such as Kubernetes in managing containerized applications. What problems do they solve, and how do they work?

  7. Compare and contrast containerization platforms like Docker, containerd, and rkt. What are their respective strengths and weaknesses?

  8. Explain the process of creating a Docker image. What is a Dockerfile, and how does it help in image creation?

  9. Discuss the security considerations in containerization. What measures can be taken to ensure the security of containerized applications?

Explore real-world use cases of containerization in software development and deployment. Provide examples of industries or companies that have benefited from containerization technologies.

Comments

Popular posts from this blog

Example of Maven project that interacts with a MySQL database and includes testing

Example Maven project that interacts with a MySQL database and includes testing To install Java, MySQL, Maven, and write a Java program to fetch table data, execute, and create a JAR file using Maven on Ubuntu, you can follow these steps: Step 1: Install Java You can install Java using the following commands: sudo apt update sudo apt install default-jre sudo apt install default-jdk Verify the installation by running: java -version Step 2: Install MySQL You can install MySQL using the following commands: sudo apt update sudo apt install mysql-server During the installation, you'll be prompted to set a root password for MySQL or you can set password at latter stage using following steps.  sudo mysql ALTER USER 'root'@'localhost' IDENTIFIED WITH mysql_native_password BY 'password'; exit Step 3: Install Maven You can install Maven using the following commands: sudo apt update sudo apt install maven Verify the installation by running: mvn -version Step 4: Create ...

Maven Create and Build Artifacts

In Maven, you can create and build artifacts using the package phase of the build lifecycle. The package phase is responsible for taking the compiled code and other project resources and packaging them into a distributable format, such as a JAR (Java Archive), WAR (Web Application Archive), or other custom formats. Here are the steps to create and build artifacts using Maven: Configure the Build Output: In your project's pom.xml file, you need to configure the output of the build. This includes specifying the type of artifact you want to create (e.g., JAR, WAR) and any additional resources to include. You do this in the <build> section of your pom.xml: <build>     <finalName>my-artifact</finalName> <!-- Name of the artifact without the extension -->     <plugins>         <!-- Plugin configurations for creating the artifact -->         <!-- For example, maven-jar-plugin or maven-war-p...

Maven Repositories (local, central, global)

Maven relies on repositories to manage dependencies, plugins, and other artifacts required for a project. There are typically three types of repositories in Maven: local, central, and remote/global repositories. Local Repository: Location: The local repository is located on your local development machine. By default, it's in the .m2 directory within your user home directory (e.g., C:\Users\<username>\.m2\repository on Windows or /Users/<username>/.m2/repository on macOS and Linux). Purpose: The local repository is used to store artifacts (JARs, POMs, and other files) that your machine has downloaded or built during previous Maven builds. These artifacts are specific to your local development environment. Benefits: Using a local repository improves build performance since it caches dependencies locally, reducing the need to download them repeatedly. It also ensures reproducibility by maintaining a local copy of dependencies. Central Repository: Location: The central repo...