Skip to main content

Experiment No. 6 Title: Exploring Containerization and Application Deployment with Docker

 Experiment No. 6

Title: Exploring Containerization and Application Deployment with Docker 


Objective:

The objective of this experiment is to provide hands-on experience with Docker containerization and application deployment by deploying an Apache web server in a Docker container. By the end of this experiment, you will understand the basics of Docker, how to create Docker containers, and how to deploy a simple web server application.


Introduction

Containerization is a technology that has revolutionised the way applications are developed, deployed, and managed in the modern IT landscape. It provides a standardised and efficient way to package, distribute, and run software applications and their dependencies in isolated environments called containers.


Containerization technology has gained immense popularity, with Docker being one of the most well-known containerization platforms. This introduction explores the fundamental concepts of containerization, its benefits, and how it differs from traditional approaches to application deployment.


Key Concepts of Containerization:


  • Containers: Containers are lightweight, stand-alone executable packages that include everything needed to run a piece of software, including the code, runtime, system tools, libraries, and settings. Containers ensure that an application runs consistently and reliably across different environments, from a developer's laptop to a production server.

  • Images: Container images are the templates for creating containers. They are read-only and contain all the necessary files and configurations to run an application. Images are typically built from a set of instructions defined in a Dockerfile.

  • Docker: Docker is a popular containerization platform that simplifies the creation, distribution, and management of containers. It provides tools and services for building, running, and orchestrating containers at scale.

  • Isolation: Containers provide process and filesystem isolation, ensuring that applications and their dependencies do not interfere with each other. This isolation enhances security and allows multiple containers to run on the same host without conflicts.


Benefits of Containerization:

  • Consistency: Containers ensure that applications run consistently across different environments, reducing the "it works on my machine" problem.

  • Portability: Containers are portable and can be easily moved between different host machines and cloud providers.

  • Resource Efficiency: Containers share the host operating system's kernel, which makes them lightweight and efficient in terms of resource utilization.

  • Scalability: Containers can be quickly scaled up or down to meet changing application demands, making them ideal for microservices architectures.

  • Version Control: Container images are versioned, enabling easy rollback to previous application states if issues arise.

  • DevOps and CI/CD: Containerization is a fundamental technology in DevOps and CI/CD pipelines, allowing for automated testing, integration, and deployment.


Containerization vs. Virtualization:

  • Containerization differs from traditional virtualization, where a hypervisor virtualizes an entire operating system (VM) to run multiple applications. In contrast:

  • Containers share the host OS kernel, making them more lightweight and efficient.

  • Containers start faster and use fewer resources than VMs.

  • VMs encapsulate an entire OS, while containers package only the application and its dependencies.



Materials:

  • A computer with Docker installed (https://docs.docker.com/get-docker/)

  • A code editor

  • Basic knowledge of Apache web server


Experiment Steps:

Step 1: Install Docker

  • If you haven't already, install Docker on your computer by following the instructions provided on the Docker website (https://docs.docker.com/get-docker/).

     

     To install Docker on Ubuntu, you can follow these steps:

  • Update Package List: It's a good practice to ensure that your package list is up to date. Open a terminal and run the following command:

  • sudo apt update
  • Install Dependencies: Docker requires some dependencies that can be installed with the following command:

  • sudo apt install -y apt-transport-https ca-certificates curl software-properties-common
  • Add Docker GPG Key: Add Docker's official GPG key to ensure that the software you download is from a trusted source. Run this command:

  • curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg
  • Add Docker Repository: Add the Docker repository to your system using the following command. You can choose the appropriate release (e.g., "focal" for Ubuntu 20.04) based on your Ubuntu version:

  • echo "deb [arch=amd64 signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
  • Update Package List (Again): Run an update to make sure the newly added repository is recognized:

  • sudo apt update
  • Install Docker: Finally, install Docker using the following command:

  • sudo apt install docker-ce
  • Start and Enable Docker Service: After installation, start the Docker service and enable it to start on boot:

  • sudo systemctl start docker sudo systemctl enable docker
  • Verify Docker Installation: To confirm that Docker is installed and running, run the following command:

  • sudo docker --version

You can also check if Docker is running with:

  • sudo systemctl status docker
 Manage Docker Without Sudo (Optional): By default, you need to use sudo to run Docker commands. If you want to manage Docker without sudo, you can add your user to the "docker" group with the following command
    • sudo usermod -aG docker $USER

      After running this command, you will need to log out and log back in or reboot your system for the changes to take effect.

    That's it! You should now have Docker installed and running on your Ubuntu system. You can start using Docker to create and manage containers

 

Step 2: Create a Simple HTML Page

  • Create a directory for your web server project.

  • Inside this directory, create a file named index.html with a simple "Hello, Docker!" message. This will be the content served by your Apache web server.


Step 3: Create a Dockerfile

  • Create a Dockerfile in the same directory as your web server project. The Dockerfile defines how your Apache web server application will be packaged into a Docker container. Here's an 

  • example:


Dockerfile


# Use an official Apache image as the base image

FROM httpd:2.4


# Copy your custom HTML page to the web server's document root

COPY index.html /usr/local/apache2/htdocs/


Step 4: Build the Docker Image

  • Build the Docker image by running the following command in the same directory as your Dockerfile:


docker build -t my-apache-server .

  • Replace my-apache-server with a suitable name for your image.


Step 5: Run the Docker Container

Start a Docker container from the image you built:


docker run -p 8080:80 -d my-apache-server

  • This command maps port 80 in the container to port 8080 on your host machine and runs the container in detached mode.


Step 6: Access Your Apache Web Server

Access your Apache web server by opening a web browser and navigating to http://localhost:8080. You should see the "Hello, Docker!" message served by your Apache web server running within the Docker container.


Step 7: Cleanup

Stop the running Docker container:


docker stop <container_id>

  • Replace <container_id> with the actual ID of your running container.


  • Optionally, remove the container and the Docker image:



docker rm <container_id>

docker rmi my-apache-server


Conclusion:


In this experiment, you explored containerization and application deployment with Docker by deploying an Apache web server in a Docker container. You learned how to create a Dockerfile, build a Docker image, run a Docker container, and access your web server application from your host machine. Docker's containerization capabilities make it a valuable tool for packaging and deploying applications consistently across different environments.


Exercise/Questions:


  1. Explain the concept of containerization. How does it differ from traditional virtualization methods?

  2. Discuss the key components of a container. What are images and containers in the context of containerization?

  3. What is Docker, and how does it contribute to containerization? Explain the role of Docker in building, running, and managing containers.

  4. Describe the benefits of containerization for application deployment and management. Provide examples of scenarios where containerization is advantageous.

  5. Explain the concept of isolation in containerization. How do containers provide process and filesystem isolation for applications?

  6. Discuss the importance of container orchestration tools such as Kubernetes in managing containerized applications. What problems do they solve, and how do they work?

  7. Compare and contrast containerization platforms like Docker, containerd, and rkt. What are their respective strengths and weaknesses?

  8. Explain the process of creating a Docker image. What is a Dockerfile, and how does it help in image creation?

  9. Discuss the security considerations in containerization. What measures can be taken to ensure the security of containerized applications?

Explore real-world use cases of containerization in software development and deployment. Provide examples of industries or companies that have benefited from containerization technologies.

Comments

Popular posts from this blog

Experiment No. 5 Title: Applying CI/CD Principles to Web Development Using Jenkins, Git, and Local HTTP Server

  Experiment No. 5 Title: Applying CI/CD Principles to Web Development Using Jenkins, Git, and Local HTTP Server  Objective: The objective of this experiment is to set up a CI/CD pipeline for a web development project using Jenkins, Git, and webhooks, without the need for a Jenkinsfile. You will learn how to automatically build and deploy a web application to a local HTTP server whenever changes are pushed to the Git repository, using Jenkins' "Execute Shell" build step. Introduction: Continuous Integration and Continuous Deployment (CI/CD) is a critical practice in modern software development, allowing teams to automate the building, testing, and deployment of applications. This process ensures that software updates are consistently and reliably delivered to end-users, leading to improved development efficiency and product quality. In this context, this introduction sets the stage for an exploration of how to apply CI/CD principles specifically to web development using J

Experiment No. 10 Title: Create the GitHub Account to demonstrate CI/CD pipeline using Cloud Platform.

  Experiment No. 10 Title: Create the GitHub Account to demonstrate CI/CD pipeline using Cloud Platform. Objective: The objective of this experiment is to help you create a GitHub account and set up a basic CI/CD pipeline on GCP. You will learn how to connect your GitHub repository to GCP, configure CI/CD using Cloud Build, and automatically deploy web pages to an Apache web server when code is pushed to your repository. Introduction: Continuous Integration and Continuous Deployment (CI/CD) pipelines are essential for automating the deployment of web applications. In this experiment, we will guide you through creating a GitHub account and setting up a basic CI/CD pipeline using Google Cloud Platform (GCP) to copy web pages for an Apache HTTP web application. Continuous Integration and Continuous Deployment (CI/CD) is a crucial practice in modern software development. It involves automating the processes of code integration, testing, and deployment to ensure that software changes are co