3.10 Working with Docker in Bitbucket Pipelines: Build and Push Images

3.10 Working with Docker in Bitbucket Pipelines: Build and Push Images

Dockerizing Your Workflow: Building and Pushing Images with Bitbucket Pipelines 3.10

Docker and Bitbucket Pipelines are a powerful combination. They allow you to automate your build, test, and deployment processes with ease. In this post, we’ll dive into using Docker within Bitbucket Pipelines to build and push container images, making your application delivery more reliable and efficient. This guide focuses on a practical, hands-on approach suitable for beginners and intermediate users.

Why Docker in Bitbucket Pipelines?

Think of Docker as a standardized shipping container for your application. It packages everything your application needs to run – code, runtime, system tools, libraries, settings – guaranteeing consistent behavior across different environments.

Bitbucket Pipelines then automates the process of building these “containers” (Docker images) and pushing them to a registry (like Docker Hub, AWS ECR, or Google Container Registry). This ensures your team always has access to the latest, working version of your application, ready to deploy.

Prerequisites:

  • A Bitbucket account.
  • A Bitbucket repository with your application code and a Dockerfile.
  • A Docker Hub account (or access to another container registry).

1. Setting up your Dockerfile:

A Dockerfile is a text document that contains all the commands a user could call on the command line to assemble an image. Let’s look at a basic example:

# Use an official Python runtime as a parent image
FROM python:3.9-slim-buster

# Set the working directory to /app
WORKDIR /app

# Copy the current directory contents into the container at /app
COPY . /app

# Install any needed packages specified in requirements.txt
RUN pip install --no-cache-dir -r requirements.txt

# Make port 8000 available to the world outside this container
EXPOSE 8000

# Define environment variable
ENV NAME World

# Run app.py when the container launches
CMD ["python", "app.py"]

Explanation:

  • FROM python:3.9-slim-buster: Specifies the base image. We’re using a lightweight Python 3.9 image.
  • WORKDIR /app: Sets the working directory inside the container.
  • COPY . /app: Copies all files from your repository into the /app directory in the container.
  • RUN pip install --no-cache-dir -r requirements.txt: Installs Python dependencies listed in requirements.txt. --no-cache-dir reduces image size.
  • EXPOSE 8000: Exposes port 8000, which your application will use.
  • ENV NAME World: Defines an environment variable named NAME with the value World.
  • CMD ["python", "app.py"]: Specifies the command to run when the container starts.

2. Configuring Bitbucket Pipelines:

Now, let’s configure Bitbucket Pipelines using a bitbucket-pipelines.yml file in your repository’s root.

image: docker:latest

definitions:
  services:
    docker:
      memory: 2048

pipelines:
  default:
    - step:
        name: Build and Push Docker Image
        services: [docker]
        script:
          - echo "Logging into Docker Hub..."
          - docker login -u "$DOCKER_USERNAME" -p "$DOCKER_PASSWORD"
          - echo "Building the Docker image..."
          - docker build -t $DOCKER_IMAGE_NAME:$BITBUCKET_COMMIT .
          - echo "Pushing the Docker image..."
          - docker push $DOCKER_IMAGE_NAME:$BITBUCKET_COMMIT
        artifacts:
          - $DOCKER_IMAGE_NAME:$BITBUCKET_COMMIT

Explanation:

  • image: docker:latest: Uses the official Docker image to run the pipeline steps.
  • definitions.services.docker.memory: 2048: Allocates 2048MB of memory to the Docker service. Adjust as needed for your application.
  • pipelines.default: Defines the pipeline that will run on every commit to the default branch.
  • step.name: Build and Push Docker Image: A descriptive name for the step.
  • step.services: [docker]: Specifies that the Docker service is required for this step.
  • step.script: The heart of the pipeline, containing the commands to execute. Let’s break this down further:
    • docker login -u "$DOCKER_USERNAME" -p "$DOCKER_PASSWORD": Logs into your Docker registry (Docker Hub in this example). $DOCKER_USERNAME and $DOCKER_PASSWORD are environment variables (see below).
    • docker build -t $DOCKER_IMAGE_NAME:$BITBUCKET_COMMIT .: Builds the Docker image.
      • -t $DOCKER_IMAGE_NAME:$BITBUCKET_COMMIT: Tags the image with the repository name and the commit hash. This is crucial for version control. $DOCKER_IMAGE_NAME is an environment variable.
      • .: Specifies the location of the Dockerfile (current directory).
    • docker push $DOCKER_IMAGE_NAME:$BITBUCKET_COMMIT: Pushes the built image to the Docker registry.
  • artifacts: Defines artifacts that persist between pipeline steps.

3. Setting up Environment Variables:

For security reasons, we use environment variables to store sensitive information like Docker Hub credentials.

  1. Go to your Bitbucket repository settings.
  2. Click on “Repository Variables” under “Pipelines”.
  3. Add the following variables:
    • DOCKER_USERNAME: Your Docker Hub username.
    • DOCKER_PASSWORD: Your Docker Hub password (or an access token). Important: Use an access token with write permissions instead of your password for increased security.
    • DOCKER_IMAGE_NAME: The name of the image you want to push to Docker Hub (e.g., your_username/your_image_name).

Important Security Note: Consider creating a dedicated Docker Hub account specifically for automated builds to isolate credentials. Use access tokens with the least privileges necessary for your build process.

4. Triggering the Pipeline:

Now, simply commit and push a change to your repository. Bitbucket Pipelines will automatically trigger and execute the pipeline defined in your bitbucket-pipelines.yml file.

5. Verifying the Image:

After the pipeline completes successfully, log in to your Docker Hub account and verify that the image has been built and pushed correctly. You should see a new tag matching the commit hash.

Troubleshooting:

  • “Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?” – Make sure the services: [docker] configuration is present in your bitbucket-pipelines.yml file.
  • “unauthorized: authentication required” – Double-check your DOCKER_USERNAME and DOCKER_PASSWORD (or access token) in the repository variables. Make sure the user has push access to the repository.
  • “Error building the image” – Review your Dockerfile for errors. Common issues include missing dependencies, incorrect paths, or invalid commands. Examine the pipeline logs for detailed error messages.
  • Image build fails due to memory constraints: Consider increasing the memory allocated to the docker service. definitions.services.docker.memory: 2048

Advanced Considerations:

  • Multi-Stage Builds: Use multi-stage builds in your Dockerfile to reduce image size by separating build dependencies from runtime dependencies.
  • Image Scanning: Integrate image scanning tools (like Snyk or Clair) into your pipeline to identify vulnerabilities in your Docker images.
  • Caching: Leverage Docker layer caching to speed up build times.
  • Conditional Pipelines: Use branches and tags configurations in bitbucket-pipelines.yml to trigger different pipelines based on the branch or tag of the commit.
  • Integration with Other Services: Extend your pipeline to automate deployments to cloud platforms like AWS, Azure, or Google Cloud.

Conclusion:

Building and pushing Docker images with Bitbucket Pipelines is a powerful way to automate your application delivery process. By following the steps outlined in this guide, you can streamline your workflow, improve reliability, and ensure consistent application behavior across different environments. Experiment with the advanced considerations to further optimize your pipelines and enhance your CI/CD practices. Good luck!

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top