looking at devops tooling and facts about some deployment automation and its tools

Inside and underneath the devops automated deployment tools

Automated deployment tools like Jenkins, Kubernetes, Docker, Chef, Ansible, OpenShift, OpenStack and Jira can automate deployment to production environments or containers in various cloud configurations. We look at understanding some of the concepts and we answer some of your questions…

considerations and a look at the environments needed to deploy in devops…

Q: What is a Deployment Pipeline?

A: A Deployment Pipeline is an important concept in Continuous Delivery. In Deployment Pipeline we break the build process into distinct stages. In each stage we get the feedback to move onto the next stage. It is a collaborative effort between various groups involved in delivering software development. Often the first stage in Deployment Pipeline is compiling the code and converting into binaries.
After that we run the automated tests. Based on the specific scenario, there are stages like performance testing, security check, usability testing etc in a Deployment Pipeline. With DevOps, the complete Development cycle from initial design to production deployment becomes shorter. Our aim is to automate all the stages of Deployment Pipeline. With a smooth-running Deployment Pipeline, we can achieve the goal of Continuous Delivery (CD).
Deployment Rollback accommodates any failures with deployment due to a bug in code or an issue in production. This gives confidence in releasing features without worrying about downtime for rollback.

Deployment Automation best practice:
The deployment process is automated to the extent that in a build process we can add the step of deploying the code to a test environment. On this test environment all the stakeholders can access and test the latest delivery.

 

Cloud computing supports following deployment models:

  1. Private Cloud:
    Some companies build their private cloud. A private cloud is a fully functional platform that is owned, operated, and used by only one organization.
    1. Primary reason for private cloud is security. Many companies feel secure in private cloud. The other reasons for building private cloud are strategic decisions or control of operations.
    2. There is also a concept of Virtual Private Cloud (VPC). In VPC, private cloud is built and operated by a hosting company. But it is exclusively used by one organization.
  2. Public Cloud:
    There are cloud platforms by some companies that are open for public as well as big companies for use and deployment. E.g., Google Apps, Amazon Web Services etc.
    1. The public cloud providers focus on layers and application like cloud application, infrastructure management etc.
    2. In this model resources are shared among different organizations.
  • Hybrid Cloud:
    The combination of public and private cloud is known as Hybrid cloud.
    This approach provides benefits of both the approaches- private and public cloud. So, it is very robust platform.
    A client gets functionalities and features of both the cloud platforms. By using Hybrid cloud an organization can create its own cloud as well as they can pass the control of their cloud to another third party.

 

Atlassian “Jira” can be used for writing requirements and tracking automated deployment tasks.

“Chef” is a tool that can be used to perform automated deployment in most cloud environments, catering for high availability as well.

Some of the popular use cases of “Ansible” are as follows: It can deploy apps in a reliable and repeatable way and can be used in orchestration of complex deployment in a simple way. Release of updates with zero downtime.

 

“Kubernetes” is used for automation of large-scale deployment of Containerized applications. It is also used in all cloud model configurations. It is an open-source system based on concepts like Google’s deployment process of millions of containers. A Node in “Kubernetes” is responsible for running an application. The Node can be a Virtual Machine or a Computer in the cluster. There is software called “Kubelet” on each node. This software is used for managing the node and communicating with the Master node in cluster. In a “Kubernetes” cluster, there is a Deployment Controller. This controller monitors the instances created by “Kubernetes” in a cluster. Once a node or the machine hosting the node goes down, Deployment Controller will replace the node.

Jenkins is used to create automated flows to run Automation tests. The first part of test automation is to develop test strategy and test cases. Once automation test cases are ready for an application, we must plug these into each Build run. In each Build we run Unit tests, Integration tests and Functional tests. With a “Jenkins” job, we can automate all these tasks. Once all the automated tests pass, we consider the build as green. This helps in deployment and release processes to build confidence on the application software.

 

Q: Why is automation of deployment very important in Cloud architecture?

A: One of the main reasons for selecting Cloud architecture is scalability of the system.

  • In case of heavy load, we must scale up the system so that there is no performance degradation.
  • While scaling up the system we must start new instances.
  • To provision new instances, we must deploy our application on them.
  • In such a scenario, if we want to save time, it makes sense to automate the deployment process.

(Another term for this is Auto-scaling)

  • With a fully automated deployment process we can start new instances based on automated triggers that are raised by load reaching a threshold.

 

Q: What is a Docker container?

A: A Docker Container is a lightweight system that can be run on a Linux operating system or a virtual machine. It is a package of an application and related dependencies that can be run independently.

Since Docker Container is very lightweight, multiple containers can be run simultaneously on a single server or virtual machine.

With a Docker Container we can create an isolated system with restricted services and processes. A Container has private view of the operating system. It has its own process ID space, file system, and network interface.

Multiple Docker Containers can share same Kernel.

 

Docker is an application-centric solution. It is optimized for deployment of an application. It does not replace a machine by creating a virtual machine. Rather, it focuses on providing ease of use features to run an application.

Some of the common use cases of Docker are as follows:

  • Setting up Development Environment: We can use Docker to set the development environment with the applications on which our code is dependent.
  • Testing Automation Setup: Docker can also help in creating the Testing Automation setup. We can setup different services and apps with Docker to create the automation-testing environment.
  • Production Deployment: Docker also helps in implementing the Production deployment for an application. We can use it to create the exact environment and process that will be used for doing the production deployment.

 

Q: Can you lose data when a Docker Container exits?
A: No, only when a container is deleted, the related data will be deleted. A Docker Container has its own filesystem. In an application running on Docker Container, we can write to this filesystem. When the container exits, data written to the filesystem remains. When we restart the container, same data can be accessed again.

 

Docker based containers have following security concerns:

  • Kernel Sharing: In a container-based system, multiple containers share same Kernel. If one container causes Kernel to go down, it will take down all the containers. In a virtual machine environment, we do not have this issue.
  • Container Leakage: If a malicious user gains access to one container, it can try to access the other containers on the same host. If a container has security vulnerabilities it can allow the user to access other containers on same host machine.
  • Denial of Service: If one container occupies the resources of a Kernel then other containers will starve for resources. It can create a Denial-of-Service attack like situation.
  • Tampered Images: Sometimes a container image can be tampered. This can lead to further security concerns. An attacker can try to run a tampered image to exploit the vulnerabilities in host machines and other containers.
  • Secret Sharing: Generally, one container can access other services. To access a service, it requires a Key or Secret. A malicious user can gain access to this secret. Since multiple containers share the secret, it may lead to further security concerns.

 

Docker is a very powerful tool. Some of the main benefits of using Docker are as follows:

  • Utilize Developer Skills: With Docker we maximize the use of Developer skills. With Docker there is less need of build or release engineers. Same Developer can create software and wrap it in one single file.
  • Standard Application Image: Docker based system allows us to bundle the application software and Operating system files in a single Application Image that can be deployed independently.
  • Uniform deployment: With Docker we can create one package of our software and deploy it on different platforms seamlessly.

 

In Docker workflow, Developer builds an Image after developing and testing the software. This Image is shipped to Registry. From Registry it is available for deployment to any system. The development process is simpler since steps for QA and Deployment etc take place before the Image is built. So, Developer gets the feedback early!

 

Next week’s discussion: (Keep your questions coming)

  1. “Which monitoring and operating tools are used in the industry?”
  2. “How will you optimize the Cloud Computing environment?”
    1. “In a Cloud Computing environment, we pay by usage. In such a scenario our usage costs are much higher.
    2. To optimize the Cloud Computing environment, we must keep a balance between our usage costs and usage…”
    3. Etc…

 

In WiRDs experience it is important to be clear on which tools will be used for each DevOps category. What has your experience been with regards to deploying DevOps testing tools and devops automation?
Please share your experiences with WiRD by commenting or interacting on LinkedIn

This website uses cookies and asks your personal data to enhance your browsing experience.