Application Containers-Increase DevOps Agility and Gain Efficiencies
A recent survey across 4,976 respondents showed that high-performing IT organizations deliver changes to production 30 times more frequently and 200 times faster . As software munches through industries, fast delivery is a huge competitive advantage. However, not only do these organizations deliver faster, they do so with higher quality; with 60 percent fewer failed deployments and a Mean Time to Recover (MTTR) that’s 168 times faster!
Not surprisingly, a key differentiator for these top performers was their adoption of DevOps practices and comprehensive automation. In this article I will discuss a technology that is rapidly becoming the cornerstone of successful DevOps initiatives—application containers.
A Brief Introduction to Containers
Containers provide a way to securely segment and share an operating system’s resources. Containers are sometimes referred to as OS Virtualization, as a single host operating system is made to look like multiple, independent, systems. Containers are not new. Traditional platforms, such as Heroku and Cloud Foundry, have always used containers internally. What’s different now, is that the industry is moving towards standardizing containers, and that enterprises can freely choose the best tools and services built around these standards. Interestingly, as part of the launch of its cloud services, Google revealed that they have been running most of their applications in containers for the last 10 years.
Containers provide several benefits and lead to DevOps best practices. Here are the top benefits:
1.Immutable Images: with containers, application code is packaged into a binary image, as part of the build process. This image can be versioned and tagged and pushed to an image repository. The image can then be pulled and executed, without any change, on any container host. Immutable images are a key to fast recovery and other DevOps best practices such as blue-green deployments and canary launches.
2.Portability: once built, a container image can be run on any host. This enables basic portability. Multi-cloud Container services, like Nirmata and Tutum (recently acquired by Docker, Inc.), build on this basic portability and enable fully automated deployment of entire applications on any cloud provider, with a single click or API request.
3.Standard units of operations: since all applications are packaged in a common format, automation tools do not have to deal with the complexity and varying needs of different languages and runtimes. A great analogy here is to a shipping container and transportation industries—imagine the lack of productivity before shipping containers became an international standard!
4.Fast deployment times: containers are lightweight and typically start in milliseconds. Since during a release cycle application developers may do thousands of launches, decreasing deployment times directly translates to huge productivity gains for your most precious resources—your team! One of our customers at Nirmata, a large Enterprise IT applications team, estimated a 30 percent increase in agility by moving to containers for automated deployment of their dev- test environments.
5.Increased Utilization: with containers, you can decouple applications from the underlying infrastructure. This makes it possible to share infrastructure across applications and environment types. Container services take this a step further, and fully automate the placement and scheduling of application containers. These tools can help increase server utilization from the typical 10-15 percent to a 60-80 percent range. For example, another Nirmata customer has leverages AWS Spot Instances and saves 60 percent on server costs, by relying on automated recovery and demand driven scheduling.
6.Enabler for Microservices: high performers have architected for DevOps. These set of architectural best practices that enable DevOps are now becoming mainstream, as Microservices. With Microservices style architectures, large monolithic applications are decomposed into smaller runtime components that can be independently versioned and delivered.
As with any new technology, application containers do introduce new challenges that need to be considered. Here are the main areas to consider for production deployments:
1.Security: application containers running on the same server share a host operating system. This may be a concern for some applications. Most deployments today deploy containers on Virtual Machines (or a cloud instance), and use existing tools to secure the virtual machine. This can impact utilization, but helps provide security and isolation where needed.
Another security concern is ensuring that container images, especially those used in production environments, are approved and verified for use by the security team and are pulled from a trusted source.
Both of these issues have been addressed by Docker and startups like Illumio, Twistlock, Apcera, and others are providing comprehensive security and compliance solutions for containers.
2.Orchestration and Lifecycle Management: while deploying a small set of containers is easy, managing containers for different environments and applications can quickly become a complex and painful task. Container deployment and lifecycle management is best left to automated services which can continuously monitor, tune, and optimize deployments. The good news is that startups like Nirmata, Tutum, and others are providing easy-to-use multi-cloud container services designed for enterprise DevOps teams.
3. Monitoring and Troubleshooting: with containers, and the shift towards Microservices style applications, there are many more moving pieces to manage. This requires new approaches for monitoring and troubleshooting applications. Once again a healthy ecosystem has rapidly emerged, with startups like Sysdig, SignalFx, and existing vendors like DataDog filling in the void.
Application Containers are a key enabler for DevOps agility and automation. While no single technology is a silver bullet, containers can be a game-changer for enterprise DevOps!
There is a common misconception that containers are only good for certain types of applications, such as stateless microservices style applications. This is not the case and almost any application that can run in a Virtual Machine can also run in a container. Also newer storage services, as well as container storage plug-ins are available for the few cases that cannot be handled. Much like the adoption curve for VMs and cloud instances went from “cool, but my application has special needs” to “cloud-first!” containers are poised to quickly become the first choice for packaging and deploying applications.
By adopting containers, enterprises can immediately leverage several best practices that high-performing IT teams have in place. At Nirmata, the advice we give our customers is to follow these three steps:
1.Containerize: start by containerizing existing applications. This provides immediate value with agility and portability. The process of containerization will also bring best practices, such as separating configuration from code, and other principles for 12-factor applications . As an example a 3-tier application is best containerized as three separate containers images, one for each tier.
2. Automate: With containers as the building block, you can now fully automate application delivery and lifecycle management. Early adopters had to build their own tools, or cobble together several different tools. Now enterprise-grade solutions are available for mainstream use.
3.Transform: lead towards a DevOps culture and organization. Also consider transforming monolithic applications into microservices style applications for increased agility and benefits. These transitions take time, and should not be rushed into.