Content

The mighty microservice

Taking containers to the next level

Microservices: Containers on steroids

As container technology continues to evolve from its earliest days on UNIX, it has found its way into new uses and deployments, inspired by the imagination and creativity of software developers and IT engineers.

With the birth and development of the next generation of container platforms such as Docker, Kubernetes, Mesos and others since 2013, that progression has bloomed even fuller, adding microservices into the mix, allowing even more innovations for enterprise use of containers and applications.

Containers isolate software from their surroundings, and package an application’s code, runtime, system tools, libraries and other components into a lightweight, stand-alone executable that can be run in any environment. Microservices are the creation of smaller parts of an application that can run individually and be separately maintained, instead of as part of a larger, monolithic application.

What makes today’s containers particularly different is how they can be more efficient and more secure than containers in the past, especially by running microservices within them.

At Capital One Financial, containers and microservices have been used in production over the past several years to deliver gains in matching workloads to compute resources, which has been driving efficiencies in the company’s use of containers, according to Sathiya Shunmugasundaram, a lead software engineer.

“The maturity model has improved significantly,” he says. “Now with finegrained microservices you design exactly each component with the right sizing of compute resources” to significantly reduce the compute needs of the company’s IT systems.

To accomplish this in production, new APIs are being doubled up in containers, while legacy APIs are being rewritten from a monolith to use the best practices of containers and microservices, says Shunmugasundaram. At the same time, legacy APIs and applications are being broken up into microservices and deployed into containers as well.

“The confidence that is coming out of other people’s adoption, and microservices themselves, have been evolving successfully,” he says. As container technology evolves, microservices become more realistic and functional. “That is what makes the containers more efficient compared to the past.”

Chris Collins, a senior automation engineer and a leader of container strategies in Duke University’s Office of Information Technology, says the swirling changes involving containers in the last few years is encouraging the school to continuously update its approach to them.

“It’s not like this is an extremely mature technology,” says Collins. “People are coming up with new things. We’re kind of rethinking the way we’ve done things and some of the decisions we made two or three years ago don’t really apply now. The ecosystem has changed.”

At Duke, when it comes to containerization and microservices, there’s a philosophy brewing that just about everything can be containerized if desired, starting with low-hanging fruit such as Web services, says Collins. And as new containers are added, that process is leading to the development of additional microservices.

“It’s pretty easy to take a traditional application and containerize it in multiple containers and then they’re ripe for becoming microservices, especially if they do jobs in the back end,” he says. “That’s what we’re going for now, but we’ve also containerized DNS servers and all kinds of other things.”

A Helping Hand

Containers and microservices “scratch each other’s backs,” says Vincent Batts, principal software engineer for container architecture in the office of the CTO at open source software company, Red Hat.

What has encouraged this relationship is that container platforms such as Kubernetes and others have grown and evolved their ecosystems in recent years, allowing microservices to become more flexible and approachable for IT professionals and enterprises, he says.

“You can now deploy these microservices into Kubernetes to extend your container platform and that lets you run your containers easier,” says Batts. “I think containers are making microservices more efficient and microservices are making container orchestration more efficient or more enabled.”

Inside Red Hat, containers have aided IT processes the company was already doing internally, such as scanning for vulnerabilities, adding single sign-on capabilities and more. Using microservices, those services and functions are easier to deploy, manage and integrate, says Batts. “It’s pervasive at this point.”

In addition, container platforms of the past few years have also gained major improvements in security and stability that are required by enterprises, spawned largely due to the specific business-critical, musthaves of corporate users, says Batts. 

“There are wild amounts of improvements … all the way down to nitty gritty details inside the kernel,” he says. “We are orders of magnitude better off now than we were four years ago as far as containers go.”

As additional enterprises evaluate containers and microservices to determine if they would be a good fit for their IT infrastructures, Batts says enterprise IT leaders must look at their specific internal use cases to figure out where it would make sense to deploy the technologies.

That’s the approach being taken at Duke University, where establishing a true microservices environment is still relatively new, says Collins. That meant familiarizing themselves by first containerizing some traditional applications, getting familiar with those environments and then looking to see how they could break some of the applications up into the smaller pieces as microservices to make things even more efficient.

“It’s a lot less complex so development of each piece and maintaining them becomes a lot easier,” says Collins. “We’re just moving the complexity from development to infrastructure, but I think there’s going to be a general gain overall.”

Kelsey Hightower, staff developer advocate for Google Cloud Platform and a widely known container and open source luminary, says he believes that while the topics of containers and microservices get much of the attention today, it is really the underlying platforms like Kubernetes, Mesos and Docker Swarm that should be getting the credit for the most important recent advances in the field.

“Where the efficiencies have come from has been the growth in container platforms,” which can more densely pack and better manage containers and microservices, he says. “When containers did show up on the scene, it made it easier to package those applications we’d been writing in a common format. It’s these tools that give the efficiencies today … not the containers themselves.”

Microservices are also providing improvements for containers because the code being written and used for them today is the result of better libraries available to and used by developers. “With most of these microservices, most developers aren’t writing all that stuff from scratch, they’re writing their business logic and importing libraries,” he says.

The Sprawl Effect

With the arrival of the Docker container platform in 2013, enterprise users began rediscovering containers as a new way of running applications using an alternative to virtual machines alone. At that point, interest in container technology was lackluster. Instead of having to run inside a virtual machine (VM), with a weight of a full operating system and other resource-hogging components, applications could be run in their own virtual environments inside containers. By running inside containers, which take up less space than VMs, applications start up almost instantaneously, as opposed to the tendency of VMs to be slower due to the additional resources they require to run. (The first implementation of container technology was chroot in 1982, which ran on UNIX. There have been more than a dozen other container approaches since then, including today’s current generation of container offerings.)

The increasing enterprise task of running applications inside VMs has been a steady trend in recent years, but it has also led to problems of VM sprawl being faced by some companies as it becomes harder to manage all the VMs they are using.

Integrating the use of containers and microservices can potentially help reduce and better manage VM sprawl issues, but it depends on the circumstances, say experts.

“For some infrastructures, [VM sprawl] might be a real problem because they don’t even know what is running and where, and the management of that is just a real burden,” says Batts. “So that might be a real winner for them. For some infrastructures, that’s just not a concern.”

If an enterprise is experiencing serious VM sprawl and is looking to find a realistic way to manage it, the company can evaluate if switching to containers could help reduce the sprawl problems, says Batts. Yet it’s not always a certain fix, he admits.

“If switching to containers doesn’t solve or make their VM sprawl go away, it just adds another thing to manage. I don’t think containers are going to unseat virtual machines, but they will definitely simplify some use cases that folks were running using virtual machines previously,” he says.

Collins, of Duke University, sees a direct correlation between the use of containers and reductions in VM sprawl inside the school’s IT infrastructure. With containers, enterprises like Duke can place hundreds of them onto a single host, depending on how many resources they use. Containers do not require the overhead of running a full-blown operating system like a VM requires, he explains.

“We did really look at trying to do this as a way of containing VM sprawl,” he says. “Getting a whole bunch of containers onto a host makes it a more efficient use of your host,” which can reduce the number of VMs that must be run. Of course, there are tradeoffs with containers as well and the IT team is still in the process of figuring out the right mix of strategies, according to Collins. 

“It’s really not just as easy as throwing a hundred containers on a host, because you have to worry about routing and ports and making sure that there’s orchestration of the containers and connecting them together correctly and in the right order,” he says.

Meanwhile, Google’s Hightower takes a different view on VM sprawl. He thinks containers make the sprawl worse when they are used by IT staffers who do not fully understand them, essentially trading one problem for another. By adding containers into the mix with VMs, users are then faced with two technology issues they must worry about, compared to one issue previously.

“There is a container sprawl because people are bringing in these containers and are not quite sure how they’re built,” says Hightower. “Once the process starts running, it’s going to be leveraging the additional OS layer that’s inside of that container and that’s where the sprawl starts to come from.”

Fortunately for enterprises that are working with containers today, the improved management tools included with Kubernetes, Docker Swarm and other platforms can be used to help control these additional issues by keeping track of what is running and where it is running, he says. “That’s a new capability that most people didn’t have before.”

Analysts’ Outlook

So, where and how will containers and microservices fit into enterprise IT plans and strategies in the future?

According to recent predictions from research firm Gartner, by 2020 more than 50 percent of global organizations are expected to be running containerized applications in production, which is up from less than 20 percent that are running them today. Some 65 percent of Gartner’s customers report, meanwhile, that they are presently experimenting with containers in their test labs.

“We’re clearly seeing growing adoption of containers in the customer environments,” says Arun Chandrasekaran, a Gartner research vice president. “The new breed of containers are more application-focused or application packaging-focused, rather than infrastructure- and hardware-focused” as in the past. “This is a primary benefit of these containers today,” because this provides more hands-on tools for developers to customize needed applications, he says.

The figures are much smaller, though, when it comes to estimating the number of enterprise users of microservices today, he adds. A few years ago, microservices were only being used by Silicon Valley giants like Facebook, Netflix and Amazon, as well as by Wall Street banks, but in 2018 he is seeing the trend beginning to seep into the enterprise among sophisticated users and organizations that incorporate advanced technologies and platforms.

So far, the use of containers and microservices is not being seen much within mainstream enterprises, says Chandrasekaran, but that’s due mostly because those businesses rely mainly on legacy applications and typically do not have the open source staff experience needed for such implementations.

As that changes, however, the benefits of containers and microservices will likely spread to more enterprises, he says. “Containers are clearly much more efficient from a developer standpoint because … you’re not actually spending a lot of time managing infrastructure, you’re really spending time writing code, which is really what developers want to do.”

And because containers are more lightweight compared to VMs, that presents enterprises with another big potential benefit — the ability to consider and implement hardware consolidation efforts, he says.

In considering prospective microservices implementations, enterprises should carefully evaluate all the primary benefits that offer promise for IT operations, says Chandrasekaran, including greater agility, which lets developers build and deploy application code faster, as well as faster scaling capabilities due to their smaller and independent attributes.

In addition, the distributed development capabilities of microservices, which enable the breakdown of applications into smaller components that can be simultaneously built by separate software teams anywhere in the world, can also speed up development schedules, helping companies accelerate their product and business efforts. Microservices can also be useful for risk reduction, he says, since they are smaller components and do not represent a single point of failure or large failure target in enterprise systems.

Gary Chen, research manager of software defined compute with IDC, says he views microservices and containers as a perfect fit for each other, making them useful tools for a wider swath of enterprises as they explore new ways of working with VMs and critical business applications.

“We see containers being used for microservices and for traditional monolithic applications as well, so it’s not limited,” says Chen. “Containers are certainly ideal for microservices because the microservices are small things and containers are very efficient for holding lots of small things. VMs are not efficient for that. They’re good at holding big things, but when you have lots and lots of small things, VMs become very inefficient.”

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.