The term Serverless Containers represents the idea that customers can  now run containers without having to manage the actual servers or compute infrastructure that the containers are running on. 

Serverless – Some Background

Traditionally, the “Serverless” term was associated with platforms like AWS Lambda (Functions as a Service). With this new paradigm, users don’t have to deal with any maintenance or management of the underlying infrastructure in order to run their code. 

Here are several characteristics that define Serverless very well:

  1. No maintenance of infrastructure
  2. Scale by request – The service will scale based on real-time requests
  3. Utility billing – Pay only for what you use
  4. Scale to zero – The service can scale all the way down to zero when there is no usage
  5. Fast Scaling – no waiting for threshold to satisfy container needs immediately

When these terms are met, it means a PaaS platform is  “Serverless” since it handles all the infrastructure and maintenance aspects for the user.

Over time, customers wanted to extend that approach, and instead of packaging their code in a ZIP file and shipping it to a Serverless platform, they wanted to have a way to ship a standard container that includes all of their dependencies, specific OS, packages, code, and configurations. Recently, this has become available with new “Containers as a Service” technologies, and the two converged in what is now known as “Serverless Containers”.


According to Docker’s website, containers “are an abstraction at the app layer that packages code and dependencies together. Multiple containers can run on the same machine and share the OS kernel with other containers, each running as isolated processes in user space. Containers take up less space than VMs (container images are typically tens of MBs in size), can handle more applications and require fewer VMs and Operating systems.” 

Containerization has become extremely popular and has modernized application architecture, helping companies move from monolithic architecture to microservices. 

Bring the two together, AWS now offers “Serverless Container” products like Fargate which enables companies to run containers without having to manage EC2 servers or clusters. 

The users now can run their microservices in their containers in a serverless environment. 

Fargate completely removes the need to touch infrastructure for companies that use AWS’ Elastic Container Service (ECS). While this helps reduce the complexity involved in managing the underlying infrastructure, Fargate can be quite expensive. Additionally, it is not currently available for Kubernetes or EKS users. Lastly, the user has zero control or visibility into which instance types are being used.

Google also offers a serverless containers option with Cloud Run

Microsoft Azure offers Container Instances and as they write you can “easily run containers on Azure without managing servers”.

Ocean by Spot provides an ideal Serverless Container experience for enterprises and SMBs alike. Ocean manages all the underlying infrastructure, removing all the time-consuming, nitty-gritty tasks, freeing DevOps to handle more important, core IT and engineering activities. 

With Ocean’s built-in container-driven autoscaling, all resource requests are handled based on real requirements for CPU and Memory, resulting in high workload availability. Whenever more node optimization is possible, Ocean will cease Pod or Task scheduling, and will proactively and gracefully drain any underutilized Nodes and bin-pack containers onto the best suited instance(s).

Ocean works with Amazon ECS, Amazon EKS, Kops and similar Kubernetes orchestration platforms. 

Of course, Spot can run all your container workloads on EC2 Spot instances with enterprise-level SLAs so you can enjoy a serverless experience without breaking the bank. This is seamlessly achieved by minimizing your day-to-day infrastructure management with an affordable serverless container solution that automates your cloud cost optimization with savings on popular compute resources like EC2, by up to 90%.

With Ocean you also are able to control which instance types are being used so you can make sure your workloads are running on hardware that is best suited for them.

Use All EC2 Pricing Models

Ocean also will utilize any unused Reserved Instances you might have, which further optimizes your overall cloud cost and ROI. In the event that there are no spot instances available, we will fall-back to On-Demand instances ensuring your workload always runs.