The ninth annual AWS event, re:Invent kicked off on November 30, this time virtually. The three-week event began just as every event has, with a famously lengthy and informative keynote address from AWS Chief Executive Officer Andy Jassy. Jassy began by discussing AWS’ $46 billion revenue run rate as per Q3 2020 and its achievement of a year-over-year growth rate of 29%.
And while the AWS CEO’s address has flooded headlines, behind the scenes the company has been making headway with impressive updates to its serverless features.
“Starting today, you can allocate up to 10 GB of memory to a Lambda function,” the company wrote in a blog post announcing the new capabilities. “This is more than a 3x increase compared to previous limits. Lambda allocates CPU and other resources linearly in proportion to the amount of memory configured. That means you can now have access to up to 6 vCPUs in each execution environment.”
While serverless computing appears to mean there are no servers, it actually means that developers no longer need to consider compute, storage, and memory requirements because AWS will take care of it for them. This enables developers to code the application instead of deploying resources.
AWS provides users with all base images for the supported Lambda runtimes, including Python, Node.js, Java, .NET, Go, and Ruby. With these, it is much easier to add code and dependencies without trying to refactor one’s application to work in a cloud-native, serverless way.
Developers can deploy any container image as long as it conforms to the AWS Lambda’s Runtime API to receive invocation requests and send responses after processing requests. The ENTRYPOINT configuration should define the filesystem location in the container image that implements the AWS Lambda Runtime Interface Client (RIC). If developers prefer to use an arbitrary base image, they can leverage the open sourced AWS Lambda Runtime Interface Client to make it compatible with Lambda’s Runtime API.
This announcement combined with support for support for the AVX2 instruction set, means that developers can use this function with more sophisticated technologies like machine learning, gaming, and even high-performance computing.
Amazon has also released the open-source Lambda Runtime Interface Emulator. The emulator allows users to test images locally to make them run when deployed to AWS and Lambda. The Lambda Runtime Interface Emulator is included in all AWS-provided base images and can be used with arbitrary images as well.