December 2, 2025
In our previous issue, Anthropic dropped Opus 4.5, AWS unloaded a tsunami of pre:Invent announcements, and CloudFront quietly rolled out a disruptive pricing change. This week, we've got the latest announcements from re:Invent, an introduction to O11ywashing, and Akamai buying their way into the serverless space. Plus, we've got lots of amazing content from the serverless and cloud community!
There is a lot to get to today, so I'm going to try and keep this brief. AWS re:Invent is in full swing and there were plenty of bangers announced over the last few days. Here are some of my favorites.
AWS Lambda Managed Instances is a big one. Running Lambda functions at scale can get expensive quickly, and with their single concurrency model, you have to scale out a huge fleet to process concurrent requests. With LMIs, your instances are multiconcurrent and run continuously, virtually eliminating cold starts and maximizing your instance's resources. You do have to run at least 12 vCPUs, so for smaller instance types it'll spin up 6 EC2 instances. When I get some time I'll do some math to see where the break even point is, but definitely worth exploring. The official blog post gives you a good walkthrough.
AWS also announced durable functions for multi-step applications and AI workflows. This is a simpler (albeit much less capable) way to build workflows using Lambda functions instead of Step Functions. The abstraction implementation isn't to my taste as it basically locks you into Lambda, but for many of your workflows, this could make sense. Check out the official blog post for more details.
Lambda also added support for Node.js 24 with the ability to use some of its cool new language features, and if you're curious about what's coming next, take a look at the recently updated Lambda Public Roadmap.
Another interesting serverless release was Amazon API Gateway's MCP proxy support. I haven't dug into this yet, but this seems a bit too blackboxy for me. Not sure how much control you'll have.
And then there was AI. They announce Amazon Nova 2 foundation models, which includes Nova 2 Lite, their fast, cost-effective reasoning model. I use Nova Lite for a few different tasks and I've been extremely happy with it. Nova 2 Lite is the same price as its predecessor, but is supposed to give you even better performance. They also introducing Amazon Nova 2 Omni in Preview as an all-in-one model for multimodal reasoning and image generation, and they announced Amazon Nova 2 Sonic for real-time conversational AI. Plus, the pricing actually seems reasonable. Read more about it here.
During Matt Garman's keynote this morning, he announced Amazon Nova Forge, a new service that lets you combine your data at certain Nova training checkpoints. You can now train your own custom model using Amazon's curated datasets to provide general intelligence to the model, but because your data is added during the training process, you get much better results than fine-tuning. It's a brilliant idea and could be the future of custom model generation.
And they weren't done. You can now build reliable AI agents for UI workflow automation with Amazon Nova Act, now generally available, plus the Mistral Large 3 and Ministral 3 family is now available first on Amazon Bedrock, and Amazon Bedrock adds 18 fully managed open weight models.
Amazon Bedrock AgentCore now includes Policy (preview), Evaluations (preview) and more, plus the AgentCore Runtime now supports bi-directional streaming, and multimodal retrieval for Bedrock Knowledge Bases is now generally available.
Amazon S3 also dropped some amazing updates. Amazon S3 Tables now offer the Intelligent-Tiering storage class that monitors access patterns and can reduce your costs by up to 80%. And Amazon S3 Tables now support automatic replication of Apache Iceberg tables (if you're into that kinda thing).
Amazon S3 Vectors is now generally available with increased scale and performance. And Amazon S3 increased the maximum object size to 50 TB!
Okay, this is not as brief as I had hoped. There's a lot more to go.
AWS simplified IAM policy creation with IAM Policy Autopilot, a new open source MCP server for builders, and they also announced a preview of the AWS MCP Server, essentially allowing agents to call over 15,000 AWS API. And if you think that means agents are coming for you job, you better watch out for the AWS DevOps Agent (preview) and AWS Transform custom that claims will crush your tech debt with AI-powered code modernization. They might still need humans to find all those pesky unused NAT Gateways at least, oh wait, nevermind.
They finally introduced Database Savings Plans for AWS Databases, eliminated local storage provisioning for Apache Spark workloads on Amazon EMR Serverless, enabled automatic quota management, and added Apache Iceberg on Amazon Redshift.
Amazon OpenSearch Service added GPU-accelerated and auto-optimized vector indexes and introduced Agentic Search. Plus, Amazon GuardDuty extended threat detection to Amazon EC2 and Amazon ECS.
Finally, NVIDIA and AWS are now all buddy-buddy, and if you want to win $50,000, and you're good with AI, think about joining the AWS AI League 2026 Championship.
In non-AWS news, Akamai Technologies announced the acquisition of Function-as-a-Service company Fermyon. Good for them!
AI is flooding systems with disposable code, while durable code continues to power critical paths that can’t afford surprises. Honeycomb provides sub-second queries, unified telemetry, and AI-ready tooling so teams can spot issues in durable systems before transient changes cause lasting problems. Learn more here. Sponsored
Here are several other announcements that caught my attention:
Okay, if you've gotten this far, congratulations! I think that's enough for this week. Better get back to the hallway track.
Take care,
Jeremy
I hope you enjoyed this newsletter. We're always looking for ideas and feedback to make it better and more inclusive, so please feel free to reach out to me via Bluesky, LinkedIn, X, or email.
Stay up to date on using serverless to build modern applications in the cloud. Get insights from experts, product releases, industry happenings, tutorials and much more, every week!
Check out all of our amazing sponsors and find out how you can help spread the #serverless word by sponsoring an issue.
Jeremy is the Director of Research at CloudZero, founder of Ampt, and an AWS Serverless Hero that has a soft spot for helping people solve problems using the cloud. You
can find him ranting about serverless and cloud on Bluesky, LinkedIn, X, the Serverless Chats podcast, and at
conferences around the world.
Off-by-none is committed to celebrating the diversity of the serverless community and recognizing the people who make it awesome. If you know of someone doing amazing things with serverless, please nominate them to be a Serverless Star ⭐️!