Jump to
- Useful Libs and Tools
- Best Practices Guides
- Synchronous Invokes, Asynchronous Invokes, Poll-Based Invokes
- MLTA, Debugging, Error Handling
- Code storage for uploaded Lambda functions (
CodeStorageExceededException
) - Lambda Scaling and Throughput
- Lambda Performance Optimisation
- Lambda Cost Optimisaton
- Lambda Function URLs
- Lambda Response Streaming
- Lambda Container Images
- Lambda Layers
- Lambda Extensions
- Design Patterns
- Gotchas
- Lambda base container images
- Lambda execution environment
- AWS Lambda Extensions API specification
- AWS Lambda Logs API specification
- AWS Lambda Runtime API specification
- aws-lambda-extensions - AWS Lambda Extensions sample projects
- Lambda Container Images
- serverless-java-container - Serverless Java container
- gallery.ecr.aws/lambda - AWS Lambda base images
- awslambdaric - AWS Lambda Python Runtime Interface Client (RIC)
- aws-lambda-runtime-interface-emulator - AWS Lambda Runtime Interface Emulator (RIE)
- Example: https://docs.aws.amazon.com/lambda/latest/dg/python-image.html
- Lambda Powertools
- cdk-aws-lambda-powertools-layer - AWS Lambda powertools layer
- aws-lambda-powertools-dotnet - AWS Lambda Powertools for .NET
- aws-lambda-powertools-java - AWS Lambda Powertools for Java
- aws-lambda-powertools-python - AWS Lambda Powertools for Python
- https://awslabs.github.io/aws-lambda-powertools-python/core/logger/
- https://awslabs.github.io/aws-lambda-powertools-python/core/metrics/
- https://awslabs.github.io/aws-lambda-powertools-python/core/tracer/
- https://awslabs.github.io/aws-lambda-powertools-python/utilities/middleware_factory/
- https://aws.amazon.com/blogs/compute/building-well-architected-serverless-applications-understanding-application-health-part-2/
- aws-lambda-powertools-typescript - AWS Lambda Powertools for TypeScript
- Tuning
- alexcasalboni/aws-lambda-power-tuning - AWS Lambda Power Tuning is a state machine powered by AWS Step Functions that helps you optimize your Lambda functions for cost and/or performance in a data-driven way.
- aws-lambda-builders - Lambda Builders is a Python library to compile, build and package AWS Lambda functions for several runtimes & frameworks. Lambda Builders is the brains behind the
sam build
. - aws-lambda-dotnet - Lambda Annotations Framework for .NET
- aws-lambda-developer-guide - AWS Lambda Developer Guide with examples
- authorization-lambda-at-edge - Authorization Lambda@Edge (Node.js)
- cargo-lambda - Cargo Lambda provides tools and workflows to help you get started building Rust functions for AWS Lambda from scratch.
- lambci/docker-lambda - A sandboxed local environment that replicates the live AWS Lambda environment
- Using SAM CLI with the CDK to test a Lambda function locally
- Understanding the Different Ways to Invoke Lambda Functions, AWS, 2019-07-02
- How to trigger a Lambda function at specific time in AWS
- E.g. SQS (message with a
due-date
) -> EventBridge Pipes -> Step Function (WaitTillScheduleddue-date
) -> Lambda
- E.g. SQS (message with a
- Lambda Performance Insights
- Lambda Powertools (Python, Java, TypeScript, .NET) - used as lib or as a layer
- Distributed tracing
- Structured logging
- Async metrics
- Event routing (Python only)
- Streaming (Python only)
- Lambda Telemetry API (Lambda Extensions, New Relic, Sumo Logic) - deployed as layers
- CloudWatch
- Implementing AWS Lambda error handling patterns, AWS, 2023-07-06
- Implementing error handling for AWS Lambda asynchronous invocations, AWS, 2023-04-25
- Lambda Destinations + X-Ray traces
- With Destinations, you will be able to send asynchronous function execution results to a destination resource without writing code.
For each execution status (i.e. Success and Failure), you can choose one destination from four options:
- another Lambda function,
- an SNS topic,
- an SQS standard queue, or
- EventBridge.
- With Destinations, you will be able to send asynchronous function execution results to a destination resource without writing code.
For each execution status (i.e. Success and Failure), you can choose one destination from four options:
- Dead-letter queues
The Lambda service stores your function code in an internal S3 bucket that's private to your account. Each AWS account is allocated 75 GB of storage in each Region (and can be increased up to Terabytes). Code storage includes the total storage used by both Lambda functions and layers. If you reach the quota, you receive a CodeStorageExceededException
when you attempt to deploy new functions.
See Lambda quotas.
- From AWS Lambda console > Dashboard
- From AWS CLI:
This returns each published version of the function/layer together with the $LATEST version. The CodeSize attribute shows the total number of bytes used by code storage of this function/layer.
aws lambda list-versions-by-function --function-name myTestFunction aws lambda get-layer-version --layer-version --layer-name TestLayer --version-number 2
See Monitoring Lambda code storage.
It's best practice to manage the storage space available and clean up old versions of functions and remove unused code.
- For Serverless, you may use the Serverless plugin serverless-prune-plugin.
See Best practices for managing code storage.
There are two scaling quotas to consider with concurrency: account concurrency quota and burst concurrency quota.
-
The account concurrency is the maximum concurrency in a particular Region. This is shared across all functions in an account. The default Regional concurrency quota starts at 1,000, which you can increase with a service ticket.
-
The burst concurrency quota provides an initial burst of traffic for each function, between 500 and 3000 per minute, depending on the Region. After this initial burst, functions can scale by another 500 concurrent invocations per minute for all Regions. If you reach the maximum number of concurrent requests, further requests are throttled.
The function initialization process can introduce latency for your applications. You can reduce this latency by configuring Provisioned Concurrency for a function version or alias. This prepares runtime environments in advance, running the function initialization process, so the function is ready to invoke when needed.
This is primarily useful for synchronous requests to ensure you have enough concurrency before an expected traffic spike. You can still burst above this using standard concurrency.
You can use Application Auto Scaling to adjust Provisioned Concurrency automatically based on Lambda's utilization metric.
- Each runtime environment processes a single request at a time. While a single runtime environment is processing a request, it cannot process other requests.
- The number of runtime environments determines the concurrency. This is the sum of all concurrent requests for currently running functions at a particular point in time.
- The number of transactions Lambda can process per second is the sum of all invokes for that period.
- Reducing a function's invocation duration can increase the transactions per second that a function can process.
To estimate concurrent requests from the number of requests per unit of time (e.g. seconds) and their average duration, using the formula:
RequestsPerSecond x AvgDurationInSeconds = concurrent requests
For example, if a Lambda function takes an average 500 ms to run, at 100 requests per second, the number of concurrent requests is 50:
100 requests/second x 0.5 sec = 50 concurrent requests
See this blog post for more examples.
- For synchronous invocations, Lambda returns a throttling error (
429
) to the caller, which must retry the request. - With asynchronous and event source mapping invokes, Lambda automatically retries the requests.
See also
- Understanding AWS Lambda’s invoke throttling limits, AWS, 2023-07-07
There are CloudWatch metrics available to monitor your account and function concurrency to ensure that your applications can scale as expected. Monitor function Invocations and Duration to understand throughput. Throttles show throttled invocations.
- ConcurrentExecutions tracks the total number of runtime environments that are processing events. Ensure this doesn't reach your account concurrency to avoid account throttling. Use the metric for individual functions to see which are using account concurrency, and also ensure reserved concurrency is not too high. For example, a function may have a reserved concurrency of 2000, but is only using 10.
- UnreservedConcurrentExecutions show the number of function invocations without reserved concurrency. This is your available account concurrency buffer.
- Use ProvisionedConcurrencyUtilization to ensure you are not paying for Provisioned Concurrency that you are not using. The metric shows the percentage of allocated Provisioned Concurrency in use.
- ProvisionedConcurrencySpilloverInvocations show function invocations using standard concurrency, above the configured Provisioned Concurrency value. This may show that you need to increase Provisioned Concurrency.
See Understanding AWS Lambda scaling and throughput.
Some good reads:
- Optimizing AWS Lambda extensions in C# and Rust, AWS, 13 Apr 2023
- Understanding AWS Lambda scaling and throughput, AWS, 18 Jul 2022
- Optimizing Node.js dependencies in AWS Lambda, AWS, 13 JUL 2022
- Optimizing AWS Lambda function performance for Java, AWS, 31 MAR 2022
- How to Make Your Lambda Functions Run Faster (and Cheaper) by webiny on 2020-11-25
- Shave 99.93% off your Lambda bill with this one weird trick
by Michael Hart on 2019-12-10
- The init stage has the same performance as a 1792 MB Lambda, even if we're only running a 128 MB one.
- Technically you can do up to 10 seconds of work before it starts getting included in the billed duration.
- Also, you'll always have to pay something for the handler execution - the minimum billed execution time is 100ms.
- Using AWS Lambda with AWS X-Ray - AWS X-Ray can provide sampling of cross-component breakdown of where initialisation, execution and integration time is being spent.
See also Lambda - Performance optimization.
- Right-sizing memory allocation
- Setting a realistic function timeout
- Using Graviton
- Filtering event sources for AWS Lambda functions
- Avoiding recursive invocation with Amazon S3 and AWS Lambda
Useful blog posts
The Lambda Function URLs feature was introduced in April 2022. A function URL is a dedicated HTTP(S) endpoint for the Lambda function (https://<url-id>.lambda-url.<region>.on.aws
)
Lambda function URLs use resource-based policies for security and access control. When AuthType = None, resource-based policy that grants public access, any unauthenticated user with your function URL can invoke your function).
Recommendation:
- Use SCP to deny following actions:
lambda:CreateFunctionUrlConfig
lambda:UpdateFunctionUrlConfig
Useful blog posts
- Protecting an AWS Lambda function URL with Amazon CloudFront and Lambda@Edge, AWS, 2023-08-23
- Building a Serverless ASP.NET Core Web API with AWS Lambda using Function URLs, 2022-10-02
- To make the API available on a public HTTP(S) endpoint, generally 2 approaches
- along with Amazon API Gateway
- use Lambda function URLs
- To make the API available on a public HTTP(S) endpoint, generally 2 approaches
- Using response streaming with AWS Lambda Web Adapter to optimize performance, AWS, 2023-08-07
- Introducing AWS Lambda response streaming, AWS, 2023-04-07
- Previewing environments using containerized AWS Lambda functions (using Lamda URL, Lambda extension), AWS, 2023-02-06
- Optimizing Lambda functions packaged as container images (blog post)
- Working with Lambda layers and extensions in container images (blog post)
- Centralizing management of AWS Lambda layers across multiple AWS Accounts
, AWS, 2023-09-19
- aws-samples/lambda-layer-management
- With AWS Config, EventBridge Scheduler, SSM Automation, StackSets.
- On demand query example:
aws configservice select-aggregate-resource-config \ --expression "SELECT accountId, awsRegion, configuration.functionName, configuration.version WHERE resourceType = 'AWS::Lambda::Function' AND configuration.layers.arn = 'YOUR_LAYER_ARN'" \ --configuration-aggregator-name 'YOUR_AGGREGATOR_NAME' \ --query "Results" \ --output json | \ jq -r '.[] | fromjson | [.accountId, .awsRegion, .configuration.functionName, .configuration.version] | @csv' > output.csv
- Enhancing runtime security and governance with the AWS Lambda Runtime API proxy extension, AWS, 2023-10-27
- Running a web server in Lambda with Lambda extension - Lambda Web Adapter - blog post.
- Comparing design approaches for building serverless microservices, AWS, 2024-03-04
Python 3.6/3.7 are Amazon Linux 1 and Python 3.8/3.9 are Amazon Linux 2.
In general it should be fine to upgrade from Python 3.6 to 3.9. But there are cases you'll need to make some changes. For example: If you have code utilizing some sys call, e.g. curl
- curl
is not installed in Amazon Linux 2 by default.
If you see error like Invalid base64:
, it could be because since awscli 2, payloads need to be base64 encoded when invoking a Lambda function.
By default, the AWS CLI version 2 now passes all binary input and binary output parameters as base64-encoded strings. A parameter that requires binary input has its type specified as blob (binary large object) in the documentation.
You will need to pass in also --cli-binary-format raw-in-base64-out
. For example:
aws lambda invoke --function-name testsms \
--invocation-type Event \
--cli-binary-format raw-in-base64-out \
--payload '{"key": "test"}' response.json
See also
- aws/aws-cli#4968
- https://stackoverflow.com/questions/60310607/amazon-aws-cli-not-allowing-valid-json-in-payload-parameter
See AWS Lambda FAQs
Lambda attempts to impose as few restrictions as possible on normal language and operating system activities, but there are a few activities that are disabled: Inbound network connections are blocked by AWS Lambda, and for outbound connections, only TCP/IP and UDP/IP sockets are supported, and ptrace (debugging) system calls are blocked. TCP port 25 traffic is also blocked as an anti-spam measure.
- By default calling the callback() function in a NodeJS Lambda function does not end the function execution. It will continue running until the event loop is empty. A common issue with NodeJS Lambda functions continuing to run after callback is called occurs when you are holding on to open database connections.
I solved my problems with set to callbackWaitsForEmptyEventLoop = false.
- https://stackoverflow.com/questions/53296201/request-time-out-from-aws-lambda/53312129
- https://stackoverflow.com/questions/37791258/lambda-timing-out-after-calling-callback
- https://docs.aws.amazon.com/lambda/latest/dg/nodejs-handler.html (edited)
-
For non-async handlers, function execution continues until the event loop is empty or the function times out. The response isn't sent to the invoker until all event loop tasks are finished. If the function times out, an error is returned instead. You can configure the runtime to send the response immediately by setting context.callbackWaitsForEmptyEventLoop to false.
-