Lambda Layers let you package dependencies, custom runtimes, and shared code separately from your function code. This reduces deployment size, speeds up deployments, and ensures consistency across functions. Here’s how to use them effectively.

Layer Basics

Lambda Function Package = Your Code + Layers

Layer structure:
├── python/           # Python packages (auto-added to PYTHONPATH)
│   └── lib/
│       └── python3.12/
│           └── site-packages/
├── nodejs/           # Node.js modules (auto-added to NODE_PATH)
│   └── node_modules/
├── bin/              # Executables (auto-added to PATH)
└── lib/              # Shared libraries (auto-added to LD_LIBRARY_PATH)

Creating a Python Layer

# Create layer directory structure
mkdir -p layer/python/lib/python3.12/site-packages

# Install dependencies into layer
pip install -t layer/python/lib/python3.12/site-packages \
  requests \
  boto3 \
  aws-lambda-powertools

# Package layer
cd layer
zip -r ../python-deps-layer.zip .
cd ..

# Upload to AWS
aws lambda publish-layer-version \
  --layer-name python-deps \
  --description "Common Python dependencies" \
  --zip-file fileb://python-deps-layer.zip \
  --compatible-runtimes python3.11 python3.12 \
  --compatible-architectures x86_64 arm64

Layer with Terraform

# Build layer from requirements.txt
resource "null_resource" "python_layer" {
  triggers = {
    requirements = filesha256("${path.module}/requirements-layer.txt")
  }

  provisioner "local-exec" {
    command = <<-EOF
      rm -rf ${path.module}/layer
      mkdir -p ${path.module}/layer/python/lib/python3.12/site-packages
      pip install -t ${path.module}/layer/python/lib/python3.12/site-packages \
        -r ${path.module}/requirements-layer.txt --quiet
      cd ${path.module}/layer && zip -r ../python-layer.zip .
    EOF
  }
}

data "archive_file" "layer" {
  type        = "zip"
  source_dir  = "${path.module}/layer"
  output_path = "${path.module}/python-layer.zip"
  depends_on  = [null_resource.python_layer]
}

resource "aws_lambda_layer_version" "deps" {
  filename            = data.archive_file.layer.output_path
  layer_name          = "python-common-deps"
  description         = "Common Python dependencies: requests, powertools"
  source_code_hash    = data.archive_file.layer.output_base64sha256

  compatible_runtimes = ["python3.11", "python3.12"]
  compatible_architectures = ["x86_64", "arm64"]
}

# Use layer in function
resource "aws_lambda_function" "api" {
  function_name = "api"
  runtime       = "python3.12"
  handler       = "handler.handler"
  role          = aws_iam_role.lambda.arn

  filename         = "function.zip"
  source_code_hash = filebase64sha256("function.zip")

  layers = [
    aws_lambda_layer_version.deps.arn,
    aws_lambda_layer_version.utils.arn,  # Can stack up to 5 layers
  ]
}

Node.js Layer

# Create layer structure
mkdir -p layer/nodejs
cd layer/nodejs

# Create package.json
cat > package.json << 'EOF'
{
  "name": "lambda-layer",
  "version": "1.0.0",
  "dependencies": {
    "@aws-sdk/client-dynamodb": "^3.0.0",
    "@aws-sdk/lib-dynamodb": "^3.0.0",
    "axios": "^1.0.0",
    "lodash": "^4.17.21"
  }
}
EOF

npm install --production

# Package
cd ..
zip -r nodejs-layer.zip nodejs/

# Deploy
aws lambda publish-layer-version \
  --layer-name nodejs-deps \
  --zip-file fileb://nodejs-layer.zip \
  --compatible-runtimes nodejs18.x nodejs20.x

Shared Utilities Layer

# layer/python/utils/__init__.py
"""Shared utilities available to all Lambda functions."""

import json
import logging
from functools import wraps
from typing import Any, Callable

logger = logging.getLogger()

def json_response(status_code: int, body: Any) -> dict:
    """Standard API Gateway response format."""
    return {
        'statusCode': status_code,
        'headers': {
            'Content-Type': 'application/json',
            'Access-Control-Allow-Origin': '*',
        },
        'body': json.dumps(body) if not isinstance(body, str) else body,
    }


def handle_errors(func: Callable) -> Callable:
    """Decorator for consistent error handling."""
    @wraps(func)
    def wrapper(event, context):
        try:
            return func(event, context)
        except ValueError as e:
            logger.warning(f"Validation error: {e}")
            return json_response(400, {'error': str(e)})
        except PermissionError as e:
            logger.warning(f"Permission denied: {e}")
            return json_response(403, {'error': 'Forbidden'})
        except Exception as e:
            logger.exception(f"Unhandled error: {e}")
            return json_response(500, {'error': 'Internal server error'})
    return wrapper


def parse_body(event: dict) -> dict:
    """Parse JSON body from API Gateway event."""
    body = event.get('body', '{}')
    if isinstance(body, str):
        return json.loads(body)
    return body
# Your function code (much cleaner!)
from utils import json_response, handle_errors, parse_body

@handle_errors
def handler(event, context):
    body = parse_body(event)
    name = body.get('name')
    
    if not name:
        raise ValueError("Name is required")
    
    return json_response(200, {'message': f'Hello, {name}!'})

Binary Dependencies Layer

# Dockerfile for building native binaries
FROM amazonlinux:2023

RUN yum install -y \
    python3.12 \
    python3.12-pip \
    gcc \
    python3.12-devel \
    libffi-devel

WORKDIR /build

COPY requirements.txt .

RUN pip3.12 install -t /opt/python/lib/python3.12/site-packages \
    -r requirements.txt

# Package
RUN cd /opt && zip -r /build/layer.zip .
# Build and extract layer
docker build -t lambda-layer-builder .
docker run --rm -v $(pwd):/output lambda-layer-builder \
  cp /build/layer.zip /output/

# Deploy
aws lambda publish-layer-version \
  --layer-name native-deps \
  --zip-file fileb://layer.zip \
  --compatible-runtimes python3.12 \
  --compatible-architectures x86_64

Layer Versioning Strategy

# Version layers with semantic versioning in description
resource "aws_lambda_layer_version" "deps_v2" {
  layer_name  = "python-deps"
  description = "v2.1.0 - Added powertools, updated boto3"
  
  filename         = "layer-v2.1.0.zip"
  source_code_hash = filebase64sha256("layer-v2.1.0.zip")

  compatible_runtimes = ["python3.12"]
}

# Keep old version for rollback
resource "aws_lambda_layer_version" "deps_v1" {
  layer_name  = "python-deps"
  description = "v1.0.0 - Initial release"
  
  filename         = "layer-v1.0.0.zip"
  source_code_hash = filebase64sha256("layer-v1.0.0.zip")

  compatible_runtimes = ["python3.12"]
}

# Function uses specific version
resource "aws_lambda_function" "api" {
  # ...
  layers = [aws_lambda_layer_version.deps_v2.arn]  # Pin to v2
}

Layer Size Limits

Single layer: 50 MB (zipped), 250 MB (unzipped)
Total layers: 5 per function
Combined size: Function + all layers ≤ 250 MB unzipped

Tips for staying under limits:
- Use __pycache__ cleanup: find . -type d -name __pycache__ -exec rm -rf {} +
- Remove tests: find . -type d -name tests -exec rm -rf {} +
- Remove docs: find . -name "*.md" -delete
- Use slim variants: pip install --no-compile --prefer-binary

Using AWS-Provided Layers

# AWS Lambda Powertools
resource "aws_lambda_function" "api" {
  # ...
  
  layers = [
    # AWS-managed layer (auto-updates minor versions)
    "arn:aws:lambda:${var.region}:017000801446:layer:AWSLambdaPowertoolsPythonV2:51",
    
    # Parameters and Secrets extension
    "arn:aws:lambda:${var.region}:177933569100:layer:AWS-Parameters-and-Secrets-Lambda-Extension:11",
  ]
}
# Using Powertools from layer
from aws_lambda_powertools import Logger, Tracer, Metrics
from aws_lambda_powertools.event_handler import APIGatewayRestResolver

logger = Logger()
tracer = Tracer()
metrics = Metrics()
app = APIGatewayRestResolver()

@app.get("/users/<user_id>")
@tracer.capture_method
def get_user(user_id: str):
    logger.info(f"Fetching user {user_id}")
    return {"user_id": user_id}

@logger.inject_lambda_context
@tracer.capture_lambda_handler
@metrics.log_metrics
def handler(event, context):
    return app.resolve(event, context)

CI/CD Pipeline for Layers

# .github/workflows/deploy-layer.yml
name: Deploy Lambda Layer

on:
  push:
    paths:
      - 'layers/**'
      - '.github/workflows/deploy-layer.yml'

jobs:
  deploy:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      
      - name: Set up Python
        uses: actions/setup-python@v4
        with:
          python-version: '3.12'
      
      - name: Build layer
        run: |
          mkdir -p layer/python/lib/python3.12/site-packages
          pip install -t layer/python/lib/python3.12/site-packages \
            -r layers/requirements.txt
          cd layer && zip -r ../layer.zip .
      
      - name: Configure AWS credentials
        uses: aws-actions/configure-aws-credentials@v4
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: us-east-1
      
      - name: Publish layer
        run: |
          VERSION=$(aws lambda publish-layer-version \
            --layer-name python-deps \
            --zip-file fileb://layer.zip \
            --compatible-runtimes python3.12 \
            --query 'Version' --output text)
          echo "Published layer version: $VERSION"

Key Takeaways

  1. Layers reduce deployment size — deploy your code in KB, not MB
  2. Structure matters — use python/ for Python, nodejs/node_modules for Node.js
  3. Version your layers — include version in description, pin functions to specific versions
  4. 5 layer limit — combine related dependencies into single layers
  5. Build for target architecture — use Docker with Amazon Linux for native dependencies
  6. AWS-provided layers — Powertools, Extensions, Pandas — use them before building your own
  7. Share across accounts — add layer permissions for cross-account use

“Layers are like shared libraries for serverless. Get your common dependencies right once, then forget about them.”