Python Decorators: Use Case Examples

Understanding what a decorator is and knowing where to use one are two different skills. This article covers ten practical decorator patterns that solve real problems in Python codebases: measuring performance, logging function calls, caching expensive computations, validating inputs, limiting call frequency, enforcing permissions, retrying failures, restricting instantiation, flagging deprecated code, and tracing execution flow. Each example includes a complete, copy-ready implementation.

Every decorator in this article follows the same foundational structure: an outer function that receives configuration, an inner function that receives the target, and a wrapper that replaces it. Each one uses functools.wraps to preserve the original function's metadata. The differences between them come down to what the wrapper does before, after, or around the original call.

1. Execution Timing

Measuring how long a function takes to run is the single simplest useful decorator. It wraps the call with a start timestamp, executes the function, computes the elapsed time, and reports it. This is valuable during development for identifying bottlenecks and in production for feeding performance metrics to monitoring systems.

import time
import functools

def timer(func):
    """Log execution time of the decorated function."""
    @functools.wraps(func)
    def wrapper(*args, **kwargs):
        start = time.perf_counter()
        result = func(*args, **kwargs)
        elapsed = time.perf_counter() - start
        print(f"{func.__name__} completed in {elapsed:.4f}s")
        return result
    return wrapper

@timer
def parse_logfile(path):
    with open(path) as f:
        return [line.strip() for line in f if "ERROR" in line]

errors = parse_logfile("/var/log/app.log")
# parse_logfile completed in 0.0312s

The time.perf_counter() clock is preferred over time.time() because it provides the highest available resolution for measuring short durations and is not affected by system clock adjustments.

2. Call Logging

A logging decorator creates an audit trail of every function invocation. It records the function name, the arguments it received, and the value it returned. This is especially useful in API-heavy applications and data pipelines where tracing the sequence of calls matters.

import logging
import functools

logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)

def log_calls(func):
    """Log function name, arguments, and return value."""
    @functools.wraps(func)
    def wrapper(*args, **kwargs):
        args_repr = [repr(a) for a in args]
        kwargs_repr = [f"{k}={v!r}" for k, v in kwargs.items()]
        signature = ", ".join(args_repr + kwargs_repr)
        logger.info("Calling %s(%s)", func.__name__, signature)
        result = func(*args, **kwargs)
        logger.info("%s returned %r", func.__name__, result)
        return result
    return wrapper

@log_calls
def calculate_tax(income, rate=0.25):
    return income * rate

calculate_tax(85000, rate=0.30)
# INFO:__main__:Calling calculate_tax(85000, rate=0.3)
# INFO:__main__:calculate_tax returned 25500.0

The repr() calls ensure that string arguments are displayed with quotes, making it easy to distinguish "42" from 42 in the log output. For production use, you would replace print-based logging with structured logging that feeds into your observability stack.

3. Memoization (Result Caching)

Memoization stores the result of a function call keyed by its arguments. When the same arguments appear again, the cached result is returned without re-executing the function. This transforms expensive recursive algorithms from exponential to linear time and eliminates redundant API calls or database queries.

import functools

def memoize(func):
    """Cache function results based on arguments."""
    cache = {}
    @functools.wraps(func)
    def wrapper(*args):
        if args in cache:
            return cache[args]
        result = func(*args)
        cache[args] = result
        return result
    wrapper.cache = cache
    wrapper.cache_clear = cache.clear
    return wrapper

@memoize
def fibonacci(n):
    if n < 2:
        return n
    return fibonacci(n - 1) + fibonacci(n - 2)

print(fibonacci(80))          # 23416728348467684
print(len(fibonacci.cache))   # 81 entries cached

Python's standard library provides functools.lru_cache, which does the same thing with additional features like a configurable maximum cache size and automatic eviction of the least recently used entries. For simple cases, the custom version above is instructive. For production, prefer lru_cache:

import functools

@functools.lru_cache(maxsize=256)
def expensive_query(user_id, date_range):
    # Simulates a slow database call
    import time
    time.sleep(0.5)
    return {"user_id": user_id, "records": 142}

# First call: 0.5s. Second call with same args: instant.
result = expensive_query("u_8837", "2026-Q1")
Note

Memoization only works for functions with hashable arguments. Lists, dictionaries, and sets cannot be used as cache keys. If your function accepts mutable arguments, convert them to tuples or frozensets before caching.

4. Input Validation

A validation decorator checks that a function's arguments meet expected criteria before the function body executes. This catches errors at the boundary of the function call rather than deep inside the implementation, producing clearer error messages and preventing invalid state from propagating.

import functools
import inspect

def validate_types(**expected_types):
    """Validate argument types against expected annotations."""
    def decorator(func):
        sig = inspect.signature(func)
        @functools.wraps(func)
        def wrapper(*args, **kwargs):
            bound = sig.bind(*args, **kwargs)
            bound.apply_defaults()
            for param_name, expected in expected_types.items():
                if param_name in bound.arguments:
                    value = bound.arguments[param_name]
                    if not isinstance(value, expected):
                        raise TypeError(
                            f"{func.__name__}() parameter '{param_name}' "
                            f"expected {expected.__name__}, got {type(value).__name__}"
                        )
            return func(*args, **kwargs)
        return wrapper
    return decorator

@validate_types(name=str, age=int)
def create_user(name, age, email=None):
    return {"name": name, "age": age, "email": email}

create_user("Kandi", 30)               # Works
create_user("Kandi", "thirty")          # TypeError: parameter 'age' expected int, got str

The inspect.signature and bind combination maps positional and keyword arguments to their parameter names regardless of how the caller passes them. This means create_user("Kandi", age=30) and create_user("Kandi", 30) are both validated correctly.

5. Rate Limiting

A rate limiting decorator prevents a function from being called more frequently than a specified threshold. This is critical when consuming third-party APIs that enforce call quotas -- exceeding the limit can result in temporary bans or elevated costs.

import time
import functools

def rate_limit(calls_per_second=1):
    """Throttle function calls to a maximum frequency."""
    min_interval = 1.0 / calls_per_second
    def decorator(func):
        last_called = [0.0]
        @functools.wraps(func)
        def wrapper(*args, **kwargs):
            elapsed = time.monotonic() - last_called[0]
            if elapsed < min_interval:
                time.sleep(min_interval - elapsed)
            last_called[0] = time.monotonic()
            return func(*args, **kwargs)
        return wrapper
    return decorator

@rate_limit(calls_per_second=2)
def fetch_stock_price(symbol):
    # Simulates API call
    return {"symbol": symbol, "price": 182.63}

# These calls will be spaced at least 0.5s apart
for sym in ["AAPL", "GOOG", "MSFT", "AMZN"]:
    print(fetch_stock_price(sym))

The last_called value is stored in a list rather than a plain variable because closures in Python can read but not rebind variables from enclosing scopes without the nonlocal keyword. Using a mutable container like a list sidesteps this restriction. Alternatively, you could use nonlocal last_called with a plain float.

6. Access Control

An access control decorator checks authorization before allowing a function to execute. This pattern appears throughout web frameworks -- Flask uses @login_required, Django uses @permission_required -- but the underlying mechanism is the same: inspect the caller's credentials and either proceed or reject.

import functools

def require_permission(permission):
    """Block execution unless the user has the required permission."""
    def decorator(func):
        @functools.wraps(func)
        def wrapper(user, *args, **kwargs):
            user_perms = user.get("permissions", [])
            if permission not in user_perms:
                raise PermissionError(
                    f"User '{user.get('name')}' lacks '{permission}' permission"
                )
            return func(user, *args, **kwargs)
        return wrapper
    return decorator

@require_permission("delete")
def remove_document(user, doc_id):
    print(f"Document {doc_id} removed by {user['name']}")

admin = {"name": "Kandi", "permissions": ["read", "write", "delete"]}
viewer = {"name": "Guest", "permissions": ["read"]}

remove_document(admin, "doc_991")     # Document doc_991 removed by Kandi
remove_document(viewer, "doc_991")    # PermissionError raised

The parameterized structure -- require_permission("delete") returning a decorator -- allows different functions to require different permissions while sharing the same enforcement logic.

7. Automatic Retry

A retry decorator re-executes a function when it raises a specified exception. This is essential for network calls, database connections, and any operation subject to transient failures. Adding exponential backoff prevents overwhelming a recovering service.

import time
import functools

def retry(max_tries=3, delay=1.0, backoff=2, exceptions=(Exception,)):
    """Retry with exponential backoff on specified exceptions."""
    def decorator(func):
        @functools.wraps(func)
        def wrapper(*args, **kwargs):
            current_delay = delay
            last_exc = None
            for attempt in range(1, max_tries + 1):
                try:
                    return func(*args, **kwargs)
                except exceptions as exc:
                    last_exc = exc
                    if attempt < max_tries:
                        print(f"Retry {attempt}/{max_tries} for "
                              f"{func.__name__} in {current_delay:.1f}s")
                        time.sleep(current_delay)
                        current_delay *= backoff
            raise last_exc
        return wrapper
    return decorator

@retry(max_tries=4, delay=0.5, exceptions=(ConnectionError, TimeoutError))
def fetch_weather(city):
    import random
    if random.random() < 0.7:
        raise ConnectionError("Service unavailable")
    return {"city": city, "temp_c": 22}
Pro Tip

For production retry logic with jitter, async support, and composable stop conditions, consider the tenacity library. A custom decorator like this one is appropriate when you want zero dependencies or need to understand the mechanism from the ground up.

8. Singleton Pattern

The singleton decorator ensures a class produces only one instance. Subsequent calls to the constructor return the existing instance instead of creating a new one. This is useful for connection pools, configuration managers, and logger instances that should exist exactly once.

import functools

def singleton(cls):
    """Ensure only one instance of the decorated class exists."""
    instances = {}
    @functools.wraps(cls, updated=[])
    def get_instance(*args, **kwargs):
        if cls not in instances:
            instances[cls] = cls(*args, **kwargs)
        return instances[cls]
    return get_instance

@singleton
class DatabasePool:
    def __init__(self, host, port=5432):
        self.host = host
        self.port = port
        print(f"Pool created for {host}:{port}")

    def query(self, sql):
        return f"Executing: {sql}"

pool_a = DatabasePool("db.example.com")   # Pool created for db.example.com:5432
pool_b = DatabasePool("db.example.com")   # No output -- returns existing instance
print(pool_a is pool_b)                    # True

Note that this decorator is applied to a class, not a function. The @singleton syntax replaces the class with get_instance, so calling DatabasePool(...) routes through the decorator's instance cache. The updated=[] argument in functools.wraps prevents it from trying to copy the __dict__ attribute, which would fail on a class.

9. Deprecation Warnings

When you need to phase out a function but cannot remove it immediately, a deprecation decorator warns callers that they should migrate to a replacement. Python's built-in warnings module integrates with this pattern to issue alerts that can be silenced, escalated to errors, or filtered by category.

import warnings
import functools

def deprecated(reason="", replacement=""):
    """Emit a DeprecationWarning when the decorated function is called."""
    def decorator(func):
        @functools.wraps(func)
        def wrapper(*args, **kwargs):
            msg = f"{func.__name__}() is deprecated."
            if reason:
                msg += f" Reason: {reason}."
            if replacement:
                msg += f" Use {replacement}() instead."
            warnings.warn(msg, category=DeprecationWarning, stacklevel=2)
            return func(*args, **kwargs)
        return wrapper
    return decorator

@deprecated(reason="uses legacy auth", replacement="authenticate_v2")
def authenticate(username, password):
    return username == "admin" and password == "secret"

authenticate("admin", "secret")
# DeprecationWarning: authenticate() is deprecated.
#   Reason: uses legacy auth. Use authenticate_v2() instead.

The stacklevel=2 parameter ensures the warning points to the caller of the deprecated function, not to the line inside the decorator. This makes the warning actionable -- developers see exactly where their code needs to change.

10. Debug Tracing

A tracing decorator records the full lifecycle of a function call: the arguments going in, the return value coming out, and any exception that gets raised. This is more comprehensive than the logging decorator -- it captures failure paths as well as successes and formats the output for easy scanning during debugging sessions.

import functools
import traceback

def trace(func):
    """Trace function calls, returns, and exceptions."""
    @functools.wraps(func)
    def wrapper(*args, **kwargs):
        call_id = id(args) % 10000
        args_str = ", ".join(
            [repr(a) for a in args] +
            [f"{k}={v!r}" for k, v in kwargs.items()]
        )
        print(f"[TRACE {call_id}] -> {func.__name__}({args_str})")
        try:
            result = func(*args, **kwargs)
            print(f"[TRACE {call_id}] <- {func.__name__} returned {result!r}")
            return result
        except Exception as exc:
            print(f"[TRACE {call_id}] !! {func.__name__} raised "
                  f"{type(exc).__name__}: {exc}")
            raise
    return wrapper

@trace
def divide(a, b):
    return a / b

divide(10, 3)
# [TRACE 8821] -> divide(10, 3)
# [TRACE 8821] <- divide returned 3.3333333333333335

divide(10, 0)
# [TRACE 4412] -> divide(10, 0)
# [TRACE 4412] !! divide raised ZeroDivisionError: division by zero

The call_id is a simple hash that correlates the entry and exit lines for the same call. In concurrent or recursive code, this makes it possible to match which return belongs to which invocation.

Quick Reference Table

Use Case What It Does When to Reach For It
Execution Timing Measures and reports how long a function takes to run Performance profiling, SLA monitoring
Call Logging Records function name, arguments, and return value Audit trails, debugging API flows
Memoization Caches results keyed by arguments to avoid recomputation Expensive calculations, recursive algorithms, repeated API calls
Input Validation Checks argument types or value constraints before execution Library APIs, user-facing interfaces, data pipelines
Rate Limiting Throttles call frequency to a maximum rate Third-party API consumption, web scraping
Access Control Blocks execution unless the caller has the required permission Web frameworks, multi-tenant applications
Automatic Retry Re-executes on transient failures with backoff Network calls, database connections, flaky services
Singleton Ensures a class creates only one instance Connection pools, configuration managers, logger setup
Deprecation Warns callers that a function will be removed API versioning, library maintenance
Debug Tracing Logs call entry, exit, and exceptions with correlation IDs Development debugging, recursive call analysis

Key Takeaways

  1. Decorators extract cross-cutting concerns. Timing, logging, caching, validation, rate limiting, access control, retry, singleton enforcement, deprecation, and tracing are all behaviors that need to wrap function calls without modifying the function body. Decorators keep this logic in one reusable place.
  2. Every decorator follows the same structural pattern. An outer function captures configuration. An inner function receives the target. A wrapper function replaces the target. The differences between all ten examples are in what the wrapper does with the call -- the skeleton is identical.
  3. Always use functools.wraps. Without it, the decorated function loses its __name__, __doc__, and __module__, which breaks debugging tools, documentation generators, and serialization libraries.
  4. Parameterized decorators add a third nesting layer. When a decorator needs its own arguments -- like a permission name, a retry count, or a rate limit -- the outermost function captures those arguments and returns the actual decorator. This is the pattern behind @retry(max_tries=4) and @require_permission("delete").
  5. Decorator state lives in the closure. The memoization cache, the rate limiter's last-call timestamp, and the singleton's instance dictionary all persist between calls because they are defined in the decorator's enclosing scope and captured by the wrapper's closure.
  6. Choose custom decorators or libraries based on complexity. The implementations in this article are production-viable for straightforward needs. For advanced requirements like async retry with jitter, composable cache eviction policies, or thread-safe rate limiting, established libraries like tenacity, functools.lru_cache, and ratelimit provide tested solutions.

Each of the ten patterns in this article solves a problem that appears repeatedly across Python codebases. Decorators are the mechanism that lets you solve each one exactly once and apply it anywhere with a single line.