Writing Parameterized Decorators in Python: The Three-Layer Factory Pattern Explained

A standard Python decorator accepts a function and returns a modified version of it. That pattern works until you need the decorator itself to be configurable — accepting arguments like a retry count, a permission level, or a log format string. At that point, the decorator must become a factory: a callable that receives configuration first, then returns the real decorator second. This three-layer nesting is the decorator factory pattern, and understanding how each layer works is the difference between writing fragile wrappers and building reusable, production-grade infrastructure.

This article walks through every layer of the parameterized decorator pattern in Python, from the closure mechanics that make it possible to production-ready implementations you can use in real codebases. Each section builds on the previous one, and every concept is demonstrated with complete, runnable code.

Why Standard Decorators Cannot Accept Arguments

Before tackling parameterized decorators, it helps to be precise about what happens at definition time when Python encounters the @ syntax. A standard decorator is a callable that takes exactly one argument — the function being decorated — and returns a replacement callable. Consider this minimal timing decorator:

import time
from functools import wraps

def timer(func):
    @wraps(func)
    def wrapper(*args, **kwargs):
        start = time.perf_counter()
        result = func(*args, **kwargs)
        elapsed = time.perf_counter() - start
        print(f"{func.__name__} took {elapsed:.4f}s")
        return result
    return wrapper

@timer
def compute(n):
    return sum(range(n))

compute(1_000_000)

When Python reads @timer, it executes compute = timer(compute). The decorator receives compute as its sole argument, wraps it inside wrapper, and returns wrapper. This works because the call signature matches: timer expects exactly one positional argument, and @ passes exactly one.

Now imagine you want that timer to only print when execution exceeds a configurable threshold. You might try @timer(threshold=0.5). But this fails immediately. Python evaluates timer(threshold=0.5) first, which passes a keyword argument to a function that expects a single positional argument func. The result is a TypeError. The @ symbol does not perform any special parsing — it simply calls whatever expression follows it, passing the decorated function as the argument. If that expression is already a function call like timer(threshold=0.5), Python evaluates the call, and then applies the return value as the decorator.

Note

The expression after @ is evaluated in full before the result is called with the decorated function. This is why @decorator() with parentheses is fundamentally different from @decorator without them. With parentheses, decorator() runs first and must return a callable that will then receive the function. Without parentheses, decorator itself receives the function directly.

This distinction is the entire reason the factory pattern exists. To pass arguments to a decorator, you need an outer function that absorbs those arguments and returns the real decorator — which then receives the function on the next call.

The Three-Layer Factory Pattern

A parameterized decorator in Python consists of three nested functions, each with a specific role. The outermost function is the factory — it captures the configuration arguments via closure. The middle function is the decorator itself — it receives the target function. The innermost function is the wrapper — it executes around each call to the original function.

Here is the threshold-based timer rewritten as a decorator factory:

import time
from functools import wraps

def timer(threshold=0.0, label=None):
    """Factory: captures configuration, returns a decorator."""
    def decorator(func):
        """Decorator: captures the function, returns a wrapper."""
        @wraps(func)
        def wrapper(*args, **kwargs):
            """Wrapper: runs on every call to the decorated function."""
            start = time.perf_counter()
            result = func(*args, **kwargs)
            elapsed = time.perf_counter() - start
            if elapsed >= threshold:
                tag = label or func.__name__
                print(f"[SLOW] {tag} took {elapsed:.4f}s (threshold: {threshold}s)")
            return result
        return wrapper
    return decorator

@timer(threshold=0.1, label="heavy computation")
def compute(n):
    return sum(range(n))

@timer(threshold=0.5)
def fast_lookup(key):
    return {"a": 1, "b": 2}.get(key)

compute(10_000_000)
fast_lookup("a")

When Python encounters @timer(threshold=0.1, label="heavy computation"), it first calls timer(threshold=0.1, label="heavy computation"). This returns decorator — a function that is now closed over threshold=0.1 and label="heavy computation". Python then applies that returned decorator to compute, producing wrapper. From that point forward, every call to compute() invokes wrapper, which has access to func, threshold, and label through the closure chain.

How the Desugaring Works Step by Step

The @ syntax with arguments desugars to a two-step operation. Understanding this equivalence removes all ambiguity about the execution order:

# This decorator application:
@timer(threshold=0.1)
def compute(n):
    return sum(range(n))

# Is exactly equivalent to:
def compute(n):
    return sum(range(n))
compute = timer(threshold=0.1)(compute)

# Breaking that chain apart:
# Step 1: factory call returns a decorator
configured_decorator = timer(threshold=0.1)

# Step 2: decorator call wraps the function
compute = configured_decorator(compute)

The Role of Closures

The factory pattern relies entirely on Python's closure mechanism. Each nested function captures variables from the enclosing scope — not copies of the values, but references to the same binding. The wrapper function can read threshold and label because those names exist in the factory's local scope, which remains alive as long as any inner function holds a reference to it. The decorator function can read those same variables plus func from its own enclosing scope.

You can verify this by inspecting the closure cells on the resulting wrapper:

@timer(threshold=0.2)
def example():
    pass

# Inspect what the wrapper closes over
for cell in example.__wrapped__.__code__.co_freevars:
    print(cell)
# Output includes: func, label, threshold

# functools.wraps preserves the original function reference
print(example.__wrapped__)  # <function example at 0x...>
Pro Tip

Always use @functools.wraps(func) on your wrapper function. Without it, the decorated function loses its original __name__, __doc__, __module__, and __qualname__. It also adds a __wrapped__ attribute pointing to the original function, which is essential for debugging and introspection.

A Second Example: Role-Based Access Control

The three-layer pattern is not limited to timing. Here is a parameterized decorator that enforces role-based access on functions, a pattern commonly used in web frameworks:

from functools import wraps

def require_role(*allowed_roles):
    """Factory: captures the set of permitted roles."""
    def decorator(func):
        """Decorator: wraps the protected function."""
        @wraps(func)
        def wrapper(user, *args, **kwargs):
            """Wrapper: checks the user's role before each call."""
            if user.get("role") not in allowed_roles:
                raise PermissionError(
                    f"User role '{user.get('role')}' is not in {allowed_roles}"
                )
            return func(user, *args, **kwargs)
        return wrapper
    return decorator

@require_role("admin", "editor")
def publish_article(user, title):
    return f"Published '{title}' by {user['name']}"

@require_role("admin")
def delete_user(user, target_id):
    return f"Deleted user {target_id}"

admin = {"name": "Kandi", "role": "admin"}
viewer = {"name": "Guest", "role": "viewer"}

print(publish_article(admin, "Factory Patterns"))
# Published 'Factory Patterns' by Kandi

try:
    delete_user(viewer, 42)
except PermissionError as e:
    print(e)
# User role 'viewer' is not in ('admin',)

The factory require_role uses *allowed_roles to accept a variable number of role strings. These are captured by the closure and checked on every call to the wrapper. Each decorated function gets its own independent set of allowed roles, because each @require_role(...) call creates a new closure.

Advanced Patterns: Optional Arguments, Classes, and Type Hints

Decorators That Work With or Without Arguments

A common pain point is wanting a decorator that can be used both as @my_decorator and as @my_decorator(option=True). The standard three-layer pattern requires the parentheses even when using defaults. Python's @dataclass uses a technique to handle both forms: if the first argument is a callable and no other arguments were passed, apply the decorator directly. Otherwise, return a decorator.

import time
from functools import wraps

def timer(_func=None, /, *, threshold=0.0, label=None):
    """Works as @timer or @timer(threshold=0.5)."""
    def decorator(func):
        @wraps(func)
        def wrapper(*args, **kwargs):
            start = time.perf_counter()
            result = func(*args, **kwargs)
            elapsed = time.perf_counter() - start
            if elapsed >= threshold:
                tag = label or func.__name__
                print(f"[SLOW] {tag}: {elapsed:.4f}s")
            return result
        return wrapper

    if _func is not None:
        # Called as @timer without parentheses
        return decorator(_func)
    # Called as @timer(...) with arguments
    return decorator

# Both forms work:
@timer
def fast():
    return 42

@timer(threshold=0.01, label="slow path")
def slow():
    time.sleep(0.05)
    return 99

fast()
slow()

The / in the signature makes _func positional-only, while * forces all other arguments to be keyword-only. When you write @timer, Python passes the decorated function as _func. When you write @timer(threshold=0.01), _func remains None because no positional argument was given, and the factory returns decorator for the @ syntax to apply.

Warning

The optional-argument pattern introduces ambiguity. If someone accidentally writes @timer(some_function) passing a function as a positional argument, it gets treated as the no-argument form and the function is decorated immediately without the intended configuration. Use this pattern deliberately, and always enforce keyword-only arguments with * to prevent accidental misuse.

Class-Based Decorator Factories

When a decorator factory needs to maintain shared state across all functions it decorates — for example, a global call counter or a registration table — a class-based approach can be cleaner than nested functions. The __init__ method plays the role of the factory, and __call__ acts as the decorator:

from collections import defaultdict
from functools import wraps

class Registry:
    """A class-based decorator factory that registers functions by group."""

    _registry = defaultdict(list)

    def __init__(self, group):
        """Factory step: capture the group name."""
        self.group = group

    def __call__(self, func):
        """Decorator step: register and return the function unchanged."""
        self._registry[self.group].append(func)
        return func

    @classmethod
    def get_group(cls, group):
        return list(cls._registry[group])

    @classmethod
    def dispatch(cls, group, *args, **kwargs):
        return [fn(*args, **kwargs) for fn in cls._registry[group]]

@Registry("validators")
def check_length(value):
    return len(value) >= 3

@Registry("validators")
def check_alpha(value):
    return value.isalpha()

@Registry("formatters")
def to_upper(value):
    return value.upper()

# Run all validators
data = "Hello"
results = Registry.dispatch("validators", data)
print(results)  # [True, True]

# List registered formatters
print(Registry.get_group("formatters"))
# [<function to_upper at 0x...>]

When Python processes @Registry("validators"), it calls Registry("validators") which creates an instance with self.group = "validators". Python then calls that instance — invoking __call__ — with the decorated function. The function is appended to the registry and returned unmodified. This pattern is used in plugin architectures, command dispatchers, and event systems.

Type-Hinting Parameterized Decorators with ParamSpec

Before Python 3.10, typing a decorator that preserves the decorated function's parameter signature was effectively impossible without a type-checker plugin. PEP 612 introduced ParamSpec, which captures the full parameter specification of a callable. Combined with TypeVar for the return type, you can write decorator factories where the type checker understands that the decorated function retains its original signature.

import time
from typing import Callable, ParamSpec, TypeVar
from functools import wraps

P = ParamSpec("P")
R = TypeVar("R")

def timer(
    threshold: float = 0.0,
    label: str | None = None,
) -> Callable[[Callable[P, R]], Callable[P, R]]:
    """Fully typed parameterized decorator factory."""
    def decorator(func: Callable[P, R]) -> Callable[P, R]:
        @wraps(func)
        def wrapper(*args: P.args, **kwargs: P.kwargs) -> R:
            start = time.perf_counter()
            result = func(*args, **kwargs)
            elapsed = time.perf_counter() - start
            if elapsed >= threshold:
                tag = label or func.__name__
                print(f"[SLOW] {tag}: {elapsed:.4f}s")
            return result
        return wrapper
    return decorator

@timer(threshold=0.1)
def add(a: int, b: int) -> int:
    return a + b

# A type checker (mypy, pyright) sees:
# add(a: int, b: int) -> int
# The original signature is fully preserved.

ParamSpec("P") creates a type variable that represents both *args and **kwargs together. In the wrapper, *args: P.args and **kwargs: P.kwargs tell the type checker that the wrapper accepts exactly the same positional and keyword arguments as the original function. The factory's return type annotation Callable[[Callable[P, R]], Callable[P, R]] states that it returns a decorator whose input and output share the same parameter specification.

For decorators that modify the function's signature — adding, removing, or transforming parameters — PEP 612 also provides Concatenate. This allows you to express that a decorator prepends or appends parameters to the original signature:

from typing import Callable, Concatenate, ParamSpec, TypeVar
from functools import wraps

P = ParamSpec("P")
R = TypeVar("R")

def inject_db(
    db_url: str,
) -> Callable[
    [Callable[Concatenate[str, P], R]],
    Callable[P, R],
]:
    """Removes the first 'connection' param by injecting it automatically."""
    def decorator(func: Callable[Concatenate[str, P], R]) -> Callable[P, R]:
        @wraps(func)
        def wrapper(*args: P.args, **kwargs: P.kwargs) -> R:
            return func(db_url, *args, **kwargs)
        return wrapper
    return decorator

@inject_db(db_url="postgresql://localhost/mydb")
def get_user(connection: str, user_id: int) -> dict:
    # In production, 'connection' would be used to query the database
    return {"id": user_id, "connection": connection}

# Callers only pass user_id -- connection is injected
result = get_user(user_id=42)
print(result)
# {'id': 42, 'connection': 'postgresql://localhost/mydb'}

The Concatenate[str, P] annotation tells the type checker that the original function expects a str as its first parameter, followed by whatever P captures. The decorator strips that first parameter, so the wrapped function's signature is just P. Pyright and mypy both understand this transformation.

Pattern When to Use Layer Count
Simple decorator No configuration needed, single fixed behavior 2 (decorator + wrapper)
Three-layer factory Decorator needs arguments like thresholds, roles, or format strings 3 (factory + decorator + wrapper)
Optional-argument factory Sensible defaults exist and both @dec and @dec() should work 3 (with _func=None sentinel)
Class-based factory Shared state across decorated functions, plugin registries, event systems 2 (__init__ + __call__)

Production Patterns and Common Mistakes

Retry with Configurable Backoff

One of the more useful parameterized decorators in production code is a retry wrapper. The factory accepts the number of attempts and a backoff multiplier, and the wrapper catches exceptions and re-executes the function with increasing delays:

import time
from functools import wraps

def retry(max_attempts=3, backoff=1.0, exceptions=(Exception,)):
    """Retry a function on failure with exponential backoff."""
    def decorator(func):
        @wraps(func)
        def wrapper(*args, **kwargs):
            last_exception = None
            for attempt in range(1, max_attempts + 1):
                try:
                    return func(*args, **kwargs)
                except exceptions as e:
                    last_exception = e
                    if attempt < max_attempts:
                        delay = backoff * (2 ** (attempt - 1))
                        print(f"Attempt {attempt} failed: {e}. "
                              f"Retrying in {delay:.1f}s...")
                        time.sleep(delay)
            raise last_exception
        return wrapper
    return decorator

@retry(max_attempts=4, backoff=0.5, exceptions=(ConnectionError, TimeoutError))
def fetch_data(url):
    # Simulated flaky network call
    import random
    if random.random() < 0.7:
        raise ConnectionError(f"Failed to reach {url}")
    return {"status": "ok", "url": url}

try:
    result = fetch_data("https://api.example.com/data")
    print(result)
except ConnectionError as e:
    print(f"All retries exhausted: {e}")

Each decorated function gets its own independent retry configuration because each @retry(...) call creates a fresh closure. A function decorated with @retry(max_attempts=5) does not share state with one decorated with @retry(max_attempts=2).

Rate Limiter with Token Bucket

A rate-limiting decorator demonstrates how the wrapper can maintain mutable state across calls through the closure, without any global variables:

import time
from functools import wraps

def rate_limit(calls_per_second=1.0):
    """Enforce a maximum call rate on the decorated function."""
    min_interval = 1.0 / calls_per_second

    def decorator(func):
        last_called = [0.0]  # mutable container to allow closure mutation

        @wraps(func)
        def wrapper(*args, **kwargs):
            elapsed = time.monotonic() - last_called[0]
            if elapsed < min_interval:
                wait = min_interval - elapsed
                time.sleep(wait)
            last_called[0] = time.monotonic()
            return func(*args, **kwargs)
        return wrapper
    return decorator

@rate_limit(calls_per_second=2)
def send_request(endpoint):
    print(f"[{time.strftime('%H:%M:%S')}] Requesting {endpoint}")
    return {"status": 200}

# These calls are automatically spaced 0.5s apart
for i in range(5):
    send_request(f"/api/item/{i}")

The last_called list is a common Python idiom for mutable state inside a closure. Because closures capture the variable binding — not the value — you can read and write to the list's contents from inside wrapper. A plain float variable would not work here, since reassigning it inside the wrapper would create a new local variable instead of modifying the closure.

Common Mistakes to Avoid

There are several recurring errors that trip up developers when writing parameterized decorators. The first and the one that causes the confusion is forgetting to call the factory. Writing @retry instead of @retry() passes the decorated function directly to the factory as max_attempts, which leads to a confusing TypeError when the wrapper tries to call an integer.

# MISTAKE 1: Forgetting parentheses on the factory
@retry          # WRONG -- passes the function as max_attempts
@retry()        # CORRECT -- calls factory with defaults, returns decorator

# MISTAKE 2: Not returning the result from the wrapper
def bad_decorator(threshold=0.0):
    def decorator(func):
        @wraps(func)
        def wrapper(*args, **kwargs):
            func(*args, **kwargs)  # Missing return!
        return wrapper
    return decorator
# Every decorated function silently returns None

# MISTAKE 3: Mutable default argument shared across all decorated functions
def bad_cache(func, cache={}):      # WRONG -- one dict shared by all
    pass

def good_cache(func):
    cache = {}                       # CORRECT -- each function gets its own
    @wraps(func)
    def wrapper(*args, **kwargs):
        key = args
        if key not in cache:
            cache[key] = func(*args, **kwargs)
        return cache[key]
    return wrapper

# MISTAKE 4: Not using *args, **kwargs (breaks methods)
def rigid_decorator(func):
    @wraps(func)
    def wrapper(x, y):              # WRONG -- can't handle 'self'
        return func(x, y)
    return wrapper
# Always use *args, **kwargs so the wrapper adapts to any signature

Key Takeaways

  1. Three layers serve three purposes: The outer function (factory) captures configuration arguments. The middle function (decorator) captures the target function. The inner function (wrapper) executes on every call. Each layer exists because Python's @ syntax evaluates the expression first, then applies the result as a single-argument callable.
  2. Closures power the entire pattern: Every layer creates a new scope, and inner functions capture variables from enclosing scopes by reference. This is why threshold, label, and func are all accessible inside the wrapper without being passed as arguments — they are closed over from the surrounding scopes.
  3. Always use functools.wraps: Without it, the decorated function loses its name, docstring, module, and qualified name. The __wrapped__ attribute added by wraps also enables introspection tools and debuggers to find the original function.
  4. ParamSpec enables full type safety: Since Python 3.10, ParamSpec and Concatenate from the typing module let you write decorator factories where mypy and pyright preserve the decorated function's exact parameter types and return type. This eliminates the Callable[..., Any] workarounds that previously plagued typed decorator code.
  5. Choose the right pattern for the problem: Use the standard three-layer factory when the decorator needs configuration. Use the optional-argument pattern when sensible defaults make parentheses-free usage desirable. Use a class-based factory when you need shared mutable state like a registry or counter across all decorated functions.

The parameterized decorator pattern is a direct application of closures and higher-order functions — two concepts that sit at the core of Python's design. Once the three-layer structure clicks, it becomes a tool you reach for whenever cross-cutting concerns like logging, validation, caching, retries, or access control need to be configurable per function. The code stays clean, the configuration stays explicit, and the decorated function's original interface remains intact.