A plain decorator takes a function and returns a modified version of it. That works until you need to configure the decorator's behavior — setting a retry count, specifying a log level, or choosing a cache size. At that point you need a decorator factory: a function that accepts your configuration arguments and returns the decorator itself. This pattern adds one more layer of nesting, and understanding exactly what each layer does is the key to writing factories that are clean, reusable, and correct.
This article covers the full anatomy of decorator factories in Python, from the basic three-layer structure through advanced patterns like optional arguments, class-based factories, and type-safe annotations with ParamSpec. It also addresses a question that generic tutorials skip entirely: when a factory is the wrong tool, and what to reach for instead.
The mental model that makes factories intuitive is partial application. A factory is a function that consumes some arguments now and defers the rest for later. When you write @log_calls("DEBUG"), you are partially applying a configuration ("DEBUG") and getting back a specialized function (the decorator) that remembers that configuration forever through its closure. This is the same principle behind functools.partial, currying in functional languages, and dependency injection in object-oriented systems. The three-layer nesting is not arbitrary; it is the minimum structure required to separate configuration time, decoration time, and call time into distinct phases.
From Plain Decorator to Factory
A standard decorator has two layers: the decorator function that receives the target function, and the inner wrapper that runs around it:
def log_calls(func):
def wrapper(*args, **kwargs):
print(f"Calling {func.__name__}")
return func(*args, **kwargs)
return wrapper
@log_calls
def add(a, b):
return a + b
This works, but the log message format is hardcoded. If you want to control the prefix, you cannot pass an argument to @log_calls directly because log_calls expects a function, not a string. You need an outer function that accepts the prefix and returns a decorator:
def log_calls(prefix):
def decorator(func):
def wrapper(*args, **kwargs):
print(f"[{prefix}] Calling {func.__name__}")
return func(*args, **kwargs)
return wrapper
return decorator
@log_calls("DEBUG")
def add(a, b):
return a + b
add(3, 5)
The expression @log_calls("DEBUG") first calls log_calls("DEBUG"), which returns the decorator function. Python then applies that returned decorator to add. The outer function is the factory. It produces a decorator on demand, configured with whatever arguments you pass.
The Triple-Nesting Pattern
Every decorator factory follows the same three-layer structure. Each layer has one responsibility:
Factory
@factory(args))Decorator
Wrapper
The desugared equivalent makes this explicit:
# What Python does when it sees @log_calls("DEBUG")
decorator = log_calls("DEBUG") # factory returns decorator
add = decorator(add) # decorator returns wrapper
Execution Tracer
Click through each phase below to see exactly what Python does when it encounters @log_calls("DEBUG"). Each step shows which layer executes, what values are bound, and what gets returned.
How Closures Power the Factory
The factory pattern works because of closures. Each inner function captures the variables from its enclosing scope, even after the enclosing function has returned. The decorator closes over the factory's arguments, and the wrapper closes over both the factory's arguments and the target function.
def repeat(n):
def decorator(func):
def wrapper(*args, **kwargs):
result = None
for _ in range(n): # n is captured from the factory
result = func(*args, **kwargs) # func is captured from the decorator
return result
return wrapper
return decorator
@repeat(4)
def greet(name):
print(f"Hello, {name}")
greet("reader")
When repeat(4) executes, it binds n = 4 and returns decorator. That decorator function keeps a reference to n through the closure. When the decorator is applied to greet, it binds func = greet and returns wrapper. The wrapper function can access both n and func through its closure chain, even though repeat and decorator have both already returned.
You can verify what a closure captures by inspecting __closure__:
for cell in greet.__closure__:
print(cell.cell_contents)
Mutable vs Immutable Captured State
A closure holds a reference to the variable, not a copy of its value. This distinction matters when the captured variable is mutable. If your factory captures a list or dictionary, every wrapper that closes over it shares the same object. Mutations in one wrapper are visible in all others. Immutable values like integers and strings are safe from this because rebinding them with nonlocal creates a new object in the enclosing scope rather than mutating a shared one. This is why the count_calls bug in the Common Mistakes section is particularly dangerous: the integer itself is immutable, but the nonlocal rebinding crosses scope boundaries in ways that share state between decorated functions when the variable is placed in the wrong layer.
Preserving Metadata with functools.wraps
Just like plain decorators, decorator factories replace the original function with the wrapper. Without functools.wraps, the wrapped function loses its __name__, __doc__, and other attributes. Always apply @wraps(func) inside the decorator layer:
from functools import wraps
def rate_limit(max_calls, period_seconds):
def decorator(func):
@wraps(func)
def wrapper(*args, **kwargs):
# rate limiting logic would go here
return func(*args, **kwargs)
return wrapper
return decorator
@rate_limit(max_calls=10, period_seconds=60)
def fetch_data(endpoint):
"""Fetch data from the given API endpoint."""
return {"endpoint": endpoint, "status": "ok"}
print(fetch_data.__name__)
print(fetch_data.__doc__)
Place @wraps(func) on the innermost wrapper function, not on the decorator. The decorator receives the function, but the wrapper is what replaces it. That is where metadata needs to be copied.
The Optional-Arguments Pattern
A common usability problem with decorator factories is that they require parentheses even when using defaults. You can write @rate_limit(max_calls=10, period_seconds=60), but you cannot write @rate_limit without parentheses because the factory would receive the function as its first argument instead of a configuration value.
The optional-arguments pattern solves this by checking whether the first argument is a callable:
from functools import wraps
def log_calls(func=None, *, prefix="LOG"):
def decorator(f):
@wraps(f)
def wrapper(*args, **kwargs):
print(f"[{prefix}] {f.__name__} called")
return f(*args, **kwargs)
return wrapper
if func is not None:
# Called without arguments: @log_calls
return decorator(func)
# Called with arguments: @log_calls(prefix="DEBUG")
return decorator
@log_calls
def task_a():
return "done"
@log_calls(prefix="AUDIT")
def task_b():
return "done"
task_a()
task_b()
The * after func forces all configuration parameters to be keyword-only. This prevents ambiguity: when @log_calls is used without parentheses, Python passes the function as func. When @log_calls(prefix="AUDIT") is used, func is None and the factory returns the decorator for Python to apply next.
The keyword-only separator * is essential. Without it, @log_calls("AUDIT") would assign "AUDIT" to func instead of prefix, and the factory would try to decorate a string.
Class-Based Decorator Factories
When a decorator factory needs to maintain state across calls or the configuration logic is complex, a class can replace the nested functions. The __init__ method receives the factory arguments, and __call__ acts as the decorator:
import functools
import time
class Retry:
def __init__(self, max_attempts=3, delay=1.0):
self.max_attempts = max_attempts
self.delay = delay
def __call__(self, func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
last_exception = None
for attempt in range(1, self.max_attempts + 1):
try:
return func(*args, **kwargs)
except Exception as e:
last_exception = e
print(f"Attempt {attempt} failed: {e}")
if attempt < self.max_attempts:
time.sleep(self.delay)
raise last_exception
return wrapper
@Retry(max_attempts=3, delay=0.5)
def connect_to_service(url):
"""Attempt connection to an external service."""
# Simulating a failure
raise ConnectionError(f"Cannot reach {url}")
print(connect_to_service.__name__)
print(connect_to_service.__doc__)
When Python sees @Retry(max_attempts=3, delay=0.5), it first calls Retry.__init__ with the arguments, creating an instance. Then it calls __call__ on that instance with connect_to_service, which returns the wrapper. The instance attributes self.max_attempts and self.delay serve the same role as closure variables in the function-based factory.
Real-World Factory Examples
Timed Execution with a Threshold
from functools import wraps
import time
def warn_slow(threshold_seconds=1.0):
def decorator(func):
@wraps(func)
def wrapper(*args, **kwargs):
start = time.perf_counter()
result = func(*args, **kwargs)
elapsed = time.perf_counter() - start
if elapsed > threshold_seconds:
print(f"WARNING: {func.__name__} took {elapsed:.3f}s "
f"(threshold: {threshold_seconds}s)")
return result
return wrapper
return decorator
@warn_slow(threshold_seconds=0.5)
def process_batch(items):
time.sleep(0.7) # simulate slow processing
return len(items)
process_batch(range(1000))
Role-Based Access Control
from functools import wraps
def require_role(*allowed_roles):
def decorator(func):
@wraps(func)
def wrapper(user, *args, **kwargs):
if user.get("role") not in allowed_roles:
raise PermissionError(
f"{user.get('name', 'Unknown')} lacks required role. "
f"Allowed: {allowed_roles}"
)
return func(user, *args, **kwargs)
return wrapper
return decorator
@require_role("admin", "editor")
def publish_article(user, title):
return f"'{title}' published by {user['name']}"
admin = {"name": "Kandi", "role": "admin"}
viewer = {"name": "Guest", "role": "viewer"}
print(publish_article(admin, "Decorator Factories"))
try:
publish_article(viewer, "Unauthorized Post")
except PermissionError as e:
print(f"Blocked: {e}")
The factory require_role uses *allowed_roles to accept any number of role strings. The closure captures those roles, and the wrapper checks each caller against them before allowing the function to execute.
Configurable Cache with Max Size
from functools import wraps
def cache(maxsize=128):
def decorator(func):
storage = {}
@wraps(func)
def wrapper(*args):
if args in storage:
return storage[args]
if len(storage) >= maxsize:
oldest_key = next(iter(storage))
del storage[oldest_key]
result = func(*args)
storage[args] = result
return result
wrapper.cache_info = lambda: {
"size": len(storage),
"maxsize": maxsize
}
wrapper.cache_clear = lambda: storage.clear()
return wrapper
return decorator
@cache(maxsize=64)
def fibonacci(n):
if n < 2:
return n
return fibonacci(n - 1) + fibonacci(n - 2)
print(fibonacci(30))
print(fibonacci.cache_info())
This factory creates a simple LRU-style cache scoped to each decorated function. The storage dictionary lives inside the decorator's closure, private to each function the factory decorates. The factory argument maxsize controls eviction behavior through the closure chain.
Type-Safe Factories with ParamSpec
Python 3.10 introduced ParamSpec from typing, which lets you preserve the decorated function's exact signature through the factory. Without ParamSpec, type checkers lose track of argument types after decoration. With it, the wrapper's parameter and return types stay correct:
from functools import wraps
from typing import Callable, TypeVar, ParamSpec
P = ParamSpec("P")
R = TypeVar("R")
def retry(max_attempts: int = 3) -> Callable[[Callable[P, R]], Callable[P, R]]:
def decorator(func: Callable[P, R]) -> Callable[P, R]:
@wraps(func)
def wrapper(*args: P.args, **kwargs: P.kwargs) -> R:
last_exc: Exception | None = None
for attempt in range(max_attempts):
try:
return func(*args, **kwargs)
except Exception as e:
last_exc = e
raise last_exc # type: ignore[misc]
return wrapper
return decorator
@retry(max_attempts=5)
def divide(a: float, b: float) -> float:
return a / b
The return type annotation on the factory, Callable[[Callable[P, R]], Callable[P, R]], tells the type checker that the factory returns a decorator. That decorator takes a Callable[P, R] and returns a Callable[P, R] with the same parameter types and return type preserved. The wrapper uses P.args and P.kwargs to forward the exact signature, so a type checker like mypy or Pyright can verify that callers of divide pass the correct argument types even through the decorator.
ParamSpec requires Python 3.10 or later. For earlier versions, install typing_extensions and import ParamSpec from there. The runtime behavior is identical; only the type checker integration differs.
When Not to Use a Factory
A decorator factory is not always the right tool. Reaching for one reflexively when a simpler pattern would suffice adds nesting complexity without proportional benefit. Here are four alternatives to evaluate before committing to a factory, and the specific conditions under which each one wins.
functools.partial as a Lighter Alternative
If your decorator already accepts the target function as its first argument and your configuration is limited to a few keyword arguments, functools.partial can bind those arguments without adding a nesting layer at all:
from functools import partial, wraps
def _log_impl(func, *, prefix="LOG"):
@wraps(func)
def wrapper(*args, **kwargs):
print(f"[{prefix}] {func.__name__} called")
return func(*args, **kwargs)
return wrapper
log_debug = partial(_log_impl, prefix="DEBUG")
log_audit = partial(_log_impl, prefix="AUDIT")
@log_debug
def task():
return "done"
This avoids the three-layer nesting entirely. The tradeoff is that you lose the @factory(args) syntax: you need a named partial for each configuration, which works well for a handful of presets but becomes unwieldy if the configuration space is large.
Default Arguments on the Decorator
If your decorator only needs one or two configuration options and the defaults cover the common case, you can skip the factory entirely and place the configuration as default arguments on the decorator itself using the optional-arguments pattern from section 5. This works well when most call sites use the default and only a few need customization.
Module-Level Constants
If the configuration is truly global and never varies between decorated functions, a module-level constant is simpler and more explicit than threading the value through a closure:
MAX_RETRIES = 3
def retry(func):
@wraps(func)
def wrapper(*args, **kwargs):
for attempt in range(MAX_RETRIES):
try:
return func(*args, **kwargs)
except Exception:
if attempt == MAX_RETRIES - 1:
raise
return wrapper
This trades configurability for clarity. If the value never changes per-function, the factory layer is ceremony without purpose.
The Decision Framework
Use a factory when the configuration varies per decorated function, when the @factory(args) syntax improves call-site readability, or when the factory needs to capture state that must be private to each decorated function. Use functools.partial when you have a small number of fixed presets. Use default arguments when the common case needs no configuration. Use module constants when configuration is global and static. The wrong choice does not break your code, but it does affect how easily the next person can read it.
Common Mistakes
Forgetting to Call the Factory
The single most frequent error with decorator factories is forgetting the parentheses:
# WRONG: passes the function to the factory as the first argument
@repeat
def my_func():
pass
# CORRECT: calls the factory, which returns the decorator
@repeat(3)
def my_func():
pass
Without parentheses, repeat receives my_func as its n argument. The decorator function returned by the factory then waits for a function that never arrives, and calling my_func() produces a TypeError. Use the optional-arguments pattern from section 5 if you need both forms to work. If you are stacking multiple factories on a single function, see chained decorator execution order for how Python resolves the nesting.
Placing State in the Wrong Scope
def count_calls(label):
call_count = 0 # lives in the factory scope
def decorator(func):
@wraps(func)
def wrapper(*args, **kwargs):
nonlocal call_count
call_count += 1
print(f"[{label}] call #{call_count}")
return func(*args, **kwargs)
return wrapper
return decorator
The call_count variable is in the factory's scope, not the decorator's. If you apply @count_calls("API") to two different functions, they share the same counter because both decorators close over the same call_count. To give each function its own counter, move the variable inside the decorator function instead.
Missing Return in the Wrapper
A wrapper that calls func(*args, **kwargs) without a return statement silently discards the return value. In a factory, this bug is harder to spot because the nesting makes the wrapper less visible during code review.
Key Takeaways
- A decorator factory returns a decorator: It adds one layer of nesting on top of the standard decorator pattern. The factory captures configuration, the decorator captures the function, and the wrapper executes around it.
- Closures connect the layers: Each inner function retains access to the variables of its enclosing scope. The factory arguments, the target function, and the call-time arguments are all accessible through the closure chain without any global state.
- Always use functools.wraps: Apply
@wraps(func)on the wrapper, not the decorator. This preserves__name__,__doc__,__module__, and__wrapped__through the factory layer. - The optional-arguments pattern allows both forms: Using
func=Nonewith a keyword-only separator lets a single factory work as@decoand@deco(args). - Classes work as factories too: A class with
__init__for configuration and__call__for decoration is cleaner when the factory needs complex logic or mutable state. - ParamSpec preserves type safety: Annotating factories with
ParamSpecandTypeVarensures that type checkers can verify decorated functions' signatures end-to-end.
Decorator factories are the standard approach when you need configurable behavior injected at decoration time. Whether you use the function-based triple-nesting pattern or a class-based approach, the principle is the same: accept configuration, return a decorator, and let closures connect the layers.
How to Build a Decorator Factory
- Define the outer factory function. Write the outermost function that accepts your configuration arguments (such as a prefix string or retry count). This function will return a decorator.
- Define the decorator inside the factory. Inside the factory, define a decorator function that accepts the target function as its single argument. This middle layer bridges configuration and execution.
- Define the wrapper inside the decorator. Inside the decorator, define a wrapper function that accepts
*argsand**kwargs. The wrapper executes your custom logic around the original function call and returns its result. - Apply functools.wraps to the wrapper. Decorate the wrapper with
@functools.wraps(func)to copy__name__,__doc__, and other metadata from the original function to the wrapper. - Return the wrapper from the decorator, and the decorator from the factory. The decorator returns the wrapper, and the factory returns the decorator. When used as
@factory(args), Python calls the factory, receives the decorator, and applies it to the target function.
Test Your Understanding
Three scenarios. For each one, decide which approach fits best. Click an option to see the tradeoff analysis.
Frequently Asked Questions
What is a decorator factory in Python?
A decorator factory is a function that accepts arguments and returns a decorator. It adds an outer layer of nesting around the standard decorator pattern, creating three nested functions: the factory (accepts configuration), the decorator (accepts the function), and the wrapper (executes around the function). This lets you parameterize decorator behavior at decoration time.
Why do decorator factories need three nested functions?
The three layers serve distinct purposes. The outermost function (the factory) receives the configuration arguments and captures them in a closure. The middle function (the decorator) receives the function being decorated. The innermost function (the wrapper) executes around the original function at call time, with access to both the configuration and the original function through closure scoping.
How do I make a decorator that works both with and without arguments?
Use the optional-arguments pattern: define the factory with func=None as the first parameter, followed by keyword-only arguments using the * separator. If func is not None, the decorator was applied without parentheses, so return the decorator applied directly. If func is None, arguments were passed, so return the decorator for later application.
Can a class be used as a decorator factory in Python?
Yes. A class whose __init__ accepts configuration arguments and whose __call__ method accepts and wraps a function works as a decorator factory. When you write @ClassName(args), Python calls __init__ with the arguments, creating an instance. Then it calls __call__ on that instance with the decorated function, which returns the wrapper.
What is the difference between a decorator and a decorator factory?
A decorator is a function that takes a single function as input and returns a modified version of it. A decorator factory is a function that takes configuration arguments and returns a decorator. The factory adds one extra layer of nesting: you call the factory with your arguments, and it gives back a decorator that you then apply to the target function. The syntax @decorator applies a decorator directly, while @factory(args) calls the factory first and applies the returned decorator second.
Why is functools.wraps important in a decorator factory?
functools.wraps copies metadata like __name__, __doc__, __module__, and __qualname__ from the original function to the wrapper function. Without it, the decorated function loses its identity and appears to have the wrapper's name and docstring instead of its own. In a decorator factory, place @wraps(func) on the innermost wrapper function, not on the decorator, because the wrapper is what replaces the original function.