Python decorators that accept arguments can be written as either triple-nested functions or as classes that implement __init__ and __call__. Both approaches produce the same external behavior: the decorator accepts configuration, wraps a function, and returns a modified callable. The difference is internal. Function-based decorators use closures for state. Class-based decorators use instance attributes. This article builds the same decorator both ways, adds stateful behavior, demonstrates inheritance, and provides a clear decision framework for choosing between them.
A decorator is any callable that accepts a function and returns a callable. Functions are callable. Instances of classes that implement __call__ are also callable. Python does not care which one you use. The @ syntax calls whatever follows it, and as long as that callable returns something callable, the decoration succeeds. This flexibility is what makes class-based decorators possible and why they are interchangeable with function-based ones from the caller's perspective.
The Same Decorator, Two Ways
To make the comparison concrete, here is a log_calls decorator that accepts a level parameter and prints a message before and after the decorated function runs. First, the function-based version:
from functools import wraps
def log_calls(level="INFO"): # Level 1: factory
def decorator(func): # Level 2: decorator
@wraps(func)
def wrapper(*args, **kwargs): # Level 3: wrapper
print(f"[{level}] Entering {func.__name__}")
result = func(*args, **kwargs)
print(f"[{level}] Exiting {func.__name__}")
return result
return wrapper
return decorator
@log_calls(level="DEBUG")
def process_data(payload):
return {"processed": payload}
print(process_data({"key": "value"}))
# [DEBUG] Entering process_data
# [DEBUG] Exiting process_data
# {'processed': {'key': 'value'}}
Now the same behavior implemented as a class:
from functools import wraps
class LogCalls:
def __init__(self, level="INFO"): # Accepts parameters
self.level = level
def __call__(self, func): # Accepts the function
@wraps(func)
def wrapper(*args, **kwargs):
print(f"[{self.level}] Entering {func.__name__}")
result = func(*args, **kwargs)
print(f"[{self.level}] Exiting {func.__name__}")
return result
return wrapper
@LogCalls(level="DEBUG")
def process_data(payload):
return {"processed": payload}
print(process_data({"key": "value"}))
# [DEBUG] Entering process_data
# [DEBUG] Exiting process_data
# {'processed': {'key': 'value'}}
The output is identical. The usage syntax is identical. The difference is structural: the function version uses three nested def statements with level captured in a closure. The class version uses __init__ to store level as an instance attribute and __call__ to accept the function and return the wrapper. The class has two levels of visual nesting instead of three.
How the Class-Based Mechanism Works
When Python encounters @LogCalls(level="DEBUG"), it evaluates LogCalls(level="DEBUG") first. This creates an instance of LogCalls by calling __init__ with level="DEBUG". The instance stores self.level = "DEBUG". Python then calls the instance with the decorated function: instance(process_data). This triggers __call__, which receives process_data as func, creates the wrapper, and returns it. The name process_data is rebound to the wrapper.
# What Python does behind the scenes:
# Step 1: Create the instance (calls __init__)
instance = LogCalls(level="DEBUG")
# Step 2: Call the instance with the function (calls __call__)
process_data = instance(process_data)
# Combined:
# process_data = LogCalls(level="DEBUG")(process_data)
This two-step process mirrors the function-based factory pattern, where log_calls(level="DEBUG") returns a decorator, and the decorator is called with the function. The class-based version makes the two steps more explicit: construction is separate from call.
The key structural difference: in a function-based decorator with arguments, the parameters live in a closure. In a class-based decorator, the parameters live in self. Both are accessible inside the wrapper, but self attributes are visible to external code and testing tools, while closure variables are not.
Stateful Decorators: Where Classes Excel
Decorators that need to track state across invocations expose the clearest difference between the two approaches. A call counter that records how many times each decorated function has been called illustrates this well.
Function-Based: State via nonlocal
from functools import wraps
def count_calls(threshold=10):
def decorator(func):
call_count = 0
@wraps(func)
def wrapper(*args, **kwargs):
nonlocal call_count
call_count += 1
if call_count > threshold:
print(f"[WARN] {func.__name__} called {call_count} times "
f"(threshold: {threshold})")
return func(*args, **kwargs)
return wrapper
return decorator
@count_calls(threshold=3)
def fetch(url):
return f"Response from {url}"
for i in range(5):
fetch("https://api.example.com")
# [WARN] fetch called 4 times (threshold: 3)
# [WARN] fetch called 5 times (threshold: 3)
The call_count variable lives inside the decorator closure. The nonlocal keyword is required to allow the wrapper to modify it. This works, but the counter is invisible from outside: there is no way to read call_count without calling the function.
Class-Based: State via Instance Attributes
from functools import wraps
class CountCalls:
def __init__(self, threshold=10):
self.threshold = threshold
self.call_count = 0
def __call__(self, func):
@wraps(func)
def wrapper(*args, **kwargs):
self.call_count += 1
if self.call_count > self.threshold:
print(f"[WARN] {func.__name__} called "
f"{self.call_count} times "
f"(threshold: {self.threshold})")
return func(*args, **kwargs)
return wrapper
def reset(self):
"""Reset the counter to zero."""
self.call_count = 0
counter = CountCalls(threshold=3)
@counter
def fetch(url):
return f"Response from {url}"
for i in range(5):
fetch("https://api.example.com")
# [WARN] fetch called 4 times (threshold: 3)
# [WARN] fetch called 5 times (threshold: 3)
# The state is visible and controllable from outside
print(f"Total calls: {counter.call_count}") # Total calls: 5
counter.reset()
print(f"After reset: {counter.call_count}") # After reset: 0
The class version exposes call_count as a public attribute and provides a reset() method. External code can read the counter, reset it, or modify the threshold without accessing internal closure variables. The decorator instance counter is a first-class object that can be passed around, inspected, and tested independently.
When using a class-based decorator, assign the instance to a variable before applying it with @. This gives you a reference to the instance for later inspection: counter = CountCalls(threshold=3) followed by @counter.
Extending Decorators Through Inheritance
Class-based decorators can be subclassed, which allows you to create specialized decorators from a general-purpose base. Function-based decorators do not support inheritance because functions cannot be subclassed.
import time
from functools import wraps
class TimedDecorator:
"""Base class: times the decorated function."""
def __init__(self, label="TIMER"):
self.label = label
def __call__(self, func):
@wraps(func)
def wrapper(*args, **kwargs):
start = time.perf_counter()
result = self.process(func, *args, **kwargs)
elapsed = time.perf_counter() - start
print(f"[{self.label}] {func.__name__}: {elapsed:.4f}s")
return result
return wrapper
def process(self, func, *args, **kwargs):
"""Override this in subclasses to add behavior."""
return func(*args, **kwargs)
class RetryTimedDecorator(TimedDecorator):
"""Extends TimedDecorator with retry logic."""
def __init__(self, max_retries=3, label="RETRY"):
super().__init__(label=label)
self.max_retries = max_retries
def process(self, func, *args, **kwargs):
last_error = None
for attempt in range(1, self.max_retries + 1):
try:
return func(*args, **kwargs)
except Exception as e:
last_error = e
print(f" Attempt {attempt}/{self.max_retries} failed: {e}")
raise last_error
# Base decorator: just timing
@TimedDecorator(label="PERF")
def fast_operation():
return sum(range(10000))
# Extended decorator: timing + retry
attempt_count = 0
@RetryTimedDecorator(max_retries=3, label="NET")
def flaky_request():
global attempt_count
attempt_count += 1
if attempt_count < 3:
raise ConnectionError("Network timeout")
return {"status": "ok"}
print(fast_operation())
# [PERF] fast_operation: 0.0003s
# 49995000
print(flaky_request())
# Attempt 1/3 failed: Network timeout
# Attempt 2/3 failed: Network timeout
# [NET] flaky_request: 0.0001s
# {'status': 'ok'}
RetryTimedDecorator inherits the timing logic from TimedDecorator and overrides only the process method to add retry behavior. The timing wraps the entire retry sequence. To achieve this with function-based decorators, you would need to either duplicate the timing code inside the retry decorator or chain two separate decorators. Inheritance keeps the logic centralized in one place and avoids duplication.
Testing and Inspecting Decorator State
Class-based decorators are easier to test because their state is accessible through instance attributes. You can create an instance, inspect its configuration, apply it to a test function, call the function, and then verify the state changed as expected:
class CountCalls:
def __init__(self, threshold=10):
self.threshold = threshold
self.call_count = 0
def __call__(self, func):
from functools import wraps
@wraps(func)
def wrapper(*args, **kwargs):
self.call_count += 1
return func(*args, **kwargs)
return wrapper
def reset(self):
self.call_count = 0
# Testing the decorator independently
def test_count_calls_tracks_invocations():
counter = CountCalls(threshold=5)
@counter
def dummy():
return "ok"
assert counter.call_count == 0
dummy()
assert counter.call_count == 1
dummy()
dummy()
assert counter.call_count == 3
counter.reset()
assert counter.call_count == 0
print("All assertions passed")
test_count_calls_tracks_invocations()
# All assertions passed
With a function-based decorator, the call_count variable is trapped inside a closure. You cannot access it without modifying the decorator's code to attach it to the wrapper function as an attribute, which is an extra step that the class-based approach handles naturally.
Decision Framework
| Consideration | Function-Based | Class-Based |
|---|---|---|
| Nesting depth (with args) | 3 levels (factory, decorator, wrapper) | 2 levels (__call__ + wrapper) |
| State management | Closure + nonlocal |
Instance attributes (self.x) |
| State visibility | Hidden in closure | Public on instance |
| Inheritance | Not possible | Full subclassing support |
| Testing | Must call function to observe effects | Inspect instance attributes directly |
| Code volume (simple cases) | Less code for stateless decorators | More boilerplate (class, __init__, __call__) |
| Idiomatic for Python | Standard, expected by readers | Less common, may surprise readers |
| Reset/control methods | Requires attaching to wrapper | Natural as instance methods |
Use a function-based decorator when the decorator is stateless or has simple, immutable configuration. Logging, timing, access control checks, and argument validation fall into this category. The three-level nesting is worth it for the conciseness and familiarity.
Use a class-based decorator when the decorator needs mutable state that changes across calls (counters, caches, rate-limiting windows), when you want to provide control methods (reset(), clear_cache()), when you need to subclass the decorator for specialized behavior, or when you want to inspect the decorator's configuration and state in tests without calling the decorated function.
Class-based decorators with arguments require careful attention to the __init__ / __call__ split. If you accidentally accept the function in __init__ and the arguments in __call__, the decorator will fail when used with the @ syntax. Parameters always go in __init__. The function always goes in __call__.
Key Takeaways
- Both approaches produce identical external behavior. A class-based decorator with
__init__/__call__and a function-based triple-nested factory are interchangeable from the caller's perspective. The choice is about internal structure. - Class-based decorators store state in instance attributes. This makes mutable state like counters, caches, and timestamps straightforward to manage, visible to external code, and easy to test. Function-based decorators require
nonlocalvariables for the same purpose. - Class-based decorators support inheritance. You can build a base decorator class and subclass it for specialized behavior. Function-based decorators cannot be subclassed.
- Function-based decorators are shorter for stateless cases. When the decorator just logs, times, or validates without tracking anything across calls, the function-based approach is more concise and more familiar to Python readers.
- Always use
@functools.wraps(func)in both approaches. The wrapper function returned from either style replaces the original function. Without@wraps, the decorated function loses its metadata regardless of whether the decorator is a function or a class.
The two approaches to writing decorators with arguments are not competing philosophies. They are tools with different strengths. Function-based decorators are the default for simple, stateless wrappers. Class-based decorators step in when the decorator needs to be an object in its own right, with state, methods, and the ability to participate in an inheritance hierarchy. Knowing both patterns lets you choose the right tool based on the complexity of the behavior you are implementing.