Decorators, callbacks, map(), sorted() with a key argument, event handlers in web frameworks -- all of these depend on the same underlying rule. That rule is not a special feature bolted onto Python's function system. It is a consequence of a single design decision made at the language's foundation: functions are objects. This article traces that rule from the ground up, through the object model, the callable protocol, and into the practical patterns it makes possible.
In languages like C, functions and data live in separate worlds. A function exists at a fixed address in memory, and you pass it around using a function pointer, which is a different kind of thing from a regular variable. In Python, there is no such separation. A function defined with def produces a value -- an object -- that is assigned to a name in the current scope, just like x = 42 assigns an integer object to the name x. That uniformity is the entire foundation.
Everything Is an Object, Including Functions
In Python, every piece of data is an object. Every object has three properties: an identity (its address in memory, retrievable with id()), a type (retrievable with type()), and a value. This applies to integers, strings, lists, and crucially, to functions.
def greet(name):
"""Return a greeting string."""
return f"Hello, {name}"
# A function has an identity (memory address)
print(id(greet)) # e.g., 140234866219040
# A function has a type
print(type(greet)) #
# A function has attributes, just like any object
print(greet.__name__) # greet
print(greet.__doc__) # Return a greeting string.
print(greet.__code__) #
When Python executes the def greet(name): statement, it does not just register a function somewhere. It creates a function object, compiles the body into a code object stored in greet.__code__, and binds the resulting function object to the name greet in the current namespace. The name greet is a variable, and its value is a function object.
This means functions participate in the same name-binding mechanics as every other value:
def greet(name):
return f"Hello, {name}"
# Assign the function object to another name
say_hello = greet
# Both names reference the same object
print(say_hello is greet) # True
print(id(say_hello) == id(greet)) # True
# Calling through either name executes the same function
print(say_hello("Alice")) # Hello, Alice
print(greet("Alice")) # Hello, Alice
# Delete the original name; the object survives
del greet
print(say_hello("Bob")) # Hello, Bob
After del greet, the original name is gone, but the function object itself remains alive because say_hello still references it. The object persists as long as at least one reference to it exists. This is the same reference-counting behavior that applies to every Python object.
Under the hood in CPython, every Python object (including functions) is represented by a C struct that begins with a PyObject header containing two fields: ob_refcnt (the reference count) and ob_type (a pointer to the object's type). This uniform representation is what makes "everything is an object" possible at the implementation level.
Reference vs Call: The Parentheses Rule
The single piece of syntax that trips up newcomers is the distinction between referencing a function and calling a function. Writing the name without parentheses gives you the object. Writing it with parentheses executes the object and gives you its return value.
def compute():
return 42
# Reference: the function object itself
print(compute) #
print(type(compute)) #
# Call: the function's return value
print(compute()) # 42
print(type(compute())) #
When you pass a function as an argument, you always pass the reference (without parentheses). If you accidentally include parentheses, you pass the result of calling the function, not the function itself:
def double(x):
return x * 2
def apply(func, value):
"""Call func with value and return the result."""
return func(value)
# Correct: pass the function object
result = apply(double, 5)
print(result) # 10
# Wrong: pass the return value of double(5), which is 10
# apply receives 10 (an int), then tries to call 10(5)
# result = apply(double(5), 5) # TypeError: 'int' object is not callable
This distinction is the bridge between "functions are objects" and "functions can be passed as arguments." Because double without parentheses evaluates to a function object, and function objects are values, they can be passed anywhere any other value can be passed.
The Callable Protocol
What makes a function object callable is not something magical about the function type. It is a protocol. Any object whose type defines a __call__ method can be called with parentheses. Functions happen to implement this protocol, but so can any class you write:
class Multiplier:
def __init__(self, factor):
self.factor = factor
def __call__(self, value):
return value * self.factor
triple = Multiplier(3)
# triple is an object, not a function defined with def
print(type(triple)) #
# But it is callable because Multiplier defines __call__
print(callable(triple)) # True
print(triple(10)) # 30
# It can be passed as an argument anywhere a function is expected
numbers = [1, 2, 3, 4, 5]
print(list(map(triple, numbers))) # [3, 6, 9, 12, 15]
The built-in callable() function checks whether an object has this protocol. Under the hood in CPython, calling an object triggers the tp_call slot on the object's type struct. Every call in Python -- function calls, method calls, class instantiation, calling an object with __call__ -- goes through this single mechanism.
When you write MyClass(arg) to create an instance, you are calling the class object. Classes are callable because their metaclass (type) defines __call__. The entire instantiation process -- allocating memory, calling __new__, then __init__ -- is triggered by this same callable protocol.
This means "passable as an argument" is not limited to def functions. It applies to any callable: lambdas, methods, classes, and instances with __call__. They are all objects, and they all implement the callable protocol.
# All of these are callable objects that can be passed as arguments
def func(x):
return x + 1
lam = lambda x: x + 1
class Adder:
def __call__(self, x):
return x + 1
adder_instance = Adder()
for fn in [func, lam, Adder, adder_instance, int, len]:
print(f"{str(fn):.<45} callable={callable(fn)}")
# .................. callable=True
# at 0x...>.............. callable=True
# .................. callable=True
# <__main__.Adder object at 0x...>......... callable=True
# ............................ callable=True
# .................. callable=True
Higher-Order Functions in Practice
A function that takes another function as an argument, or returns a function as its result, is called a higher-order function. This pattern is the direct consequence of first-class functions. Python's standard library relies on it heavily.
Passing Functions to Built-in Functions
# sorted() accepts a key function
words = ["banana", "apple", "cherry", "date"]
by_length = sorted(words, key=len)
print(by_length) # ['date', 'apple', 'banana', 'cherry']
# map() applies a function to every element
temperatures_c = [0, 20, 37, 100]
to_fahrenheit = lambda c: c * 9/5 + 32
temperatures_f = list(map(to_fahrenheit, temperatures_c))
print(temperatures_f) # [32.0, 68.0, 98.6, 212.0]
# filter() keeps elements where the function returns True
numbers = range(20)
is_even = lambda n: n % 2 == 0
evens = list(filter(is_even, numbers))
print(evens) # [0, 2, 4, 6, 8, 10, 12, 14, 16, 18]
In each case, the built-in function receives a callable object and invokes it internally. The built-in does not care whether it received a def function, a lambda, a method, or a class with __call__. It calls whatever it was given.
Writing Your Own Higher-Order Functions
def retry(func, attempts=3):
"""Try calling func up to `attempts` times, returning the first success."""
for i in range(attempts):
try:
return func()
except Exception as e:
last_error = e
print(f"Attempt {i + 1} failed: {e}")
raise last_error
import random
def unreliable_fetch():
if random.random() < 0.7:
raise ConnectionError("Network timeout")
return {"status": "ok", "data": [1, 2, 3]}
# Pass the function object (no parentheses) to retry
result = retry(unreliable_fetch, attempts=5)
print(result)
The retry function knows nothing about what func does. It only knows that func is callable. This separation of concerns -- the retry logic is independent of the business logic -- is only possible because the business logic can be passed in as a value.
Storing Functions in Data Structures
Because functions are objects, they can be stored in lists, dictionaries, or any other container:
def add(a, b):
return a + b
def subtract(a, b):
return a - b
def multiply(a, b):
return a * b
operations = {
"+": add,
"-": subtract,
"*": multiply,
}
def calculate(expression):
"""Parse and evaluate a simple 'a op b' expression."""
parts = expression.split()
a, op, b = int(parts[0]), parts[1], int(parts[2])
func = operations.get(op)
if func is None:
raise ValueError(f"Unknown operator: {op}")
return func(a, b)
print(calculate("10 + 5")) # 15
print(calculate("10 - 3")) # 7
print(calculate("10 * 4")) # 40
The dictionary maps operator strings to function objects. Looking up operations["+"] returns the add function object, which is then called with the operands. This dispatch pattern eliminates long chains of if/elif statements and makes it trivial to add new operations by inserting new entries into the dictionary.
Closures: Functions That Remember
Because functions are objects, they can be created inside other functions and returned as values. When an inner function references variables from its enclosing scope, it captures those variables. The resulting combination of the inner function and its captured variables is called a closure.
def make_greeter(greeting):
"""Return a function that greets with a specific greeting."""
def greeter(name):
return f"{greeting}, {name}!"
return greeter
hello = make_greeter("Hello")
howdy = make_greeter("Howdy")
print(hello("Alice")) # Hello, Alice!
print(howdy("Bob")) # Howdy, Bob!
# The inner function carries its captured variable
print(hello.__closure__)
# (| ,)
print(hello.__closure__[0].cell_contents)
# Hello |
When make_greeter("Hello") returns, its local variable greeting would normally be garbage collected. But the inner greeter function references greeting, so Python keeps it alive inside a cell object stored in the function's __closure__ attribute. The function object carries its environment with it.
Closures are the mechanism behind function factories, a pattern where one function produces customized functions on demand:
def make_validator(min_val, max_val):
"""Return a function that checks if a value is within range."""
def validate(value):
if not (min_val <= value <= max_val):
raise ValueError(
f"{value} is outside [{min_val}, {max_val}]"
)
return value
return validate
validate_percentage = make_validator(0, 100)
validate_temperature = make_validator(-273.15, 1000)
print(validate_percentage(85)) # 85
print(validate_temperature(-40)) # -40
try:
validate_percentage(150)
except ValueError as e:
print(e) # 150 is outside [0, 100]
Each call to make_validator creates a new function object with its own captured min_val and max_val. The returned functions are independent of each other, each carrying its own enclosed state.
From First-Class Functions to Decorators
Decorators are the highest-profile application of first-class functions in Python. A decorator is a function that takes a function as its argument and returns a new function (or the same function, modified). Every piece of this sentence depends on functions being objects.
import functools
import time
def timer(func):
"""Decorator that prints how long a function takes to run."""
@functools.wraps(func)
def wrapper(*args, **kwargs):
start = time.perf_counter()
result = func(*args, **kwargs)
elapsed = time.perf_counter() - start
print(f"{func.__name__} took {elapsed:.4f}s")
return result
return wrapper
@timer
def slow_sum(n):
return sum(range(n))
print(slow_sum(10_000_000))
# slow_sum took 0.1823s
# 49999995000000
Breaking down what happens step by step: Python executes def slow_sum(n): and creates a function object. The @timer syntax passes that function object to timer() as the argument func. Inside timer, a new function object wrapper is created. wrapper is a closure that captures the original func. timer returns wrapper. Python rebinds the name slow_sum to the returned wrapper object.
Every step requires functions to be objects: passing slow_sum as an argument to timer, creating wrapper inside timer, capturing func in the closure, returning wrapper as a value, and assigning it back to the name slow_sum.
Without the decorator syntax, the same operation looks like this, making the first-class nature even more explicit:
def slow_sum(n):
return sum(range(n))
# This is exactly what @timer does:
slow_sum = timer(slow_sum)
The expression timer(slow_sum) passes a function object as an argument. The expression slow_sum = timer(slow_sum) assigns the returned function object to a variable. Both operations are only possible because functions are objects.
Key Takeaways
- The fundamental rule is that functions are objects. Every
defstatement creates an object of typefunctionwith an identity, a type, and attributes. This is not a special feature -- it is a consequence of Python's unified object model where everything is an object. - Referencing a function and calling a function are different operations.
funcgives you the object.func()executes the object and gives you its return value. Passing a function as an argument requires the reference form without parentheses. - The callable protocol unifies all invocation. Functions, lambdas, methods, classes, and objects with
__call__are all callable through the same mechanism. Thecallable()built-in checks for this capability. - Higher-order functions are the direct consequence. Because functions are values, they can be passed to other functions, returned from functions, and stored in data structures. This enables patterns like callbacks, dispatch tables, function factories, and decorators.
- Closures keep functions connected to their creation context. When an inner function captures variables from an enclosing scope, those variables are preserved in the function's
__closure__attribute, keeping them alive as long as the function object exists.
Everything in Python's higher-order ecosystem -- decorators, functools.wraps, map, filter, sorted with key, callback registrations in web frameworks, event-driven architectures -- traces back to one decision: def creates an object. Once that rule is internalized, the rest is just applying it.