Function composition is one of those ideas that sounds academic until you actually use it -- and then you wonder how you ever wrote Python without it. At its core, composition is the act of chaining two or more functions together so that the output of one becomes the input of the next, producing a brand-new callable that encapsulates the entire pipeline.
In mathematical notation, if you have two functions f and g, their composition is written as (f . g)(x) = f(g(x)). You feed x into g, take whatever g returns, and hand it straight to f. Python does not ship with a built-in compose function, and that absence is itself a story worth telling. It touches on language design philosophy, several Python Enhancement Proposals, heated mailing-list debates, and the broader tension between functional and imperative programming styles that has shaped Python for over three decades.
Why Function Composition Matters
Every Python developer composes functions whether they realize it or not. Any time you write something like sorted(map(str.strip, lines)), you are piping the output of one callable directly into another. The question is not whether composition happens in your code, but whether you have clean, reusable abstractions for expressing it.
A.M. Kuchling, in Python's official Functional Programming HOWTO, frames the paradigm this way: functional programming decomposes problems into a set of functions where input flows through each function, and each function operates on its input and produces some output. That description is practically a definition of composition itself. When you decompose a problem into small, focused functions and then snap them together like building blocks, you get code that is easier to test, easier to read, and easier to maintain.
Consider a real scenario. You are processing user-submitted form data. Each entry needs to be stripped of whitespace, converted to lowercase, and checked against a list of banned words. Without composition, you might write:
def process_entry(entry):
stripped = entry.strip()
lowered = stripped.lower()
if lowered in BANNED_WORDS:
return None
return lowered
That works, but each transformation step is tangled inside a single function body. If you later need to add a step -- say, removing accents or normalizing Unicode -- you have to crack open process_entry and modify it. With composition, each transformation is its own callable, and the pipeline is assembled declaratively:
def strip(s):
return s.strip()
def lower(s):
return s.lower()
def reject_banned(s):
return None if s in BANNED_WORDS else s
process_entry = compose(reject_banned, lower, strip)
Adding a new step means writing a new function and dropping it into the pipeline at the right position. Nothing else changes.
The Simplest Compose: Two Functions
The most fundamental compose takes exactly two single-argument functions and returns a new function that calls them in sequence. Here is the clearest possible implementation:
def compose2(f, g):
def composed(x):
return f(g(x))
return composed
That is the entire mechanism. g runs first, f runs on the result. Let's use it:
def add_two(x):
return x + 2
def multiply_by_three(x):
return x * 3
add_then_multiply = compose2(multiply_by_three, add_two)
print(add_then_multiply(5)) # (5 + 2) * 3 = 21
compose2(f, g) means "apply g first, then f." This follows the mathematical convention where (f . g)(x) = f(g(x)), which reads right-to-left. The rightmost function in the argument list runs first. This is a common source of confusion, and it is one of the reasons some developers prefer the reverse ordering -- often called a "pipe" -- which reads left-to-right.
A practical example illustrates this well. Imagine a thermometer that reads Celsius but is slightly inaccurate, requiring an adjustment function before converting to Fahrenheit:
def celsius_to_fahrenheit(t):
return 1.8 * t + 32
def readjust(t):
return 0.9 * t - 0.5
convert = compose2(celsius_to_fahrenheit, readjust)
print(convert(10)) # 49.4
The order matters. compose2(celsius_to_fahrenheit, readjust) first readjusts the raw reading, then converts. Swap the arguments and you get a completely different (and incorrect) result. Composition of two functions is generally not commutative.
Generalizing to N Functions with functools.reduce
Composing two functions is useful, but real pipelines often involve three, four, or more steps. The pattern for chaining compose2 calls reveals itself quickly -- it is just a left fold over a list of functions. Python's functools.reduce handles this pattern perfectly:
import functools
def compose(*functions):
def compose2(f, g):
return lambda x: f(g(x))
return functools.reduce(compose2, functions, lambda x: x)
The identity function lambda x: x serves as the initial value, ensuring that compose() with no arguments returns an identity function and compose(f) returns f unchanged. Although reduce processes functions left-to-right, each compose2 call wraps the accumulator as the outer function, which means the rightmost function in the list ends up running first -- preserving the right-to-left mathematical convention. Here it is in action:
def add_one(x):
return x + 1
def double(x):
return x * 2
def negate(x):
return -x
pipeline = compose(negate, double, add_one)
print(pipeline(3)) # -((3 + 1) * 2) = -8
Reading right to left: add_one(3) gives 4, double(4) gives 8, negate(8) gives -8. This approach can also be condensed into a single line:
def compose(*fns):
return functools.reduce(lambda f, g: lambda x: f(g(x)), fns, lambda x: x)
The PEP Trail: How Python's Standard Library Almost Got compose
Function composition has been discussed in the Python community for over two decades, and its story is intertwined with several PEPs.
PEP 309 -- Partial Function Application (2003)
The most directly relevant PEP is PEP 309, authored by Peter Harris and filed in February 2003. This proposal introduced functools.partial, the tool that lets you "freeze" some portion of a function's arguments. Even during the community feedback phase, Harris noted that the proposed functional module should contain more than just partial, explicitly naming function composition as a natural companion -- though he considered it outside PEP 309's scope.
Function composition was recognized as a natural fit as far back as 2003, yet it was never added. The PEP was accepted and applied in 2005 for Python 2.5. Notably, after acceptance, further discussion on python-dev revealed a desire for tools that operated on function objects but were not strictly related to functional programming -- so the module was renamed from functional to functools to reflect its broader focus. The implementation was proposed by Peter Harris, implemented by Hye-Shik Chang and Nick Coghlan, with adaptations by Raymond Hettinger. But compose itself never made it in.
PEP 443 -- Single-dispatch Generic Functions (2013)
PEP 443, authored by Ćukasz Langa and accepted for Python 3.4, introduced functools.singledispatch. It is primarily a dispatch mechanism -- it routes function calls to different implementations based on argument type -- rather than a composition tool in the mathematical sense. Its inclusion in functools demonstrates the module's evolution as a general home for callable-transformation utilities, growing incrementally with each Python release rather than delivering a sweeping functional programming toolkit all at once. That incremental philosophy is the same reason compose still hasn't landed there.
The Ongoing functools.pipe Discussion (2024)
The desire for a standard library composition tool has not gone away. On October 31, 2024, developer dg-pb opened a thread on Python's official Discuss forum titled "functools.pipe -- Function Composition Utility," proposing a C-implemented pipe class for functools. The proposal centered on left-to-right composition, which many Python developers find more intuitive. Discussion participants raised concerns about debuggability and flexibility, pointing out that functional pipelines often require more than simple chaining -- including operations like starmap, filter, and grouping. Performance benchmarks showed a Cython implementation could sit in a useful sweet spot between readability and raw speed.
CPython Issue #116744 -- Pipe Operator for Function Composition (2024)
In March 2024, a feature request was filed on CPython's GitHub proposing support for function composition using the | pipe operator, explicitly drawing on Haskell's . composition operator as the inspiration. The proposal has not been accepted, but the discussion highlights that the community continues to push for first-class composition support in the language.
Guido's Complicated Relationship with Functional Programming
Understanding why Python still lacks a built-in compose requires understanding Guido van Rossum's views on functional programming. In his well-known March 10, 2005 blog post "The fate of reduce() in Python 3000" on Artima, Guido explained his reasoning for wanting to remove reduce(), lambda, map(), and filter() from the language entirely. His core objection was that reduce() calls with non-trivial function arguments were difficult to read:
"So now reduce(). This is actually the one I've always hated most, because, apart from a few examples involving + or *, almost every time I see a reduce() call with a non-trivial function argument, I need to grab pen and paper to diagram what's actually being fed into that function before I understand what the reduce() is supposed to do." — Guido van Rossum, The fate of reduce() in Python 3000, Artima, March 10, 2005
Guido ultimately moved reduce to functools rather than removing it entirely, and lambda, map(), and filter() stayed in the language. But the philosophical stance is clear: Python prioritizes explicit, readable code over compact functional abstractions. That readability-first philosophy is why function composition has remained a user-land pattern rather than a language feature.
Decorators: Composition's Native Habitat in Python
If Python does not have a built-in compose, it does have a mechanism that is essentially composition by another name: decorators. When you stack decorators on a function, you are composing behaviors:
import time
import functools
def timer(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
start = time.perf_counter()
result = func(*args, **kwargs)
elapsed = time.perf_counter() - start
print(f"{func.__name__} took {elapsed:.4f}s")
return result
return wrapper
def logger(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
print(f"Calling {func.__name__} with {args}, {kwargs}")
return func(*args, **kwargs)
return wrapper
@timer
@logger
def compute(x, y):
return x ** y
When you call compute(2, 10), the call chain is timer(logger(compute))(2, 10). That is textbook function composition. The decorator syntax just makes it read top-to-bottom instead of inside-out, which is exactly the readability improvement that Pythonistas value.
Always use @functools.wraps(func) inside your decorator wrappers. Without it, the composed function loses its __name__ and __doc__ attributes, making debugging significantly harder.
A Production-Grade compose with Type Hints
For production code, you want a compose function that handles *args and **kwargs on the first function, preserves introspection where possible, and plays nicely with type checkers. Here is a robust implementation using modern Python type annotations. Note the import of Callable from collections.abc rather than typing -- as of Python 3.9, typing.Callable is deprecated in favor of the collections.abc version, and tools like Ruff and mypy will flag the old import for projects targeting 3.9 or later:
import functools
from collections.abc import Callable
from typing import Any
def compose(*functions: Callable[..., Any]) -> Callable[..., Any]:
"""
Compose functions right-to-left.
compose(f, g, h)(x) == f(g(h(x)))
The rightmost function may accept any arguments.
All other functions must accept a single argument.
"""
if not functions:
return lambda x: x
if len(functions) == 1:
return functions[0]
def composed(*args: Any, **kwargs: Any) -> Any:
# Apply the rightmost function with all arguments
result = functions[-1](*args, **kwargs)
# Apply remaining functions right-to-left
for f in reversed(functions[:-1]):
result = f(result)
return result
# Preserve the name for debugging
names = " . ".join(
getattr(f, "__name__", repr(f)) for f in functions
)
composed.__name__ = f"compose({names})"
composed.__doc__ = f"Composition of: {names}"
return composed
The rightmost function receives all original arguments (supporting multi-argument entry points), while every subsequent function receives a single value:
def fetch_data(url, timeout=30):
"""Simulate fetching data from a URL."""
return f"data from {url} (timeout={timeout})"
def parse(raw):
return raw.upper()
def validate(parsed):
if len(parsed) < 5:
raise ValueError("Too short")
return parsed
pipeline = compose(validate, parse, fetch_data)
result = pipeline("https://api.example.com", timeout=10)
print(result) # DATA FROM HTTPS://API.EXAMPLE.COM (TIMEOUT=10)
The pipe Alternative: Left-to-Right Composition
Many developers find right-to-left composition counterintuitive because it reverses the visual order of execution. A "pipe" function composes left-to-right instead -- matching the way data actually flows through your program and mirroring patterns familiar from Elixir's |>, F#'s pipeline operator, and Unix shell pipes. Like the compose implementation above, this imports Callable from collections.abc, which is the correct location for Python 3.9+:
from collections.abc import Callable
from typing import Any
def pipe(*functions: Callable[..., Any]) -> Callable[..., Any]:
"""
Compose functions left-to-right (pipe).
pipe(f, g, h)(x) == h(g(f(x)))
The leftmost function may accept any arguments.
All other functions must accept a single argument.
"""
if not functions:
return lambda x: x
if len(functions) == 1:
return functions[0]
def piped(*args: Any, **kwargs: Any) -> Any:
result = functions[0](*args, **kwargs)
for f in functions[1:]:
result = f(result)
return result
names = " | ".join(
getattr(f, "__name__", repr(f)) for f in functions
)
piped.__name__ = f"pipe({names})"
return piped
process = pipe(
str.strip,
str.lower,
lambda s: s.replace(" ", "_"),
lambda s: s[:50],
)
print(process(" Hello World ")) # hello_world
Composition with functools.partial: A Powerful Combination
Function composition becomes especially powerful when combined with functools.partial. Partial application lets you fix some arguments ahead of time, creating specialized versions of general functions that slot cleanly into a composition pipeline. Note that the example below uses def for is_high_score rather than assigning a lambda to a name -- this follows PEP 8's programming recommendations, which state: "Always use a def statement instead of an assignment statement that binds a lambda expression directly to an identifier."
from functools import partial
from operator import itemgetter
data = [
{"user": {"name": "Alice", "score": 95}},
{"user": {"name": "Bob", "score": 87}},
{"user": {"name": "Charlie", "score": 92}},
]
get_user = itemgetter("user")
get_score = itemgetter("score")
def is_high_score(score):
return score >= 90
extract_score = compose(get_score, get_user)
high_scorers = [d for d in data if is_high_score(extract_score(d))]
print(high_scorers)
# [{'user': {'name': 'Alice', 'score': 95}},
# {'user': {'name': 'Charlie', 'score': 92}}]
You can also use partial to adapt functions with multiple parameters for use inside a composition:
from functools import partial
def clamp(value, low, high):
return max(low, min(high, value))
clamp_percentage = partial(clamp, low=0, high=100)
normalize = pipe(
float,
lambda x: x * 100,
clamp_percentage,
round,
)
print(normalize("0.873")) # 87
print(normalize("1.5")) # 100
print(normalize("-0.2")) # 0
PEP 309 made partial the cornerstone of functools, and in practice it is the tool that makes composition viable for functions that were not originally designed to participate in a pipeline.
Third-Party Libraries: The compose Package
For developers who want a battle-tested, fully-featured composition tool without writing their own, the compose package on PyPI provides exactly that. It follows the lead of functools.partial, returning callable compose objects that have a meaningful repr, retain correct signature introspection, and can be type-checked. Nested compositions also flatten automatically, so chaining compose objects does not add extra indirection layers.
pip install compose
from compose import compose
def double(x):
return x * 2
def increment(x):
return x + 1
double_then_increment = compose(increment, double)
print(double_then_increment(5)) # 11
# Nested compositions flatten automatically
times_eight = compose(double, double, double)
times_sixteen = compose(double, double, double, double)
times_eight_times_two = compose(double, times_eight)
print(times_eight_times_two.functions == times_sixteen.functions) # True
The library also provides acompose for composing async functions and sacompose for mixed sync/async pipelines -- functionality that is increasingly relevant as async Python becomes more widespread.
For a more fully-featured functional toolkit, the toolz library (and its C-accelerated sibling cytoolz) includes a battle-hardened compose and pipe, along with currying, memoization, and a wide range of iterator utilities. It is one of the most widely used functional programming libraries in the Python ecosystem and integrates well with data science pipelines.
Real-World Patterns: Data Processing Pipelines
Function composition truly shines in data processing. Here is a realistic example of cleaning and transforming CSV records:
import csv
import io
from functools import partial
from collections.abc import Callable
from typing import Any
# Full pipe() with edge-case handling; paste this in or import from your utils module.
def pipe(*functions: Callable[..., Any]) -> Callable[..., Any]:
"""
Compose functions left-to-right (pipe).
pipe(f, g, h)(x) == h(g(f(x)))
The leftmost function may accept any arguments.
All other functions must accept a single argument.
"""
if not functions:
return lambda x: x
if len(functions) == 1:
return functions[0]
def piped(*args: Any, **kwargs: Any) -> Any:
result = functions[0](*args, **kwargs)
for f in functions[1:]:
result = f(result)
return result
names = " | ".join(
getattr(f, "__name__", repr(f)) for f in functions
)
piped.__name__ = f"pipe({names})"
return piped
def read_records(csv_text):
reader = csv.DictReader(io.StringIO(csv_text))
return list(reader)
def filter_by(records, key, predicate):
return [r for r in records if predicate(r.get(key, ""))]
def transform_field(records, key, func):
# Return new dicts rather than mutating in place
return [
{k: func(v) if k == key else v for k, v in r.items()}
for r in records
]
def sort_by(records, key, reverse=False):
# Note: this uses string comparison on r.get(key, ""). If values may be
# mixed types (e.g. numeric strings vs empty strings), apply an explicit
# key function or normalize field types in an earlier pipeline stage.
return sorted(records, key=lambda r: r.get(key, ""), reverse=reverse)
# Build the pipeline using partial + pipe
clean_pipeline = pipe(
read_records,
partial(filter_by, key="email", predicate=lambda e: "@" in e),
partial(transform_field, key="name", func=str.title),
partial(transform_field, key="email", func=str.lower),
partial(sort_by, key="name"),
)
csv_data = """name,email
john doe,[email protected]
jane smith,[email protected]
bad record,not-an-email
BOB JONES,[email protected]"""
results = clean_pipeline(csv_data)
for r in results:
print(r)
# {'name': 'Bob Jones', 'email': '[email protected]'}
# {'name': 'Jane Smith', 'email': '[email protected]'}
# {'name': 'John Doe', 'email': '[email protected]'}
Each step in the pipeline is independently testable. Notice that transform_field returns new dicts rather than mutating records in place -- this avoids hidden side effects and makes each stage genuinely composable. If the email validation logic changes, you swap out one partial call. If you need to add a deduplication step, you insert a new function between existing stages. The pipeline's structure makes the data flow visible at a glance.
Using a Class-Based Approach for Compose
The Python Cookbook (O'Reilly, 2002, edited by Alex Martelli and David Ascher) included a recipe specifically titled "Composing Functions" in Chapter 15, which presented a class with __call__ as the canonical Pythonic approach to composition. The O'Reilly page for that recipe describes the approach this way: "A class defining the special method __call__ is often the best Pythonic approach to constructing new functions." The class-based approach gives you a meaningful repr for free, operator overloading for combining pipelines, and the ability to attach metadata or implement protocols like __len__ to report the number of composed stages:
class Compose:
"""Compose two or more callables into a single pipeline."""
def __init__(self, *functions):
if not functions:
raise ValueError("Compose requires at least one function")
self.functions = functions
def __call__(self, *args, **kwargs):
result = self.functions[-1](*args, **kwargs)
for f in reversed(self.functions[:-1]):
result = f(result)
return result
def __repr__(self):
names = ", ".join(
getattr(f, '__name__', repr(f)) for f in self.functions
)
return f"Compose({names})"
def __add__(self, other):
"""Allow Compose(f) + Compose(g) to create a new composition."""
if isinstance(other, Compose):
return Compose(*self.functions, *other.functions)
if callable(other):
return Compose(*self.functions, other)
return NotImplemented
One subtlety worth noting: because Compose.__call__ applies the last function first, Compose(f) + Compose(g) produces a pipeline where g runs before f -- consistent with the right-to-left mathematical convention, but visually the opposite of what the + operator might suggest. If your team finds this confusing, you can rename __add__ to __or__ (using the | operator) for a visual cue that matches Unix pipe direction, or document the ordering convention clearly in the class docstring. The important thing is to pick one convention and apply it consistently.
Performance Considerations
Function composition introduces overhead. Each composed call adds one Python function call to the stack, and function calls in Python are not free. Guido van Rossum himself noted in his tips for fast Python that developers should "be suspicious of function/method calls; creating a stack frame is expensive."
For hot loops processing millions of items, a hand-written loop will outperform a composed pipeline. But for the vast majority of application code -- web request handlers, data transformation scripts, configuration processing -- the overhead is negligible, and the readability and maintainability gains far outweigh the microseconds lost per call.
The functools.pipe discussion benchmarks showed a Cython pipe with operator.itemgetter composition ran at roughly 7.4ms for 100,000 iterations, compared to 8.7ms for an equivalent lambda approach and 5.5ms for manually chained map calls. A CPython C extension would likely improve that further.
Where Composition Is Headed in Python
The absence of a built-in compose or pipe in Python's standard library is not an oversight. It is the result of deliberate choices by the core development team, rooted in the language's readability-first philosophy. But the pressure is building. The October 2024 functools.pipe discussion, the March 2024 CPython pipe-operator feature request, and the continued popularity of third-party composition libraries all signal strong community demand.
Python 3.14, released on October 7, 2025, introduced functools.Placeholder for more flexible partial application -- a feature that makes composition with partially applied functions even more natural. Before 3.14, partial could freeze arguments in two ways: pass positional arguments (prepended in order) or pass keyword arguments (matched by name). The keyword approach let you target non-first parameters -- but only when those parameters accepted keyword form. Functions that declare positional-only parameters (marked with / in their signature) cannot be targeted by name at all, leaving partial unable to freeze anything except leftmost slots. Placeholder closes that gap entirely: it is a singleton sentinel you place in the argument list to reserve any positional slot for later, regardless of whether the underlying parameter accepts a keyword form. See the official docs for details. If the trajectory holds, it would not be surprising to see some form of functools.pipe or functools.compose land in a future Python release.
Key Takeaways
- Composition is already in your code: Every nested function call is composition. The question is whether you have clean abstractions for it.
- functools.reduce is your friend: Python's standard library gives you all the machinery you need to build a general
composefunction in a few lines. - The PEP history explains the gap: PEP 309 (2003) established
functoolsas the home for higher-order tools. The absence of a built-incomposeis a deliberate design choice rooted in Python's readability-first philosophy, not an oversight. - Decorators are composition in disguise: Stacked decorators are the most Pythonic form of function composition already in the language.
@functools.wrapskeeps composed functions inspectable. - Choose your direction: Right-to-left (
compose) matches mathematical convention. Left-to-right (pipe) matches the way you read data flow. Pick whichever your team finds more readable and use it consistently. - Use
collections.abc.Callable, nottyping.Callable: As of Python 3.9,typing.Callableis deprecated. ImportCallablefromcollections.abcin all new code. Static analysis tools like Ruff and mypy will flag the old form for projects targeting 3.9+. - functools.Placeholder (Python 3.14+) closes a real gap: Before 3.14,
partialcould not target positional-only parameters (those marked with/) because they cannot be passed by keyword.Placeholderlets you reserve any positional slot as a sentinel, makingpartial+ composition a cleaner pattern across a much wider range of functions -- including built-ins likestr.translate-- without resorting tolambdawrappers. - Third-party libraries fill the gap: The
composepackage on PyPI andtoolzboth provide battle-tested, production-ready composition utilities with async support, type-aware introspection, and rich ecosystems. - Performance is rarely the bottleneck: Composed pipelines add function-call overhead, but for the vast majority of application code the readability and testability gains far outweigh the cost. Profile before optimizing.