TypeVar and Generic in Python: When and How to Use Them

Python's type annotation system has several tools that confuse developers more than almost any others: TypeVar, Generic, and ParamSpec. The confusion is understandable — they solve a specific problem that only surfaces once your code grows beyond simple concrete type annotations. This article explains exactly what that problem is, when these tools are the right solution, how variance affects the way type checkers evaluate your generics, and how PEP 695 in Python 3.12 and PEP 696 in Python 3.13 changed the game with new syntax and type parameter defaults that make the older approach feel like a workaround in comparison.

When you write def first(items: list) -> ???, you immediately run into a wall. You know the function returns whatever type of thing is in the list, but there is no way to express that relationship with a plain type annotation. You could write Any, but that defeats the purpose — you lose all the safety that type checking provides. You could write an overloaded version for every type you care about, but that is unmaintainable. TypeVar and Generic are how Python's type system solves this: they let you say "whatever type goes in, that same type comes back out," and they let type checkers like mypy and pyright enforce that contract.

The Problem They Solve

A TypeVar is a placeholder for a type that will be filled in later. It is not a type itself — it is a variable that ranges over types. The official Python documentation defines it succinctly: a TypeVar() expression creates a new type variable, and the argument to TypeVar() must be a string equal to the variable name to which it is assigned. That last rule — the redundant name string — is one of the clearest signals that the old syntax was designed as an interim solution.

Here is the canonical pre-3.12 pattern. To write a generic function that returns the first element of a sequence while preserving its type, you write:

from collections.abc import Sequence
from typing import TypeVar

T = TypeVar('T')

def first(items: Sequence[T]) -> T:
    return items[0]

When a type checker sees first([1, 2, 3]), it knows the input is Sequence[int], which means T is bound to int for that call, so the return type is inferred as int. When it sees first(['a', 'b']), T is bound to str. The same function, different type signatures per call site — that is what generics do.

The Generic base class comes into play when you need a whole class to be parameterized by a type, not just a single function. Suppose you are building a typed stack data structure. Without Generic, there is no way to say "this stack holds integers" and have the type checker enforce that pop() returns int:

from typing import TypeVar, Generic

T = TypeVar('T')

class Stack(Generic[T]):
    def __init__(self) -> None:
        self._items: list[T] = []

    def push(self, item: T) -> None:
        self._items.append(item)

    def pop(self) -> T:
        return self._items.pop()

    def peek(self) -> T:
        return self._items[-1]

With this definition, Stack[int]() creates a stack that only accepts integers. Calling .push("hello") on it produces a type error. The Generic[T] base class is the mechanism that registers T as the type parameter for the class and enables the Stack[int] subscription syntax at runtime.

Note

A single TypeVar instance can appear in multiple positions within a signature. When it does, the type checker constrains all positions to agree. In def identity(x: T) -> T, the input and output types are not merely both generic — they are the same type. That is a stronger guarantee than def identity(x: Any) -> Any, which permits the function to silently return a completely different type.

You use TypeVar alone when the generic behavior belongs to a function. You add Generic[T] as a base class when the generic behavior belongs to a class — when the type parameter needs to be remembered across multiple methods. A class that inherits from Generic[T] is how you tell Python's type system that instances of that class carry a type parameter with them. The article on Python type annotations and IDE support covers many patterns where the distinction matters.

One subtlety worth understanding early: a TypeVar is invariant by default. This means that if you declare a variable as Stack[Animal], you cannot pass a Stack[Dog] where it is expected — even if Dog is a subclass of Animal. This is not a bug; it is a deliberate safety rule. The next section explains why, and when you need to change it.

Pro Tip

The rule "a TypeVar() expression must always directly be assigned to a variable" is enforced by type checkers, not the Python runtime. At runtime, you can technically do other things, but mypy and pyright will reject code where a TypeVar is used as part of a larger expression or passed around without being assigned first.

Bounded Types, Constraints, and Variance

An unconstrained TypeVar accepts any type. That is useful for generic containers, but many real-world functions are not that permissive. They can work with multiple types, but not every type. Python's type system gives you two distinct tools for narrowing this range: bounds and constraints. They look similar but behave very differently, and choosing the wrong one is a common source of subtle bugs.

A bound restricts T to a type or any subtype of it. You use the bound= keyword argument:

from typing import TypeVar

class Animal:
    def speak(self) -> str: ...

class Dog(Animal):
    def fetch(self) -> None: ...

AnimalT = TypeVar('AnimalT', bound=Animal)

def make_speak(creature: AnimalT) -> AnimalT:
    creature.speak()
    return creature

# Type checker knows the return type is Dog, not just Animal
my_dog: Dog = make_speak(Dog())

The critical behavior of a bounded TypeVar is that the type checker infers the most specific type possible. When you pass a Dog, the return type is inferred as Dog, not Animal. This is what makes bounded TypeVars indispensable for factory patterns and fluent builder APIs — the caller gets back exactly what they put in, with no information lost.

Constraints, by contrast, restrict T to exactly one of a fixed set of types. The type checker will not infer anything more specific than the matching constraint. The official Python typing documentation draws a clear line between the two: a bounded TypeVar resolves to the most specific type the checker can determine, while a constrained one resolves only to whichever listed type matches — never to a subtype of it.

from typing import TypeVar

# T must be exactly str or bytes — nothing else
StrOrBytes = TypeVar('StrOrBytes', str, bytes)

def process(data: StrOrBytes) -> StrOrBytes:
    return data

result = process("hello")   # type inferred as str
result2 = process(b"world") # type inferred as bytes
# process(42)               # type error: int is not str or bytes

Understanding Variance

Variance describes how subtype relationships between concrete types carry over into generic types. It is one of the concepts that PEP 483 (opens in new window) — the theoretical foundation for Python's type hints — treats with particular rigor. There are three modes: invariant, covariant, and contravariant.

Invariant (the default) means Generic[Dog] is neither a subtype nor a supertype of Generic[Animal], even though Dog is a subtype of Animal. Mutable containers like list are invariant because allowing substitution would be unsound: if list[Dog] were a subtype of list[Animal], you could pass a list[Dog] to a function expecting list[Animal], that function could append a Cat to it, and you would have broken the caller's guarantee that the list contains only dogs. Mypy even surfaces this in its error messages, noting that List is invariant and suggesting Sequence as an alternative when you only need to read from it.

Covariant means Generic[Dog] is a subtype of Generic[Animal] when Dog is a subtype of Animal. This is safe for read-only producers. PEP 483 defines covariance as the condition where GenType[t2] is a subtype of GenType[t1] whenever t2 is a subtype of t1 — the subtype relationship flows in the same direction as the parameter relationship. Sequence[T], FrozenSet[T], and Iterator[T] in the standard library are all covariant — you can only read from them, so it is safe to substitute a more specific type. You declare this with covariant=True:

from typing import TypeVar, Generic

T_co = TypeVar('T_co', covariant=True)

class ReadOnlyBox(Generic[T_co]):
    def __init__(self, value: T_co) -> None:
        self._value = value

    def get(self) -> T_co:
        return self._value

# Because T_co is covariant, ReadOnlyBox[Dog] is a subtype of ReadOnlyBox[Animal]
def describe(box: ReadOnlyBox[Animal]) -> None:
    print(box.get().speak())

describe(ReadOnlyBox(Dog()))  # accepted by the type checker

Contravariant reverses the relationship: Generic[Animal] becomes a subtype of Generic[Dog]. This applies to consumers — things that accept a type rather than produce it. The canonical example is Callable[[T], None]: a function that accepts Animal can safely be used anywhere a function accepting Dog is expected, because it can handle at least as much. You declare this with contravariant=True. The Python typing specification is precise on this point: variance is a property of a generic class, not of the type variable in isolation. The same TypeVar could behave covariantly in one class and contravariantly in another, depending on how each class uses it.

From the specification

PEP 483, The Theory of Type Hints (opens in new window) treats variance with particular rigor, framing it as the question of how subtype relationships among concrete types propagate into subtype relationships among the generic types constructed from them.

The practical rule is: use invariant (the default) for mutable containers, covariant for read-only producers, and contravariant for write-only consumers or callbacks. When you get a mypy or pyright error about variance, it is almost always telling you that you are attempting a substitution that would be unsound under those rules.

Invariant default
Declare T = TypeVar('T') Use for Mutable containers: list, dict, custom writable classes
Covariant
Declare T_co = TypeVar('T_co', covariant=True) Use for Read-only producers: Sequence, FrozenSet, Iterator
Contravariant
Declare T_contra = TypeVar('T_contra', contravariant=True) Use for Consumers and callbacks: Callable[[T], None], event handlers

One important restriction: covariant and contravariant TypeVars are only valid for use with classes, not standalone generic functions. The typing specification is explicit on this point: variance is a property of generic types, not of generic functions in isolation. If you try to use a covariant TypeVar in a plain function signature without a wrapping class, mypy will flag it as an error.

Pop Quiz Question 1 of 3

Python 3.12 and PEP 695: The New Syntax

Python 3.12, released in October 2023, introduced PEP 695 — a redesign of how generic types and type aliases are declared. The change is substantial enough that the Python documentation characterizes the old approach as a longstanding source of developer confusion. An analysis cited in PEP 695 found that TypeVar was used in 14% of modules across 25 popular typed Python libraries, which gives a sense of how widespread the problem was.

The old approach had three concrete pain points. First, the redundant name string: T = TypeVar('T') requires you to repeat the variable name in quotes, and if you get them out of sync, behavior becomes undefined. Second, global scope pollution: type variables declared at module level appear in globals(), even though their meaning is only valid inside the generic context where they are used. Third, the need to import both TypeVar and Generic from typing, adding boilerplate that feels disproportionate to the task.

PEP 695 solves all three at once. The new syntax places type parameters directly in square brackets on the class or function definition:

# Python 3.12+ — no imports required for simple cases
from collections.abc import Sequence

class Stack[T]:
    def __init__(self) -> None:
        self._items: list[T] = []

    def push(self, item: T) -> None:
        self._items.append(item)

    def pop(self) -> T:
        return self._items.pop()

def first[T](items: Sequence[T]) -> T:
    return items[0]

# Type alias — also new in 3.12
type Vector[T] = list[T]

The Python 3.12 release notes describe this as "a new, more compact and explicit way" (opens in new window) to define generic types. T, *Ts, and **P in the bracket syntax replace TypeVar, TypeVarTuple, and ParamSpec respectively, without any imports. Any class that has generic annotations in this form is automatically treated as a generic type — there is no need to inherit from Generic[T]. PEP 695 also addresses a core conceptual problem: under the old approach, the generic boilerplate obscured what a class did at its core, making type parameters harder to spot during code review.

Bounds, Constraints, and Variance in the New Syntax

Bounds and constraints translate directly into the bracket syntax using a colon:

from collections.abc import Hashable

# Bound: T must be Hashable or a subtype
def deduplicate[T: Hashable](items: list[T]) -> list[T]:
    return list(dict.fromkeys(items))

# Constraints: T must be exactly str or bytes
def encode[T: (str, bytes)](data: T) -> T:
    return data

# TypeVarTuple — variadic generics
def args_as_tuple[*Ts](*args: *Ts) -> tuple[*Ts]: ...

Variance works differently in PEP 695: rather than requiring you to declare it, type checkers infer it automatically. This is one of the most significant ergonomic improvements. Previously, developers had to reason about variance upfront and encode it into naming conventions (the community convention was T_co for covariant, T_contra for contravariant). Under PEP 695, the type checker observes how the type parameter is used — whether it appears in argument position, return position, or both — and determines variance from that. PEP 695 documents this inference as automatic: usage within a class body is the sole input the checker needs.

Note

PEP 695 also changes scoping rules. Type parameters declared with the new syntax do not appear in globals() or locals(). Each generic context gets its own isolated scope. This means the same name T can be used in multiple unrelated generic definitions in the same module without interference — a significant improvement over the old approach, where a module-level T = TypeVar('T') was shared across every usage in that file.

When to Keep the Old Syntax

The new syntax requires Python 3.12 or later. If your project needs to support Python 3.10 or 3.11, you must continue using the TypeVar and Generic imports. The two syntaxes are semantically equivalent for most purposes — class Stack[T]: and class Stack(Generic[T]): produce functionally identical behavior from the perspective of type checkers like mypy and pyright. The difference is syntactic clarity and reduced boilerplate, not functionality.

Important: You cannot mix old and new syntax within the same class

PEP 695 is explicit on this point: using a bracket-style type parameter list on a class and also passing a traditional TypeVar in its Generic[T] base will produce a runtime error. The two syntaxes can coexist freely across different classes and functions in the same module, but not within a single class definition. Pick one approach per class.

There is also one specific case where the old-style TypeVar with explicit covariant=True or contravariant=True remains the required approach even in 3.12+: when you need to override the inferred variance. If your class design requires a type parameter to behave contravariantly in a way the type checker would not infer automatically, you still create a TypeVar explicitly with contravariant=True and use it in your Generic base class. This is uncommon in practice, but it is the escape hatch.

Python 3.13 extended PEP 695 further via PEP 696, adding default values for type parameters — a feature long-requested by library authors writing complex generic APIs. The syntax places the default after an = sign in the bracket list: class Box[T = int]: means that Box() without a type argument is treated as Box[int]. As of Python 3.14 (released October 2025), the new type parameter syntax and its tooling support across mypy and pyright are considered stable and widely adopted.

BEFORE PYTHON 3.12 PYTHON 3.12+ (PEP 695) from typing import TypeVar, Generic T = TypeVar('T') module-level scope class Stack(Generic[T]): ... T_co = TypeVar('T_co', covariant=True) Variance declared manually by convention # no imports required class Stack[T]: locally scoped ... # variance is inferred automatically class ReadOnly[T]: ... Type checker infers variance from usage TypeVar shared at module scope — name string required Parameters scoped to class — no string, no import
Old-style TypeVar + Generic (left) versus PEP 695 bracket syntax (right). Both are semantically equivalent; the new syntax removes boilerplate, scopes type parameters locally, and lets type checkers infer variance automatically.

ParamSpec: Generics for Callables

There is a third type-variable-like tool that belongs in this article: ParamSpec. Where TypeVar captures a single type, ParamSpec captures the full parameter specification of a callable — the names, types, and order of its arguments. It was introduced in PEP 612 (opens in new window) and is essential for writing typed decorators that preserve the signature of the function they wrap.

The problem ParamSpec solves is concrete. Suppose you write a logging decorator. Without ParamSpec, the only way to annotate it is with Callable[..., T] — the ellipsis means "I don't know the parameters." The type checker cannot verify that callers pass the right arguments to the wrapped function. With ParamSpec, you can thread the original parameter specification through the decorator so the type checker knows exactly what the wrapper accepts:

from typing import TypeVar, Callable, ParamSpec
import functools

P = ParamSpec('P')
R = TypeVar('R')

def log_call(func: Callable[P, R]) -> Callable[P, R]:
    @functools.wraps(func)
    def wrapper(*args: P.args, **kwargs: P.kwargs) -> R:
        print(f"Calling {func.__name__}")
        return func(*args, **kwargs)
    return wrapper

@log_call
def add(x: int, y: int) -> int:
    return x + y

add(1, 2)       # type checker knows: two ints required
add("a", "b")   # type error: str is not int

The two attributes P.args and P.kwargs let you unpack the captured parameter specification inside the wrapper's signature. This is a specific requirement enforced by the type specification — P.args and P.kwargs must be used together, not independently.

In Python 3.12's bracket syntax, ParamSpec is written with a ** prefix:

from collections.abc import Callable
import functools

def log_call[**P, R](func: Callable[P, R]) -> Callable[P, R]:
    @functools.wraps(func)
    def wrapper(*args: P.args, **kwargs: P.kwargs) -> R:
        print(f"Calling {func.__name__}")
        return func(*args, **kwargs)
    return wrapper

There is also TypeVarTuple (introduced in PEP 646 (opens in new window)), which captures a variable number of type arguments — used for functions that operate on tuples of arbitrary length and shape. In the bracket syntax it is written with a * prefix: def args_as_tuple[*Ts](*args: *Ts) -> tuple[*Ts]: .... TypeVarTuple is an advanced feature primarily useful in numerical computing frameworks and low-level library code.

Which tool to reach for

Use TypeVar when you need to preserve the type of a single value through a function or class. Use ParamSpec when you are wrapping a callable and need to preserve its full argument signature. Use TypeVarTuple when you are working with heterogeneous tuples of variable length. In Python 3.12+, all three are available without imports through the bracket syntax: [T], [**P], and [*Ts].

Protocol: Structural Generics

Every example so far has used nominal subtyping: Dog is accepted as an Animal because it explicitly inherits from Animal. Python's type system also supports structural subtyping through Protocol, introduced in PEP 544 (opens in new window). The distinction matters here because Protocol and Generic are frequently confused, and in many situations Protocol is the right answer where developers reflexively reach for a bounded TypeVar.

A Protocol describes a structural interface — a set of methods and attributes a type must have — without requiring the type to explicitly declare that it implements the protocol. If a class has a speak() method that returns str, it satisfies a Speakable protocol, regardless of its inheritance chain. This is duck typing made statically checkable:

from typing import Protocol

class Speakable(Protocol):
    def speak(self) -> str: ...

# Works with any class that has a speak() method
# — no inheritance from Speakable required
def make_noise(creature: Speakable) -> str:
    return creature.speak()

class Dog:
    def speak(self) -> str:
        return "woof"

class Robot:
    def speak(self) -> str:
        return "beep"

make_noise(Dog())    # accepted
make_noise(Robot())  # also accepted

The key question — "should I use a bounded TypeVar or a Protocol?" — comes down to whether you need to preserve the specific type through the operation. If the function receives a Dog and must return that same Dog (not just a Speakable), use a bounded TypeVar. If the function just needs to call methods on whatever it receives and returns something else entirely, a Protocol is simpler and more flexible.

Protocols and generics can also be combined. A Protocol can inherit from Generic[T] to describe a generic structural interface — for example, an Iterator[T] protocol that describes any object with a __next__(self) -> T method. The standard library uses this pattern extensively: Iterable, Iterator, Sequence, and Mapping in collections.abc are all defined as generic protocols.

Protocol vs Generic: the fast decision rule

Reach for Protocol when you want to describe what a type can do and do not need the specific type preserved. Reach for a bounded TypeVar when the function must return the exact same type it received. If you need the function to work across types that share no common base class, Protocol is almost always the cleaner option.

What Actually Happens at Runtime

A question that trips up nearly every developer encountering generics for the first time: does this actually enforce anything when the code runs? The short answer is no. Python's type annotations, including everything built with TypeVar, Generic, and Protocol, are not enforced by the Python interpreter at runtime. They are metadata — hints for type checkers, IDEs, and documentation tools, not runtime guards.

At runtime, T = TypeVar('T') creates an object of type TypeVar, and class Stack(Generic[T]): sets up Stack so it supports subscript syntax — that is why Stack[int] does not raise an error. But Stack[int] and Stack[str] are the same class at runtime. There is no separate specialized class generated per type argument, unlike Java or C# generics. Python uses type erasure: the type parameter is recorded in __class_getitem__ for introspection purposes, but the actual instance stores no type information about its parameter.

from typing import TypeVar, Generic

T = TypeVar('T')

class Box(Generic[T]):
    def __init__(self, value: T) -> None:
        self.value = value

int_box = Box[int](42)
str_box = Box[str]("hello")

# These are the same class at runtime
print(type(int_box))              # 
print(type(str_box))              # 
print(type(int_box) is type(str_box))  # True

# No runtime enforcement — this runs without error
int_box.value = "not an int"      # type error statically; fine at runtime

This design is intentional. CPython prioritizes performance and simplicity, and full reification of generic types at runtime would introduce significant overhead. The tradeoff is that your type safety net is entirely static — it exists at the point where you run a type checker like mypy or pyright, not when your program executes. This is why integrating a type checker into your CI pipeline matters: the runtime will not catch violations that the static checker would have flagged.

Note

isinstance() checks against generic types behave differently depending on the Python version and whether a runtime-checkable protocol is involved. For a plain generic like Stack[int], isinstance(x, Stack[int]) will raise a TypeError on most Python versions — you can only check against the unparameterized Stack. Protocols decorated with @runtime_checkable are a partial exception: isinstance() can check for the presence of the required methods, but it still cannot check the types of those methods' arguments or return values at runtime.

When Not to Use Generics

The existence of TypeVar and Generic does not mean every function or class that accepts multiple types needs them. Overusing generics is one of the clearest signs of premature type-system engineering, and it produces code that is harder to read without making it meaningfully safer.

The most common misapplication is reaching for TypeVar when Union or a direct concrete type annotation is what the code means. Consider a function that accepts either a string or a list of strings and always returns a list. Using TypeVar here is wrong — the function does not return "whatever type went in," it always returns a list, so the relationship between input and output types is not preserved:

from typing import TypeVar

# Wrong: TypeVar implies input type is preserved in the output
T = TypeVar('T', str, list)
def normalize(value: T) -> T: ...   # misleads callers

# Correct: the output is always list[str], regardless of input type
def normalize(value: str | list[str]) -> list[str]:
    if isinstance(value, str):
        return [value]
    return value

Similarly, TypeVar adds no value when a function already works correctly with Any, object, or a concrete base class. If the function only calls methods defined on object — like __str__ or __repr__ — typing the parameter as object is accurate and clear. Wrapping it in a TypeVar adds complexity without adding information.

The right signal for reaching for TypeVar is a specific structural question: does the caller need to know that the output type is the same as — or directly derived from — the input type? If the answer is no, a plain annotation is almost always the better choice. Generic code is harder to read, harder to debug, and produces more opaque error messages from type checkers. The ergonomic cost is only worth paying when the type relationship is real and the caller needs to rely on it.

A common over-engineering trap

Declaring a TypeVar that is only used once in a function signature — appearing in one parameter but not in the return type or any other parameter — accomplishes nothing. A singly-used TypeVar is equivalent to an unconstrained type, which is equivalent to object or Any. Type checkers like mypy will warn about this. If you find yourself writing T = TypeVar('T') and only using T in a single position, replace it with the appropriate concrete type or object.

Deeper Solutions Worth Knowing

The patterns covered so far — TypeVar, Generic, Protocol, and ParamSpec — are what you will find in most introductions to Python generics. But the standard library and the typing specification include several additional tools that solve real problems more precisely than the general-purpose options, and they are frequently underused.

Self: The Right Tool for Method Chaining and Fluent APIs

A common pattern in object-oriented Python is the builder or fluent API, where methods return self to allow chaining. The naive approach is to annotate the return type as the class itself, but this breaks down under inheritance: if Dog inherits from Animal and overrides a method that returns Animal, callers who hold a Dog lose its specific type after calling the method.

Before Python 3.11, the workaround was a bounded TypeVar:

from typing import TypeVar

T = TypeVar('T', bound='Animal')

class Animal:
    def set_name(self: T, name: str) -> T:
        self._name = name
        return self

class Dog(Animal):
    def fetch(self) -> None: ...

# Type checker knows this is Dog, not Animal
my_dog: Dog = Dog().set_name("Rex")

Python 3.11 added Self (PEP 673), which makes this idiom far cleaner. Self always refers to the type of the current instance, whatever class that is. You no longer need a TypeVar at all:

from typing import Self

class Animal:
    def set_name(self, name: str) -> Self:
        self._name = name
        return self

class Dog(Animal):
    def fetch(self) -> None: ...

# Still returns Dog when called on a Dog instance
my_dog: Dog = Dog().set_name("Rex")
my_dog.fetch()  # no type error — Self preserved the Dog type

Self also works in @classmethod definitions, which is where the bounded-TypeVar workaround was particularly awkward. Any class that returns its own type from a constructor-like class method should use Self as the return annotation.

TypeGuard and TypeIs: Narrowing Inside Generics

Python's type narrowing — the process by which a type checker understands that after an isinstance() check, a variable has a more specific type — works automatically for built-in checks. But it does not work automatically for custom validation functions. TypeGuard (PEP 647, Python 3.10) and TypeIs (PEP 742, Python 3.13) let you annotate a function as a type guard, so the type checker applies narrowing when that function returns True.

from typing import TypeGuard

def is_list_of_str(val: list[object]) -> TypeGuard[list[str]]:
    return all(isinstance(x, str) for x in val)

def process(items: list[object]) -> None:
    if is_list_of_str(items):
        # Type checker now knows: items is list[str] here
        upper = [s.upper() for s in items]  # no error

TypeIs is a stricter variant introduced in Python 3.13: unlike TypeGuard, it is bidirectional — the type checker narrows the type in both branches, not just the True branch. When you know the check is logically exhaustive (either the value is a str or it is not), TypeIs gives the type checker more information to work with. Both are particularly useful when working with heterogeneous generic containers that need to be split by type before processing.

@overload: A Precision Alternative to TypeVar

When a function has a small, known set of input-output type relationships — not an open-ended family of types — @overload is often more precise than a TypeVar. Overloads let you declare separate signatures for each combination, so the type checker can pick the exact one that matches a given call site rather than inferring from a generic variable.

from typing import overload

@overload
def parse(value: str) -> int: ...
@overload
def parse(value: bytes) -> float: ...

def parse(value: str | bytes) -> int | float:
    if isinstance(value, str):
        return int(value)
    return float(value)

result: int = parse("42")      # type checker infers int
result2: float = parse(b"3.14") # type checker infers float

The advantage over a constrained TypeVar here is that @overload can express asymmetric relationships: the input type being str does not mean the output type is str. A constrained TypeVar always returns the same type it received. Use @overload when the input-to-output mapping is discrete and known upfront.

NewType: Nominal Distinctiveness Without Full Generics

NewType creates a distinct named type that is a subtype of an existing type, without the overhead of defining a full class or a generic. At runtime it is a no-op — the value is still just the original type. But the type checker treats the new type as distinct, catching accidental interchangeability between logically different values that happen to share the same underlying type:

from typing import NewType

UserId = NewType('UserId', int)
OrderId = NewType('OrderId', int)

def get_order(user_id: UserId, order_id: OrderId) -> None: ...

uid = UserId(42)
oid = OrderId(99)

get_order(uid, oid)   # correct
get_order(oid, uid)   # type error: OrderId is not UserId

NewType fills a gap that generics do not: it gives you type safety for values that are semantically distinct but structurally identical. Without it, a function expecting a UserId will silently accept any int. It is particularly useful in domain-driven designs where the same primitive type (string, integer, UUID) appears in many different semantic roles.

Never and NoReturn: Exhaustiveness in Generic Dispatch

Never (added to typing in Python 3.11) and its function-level counterpart NoReturn represent the bottom type — a value of type Never can never exist. This sounds abstract, but it has a concrete use in generic code: exhaustive pattern matching. When you write a function that dispatches on a union type and want the type checker to guarantee you have handled every case, a Never-typed fallback is the standard pattern:

from typing import Never

# Python 3.12+ type alias syntax
class Circle:
    radius: float

class Rectangle:
    width: float
    height: float

class Triangle:
    base: float
    height: float

type Shape = Circle | Rectangle | Triangle

def assert_never(x: Never) -> Never:
    raise AssertionError(f"Unhandled case: {x!r}")

def area(shape: Shape) -> float:
    match shape:
        case Circle():    return 3.14159 * shape.radius ** 2
        case Rectangle(): return shape.width * shape.height
        # If Triangle is added to Shape but not handled here,
        # the type checker flags assert_never(shape) as an error
        case _:           assert_never(shape)

This is a more rigorous approach than leaving the fallback as a generic raise ValueError. The type checker uses Never to verify at analysis time that the union is fully covered — you find out about a missing case when you run mypy, not when the code executes in production.

Multiple TypeVars in One Generic Class

A generic class is not limited to a single type parameter. When a class needs to track two or more independent types — a key and a value, for example — you declare multiple TypeVars and pass them all to Generic. The ordering in the Generic[K, V] base determines the order of the type arguments when the class is subscripted:

from typing import TypeVar, Generic

K = TypeVar('K')
V = TypeVar('V')

class Pair(Generic[K, V]):
    def __init__(self, key: K, value: V) -> None:
        self.key = key
        self.value = value

    def swap(self) -> "Pair[V, K]":
        return Pair(self.value, self.key)

# Python 3.12+ equivalent — no imports needed
class Pair[K, V]:
    def __init__(self, key: K, value: V) -> None:
        self.key = key
        self.value = value

    def swap(self) -> "Pair[V, K]":
        return Pair(self.value, self.key)

p: Pair[str, int] = Pair("age", 30)
swapped: Pair[int, str] = p.swap()

Each TypeVar must be distinct — passing the same one twice (Generic[T, T]) is invalid and will produce both a runtime error and a type checker error. In the bracket syntax, the same rule applies: class Pair[T, T]: is a SyntaxError.

Generics and Dataclasses

Python's @dataclass decorator works with generic classes. You combine Generic[T] (or the bracket syntax) with @dataclass to get auto-generated __init__, __repr__, and comparison methods that are all type-aware. This is one of the cleanest patterns for typed data containers:

from dataclasses import dataclass
from typing import TypeVar, Generic, Optional

T = TypeVar('T')

@dataclass
class Result(Generic[T]):
    value: Optional[T] = None
    error: Optional[str] = None

    @property
    def ok(self) -> bool:
        return self.error is None

# Python 3.12+ equivalent
@dataclass
class Result[T]:
    value: T | None = None
    error: str | None = None

    @property
    def ok(self) -> bool:
        return self.error is None

success: Result[int] = Result(value=42)
failure: Result[int] = Result(error="not found")

This pattern is particularly valuable in API response handling, where the payload type varies but the envelope structure is fixed. The type checker tracks the T through both the constructor and any property or method that references it.

TypeAlias vs. the type Statement

Before Python 3.12, generic type aliases required TypeAlias from typing to tell the type checker that an assignment was a type alias rather than a regular variable. Without it, type checkers sometimes misinterpreted the assignment:

from typing import TypeVar, TypeAlias

T = TypeVar('T')

# Pre-3.12: explicit TypeAlias annotation
Vector: TypeAlias = list[T]
Matrix: TypeAlias = list[list[T]]

# Python 3.12+: the type statement (PEP 695)
type Vector[T] = list[T]
type Matrix[T] = list[list[T]]

The type statement introduced by PEP 695 is the modern replacement. It eliminates the need for the TypeAlias import, supports the bracket syntax for generic parameters, and uses lazy evaluation — the right-hand side is not evaluated until the alias is used, which avoids circular reference issues. TypeAlias has been deprecated since Python 3.12 in favor of the type statement.

Debugging Generics with reveal_type()

When a generic expression produces an unexpected type checker error, reveal_type() is the diagnostic tool. It is not a runtime function — it is a directive that type checkers recognize. When mypy or pyright encounters reveal_type(expr), it prints the inferred type of the expression at analysis time. This is indispensable for understanding how a type checker resolves generic type parameters:

from typing import TypeVar
from collections.abc import Sequence

T = TypeVar('T')

def first(items: Sequence[T]) -> T:
    return items[0]

result = first([1, 2, 3])
reveal_type(result)  # mypy output: "Revealed type is 'builtins.int'"

result2 = first(["a", "b"])
reveal_type(result2)  # mypy output: "Revealed type is 'builtins.str'"

Use reveal_type() liberally when debugging complex generic signatures. It shows exactly what the type checker infers at each step, which makes it straightforward to identify where a type parameter is being resolved differently than you expected. Python 3.11 added reveal_type() as a built-in that also works at runtime (printing to stderr), but its primary value remains as a static analysis tool.

How mypy and pyright Differ on Generics

While mypy and pyright both implement the Python typing specification, they are independently developed and occasionally diverge on edge cases in generic type checking. The differences that surface in practice tend to involve variance inference, type narrowing within generic contexts, and how aggressively each tool infers type parameters from usage. Pyright, developed by Microsoft and used in Pylance (the VS Code Python extension), tends to implement new PEPs faster and is generally stricter about variance violations. Mypy, the original Python type checker, has broader community adoption and is often more permissive in ambiguous cases. When writing generic libraries intended for broad consumption, running both checkers during development is the most reliable approach to ensuring your type annotations are specification-compliant rather than accidentally dependent on one tool's interpretation.

Spot the Bug

The function below is supposed to be a typed decorator that wraps any callable and logs its name before calling it. The author used TypeVar and Callable, but there is one critical type annotation mistake that will cause a type checker to reject call sites. Can you find it?

from typing import TypeVar, Callable import functools T = TypeVar('T') def log_call(func: Callable[..., T]) -> Callable[..., T]: @functools.wraps(func) def wrapper(*args: T, **kwargs: T) -> T: print(f"Calling {func.__name__}") return func(*args, **kwargs) return wrapper @log_call def add(x: int, y: int) -> int: return x + y

The type error is on a specific line inside wrapper. Which option correctly identifies it?

Key Takeaways

  1. Use TypeVar for generic functions, Generic[T] for generic classes (pre-3.12): A standalone TypeVar is sufficient when the generic behavior belongs entirely to one function. When a class needs to carry a type parameter across multiple methods, inherit from Generic[T]. In Python 3.12+, both patterns collapse into the bracket syntax — def f[T](...) and class C[T]:.
  2. Bounded TypeVars preserve specificity; constrained ones do not: Use bound=SomeClass when you want the return type to stay as specific as possible (the return type of make_speak(Dog()) is Dog, not Animal). Use positional constraints like TypeVar('T', str, bytes) when you need to accept exactly a fixed set of types and no subtypes of them matter.
  3. Default variance is invariant — change it only when you have a reason: Mutable containers should stay invariant. Read-only producers (things that only return T) can be declared covariant. Consumers and callbacks (things that only accept T) can be contravariant. In Python 3.12+, you rarely need to declare variance at all — type checkers infer it.
  4. Do not mix old and new syntax within the same class: PEP 695 is explicit that combining bracket-style type parameters with a traditional Generic[T] base on the same class produces a runtime error. Mixing is fine across separate classes in the same file — just not within a single definition.
  5. PEP 695 is the modern standard for projects on Python 3.12+: The bracket syntax removes redundant name strings, eliminates global scope pollution, and makes variance inference automatic. The two approaches are semantically equivalent for most purposes, so migration is safe. If you are targeting older Python versions, the old syntax remains fully supported and is not deprecated.
  6. Covariant and contravariant TypeVars cannot be used in plain functions: The typing specification is clear that variance is a property of a generic class, not of a generic function. Placing a covariant=True TypeVar in a function signature that is not part of a class will produce errors from both mypy and pyright.
  7. Use ParamSpec when wrapping callables: TypeVar alone cannot capture the full parameter list of a function. ParamSpec (or **P in bracket syntax) is the correct tool for typed decorators and higher-order functions that need to preserve the wrapped function's complete signature.
  8. Consider Protocol before writing a bounded TypeVar: If your function only needs to call certain methods on its argument and does not need to return the exact same type it received, a Protocol is often simpler and more flexible than a bounded TypeVar. Protocols describe structural interfaces rather than requiring explicit inheritance.
  9. Type annotations are not enforced at runtime: Python uses type erasure. Stack[int] and Stack[str] are the same class at runtime. Violations that mypy or pyright would catch will not raise errors when your program executes — which is why running a type checker in CI matters.
  10. A TypeVar used in only one position in a signature accomplishes nothing: If T appears in a parameter but not in the return type or any other parameter, it is functionally equivalent to object or Any. Use a concrete type annotation instead. Mypy will flag this pattern directly.
  11. Use Self instead of a bounded TypeVar for fluent APIs and method chaining: Self (Python 3.11, PEP 673) always resolves to the type of the current instance and works correctly under inheritance without any TypeVar declaration. It also applies cleanly to @classmethod constructors that return instances of the class.
  12. Use TypeGuard or TypeIs when custom validation functions need to narrow types: Standard isinstance() narrowing is automatic, but type checkers cannot narrow based on arbitrary validation logic. Annotating such functions with TypeGuard[T] or TypeIs[T] tells the checker the type is narrowed inside the guarded branch. TypeIs (Python 3.13) is bidirectional and stricter.
  13. Use @overload when the input-to-output mapping is discrete and asymmetric: A constrained TypeVar always returns the type it received. @overload lets you declare that str input yields int output and bytes input yields float output — relationships a TypeVar cannot express.
  14. Use NewType for nominal distinctiveness without a full class: When the same underlying type (such as int or str) appears in multiple semantically distinct roles, NewType prevents silent substitution between them at zero runtime cost. The values remain the original type at runtime; only the type checker sees the distinction.
  15. Use Never to enforce exhaustive dispatch in generic union handling: Annotating a fallback function's parameter as Never makes the type checker verify that all members of a union are handled. If a new variant is added and the handler is not updated, the checker flags the gap at analysis time rather than at runtime.

Generic programming in Python has traveled a long road from the early PEP 484 boilerplate through the cleaner PEP 695 syntax. The core idea has not changed: when you want a function or class to operate over a family of types while preserving type information through each operation, TypeVar and Generic — or their modern equivalents — are the correct tools. But the type system also includes precision instruments — Self, TypeGuard, @overload, NewType, Never — that each solve a narrower problem more cleanly than a general-purpose TypeVar would. Choosing between them is the skill: understanding not just what each tool does, but which one matches the actual shape of the problem. For more on Python's type system and related concepts, explore the full library of Python tutorials (opens in new window) on PythonCodeCrack.

Frequently Asked Questions

What is TypeVar in Python?

TypeVar is a placeholder for a type used in generic functions and classes. Declared as T = TypeVar('T'), it tells type checkers like mypy and pyright that a function or class operates over a family of types, preserving the specific type through each operation. In Python 3.12 and later, the explicit TypeVar declaration is replaced by bracket syntax: def f[T](x: T) -> T.

What is the difference between TypeVar and Generic?

TypeVar declares a placeholder for a type. Generic is a base class that registers a class as parameterized by one or more of those placeholders. You use TypeVar alone for generic functions; you combine it with Generic[T] as a base class when you need a generic class. In Python 3.12+, the bracket syntax on a class (class Stack[T]:) handles both jobs at once — no TypeVar declaration and no Generic import needed.

Why does mypy say my list is invariant?

list in Python is invariant because it is mutable. If list[Dog] were a subtype of list[Animal], a function accepting list[Animal] could append a Cat to a list that the caller guaranteed contains only dogs. Mypy surfaces this correctly. The solution is usually to change the annotation to use Sequence[Animal] (which is covariant and read-only) if you only need to read from the list.

Can I use TypeVar and the PEP 695 bracket syntax in the same file?

Yes — different classes and functions in the same file can freely use different syntax. What you cannot do is mix them within a single class definition: a class that uses bracket-style type parameters cannot also specify a traditional Generic[T] base class using an old-style TypeVar. PEP 695 documents this as producing a runtime error.

What is the difference between a bounded TypeVar and a constrained TypeVar?

A bounded TypeVar (bound=SomeClass) accepts SomeClass or any subtype, and the type checker preserves the most specific type — passing a Dog returns a Dog, not just an Animal. A constrained TypeVar (TypeVar('T', str, bytes)) accepts only the exact listed types and resolves to whichever constraint matches, never to a subtype of one.

What is PEP 695 and what did it change in Python 3.12?

PEP 695 (opens in new window), introduced in Python 3.12, redesigned how generic types are declared. Instead of writing T = TypeVar('T') and class Stack(Generic[T]):, you write class Stack[T]: — no imports required for simple cases. Type parameters are scoped locally to the class or function, and variance is inferred automatically by the type checker rather than declared manually.

What is variance in Python generics?

Variance describes how subtype relationships between concrete types carry over into generic types. Invariant (the default) means Generic[Dog] is unrelated to Generic[Animal] even if Dog is a subtype of Animal — used for mutable containers. Covariant means Generic[Dog] is a subtype of Generic[Animal] — safe for read-only producers. Contravariant reverses the relationship — used for write-only consumers and callbacks.

What is ParamSpec in Python?

ParamSpec captures the full parameter specification of a callable — the names, types, and ordering of its arguments. It is primarily used to write typed decorators that preserve the signature of the function they wrap. Without ParamSpec, the type checker cannot verify that callers pass the correct arguments to a wrapped function. The equivalent bracket syntax is [**P].

What did Python 3.13 add to the generics system?

Python 3.13 implemented PEP 696 (opens in new window), which added default values for type parameters. This lets library authors write class Box[T = int]: so that Box() without an explicit type argument is treated as Box[int] rather than leaving the type parameter unresolved. This mirrors the behavior of default type arguments in languages like TypeScript and C++.

Can I still use the old TypeVar syntax in Python 3.12 and later?

Yes. The old TypeVar and Generic syntax is fully supported and not deprecated in Python 3.12 or later. The two syntaxes are semantically equivalent for most purposes. Projects that must support Python 3.10 or 3.11 should continue using the old syntax. You cannot mix old and new syntax within the same class definition.

What is the difference between Generic and Protocol in Python?

Generic uses nominal subtyping: a class must explicitly inherit from the base to satisfy the type. Protocol uses structural subtyping: any class that implements the required methods and attributes satisfies the protocol, regardless of its inheritance chain. Use Generic when you need to preserve a specific type through a function or class. Use Protocol when you want to describe what a type can do without requiring explicit inheritance — this is statically checkable duck typing.

Are Python type annotations enforced at runtime?

No. Python uses type erasure: TypeVar, Generic, and all type annotations are metadata for type checkers and IDEs, not runtime constraints. Stack[int] and Stack[str] are identical objects at runtime — no specialized class is generated per type argument. A type violation that mypy or pyright would flag as an error will not raise an exception when the program executes. This makes running a type checker as part of your CI pipeline essential if you want these annotations to do their job.

When should I not use TypeVar?

Avoid TypeVar when the function does not preserve the input type in its output — use Union or a concrete type instead. Avoid it when a TypeVar would only appear once in a signature, since a singly-used TypeVar is equivalent to object and conveys nothing to callers or the type checker. Also avoid it when the function only needs to call methods all objects share, or when a Protocol would express the structural requirement more directly. The signal for genuinely needing a TypeVar is that the caller must know the output type is the same as — or derived from — the input type.

What is the Self type in Python and when should I use it?

Self (introduced in Python 3.11 via PEP 673) is a special type that always refers to the type of the current instance. It replaces the bounded-TypeVar workaround previously needed for fluent APIs and method chaining. When a method annotated with -> Self is called on a Dog instance, the type checker knows the return type is Dog, not just its base class. Self also works on @classmethod constructors. Use it any time a method returns self and you need the return type to stay specific under inheritance.

What is TypeGuard in Python and how does it differ from TypeIs?

TypeGuard[T] (Python 3.10, PEP 647) annotates a function as a custom type narrowing predicate. When the function returns True, the type checker narrows the argument to T inside the guarded branch. TypeIs[T] (Python 3.13, PEP 742) is a stricter variant: it is bidirectional, narrowing the type in both the true and false branches. Use TypeGuard for validation functions that confirm a value meets a type constraint. Use TypeIs when the check is logically exhaustive — the value either is or is not the target type — and you want both branches narrowed.

When should I use @overload instead of TypeVar?

Use @overload when a function accepts a small, fixed set of input types and the output type depends on which input was passed — especially when the relationship is asymmetric (for example, str in yields int out, bytes in yields float out). A constrained TypeVar always returns the same type it received, which cannot express that kind of mapping. @overload lets you declare each input-output pair separately so the type checker can pick the exact matching signature at each call site.

What is NewType and when is it better than a full generic class?

NewType creates a distinct named type that is a subtype of an existing one, with zero runtime overhead — at runtime the value is still the original type. The type checker treats the new type as distinct, so passing a UserId where an OrderId is expected becomes a type error even though both are plain ints. Use NewType when the same primitive type appears in semantically different roles and you want to prevent accidental substitution without the overhead of defining a wrapper class or a generic.

What is the Never type in Python and why is it useful in generic code?

Never (added to typing in Python 3.11) represents the bottom type — a value that can never exist. In generic dispatch over union types, annotating a fallback function's parameter as Never turns it into an exhaustiveness check: if any member of the union is left unhandled, the type checker flags the fallback call as an error because a value of the remaining type cannot satisfy Never. This means missing cases in a match or if-chain are caught at analysis time rather than at runtime.

How do I use multiple TypeVars in one generic class?

Declare a separate TypeVar for each independent type parameter and pass them all to Generic: class Pair(Generic[K, V]):. The order in the Generic base determines the subscript order, so Pair[str, int] means K=str and V=int. Each TypeVar must be distinct — passing the same one twice is an error. In Python 3.12+, the equivalent is class Pair[K, V]: with no imports needed.

Can I use Generic with dataclasses?

Yes. Combine @dataclass with Generic[T] (or the bracket syntax in Python 3.12+) to create type-aware data containers with auto-generated __init__, __repr__, and comparison methods. For example, @dataclass class Result(Generic[T]): lets you write Result[int](value=42) and the type checker will enforce that the value field matches the declared type argument.

What is the difference between TypeAlias and the type statement?

TypeAlias (introduced in Python 3.10 via PEP 613) was an annotation used to explicitly mark a variable assignment as a type alias rather than a regular variable. The type statement (introduced in Python 3.12 via PEP 695) replaces it with dedicated syntax: type Vector[T] = list[T]. The type statement supports generic parameters natively, uses lazy evaluation to avoid circular reference issues, and does not require any imports. TypeAlias has been deprecated since Python 3.12 in favor of the type statement.

How do I debug generic type inference with reveal_type()?

reveal_type(expr) is a directive that type checkers like mypy and pyright recognize. When they encounter it, they print the inferred type of the expression at analysis time. This is indispensable for understanding how a type checker resolves generic type parameters — for example, reveal_type(first([1, 2, 3])) will output Revealed type is 'builtins.int' in mypy. Python 3.11 added reveal_type() as a built-in that also works at runtime, but its primary value remains as a static analysis debugging tool.

Do mypy and pyright handle generics differently?

Both implement the same Python typing specification, but they are independently developed and occasionally diverge on edge cases. Pyright (used in VS Code's Pylance extension) tends to implement new PEPs faster and is generally stricter about variance violations. Mypy has broader community adoption and is sometimes more permissive in ambiguous cases. When writing generic libraries intended for broad use, running both checkers during development is the most reliable approach to ensuring your type annotations are specification-compliant.

Sources