What Does Tightly Coupled Mean in Python? Absolute Beginners Tutorial

Tightly coupled code means two pieces of a program depend so heavily on each other's internal details that changing one forces changes in the other. This tutorial explains what that means, shows every major form it takes in Python — from hard-wired constructors and inheritance to global state — and walks through how to fix each one.

No prior knowledge of design patterns is assumed. By the end, you will be able to look at a Python class and identify whether it is tightly or loosely coupled — and know exactly what to do about it.

What Coupling Means

Coupling is a measure of how much one piece of code knows about — and depends on — another piece of code. At one extreme, two parts of a program are completely independent and could be moved to separate projects without either noticing. At the other extreme, two parts are so intertwined that they are effectively one piece of code split across two files.

In any real codebase, some coupling is unavoidable. A Cart class that adds products to a shopping list has to know something about what a product is. The question is: how much does it need to know? A class that knows it needs something with a save method is fine. A class that knows it specifically needs a Database constructed with no arguments is the start of a problem.

Note

Coupling is not about how many classes or functions exist. It is about the nature of the connections between them. A program with one hundred loosely coupled functions is far easier to maintain than a program with five tightly coupled classes.

The Coupling Spectrum
tightly coupled loosely coupled
hardcoded
One class instantiates another directly inside its constructor. Impossible to substitute without editing source code.
self.db = Database()
injected
The dependency is passed in as a parameter. The caller controls what gets provided, so fakes can be used in tests.
def __init__(self, db):
protocol-based
The class depends only on a contract — a Protocol describing required methods. No concrete type is named at all.
db: Storage (Protocol)
pure function
No instance state at all. The function receives everything it needs as arguments and returns a value. Maximum isolation.
def place(item, save_fn):

Tight coupling is the situation where one part of your code reaches into the private internal workings of another part — knowing its exact class name, its internal variable structure, or how it does its job rather than just what it produces. When that internal detail changes, both sides break.

Loose coupling is the opposite: one part communicates with another through a clear, stable boundary — a defined method signature, a shared interface, or a simple function parameter. The internal details on either side can change freely as long as that boundary stays consistent.

Coupling and cohesion go together

Coupling and cohesion are two sides of the same design quality. Coupling measures how much one module depends on another. Cohesion measures how focused a single module is on one job. Higher cohesion — a class that does one thing well — almost always produces lower coupling, because a class with a clear single responsibility has fewer reasons to reach into other classes. When you see a tightly coupled class, it is often also doing too many things at once.

concept relationship map — hover each node to explore
DI fixes achieves strengthens enables alternative to causes Tight Coupling self.db = Database() Dependency Injection Loose Coupling def __init__(self, db): typing.Protocol structural interface Hard to Test no seam for a fake Easy to Test inject a fake object Composition HAS-A, not IS-A Inheritance tightest coupling replace with
hover or tap any node for a plain-English explanation
predict the output read the code, then choose what happens

What does Python print when the following code runs? Think through the object relationships before choosing.

class FakeStore: def __init__(self): self.records = [] def save(self, data): self.records.append(data) class Order: def __init__(self, db): self.db = db def place(self, item): self.db.save(item) store = FakeStore() order = Order(store) order.place("keyboard") order.place("monitor") print(len(store.records))
order.place("keyboard") calls self.db.save("keyboard"), which appends to store.records. Same for "monitor". Because Order received the same FakeStore instance through the constructor — dependency injection in action — both calls go to the same list. len(store.records) is 2. No AttributeError because FakeStore has a valid save method. This is the core testing pattern: inject a fake, then inspect it directly after the fact.
code builder click a token to place it

Build a class definition that creates another class directly inside its constructor — the classic tightly coupled pattern:

your code will appear here...
self.db Order = __init__ Database def ( ) self : class super().__init__() pass
Why: The correct sequence declares class Order: then defines its constructor def __init__(self): and immediately creates a Database instance inside it with self.db = Database(). This is the signature pattern of tight coupling: Order is responsible for creating its own dependency rather than receiving it from outside, which means Order and Database are now permanently linked.

What Tight Coupling Looks Like in Python

The clearest example of tight coupling is a class that creates another class inside its own constructor. Here is a straightforward version:

python
# Tightly coupled — Order creates its own Database
class Database:
    def save(self, data):
        print(f"Saving {data} to database")

class Order:
    def __init__(self):
        self.db = Database()   # hard-wired dependency

    def place(self, item):
        self.db.save(item)

Notice that Order.__init__ calls Database() directly. Order now has three hard-wired facts about Database: its exact class name, the fact that it takes no constructor arguments, and the assumption that it will always be there. If any of those facts changes — say you rename the class, add a required argument, or want to switch to a different storage system during testing — Order breaks along with it.

The two-second coupling check

Look at the constructor. If you see ClassName() — a class being instantiated inside another class — that is tight coupling. If you see self.thing = thing — an object being received and stored — that is dependency injection. One line tells you almost everything you need to know.

The accordion below compares the same three patterns so you can see how the dependency relationship changes across tight coupling, constructor injection, and a protocol.

Pattern
self.db = Database() inside __init__
Problem
Order is permanently linked to the concrete Database class. You cannot substitute anything else without modifying Order's source code.
Pattern
def __init__(self, db): then self.db = db
Effect
Order does not create anything. The caller decides which storage object to provide. A fake can be passed in during testing without changing Order at all.
Pattern
A Protocol defines that anything passed in must have a save(data) method. Order only knows about that contract.
Effect
Any object that happens to have a matching save method works, regardless of where it comes from or what it is named internally.

Why Tight Coupling Creates Real Problems

Tight coupling is not just a style concern. It creates three concrete obstacles as a codebase grows, and each one compounds the others.

Testing becomes difficult. When a class creates its own dependencies internally, you cannot isolate it. Testing Order from the example above means you also invoke real Database logic. If the database is slow, unavailable, or writes to real storage during a test, your test suite becomes fragile and slow.

Reuse is blocked. Suppose a second part of the same application needs to place orders but write them to a file instead of a database — perhaps for an audit log or an offline export. With the tightly coupled version, that is impossible without duplicating or rewriting Order. The class and its storage mechanism are one fused unit. With dependency injection, you pass a different storage object and the same Order class handles both cases without a single line of it changing.

Change propagates unpredictably. A modification to the internal structure of Database — even something as small as renaming a private method — forces you to trace every class that created a Database instance and check whether it is affected.

Watch out

Tight coupling tends to accumulate silently. Each individual shortcut looks harmless in isolation. It is only when you try to change something — or write a test — that the full chain of locked-in dependencies becomes visible.

check your understanding question 1 of 3

Here is a demonstration of the testing problem. Notice how the tightly coupled version has no seam where a fake can be inserted:

python
# Tightly coupled — no way to swap in a fake database during tests
class Order:
    def __init__(self):
        self.db = Database()   # always uses the real Database

# Loosely coupled — pass in whatever you like
class Order:
    def __init__(self, db):
        self.db = db           # caller decides

# In a test you can now do:
class FakeStorage:
    def save(self, data):
        self.saved = data      # records what was saved without touching disk

order = Order(FakeStorage())
order.place("bicycle")
assert order.db.saved == "bicycle"

The loosely coupled version lets you write a FakeStorage class in your test file and pass it straight in. The tightly coupled version has no such entry point.

A seam is a point in your code where you can substitute one thing for another without modifying the surrounding code. Think of it like a physical seam in a garment: it is where two pieces connect, and where you would open the stitching if you needed to swap one piece out.

When a class creates its own dependency internally — self.db = Database() — there is no seam. The class and its dependency are one continuous piece of fabric. To insert a fake, you would have to cut into the middle of the fabric itself, which is what mock.patch does. It reaches into the module internals and temporarily replaces a name.

When a class accepts a dependency through a constructor parameter, the parameter is the seam. No patching is needed. You simply provide a different object at the join point. This is why the presence or absence of a seam is the concrete, practical difference between tightly and loosely coupled code.

spot the bug click the line that contains the bug

This code tries to use dependency injection to loosen coupling, but one line undoes all of that. Find it.

1 class Notifier:
2 def __init__(self, sender):
3 self.sender = EmailSender() # ignores the parameter!
4 def notify(self, message):
5 self.sender.send(message)
The fix: Change line 3 to self.sender = sender. The constructor accepts a sender parameter, but then ignores it and creates a hard-wired EmailSender() anyway. This is a common mistake when refactoring towards dependency injection — the parameter is added to the signature, but the old hard-coded instantiation is never removed.

How to Reduce Tight Coupling in Python

Work through these strategies in order of escalating flexibility. You do not have to reach the last step on every class — step one alone removes a significant amount of unnecessary dependency. Steps four through six address scenarios that most tutorials skip entirely.

  1. Find the hard-wired dependency

    Scan your class for any line inside __init__ or a method that calls SomeClass() to create a new object. That construction call is the coupling. The class that calls it now knows the name and interface of the class it is creating, and both are permanently linked.

  2. Move the creation outside (dependency injection)

    Change the constructor to accept the dependency as a parameter — def __init__(self, db): — then assign it: self.db = db. This is dependency injection. The caller now decides what to pass in, and your class no longer needs to know how to build the thing it uses.

  3. Define what you need, not how it works (typing.Protocol)

    If you want maximum flexibility, describe the expected behaviour with a typing.Protocol. Your class then depends only on the protocol — a description of the methods it needs — rather than on any specific class. Anything in the codebase that happens to implement those methods can be passed in without inheritance or modification.

  4. Replace a shared concrete class with an adapter

    When you depend on a third-party class you cannot modify — say a library's S3Client — do not inject the library class directly. Write a thin adapter class that wraps it and satisfies your own Protocol. Your business logic stays coupled only to your protocol, not to the library's interface. If you later swap S3Client for a local filesystem, you write a second adapter and change one line at the composition root. Nothing inside your application notices.

  5. Decouple producers from consumers with a callable or event

    Some coupling cannot be solved by injecting a collaborator — the problem is that two classes need to react to the same event without either one knowing about the other. The pattern here is to pass a callback or use a simple in-process event dispatcher. Class A fires an event or calls a registered callable; Class B registers its handler at the composition root. Neither A nor B imports the other, and adding a third subscriber requires no changes to A at all. This is the function-level equivalent of the Observer pattern and is especially useful for decoupling side effects — logging, metrics, notifications — from the business logic that triggers them.

  6. Use a DI framework when manual wiring becomes expensive

    Manual dependency injection scales well to dozens of classes, but very large applications accumulate a composition root that runs to hundreds of lines of wiring code. At that scale, a Python DI library such as dependency-injector or lagom can manage the wiring declaratively. These tools let you declare what each class needs and resolve the full object graph automatically. The key benefit is that you keep your business classes completely free of construction logic while the framework handles the wiring mechanics. The trade-off is an additional dependency and a learning curve — for most beginner and intermediate projects, manual injection with a well-organised main.py is the right call.

The adapter pattern: shielding your code from third-party interfaces

An adapter is a wrapper class that translates between your code's interface and a third-party library's interface. Your business classes depend only on the adapter's interface — which you define and control — so the library's internals never reach into your code. Step four above mentions this pattern. Here is what it looks like in Python.

python
from __future__ import annotations
from typing import Protocol

# Your Protocol — the contract your code depends on
class FileStore(Protocol):
    def upload(self, key: str, data: bytes) -> None: ...

# Third-party library class you do not control
class boto3S3Client:
    def put_object(self, Bucket, Key, Body): ...   # their naming, not yours

# Adapter: wraps the library, satisfies your Protocol
class S3Adapter:
    def __init__(self, client: boto3S3Client, bucket: str) -> None:
        self._client = client
        self._bucket = bucket

    def upload(self, key: str, data: bytes) -> None:
        # translate your interface into the library's interface
        self._client.put_object(Bucket=self._bucket, Key=key, Body=data)

# Your business class only knows about FileStore — never boto3
class AssetManager:
    def __init__(self, store: FileStore) -> None:
        self.store = store

    def save_asset(self, name: str, content: bytes) -> None:
        self.store.upload(name, content)

# Composition root — only place boto3 is named
client = boto3S3Client()
store = S3Adapter(client, bucket="my-assets")
manager = AssetManager(store)

# For tests — no AWS credentials needed
class LocalStore:
    def __init__(self): self.files: dict[str, bytes] = {}
    def upload(self, key: str, data: bytes) -> None:
        self.files[key] = data

manager_test = AssetManager(LocalStore())

If AWS changes the put_object signature in a future boto3 release, only S3Adapter needs updating. AssetManager does not exist as far as boto3 is concerned, and boto3 does not exist as far as AssetManager is concerned. The adapter is the only place those two worlds touch.

Decoupling side effects with callbacks

When a class triggers a side effect — logging a purchase, sending a notification, recording a metric — the naive solution is to call those services directly. That introduces three coupling points where one business operation existed. The cleaner approach is to accept a list of callable hooks at construction time and call them after the business action completes.

python
from __future__ import annotations
from typing import Callable

# Order knows nothing about loggers, mailers, or metrics
class Order:
    def __init__(
        self,
        db,
        on_placed: list[Callable[[str], None]] | None = None
    ) -> None:
        self.db = db
        self._on_placed = on_placed or []

    def place(self, item: str) -> None:
        self.db.save(item)
        for handler in self._on_placed:
            handler(item)   # fire every registered callback

# Individual side-effect functions — fully independent
def log_order(item: str) -> None:
    print(f"[LOG] Order placed: {item}")

def send_confirmation(item: str) -> None:
    print(f"[EMAIL] Confirmation sent for: {item}")

def record_metric(item: str) -> None:
    print(f"[METRIC] +1 order: {item}")

# Composition root: wire side effects without touching Order
order = Order(
    db=real_db,
    on_placed=[log_order, send_confirmation, record_metric]
)
order.place("notebook")

# Test: no side effects at all — pass an empty list or a recorder
received: list[str] = []
order_test = Order(db=FakeDb(), on_placed=[received.append])
order_test.place("notebook")
assert received == ["notebook"]

Adding a fourth side effect later — say, triggering a warehouse pick list — requires only adding a new function and registering it at the composition root. Order does not change at all. This is the same decoupling principle as dependency injection, applied to side effects rather than collaborators.

Python version note

typing.Protocol was added in Python 3.8, introduced by PEP 544. If you need runtime isinstance() checks against a Protocol, decorate it with @runtime_checkable. Without that decorator, Protocols are used only at static type-checking time — tools like mypy and pyright will catch violations, but Python itself will not raise an error at runtime if an incompatible object is passed.

python
from __future__ import annotations
from typing import Protocol

# Step 3: define only the contract you need
class Storage(Protocol):
    def save(self, data: str) -> None: ...

# Order depends on the Protocol, not on any concrete class
class Order:
    def __init__(self, db: Storage) -> None:
        self.db = db

    def place(self, item: str) -> None:
        self.db.save(item)

# Production usage
class PostgresDatabase:
    def save(self, data: str) -> None:
        print(f"Persisting {data!r} to Postgres")

# Test usage — no Postgres needed
class InMemoryStorage:
    def __init__(self):
        self.records: list[str] = []
    def save(self, data: str) -> None:
        self.records.append(data)

order = Order(InMemoryStorage())
order.place("notebook")
print(order.db.records)   # ['notebook']
"Program to an interface, not an implementation." — Gamma, Helm, Johnson & Vlissides, Design Patterns: Elements of Reusable Object-Oriented Software (Addison-Wesley, 1995), p. 18

That principle — from the book published in October 1994 with a 1995 copyright, by the authors now known as the Gang of Four — is exactly what dependency injection and typing.Protocol make possible in Python. Your class programs to a description of what it needs — a save method — rather than to a concrete PostgresDatabase class. The concrete class can be swapped at any point without touching the class that depends on it.

Protocol vs. Abstract Base Class: which should you use?

Python offers two ways to define a shared interface: typing.Protocol and abc.ABC. They solve the same problem differently.

How it works
Any class that has the required methods automatically satisfies the Protocol — no inheritance required. The check is structural: "does it have the right shape?"
Best for
When you want maximum flexibility, when the classes you are working with are from third-party libraries you cannot modify, or when you want Python's duck typing to remain natural.
Gotcha
Without @runtime_checkable, Protocols are invisible at runtime. A wrong type can be passed without Python raising an error — only a static type checker will catch it.
How it works
A class must explicitly inherit from the ABC and implement every abstract method. Forgetting to implement one raises TypeError at instantiation time.
Best for
When you control all the classes involved and want Python itself to enforce the contract at runtime — not just at type-check time.
Gotcha
Requires inheritance. Classes from external libraries cannot satisfy your ABC unless they explicitly subclass it, which limits flexibility.

For beginners reducing tight coupling, either works. typing.Protocol is the more Pythonic and flexible choice for interfaces you define yourself. Use abc.ABC when you want Python to loudly break at runtime if a required method is missing.

How to Refactor Existing Tightly Coupled Code

Understanding the concept is one thing — applying it to code that is already in production is another. The concern most beginners have is: if I change the constructor, everything that creates this class will break. That is a real constraint, and the solution is to refactor in steps rather than all at once.

  1. Add a default parameter first

    Change def __init__(self): to def __init__(self, db=None): and then self.db = db or Database(). This is a non-breaking change — all existing call sites still work with no arguments. You now have a seam: tests can pass a fake, and production code is unchanged. This single step eliminates the hardest part of the migration.

  2. Write tests against the new seam

    Now that db is injectable, write test coverage passing a fake. This gives you a safety net before you touch the call sites. Do not skip this step — it is the whole point of the exercise. Once you have tests, you can refactor the call sites with confidence.

  3. Migrate call sites to explicit injection

    Update each place in the codebase that calls MyClass() to instead call MyClass(db=real_db), passing the real object from outside. Work through them one at a time and run tests after each change. When all call sites pass an explicit argument, remove the default and the fallback construction inside the constructor.

  4. Move all construction to the composition root

    Once every call site is explicit, gather all the concrete object creation into one place — typically main.py or an application factory function. This is the composition root. From this point forward, your business classes never name or construct their own dependencies. The coupling is gone.

python
# Step 1 — non-breaking: add an optional parameter
# All existing callers still work unchanged
class Order:
    def __init__(self, db=None):
        self.db = db or Database()   # fallback keeps production working

# Step 2 — tests can now inject a fake immediately
class FakeDb:
    def __init__(self): self.saved = []
    def save(self, data): self.saved.append(data)

order = Order(db=FakeDb())   # test passes a fake, no patch needed

# Step 3 — migrate call sites one by one
# Before: order = Order()
# After:  order = Order(db=real_database)

# Step 4 — remove the default, construction lives only in main.py
class Order:
    def __init__(self, db):   # no default, no fallback — fully decoupled
        self.db = db
How far to go

You do not have to complete all four steps at once, or even reach step four on every class. Step one alone — adding an optional parameter — is a net improvement and can be committed independently. If a class has no tests and no urgency to change, stopping at step two (having written the first test) is still progress. Refactoring toward loose coupling is incremental work, not a rewrite.

Where Do You Actually Create the Objects? The Composition Root

When beginners first learn dependency injection, a natural question appears: if the class no longer creates its own dependencies, who does? The answer is the composition root — a single place in your program (usually main.py or your application factory) where all concrete objects are created and wired together.

This is not just a convention. Centralising all construction in one place means that changing a dependency — say, swapping a Postgres database for SQLite in a test environment — requires only a one-line change in the composition root rather than hunting through every class that touches the database.

python
# composition_root.py  — the ONLY place concrete classes are named
from __future__ import annotations
from typing import Protocol

class Storage(Protocol):
    def save(self, data: str) -> None: ...

class PostgresStorage:
    def save(self, data: str) -> None:
        print(f"Persisting {data!r} to Postgres")

class InMemoryStorage:
    def __init__(self):
        self.records: list[str] = []
    def save(self, data: str) -> None:
        self.records.append(data)

class Order:
    def __init__(self, db: Storage) -> None:
        self.db = db
    def place(self, item: str) -> None:
        self.db.save(item)

class ReportGenerator:
    def __init__(self, db: Storage) -> None:
        self.db = db
    def run(self, label: str) -> None:
        self.db.save(f"report:{label}")

def build_app(use_real_db: bool = True):
    """Composition root: all concrete decisions live here."""
    storage = PostgresStorage() if use_real_db else InMemoryStorage()
    # Every class receives the same storage object — wired once
    return Order(storage), ReportGenerator(storage)

# Production
order, report = build_app(use_real_db=True)
order.place("bicycle")

# Test / CI environment — zero database required
order_test, report_test = build_app(use_real_db=False)
order_test.place("bicycle")
print(order_test.db.records)   # ['bicycle']

Notice that Order and ReportGenerator never mention PostgresStorage or InMemoryStorage by name. The entire swap between environments happens in one function. That is the practical payoff of eliminating tight coupling.

When Tight Coupling Is Acceptable

Not all coupling is a problem worth solving. In Python, there are several common cases where tight coupling between two things is the correct, intentional choice.

Coupling is fine here

The goal is not to eliminate all coupling — that would produce code with no relationships between its parts. The goal is to eliminate coupling that hides intent, blocks testing, or forces changes to spread unpredictably.

Value objects and dataclasses. A @dataclass that represents a product with a name and price is fully coupled to those two fields. That is its entire job. Injecting the price type through a Protocol would be over-engineering. The coupling is stable, intentional, and contained.

Classes that only contain data. If a class has no external side effects — it performs no I/O, makes no network calls, and writes to no shared state — the risk of tight coupling is low. You can still test it without substituting anything, because there is nothing to substitute.

Internal implementation details. When a public class uses a private helper class that is never exposed outside its own module, the coupling between them is a private implementation concern. If both classes change together, that is acceptable — they are one logical unit split across two objects for clarity.

Enumerations. A class that accepts an Enum value as a parameter and switches on it is tightly coupled to that enum. This is usually correct: enums are stable, version-controlled, and part of the public interface you intended to expose.

The practical rule: tight coupling between two things in the same layer and module, where neither thing needs to be swapped or tested in isolation, is usually fine. Tight coupling across layers — where a high-level business class hardcodes a low-level infrastructure class — is where problems appear.

Coupling in Functions, Not Just Classes

Classes are the most common place to find tight coupling, because constructors make the dependency creation visible. But the same principle applies to plain functions. A function that calls another function by name, rather than accepting a callable as an argument, is tightly coupled to that specific function.

tightly coupled function
import smtplib

def notify_user(email):
    # hard-wired to SMTP — cannot
    # be tested without a mail server
    server = smtplib.SMTP("smtp.example.com")
    server.sendmail("[email protected]",
                    email, "Hello")
loosely coupled function
def notify_user(email, send_fn):
    # caller provides any callable
    # — a real mailer or a test stub
    send_fn(email, "Hello")

# Production:
notify_user(email, real_send)

# Test — no SMTP needed:
notify_user(email, lambda e, m: None)

The loosely coupled version accepts a send_fn callable. In production, you pass in a real sending function. In a test, you pass in a lambda that does nothing, records what it received, or raises an intentional error. No mocking framework is required — just a function argument.

spot the bug click the line that contains the bug

This function was supposed to be loosely coupled so it could be tested without a real database. One line breaks that. Find it.

1 def save_order(item, store_fn):
2 record = {"item": item, "saved": True}
3 if not store_fn:
4 store_fn = PostgresStore().save # hard-wires a fallback!
5 store_fn(record)
The fix: Remove lines 3 and 4 entirely. The function accepts store_fn as a parameter, which is correct — but then falls back to constructing a hard-wired PostgresStore() when nothing is passed. That fallback re-introduces the tight coupling the parameter was meant to prevent. The caller should always provide a valid callable; if a default is genuinely needed, bind it with functools.partial at the composition root, not inside the function.

Python's functools.partial makes this pattern practical without verbosity. If a function needs a preconfigured callable with some arguments already bound, partial creates it at the composition root:

python
from functools import partial

def send_email(host, port, from_addr, to_addr, message):
    print(f"Sending via {host}:{port} to {to_addr} — {message}")

# Composition root: bind the infrastructure details once
configured_send = partial(send_email, "smtp.example.com", 587, "[email protected]")

# The rest of the codebase only sees a simple two-argument callable
def notify_user(email, send_fn):
    send_fn(email, "Your order has shipped.")

notify_user("[email protected]", configured_send)

Nothing in notify_user knows about SMTP, host names, or ports. Those infrastructure details are bound at one place — the composition root — and injected as a plain callable. This is the function-level equivalent of dependency injection for classes.

Circular Imports as a Coupling Signal

If you have ever seen Python raise ImportError: cannot import name 'X' from partially initialized module 'Y', you have encountered a circular import. Two modules each import from the other, and Python cannot finish loading either one. This error is one of the clearest symptoms of tight coupling across module boundaries.

Circular imports arise when two modules are so intertwined that neither can exist without the other. This is not a Python limitation to work around — it is a design signal telling you that the two modules are doing too much together and should be separated.

python
# orders.py — imports from notifications.py
from notifications import Notifier

class Order:
    def place(self, item):
        Notifier().send(f"Placed: {item}")

# notifications.py — imports from orders.py  (circular!)
from orders import Order

class Notifier:
    def send(self, message):
        print(message)
        # Notifier is reaching back into Order — why?

The standard resolution is to introduce a shared interface or move the shared concern to a third module that neither depends on the other. If Notifier does not actually need to know about Order, the import can simply be removed. If it does need some data from an order, that data should be passed as a plain value — a string, a number, a dataclass — rather than importing the full Order class.

Practical check

If you resolve a circular import by moving the import inside a function body — def some_method(self): from orders import Order — you have deferred the coupling rather than removed it. The circular dependency still exists; it just explodes at call time rather than import time. The correct fix is to restructure the dependency, not to hide it.

Tight Coupling and Inheritance

Inheritance is the tightest form of coupling Python offers. When class Dog inherits from class Animal, it does not just depend on Animal's public interface — it depends on every internal attribute, every method implementation, and every future change to the parent class. That is a much deeper dependency than constructor injection.

This is why the principle "favour composition over inheritance" appears so often in software design literature. Composition — where a class holds an instance of another class rather than extending it — gives you the behaviour you need without locking the two classes together at the structural level.

inheritance — tightest coupling
class FileLogger:
    def log(self, msg):
        print(f"[FILE] {msg}")

# Order IS-A FileLogger — deeply coupled
class Order(FileLogger):
    def place(self, item):
        self.log(f"Placed: {item}")

# Cannot swap the logger without
# changing Order's class hierarchy
composition — loose coupling
class FileLogger:
    def log(self, msg):
        print(f"[FILE] {msg}")

# Order HAS-A logger — loosely coupled
class Order:
    def __init__(self, logger):
        self.logger = logger

    def place(self, item):
        self.logger.log(f"Placed: {item}")

# Swap to any logger — no hierarchy change

Inheritance is not always wrong. It makes sense when you genuinely have an is-a relationship that is stable across the lifetime of the codebase — a Square that truly is a Shape, or a HTTPException that truly is an Exception. The warning sign is inheriting purely to get access to another class's methods, with no meaningful is-a relationship. That is tight coupling dressed up as code reuse.

A reliable test: substitute the child class anywhere the parent class is expected. If that substitution always makes sense in plain English, the inheritance is genuine. A Dog substituted for an Animal makes sense. A ReportBuilder substituted for a TextFormatter — wherever a formatter is expected — does not make sense, because a report builder is not a formatter. It just happens to need one.

The deeper intuition: IS-A relationships are about identity and classification. HAS-A relationships are about tools and collaborators. A car IS-A vehicle. A car HAS-A engine. If you find yourself saying "X IS-A Y so that it can call Y's methods," that sentence is backward — what you actually mean is "X HAS-A Y and delegates certain work to it."

When you use composition, you preserve the freedom to swap the collaborator. ReportBuilder(formatter=PlainFormatter()) can just as easily become ReportBuilder(formatter=HTMLFormatter()) without touching the ReportBuilder class at all.

spot the bug click the line that contains the bug

This code uses inheritance to give ReportBuilder access to a formatting method. One line reveals the design problem. Find it.

1 class TextFormatter:
2 def format(self, text): return text.upper()
3
4 class ReportBuilder(TextFormatter): # IS ReportBuilder a TextFormatter?
5 def build(self, title):
6 return self.format(title)
The problem: Line 4 inherits from TextFormatter solely to access its format method — there is no genuine is-a relationship. A report builder is not a text formatter. This is composition disguised as inheritance. The fix: Remove the inheritance and accept a formatter through the constructor: def __init__(self, formatter): self.formatter = formatter, then call self.formatter.format(title). Now any object with a format method can be passed in — including a mock in tests — and ReportBuilder is free from the internal details of TextFormatter.
The fragile base class problem

When a parent class changes an internal method — even one the child class does not explicitly call — it can silently break all subclasses. This is called the fragile base class problem and it is a direct consequence of inheritance being the deepest form of tight coupling. A compositional dependency does not have this problem: changing the internals of the composed object leaves the class that holds it untouched, as long as the public interface stays the same.

A practical rule: if you find yourself inheriting from a class only to override one method, composition with a typing.Protocol almost always produces a better design. The child class becomes a collaborator rather than a specialisation, and you gain the ability to swap it independently.

Global State as Hidden Coupling

Global variables and module-level singletons are a form of tight coupling that is harder to see than a ClassName() call in a constructor, because the dependency is invisible in the class signature. A class that reads from or writes to a global variable is coupled to every other piece of code that touches that same global — without any of those relationships appearing in a constructor or method parameter.

python
# globals.py
current_user = None   # global state

# orders.py — reads global without declaring it as a dependency
import globals

class Order:
    def place(self, item):
        # This class is coupled to globals.current_user
        # but nothing in its signature reveals that
        if globals.current_user is None:
            raise ValueError("No user logged in")
        print(f"{globals.current_user} placed: {item}")

The Order class above looks like it only needs an item string — in reality it also depends on globals.current_user being set correctly before it runs. Testing it in isolation requires you to set up and tear down global state around every test, which makes tests order-dependent and brittle. The dependency is real but invisible.

The fix is the same as for any tight coupling — make the dependency explicit:

python
# No globals — user is an explicit parameter
class Order:
    def __init__(self, user: str) -> None:
        self.user = user

    def place(self, item: str) -> None:
        print(f"{self.user} placed: {item}")

# Dependency visible, injectable, testable
order = Order(user="alice")
order.place("notebook")
How to spot hidden global coupling

Search your class bodies for any reference to a name that was not received as a constructor parameter or method argument — and is not a built-in. If the name lives at module level in another file, you have a hidden dependency. The class cannot be understood or tested without also knowing the state of that external name at the moment it runs.

mock.patch vs Dependency Injection

When beginners discover that their tightly coupled code is hard to test, a common first instinct is to reach for unittest.mock.patch. This works — it temporarily replaces a name in a module's namespace so the test sees a fake instead of the real object. But it addresses the symptom rather than the underlying design.

python
from unittest.mock import patch, MagicMock

# The class is still tightly coupled — patch papers over it for the test
class Order:
    def __init__(self):
        self.db = Database()   # hard-wired

def test_order_place():
    mock_db = MagicMock()
    # patch replaces 'Database' in the module namespace during this test only
    with patch('orders.Database', return_value=mock_db):
        order = Order()
        order.place("bicycle")
        mock_db.save.assert_called_once_with("bicycle")

This test passes, but the coupling remains. Anyone reading Order's constructor still cannot tell what it needs without reading the implementation. The test setup is fragile — it depends on knowing the exact import path 'orders.Database', so renaming a module or moving a class silently breaks the patch target. And the test cannot run without understanding how patch works.

Compare to the same test with dependency injection:

python
class Order:
    def __init__(self, db):
        self.db = db   # injected — dependency visible in the signature

class FakeDb:
    def __init__(self): self.calls = []
    def save(self, data): self.calls.append(data)

def test_order_place():
    fake = FakeDb()
    order = Order(fake)     # no patching, no magic, no import paths
    order.place("bicycle")
    assert fake.calls == ["bicycle"]
When mock.patch is the right choice

mock.patch is the right tool when you do not own the code you are testing against — for example when patching a third-party library, a built-in like open, or a network call inside code you cannot refactor. For code you own and control, dependency injection produces a cleaner and more durable design. Use patch as a last resort on boundaries you cannot move, not as a substitute for good dependency design.

Cohesion and Coupling Together

Coupling measures how much one module depends on another. Cohesion measures how focused a single module is on doing one clear job. The two concepts always travel together in software design because a class with low cohesion — one that does many unrelated things — almost always produces high coupling. Each extra job the class does forces it to reach into more external classes to get what it needs.

A class that handles user authentication, sends email, and writes to a database is doing three jobs. To do all three it must know about an email server, a database connection, and a session store. Each of those is a dependency — and each one is a coupling point. Splitting the class into three focused classes, each with one job, typically reduces coupling as a side effect because each smaller class needs far fewer external dependencies.

The design goal

Good module design aims for high cohesion and low coupling. Each module does one thing well (high cohesion) and communicates with other modules only through clear, minimal interfaces (low coupling). When you find a class that is hard to decouple, look first at whether it is trying to do too much. Splitting the responsibilities often dissolves the coupling naturally.

The Single Responsibility Principle from SOLID — the idea that a class should have only one reason to change — is a practical statement of the same idea. A class with one reason to change has one job. A class with one job needs fewer external dependencies. Fewer external dependencies means lower coupling.

Think about it from the dependency count perspective. A class that does three jobs — authentication, email, and caching — must know about an auth store, a mail server, and a cache. That is three external dependencies, three coupling points, and three reasons the class might break when something outside it changes.

Now split it into three focused classes. The UserRegistrar only knows about a database. The WelcomeMailer only knows about a mailer. The SessionCache only knows about a cache. Each class has one coupling point. Each can be understood, tested, and changed without thinking about the other two.

This is why refactoring toward high cohesion and refactoring toward low coupling so often produce the same result. They are different lenses on the same underlying design quality: does each piece of code do one thing, with one reason to change, needing one minimal set of collaborators?

python
# Low cohesion — one class, three jobs, three dependency coupling points
class UserManager:
    def __init__(self):
        self.db = PostgresDatabase()    # coupling point 1
        self.mailer = SMTPMailer()      # coupling point 2
        self.cache = RedisCache()       # coupling point 3

    def register(self, email, password): ...
    def send_welcome_email(self, email): ...
    def cache_session(self, user_id): ...

# High cohesion — each class has one job, one dependency
class UserRegistrar:
    def __init__(self, db): self.db = db
    def register(self, email, password): ...

class WelcomeMailer:
    def __init__(self, mailer): self.mailer = mailer
    def send(self, email): ...

class SessionCache:
    def __init__(self, cache): self.cache = cache
    def store(self, user_id): ...

Each of the three focused classes above has one constructor parameter. Each can be tested with a single fake. The coupling still exists — each class still depends on something — but the dependencies are explicit, minimal, and swappable. This is what high cohesion produces: lower coupling as a natural consequence.

Temporal Coupling

Most of the coupling discussed so far is structural — one class knows about the internals of another. There is a second, less-discussed form called temporal coupling, which is about when things must happen rather than what they depend on. Two operations are temporally coupled when they must be called in a specific order, and calling them out of sequence produces incorrect or undefined behaviour.

This is common in code that uses initialisation methods separate from the constructor, or that relies on side effects having been applied before a second call is made:

python
# Temporal coupling — setup() must be called before process()
# Nothing in the interface enforces or communicates this
class DataPipeline:
    def __init__(self):
        self._ready = False
        self._data = None

    def setup(self, data):
        self._data = data
        self._ready = True

    def process(self):
        if not self._ready:
            raise RuntimeError("Call setup() before process()")
        return [d.strip() for d in self._data]

# A caller can easily get the order wrong:
pipeline = DataPipeline()
pipeline.process()   # RuntimeError — nothing warned about the ordering

The problem is that the ordering requirement is hidden inside the implementation. A caller reading the class signature sees two public methods and nothing to indicate which must come first. The coupling is between the two method calls themselves, not between two classes.

The standard fix is to collapse the two-step pattern into construction — accept the data in the constructor so an object can only exist in a ready state. If there are good reasons to keep two steps, use a factory function that enforces the order and returns a fully-configured object:

python
from __future__ import annotations

# Fix 1 — make the object always ready at construction
class DataPipeline:
    def __init__(self, data: list[str]) -> None:
        self._data = data   # always ready — no separate setup() required

    def process(self) -> list[str]:
        return [d.strip() for d in self._data]

# Fix 2 — if two-step construction is unavoidable, hide it behind a factory
class DataPipeline:
    def __init__(self, data: list[str]) -> None:
        self._data = data

    @classmethod
    def from_raw(cls, raw_data: list[str]) -> "DataPipeline":
        validated = [d for d in raw_data if d.strip()]
        return cls(validated)   # returns a fully configured instance

    def process(self) -> list[str]:
        return [d.strip() for d in self._data]

# Callers have one clear path — no ordering to remember
pipeline = DataPipeline.from_raw(["  hello  ", "", "  world  "])
print(pipeline.process())   # ['hello', 'world']
Temporal coupling in test setup

Temporal coupling also appears in test suites where tests share state through class-level variables or rely on running in a particular order. If removing or reordering a test causes others to fail, the test suite has temporal coupling. The fix is the same: make each test fully self-contained, constructing its own fixtures from scratch rather than relying on state left by a previous test.

When Loose Coupling Goes Too Far

Dependency injection is a powerful tool, but it can be over-applied. The question worth asking is: does loosening this coupling actually serve a real need — testability, replaceability, or flexibility — or am I injecting things out of habit?

A constructor that accepts eight parameters is a signal. Sometimes all eight are genuinely needed and independently swappable. More often, it means the class has too many responsibilities (a cohesion problem) or that some of those parameters are configuration values that could be grouped into a settings object, or that some dependencies are never actually swapped in practice and would be simpler as internal implementation details.

python
# Over-injected — every internal detail has been pushed to the constructor
# The caller now has to know about — and wire — seven different things
class ReportGenerator:
    def __init__(
        self,
        db,
        formatter,
        logger,
        cache,
        auth_checker,
        rate_limiter,
        audit_log,
    ):
        ...

# Signs you may have gone too far:
# 1. Every test needs to create 7 fakes just to instantiate the class
# 2. Most parameters never change between production and test
# 3. Several parameters are always created together as a group

There are three practical ways to reduce an over-injected constructor without re-introducing tight coupling.

Group related configuration into a dataclass. If several parameters are always provided together and none is independently swapped, a @dataclass groups them cleanly and reduces the constructor signature without hiding the dependency.

Ask whether the dependency is ever substituted. If a logger is always the same standard library logging.Logger and you never need to swap it in tests, injecting it adds friction without benefit. Standard library tools that are stable and have no side effects are reasonable candidates to use directly.

Split the class. If the constructor is long because the class does many things, splitting it along responsibility lines usually reduces the injection load on each resulting class naturally.

The practical test

Before injecting a dependency, ask two questions: will I ever need to substitute a different implementation of this, and does having it inside the class make this class impossible to test in isolation? If the answer to both is no, the internal creation may be the simpler choice. Loose coupling is a tool for managing real complexity — not a rule to apply mechanically to every line of code.

Summary

  1. Tight coupling happens when one piece of code creates or directly references the internal details of another, locking the two together so that changes in one force changes in the other.
  2. The clearest form in Python is a class that instantiates another class directly inside its own constructor using SomeClass(), rather than receiving the instance from outside.
  3. The simplest remedy is dependency injection: accept the dependency as a constructor parameter and store it. This gives the caller control and gives you a seam — a point where a substitute can be inserted for testing.
  4. Using a typing.Protocol to describe what you need rather than naming a specific class is the most flexible approach. Any object with matching methods satisfies it, with no inheritance required.
  5. Inheritance is the tightest form of coupling. Favour composition — holding an instance of another class — over inheritance when you need behaviour without a genuine is-a relationship.
  6. Global variables create invisible coupling. A class that reads global state has a hidden dependency that does not appear in its constructor and makes tests order-dependent and fragile.
  7. High cohesion and low coupling reinforce each other. A class that does one job needs fewer external dependencies, which reduces coupling naturally. The Single Responsibility Principle is a practical route to both.
  8. Not all coupling is a problem. Value objects, dataclasses, and private internal helpers are appropriate candidates for tight coupling. The concern is coupling that hides intent, blocks testing, or forces change to propagate through unrelated parts of the codebase.

Once you can identify tight coupling, you will see the refactoring opportunity sitting next to it. In Python, the gap between tightly coupled and loosely coupled code is often just one constructor parameter.

What Actually Happens at the CPython Level When You Instantiate Inside a Constructor

Understanding the mechanism behind tight coupling — not just the design principle — makes it much easier to reason about real code. When Python compiles self.db = Database() inside a class body, something concrete happens in the bytecode that explains exactly why the coupling is fragile.

The bytecode instructions include a LOAD_GLOBAL opcode that looks up the name Database in the module's global namespace at the moment the line executes. This means the tight coupling is not just a design-time concern — it is baked into the bytecode. The class literally carries the name of its dependency as a string reference that must be resolvable at runtime.

python
import dis

class Database:
    def save(self, data):
        print(f"Saving {data}")

class TightOrder:
    def __init__(self):
        self.db = Database()   # hard-wired

class LooseOrder:
    def __init__(self, db):
        self.db = db           # injected

# Disassemble both constructors to see what CPython compiles them to
print("=== TightOrder.__init__ bytecode ===")
dis.dis(TightOrder.__init__)

print("\n=== LooseOrder.__init__ bytecode ===")
dis.dis(LooseOrder.__init__)

# TightOrder output will show a LOAD_GLOBAL for '(NULL + Database)'
# followed by CALL — the dependency name is embedded in the bytecode
#
# LooseOrder output shows only LOAD_FAST for the parameter 'db'
# — no external name lookup required at all
# (Exact opcode names vary by Python version; the distinction remains the same)

Running this yourself with dis.dis() — Python's built-in bytecode disassembler from the standard library dis module — shows the concrete difference. The TightOrder constructor emits a LOAD_GLOBAL instruction that carries the string 'Database' as a constant. If you were to rename Database to PostgresStorage and forget to update one reference, Python raises a NameError at runtime on that line — not at import time, not during static analysis, but only when TightOrder() is actually called. This is the runtime brittleness that design-level descriptions of tight coupling are referring to.

The loosely coupled version's bytecode shows only LOAD_FAST for the db parameter. There is no external name at all — the reference is local to the function frame. Swapping the concrete type is now a caller-side concern entirely outside the compiled bytecode of the class.

Python version note for Python 3.12+

In Python 3.12 and later, the bytecode instruction set was significantly reorganised as part of the specialising adaptive interpreter introduced by PEP 659. The opcode names may differ from those shown above, but the fundamental distinction — external global lookup vs. local parameter load — remains the same. Use dis.dis() on your own Python version to see the precise instructions.

The module-level import is not tight coupling

A common confusion for beginners: importing a module at the top of a file is not tight coupling in the problematic sense. from typing import Protocol at the top of a file is a structural import — it gives you access to a typing construct, not a concrete object with internal state. The coupling concern is about whether one class names and constructs another class inside itself, not about whether it uses imported names for type hints.

Tight coupling becomes a problem at the point where construction occurs — the ClassName() call — not at the import statement. Importing a class for use as a type annotation or Protocol is the correct pattern for defining contracts precisely.

check your understanding question 1 of 12

How to Spot Tight Coupling in Your Own Python Code

Recognising tight coupling in code you are already working with is a different skill from understanding the concept in examples. Work through these six checks whenever you review a class.

  1. Look for internal instantiation

    Search your class bodies for any line that calls AnotherClass() directly inside __init__ or a method. If the class is creating its own dependency rather than receiving one, that is tight coupling. The giveaway is a constructor call — ClassName() — that does not belong to the class itself.

  2. Ask whether the class can be tested in isolation

    Try to write a unit test for the class without importing or running any of its dependencies. If you cannot — because the constructor builds real objects you cannot intercept — the class is tightly coupled. The inability to pass in a fake or stub is one of the clearest practical signals.

  3. Check whether the class names a concrete type it does not own

    If a class references a specific class by name — other than through a constructor parameter or type hint on an accepted argument — and that class lives in a different module or layer, tight coupling is likely present. The class is reaching across a boundary it should not need to cross.

  4. Check whether changes in one class force changes in another

    Rename a private attribute or change a constructor signature in one class, then check whether another unrelated class immediately breaks. If it does, those two classes are coupled at the point of the change. This ripple effect is the most visible runtime symptom of tight coupling in a real codebase.

  5. Look for names in method bodies that came from nowhere

    Read through a method body and for every name it uses, ask: where did this come from? It should have arrived as a constructor parameter, a method argument, or a local variable created in this scope. If a name came from a module-level global or a class attribute that was hard-wired at construction time, it is a hidden dependency — and hidden dependencies are coupling that does not show up in the signature.

  6. Check inheritance chains for behaviour reuse without a real is-a relationship

    Look at every class that inherits from another. For each one, ask: is this genuinely an is-a relationship, or did someone inherit just to get access to the parent's methods? If the child class does not truly specialise the parent — if it just needed one method — the inheritance is tight coupling masquerading as code reuse. Consider whether composition would give the same result with less risk.

Frequently Asked Questions

Tightly coupled means two pieces of code depend so directly on each other that changing one forces changes in the other. They cannot work independently. In Python, the most common form is a class that creates a concrete instance of another class inside its own constructor.

Tight coupling makes code harder to change, test, and reuse. A modification in one class or function ripples through everything that depends on it, which increases the chance of bugs and slows down development. Testing is particularly affected because tightly coupled code cannot be isolated — running a test for one class forces you to also run the real logic of everything it depends on.

The opposite is loose coupling. Loosely coupled code communicates through clear interfaces rather than internal details, so parts can change independently without breaking each other. A class that receives its dependencies through constructor parameters rather than creating them internally is a loosely coupled class.

Look for these signals: one class directly creating instances of another class inside its own methods using SomeClass(); a function reaching into the internal attributes of another object that it did not receive as a parameter; and test code that cannot run without setting up a large chain of other objects first. If you cannot write a test for a class in isolation, it is almost certainly tightly coupled.

Not all coupling is bad. Some dependency between parts is unavoidable in any program. The concern is when coupling is so tight that it obscures what a piece of code requires and makes isolated testing or replacement impossible. A class that knows it needs something with a save method is fine. A class that knows it specifically needs a PostgresDatabase created with no arguments is a problem.

Dependency injection means passing a dependency into a class or function from outside rather than letting it create the dependency itself. It is one of the simplest ways to loosen coupling in Python. Instead of self.db = Database() inside the constructor, you write def __init__(self, db): and then self.db = db. The class no longer owns the creation of its dependency and can work with any object that fits the expected interface.

Yes. Even a short program with two or three classes can be tightly coupled. The size of the program is not what matters — what matters is whether one part directly reaches into the internals of another. Tight coupling in a small program is less painful to fix, but it is still worth recognising early before the habit carries over into larger projects.

Python does not enforce either approach. Its dynamic nature makes it easy to write tightly coupled code by accident — for example by instantiating concrete classes directly inside other classes — but it also provides everything needed to write loosely coupled code with constructor injection, typing.Protocol, and abstract base classes. The choice is yours, which is why understanding the concept matters.

A single place in your application, called the composition root, takes responsibility for creating all concrete objects and wiring them together. This is usually a main.py file or an application factory function. By centralising all construction there, swapping an implementation — for example, replacing a real database with an in-memory one for tests — requires a one-line change in one place rather than edits scattered across many classes.

typing.Protocol uses structural subtyping — any class with the right methods automatically satisfies the Protocol, with no inheritance required. abc.ABC uses nominal subtyping — a class must explicitly inherit from the ABC and implement every abstract method, or Python raises a TypeError at instantiation. Use typing.Protocol when you want maximum flexibility, especially with third-party classes you cannot modify. Use abc.ABC when you want Python itself to enforce the contract at runtime for classes you control.

Use Python's built-in dis module. Running import dis; dis.dis(MyClass.__init__) prints the bytecode for the constructor. A tightly coupled constructor emits a LOAD_GLOBAL instruction carrying the dependency class name as a string constant — the name is literally baked into the compiled bytecode. A loosely coupled constructor using dependency injection shows only LOAD_FAST for the injected parameter — no external name lookup occurs at all. This is not just a design concern; it is the concrete runtime mechanism that makes tight coupling fragile when class names or module structures change.

Not in the problematic sense. Importing a class for use as a type annotation or to define a Protocol is normal and appropriate — it gives you access to a typing construct without creating a runtime dependency on a concrete object with internal state. The tight coupling concern is about construction: where does ClassName() actually get called? If it is called inside another class's constructor, that is the problem. If the construction happens at the composition root and the import is only for type-checking, there is no tight coupling issue.

Yes. Tight coupling is appropriate when both sides are stable, contained in the same layer or module, and neither needs to be swapped or tested in isolation. Good examples include a @dataclass that is tightly coupled to its own fields, a private helper class that never leaves its module, and an enum that defines a fixed contract. The problem is not coupling itself — it is coupling that crosses layers, hides intent, or prevents testing.

It applies to both. A function that calls another function by its concrete name — rather than accepting a callable as a parameter — is tightly coupled to that function. The fix is the same idea as dependency injection for classes: accept the callable as an argument so the caller decides which function to provide. functools.partial is a practical tool for pre-configuring callables at the composition root without cluttering every call site with infrastructure arguments.

A circular import — where module A imports from module B and module B imports from module A — is one of the most visible structural symptoms of tight coupling across module boundaries. Python cannot finish initialising either module and raises an ImportError. Moving the import inside a function body defers the error but does not remove the underlying coupling. The correct fix is to restructure the code so that the two modules depend on a third shared interface rather than on each other.

Yes — inheritance is the tightest form of coupling available in Python. A subclass does not just depend on the parent's public interface; it depends on every internal attribute and implementation detail, including changes to methods it never explicitly calls. This is called the fragile base class problem. Composition — where a class holds an instance of another class rather than extending it — gives you the behaviour you need with far looser coupling, and lets you swap the collaborating object independently.

Yes, and it is a particularly dangerous kind because it is invisible. A class that reads from or writes to a global variable is coupled to every other piece of code that touches that global, but none of those relationships appear in the constructor or method signatures. This makes the class impossible to test in isolation without setting up and tearing down global state around every test, leading to tests that are order-dependent and fragile. The fix is to make the dependency explicit — pass the value through the constructor instead of reading it from a global.

mock.patch makes the test pass without fixing the design — the coupling remains in the production code, and the test now depends on knowing the exact import path of the class being patched. Dependency injection produces tests that are simpler, more readable, and do not break when you move or rename a module. Use mock.patch when you cannot refactor the code — for example when patching third-party libraries or built-ins — and use dependency injection for code you own and control.

The safest approach is incremental. First, add an optional parameter with a fallback: def __init__(self, db=None): self.db = db or Database(). This is a non-breaking change — all existing call sites work unchanged, but tests can now inject a fake immediately. Once you have test coverage through the new seam, migrate call sites one at a time to pass the real object explicitly. When all call sites are explicit, remove the default and the internal fallback. You do not have to complete this in one commit — stopping at step one or two is still a net improvement.

Cohesion measures how focused a single module is on doing one clear job. Coupling measures how much one module depends on another. The two are related: a class that tries to do many unrelated things (low cohesion) must reach into many external classes to get what it needs, which produces high coupling. A class that does one job well (high cohesion) typically needs fewer external dependencies, which naturally reduces coupling. The design goal is high cohesion and low coupling working together.

Temporal coupling is a form of coupling based on order rather than structure. Two operations are temporally coupled when they must be called in a specific sequence and calling them out of that sequence produces incorrect or undefined behaviour — but nothing in the interface communicates this requirement. A common example is a class with a setup() method that must be called before process(), where a caller has no way of knowing that unless they read the implementation. The fix is to eliminate the ordering requirement: accept the data at construction time so the object is always in a ready state, or use a factory method that enforces the sequence internally and returns a fully configured object.

Yes. A constructor that accepts many parameters is often a signal that something has gone wrong — either the class is doing too many things (a cohesion problem), several parameters could be grouped into a configuration object, or some dependencies are stable enough that injecting them adds friction without any practical benefit. Before injecting a dependency, ask two questions: will you ever need to substitute a different implementation of it, and does having it created internally make the class impossible to test in isolation? If the answer to both is no, an internal creation may be simpler. Loose coupling is a tool for managing real variability and testability — not a rule to apply mechanically to every object a class touches.

A Protocol that defines ten methods when a class only calls two of them is a form of unnecessary coupling — the class now implicitly requires every dependency it receives to implement nine methods it never uses. This is the same principle as the Interface Segregation Principle from SOLID: clients should not be forced to depend on methods they do not use. The fix is to define narrow, focused Protocols that describe only what the class actually needs. A class that only calls save(data) should depend on a Protocol with one method, not a broad storage interface with ten. Narrow Protocols are easier to satisfy with fakes in tests, and they make the class's true requirements immediately visible to any reader.

Certificate of Completion
Final Exam
Pass mark: 80% · Score 80% or higher to receive your certificate

Enter your name as you want it to appear on your certificate, then start the exam. Your name is used only to generate your certificate and is never transmitted or stored anywhere.

Question 1 of 22