Python Type Hints and Static Analysis with mypy

Table of Contents
- The Bug That Shouldn't Have Made It to Production
- Why Type Hints Changed Python
- Type Hint Basics: From Variables to Functions
- Variable Type Hints
- Function Parameter and Return Types
- Generic Types: Moving Beyond Simple Annotations
- list, dict, tuple, and set
- Variadic Tuples
- Optional and Union Types: Handling None and Alternatives
- Optional: Something or Nothing
- Union Types: Multiple Possibilities
- Literal: When Only Specific Values Are Allowed
- Advanced Type Concepts: TypeVar, Callable, and Protocol
- TypeVar: Generic Functions
- Bounded TypeVar: When You Need Some Constraints
- Callable: Typing Functions as Arguments
- Protocol: Structural Typing
- Generics and Advanced Types
- Mypy Configuration Best Practices
- Running mypy: Configuration and Strictness
- Installing and Running mypy
- Configuration with pyproject.toml
- Handling Third-Party Libraries Without Stubs
- Modern Union Syntax vs. Optional
- Common Type Hint Mistakes
- mypy vs. pyright: Choosing Your Type Checker
- mypy: The Traditional Choice
- pyright: The Performance Alternative
- In Practice
- VS Code Integration
- A Complete Example: Bringing It All Together
- Common Gotchas and How to Avoid Them
- The Compounding Returns of Type Safety
- The Payoff
- Summary
The Bug That Shouldn't Have Made It to Production
Picture this: you're three weeks into a machine learning project, the model training pipeline is humming along, and then, midnight on a Tuesday, a data ingestion function silently swallows a string where it expected an integer. No exception. No warning. Just subtly corrupted data flowing downstream into training batches, producing a model that looks like it works until your client runs it on their dataset and everything falls apart. You spend two days tracing back through logs, comparing outputs, and staring at a function signature that tells you absolutely nothing about what it expects.
This is the bug that type hints exist to prevent. Not some exotic edge case, not a concurrency nightmare, just the everyday reality of Python's dynamic typing letting incompatible values pass through function boundaries without complaint. The language's flexibility is one of its greatest strengths, but that same flexibility can turn a straightforward API into a minefield when you're building systems of any real size. You can't hold the entire call graph in your head. Your test suite can't exhaustively cover every combination of types that callers might pass. But a static type checker can scan your entire codebase in seconds and flag every place where the types don't line up.
The uncomfortable truth is that most Python bugs aren't algorithmic failures, they're type failures. A None where you expected a string. A list where you expected a single item. A float sneaking through where your function was built to handle integers only. These bugs are invisible in plain Python code, they slip through code review because reviewers are reading logic not types, and they surface at the worst possible moment. Static analysis with type hints changes the equation entirely: you catch those failures the moment you write the code, not when your system is under load at 2 a.m.
We've been using type hints informally since the start of this series. Now it's time to stop being casual about them and harness the power of static analysis tools like mypy and pyright to turn hints into guarantees.
Why Type Hints Changed Python
Type hints didn't arrive in Python until version 3.5, with PEP 484, and even then they were purely optional. The Python community's reaction was mixed. Some developers embraced them immediately as a way to make large codebases manageable. Others pushed back, arguing that static typing was antithetical to Python's philosophy of flexible, expressive code.
What changed minds was scale. As Python moved from scripting language to serious enterprise software, from quick analysis scripts to production ML systems serving millions of requests, the cost of dynamic typing became impossible to ignore. Google, Dropbox, and other large Python shops started building internal type checkers and reporting dramatic reductions in production bugs. Mypy, originally developed at Dropbox, became a community-standard tool.
The key insight is that type hints are not a binary choice between static and dynamic typing. Python's gradual typing system lets you annotate the parts of your code where type safety matters most while leaving dynamic sections untyped. You can start with just function signatures on your most critical utilities and work outward from there. This pragmatic middle ground turned out to be exactly what the community needed, you get the safety benefits where you need them without rewriting your entire codebase.
Today, all major Python IDEs understand type hints natively, and tools like pyright provide instant inline feedback as you type. The ecosystem has matured to the point where writing untyped Python in a new project feels like a deliberate choice to skip safety nets, not the neutral default it once was.
Type Hint Basics: From Variables to Functions
Let's start simple. Type hints are annotations that tell Python (and more importantly, type checkers) what kind of data your variables and functions expect.
Variable Type Hints
name: str = "Alice"
age: int = 30
height: float = 5.8
is_active: bool = TrueThe syntax is straightforward: variable: Type = value. The type checker will verify that the value matches the declared type. But here's the thing, Python still doesn't enforce these at runtime. If you do name = 42, Python won't complain. But mypy will catch it during analysis.
Variable annotations are most useful at the module level and in class bodies where they document what a variable is meant to hold across its entire lifetime. They serve as inline documentation that a type checker can verify, which means they stay accurate in a way that comments never do, if you change the type and forget to update the annotation, mypy will tell you immediately.
Function Parameter and Return Types
Functions are where type hints shine. Every function you write is an implicit contract: you promise to accept certain types and return certain types. Type hints make that contract explicit and machine-verifiable.
def greet(name: str, times: int) -> str:
"""Greet someone multiple times."""
return f"Hello, {name}! " * times
def process_user(user_id: int) -> dict:
"""Fetch and process user data."""
# Your logic here
passThe arrow -> specifies the return type. Parameters get their types after the colon. This tells anyone reading the code (and any type checker) exactly what goes in and what comes out.
When you annotate function signatures consistently, something useful happens: your IDE can tell you immediately when you're calling a function incorrectly. You don't need to read the docstring or trace through the implementation. The type tells you. This compounds across a codebase, once your utility functions are typed, every caller gets automatic validation at the call site.
Generic Types: Moving Beyond Simple Annotations
Most real-world functions work with collections. That's where generic types come in, and where the expressiveness of Python's type system really starts to shine.
list, dict, tuple, and set
def process_scores(scores: list[int]) -> float:
"""Calculate average of integer scores."""
return sum(scores) / len(scores)
def user_profiles(user_ids: list[int]) -> dict[int, str]:
"""Map user IDs to profile names."""
return {uid: f"User_{uid}" for uid in user_ids}
def get_coordinates() -> tuple[float, float]:
"""Return X and Y coordinates."""
return (42.3, -71.1)
def unique_tags(articles: list[str]) -> set[str]:
"""Extract unique tags from article list."""
# Your logic here
return set()Notice the square brackets? That's the difference between "a list" and "a list of integers." The type checker understands that if a function expects list[int], you can't pass list[str] without it raising a type error.
The specificity here matters enormously in practice. A function that takes dict[str, int] is fundamentally different from one that takes dict[str, Any], the first makes real promises about the values you'll find, the second tells you almost nothing. Getting in the habit of parameterizing your collection types rather than leaving them bare is one of the highest-value habits you can build when adopting type hints.
Variadic Tuples
Sometimes you don't know how many items will be in a tuple:
def summarize(*values: int) -> int:
"""Sum any number of integers."""
return sum(values)
# This is equivalent to:
def summarize_explicit(values: tuple[int, ...]) -> int:
"""Sum integers from a tuple."""
return sum(values)The tuple[int, ...] syntax means "a tuple of any length containing only integers." This is subtly different from list[int], tuples are structurally typed by position in Python's type system, and this ellipsis syntax is the special case for homogeneous variable-length tuples.
Optional and Union Types: Handling None and Alternatives
Not every function always returns a value. Not every parameter is always present. These are the situations where beginners often reach for Any, but Python's type system has much better tools.
Optional: Something or Nothing
def find_user_by_email(email: str) -> dict | None:
"""Find user, or return None if not found."""
# Search logic
if found:
return user_data
return NoneThat dict | None is the modern Python 3.10+ syntax. In older versions, you'd write Optional[dict]. Both mean the same thing: the function might return a dict, or it might return None.
The real power of Optional types comes from what happens at call sites. When mypy knows a function can return None, it will flag any place where you use the return value without first checking for None. This is the single most common source of AttributeError and TypeError crashes in Python code, someone returned None and somewhere downstream it was used as if it were a real value. Optional types make the type checker your partner in enforcing those checks.
Union Types: Multiple Possibilities
def parse_value(data: str | int | float) -> float:
"""Parse numeric data from multiple types."""
if isinstance(data, str):
return float(data)
return float(data)
# With Union (older syntax):
from typing import Union
def parse_value(data: Union[str, int, float]) -> float:
passUnion types tell the type checker that a parameter can be one of several types. The function needs to handle all possibilities, and the type checker will verify you're not accidentally treating a string as an int.
Literal: When Only Specific Values Are Allowed
from typing import Literal
def set_log_level(level: Literal["DEBUG", "INFO", "ERROR"]) -> None:
"""Set logging level to one of three valid options."""
# Type checker ensures only these strings are passed
passLiteral types are powerful for APIs where only certain values make sense. Try calling set_log_level("INVALID") and mypy will catch it immediately. This is especially useful when you're building configuration interfaces or state machines where the valid values are a known, finite set, it turns what would otherwise be a runtime ValueError into a type-time error that never reaches your users.
Advanced Type Concepts: TypeVar, Callable, and Protocol
Now we're entering more sophisticated territory. These tools let you write flexible, reusable code while maintaining type safety, and they're the foundation for understanding how the standard library itself is typed.
TypeVar: Generic Functions
Imagine you want a function that works with any type but preserves that type:
from typing import TypeVar
T = TypeVar('T')
def first_element(items: list[T]) -> T:
"""Return the first element without changing its type."""
return items[0]
# When called with list[int], it returns int
# When called with list[str], it returns str
result_int: int = first_element([1, 2, 3])
result_str: str = first_element(["a", "b", "c"])TypeVar is how you write truly generic functions. The type checker tracks that if you pass list[int], you get back an int. If you pass list[str], you get back a str.
Without TypeVar, you'd be forced to choose between being specific (only works for one type) or using Any (loses all type information). TypeVar gives you the third option: be generic but preserve type relationships. This is why functions like sorted(), max(), and min() in the standard library can be typed correctly, they use TypeVar internally.
Bounded TypeVar: When You Need Some Constraints
from typing import TypeVar
Numeric = TypeVar('Numeric', int, float)
def add(a: Numeric, b: Numeric) -> Numeric:
"""Add two numbers of the same type."""
return a + bThis TypeVar can only be bound to int or float. If you try to use it with a string, mypy complains.
Callable: Typing Functions as Arguments
Functions are first-class objects in Python. Sometimes you pass them around:
from typing import Callable
def apply_operation(
values: list[int],
operation: Callable[[int], int]
) -> list[int]:
"""Apply a function to each value."""
return [operation(v) for v in values]
def double(x: int) -> int:
return x * 2
result = apply_operation([1, 2, 3], double)Callable[[int], int] means "a function that takes an int and returns an int." The square brackets inside specify parameter types, and the final type is the return type.
Higher-order functions are extremely common in Python, decorators, callbacks, strategy patterns, functional pipelines. Without Callable typing, the entire category of "functions as arguments" becomes a black hole in your type coverage. Once you start using Callable consistently, you'll find that many bugs in callback-heavy code surface immediately as type errors rather than cryptic TypeError: X is not callable messages at runtime.
Protocol: Structural Typing
Sometimes you don't care about an object's class, only that it has certain methods:
from typing import Protocol
class Drawable(Protocol):
"""Anything that can be drawn."""
def draw(self) -> None:
...
def render(shape: Drawable) -> None:
"""Render any drawable object."""
shape.draw()
class Circle:
def draw(self) -> None:
print("Drawing circle...")
# This works! Circle matches Drawable structurally
render(Circle())Protocols enable duck typing with type safety. If an object has the right methods, mypy accepts it, no inheritance required.
Protocol is one of the most powerful tools in Python's type system because it aligns type checking with Python's actual runtime behavior. Python doesn't care about inheritance hierarchies, it cares whether an object has the right attributes and methods. Protocol lets you express exactly that in the type system. This is particularly valuable when working with third-party libraries where you can't modify class hierarchies but need to express structural compatibility.
Generics and Advanced Types
Once you're comfortable with the basics, Python's type system opens up significantly. The typing module contains a collection of tools for expressing complex type relationships that come up frequently in real-world code.
TypedDict is invaluable when you're working with dictionary-shaped data that has a known, fixed structure, which describes a huge percentage of JSON payloads, configuration objects, and API responses. Rather than annotating a function parameter as dict[str, Any] and losing all type information about what keys exist, you can define the exact shape of the dictionary and get full field-level type checking.
from typing import TypedDict
class UserRecord(TypedDict):
id: int
name: str
email: str
active: bool
def send_welcome(user: UserRecord) -> None:
print(f"Welcome, {user['name']}!") # Type checker knows 'name' is strNamedTuple similarly gives you a typed alternative to plain tuples. Where a bare tuple[int, str, float] forces callers to remember that index 0 is the ID, index 1 is the name, and index 2 is the score, a NamedTuple makes that structure explicit and accessible by name:
from typing import NamedTuple
class ModelResult(NamedTuple):
model_id: str
accuracy: float
loss: float
result = ModelResult(model_id="bert-base", accuracy=0.94, loss=0.18)
print(result.accuracy) # Clear, named access with full type infoFor truly advanced scenarios, overload lets you express different return types based on argument types, the pattern that explains how open() can return either a text file or a binary file depending on the mode argument. And Final annotates constants that should never be reassigned, giving you a type-checked equivalent of other languages' const.
The deeper you go into Python's type system, the more you realize it can express almost any type relationship you care about, it just requires learning the vocabulary.
Mypy Configuration Best Practices
Getting mypy set up correctly from the start saves a lot of pain later. The default mypy configuration is permissive enough that it will pass code with significant type holes, so knowing how to tune it to your needs is essential.
The single most important configuration decision is where to set your strictness level. For new projects, start strict. It's far easier to write typed code from the beginning than to retrofit types onto an existing codebase. For existing projects, start permissive and tighten gradually, adding disallow_untyped_defs = true to require annotations on all function definitions is usually the best first step, because it forces type coverage at the boundary where it matters most.
Use per-module overrides liberally when you're incrementally adding types to a large codebase. You can set strict mode globally and then carve out exceptions for specific modules that aren't typed yet:
[tool.mypy]
python_version = "3.11"
strict = true
[[tool.mypy.overrides]]
module = "legacy_module.*"
ignore_errors = true
[[tool.mypy.overrides]]
module = "third_party_untyped.*"
ignore_missing_imports = trueThis lets you enforce strict typing on all new code while not blocking on legacy modules. The key is to shrink the list of overrides over time rather than letting it grow.
Run mypy as part of your CI pipeline, not just as an optional local check. Type errors that only get caught when someone happens to run mypy locally will inevitably slip through. Treat mypy failures the same way you treat test failures, they block the build. This cultural shift is what actually makes type checking effective at the team level.
Finally, use # type: ignore comments sparingly and always with a comment explaining why. # type: ignore[assignment] with a note about why this particular assignment is intentional is dramatically better than a bare # type: ignore that leaves future maintainers guessing. If you find yourself adding many ignore comments, that's a signal to look more closely at whether the code structure itself could be improved.
Running mypy: Configuration and Strictness
Type hints are inert until you run a type checker. That's where mypy comes in.
Installing and Running mypy
pip install mypy
mypy your_script.pyThat's it. mypy analyzes your code and reports type errors. No runtime overhead, no changes to your actual code.
Running mypy for the first time on an untyped codebase is often eye-opening. You'll see hundreds of errors, most of them revealing genuine ambiguities and potential bugs that your tests haven't caught. Don't be discouraged, treat each error as a question the type checker is asking: "Did you really mean this?" The answer is usually no, and fixing the type error usually means clarifying the code in a way that prevents a real bug.
Configuration with pyproject.toml
Most projects configure mypy once and forget about it:
[tool.mypy]
python_version = "3.11"
warn_return_any = true
warn_unused_configs = true
disallow_untyped_defs = true
disallow_incomplete_defs = true
check_untyped_defs = true
no_implicit_optional = true
warn_redundant_casts = true
warn_unused_ignores = true
warn_no_return = true
strict = trueThe strict = true setting enables the most stringent checks. Start here if you're serious about type safety. For existing codebases, you might begin with:
[tool.mypy]
python_version = "3.11"
warn_return_any = true
warn_unused_ignores = trueAnd gradually increase strictness as your code improves.
Handling Third-Party Libraries Without Stubs
Not every library has type information. mypy will complain about imports from untyped packages. You have options:
# Option 1: Ignore specific import
from some_old_library import compute # type: ignore
# Option 2: Configure mypy to ignore the entire package
# In pyproject.toml:
# [[tool.mypy.overrides]]
# module = "some_old_library.*"
# ignore_missing_imports = true
# Option 3: Install stub packages
# pip install types-requestsThe types- prefix is the convention for stub packages. These provide type information for popular untyped libraries. The typeshed project maintains stubs for the entire Python standard library and many popular third-party packages, so you're often just one pip install away from full type coverage for your dependencies.
Modern Union Syntax vs. Optional
Python 3.10 introduced the X | Y syntax. Here's the comparison:
# Python 3.10+ (preferred)
def find_by_id(id: int) -> dict | None:
pass
# Python 3.9 and earlier
from typing import Optional, Union
def find_by_id(id: int) -> Optional[dict]:
pass
# Also valid but more verbose
def find_by_id(id: int) -> Union[dict, None]:
passAll three are equivalent, but the X | Y syntax is cleaner and the future of Python typing. Use it if you're on 3.10+.
The shift to inline union syntax is part of a broader trend toward making type annotations feel less like foreign intrusions from a typed language and more like natural Python. The old from typing import Optional, Union, List, Dict, Tuple import dance was a barrier to adoption, it felt like ceremonial boilerplate before you could write actual code. With 3.10+ syntax, you just write what you mean: int | None, str | bytes | None, list[int] instead of List[int]. If you're starting a new project today, target 3.10+ and use the modern syntax throughout.
Common Type Hint Mistakes
Knowing the right tools is only half the battle, the other half is avoiding the patterns that seem helpful but actually undermine your type safety. These are the mistakes that experienced developers make when they're first adopting type hints, and learning to recognize them early will save you significant rework.
The Any escape hatch is the most dangerous. When you're not sure what type something should be, it's tempting to annotate it as Any and move on. But Any is contagious, a function that returns Any poisons every call site because the type checker can no longer reason about what comes back. Any is appropriate for genuinely dynamic code (reflection, serialization), but using it as a shortcut for "I don't know what this is" defeats the entire purpose. If you genuinely don't know the type, that's a design signal: your data structures may need to be clarified.
Over-annotating with bare dict instead of dict[str, SomeType] is nearly as bad. A function that accepts dict and a function that accepts dict[str, UserRecord] have very different contracts, but mypy will accept both. The former is approximately as useful as Any for the type checker's purposes.
Ignoring narrowing opportunities is another common miss. When you have a str | None parameter and you check if value is None: return, mypy knows that in the rest of the function, value is str. This is called type narrowing, and it's how Optional types actually work in practice, but only if you structure your guards correctly. If you check if value: instead of if value is None:, you might accidentally exclude empty strings in the type narrowing.
Finally, writing list as a return type when you mean list[SomeType] is both a type hint mistake and a documentation failure. The specificity is the whole point.
mypy vs. pyright: Choosing Your Type Checker
mypy is the canonical type checker, but pyright (from Microsoft) has gained traction, especially in VS Code.
mypy: The Traditional Choice
- Mature ecosystem with years of development
- Excellent documentation
- Configurable strictness levels
- Slightly slower than pyright
- More forgiving by default
pyright: The Performance Alternative
- Blazingly fast, written in TypeScript
- Excellent VS Code integration (automatic diagnostics)
- Slightly stricter out of the box
- Better error messages in some cases
- Active development with frequent updates
In Practice
Use mypy if you're starting out or need to integrate with an existing project. Use pyright if you want better IDE integration and faster feedback loops. Many projects use both, mypy in CI/CD for thoroughness, pyright in the IDE for instant feedback.
VS Code Integration
For pyright in VS Code, install the Pylance extension:
{
"python.analysis.typeCheckingMode": "strict",
"python.linting.enabled": true
}You'll see type errors highlighted as you type, without running any external command.
A Complete Example: Bringing It All Together
Let's write a small module that demonstrates all these concepts working in concert. This is the kind of code you'd find at the core of a real data processing pipeline, typed clearly enough that any engineer can understand the contracts at a glance.
from typing import Callable, Protocol, TypeVar, Literal
from dataclasses import dataclass
# Define a protocol for sortable items
class Comparable(Protocol):
def __lt__(self, other: 'Comparable') -> bool:
...
# Type variable for generic operations
T = TypeVar('T')
# Data class with strict type hints
@dataclass
class Result:
status: Literal["success", "error", "pending"]
data: dict | None = None
message: str = ""
def filter_items(
items: list[T],
predicate: Callable[[T], bool]
) -> list[T]:
"""Filter items using a predicate function."""
return [item for item in items if predicate(item)]
def process_results(
results: list[Result],
) -> tuple[int, int, int]:
"""Count successes, errors, and pending items."""
success = sum(1 for r in results if r.status == "success")
errors = sum(1 for r in results if r.status == "error")
pending = sum(1 for r in results if r.status == "pending")
return (success, errors, pending)
# Using the functions
if __name__ == "__main__":
results: list[Result] = [
Result(status="success", data={"id": 1}),
Result(status="error", message="Not found"),
Result(status="pending"),
]
success, errors, pending = process_results(results)
print(f"Results: {success} success, {errors} errors, {pending} pending")
# Filter even numbers
numbers = filter_items([1, 2, 3, 4, 5], lambda x: x % 2 == 0)
print(f"Even numbers: {numbers}")When you run mypy on this file, it verifies:
Resultobjects only contain valid status valuesfilter_itemspreserves type informationprocess_resultsreceives a proper list and returns a proper tuple- All function calls pass the right types
Zero runtime overhead. Pure compile-time safety. Notice how the types themselves communicate design intent, Literal["success", "error", "pending"] tells you immediately that Result.status is a state machine with exactly three states, not just any string. That documentation lives in the code where it can be verified, not in a comment or external doc that can drift out of sync.
Common Gotchas and How to Avoid Them
1. Any Type Defeats the Purpose
# Bad: defeats the point
def process(data: Any) -> Any:
pass
# Good: be specific
def process(data: dict[str, int]) -> list[str]:
pass2. Mutable Default Arguments with Type Hints
# Bad: this is still a footgun
def append_to_list(item: int, items: list[int] = []) -> list[int]:
items.append(item)
return items
# Good: use None as default
def append_to_list(item: int, items: list[int] | None = None) -> list[int]:
if items is None:
items = []
items.append(item)
return items3. Forgetting That Type Hints Aren't Enforced at Runtime
def add(a: int, b: int) -> int:
return a + b
# Type checker warns about this
add("hello", "world") # Mypy error, but Python runs it anywayUse beartype or pydantic if you need runtime validation.
4. Being Too Loose with Union Types
# Too permissive
def process(value: int | str | float | bool | list):
pass
# Better: what does the function actually do with these types?
def process(value: int | str) -> float:
"""Convert numeric string or int to float."""
return float(value)The Compounding Returns of Type Safety
The benefits of type hints don't scale linearly with codebase size, they compound. A small, well-typed utility library doesn't just protect itself; it propagates type information outward to every caller. When every function in a codebase declares its contracts clearly, refactoring becomes dramatically safer. You can change a function's signature and mypy will tell you every call site that needs updating, instantly. What used to be an exhaustive manual search through a codebase becomes a compiler error list to work through.
This is especially significant in AI/ML work, where data pipelines are long chains of transformations. NumPy arrays, pandas DataFrames, and PyTorch tensors all have typed interfaces. When your preprocessing, feature engineering, and model inference stages are properly typed, you catch shape mismatches and dtype incompatibilities before your training job fails three hours in. The numpy stub package and pandas-stubs bring type checking to the numerical computing stack, making the kinds of silent failures that plague data pipelines visible as type errors.
The maturity you build by writing well-typed Python also prepares you for other strongly-typed ecosystems. If you ever work with TypeScript, Go, or Rust, the discipline of thinking about type contracts transfers directly. More immediately, it prepares you for working with typed ML frameworks like JAX with its shape type annotations and PyTorch with its typed tensor operations.
Type hints are not the last word in Python safety, they don't replace tests, they don't catch logic errors, and they can't protect you against misuse by callers who ignore the types. But they're one of the most cost-effective safety investments you can make. The overhead is small, the tooling is excellent, and the return in prevented bugs and improved readability is substantial.
The Payoff
Type hints and static analysis might feel like overhead at first. But after a few weeks of catching bugs before they happen, you'll wonder how you lived without them.
The type checker becomes your assistant, reviewing every line of code and verifying assumptions that might otherwise slip through. Combined with good testing practices from the previous article, you've built a formidable defense against silent failures.
Summary
Type hints transform Python from dynamically typed to gradually typed. mypy and pyright analyze your hints and catch errors without slowing your code down. Start with simple annotations on functions, graduate to generic types, and use TypeVar and Protocol for flexible reusable code. Configure mypy to match your strictness preferences, and let it evolve with your codebase.
The discipline of writing well-typed Python pays dividends immediately in IDE support and refactoring safety, and it compounds over time as your codebase grows. Whether you're building data pipelines, web APIs, or ML training systems, type hints give you a layer of correctness verification that no amount of unit testing can fully replicate. The types express intent; the tests verify behavior. You need both.
The next article covers virtual environments, the infrastructure that keeps your projects isolated and reproducible.