In Python, even though I adore writing tests in a functional manner via pytest, I still have
a soft corner for the tools provided in the unittest.mock module. I like the fact it’s
baked into the standard library and is quite flexible. Moreover, I’m yet to see another
mock library in any other language or in the Python ecosystem that allows you to mock your
targets in such a terse, flexible, and maintainable fashion.
Python
Narrowing types with TypeGuard in Python
Static type checkers like Mypy follow your code flow and statically try to figure out the types of the variables without you having to explicitly annotate inline expressions. For example:
# src.py
from __future__ import annotations
def check(x: int | float) -> str:
if not isinstance(x, int):
reveal_type(x)
# Type is now 'float'.
else:
reveal_type(x)
# Type is now 'int'.
return str(x)
The reveal_type function is provided by Mypy and you don’t need to import this. But
remember to remove the function before executing the snippet. Otherwise, Python will raise a
runtime error as the function is only understood by Mypy. If you run Mypy against this
snippet, it’ll print the following lines:
Why 'NoReturn' type exists in Python
Technically, the type of None in Python is NoneType. However, you’ll rarely see
types.NoneType being used in the wild as the community has pretty much adopted None to
denote the type of the None singleton. This usage is also documented in PEP-484.
Whenever a callable doesn’t return anything, you usually annotate it as follows:
# src.py
from __future__ import annotations
def abyss() -> None:
return
But sometimes a callable raises an exception and never gets the chance to return anything. Consider this example:
Add extra attributes to enum members in Python
While grokking the source code of the http.HTTPStatus module, I came across this technique to add extra attributes to the values of enum members. Now, to understand what do I mean by adding attributes, let’s consider the following example:
# src.py
from __future__ import annotations
from enum import Enum
class Color(str, Enum):
RED = "Red"
GREEN = "Green"
BLUE = "Blue"
Here, I’ve inherited from str to ensure that the values of the enum members are strings.
This class can be used as follows:
Peeking into the internals of Python's 'functools.wraps' decorator
The functools.wraps decorator allows you to keep your function’s identity intact after
it’s been wrapped by a decorator. Whenever a function is wrapped by a decorator, identity
properties like - function name, docstring, annotations of it get replaced by those of the
wrapper function. Consider this example:
from __future__ import annotations
# In < Python 3.9, import this from the typing module.
from collections.abc import Callable
from typing import Any
def log(func: Callable) -> Callable:
def wrapper(*args: Any, **kwargs: Any) -> Any:
"""Internal wrapper."""
val = func(*args, **kwargs)
return val
return wrapper
@log
def add(x: int, y: int) -> int:
"""Add two numbers.
Parameters
----------
x : int
First argument.
y : int
Second argument.
Returns
-------
int
Returns the summation of two integers.
"""
return x + y
if __name__ == "__main__":
print(add.__doc__)
print(add.__name__)
Here, I’ve defined a simple logging decorator that wraps the add function. The function
add has its own type annotations and docstring. So, you’d expect the docstring and
name of the add function to be printed when the above snippet gets executed. However,
running the script prints the following instead:
Limit concurrency with semaphore in Python asyncio
I was working with a rate-limited API endpoint where I continuously needed to send short-polling GET requests without hitting HTTP 429 error. Perusing the API doc, I found out that the API endpoint only allows a maximum of 100 requests per second. So, my goal was to find out a way to send the maximum amount of requests without encountering the too-many-requests error.
I picked up Python’s asyncio and the amazing HTTPx library by Tom Christie to make the requests. This is the naive version that I wrote in the beginning; it quickly hits the HTTP 429 error:
Amphibian decorators in Python
Whether you like it or not, the split world of sync and async functions in the Python ecosystem is something we’ll have to live with; at least for now. So, having to write things that work with both sync and async code is an inevitable part of the journey. Projects like Starlette, HTTPx can give you some clever pointers on how to craft APIs that are compatible with both sync and async code.
Go Rusty with exception handling in Python
While grokking Black formatter’s codebase, I came across this Rust-influenced error handling model that offers an interesting way of handling exceptions in Python. Exception handling in Python usually follows the EAFP paradigm where it’s easier to ask for forgiveness than permission.
However, Rust has this recoverable error handling workflow that leverages generic Enums. I wanted to explore how Black emulates that in Python. This is how it works:
# src.py
from __future__ import annotations
from typing import Generic, TypeVar, Union
T = TypeVar("T")
E = TypeVar("E", bound=Exception)
class Ok(Generic[T]):
def __init__(self, value: T) -> None:
self._value = value
def ok(self) -> T:
return self._value
class Err(Generic[E]):
def __init__(self, e: E) -> None:
self._e = e
def err(self) -> E:
return self._e
Result = Union[Ok[T], Err[E]]
In the above snippet, two generic types Ok and Err represent the return type and the
error types of a callable respectively. These two generics were then combined into one
Result generic type. You’d use the Result generic to handle exceptions as follows:
Variance of generic types in Python
I’ve always had a hard time explaining variance of generic types while working with type annotations in Python. This is an attempt to distill the things I’ve picked up on type variance while going through PEP-483.
A pinch of type theory
A generic type is a class or interface that is parameterized over types. Variance refers to how subtyping between the generic types relates to subtyping between their parameters' types.
Create a sub dictionary with O(K) complexity in Python
How’d you create a sub dictionary from a dictionary where the keys of the sub-dict are provided as a list?
I was reading a tweet by Ned Bachelder on this today and that made me realize that I
usually solve it with O(DK) complexity, where K is the length of the sub-dict keys and
D is the length of the primary dict. Here’s how I usually do that without giving it any
thoughts or whatsoever: