I'm trying out Python's type annotations with abstract base classes to write some interfaces. Is there a way to annotate the possible types of *args and **kwargs?

For example, how would one express that the sensible arguments to a function are either an int or two ints? type(args) gives Tuple so my guess was to annotate the type as Union[Tuple[int, int], Tuple[int]], but this doesn't work.

from typing import Union, Tuple

def foo(*args: Union[Tuple[int, int], Tuple[int]]):
        i, j = args
        return i + j
    except ValueError:
        assert len(args) == 1
        i = args[0]
        return i

# ok
print(foo((1, 2)))
# mypy does not like this
print(foo(1, 2))

Error messages from mypy:

t.py: note: In function "foo":
t.py:6: error: Unsupported operand types for + ("tuple" and "Union[Tuple[int, int], Tuple[int]]")
t.py: note: At top level:
t.py:12: error: Argument 1 to "foo" has incompatible type "int"; expected "Union[Tuple[int, int], Tuple[int]]"
t.py:14: error: Argument 1 to "foo" has incompatible type "int"; expected "Union[Tuple[int, int], Tuple[int]]"
t.py:15: error: Argument 1 to "foo" has incompatible type "int"; expected "Union[Tuple[int, int], Tuple[int]]"
t.py:15: error: Argument 2 to "foo" has incompatible type "int"; expected "Union[Tuple[int, int], Tuple[int]]"

It makes sense that mypy doesn't like this for the function call because it expects there to be a tuple in the call itself. The addition after unpacking also gives a typing error that I don't understand.

How does one annotate the sensible types for *args and **kwargs?

Alex Waygood
  • 6,304
  • 3
  • 24
  • 46
  • 22,455
  • 16
  • 75
  • 126

8 Answers8


For variable positional arguments (*args) and variable keyword arguments (**kw) you only need to specify the expected value for one such argument.

From the Arbitrary argument lists and default argument values section of the Type Hints PEP:

Arbitrary argument lists can as well be type annotated, so that the definition:

def foo(*args: str, **kwds: int): ...

is acceptable and it means that, e.g., all of the following represent function calls with valid types of arguments:

foo('a', 'b', 'c')
foo(x=1, y=2)
foo('', z=0)

So you'd want to specify your method like this:

def foo(*args: int):

However, if your function can only accept either one or two integer values, you should not use *args at all, use one explicit positional argument and a second keyword argument:

def foo(first: int, second: Optional[int] = None):

Now your function is actually limited to one or two arguments, and both must be integers if specified. *args always means 0 or more, and can't be limited by type hints to a more specific range.

Mark Amery
  • 143,130
  • 81
  • 406
  • 459
Martijn Pieters
  • 1,048,767
  • 296
  • 4,058
  • 3,343
  • 4
    Just curious, why add the `Optional`? Did something change about Python or did you change your mind? Is it still not strictly necessary due to the `None` default? – Praxeolitic Aug 25 '17 at 00:30
  • 23
    @Praxeolitic yes, in practice the automatic, implied `Optional` annotation when you use `None` as a default value made certain usecases harder and that is now being removed from the PEP. – Martijn Pieters Aug 25 '17 at 00:44
  • 14
    [Here is a link discussing this](https://github.com/python/typing/issues/275) for those interested. It certainly does sound like explicit `Optional` is going to be required in the future. – Rick Feb 05 '18 at 22:22
  • This is actually not supported for Callable: https://github.com/python/mypy/issues/5876 – Shital Shah Jan 11 '20 at 06:49
  • 2
    @ShitalShah: that’s not really what that issue is about. `Callable` doesn’t support *any* mention of a type hint for `*args` or `**kwargs` **full stop**. That specific issue is about marking up callables that accept specific arguments *plus an arbitrary number of others*, and so use `*args: Any, **kwargs: Any`, a very specific type hint for the two catch-alls. For cases where you set `*args` and / or `**kwargs` to something more specific you can use a `Protocol`. – Martijn Pieters Jan 11 '20 at 08:17
  • Could it be cleaner to use [`@overload`](https://docs.python.org/3/library/typing.html#typing.overload) on `foo` so that the two valid signatures are `foo(first: int)` and `foo(first: int, second: int)`, leaving the real signature as `foo(first, second=None)`? Or is just a matter of preference? – wjandrea Jan 08 '23 at 20:58
  • 1
    @wjandrea overloads won’t really help here; use overloads to aid the type checker when the combination of arguments alters what is being returned; e.g. `foo(1)` returns a string but `foo(1, 2)` produces an int, etc. – Martijn Pieters Jan 08 '23 at 22:46

2022 Update

The mypy team added support for Unpack, this is available since Mypy 0.981 or higher.

Attention! Although this feature is complete, Unpack[...] is still considered experimental, so you will need to use --enable-incomplete-features to enable it.

You can use this feature as follows:

from typing import TypedDict
from typing_extensions import Unpack

class RequestParams(TypedDict):
    url: str
    allow_redirects: bool

def request(**kwargs: Unpack[RequestParams]) -> None:

If you call the request function with the arguments defined in the TypedDict, you won't get any errors:

# OK
request(url="https://example.com", allow_redirects=True)

If you forget to pass an argument, mypy will let you know now

# error: Missing named argument "allow_redirects" for "request"  [call-arg]

You can also make the fields non-required by adding total=False to the TypedDict:

class RequestParams(TypedDict, total=False):
    url: str
    allow_redirects: bool

# OK

Alternatively, you can use the Required and NotRequired annotations to control whether a keyword argument is required or not:

from typing import TypedDict
from typing_extensions import Unpack, NotRequired

class RequestParams(TypedDict):
    url: str
    allow_redirects: NotRequired[bool]

def request(**kwargs: Unpack[RequestParams]) -> None:

# OK
request(url="https://example.com", allow_redirects=True)

Old answer bellow:

While you can annotate variadic arguments with a type, I don't find it very useful because it assumes that all arguments are of the same type.

The proper type annotation of *args and **kwargs that allows specifying each variadic argument separately is not supported by mypy yet. There is a proposal for adding an Expand helper on mypy_extensions module, it would work like this:

class Options(TypedDict):
    timeout: int
    alternative: str
    on_error: Callable[[int], None]
    on_timeout: Callable[[], None]

def fun(x: int, *, **options: Expand[Options]) -> None:

The GitHub issue was opened on January 2018 but it's still not closed. Note that while the issue is about **kwargs, the Expand syntax will likely be used for *args as well.

  • 88,713
  • 10
  • 131
  • 172
Cesar Canassa
  • 18,659
  • 11
  • 66
  • 69
  • 3
    According to https://github.com/microsoft/pyright/issues/3002#issuecomment-1046100462 the new syntax is `**options: Unpack[Options]` and works in Pylance (but not yet mypy) – rattray Mar 24 '22 at 20:21
  • 1
    Great. If the answer is: `# type: ignore[no-untyped-def]`, then that is the answer! – Chris Jun 17 '22 at 17:28
  • 3
    @Chris IMO this is the only current answer in this thread and one of the most useful I've found on the `python-typing` tag. – bad_coder Jun 17 '22 at 17:32

The easiest way to do this -- without changing your function signature -- is using @overload

First, some background. You cannot annotate the type of *args as a whole, only the type of the items in args. So you can't say that *args is Tuple[int, int] you can only say that the type of each item within *args is int. That means that you can't put a limit on the length of *args or use a different type for each item.

To solve this you can consider changing the signature of your function to give it named arguments, each with their own type annotation, but if want (or need) to keep your function using *args, you can get mypy to work using @overload:

from typing import overload

def foo(arg1: int, arg2: int) -> int:

def foo(arg: int) -> int:

def foo(*args):
        i, j = args
        return i + j
    except ValueError:
        assert len(args) == 1
        i = args[0]
        return i

print(foo(1, 2))

Note that you do not add @overload or type annotations to the actual implementation, which must come last.

You can also use this to vary the returned result in a way that makes explicit which argument types correspond with which return type. e.g.:

from typing import Tuple, overload

def foo(arg1: int, arg2: int) -> Tuple[int, int]:

def foo(arg: int) -> int:

def foo(*args):
        i, j = args
        return j, i
    except ValueError:
        assert len(args) == 1
        i = args[0]
        return i

print(foo(1, 2))
  • 3,413
  • 1
  • 21
  • 19
  • 2
    I like this answer because it addresses the more general case. Looking back, I should not have used `(type1)` vs `(type1, type1)` function calls as my example. Maybe `(type1)` vs `(type2, type1)` would have been a better example and shows why I like this answer. This also allows differing return types. However, in the special case where you only have one return type and your `*args` and `*kwargs` are all the same type, the technique in Martjin's answer makes more sense so both answers are useful. – Praxeolitic Aug 25 '17 at 00:07
  • 11
    Using `*args` where there is a maximum number of arguments (2 here) is *still wrong* however. – Martijn Pieters Aug 25 '17 at 00:12
  • 1
    So, yes, it's good to know about `@overload`, but it is the wrong tool *for this specific job*. – Martijn Pieters Aug 25 '17 at 00:13
  • 1
    @MartijnPieters Why is `*args` necessarily wrong here? If the expected calls were `(type1)` vs `(type2, type1)`, then the number of arguments is variable and there isn't an appropriate default for the trailing argument. Why is it important that there's a max? – Praxeolitic Aug 25 '17 at 00:24
  • 3
    `*args` is really there for *zero or more*, uncapped, homogenous arguments, *or* for 'passing these along untouched' catch-alls. You have one required argument and one optional. That's totally different and is normally handled by giving the second argument a sentinel default value to detect that is was omitted. – Martijn Pieters Aug 25 '17 at 00:39
  • Using `*args` where the cardinality is *at least n, at most m* where *n* is greater than 0 breaks the self-documenting nature and convention, whilst keyword arguments for optionals has a very long precedent in Python. – Martijn Pieters Aug 25 '17 at 00:41
  • 5
    After looking at the PEP, this clearly isn't the intended use of @overload. While this answer shows an interesting way to individually annotate the types of `*args`, an even better answer to the question is that this isn't something that should be done at all. – Praxeolitic Aug 25 '17 at 01:07
  • @MartijnPieters I'm convinced on `*args` but what about `**kwargs`? It seems like it could be consistent with Python style to annotate individual types for `**kwargs` but there just isn't a way to do that. Would you agree? – Praxeolitic Aug 25 '17 at 01:12
  • 1
    @Praxeolitic: in Python 3, `**kwargs` is *mostly* used for passing on arguments to another call. But yes, that you can't annotate the values is.. unhelpful sometimes. The PEP is still in flux, the community is hashing out better ways of specifying complex cases as we discover them. See the [extensive discussion around decorators](https://github.com/python/mypy/issues/3157) (follow all the referenced issues) for example. – Martijn Pieters Aug 25 '17 at 07:02
  • 1
    This is still the only answer that completely addresses the original question "Is there a way to annotate the *possible* types of *args and **kwargs", emphasis on "possible". I'm not sure why I'm getting down-voted for what is the obvious correct answer. The example was fabricated, but the intent was clear: the OP wanted to explicitly describe the possible signatures of the function `foo`. – chadrik Aug 26 '17 at 00:34
  • At least in MyPy, one is encouraged to use the overloads rather than the direct call. Thus, the user is discouraged from passing in fewer or more positional arguments than necessary. This seems to be a standard, as it works in PyCharm as well. This solution allows multiple heterogenous overloads sharing a common base implementation but with non-runtime validation, and would therefore be what I consider the best solution. It may be a solution to implementing something that's considered non-Pythonic, but that's a secondary issue. – TerrestrialIntelligence Dec 27 '19 at 12:18
  • @MartijnPieters you wrote: "You have one required argument and one optional. That's totally different and is normally handled by giving the second argument a sentinel default value to detect that is was omitted." I agree that a keyword argument would have been more clear, but even in that case you would still need to use `@overload` to properly type this function if the type of result or the first arg varied depending on the existence of the second argument. e.g. if the valid type signatures were `(int) -> int` and `(int, int) -> float` – chadrik Jan 06 '20 at 20:47
  • @chadrik that’s the actual use-case for `@overload`, which is orthogonal to the issue discussed here. Put differently, it doesn't matter that the argument count varies, you also need `@overload` for `(int) -> int` vs `(float) -> float`, or `(bytes, int) -> bytes` vs `(str, int) -> str`, etc. – Martijn Pieters Jan 07 '20 at 00:11
  • `@overload` is required when mimicking _functional polymorphism_. **Python does not natively support functional polymorphism.** If you `def` a function with multiple signatures, the last function `def`'d overrides (redefines) the previous ones. Python mimics functional polymorphism with _optional positional/keyword arguments_ (which coincidentally C++ does not support). Overloads are used when (1) typing ported C/C++ polymorphic functions, or (2) consistency must be maintained between types and their arity from parameters to return types. You have limited `*args` to 2 arguments instead of any. – adam.hendry Sep 18 '22 at 18:26
  • @A.Hendry the ability to edit another user's post is meant to fix or update examples, typos, etc. You've just dumped your opinion into my opening paragraph. Please leave your opinions in the comments. – chadrik Sep 26 '22 at 16:50
  • @chadrik Fair enough; apologies for that. However, thank you for updating your answer: I don't think it was correct in saying "The correct way to do this is....". Your answer is only one way to get the type checking to work; there are others. I wouldn't have jumped straight to using `@overload`, but I will countenance that that is my opinion. Most importantly, in this case, I think it was important to clarify that this is "a" way to do this in order to avoid arguments in the community. – adam.hendry Sep 30 '22 at 00:27
  • This will mistyped: it will fail when calling arguments by name – tbrugere Aug 25 '23 at 18:58

As a short addition to the previous answer, if you're trying to use mypy on Python 2 files and need to use comments to add types instead of annotations, you need to prefix the types for args and kwargs with * and ** respectively:

def foo(param, *args, **kwargs):
    # type: (bool, *str, **int) -> None

This is treated by mypy as being the same as the below, Python 3.5 version of foo:

def foo(param: bool, *args: str, **kwargs: int) -> None:
  • 58,192
  • 30
  • 175
  • 224

In some cases the content of **kwargs can be a variety of types.

This seems to work for me:

from typing import Any

def testfunc(**kwargs: Any) -> None:


from typing import Any, Optional

def testfunc(**kwargs: Optional[Any]) -> None:

In the case where you feel the need to constrain the types in **kwargs I suggest creating a struct-like object and add the typing there. This can be done with dataclasses, or pydantic.

from dataclasses import dataclass

class MyTypedKwargs:
   expected_variable: str
   other_expected_variable: int

def testfunc(expectedargs: MyTypedKwargs) -> None:
  • 42,176
  • 24
  • 124
  • 155
  • This essentially disables type checking, doesn't it? That's like leaving out the annotation for `kwargs` altogether. – normanius Apr 06 '21 at 21:29
  • 5
    `**kwargs` is by design and technically can be anything. If you know what you're getting I suggest defining that as a typed argument. The advantage here is that for cases where using `**kwargs` is acceptable/expected, in ides/tools, like pycharm, is it won't give you a notification that the type is incorrect. – monkut Apr 07 '21 at 00:38
  • 1
    I partially disagree. I think there are situations where it's reasonable to constrain types for **kwargs or *args. But I also see that type checking and **kwargs doesn't go together very well (at least for current Python versions). Maybe you want to add this to your answer to better addresse the OPs question. – normanius Apr 10 '21 at 19:56
  • 2
    Yeah, there may be a usecase for typing kwargs, but I would lean toward making your inputs clearer instead of lumping them in to kwargs. – monkut Apr 12 '21 at 02:37
  • 1
    It's a good practice to avoid using `Any` because it disables type checking completely. Instead you can use `object` and then `# type: ignore` wherever you expand kwargs. – V13 Sep 10 '22 at 15:44

If one wants to describe specific named arguments expected in kwargs, one can instead pass in a TypedDict(which defines required and optional parameters). Optional parameters are what were the kwargs. Note: TypedDict is in python >= 3.8 See this example:

import typing

class RequiredProps(typing.TypedDict):
    # all of these must be present
    a: int
    b: str

class OptionalProps(typing.TypedDict, total=False):
    # these can be included or they can be omitted
    c: int
    d: int

class ReqAndOptional(RequiredProps, OptionalProps):

def hi(req_and_optional: ReqAndOptional):
  • 2,136
  • 1
  • 21
  • 28

I'm trying out Python's type annotations with abstract base classes to write some interfaces. Is there a way to annotate the possible types of *args and **kwargs...How does one annotate the sensible types for *args and **kwargs

There are two general usage categories when it comes to type hinting:

  1. Writing your own code (which you can edit and change)
  2. Using 3rd party code (which you can't edit, or is hard to change)

Most users have some combo of both.

The answer depends on whether your *args and **kwargs have homogeneous types (i.e. all of the same type) or heterogenous types (i.e. different types), as well as whether there is a fixed number of them or a variable/indeterminate number of them (the term used here is fixed vs. variable arity)

*args and **kwargs have sometimes been used in what I'm loosely calling a "Python-specific design pattern" (see below). It is important to understand when this is being done because it affects the way you should type hint.

Best practice, always, is to stand on the shoulders of giants:

  • I highly recommend reading and studying the typeshed .pyi stubs, especially for the standard library, to learn how developers have typed these things in the wild.

For those who want to see a HOW-TO come to life, please consider upvoting the following PRs:

Case 1: (Writing Your Own Code)


(a) Operating on a Variable Number of Homogeneous Arguments

The first reason *args is used is to write a function that has to work on a variable (indeterminate) number of homogoeneous arguments

Example: summing numbers, accepting command line arguments, etc.

In these cases, all *args are homogeneous (i.e. all the same type).

Example: In the first case, all arguments are ints or floats; In the second case, all arguments are strs.

It is also possible to use Unions, TypeAliass, Generics, and Protocols as the type for *args.

I claim (without proof) that operating on an indeterminate number of homogeneous arguments was the first reason *args was introduced into the Python language.

Consequently, PEP 484 supports providing *args a homogeneous type.


Using *args is done much less often than specifying parameters explicitly (i.e. logically, your code base will have many more functions that don't use *args than do). Using *args for homogeneous types is normally done to avoid requiring users to put arguments into a container before passing them to the function.

It is recommended to type parameters explicitly wherever possible.

  • If for nothing else, you would normally be documenting each argument with its type in a docstring anyway (not documenting is a quick way to make others not want to use your code, including your future self.)

Note also that args is a tuple because the unpacking operator (*) returns a tuple, so note that you can't mutate args directly (You would have to pull the mutable object out of args).

(b) Writing Decorators and Closures

The second place where *args will pop up is in decorators. For this, using ParamSpec as described in PEP 612 is the way to go.

(c) Top-Level Functions that Call Helpers

This is the "Python-specific design pattern" I alluded to. For Python >= 3.11, the python docs show examples where you can use TypeVarTuple to type this so the type information is preserved between calls.

  • Using *args this way is typically done to reduce the amount of code to write, esp. when the arguments between multiple functions are the same
  • It has also been used to "swallow up" a variable number of arguments through tuple unpacking that may not be needed in the next function

Here, items in *args have heterogenous types, and possibly a variable number of them, both of which can be problematic.

The Python typing ecosystem does not have a way to specify heterogenous *args. 1

Before the advent of type checking, developers would need to check the type of individual arguments in *args (with assert, isinstance, etc.) if they needed to do something differently depending on the type:


  • You need to print passed strs, but sum the passed ints

Thankfully, the mypy developers included type inference and type narrowing to mypy to support these kinds of situations. (Also, existing code bases don't need to change much if they were already using assert, isintance, etc., to determine the types of the items in *args)

Consequently, in this case you would do the following:

  • Give *args the type object so its elements can be any type, and
  • use type narrowing where needed with assert ... is (not) None, isinstance, issubclass, etc., to determine the types of individual items in *args

1 Warning:

For Python >= 3.11, *args can be typed with TypeVarTuple, but this is meant to be used when type hinting variadic generics. It should not be used for typing *args in the general case.

TypeVarTuple was primarily introduced to help type hint numpy arrays, tensorflow tensors, and similar data structures, but for Python >= 3.11, it can be used to preserve type information between calls for top-level functions calling helpers as stated before.

Functions that process heterogenous *args (not just pass them through) must still type narrow to determine the types of individual items.

For Python <3.11, TypeVarTuple can be accessed through typing_extensions, but to date there is only provisional support for it through pyright (not mypy). Also, PEP 646 includes a section on using *args as a Type Variable Tuple.


(a) Operating on a Variable Number of Homogeneous Arguments

PEP 484 supports typing all values of the **kwargs dictionary as a homogeneous type. All keys are automatically strs.

Like *args, it is also possible to use Unions, TypeAliass, Generics, and Protocols as the type for *kwargs.

I've not found a compelling use case for processing a homogeneous set of named arguments using **kwargs.

(b) Writing Decorators and Closures

Again, I would point you to ParamSpec as described in PEP 612.

(c) Top-Level Functions that Call Helpers

This is also the "Python-specific design pattern" I alluded to.

For a finite set of heterogeneous keyword types, you can use TypedDict and Unpack if PEP 692 is approved.

However, the same things for *args applies here:

  • It is best to explicitly type out your keyword arguments
  • If your types are heterogenous and of unknown size, type hint with object and type narrow in the function body

Case 2: (3rd Party Code)

This ultimately amounts to following the guidelines for the part (c)s in Case 1.


Static Type Checkers

The answer to your question also depends on the static type checker you use. To date (and to my knowledge), your choices for static type checker include:

  • mypy: Python's de facto static type checker
  • pyright: Microsoft's static type checker
  • pyre: Facebook/Instagram's static type checker
  • pytype: Google's static type checker

I personally have only ever used mypy and pyright. For these, the mypy playground and pyright playground are great places to test out type hinting your code.


ABCs, like descriptors and metaclasses, are tools for building frameworks (1). If there's a chance you could be turning your API from a "consenting adults" Python syntax into a "bondage-and-discipline" syntax (to borrow a phrase from Raymond Hettinger), consider YAGNE.

That said (preaching aside), when writing interfaces, it's important to consider whether you should use Protocols or ABCs.


In OOP, a protocol is an informal interface, defined only in documentation and not in code (see this review article of Fluent Python, Ch. 11, by Luciano Ramalho). Python adopted this concept from Smalltalk, where a protocol was an interface seen as a set of methods to fulfill. In Python, this is achieved by implementing specific dunder methods, which is described in the Python data model and I touch upon briefly here.

Protocols implement what is called structural subtyping. In this paradigm, _a subtype is determined by its structure, i.e. behavior), as opposed to nominal subtyping (i.e. a subtype is determined by its inheritance tree). Structural subtyping is also called static duck typing, as compared to traditional (dynamic) duck typing. (The term is thanks to Alex Martelli.)

Other classes don't need to subclass to adhere to a protocol: they just need to implement specific dunder methods. With type hinting, PEP 544 in Python 3.8 introduced a way to formalize the protocol concept. Now, you can create a class that inherits from Protocol and define any functions you want in it. So long as another class implements those functions, it's considered to adhere to that Protocol.


Abstract base classes complement duck-typing and are helpful when you run into situations like:

class Artist:
    def draw(self): ...

class Gunslinger:
    def draw(self): ...

class Lottery:
    def draw(self): ...

Here, the fact that these classes all implement a draw() might doesn't necessarily mean these objects are interchangeable (again, see Fluent Python, Ch. 11, by Luciano Ramalho)! An ABC gives you the ability to make a clear declaration of intent. Also, you can create a virtual subclass by registering the class so you don't have to subclass from it (in this sense, you are following the GoF principle of "favoring composition over inheritance" by not tying yourself directly to the ABC).

Raymond Hettinger gives an excellent talk on ABCs in the collections module in his PyCon 2019 Talk.

Also, Alex Martelli called ABCs goose typing. You can subclass many of the classes in collections.abc, implement only a few methods, and have classes behave like the builtin Python protocols implemented with dunder methods.

The Python Typing Paradigm

Luciano Ramalho gives an excellent talk on this and its relationship to the typing ecosystem in his PyCon 2021 Talk.

Incorrect Approaches


@overload is designed to be used to mimic functional polymorphism.

  • Python does not natively support functional polymorphism (C++ and several other languages do).

    • If you def a function with multiple signatures, the last function def'd overrides (redefines) the previous ones.
def func(a: int, b: str, c: bool) -> str:
    print(f'{a}, {b}, {c}')

def func(a: int, b: bool) -> str:
    print(f'{a}, {b}')

if __name__ == '__main__':
    func(1, '2', True)  # Error: `func()` takes 2 positional arguments but 3 were given

Python mimics functional polymorphism with optional positional/keyword arguments (coincidentally, C++ does not support keywrod arguments).

Overloads are to be used when

  • (1) typing ported C/C++ polymorphic functions, or
  • (2) type consistency must be maintained between depending on types used in a function call

Please see Adam Johnson's blog post "Python Type Hints - How to Use @overload.


(1) Ramalho, Luciano. Fluent Python (p. 320). O'Reilly Media. Kindle Edition.

  • 4,458
  • 5
  • 24
  • 51


def __init__(self, *args, **kwargs):  # type: ignore[no-untyped-def]


This is the answer given by Chris in the comments, I did not find consensus within 5 minutes of scanning the answers, and it was not that relevant for me to get the typing correct of this default Python syntax. Still I do value mypy on my own code, so this was, timewise, an acceptable compromise for me. Perhaps it helps someone.

  • 2,002
  • 3
  • 26
  • 66