ChatGPT解决这个技术问题 Extra ChatGPT

Type annotations for *args and **kwargs

I'm trying out Python's type annotations with abstract base classes to write some interfaces. Is there a way to annotate the possible types of *args and **kwargs?

For example, how would one express that the sensible arguments to a function are either an int or two ints? type(args) gives Tuple so my guess was to annotate the type as Union[Tuple[int, int], Tuple[int]], but this doesn't work.

from typing import Union, Tuple

def foo(*args: Union[Tuple[int, int], Tuple[int]]):
    try:
        i, j = args
        return i + j
    except ValueError:
        assert len(args) == 1
        i = args[0]
        return i

# ok
print(foo((1,)))
print(foo((1, 2)))
# mypy does not like this
print(foo(1))
print(foo(1, 2))

Error messages from mypy:

t.py: note: In function "foo":
t.py:6: error: Unsupported operand types for + ("tuple" and "Union[Tuple[int, int], Tuple[int]]")
t.py: note: At top level:
t.py:12: error: Argument 1 to "foo" has incompatible type "int"; expected "Union[Tuple[int, int], Tuple[int]]"
t.py:14: error: Argument 1 to "foo" has incompatible type "int"; expected "Union[Tuple[int, int], Tuple[int]]"
t.py:15: error: Argument 1 to "foo" has incompatible type "int"; expected "Union[Tuple[int, int], Tuple[int]]"
t.py:15: error: Argument 2 to "foo" has incompatible type "int"; expected "Union[Tuple[int, int], Tuple[int]]"

It makes sense that mypy doesn't like this for the function call because it expects there to be a tuple in the call itself. The addition after unpacking also gives a typing error that I don't understand.

How does one annotate the sensible types for *args and **kwargs?


M
Mark Amery

For variable positional arguments (*args) and variable keyword arguments (**kw) you only need to specify the expected value for one such argument.

From the Arbitrary argument lists and default argument values section of the Type Hints PEP:

Arbitrary argument lists can as well be type annotated, so that the definition: def foo(*args: str, **kwds: int): ... is acceptable and it means that, e.g., all of the following represent function calls with valid types of arguments: foo('a', 'b', 'c') foo(x=1, y=2) foo('', z=0)

So you'd want to specify your method like this:

def foo(*args: int):

However, if your function can only accept either one or two integer values, you should not use *args at all, use one explicit positional argument and a second keyword argument:

def foo(first: int, second: Optional[int] = None):

Now your function is actually limited to one or two arguments, and both must be integers if specified. *args always means 0 or more, and can't be limited by type hints to a more specific range.


Just curious, why add the Optional? Did something change about Python or did you change your mind? Is it still not strictly necessary due to the None default?
@Praxeolitic yes, in practice the automatic, implied Optional annotation when you use None as a default value made certain usecases harder and that is now being removed from the PEP.
Here is a link discussing this for those interested. It certainly does sound like explicit Optional is going to be required in the future.
This is actually not supported for Callable: github.com/python/mypy/issues/5876
@ShitalShah: that’s not really what that issue is about. Callable doesn’t support any mention of a type hint for *args or **kwargs full stop. That specific issue is about marking up callables that accept specific arguments plus an arbitrary number of others, and so use *args: Any, **kwargs: Any, a very specific type hint for the two catch-alls. For cases where you set *args and / or **kwargs to something more specific you can use a Protocol.
c
chadrik

The proper way to do this is using @overload

from typing import overload

@overload
def foo(arg1: int, arg2: int) -> int:
    ...

@overload
def foo(arg: int) -> int:
    ...

def foo(*args):
    try:
        i, j = args
        return i + j
    except ValueError:
        assert len(args) == 1
        i = args[0]
        return i

print(foo(1))
print(foo(1, 2))

Note that you do not add @overload or type annotations to the actual implementation, which must come last.

You'll need a newish version of both typing and mypy to get support for @overload outside of stub files.

You can also use this to vary the returned result in a way that makes explicit which argument types correspond with which return type. e.g.:

from typing import Tuple, overload

@overload
def foo(arg1: int, arg2: int) -> Tuple[int, int]:
    ...

@overload
def foo(arg: int) -> int:
    ...

def foo(*args):
    try:
        i, j = args
        return j, i
    except ValueError:
        assert len(args) == 1
        i = args[0]
        return i

print(foo(1))
print(foo(1, 2))

I like this answer because it addresses the more general case. Looking back, I should not have used (type1) vs (type1, type1) function calls as my example. Maybe (type1) vs (type2, type1) would have been a better example and shows why I like this answer. This also allows differing return types. However, in the special case where you only have one return type and your *args and *kwargs are all the same type, the technique in Martjin's answer makes more sense so both answers are useful.
Using *args where there is a maximum number of arguments (2 here) is still wrong however.
So, yes, it's good to know about @overload, but it is the wrong tool for this specific job.
*args is really there for zero or more, uncapped, homogenous arguments, or for 'passing these along untouched' catch-alls. You have one required argument and one optional. That's totally different and is normally handled by giving the second argument a sentinel default value to detect that is was omitted.
After looking at the PEP, this clearly isn't the intended use of @overload. While this answer shows an interesting way to individually annotate the types of *args, an even better answer to the question is that this isn't something that should be done at all.
C
Cesar Canassa

Not really supported yet

While you can annotate variadic arguments with a type, I don't find it very useful because it assumes that all arguments are of the same type.

The proper type annotation of *args and **kwargs that allows specifying each variadic argument separately is not supported by mypy yet. There is a proposal for adding an Expand helper on mypy_extensions module, it would work like this:

class Options(TypedDict):
    timeout: int
    alternative: str
    on_error: Callable[[int], None]
    on_timeout: Callable[[], None]
    ...

def fun(x: int, *, **options: Expand[Options]) -> None:
    ...

The GitHub issue was opened on January 2018 but it's still not closed. Note that while the issue is about **kwargs, the Expand syntax will likely be used for *args as well.


According to github.com/microsoft/pyright/issues/… the new syntax is **options: Unpack[Options] and works in Pylance (but not yet mypy)
Great. If the answer is: # type: ignore[no-untyped-def], then that is the answer!
@Chris IMO this is the only current answer in this thread and one of the most useful I've found on the python-typing tag.
M
Michael0x2a

As a short addition to the previous answer, if you're trying to use mypy on Python 2 files and need to use comments to add types instead of annotations, you need to prefix the types for args and kwargs with * and ** respectively:

def foo(param, *args, **kwargs):
    # type: (bool, *str, **int) -> None
    pass

This is treated by mypy as being the same as the below, Python 3.5 version of foo:

def foo(param: bool, *args: str, **kwargs: int) -> None:
    pass

m
monkut

In some cases the content of **kwargs can be a variety of types.

This seems to work for me:

from typing import Any

def testfunc(**kwargs: Any) -> None:
    print(kwargs)

or

from typing import Any, Optional

def testfunc(**kwargs: Optional[Any]) -> None:
    print(kwargs)

In the case where you feel the need to constrain the types in **kwargs I suggest creating a struct-like object and add the typing there. This can be done with dataclasses, or pydantic.

from dataclasses import dataclass

@dataclass
class MyTypedKwargs:
   expected_variable: str
   other_expected_variable: int


def testfunc(expectedargs: MyTypedKwargs) -> None:
    pass

This essentially disables type checking, doesn't it? That's like leaving out the annotation for kwargs altogether.
**kwargs is by design and technically can be anything. If you know what you're getting I suggest defining that as a typed argument. The advantage here is that for cases where using **kwargs is acceptable/expected, in ides/tools, like pycharm, is it won't give you a notification that the type is incorrect.
I partially disagree. I think there are situations where it's reasonable to constrain types for **kwargs or *args. But I also see that type checking and **kwargs doesn't go together very well (at least for current Python versions). Maybe you want to add this to your answer to better addresse the OPs question.
Yeah, there may be a usecase for typing kwargs, but I would lean toward making your inputs clearer instead of lumping them in to kwargs.
s
spacether

If one wants to describe specific named arguments expected in kwargs, one can instead pass in a TypedDict(which defines required and optional parameters). Optional parameters are what were the kwargs. Note: TypedDict is in python >= 3.8 See this example:

import typing

class RequiredProps(typing.TypedDict):
    # all of these must be present
    a: int
    b: str

class OptionalProps(typing.TypedDict, total=False):
    # these can be included or they can be omitted
    c: int
    d: int

class ReqAndOptional(RequiredProps, OptionalProps):
    pass

def hi(req_and_optional: ReqAndOptional):
    print(req_and_optional)

sometimes you get so wrapped up in one way of doing something you forget the simple way. thanks for this.