-
Notifications
You must be signed in to change notification settings - Fork 33
TYP: Type annotations, part 4 #313
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This pull request updates and standardizes type annotations throughout the array‑API‑compat codebase, ensuring that union types, keyword arguments, and collection type hints conform to the latest Python syntax and best practices.
- Changes include replacing deprecated type imports, updating union syntax (using the “|” operator), and adding explicit types for **kwargs across multiple modules.
- Several backend-specific modules (torch, numpy, dask, cupy, and common) now consistently annotate parameters and return types.
Reviewed Changes
Copilot reviewed 22 out of 22 changed files in this pull request and generated no comments.
Show a summary per file
File | Description |
---|---|
array_api_compat/torch/fft.py | Updated import and kwargs annotation for FFT functions. |
array_api_compat/torch/_aliases.py | Refined type annotations for promotion and result functions. |
array_api_compat/numpy/* | Standardized type hints and error messages in linalg, _typing, _info. |
array_api_compat/dask/array/* | Applied uniform type annotations and minor updates in error texts. |
array_api_compat/cupy/* | Updated type annotations and internal alias handling. |
array_api_compat/common/* | Adjusted typing in helper functions and internal modules. |
Comments suppressed due to low confidence (1)
array_api_compat/dask/array/linalg.py:56
- Typo in the error message: 'full_matrics' should be corrected to 'full_matrices'.
raise ValueError("full_matrics=True is not supported by dask.")
|
||
# TODO: Account for other backends. | ||
return isinstance(x, sparse.SparseArray) | ||
|
||
|
||
def is_array_api_obj(x: object) -> TypeIs[_ArrayApiObj]: # pyright: ignore[reportUnknownParameterType] | ||
def is_array_api_obj(x: object) -> TypeGuard[_ArrayApiObj]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Given the current definition of _ArrayApiObj, TypeIs would cause downstream failures for all unknown array api compliant libraries.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Doesn't SupportsArrayNamespace
cover all downstream array types?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
SupportsArrayNamespace
just states that the object has a __array_namespace__
method, nothing more.
It's missing all the other methods and properties of an Array.
I would much rather NOT write the full Array protocol here (I did it in array-api-extra and I regret it), as this is squarely in scope for array-api-types
.
@@ -70,72 +69,11 @@ def shape(self, /) -> _T_co: ... | |||
DTypeKind: TypeAlias = _DTypeKind | tuple[_DTypeKind, ...] | |||
|
|||
|
|||
# `__array_namespace_info__.dtypes(kind="bool")` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
holy overengineering Batman!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was planning on making these generic on their dtype-types. Then, when combined with the array-api-namespace protocol, it becomes possible to statically type the individual dtypes of any array-api library. I don't see any other way to make that happen.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I do not see much benefit at all in breaking down which dict keys each of the methods of dtypes()
can return. From both a maintainer and a end user perspective, I can see very little to no gain for any work in addition to -> Mapping[str, DType]
.
from numpy import _CopyMode # noqa: F401 | ||
except ImportError: | ||
pass | ||
from .linalg import matrix_transpose, vecdot # type: ignore[no-redef] # noqa: F401 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
reverts regression from #288
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what was the issue?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
**kwargs, | ||
dtype: DType | None = None, | ||
device: Device | None = None, | ||
copy: py_bool | None = _copy_default, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is plain wrong and requires a dedicated follow-up
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Many (if not most) of the changes here revert work that I've done in #288. Why didn't you address those while it was still open? Then I could've explained my reasons for it, which would have saved us both a lot time.
Seeing large portions of my work being deleted with comments like "holy overengineering Batman!" is pretty frustrating, to say the least. Because this way you're literally wasting the time and effort I've put into it, without even knowing why I made those decisions.
array_api_compat/common/_aliases.py
Outdated
@@ -720,7 +722,7 @@ def iinfo(type_: DType | Array, /, xp: Namespace) -> Any: | |||
"finfo", | |||
"iinfo", | |||
] | |||
_all_ignore = ["inspect", "array_namespace", "NamedTuple"] | |||
_all_ignore = ["is_cupy_namespace", "inspect", "array_namespace", "NamedTuple"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now that we have the __dir__
functions, are these _all_ignore
's still needed?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah. I hadn't noticed __dir__
. It's a nice idea but it completely neuters test_all
.
I will revert the changes here and write a separate PR to address the issue.
def _isscalar(a: object) -> TypeIs[int | float | None]: | ||
return isinstance(a, (int, float, type(None))) | ||
def _isscalar(a: object) -> TypeIs[float | None]: | ||
return isinstance(a, int | float | NoneType) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
With
return isinstance(a, int | float | NoneType) | |
return a is None or isinstance(a, int | float) |
we avoid the types
import while simultaneously accentuating the violent dissonance between the Python runtime and its type-system, given that the sole purpose of a type-system is to accurately describe the runtime behavior...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we avoid the types
import? This is a runtime check.
array_api_compat/common/_helpers.py
Outdated
) | ||
|
||
_API_VERSIONS_OLD: Final = frozenset({"2021.12", "2022.12", "2023.12"}) | ||
_API_VERSIONS: Final = _API_VERSIONS_OLD | frozenset({"2024.12"}) | ||
|
||
|
||
def _is_jax_zero_gradient_array(x: object) -> TypeGuard[_ZeroGradientArray]: | ||
def _is_jax_zero_gradient_array(x: object) -> TypeIs[_ZeroGradientArray]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The TypeGuard
was intentional. Because even if x
is a _zeroGradientArray
, and therefore a npt.NDArray[np.void]
, the function might still return False
, in which case the TypeIs
would narrow x
to be not npt.NDArray[np.void]
, whereas a TypeGuard
wouldn't.
So it's better to revert this change (and the one below here)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
reverting
|
||
# TODO: Account for other backends. | ||
return isinstance(x, sparse.SparseArray) | ||
|
||
|
||
def is_array_api_obj(x: object) -> TypeIs[_ArrayApiObj]: # pyright: ignore[reportUnknownParameterType] | ||
def is_array_api_obj(x: object) -> TypeGuard[_ArrayApiObj]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Doesn't SupportsArrayNamespace
cover all downstream array types?
namespaces.add(jnp) | ||
elif is_pydata_sparse_array(x): | ||
if use_compat is True: | ||
_check_api_version(api_version) | ||
raise ValueError("`sparse` does not have an array-api-compat wrapper") | ||
else: | ||
import sparse # pyright: ignore[reportMissingTypeStubs] | ||
import sparse |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
there's no need for the else:
clause
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd rather not enlarge the scope of this PR.
On a side note, this is something I disagree with the ruff policy. The else clause, while not being functionally useful, makes the code flow more readable IMHO.
from numpy import _CopyMode # noqa: F401 | ||
except ImportError: | ||
pass | ||
from .linalg import matrix_transpose, vecdot # type: ignore[no-redef] # noqa: F401 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what was the issue?
@@ -139,7 +141,7 @@ def default_dtypes( | |||
self, | |||
*, | |||
device: Device | None = None, | |||
) -> dict[str, dtype[intp | float64 | complex128]]: | |||
) -> DefaultDTypes: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a lot less precise, so why did you change it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The ultra-precise annotation was giving neither final users nor maintainers any benefit.
Why did you define a TypeAlias if you're not using it?
I couldn't review that PR in time, apologies.
I apologise, that was an emotional comment and I should not have made it. |
Follow-up to #288
Get mypy mostly green everywhere except torch.