Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add python 3.13, Remove python 3.8 #345

Merged
merged 24 commits into from
Mar 27, 2025

Conversation

sneakers-the-rat
Copy link
Contributor

@sneakers-the-rat sneakers-the-rat commented Oct 9, 2024

upstream_repo: sneakers-the-rat/linkml
upstream_branch: python-3.13

Sibling of: linkml/linkml#2358
Fix: linkml/linkml#2370
Fix: linkml/linkml#2378

does what it says on the tin.

Currently linkml/linkml#2358 is failing because there's a from typing import re statement in here, so i also removed that.

I added the upgrade ruff rules and one to check for unused imports just to automate the update, and i figure why the heck not leave them in here i would actually really like to at least isort stuff in here if not give it the same linting rules as upstream.

Other stuff

  • Removed the monkeypatch to allow arbitrary kwargs in dataclasses - if we want to keep this in then someone else re-implement it, but the basis for doing that was removed in python 3.13 and it seems like the thing it was supposed to do (report line numbers from instantiation errors) is better done at yaml loading time rather than object instantiation time to me, since the only time it would work anyway is after yaml has been loaded with the loader that would be able to report the error in the first place.
  • The tox files have never worked for me with poetry, and it doesn't look like they've been used in a long time anyway (still using whitelist_externals rather than allowlist_externals ) so i fixed them to just work with normal standards-compliant installation rather than wrestle with accursed poetry, but i can take that out if it's not wanted.
  • added an __all__ to linkml_runtime.linkml_model because importing '*' from it makes linters complain.
  • fix pydantic deprecation warnings like dict -> model_dump and parse_obj -> model_validate, which i swear i have done like 20 times but it keeps showing up somehow lol
  • add a __test__ = False flag to TestEnvironment so pytest doesn't try to treat it like a test.

@vincentkelleher
Copy link

Just ran the tests on my local environment and things look fine 👍

image

Maybe you could add "Programming Language :: Python :: 3.13" to the pyproject.toml file ?

image

@sneakers-the-rat
Copy link
Contributor Author

sneakers-the-rat commented Jan 24, 2025

Done.

Can this get merged? Would be nice to support python 3.13 and not need to keep rebasing/deconflicting this and the upstream PR

@vincentkelleher
Copy link

@sneakers-the-rat I've approved this PR but I don't have the rights to merge it.

@ptgolden
Copy link

@amc-corey-cox ran into this recently when trying to run schema automator under Python 3.13

@sneakers-the-rat I almost made an issue pretty similar to linkml/linkml#2370, but noticed you'd already done the lifting!- Thanks

@sierra-moxon
Copy link
Member

@sneakers-the-rat @ptgolden - can you please fix the failing tests and then carry this through the finish line?

@ptgolden
Copy link

The test failures are because upstream linkml still advertises support for python 3.8. It seems that those failures will continue until linkml/linkml#2358 is merged.

@sierra-moxon
Copy link
Member

Unfortunately, I don't think we want to merge the upstream one until it has passing tests or at least a comment on why they can't/won't be passable. But I trust @sneakers-the-rat is already on fixing those and we'll have this out in no time. Alternatively, if you have some cycles @ptgolden, and you want to fix the forked PR upstream, that would be wonderful.

@ptgolden
Copy link

This should be the first PR that should be merged out of the two. When a new version is tagged here for a release that supports 3.13, that new version can be pinned in the linkml PR for updating to 3.13.

I think the fix here is to change the test_upstream action so that we can install the local version of linkml-runtime even if an incompatible python version is declared. That might mean changing these steps:

# we are not using linkml-runtime's lockfile, but simulating what will happen
# when we merge this and update linkml's lockfile
- name: add linkml-runtime to lockfile
working-directory: linkml
run: poetry add ../linkml-runtime
# note that we run the installation step always, even if we restore a venv,
# the cache will restore the old version of linkml-runtime, but the lockfile
# will only store the directory dependency (and thus will reinstall it)
# the cache will still speedup the rest of the installation
- name: install linkml
working-directory: linkml
run: poetry install --no-interaction -E tests

To run something like:

# in linkml directory
poetry install --no-interaction -E tests
poetry run pip install ../linkml-runtime

This would mean that there would be a Poetry lockfile that doesn't represent the state of the virtual environment, but it would allow installing the local version of linkml-runtime when poetry add would reject it (because of incompatible python versions).

I'd push a commit here but I'll wait to hear thoughts from @sneakers-the-rat

@ptgolden
Copy link

ptgolden commented Jan 31, 2025

To be clear, the dilemma here was caused by two things (both totally understandable on their own):

  1. A reverse dependency on the state of upstream linkml here because of the test_upstream action
  2. Dropping support for 3.8 at the same time as adding support for 3.13 (and in the corresponding linkml PR)

Because upstream linkml currently requests 3.8 to 3.12, Poetry can not add this branch as dependency because it doesn't support 3.8.

Upstream linkml can't slide the window from 3.8 to 3.12 -> 3.9 to 3.13 because the current linkml-runtime release does not support 3.13.

(that cycles back and forth forever)

In retrospect, the order could have been:

  1. Support 3.13 in linkml-runtime
  2. Support 3.13 and drop support for 3.8 in linkml
  3. Drop support for 3.8 in linkml-runtime

The solution I proposed above is just a kludge to break out of the cyclical trap these two PRs are in, while still being able to run the test_upstream action.

@sierra-moxon
Copy link
Member

from dev call:

  • check in with @sneakers-the-rat for feedback
  • would like to consider closing this linkml-runtime PR in favor of a smaller one that just expands support for 3.13, then merging the linkml PR etc... following @ptgolden's suggestion above:
Support 3.13 in linkml-runtime
Support 3.13 and drop support for 3.8 in linkml
Drop support for 3.8 in linkml-runtime

@sneakers-the-rat
Copy link
Contributor Author

I mean it's really not a big deal we just need to merge at the same time, but sure if yall want to redo this be my guest I guess

@sneakers-the-rat
Copy link
Contributor Author

in linkml directory

poetry install --no-interaction -E tests
poetry run pip install ../linkml-runtime

This is how it used to work, but the current version also testing compatibility of dependencies is a feature not a bug. It could be two steps, checking for dep compatibility and running tests, but it's like that for a reason

@dalito
Copy link
Member

dalito commented Feb 1, 2025

@sierra-moxon Maybe important: who is expected to merge?

I could do it from the permissions but don't really know where to use the power without making you upset. See also https://github.com/orgs/linkml/discussions/2518

@sneakers-the-rat
Copy link
Contributor Author

Support 3.13 in linkml-runtime
Support 3.13 and drop support for 3.8 in linkml
Drop support for 3.8 in linkml-runtime

this is not really possible, since the major thing these PRs do is update the code to remove 3.8 features (typing.Dict et al), so if we really want to see green checks we could fake it and leave the linkml-runtime lower bound at 3.8 (while still not testing at 3.8 because those tests would fail), merge both PRs, and then follow up with a trivial PR that just boosts the floor. alternatively we could do this in 4 PRs where we support 3.8-3.13 in both, then remove 3.8 in both, but imo it's simpler to just "merge the work that has already been done at the same time" than "make 4 new PRs"

@dalito
Copy link
Member

dalito commented Feb 1, 2025

You could remove 3.8 first and then add 3.13 (so the opposite order). Maybe next time for 3.9 / 3.14 😉

@amc-corey-cox
Copy link
Contributor

Hey @sneakers-the-rat thanks for all of your hard work on this. @ptgolden helped walk me through this and we're making progress to get the last few things wrapped up and get this merged. I will be shepherding a release of shexc through as quickly as I can. Then I'll get these issues fixed and pushed to your linkml 3.13 branch. That should resolve all of the test failures and we'll get this merged as soon as possible.
These are the test failures we need to fix in the linkml 3.13 branch but we'll hold off on those until we have the shexc thing fixed.

FAILED tests/test_generators/test_pydanticgen.py::test_arrays_anyshape_json_schema[int-expected1] - AssertionError: assert {'items': {'$...ype': 'array'} == {'items': {'$...ype': 'array'}
  Omitting 1 identical items, use -vv to show
  Differing items:
  {'items': {'$ref': '#/$defs/AnyShapeArray___T_'}} != {'items': {'$ref': '#/$defs/AnyShapeArray_int_'}}
  Full diff:
  - {'items': {'$ref': '#/$defs/AnyShapeArray_int_'}, 'type': 'array'}
  ?                                           ^^^
  + {'items': {'$ref': '#/$defs/AnyShapeArray___T_'}, 'type': 'array'}
  ?

@ptgolden
Copy link

To be clear, this failing test happens in the upstream linkml repo (https://github.com/sneakers-the-rat/linkml).

For some reason, Pydantic's JSON Schema gen acts weird in 3.9 and 3.10 when using collections.abc.Iterable instead of typing.Iterable in linkml/generators/pydanticgen/array.py.

@ptgolden
Copy link

Putting a couple print statements in this test:

def test_arrays_anyshape_json_schema(dtype, expected):
    class MyModel(BaseModel):
        array: AnyShapeArray[dtype]
        dummy: Optional[AnyShapeArray[str]] = None

    schema = MyModel.model_json_schema()
    array_ref = schema["properties"]["array"]["$ref"].split("/")[-1]
    print(array_ref)
    print(json.dumps(schema["$defs"]), indent=2)

Here's output where dtype is int and expected is [{"type": "integer"}]:

  1. Using collections.abc.Iterable:
AnyShapeArray_int_

{
  "AnyShapeArray___T_": {
    "items": {
      "anyOf": [
        {},
        {
          "items": {
            "$ref": "#/$defs/AnyShapeArray___T_"
          },
          "type": "array"
        }
      ]
    },
    "type": "array"
  },
  "AnyShapeArray_int_": {
    "items": {
      "anyOf": [
        {
          "type": "integer"
        },
        {
          "items": {
            "$ref": "#/$defs/AnyShapeArray___T_"
          },
          "type": "array"
        }
      ]
    },
    "type": "array"
  },
  "AnyShapeArray_str_": {
    "items": {
      "anyOf": [
        {
          "type": "string"
        },
        {
          "items": {
            "$ref": "#/$defs/AnyShapeArray___T_"
          },
          "type": "array"
        }
      ]
    },
    "type": "array"
  }
}
  1. Using typing.Iterable
{
  "AnyShapeArray_int_": {
    "items": {
      "anyOf": [
        {
          "type": "integer"
        },
        {
          "items": {
            "$ref": "#/$defs/AnyShapeArray_int_"
          },
          "type": "array"
        }
      ]
    },
    "type": "array"
  },
  "AnyShapeArray_str_": {
    "items": {
      "anyOf": [
        {
          "type": "string"
        },
        {
          "items": {
            "$ref": "#/$defs/AnyShapeArray_str_"
          },
          "type": "array"
        }
      ]
    },
    "type": "array"
  }
}

@ptgolden
Copy link

I'm trying to track down exactly why it's happening. Obviously it's something about Pydantic not being able to resolve forward references.

...but it might just be easier, for the sake of getting this and the linkml PR merged to switch back to using typing.Iterable, then finally merge both PRs, then track down the issue.

@ptgolden
Copy link

(I should mention that the output in #345 (comment) was on python 3.9)

@sneakers-the-rat
Copy link
Contributor Author

sneakers-the-rat commented Mar 26, 2025

Can someone with push privs trigger a workflow dispatch for upstream tests after linkml/linkml@7ba9b72

I could do it with a no-op commit but that's messy.

Edit: also would it be possible to get commit privs here (assuming there are branch protections that would prevent me from pushing to main) so i can do this myself? Happens regularly enough that I figure I am annoying to whoever has to push the button

Edit 2: nvm, I figure we're all tired of waiting for the several months it has taken to call twine on pyshexc, I'm just going to monkeypatch it so we can be done with this

@sneakers-the-rat
Copy link
Contributor Author

cool so let's merge while it's still working

@Silvanoc
Copy link
Contributor

cool so let's merge while it's still working

I have the rights to do it... But since my reason for opening discussion 2518 remains, I don't think I should and I don't know who should 🤷

Copy link
Member

@dalito dalito left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I checked also locally (on Win10/11) that all tests for Python 3.9-3.13 pass with linkml from the corresponding PR in linkml/linkml@c130374

It's one of those rare moments when the stars align favorably for a merge. ⭐ 🌟 🤩

@sierra-moxon sierra-moxon merged commit c35816f into linkml:main Mar 27, 2025
13 checks passed
@dalito
Copy link
Member

dalito commented Mar 27, 2025

Thanks @sierra-moxon ! 🚀

@cmungall
Copy link
Member

cmungall commented Mar 30, 2025

This breaks downstream code. I have yanked the 1.9.0 runtime release.

Specifically it's the removal of dataclass_extensions_376.py

de47a546f0972078965fad3127051cccbf4246ba152cc62f649cbc7ecc9412fc

This breaks a very large number of codebases, who will need to pin runtime <1.9.0, until all their upstreams are fixed.

Here is what needs to happen:

  1. restore dataclass_extensions_376.py
  2. create a new rc
  3. test the rc on actual downstream code that uses linkml

3 is incredibly important for changes to the runtime that can impact existing code

After that we can change pythongen to stop generating code that depends on dataclass_extensions_376

@Silvanoc
Copy link
Contributor

This breaks downstream code. I have yanked the 1.9.0 runtime release.

Specifically it's the removal of dataclass_extensions_376.py

de47a546f0972078965fad3127051cccbf4246ba152cc62f649cbc7ecc9412fc

This breaks a very large number of codebases, who will need to pin runtime <1.9.0, until all their upstreams are fixed.

Here is what needs to happen:

1. restore dataclass_extensions_376.py

2. create a new rc

3. test the rc on actual downstream code that uses linkml

3 is incredibly important for changes to the runtime that can impact existing code

After that we can change pythongen to stop generating code that depends on dataclass_extensions_376

@cmungall do we have tests for downstream code? How important is it, not to break downstream code? Or which is the downstream code that should not break?

@sierra-moxon
Copy link
Member

sierra-moxon added a commit that referenced this pull request Apr 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
9 participants