Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add yet an other CI for numpy 2.1 #9540

Merged
merged 1 commit into from
Sep 25, 2024
Merged

Conversation

hmaarrfk
Copy link
Contributor

In response to help with: #9403 (comment)

  • Closes #xxxx
  • Tests added
  • User visible changes (including notable bug fixes) are documented in whats-new.rst
  • New functions/methods are listed in api.rst

@dcherian
Copy link
Contributor

Thanks!

Perhaps we should go the other way and have numba, sparse, numbagg only in one env?

@hmaarrfk
Copy link
Contributor Author

hmmm as you wish, but my attempt got sparse 0.3.1.

So i guess i should remove sparse from my test environement too.

@hmaarrfk
Copy link
Contributor Author

PS. i also opened: conda-forge/sparse-feedstock#25

@hmaarrfk hmaarrfk marked this pull request as ready for review September 24, 2024 05:35
@max-sixty
Copy link
Collaborator

I'll merge so we can have this running.

But should we adjust this to always have the latest numpy? The policy would be to eschew any package (like numba) which is incompatible with the latest numpy.

@max-sixty max-sixty merged commit 378b4ac into pydata:main Sep 25, 2024
29 checks passed
@hmaarrfk
Copy link
Contributor Author

But should we adjust this to always have the latest numpy?

this seems complicated to me. you can try, but it isn't too much churn to check this from time to time.

I think we are just going through some long needed cleanup pains of numpy. happy that this is happening!

@max-sixty
Copy link
Collaborator

this seems complicated to me. you can try, but it isn't too much churn to check this from time to time.

I don't know conda well at all, but if we just exclude a list of packages which pin numpy, won't that suffice? It's conditional on a package not newly pinning numpy but should otherwise work?

@hmaarrfk
Copy link
Contributor Author

but if we just exclude a list of packages which pin numpy,

Many bugs seem to come from various interactions. So the more you exclude, the less you test....

xarray builds on alot of the scientfici stack, so you can't be too "modern".

I think if you just remember to set the lower bound of numpy every one in a while, you'll get a semi-modern environment.

@hmaarrfk
Copy link
Contributor Author

hmaarrfk commented Sep 26, 2024

Selfishly, i would just recommend setting your lower bounds, other than numpy, to whatever the output of

nep29 PACKAGE_NAME

gives you.

https://pypi.org/project/nep29/

edit: added "other than numpy", and leave numpy super modern.

@max-sixty
Copy link
Collaborator

Many bugs seem to come from various interactions. So the more you exclude, the less you test....

Sure sure, but IIUC there are a few specific packages which pin numpy (e.g. numba) — so if we exclude those then we will get latest numpy. No?

That was my understanding of the gap here — in our existing CI we basically either have a) minimum deps or b) everything, but we were lacking a "latest numpy" because of that pinning; outside of upstream-dev.

(Ofc possible for a package to suddenly start pinning numpy — but seems less maintenance than manually upgrading numpy, which the currently merged code requires...)

@hmaarrfk
Copy link
Contributor Author

so if we exclude those then we will get latest numpy. No?

Yes. You "hope" to get the latest numpy, but sometimes the solver does strange things.

Having the >=x.y.z helps with confidence.

This CI stuff is a never ending game of wack-a-mole/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants