-
Notifications
You must be signed in to change notification settings - Fork 243
This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
Where to host extension(s) #2735
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
My initial reaction was: Why is your extension depending on CUDA at all? It should only need Adapt.jl Looking at your code the issue seems to be that you want to define a BroadcastStyle that promotes to CUDA... I wonder if we could have a "Adapt broadcast style" The goal of Adapt is to not require a GPU dependency |
Yes, you are right. The |
I'm not a fan of that, no. That code will then either go untested, or require us to maintain it (while being unfamiliar with the it), both of which are not ideal. The Enzyme.jl extension has demonstrated that being problematic in the past. I do agree that it would be nice if these extensions were back-end agnostic (e.g., instead depending on GPUArrays for the AbstractGPUArray type), but that would require somebody to delve into the broadcast code again... |
Thanks, I can understand your concerns. The current runtests.jl does perform extensive testing also in Cuda if supported by the system, but I can see that maintance is an issue. |
Regarding
|
I tried out my own suggestions over here JuliaArrays/OffsetArrays.jl#378 |
Thanks for the good suggestion. I did not know about the |
This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
To make existing packages CUDA-capable, one can use the
Adapt
mechanism and write an extension.This extension can either reside in the supporting package (e.g.
ShiftedArrays.jl
) or possibly in theext
folder in theCUDA.jl
package.As for
ShiftedArrays.jl
there were PRs more a year ago, trying to add such cuda support, but this never led to a merge and a new release.JuliaArrays/ShiftedArrays.jl#67
One could also construct a package with only the extension in it, but this seem to be a clear example of type piracy, i.e. not the right way to go.
The question is therefore, whether I should port move the file in
ext
from this PRJuliaArrays/ShiftedArrays.jl#70
to the
ext
folder ofCUDA.jl
and make a PR here.Similarly one cold do so for the
FourierTools.jl
package:bionanoimaging/FourierTools.jl#56
Is this an option? Would be great to get this finally done.
The text was updated successfully, but these errors were encountered: