Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Interfacing with KernelAbstractions #395

Open
roflmaostc opened this issue Nov 28, 2024 · 3 comments
Open

Interfacing with KernelAbstractions #395

roflmaostc opened this issue Nov 28, 2024 · 3 comments
Labels
enhancement New feature or request

Comments

@roflmaostc
Copy link

Hi,

the following issue occurred with a code using KernelAbstractions.jl.

I could probably reduce it to a minimal KernelAbstractions example but this fails too:

julia> using DifferentiationInterface, Mooncake

julia> using RadonKA

julia> x = rand(32, 32, 1);

julia> angles = range(0, 2π, 100);

julia> radon(x, angles);^C

julia> f(x) = sum(radon(x, angles));

julia> f(x)
35091.5054103222

julia> value_and_gradient(f, AutoMooncake(config=nothing),      x)
ERROR: No rrule!! available for foreigncall with primal argument types Tuple{Val{:jl_new_task}, Val{Ref{Task}}, Tuple{Val{Any}, Val{Any}, Val{Int64}}, Val{0}, Val{:ccall}, KernelAbstractions.var"#20#23"{KernelAbstractions.Kernel{KernelAbstractions.CPU, KernelAbstractions.NDIteration.DynamicSize, KernelAbstractions.NDIteration.DynamicSize, typeof(RadonKA.cpu_radon_kernel!)}, Tuple{Int64, Int64, Int64}, KernelAbstractions.NDIteration.NDRange{3, KernelAbstractions.NDIteration.DynamicSize, KernelAbstractions.NDIteration.DynamicSize, CartesianIndices{3, Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}, Base.OneTo{Int64}}}, CartesianIndices{3, Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}, Base.OneTo{Int64}}}}, Tuple{Array{Float64, 3}, Array{Float64, 3}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Float64, Float64, RadonKA.var"#5#6"{Float64}, Nothing}, KernelAbstractions.NDIteration.DynamicCheck, Int64}, Base.GenericCondition{Base.Threads.SpinLock}, Int64, Int64}. This problem has most likely arisen because there is a ccall somewhere in the function you are trying to differentiate, for which an rrule!! has not been explicitly written.You have three options: write an rrule!! for this foreigncall, write an rrule!! for a Julia function that calls this foreigncall, or re-write your code to avoid this foreigncall entirely. If you believe that this error has arisen for some other reason than the above, or the above does not help you to workaround this problem, please open an issue.
Stacktrace:
  [1] rrule!!(::Mooncake.CoDual{…}, ::Mooncake.CoDual{…}, ::Mooncake.CoDual{…}, ::Mooncake.CoDual{…}, ::Mooncake.CoDual{…}, ::Mooncake.CoDual{…}, ::Mooncake.CoDual{…}, ::Mooncake.CoDual{…}, ::Mooncake.CoDual{…}, ::Mooncake.CoDual{…})
    @ Mooncake ~/.julia/packages/Mooncake/TQO5a/src/rrules/foreigncall.jl:12
  [2] __run
    @ ~/.julia/packages/KernelAbstractions/iW1Rw/src/cpu.jl:98 [inlined]
  [3] (::Tuple{…})(none::Mooncake.CoDual{…}, none::Mooncake.CoDual{…}, none::Mooncake.CoDual{…}, none::Mooncake.CoDual{…}, none::Mooncake.CoDual{…}, none::Mooncake.CoDual{…}, none::Mooncake.CoDual{…})
    @ Base.Experimental ./<missing>:0
  [4] (::Mooncake.DerivedRule{…})(::Mooncake.CoDual{…}, ::Mooncake.CoDual{…}, ::Mooncake.CoDual{…}, ::Mooncake.CoDual{…}, ::Mooncake.CoDual{…}, ::Mooncake.CoDual{…}, ::Mooncake.CoDual{…})
    @ Mooncake ~/.julia/packages/Mooncake/TQO5a/src/interpreter/s2s_reverse_mode_ad.jl:882
  [5] (::Mooncake.DynamicDerivedRule{…})(::Mooncake.CoDual{…}, ::Mooncake.CoDual{…}, ::Mooncake.CoDual{…}, ::Mooncake.CoDual{…}, ::Mooncake.CoDual{…}, ::Mooncake.CoDual{…}, ::Mooncake.CoDual{…})
    @ Mooncake ~/.julia/packages/Mooncake/TQO5a/src/interpreter/s2s_reverse_mode_ad.jl:1669
  [6] #_#16
    @ ~/.julia/packages/KernelAbstractions/iW1Rw/src/cpu.jl:40 [inlined]
  [7] (::Tuple{…})(none::Mooncake.CoDual{…}, none::Mooncake.CoDual{…}, none::Mooncake.CoDual{…}, none::Mooncake.CoDual{…}, none::Mooncake.CoDual)
    @ Base.Experimental ./<missing>:0
  [8] DerivedRule
    @ ~/.julia/packages/Mooncake/TQO5a/src/interpreter/s2s_reverse_mode_ad.jl:882 [inlined]
  [9] _build_rule!(rule::Mooncake.LazyDerivedRule{…}, args::Tuple{…})
    @ Mooncake ~/.julia/packages/Mooncake/TQO5a/src/interpreter/s2s_reverse_mode_ad.jl:1748
 [10] LazyDerivedRule
    @ ~/.julia/packages/Mooncake/TQO5a/src/interpreter/s2s_reverse_mode_ad.jl:1708 [inlined]
 [11] _radon
    @ ~/.julia/packages/RadonKA/FXHjL/src/radon.jl:136 [inlined]
 [12] (::Tuple{…})(none::Mooncake.CoDual{…}, none::Mooncake.CoDual{…}, none::Mooncake.CoDual{…}, none::Mooncake.CoDual{…}, none::Mooncake.CoDual{…})
    @ Base.Experimental ./<missing>:0
 [13] DerivedRule
    @ ~/.julia/packages/Mooncake/TQO5a/src/interpreter/s2s_reverse_mode_ad.jl:882 [inlined]
 [14] _build_rule!(rule::Mooncake.LazyDerivedRule{…}, args::Tuple{…})
    @ Mooncake ~/.julia/packages/Mooncake/TQO5a/src/interpreter/s2s_reverse_mode_ad.jl:1748
 [15] LazyDerivedRule
    @ ~/.julia/packages/Mooncake/TQO5a/src/interpreter/s2s_reverse_mode_ad.jl:1708 [inlined]
 [16] radon
    @ ~/.julia/packages/RadonKA/FXHjL/src/radon.jl:123 [inlined]
 [17] (::Tuple{…})(none::Mooncake.CoDual{…}, none::Mooncake.CoDual{…}, none::Mooncake.CoDual{…})
    @ Base.Experimental ./<missing>:0
 [18] (::Mooncake.DerivedRule{…})(::Mooncake.CoDual{…}, ::Mooncake.CoDual{…}, ::Mooncake.CoDual{…})
    @ Mooncake ~/.julia/packages/Mooncake/TQO5a/src/interpreter/s2s_reverse_mode_ad.jl:882
 [19] (::Mooncake.DynamicDerivedRule{Dict{…}})(::Mooncake.CoDual{typeof(radon), NoFData}, ::Mooncake.CoDual{Array{…}, Array{…}}, ::Mooncake.CoDual{StepRangeLen{…}, NoFData})
    @ Mooncake ~/.julia/packages/Mooncake/TQO5a/src/interpreter/s2s_reverse_mode_ad.jl:1669
 [20] f
    @ ./REPL[61]:1 [inlined]
 [21] (::Tuple{…})(none::Mooncake.CoDual{…}, none::Mooncake.CoDual{…})
    @ Base.Experimental ./<missing>:0
 [22] DerivedRule
    @ ~/.julia/packages/Mooncake/TQO5a/src/interpreter/s2s_reverse_mode_ad.jl:882 [inlined]
 [23] __value_and_pullback!!(::Mooncake.DerivedRule{…}, ::Float64, ::Mooncake.CoDual{…}, ::Mooncake.CoDual{…})
    @ Mooncake ~/.julia/packages/Mooncake/TQO5a/src/interface.jl:14
 [24] value_and_pullback!!(::Mooncake.DerivedRule{Tuple{…}, MistyClosures.MistyClosure{…}, Mooncake.Pullback{…}, Val{…}, Val{…}}, ::Float64, ::Function, ::Array{Float64, 3})
    @ Mooncake ~/.julia/packages/Mooncake/TQO5a/src/interface.jl:119
 [25] value_and_pullback(::Function, ::DifferentiationInterfaceMooncakeExt.MooncakeOneArgPullbackPrep{…}, ::AutoMooncake{…}, ::Array{…}, ::Tuple{…})
    @ DifferentiationInterfaceMooncakeExt ~/.julia/packages/DifferentiationInterface/gSdHF/ext/DifferentiationInterfaceMooncakeExt/onearg.jl:36
 [26] prepare_pullback(::Function, ::AutoMooncake{Nothing}, ::Array{Float64, 3}, ::Tuple{Bool})
    @ DifferentiationInterfaceMooncakeExt ~/.julia/packages/DifferentiationInterface/gSdHF/ext/DifferentiationInterfaceMooncakeExt/onearg.jl:22
 [27] prepare_gradient(::typeof(f), ::AutoMooncake{Nothing}, ::Array{Float64, 3})
    @ DifferentiationInterface ~/.julia/packages/DifferentiationInterface/gSdHF/src/first_order/gradient.jl:70
 [28] value_and_gradient(::typeof(f), ::AutoMooncake{Nothing}, ::Array{Float64, 3})
    @ DifferentiationInterface ~/.julia/packages/DifferentiationInterface/gSdHF/src/fallbacks/no_prep.jl:60
 [29] top-level scope
    @ REPL[63]:1
Some type information was truncated. Use `show(err)` to see complete types.

(@enzyme) pkg> st
Status `~/.julia/environments/enzyme/Project.toml`
  [052768ef] CUDA v5.5.2
  [a0c0ee7d] DifferentiationInterface v0.6.23
  [7da242da] Enzyme v0.13.16
  [da2b9cff] Mooncake v0.4.51
  [86de8297] RadonKA v0.6.2

@willtebbutt
Copy link
Member

Thanks for opening this! This looks to me like the problem is Task-related. This makes sense to me, because Mooncake's ability to handle tasks is currently rather limited.

Am I interpreting the above correctly that your current code shouldn't involve any GPU code?

@willtebbutt willtebbutt added the enhancement New feature or request label Nov 28, 2024
@roflmaostc
Copy link
Author

Right now it's all CPU based but in future I'd like to use GPUs (and in fact with Enzyme this also works -> But slower than my custom written gradients)

@willtebbutt
Copy link
Member

willtebbutt commented Nov 28, 2024

Cool.

Unfortunately I don't know when I'm going to be able to resolve this. Properly handling Tasks is one of my larger outstanding todo items in Mooncake at the minute.

This being said, you mention that you have custom gradients -- are these in the form of ChainRulesCore.rrules? If so, do the KernelAbstractions calls happen inside of them?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants