-
-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
snoop_inference is broken on latest after stackless inference change #56115
Labels
caching
compiler:precompilation
Precompilation of modules
observability
metrics, timing, understandability, reflection, logging, ...
Milestone
Comments
CC: @timholy |
FWIW, this also affects PrecompileTools.jl. |
Can we confirm that PrecompileTools and SnoopCompile works properly brute closing this? |
Attempts to validate this:
|
Looks like it's still broken JuliaGPU/GPUCompiler.jl#676 (comment) |
KristofferC
pushed a commit
that referenced
this issue
Mar 21, 2025
Adds 4 new Float16 fields to CodeInstance to replace Compiler.Timings with continually collected and available measurements. Sample results on a novel method signature: julia> @time code_native(devnull, ÷, dump_module=false, (Int32, UInt16)); 0.006262 seconds (3.62 k allocations: 186.641 KiB, 75.53% compilation time) julia> b = which(÷, (Int32, UInt16)).specializations[6].cache CodeInstance for MethodInstance for div(::Int32, ::UInt16) julia> reinterpret(Float16, b.time_infer_self) Float16(0.0002766) julia> reinterpret(Float16, b.time_infer_total) Float16(0.00049) julia> reinterpret(Float16, b.time_infer_cache_saved) Float16(0.02774) julia> reinterpret(Float16, b.time_compile) Float16(0.003773) Closes #56115 (cherry picked from commit 18b5d8f)
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
caching
compiler:precompilation
Precompilation of modules
observability
metrics, timing, understandability, reflection, logging, ...
We should be able to restore basically the old behavior with the new stack.
Originally posted by @vtjnash in #55575 (comment)
The text was updated successfully, but these errors were encountered: