Skip to content

typeof(...) in Core failed to infer error in @code_warntype #49197

@bvdmitri

Description

@bvdmitri

The issue after the discussion

The @code_warntype typeof(1) gives an uninformative error message.

julia> @code_warntype typeof(1)
typeof(...) @ Core none:0
  failed to infer

The original issue

Consider we write a structure:

struct TypeConverter{T} end

TypeConverter(::Type{T}) where {T} = TypeConverter{T}()

(converter::TypeConverter{T})(something) where {T} = convert(T, something)
(converter::TypeConverter{T})(something::AbstractArray) where {T} = map(converter, something)

Works perfectly fine at first glance:

julia> converter = TypeConverter(Float32)

julia> converter(1.0)
1.0f0

julia> converter([ 1.0, 1.0 ])
2-element Vector{Float32}:
 1.0
 1.0

EDIT: missing $. The problem is that this code does not really work as I would expect it to and it does allocates (type-instability?):

julia> @btime converter(1.0)
  10.803 ns (1 allocation: 16 bytes)
1.0f0

But if I call @code_warntype I get this cryptic error message:

julia> @code_warntype typeof(converter(1.0))
typeof(...) @ Core none:0
  failed to infer

Interestingly enough, @code_warntype does not show any error if I call it directly on converter(1.0) and it does infer Float32. Why does not typeof work in a combination with @code_warntype? And why does the converter call allocates something?

julia> @code_warntype converter(1.0)
MethodInstance for (::TypeConverter{Float32})(::Float64)
  from (converter::TypeConverter{T})(something) where T @ Main REPL[3]:1
Static Parameters
  T = Float32
Arguments
  converter::Core.Const(TypeConverter{Float32}())
  something::Float64
Body::Float32
1%1 = Main.convert($(Expr(:static_parameter, 1)), something)::Float32
└──      return %1

If I look at the @code_native I also do not see any place where it could allocate:

@code_native converter(1.0)
.section	__TEXT,__text,regular,pure_instructions
	.build_version macos, 13, 0
	.globl	_julia_TypeConverter_12357      ; -- Begin function julia_TypeConverter_12357
	.p2align	2
_julia_TypeConverter_12357:             ; @julia_TypeConverter_12357
; ┌ @ /Users/bvdmitri/.julia/dev/ReactiveMP.jl/src/helpers/helpers.jl:223 within `TypeConverter`
	.cfi_startproc
; %bb.0:                                ; %top
; │┌ @ number.jl:7 within `convert`
; ││┌ @ float.jl:233 within `Float32`
	fcvt	s0, d0
; │└└
	ret
	.cfi_endproc
; └
                                        ; -- End function
.subsections_via_symbols

but it does and the code is slow. How could it be that single fcvt s0, d0 instruction allocate 16 bytes?

@btime converter(1.0)
17.327 ns (1 allocation: 16 bytes)

Motivation

There are a couple of famous issues like #15276 and #47760.

Basically the issue itself slowdowns a code of the form map(e -> convert(T, e), collection).

julia> foo(T, x) = map((e) -> convert(T, e), x)
foo (generic function with 2 methods)

julia> foo(x) = map((e) -> convert(Float64, e), x)
foo (generic function with 2 methods)

julia> x = ones(100);

julia> @btime foo($x);
  33.359 ns (1 allocation: 896 bytes)

julia> @btime foo(Float64, $x);
  9.333 μs (119 allocations: 3.22 KiB)

The structure above is an attempt to circumvent this related issue.

Version

I checked on Julia 1.9-rc1. The issue is present there as well.

Julia Version 1.8.5
Commit 17cfb8e65ea (2023-01-08 06:45 UTC)
Platform Info:
  OS: macOS (arm64-apple-darwin21.5.0)
  CPU: 10 × Apple M2 Pro
  WORD_SIZE: 64
  LIBM: libopenlibm
  LLVM: libLLVM-13.0.1 (ORCJIT, apple-m1)
  Threads: 1 on 6 virtual cores

Metadata

Metadata

Assignees

No one assigned

    Labels

    error messagesBetter, more actionable error messagesobservabilitymetrics, timing, understandability, reflection, logging, ...

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions