Skip to content

Error in convert-whisper-to-coreml.py ? #1009

Open
@jaybinks

Description

@jaybinks

On a nearly brand-new mac mini M2, I get the following error trying to generate the coreml files.

Any ideas / suggestions would be appreciated.

/opt/homebrew/lib/python3.11/site-packages/whisper/timing.py:57: NumbaDeprecationWarning: The 'nopython' keyword argument was not supplied to the 'numba.jit' decorator. The implicit default value for this argument is currently False, but it will be changed to True in Numba 0.59.0. See https://numba.readthedocs.io/en/stable/reference/deprecation.html#deprecation-of-object-mode-fall-back-behaviour-when-using-jit for details.
  @numba.jit

ModelDimensions(n_mels=80, n_audio_ctx=1500, n_audio_state=1280, n_audio_head=20, n_audio_layer=32, n_vocab=51865, n_text_ctx=448, n_text_state=1280, n_text_head=20, n_text_layer=32)
/opt/homebrew/lib/python3.11/site-packages/whisper/model.py:166: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We cant record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  assert x.shape[1:] == self.positional_embedding.shape, "incorrect audio shape"

Converting PyTorch Frontend ==> MIL Ops: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▉| 2613/2614 [00:00<00:00, 3476.92 ops/s]
Running MIL frontend_pytorch pipeline: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 5/5 [00:00<00:00, 33.49 passes/s]
Running MIL default pipeline: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 57/57 [00:12<00:00,  4.50 passes/s]
Running MIL backend_mlprogram pipeline: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 10/10 [00:00<00:00, 368.28 passes/s]

Traceback (most recent call last):
  File "/Users/jaybinks/src/whisper.cpp/models/convert-whisper-to-coreml.py", line 323, in <module>
    encoder = convert_encoder(hparams, encoder, quantize=args.quantize)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jaybinks/src/whisper.cpp/models/convert-whisper-to-coreml.py", line 259, in convert_encoder
    model = ct.convert(
            ^^^^^^^^^^^
  File "/opt/homebrew/lib/python3.11/site-packages/coremltools/converters/_converters_entry.py", line 492, in convert
    mlmodel = mil_convert(
              ^^^^^^^^^^^^
  File "/opt/homebrew/lib/python3.11/site-packages/coremltools/converters/mil/converter.py", line 188, in mil_convert
    return _mil_convert(model, convert_from, convert_to, ConverterRegistry, MLModel, compute_units, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/lib/python3.11/site-packages/coremltools/converters/mil/converter.py", line 212, in _mil_convert
    proto, mil_program = mil_convert_to_proto(
                         ^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/lib/python3.11/site-packages/coremltools/converters/mil/converter.py", line 303, in mil_convert_to_proto
    out = backend_converter(prog, **kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/lib/python3.11/site-packages/coremltools/converters/mil/converter.py", line 130, in __call__
    return backend_load(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/lib/python3.11/site-packages/coremltools/converters/mil/backend/mil/load.py", line 283, in load
    raise RuntimeError("BlobWriter not loaded")

RuntimeError: BlobWriter not loaded

coremlc: error: Model does not exist at models/coreml-encoder-large.mlpackage -- file:///Users/jaybinks/src/whisper.cpp/

mv: rename models/coreml-encoder-large.mlmodelc to models/ggml-large-encoder.mlmodelc: No such file or directory

Metadata

Metadata

Assignees

No one assigned

    Labels

    solutionThis issue contains a potential solution

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions