Skip to content

IntrinsicLabsAI/intrinsic-model-server

Repository files navigation

LICENSE Python application Frontend build and lint Create and publish a Docker image

Intrinsic Model server

A simple API server on top of your favorite locally runnable foundation models. We support

  • llama.cpp compatible models, including k-quant models
    • ARM NEON CPUs, BLAS, (CUDA coming soon)
  • Whisper transcription models (COMING SOON)
  • Visual models for object detection and segmentation (COMING SOON)

Checkout our documentation!

Supported model types

  • LLM completion
  • Transcription (coming soon)

About

Curate, serve and interact with local LLMs (other modalities coming soon!)

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors 3

  •  
  •  
  •