LocalAI is aAutonomous, community-driven, simple local OpenAI-compatible API, written in go. Can be used as a drop-in replacement for OpenAI, running on the CPU of consumer-grade hardware. Supports ggml-compatible models such as: LLaMA, alpaca, gpt4all, vicuna, koala, gpt4all-j, cerebras.

  • OpenAI compatible API
  • Support multiple models
  • After the first load, it loads the model into memory for faster inference
  • Support prompt templates
  • Not shell-out, but uses C bindings for faster inference and better performance.use go-llama.cpp and go-gpt4all-j.cpp.

It is related to llama.cpp The supported models are compatible and also support GPT4ALL-J and cerebras-GPT with ggml.


It should also be compatible with StableLM and GPTNeoX ggml models (untested)

note: you may need to convert the old model to the new format, seehereto rungpt4all.

#LocalAI #homepage #documentation #downloads #dropin #replacement #OpenAI #News Fast Delivery

Leave a Comment

Your email address will not be published. Required fields are marked *