Skip to content

Installing LLM Engine

Configuration

Configuration is done with an environment variables file. Copy .env.example to create your .env file.

Running locally

  1. Start by copying .env.example to .env.
  2. Install mongodb (ubuntu 24 server)
  3. Run MongoDB with mongod

    💡 Note: Mac users who have used Homebrew to install MongoDB should use the command brew services start mongodb-community to run the MongoDB service instead.

  4. Install node.js and set to a version specified in package.json file (Consider using nvm)
  5. Install yarn
  6. Install all dependencies with yarn install.
  7. Run yarn run dev to serve the API locally.

LLM Model selection

LLM Engine supports a range of LLM platforms.

OpenAI

  1. Configure DEFAULT_OPENAI_API_KEY and DEFAULT_OPENAI_BASE_URL in your .env file.
  2. When creating a Conversation with an Agent, specify llmPlatform to be openai and llmModel to be an available OpenAI model.

Note that this will work for any OpenAI compatible LLM provider.

AWS Bedrock (including Claude)

  1. Configure BEDROCK_API_KEY and BEDROCK_BASE_URL in your .env file.
  2. When creating a Conversation with an Agent, specify llmPlatform to be bedrock and llmModel to be an available Bedrock model.

Open Source Models via vLLM

Open source models are available through vLLM running locally or on one of two hosted serverless providers:

Open Source Models via Ollama

Open source models are also available through Ollama running locally.

  1. Install Ollama locally.
  2. Configure OLLAMA_BASE_URL
  3. When creating a Conversation with an Agent, specify llmPlatform to be ollama and llmModel to be an available open source model supported by ollama.

Optional: Retrieval Augmented Generation

If you would like to make use of Retrieval Augmented Generation (RAG) see our rag guide.

Optional: Nextspace integration

If you would like to use LLM Engine with the Nextspace client, see our nextspace guide.

Optional: Zoom integration

If you would like to use LLM Engine with Zoom, see our zoom guide.