Faq

Local LLM Setup

LM Studio Setup

LM Studio is a desktop application that lets you run large language models locally on your computer.

Configuration

  1. Go to the Developer tab in LM Studio
  2. Click Start Server to start the local server (default port is 1234)
  3. Enable CORS in the server settings - this is required for Hyprnote to connect

Enable CORS in LM Studio

Connecting to Hyprnote

  1. Open Hyprnote and go to Settings > Intelligence
  2. Expand the LM Studio provider card
  3. The default base URL http://127.0.0.1:1234/v1 should work if you haven't changed the port
  4. Select LM Studio as your provider and choose a model from the dropdown

Troubleshooting

If Hyprnote cannot connect to LM Studio:

  • Ensure the LM Studio server is running (check the Developer tab)
  • Verify CORS is enabled in LM Studio settings
  • Check that the port matches (default is 1234)
  • Make sure no firewall is blocking the connection

Ollama Setup

Ollama is a command-line tool for running large language models locally.

Connecting to Hyprnote

  1. Open Hyprnote and go to Settings > Intelligence
  2. Expand the Ollama provider card
  3. The default base URL http://127.0.0.1:11434/v1 should work
  4. Select Ollama as your provider and choose a model from the dropdown

Troubleshooting

If Hyprnote cannot connect to Ollama:

  • Ensure Ollama is running (ollama serve)
  • Check that you have at least one model pulled (ollama list)
  • Verify the port is correct (default is 11434)
  • On macOS, Ollama may already be running as a background service

Expose Ollama to the network