Supercharge Your Cursor: Using LLM-Router with Local and Cloud AI Models on Mac

Using Cursor.AI with local LLM
New Toys to play with
Using LLM-Router with Claude 3.5 Sonnet and Local Models on Mac
As a Mac user and coding enthusiast, I'm thrilled to share how you can enhance your development workflow using LLM-router to seamlessly integrate Claude 3.5 Sonnet and local Ollama models with the Cursor code editor
The magic is by Keith Coleman , here https://github.com/kcolemangt/llm-router
What is LLM-Router?
LLM-router is a reverse proxy that allows you to switch between different AI language models when using the Cursor code editor. It solves a significant limitation of Cursor: the inability to easily use local models or switch between multiple AI providers.
Why Should Mac Users Care?
- Cloud and Local Model Integration: Use Anthropic's Claude 3.5 Sonnet and Ollama models running on your Mac directly within Cursor.
- Multi-Provider Flexibility: Easily switch between local models and Claude 3.5 Sonnet without changing Cursor settings.
- Improved Privacy: Use local models for sensitive code while leveraging Claude's power for complex tasks.
Setting Up LLM-Router on Mac
We'll walk through the setup process using two scripts: a configuration script and a run script.
Prerequisites
- Cursor installed on your Mac
- Homebrew package manager, installed on your Mac
- Ngrok installed on your Mac
- Ollama installed (for local models)
- ngrok account (for creating a secure tunnel)
- Anthropic API key for Claude 3.5 Sonnet
Step 1: Configuration Script
First, let's create a script to download LLM-router and set up the configuration:
#!/bin/bash
# Configuration Script for LLM-Router
# Create a directory for LLM-router
mkdir -p ~/llm-router
cd ~/llm-router
# Download LLM-router binary
curl -LO https://github.com/kcolemangt/llm-router/releases/download/v0.0.1/llm-router-darwin-arm64
chmod +x llm-router-darwin-arm64
cat << EOF > config.json
{
"listening_port": 11411,
"backends": [
{
"name": "anthropic",
"base_url": "https://api.anthropic.com",
"prefix": "anthropic/",
"default": true,
"require_api_key": true,
"key_env_var": "ANTHROPIC_API_KEY"
},
{
"name": "ollama",
"base_url": "http://localhost:11434",
"prefix": "ollama/"
}
]
}
EOF
echo "Configuration complete!"
echo "Next, run the run_llm_router.sh script to start LLM-router and ngrok."
Save this script as configure_llm_router.sh
and make it executable:
chmod +x configure_llm_router.sh
Step 2: Run Script
Now, let's create a script to run LLM-router and ngrok:
#!/bin/bash
# Run Script for LLM-Router
cd ~/llm-router
# Check if ANTHROPIC_API_KEY is set
if [ -z "$ANTHROPIC_API_KEY" ]; then
echo "Error: ANTHROPIC_API_KEY is not set. Please set it before running this script."
exit 1
fi
# Start LLM-router
echo "Starting LLM-router..."
./llm-router-darwin-arm64 &
# Start ngrok
echo "Starting ngrok..."
ngrok http 11411 > ngrok.log 2>&1 &
# Wait for ngrok to start and extract the URL
sleep 5
NGROK_URL=$(grep -o 'https://.*\.ngrok-free\.app' ngrok.log | head -n 1)
echo "Setup complete!"
echo "Use this URL in Cursor's 'Override OpenAI Base URL' setting:"
echo "${NGROK_URL}/v1"
echo "LLM-router is now running with Claude 3.5 Sonnet and Ollama support."
Save this script as run_llm_router.sh
and make it executable:
chmod +x run_llm_router.sh
Using the Scripts
- Run the configuration script once to set everything up:
./configure_llm_router.sh
- Whenever you want to use LLM-router, run the run script:
export ANTHROPIC_API_KEY=your_api_key_here
./run_llm_router.sh
Final Configuration in Cursor
- Open Cursor preferences.
- Navigate to the AI settings.
- In the "Override OpenAI Base URL" field, enter the URL provided by the run script (ending with
/v1
).
Using LLM-Router with Cursor
Now you can use different models by prefixing them in Cursor:
- For Claude 3.5 Sonnet:
anthropic/claude-3-sonnet-20240229
or justclaude-3-sonnet-20240229
- For Ollama models:
ollama/llama2
Leveraging Claude 3.5 Sonnet
Claude 3.5 Sonnet is Anthropic's most capable model, excelling at a wide range of tasks including:
- Complex coding assistance
- In-depth code reviews
- Architectural design discussions
- Debugging and error analysis
- Documentation generation
When working on challenging programming tasks or when you need high-quality, nuanced responses, Claude 3.5 Sonnet is an excellent choice.
Conclusion
With LLM-router, you've unlocked a new level of flexibility in your AI-assisted coding workflow. You can now easily switch between local Ollama models for privacy-sensitive tasks and Claude 3.5 Sonnet for complex problem-solving, all within Cursor.
Remember to run the run_llm_router.sh
script each time you want to use LLM-router with Cursor. This setup provides a perfect balance of flexibility, performance, and privacy in your coding environment.
Happy coding, fellow Mac enthusiasts!