How to Run Claude Code with Gemini, OpenAI, or Anthropic Models

Stephen NdegwaStephen Ndegwa
·
3 min read

How to Run Claude Code with Gemini, OpenAI, or Anthropic Models via claude-code-proxy (Windows, macOS, Linux)

The AI ecosystem is evolving fast, and many developers want to leverage Claude Code, Anthropic’s client, with different backends like Google Gemini, OpenAI, or Anthropic itself. claude-code-proxy is a lightweight proxy server that allows you to connect Claude Code to these models seamlessly.

In this guide, we’ll walk you through installation, configuration, and usage on Windows, macOS, and Linux, including Gemini API key setup, Vertex AI support, and OpenAI fallback.

What is claude-code-proxy?

claude-code-proxy is a transparent proxy that allows Anthropic clients (like Claude Code) to use:

  • Gemini (Google AI Studio / Vertex AI)
  • OpenAI models
  • Direct Anthropic backends

It translates Anthropic API requests to your chosen backend, and maps Claude models like:

Claude ModelDefault MappingGemini Mapping
haikuopenai/gpt-4o-minigemini-2.0-flash
sonnetopenai/gpt-4ogemini-2.5-pro

This lets you use your preferred AI provider without modifying Claude Code.


Prerequisites

Before installing, ensure you have:

  • Python 3.10+ with pip
  • Node.js LTS (for Claude Code)
  • Git (for cloning the repo)

Optional (for Vertex AI):

  • Google Cloud project with Vertex AI enabled
  • Google Cloud CLI installed

Install Required Tools

Windows:

# Python
winget install Python.Python.3

# Git
winget install Git.Git

# Node.js
winget install OpenJS.NodeJS.LTS

macOS (Homebrew):

brew install python git node

Linux (Debian/Ubuntu):

sudo apt update
sudo apt install python3 python3-pip git nodejs npm -y

Installing uv

uv manages Python dependencies automatically.

Windows (PowerShell):

irm https://astral.sh/uv/install.ps1 | iex

macOS / Linux:

curl -LsSf https://astral.sh/uv/install.sh | sh

Verify:

uv --version

Cloning the Repository

git clone https://github.com/1rgs/claude-code-proxy.git
cd claude-code-proxy

Configuring .env

  1. Copy the example file:
# Windows
copy .env.example .env

# macOS / Linux
cp .env.example .env
  1. Edit .env to include your API keys and preferred backend.

Example A: Gemini via API Key

PREFERRED_PROVIDER="google"
GEMINI_API_KEY="YOUR_GEMINI_API_KEY"
OPENAI_API_KEY="dummy-key"
BIG_MODEL="gemini-2.5-pro"
SMALL_MODEL="gemini-2.0-flash"

Example B: OpenAI

PREFERRED_PROVIDER="openai"
OPENAI_API_KEY="YOUR_OPENAI_API_KEY"
BIG_MODEL="gpt-4.1"
SMALL_MODEL="gpt-4.1-mini"

Example C: Anthropic Direct

PREFERRED_PROVIDER="anthropic"
ANTHROPIC_API_KEY="YOUR_ANTHROPIC_KEY"

Optional: Vertex AI (GCP)

PREFERRED_PROVIDER="google"
USE_VERTEX_AUTH=true
VERTEX_PROJECT="your-gcp-project-id"
VERTEX_LOCATION="us-central1"
OPENAI_API_KEY="dummy-key"
BIG_MODEL="gemini-2.5-pro"
SMALL_MODEL="gemini-2.0-flash"

Running the Proxy

Start the server from the repository folder:

uv run uvicorn server:app --host 127.0.0.1 --port 8082 --reload
  • --reload is optional (auto-reloads on code changes)
  • The server will install dependencies automatically

Test URL: http://127.0.0.1:8082/docs


Installing and Using Claude Code

  1. Install globally:
npm install -g @anthropic-ai/claude-code
  1. Connect to the proxy:
# Temporary session
set ANTHROPIC_BASE_URL=http://127.0.0.1:8082   # Windows
export ANTHROPIC_BASE_URL=http://127.0.0.1:8082 # macOS / Linux

claude

Now all Claude Code requests go through your selected backend.


Advanced Gemini Setup (Vertex AI)

If you prefer Application Default Credentials (ADC):

  1. Install Google Cloud CLI:
# macOS
brew install --cask google-cloud-sdk

# Windows
winget install Google.Cloud.SDK
  1. Authenticate:
gcloud auth application-default login
  1. Enable Vertex AI API:
gcloud services enable aiplatform.googleapis.com
  1. Update .env:
USE_VERTEX_AUTH=true
VERTEX_PROJECT="your-gcp-project-id"
VERTEX_LOCATION="us-central1"

Linux and macOS Commands

  • Clone repo: git clone ...
  • Copy .env: cp .env.example .env
  • Run proxy: uv run uvicorn server:app --host 0.0.0.0 --port 8082
  • Set environment variable:
export ANTHROPIC_BASE_URL=http://127.0.0.1:8082
  • Start Claude Code:
claude
  • Permanent variable (macOS / Linux):
echo 'export ANTHROPIC_BASE_URL=http://127.0.0.1:8082' >> ~/.zshrc
source ~/.zshrc

Common Issues and Troubleshooting

ProblemSolution
uv not foundRestart terminal; ensure PATH updated
Proxy starts but Claude errorsCheck .env formatting; quotes must be straight
Gemini model not recognizedUse gemini-2.5-pro or gemini-2.0-flash exactly
Port 8082 in useChange --port to an unused port

Conclusion

With claude-code-proxy, you can seamlessly use Claude Code with OpenAI, Gemini, or Anthropic. This setup works across Windows, macOS, and Linux, and supports Vertex AI authentication, streaming responses, and custom model mapping.

Once set up, you have the power to experiment with multiple AI backends through a single Claude Code interface, ideal for developers, researchers, and AI enthusiasts.

Share:

Related Tutorials

How to Fix “‘adb’ is not recognized as an internal or external command”

If you’re seeing this error when trying to use Android Debug Bridge (ADB), it means your system can’t find the ADB executable. This comprehensive guide will walk you through understanding the problem and multiple solutions to fix it. Understanding the Problem ADB (Android Debug Bridge) is a command-line tool that’s part of the Android SDK […]

Stephen Ndegwa
·

How to Use Claude Code with AWS Bedrock

This guide shows how to run Claude Code using Anthropic models hosted on Amazon Bedrock, instead of Anthropic’s direct API. Overview Claude Code supports Amazon Bedrock as a backend. When enabled, it: Prerequisites 1. AWS Account with Bedrock Enabled 👉 AWS Bedrock Consolehttps://console.aws.amazon.com/bedrock/ Important: The first time you use Anthropic models, AWS requires you to […]

Stephen Ndegwa
·

How to Enable ICMP (Ping) on Windows Public Firewall

This guide explains how to allow ICMP (Ping) traffic on a Windows machine using the Public firewall profile. This is useful for network troubleshooting, monitoring, and connectivity testing. ⚠️ Important: This only makes the Windows host pingable. It does not affect routers, public IPs on firewalls (e.g. MikroTik), or tunnel brokers like Hurricane Electric if […]

Stephen Ndegwa
·