Skip to Content

Use Ling / Ring on OpenRouter

OpenRouter provides a unified, OpenAI-compatible entry point: point base_url at OpenRouter, use an OpenRouter API key in requests, and call multiple models with the same SDK or HTTP stack.

This pages covers common use cases:

  • Ling: General purpose
  • Ring: Complex reasoning, math, and code

What you need

Go to OpenRouter  to get:

  • An OpenRouter API Key
  • The model id for the Ant Ling model from OpenRouter’s model list

Important: OpenRouter’s model field usually expects OpenRouter’s model identifier (not the native Ant Ling model name). Use the id shown in the OpenRouter console or model page.


export OPENAI_BASE_URL=https://openrouter.ai/api/v1 export OPENAI_API_KEY=${OPENROUTER_API_KEY}

OpenRouter also recommends attaching site metadata on requests (for analytics and risk control), commonly:

  • HTTP-Referer: your site or project URL
  • X-Title: your application name

How you set these headers depends on the SDK; examples below.


Calling Ling (text chat)

Set model to the Ling model id you see on OpenRouter (placeholder in the example).

import os from openai import OpenAI client = OpenAI( base_url=os.environ.get("OPENAI_BASE_URL", "https://openrouter.ai/api/v1"), api_key=os.environ["OPENAI_API_KEY", "Your Api Key"], ) resp = client.chat.completions.create( model="inclusionai/ling-2.6-flash:free", # <- replace with the actual id on OpenRouter messages=[ {"role": "system", "content": "You are a professional, concise assistant."}, {"role": "user", "content": "Give five bullet points on how to write maintainable API documentation."}, ], extra_headers={ "HTTP-Referer": "https://your.site", "X-Title": "Your App Name", }, ) print(resp.choices[0].message.content)

Calling Ring (reasoning / code)

The call shape is the same as Ling; what matters is choosing the correct OpenRouter model id for Ring.

import os from openai import OpenAI client = OpenAI( base_url=os.environ.get("OPENAI_BASE_URL", "https://openrouter.ai/api/v1"), api_key=os.environ["OPENAI_API_KEY"], ) resp = client.chat.completions.create( model="inclusionai/ling-2.6-flash:free", # <- replace with the actual id on OpenRouter messages=[ {"role": "user", "content": "Given a string s, return the number of palindromic substrings. Give an O(n^2) solution and explain the idea."}, ], extra_headers={ "HTTP-Referer": "https://your.site", "X-Title": "Your App Name", }, ) print(resp.choices[0].message.content)

FAQ

Why do I get an error when I use the native Ant Ling model name?

OpenRouter’s model field usually expects its own identifier (often provider/model). Copy the id from OpenRouter’s model list and paste it into model.

How do I debug 401/403 responses?

  • Confirm you are using an OpenRouter API Key
  • Confirm OPENAI_BASE_URL=https://openrouter.ai/api/v1
  • Confirm the chosen model is available for your OpenRouter account (permissions, balance, region, etc.)
Was this page helpful?
Last updated on