Skip to Content
IntegrationOpenClaw

Using Ling in OpenClaw

OpenClaw  is an AI coding assistant that supports custom LLM providers. By pointing its API endpoint to Ant Ling, you can use Ling models directly in OpenClaw for code completion, generation, explanation, and more.


What You Need


Configuration

Run the following command to launch the interactive setup wizard:

openclaw onboard

Follow the prompts to complete each step:

Select Custom Provider as the model provider
◇ Model/auth provider │ Custom Provider
Enter https://api.ant-ling.com/v1 as the API Base URL
◇ API Base URL │ https://api.ant-ling.com/v1
Select Paste API key now as the API Key input method
◇ How do you want to provide this API key? │ Paste API key now
Go to the API Console  to create and copy your Ling API Key
◇ API Key (leave blank if not required) │ <YOUR_API_KEY>
Select OpenAI-compatible as the endpoint compatibility mode
◇ Endpoint compatibility │ OpenAI-compatible
Enter the model ID — the example uses Ling-2.6-flash, change to Ling-2.6-1T, Ling-2.5-1T, or Ring-1T as needed
◇ Model ID │ Ling-2.6-flash
Verification succeeds — enter an Endpoint ID and optional model alias to complete setup
◇ Verification successful. ◇ Endpoint ID │ ant-ling ◇ Model alias (optional) │ Ling-2.6-flash Configured custom provider: ant-ling/Ling-2.6-flash

FAQ

Connection Failed / 401 Error

  • Confirm your API Key has been created and is active in the API Console 
  • Confirm the Base URL is https://api.ant-ling.com/v1 with no trailing slash

Incorrect Model Name

Model names are case-sensitive. Make sure the name matches the documentation exactly, e.g. Ling-2.6-flash not ling-2.6-flash.

Slow Response

Enable Streaming output (Stream) for a faster response experience.


Was this page helpful?
Last updated on