Using Ling in OpenClaw
OpenClaw is an AI coding assistant that supports custom LLM providers. By pointing its API endpoint to Ant Ling, you can use Ling models directly in OpenClaw for code completion, generation, explanation, and more.
What You Need
- Create API Key
- Install OpenClaw
Configuration
Run the following command to launch the interactive setup wizard:
openclaw onboardFollow the prompts to complete each step:
Select
Custom Provider as the model provider◇ Model/auth provider
│ Custom ProviderEnter
https://api.ant-ling.com/v1 as the API Base URL◇ API Base URL
│ https://api.ant-ling.com/v1Select
Paste API key now as the API Key input method◇ How do you want to provide this API key?
│ Paste API key nowGo to the API Console to create and copy your Ling API Key
◇ API Key (leave blank if not required)
│ <YOUR_API_KEY>Select
OpenAI-compatible as the endpoint compatibility mode◇ Endpoint compatibility
│ OpenAI-compatibleEnter the model ID — the example uses
Ling-2.6-flash, change to Ling-2.6-1T, Ling-2.5-1T, or Ring-1T as needed◇ Model ID
│ Ling-2.6-flashVerification succeeds — enter an Endpoint ID and optional model alias to complete setup
◇ Verification successful.
│
◇ Endpoint ID
│ ant-ling
│
◇ Model alias (optional)
│ Ling-2.6-flash
Configured custom provider: ant-ling/Ling-2.6-flashFAQ
Connection Failed / 401 Error
- Confirm your API Key has been created and is active in the API Console
- Confirm the Base URL is
https://api.ant-ling.com/v1with no trailing slash
Incorrect Model Name
Model names are case-sensitive. Make sure the name matches the documentation exactly, e.g. Ling-2.6-flash not ling-2.6-flash.
Slow Response
Enable Streaming output (Stream) for a faster response experience.
Related Links
Was this page helpful?
Last updated on