LangChain

Using OpenRouter with LangChain

Using LangChain

LangChain provides a standard interface for working with chat models. You can use OpenRouter with LangChain using the dedicated ChatOpenRouter integration packages. For more details on LangChain’s model interface, see the LangChain Models documentation.

Resources:

1import { ChatOpenRouter } from "@langchain/openrouter";
2
3const model = new ChatOpenRouter(
4 "anthropic/claude-sonnet-4.6",
5 { temperature: 0.8 }
6);
7
8// Example usage
9const response = await model.invoke([
10 { role: "system", content: "You are a helpful assistant." },
11 { role: "user", content: "Hello, how are you?" },
12]);

For full documentation — including streaming, tool calling, structured output, reasoning, multimodal inputs, provider routing, and more — see the LangChain integration guides: