Use any LLM-Model via OpenRouter
Engage with any Large Language Model available through OpenRouter, leveraging n8n's powerful automation capabilities to create dynamic AI-driven interactions. This workflow automates the process of receiving a chat message, processing it through an AI Agent, and then generating a response using a specified LLM Model connected via OpenRouter, with chat memory ensuring conversational context. Businesses can deploy this for advanced customer support chatbots, interactive content generation, or personalized user experiences, allowing them to experiment with various cutting-edge AI models without complex infrastructure. It significantly reduces the development time and technical overhead associated with integrating and managing multiple LLMs, providing a flexible and scalable solution for AI-powered applications.
Workflow JSON
{"id": "VhN3CX6QPBkX77pZ", "meta": {"instanceId": "98bf0d6aef1dd8b7a752798121440fb171bf7686b95727fd617f43452393daa3", "templateCredsSetupCompleted": true}, "name": "Use any LLM-Model via OpenRouter", "tags": [{"id": "uumvgGHY5e6zEL7V", "name": "Published Template", "createdAt": "2025-02-10T11:18:10.923Z", "updatedAt": "2025-02-10T11:18:10.923Z"}], "nodes": [{"id": "b72721d2-bce7-458d-8ff1-cc9f6d099aaf", "name": "Settings", "type": "n8n-nodes-base.set", "position": [-420, -640], "parameters": {"options": {}, "assignments": {"assignments": [{"id": "3d7f9677-c753-4126-b33a-d78ef701771f", "name": "model", "type": "string", "value": "deepseek/deepseek-r1-distill-llama-8b"}, {"id": "301f86ec-260f-4d69-abd9-bde982e3e0aa", "name": "prompt", "type": "string", "value": "={{ $json.chatInput }}"}, {"id": "a9f65181-902d-48f5-95ce-1352d391a056", "name": "sessionId", "type": "string", "value": "={{ $json.sessionId }}"}]}}, "typeVersion": 3.4}, {"id": "a4593d64-e67a-490e-9cb4-936cc46273a0", "name": "Sticky Note", "type": "n8n-nodes-base.stickyNote", "position": [-460, -740], "parameters": {"width": 180, "height": 400, "content": "## Settings\nSpecify the model"}, "typeVersion": 1}, {"id": "3ea3b09a-0ab7-4e0f-bb4f-3d807d072d4e", "name": "Sticky Note1", "type": "n8n-nodes-base.stickyNote", "position": [-240, -740], "parameters": {"color": 3, "width": 380, "height": 400, "content": "## Run LLM\nUsing OpenRouter to make model fully configurable"}, "typeVersion": 1}, {"id": "19d47fcb-af37-4daa-84fd-3f43ffcb90ff", "name": "When chat message received", "type": "@n8n/n8n-nodes-langchain.chatTrigger", "position": [-660, -640], "webhookId": "71f56e44-401f-44ba-b54d-c947e283d034", "parameters": {"options": {}}, "typeVersion": 1.1}, {"id": "f5a793f2-1e2f-4349-a075-9b9171297277", "name": "AI Agent", "type": "@n8n/n8n-nodes-langchain.agent", "position": [-180, -640], "parameters": {"text": "={{ $json.prompt }}", "options": {}, "promptType": "define"}, "typeVersion": 1.7}, {"id": "dbbd9746-ca25-4163-91c5-a9e33bff62a4", "name": "Chat Memory", "type": "@n8n/n8n-nodes-langchain.memoryBufferWindow", "position": [-80, -460], "parameters": {"sessionKey": "={{ $json.sessionId }}", "sessionIdType": "customKey"}, "typeVersion": 1.3}, {"id": "ef368cea-1b38-455b-b46a-5d0ef7a3ceb3", "name": "LLM Model", "type": "@n8n/n8n-nodes-langchain.lmChatOpenAi", "position": [-200, -460], "parameters": {"model": "={{ $json.model }}", "options": {}}, "credentials": {"openAiApi": {"id": "", "name": "[Your openAiApi]"}}, "typeVersion": 1.1}, {"id": "32601e76-0979-4690-8dcf-149ddbf61983", "name": "Sticky Note2", "type": "n8n-nodes-base.stickyNote", "position": [-460, -320], "parameters": {"width": 600, "height": 240, "content": "## Model examples\n\n* openai/o3-mini\n* google/gemini-2.0-flash-001\n* deepseek/deepseek-r1-distill-llama-8b\n* mistralai/mistral-small-24b-instruct-2501:free\n* qwen/qwen-turbo\n\nFor more see https://openrouter.ai/models"}, "typeVersion": 1}], "active": false, "pinData": {}, "settings": {"executionOrder": "v1"}, "versionId": "6d0caf5d-d6e6-4059-9211-744b0f4bc204", "connections": {"Settings": {"main": [[{"node": "AI Agent", "type": "main", "index": 0}]]}, "LLM Model": {"ai_languageModel": [[{"node": "AI Agent", "type": "ai_languageModel", "index": 0}]]}, "Chat Memory": {"ai_memory": [[{"node": "AI Agent", "type": "ai_memory", "index": 0}]]}, "When chat message received": {"main": [[{"node": "Settings", "type": "main", "index": 0}]]}}}How to Import This Workflow
- 1Copy the workflow JSON above using the Copy Workflow JSON button.
- 2Open your n8n instance and go to Workflows.
- 3Click Import from JSON and paste the copied workflow.
Don't have an n8n instance? Start your free trial at n8nautomation.cloud
Related Templates
Text to Speech (OpenAI)
Converts text into natural-sounding speech using OpenAI's Text-to-Speech API. It sends your input text to OpenAI and receives an audio file in return. This is useful for creating audio versions of articles, generating voiceovers for videos, or providing accessibility features for web content. Quickly transform written content into engaging audio.
LangChain - Example - Code Node Example
Explore a basic LangChain agent that answers questions using a custom tool. This workflow connects n8n's AI nodes and custom code nodes to OpenAI for language model interactions. It's useful for developers building custom AI assistants or researchers experimenting with agentic workflows. This saves development time by providing a ready-to-use example of a LangChain agent.
AI-Powered Candidate Shortlisting Automation for ERPNext
Automate AI-powered candidate shortlisting for ERPNext job applications. This workflow connects ERPNext, Google Gemini, WhatsApp, and Outlook to process resumes, evaluate candidates, and communicate outcomes. Recruiters and HR departments can use this to efficiently screen applicants, automatically reject unqualified candidates, and send acceptance notifications. It significantly reduces manual review time and streamlines the hiring process.