RAG on living data
Automate the creation and maintenance of a powerful Retrieval-Augmented Generation (RAG) system using your live Notion data with this n8n workflow. This robust automation connects Notion, OpenAI, and Supabase to continuously update a vector database with your most current information, enabling intelligent AI responses based on your organization's living knowledge base. The workflow is triggered on a schedule to get updated pages from Notion, then processes these pages by getting their blocks, concatenating them into a single string, and using OpenAI embeddings to create vector representations. These embeddings are then stored in a Supabase vector store, after deleting any old embeddings to ensure data freshness. Additionally, a separate path allows for real-time RAG queries: when a chat message is received via the AI:chatTrigger, the Question and Answer Chain leverages the Supabase Vector Store to retrieve relevant information, which is then used by the OpenAI Chat Model to generate accurate and context-aware responses. This system is ideal for businesses and teams who need to provide up-to-date answers to complex questions, onboard new employees with dynamic documentation, or build intelligent chatbots that always reference the latest internal policies and procedures. By automating the entire RAG pipeline, this workflow significantly reduces the manual effort of maintaining a knowledge base, ensures AI responses are always current, and empowers users with instant access to accurate information, ultimately saving countless hours and improving decision-making across the organization.
Workflow JSON
{"id": "JxFP8FJ2W7e4Kmqn", "meta": {"instanceId": "fb8bc2e315f7f03c97140b30aa454a27bc7883a19000fa1da6e6b571bf56ad6d", "templateCredsSetupCompleted": true}, "name": "RAG on living data", "tags": [], "nodes": [{"id": "49086cdf-a38c-4cb8-9be9-d3e6ea5bdde5", "name": "Embeddings OpenAI", "type": "@n8n/n8n-nodes-langchain.embeddingsOpenAi", "position": [1740, 1040], "parameters": {"options": {}}, "credentials": {"openAiApi": {"id": "", "name": "[Your openAiApi]"}}, "typeVersion": 1}, {"id": "f0670721-92f4-422a-99c9-f9c2aa6fe21f", "name": "Token Splitter", "type": "@n8n/n8n-nodes-langchain.textSplitterTokenSplitter", "position": [2380, 540], "parameters": {"chunkSize": 500}, "typeVersion": 1}, {"id": "fe80ecac-4f79-4b07-ad8e-60ab5f980cba", "name": "Loop Over Items", "type": "n8n-nodes-base.splitInBatches", "position": [1180, -200], "parameters": {"options": {}}, "typeVersion": 3}, {"id": "81b79248-08e8-4214-872b-1796e51ad0a4", "name": "Question and Answer Chain", "type": "@n8n/n8n-nodes-langchain.chainRetrievalQa", "position": [744, 495], "parameters": {"options": {}}, "typeVersion": 1.3}, {"id": "e78f7b63-baef-4834-8f1b-aecfa9102d6c", "name": "Vector Store Retriever", "type": "@n8n/n8n-nodes-langchain.retrieverVectorStore", "position": [844, 715], "parameters": {}, "typeVersion": 1}, {"id": "1d5ffbd0-b2cf-4660-a291-581d18608ecd", "name": "OpenAI Chat Model", "type": "@n8n/n8n-nodes-langchain.lmChatOpenAi", "position": [704, 715], "parameters": {"model": "gpt-4o", "options": {}}, "credentials": {"openAiApi": {"id": "", "name": "[Your openAiApi]"}}, "typeVersion": 1}, {"id": "37a3063f-aa21-4347-a72f-6dd316c58366", "name": "When chat message received", "type": "@n8n/n8n-nodes-langchain.chatTrigger", "position": [524, 495], "webhookId": "74479a54-418f-4de2-b70d-cfb3e3fdd5a7", "parameters": {"public": true, "options": {}}, "typeVersion": 1.1}, {"id": "5924bc01-1694-4b5c-8a06-7c46ee4c6425", "name": "Schedule Trigger", "type": "n8n-nodes-base.scheduleTrigger", "position": [520, -200], "parameters": {"rule": {"interval": [{"field": "minutes", "minutesInterval": 1}]}}, "typeVersion": 1.2}, {"id": "5067eda6-8bbe-407a-a6af-93e81be53661", "name": "Sticky Note", "type": "n8n-nodes-base.stickyNote", "position": [620, 0], "parameters": {"width": 329.16412916774584, "height": 312.52803480051045, "content": "## Switch trigger (optional)\nIf you are on the cloud plan, consider switching to the Notion Trigger Node instead, to save on executions."}, "typeVersion": 1}, {"id": "33458828-484d-426b-a3d1-974a81c6162e", "name": "Limit", "type": "n8n-nodes-base.limit", "position": [1620, -60], "parameters": {}, "typeVersion": 1}, {"id": "4d39503a-378e-4942-a5d4-8c62785aac44", "name": "Limit1", "type": "n8n-nodes-base.limit", "position": [2660, -60], "parameters": {}, "typeVersion": 1}, {"id": "0e0b1391-3fe5-4d80-a2eb-a2483b79d9a6", "name": "Delete old embeddings if exist", "type": "n8n-nodes-base.supabase", "position": [1400, -60], "parameters": {"tableId": "documents", "operation": "delete", "filterType": "string", "filterString": "=metadata->>id=eq.{{ $('Input Reference').item.json.id }}"}, "credentials": {"supabaseApi": {"id": "", "name": "[Your supabaseApi]"}}, "typeVersion": 1, "alwaysOutputData": true}, {"id": "4a8614e4-0a53-4731-bc68-57505d7d0a09", "name": "Get page blocks", "type": "n8n-nodes-base.notion", "position": [1840, -60], "parameters": {"blockId": {"__rl": true, "mode": "id", "value": "={{ $('Input Reference').item.json.id }}"}, "resource": "block", "operation": "getAll", "returnAll": true, "fetchNestedBlocks": true}, "credentials": {"notionApi": {"id": "", "name": "[Your notionApi]"}}, "executeOnce": true, "typeVersion": 2.2}, {"id": "8c922895-49d6-4778-8356-6f6cf49e5420", "name": "Default Data Loader", "type": "@n8n/n8n-nodes-langchain.documentDefaultDataLoader", "position": [2300, 260], "parameters": {"options": {"metadata": {"metadataValues": [{"name": "id", "value": "={{ $('Input Reference').item.json.id }}"}, {"name": "name", "value": "={{ $('Input Reference').item.json.name }}"}]}}}, "typeVersion": 1}, {"id": "8ad7ff2e-4bc2-4821-ae03-bab2dc11d947", "name": "Sticky Note1", "type": "n8n-nodes-base.stickyNote", "position": [2220, 400], "parameters": {"width": 376.2098538932132, "height": 264.37628764336097, "content": "## Adjust chunk size and overlap\nFor more accurate search results, increase the overlap. For the *text-embedding-ada-002* model the chunk size plus overlap must not exceed 8191"}, "typeVersion": 1}, {"id": "8078d59a-f45f-4e96-a8ec-6c2f1c328e84", "name": "Input Reference", "type": "n8n-nodes-base.noOp", "position": [960, -200], "parameters": {}, "typeVersion": 1}, {"id": "aae6c517-a316-40e3-aee9-1cc4b448689f", "name": "Notion Trigger", "type": "n8n-nodes-base.notionTrigger", "disabled": true, "position": [740, 120], "parameters": {"event": "pagedUpdatedInDatabase", "pollTimes": {"item": [{"mode": "everyMinute"}]}, "databaseId": {"__rl": true, "mode": "list", "value": "ec6dc7b4-9ce0-47f7-8025-ef09295999fd", "cachedResultUrl": "https://www.notion.so/ec6dc7b49ce047f78025ef09295999fd", "cachedResultName": "Knowledge Base"}}, "credentials": {"notionApi": {"id": "", "name": "[Your notionApi]"}}, "typeVersion": 1}, {"id": "3a43d66d-d4e3-4ca1-aee9-85ac65160e45", "name": "Get updated pages", "type": "n8n-nodes-base.notion", "position": [740, -200], "parameters": {"filters": {"conditions": [{"key": "Last edited time|last_edited_time", "condition": "equals", "lastEditedTime": "={{ $now.minus(1, 'minutes').toISO() }}"}]}, "options": {}, "resource": "databasePage", "operation": "getAll", "databaseId": {"__rl": true, "mode": "list", "value": "ec6dc7b4-9ce0-47f7-8025-ef09295999fd", "cachedResultUrl": "https://www.notion.so/ec6dc7b49ce047f78025ef09295999fd", "cachedResultName": "Knowledge Base"}, "filterType": "manual"}, "credentials": {"notionApi": {"id": "", "name": "[Your notionApi]"}}, "typeVersion": 2.2}, {"id": "bbf1296f-4e2b-4a38-bdf3-ae2b63cc7774", "name": "Sticky Note23", "type": "n8n-nodes-base.stickyNote", "position": [900, -300], "parameters": {"color": 7, "width": 216.47293010628914, "height": 275.841854198618, "content": "This placeholder serves as a reference point so it is easier to swap the data source with a different service"}, "typeVersion": 1}, {"id": "631e1e10-0b52-4a17-89a4-769ac563321f", "name": "Sticky Note24", "type": "n8n-nodes-base.stickyNote", "position": [1340, -160], "parameters": {"color": 7, "width": 216.47293010628914, "height": 275.841854198618, "content": "All chunks of a previous version of the document are being deleted by filtering the meta data by the given ID"}, "typeVersion": 1}, {"id": "6c830c83-4b70-4719-8e2a-26846e60085c", "name": "Sticky Note25", "type": "n8n-nodes-base.stickyNote", "position": [1560, -160], "parameters": {"color": 7, "width": 216.47293010628914, "height": 275.841854198618, "content": "Reduce the active streams/items to just 1 to prevent the following nodes from double-processing"}, "typeVersion": 1}, {"id": "46c8e4e4-0a5e-4ede-947b-5773710d4e55", "name": "Sticky Note26", "type": "n8n-nodes-base.stickyNote", "position": [1780, -160], "parameters": {"color": 7, "width": 216.47293010628914, "height": 275.841854198618, "content": "Retrieve all page contents/blocks"}, "typeVersion": 1}, {"id": "0369e610-d074-4812-9d04-8615b42965a5", "name": "Sticky Note27", "type": "n8n-nodes-base.stickyNote", "position": [2600, -160], "parameters": {"color": 7, "width": 216.47293010628914, "height": 275.841854198618, "content": "Reduce the active streams/items to just 1 to prevent the following nodes from double-processing"}, "typeVersion": 1}, {"id": "4f3bce54-1650-45fa-abb0-c881358c7e8d", "name": "Sticky Note28", "type": "n8n-nodes-base.stickyNote", "position": [2220, -160], "parameters": {"color": 7, "width": 375.9283286479995, "height": 275.841854198618, "content": "Embed item and store in Vector Store. Depending on the length the content is being split up into multiple chunks/embeds"}, "typeVersion": 1}, {"id": "44125921-e068-4a5d-a56b-b0e63c103556", "name": "Supabase Vector Store1", "type": "@n8n/n8n-nodes-langchain.vectorStoreSupabase", "position": [924, 935], "parameters": {"options": {}, "tableName": {"__rl": true, "mode": "list", "value": "documents", "cachedResultName": "documents"}}, "credentials": {"supabaseApi": {"id": "", "name": "[Your supabaseApi]"}}, "typeVersion": 1}, {"id": "467322a9-949d-4569-aac6-92196da46ba5", "name": "Sticky Note30", "type": "n8n-nodes-base.stickyNote", "position": [460, 400], "parameters": {"color": 7, "width": 730.7522093855692, "height": 668.724737081502, "content": "Simple chat bot to ask specific questions while having access to the context of the Notion Knowledge Base which was stored in the Vector Store"}, "typeVersion": 1}, {"id": "27f078cf-b309-4dd1-a8ce-b4fc504d6e29", "name": "Sticky Note31", "type": "n8n-nodes-base.stickyNote", "position": [1660, 900], "parameters": {"color": 7, "width": 219.31927574471658, "height": 275.841854198618, "content": "Model used for both creating and reading embeddings"}, "typeVersion": 1}, {"id": "2f59cba1-4318-47e7-bf0b-b908d4186b86", "name": "Supabase Vector Store", "type": "@n8n/n8n-nodes-langchain.vectorStoreSupabase", "position": [2280, -60], "parameters": {"mode": "insert", "options": {}, "tableName": {"__rl": true, "mode": "list", "value": "documents", "cachedResultName": "documents"}}, "credentials": {"supabaseApi": {"id": "", "name": "[Your supabaseApi]"}}, "typeVersion": 1}, {"id": "729849e7-0eff-40c2-ae00-ae660c1eec69", "name": "Sticky Note32", "type": "n8n-nodes-base.stickyNote", "position": [1120, -300], "parameters": {"color": 7, "width": 216.47293010628914, "height": 275.841854198618, "content": "Process each page/document separately."}, "typeVersion": 1}, {"id": "3f632a24-ca0a-45c4-801d-041aa3f887a7", "name": "Sticky Note29", "type": "n8n-nodes-base.stickyNote", "position": [2220, 120], "parameters": {"color": 7, "width": 376.0759088111347, "height": 275.841854198618, "content": "Store additional meta data with each embed, especially the Notion ID, which can be later used to find all belonging entries of one page, even if they got split into multiple embeds."}, "typeVersion": 1}, {"id": "ffaf3861-5287-4f57-8372-09216a18cb4d", "name": "Sticky Note33", "type": "n8n-nodes-base.stickyNote", "position": [460, -300], "parameters": {"color": 7, "width": 216.47293010628914, "height": 275.841854198618, "content": "Using a manual approach for polling data from Notion for more accuracy."}, "typeVersion": 1}, {"id": "cbbedfc0-4d64-42a6-8f55-21e04887305f", "name": "Sticky Note34", "type": "n8n-nodes-base.stickyNote", "position": [680, -300], "parameters": {"width": 216.47293010628914, "height": 275.841854198618, "content": "## Select Database\nChoose the database which represents your Knowledge Base"}, "typeVersion": 1}, {"id": "8b6767f2-1bc9-42fb-b319-f39f6734b9f2", "name": "Sticky Note35", "type": "n8n-nodes-base.stickyNote", "position": [2000, -160], "parameters": {"color": 7, "width": 216.47293010628914, "height": 275.841854198618, "content": "Combine all contents to a single text formatted into one line which can be easily stored as an embed"}, "typeVersion": 1}, {"id": "cdff1756-77d7-421e-8672-25c9862840b0", "name": "Concatenate to single string", "type": "n8n-nodes-base.summarize", "position": [2060, -60], "parameters": {"options": {}, "fieldsToSummarize": {"values": [{"field": "content", "separateBy": "\n", "aggregation": "concatenate"}]}}, "typeVersion": 1}], "active": false, "pinData": {}, "settings": {"executionOrder": "v1"}, "versionId": "51075175-868a-4a3a-9580-5ad55e25ac71", "connections": {"Limit": {"main": [[{"node": "Get page blocks", "type": "main", "index": 0}]]}, "Limit1": {"main": [[{"node": "Loop Over Items", "type": "main", "index": 0}]]}, "Notion Trigger": {"main": [[{"node": "Input Reference", "type": "main", "index": 0}]]}, "Token Splitter": {"ai_textSplitter": [[{"node": "Default Data Loader", "type": "ai_textSplitter", "index": 0}]]}, "Get page blocks": {"main": [[{"node": "Concatenate to single string", "type": "main", "index": 0}]]}, "Input Reference": {"main": [[{"node": "Loop Over Items", "type": "main", "index": 0}]]}, "Loop Over Items": {"main": [[], [{"node": "Delete old embeddings if exist", "type": "main", "index": 0}]]}, "Schedule Trigger": {"main": [[{"node": "Get updated pages", "type": "main", "index": 0}]]}, "Embeddings OpenAI": {"ai_embedding": [[{"node": "Supabase Vector Store", "type": "ai_embedding", "index": 0}, {"node": "Supabase Vector Store1", "type": "ai_embedding", "index": 0}]]}, "Get updated pages": {"main": [[{"node": "Input Reference", "type": "main", "index": 0}]]}, "OpenAI Chat Model": {"ai_languageModel": [[{"node": "Question and Answer Chain", "type": "ai_languageModel", "index": 0}]]}, "Default Data Loader": {"ai_document": [[{"node": "Supabase Vector Store", "type": "ai_document", "index": 0}]]}, "Supabase Vector Store": {"main": [[{"node": "Limit1", "type": "main", "index": 0}]]}, "Supabase Vector Store1": {"ai_vectorStore": [[{"node": "Vector Store Retriever", "type": "ai_vectorStore", "index": 0}]]}, "Vector Store Retriever": {"ai_retriever": [[{"node": "Question and Answer Chain", "type": "ai_retriever", "index": 0}]]}, "When chat message received": {"main": [[{"node": "Question and Answer Chain", "type": "main", "index": 0}]]}, "Concatenate to single string": {"main": [[{"node": "Supabase Vector Store", "type": "main", "index": 0}]]}, "Delete old embeddings if exist": {"main": [[{"node": "Limit", "type": "main", "index": 0}]]}}}How to Import This Workflow
- 1Copy the workflow JSON above using the Copy Workflow JSON button.
- 2Open your n8n instance and go to Workflows.
- 3Click Import from JSON and paste the copied workflow.
Don't have an n8n instance? Start your free trial at n8nautomation.cloud
Related Templates
Auto-create TikTok videos with VEED.io AI avatars, ElevenLabs & GPT-4
Automate the creation and distribution of trending TikTok videos using AI avatars. This workflow connects Telegram, Perplexity, OpenAI, ElevenLabs, VEED.io, and BLOTATO to generate scripts, synthesize voice, create video, and publish across multiple social platforms. Content creators and marketers can rapidly produce engaging short-form video content without manual editing.
Automate LinkedIn Posts with AI
Automate your LinkedIn content creation and publishing by leveraging AI with this powerful workflow. This n8n automation connects LinkedIn, OpenAI, and Notion to streamline your social media presence. A Schedule Trigger initiates the process daily, querying your Notion database for today's scheduled posts. For each post, the workflow fetches all content from its Notion page, including text blocks and an image URL, then uses OpenAI to reformat the post text for optimal engagement. The workflow then combines the rephrased text and fetched image, publishing the complete post directly to LinkedIn. Finally, it updates the post's status in Notion to "Done," ensuring your content calendar remains accurate. This workflow is ideal for content creators, marketers, and businesses looking to maintain a consistent and engaging LinkedIn presence without manual effort, saving significant time on content preparation and publishing while ensuring high-quality, AI-enhanced posts.
Customer Support Channel and Ticketing System with Slack and Linear
Automate your customer support channel and ticketing system by efficiently managing incoming requests from Slack and creating structured tickets in Linear. This workflow connects Slack to capture customer inquiries, then leverages OpenAI's AI capabilities to understand and categorize these requests. It regularly polls Slack for new messages in a designated support channel using a Schedule Trigger, extracts key information, and checks Linear to see if similar issues already exist. If a new, unique issue is identified, the workflow uses OpenAI's language model to generate a comprehensive ticket description and then creates a new ticket in Linear, ensuring all relevant details are captured. This automation is ideal for customer support teams, product managers, and operations personnel who need to streamline their issue tracking, reduce manual data entry, and ensure no customer request falls through the cracks. It significantly reduces the time spent on triaging and creating support tickets, allowing teams to focus more on resolving customer issues rather than administrative tasks, ultimately improving response times and customer satisfaction.