Create childrens AI story videos from drawings and auto-publish to YouTube with Blotato
π₯ From Drawing to Story: Auto-Publish AI Video to YouTube with Blotato Overview Transform a hand-drawn character sketch into a fully animated, narrated video story β automatically. This 3-part pipeline uses Claude AI, image generation, and video synthesis to go from a simple drawing to a publish-ready video, with no manual editing required. Perfect for: indie creators, educators, storytellers, and anyone who wants to bring hand-drawn characters to life at scale. How It Works Part 1 β From Drawing to Story: Bringing Characters to Life A form submission triggers the workflow with an uploaded drawing The image is analyzed by Claude AI to extract characters and traits Character images are generated via Nano Banana (image generation API) A full story is written by Claude AI, split into scenes, and passed to Part 2 Part 2 β From Characters to Scenes: Rendering the Visual Story Character images are downloaded and converted to Base64 references Scene images are generated using Nano Banana with character consistency Scene image URLs are mapped and the video pipeline is triggered Part 3 β From Scenes to Screen: Video, Narration & Final Render Video prompts and narration context are generated by Claude AI Videos are generated via AtlasCloud (Kling Pro 3.0) with polling loop Narration audio is created with ElevenLabs and uploaded Shotstack assembles the final video with audio sync Final video is published to YouTube (and optionally TikTok) > β οΈ Important β Workflow Structure > > This template is split into 3 separate workflows. > Each part must be imported and deployed in its own workflow in n8n. > > πΊ Watch the step-by-step tutorial to set everything up correctly: > > @youtube Requirements Credentials needed Blotato API credentials (YouTube/TikTok publishing) AtlasCloud API (Kling Pro 3.0 video generation) Anthropic API key (Claude AI for story & prompts) ElevenLabs API key (narration audio) Shotstack API key (video assembly) Nano Banana API key (image generation) Setup steps Configure credentials for each service above in n8n Set up a form trigger with a file upload field for the drawing Deploy the 3 workflows in order and connect them via webhooks Run a test submission with a simple sketch to validate the full pipeline π₯ Watch This Tutorial π Need help or want to customize this? π© Contact: LinkedIn πΊ YouTube: @DRFIRASS π Workshops: Mes Ateliers n8n Need help customizing? Contact me for consulting and support : Linkedin / Youtube / π Mes Ateliers n8n
Workflow JSON
{"id":"yfIi4CChMQkYHTfj","meta":{"instanceId":"de822f81f3a2367cef7d9549771a77783236bc9596481be2ae65c05fbcc4b4fd","templateCredsSetupCompleted":true},"name":"workflow1 copy - MEF","tags":[],"nodes":[{"id":"c0d8791a-573e-4803-94ef-78983a97c250","name":"On form submission","type":"n8n-nodes-base.formTrigger","position":[-1136,-224],"webhookId":"df73cf22-0e12-4c4d-b7a6-cf50dfaa5044","parameters":{"options":{},"formTitle":"SketchTales","formFields":{"values":[{"fieldName":"Image","fieldType":"file","fieldLabel":"Upload your child's drawing","multipleFiles":false,"requiredField":true}]}},"typeVersion":2.4},{"id":"67676d60-5dfa-4f0a-9fdc-614c95ba445e","name":"Analyze an image","type":"@n8n/n8n-nodes-langchain.googleGemini","position":[-512,-224],"parameters":{"text":"You are an expert at understanding children's artwork. Analyze this drawing and extract ALL characters/creatures/people you see.\\n\\nFor each character give:\\n- A fun simple name a child would use\\n- Detailed visual description (colors, size, features, clothing, expression)\\n\\nAlso extract:\\n- Setting (where is this?)\\n- Mood (happy, adventurous, silly, etc.)\\n\\nRespond ONLY in this exact JSON format, no markdown, no backticks:\\n{\\\"characters\\\": [{\\\"name\\\": \\\"...\\\", \\\"description\\\": \\\"...\\\"}], \\\"setting\\\": \\\"...\\\", \\\"mood\\\": \\\"...\\\"}","modelId":{"__rl":true,"mode":"list","value":"models/gemini-3-pro-preview","cachedResultName":"models/gemini-3-pro-preview"},"options":{},"resource":"image","inputType":"binary","operation":"analyze"},"credentials":{"googlePalmApi":{"id":"credential-id","name":"Google Gemini(PaLM) Api account 2"}},"typeVersion":1.1},{"id":"102bdbc6-5742-4f6a-9e3c-b24ea60c77e3","name":"Edit Fields","type":"n8n-nodes-base.set","position":[-320,-224],"parameters":{"options":{},"assignments":{"assignments":[{"id":"f3431bde-ff93-412e-b836-082b3752fcf3","name":"content.parts[0].text","type":"string","value":"={{ $json.content.parts[0].text }}"}]}},"typeVersion":3.4},{"id":"e7d86999-0cd4-470f-bb0f-e4903d227b34","name":"Extract from File","type":"n8n-nodes-base.extractFromFile","position":[-928,-224],"parameters":{"options":{},"operation":"binaryToPropery","binaryPropertyName":"Image"},"typeVersion":1.1},{"id":"8a501e82-ba99-4de0-9a90-2594ef76cd83","name":"Convert to File","type":"n8n-nodes-base.convertToFile","position":[-720,-224],"parameters":{"options":{},"operation":"toBinary","sourceProperty":"data"},"typeVersion":1.1},{"id":"7940864c-cc04-4b49-ae9f-57c4941297fa","name":"Parse Characters","type":"n8n-nodes-base.code","position":[-112,-224],"parameters":{"jsCode":"const response = $input.first().json;\nlet text = response.content.parts[0].text;\ntext = text.replace(/```json\\n?/g, '').replace(/```\\n?/g, '').trim();\nconst data = JSON.parse(text);\n\nconst chars = data.characters;\nlet charList = '';\nlet jsonFormat = '{\\n';\n\nfor (let i = 0; i < chars.length; i++) {\n charList += 'Character ' + (i + 1) + ': ' + chars[i].name + ' β ' + chars[i].description + '\\n\\n';\n if (i < chars.length - 1) {\n jsonFormat += ' \"character_' + (i + 1) + '_prompt\": \"...\",\\n';\n } else {\n jsonFormat += ' \"character_' + (i + 1) + '_prompt\": \"...\"\\n';\n }\n}\n\njsonFormat += '}';\n\nreturn chars.map((char, i) => ({\n json: {\n character_index: i + 1,\n character_name: char.name,\n character_desc: char.description,\n character_count: chars.length\n }\n}));"},"typeVersion":2},{"id":"44a22476-14ab-4cc1-ad6c-c2d56f24d167","name":"Build Character Prompt","type":"n8n-nodes-base.code","position":[-1136,176],"parameters":{"jsCode":"const items = $input.all();\n\nlet charList = '';\nlet jsonFormat = '{\\n';\n\nfor (let i = 0; i < items.length; i++) {\n const c = items[i].json;\n charList += 'Character ' + c.character_index + ': ' + c.character_name + ' β ' + c.character_desc + '\\n\\n';\n if (i < items.length - 1) {\n jsonFormat += ' \"character_' + c.character_index + '_prompt\": \"...\",\\n';\n } else {\n jsonFormat += ' \"character_' + c.character_index + '_prompt\": \"...\"\\n';\n }\n}\n\njsonFormat += '}';\n\nreturn [{\n json: {\n character_count: items.length,\n character_list: charList,\n json_format: jsonFormat\n }\n}];"},"typeVersion":2},{"id":"65f98464-7b46-478d-a930-1bc6fd4ab6fb","name":"Split Character Prompts","type":"n8n-nodes-base.code","position":[-544,176],"parameters":{"jsCode":"const prompts = $input.first().json.output.prompts;\n\nreturn prompts.map(p => ({\n json: {\n character_index: p.character_index,\n character_name: p.character_name,\n prompt: p.prompt\n }\n}));"},"typeVersion":2},{"id":"8c98944a-4e07-4b08-b0b0-468d02244c75","name":"Nano Banana Payload","type":"n8n-nodes-base.code","position":[-336,176],"parameters":{"jsCode":"const items = $input.all();\nconst base64 = $('Extract from File').first().json.data;\n\nreturn items.map(item => ({\n json: {\n character_index: item.json.character_index,\n character_name: item.json.character_name,\n payload: {\n contents: [\n {\n parts: [\n { text: item.json.prompt },\n {\n inline_data: {\n mime_type: \"image/png\",\n data: base64\n }\n }\n ]\n }\n ],\n generationConfig: {\n responseModalities: [\"TEXT\", \"IMAGE\"],\n imageConfig: {\n aspectRatio: \"16:9\",\n imageSize: \"1K\"\n }\n }\n }\n }\n}));"},"typeVersion":2},{"id":"ae297634-568c-4233-a0d4-e30278e01fd6","name":"Nano Banana Generate","type":"n8n-nodes-base.httpRequest","position":[-112,176],"parameters":{"url":"https://generativelanguage.googleapis.com/v1beta/models/gemini-3-pro-image-preview:generateContent","method":"POST","options":{},"jsonBody":"={{ $json.payload }}","sendBody":true,"sendHeaders":true,"specifyBody":"json","authentication":"predefinedCredentialType","headerParameters":{"parameters":[{"name":"Content-Type","value":"application/json"}]},"nodeCredentialType":"googlePalmApi"},"credentials":{"googlePalmApi":{"id":"credential-id","name":"Google Gemini(PaLM) Api account 2"}},"retryOnFail":true,"typeVersion":4.3,"waitBetweenTries":5000},{"id":"a9c8ccbb-c44d-43f8-8cf4-2aac28b792d3","name":"Extract Character Image","type":"n8n-nodes-base.set","position":[112,176],"parameters":{"options":{},"assignments":{"assignments":[{"id":"b5b58c86-29ff-4815-84dd-e54331612abb","name":"data","type":"string","value":"={{ $json.candidates[0].content.parts[0].inlineData.data }}"}]}},"typeVersion":3.4},{"id":"5d6e03b9-7714-4806-b217-4868a9b0ae1d","name":"Character Image to File","type":"n8n-nodes-base.convertToFile","position":[320,176],"parameters":{"options":{},"operation":"toBinary","sourceProperty":"data"},"typeVersion":1.1},{"id":"0aa4b18a-d40b-4545-a7ae-402c7ce8d21d","name":"Map Character URLs","type":"n8n-nodes-base.code","position":[736,176],"parameters":{"jsCode":"const cloudinary = $input.all();\nconst characters = $('Parse Characters').all(); // change to your code node name that has character_name\n\nreturn cloudinary.map((item, i) => ({\n json: {\n character_index: i + 1,\n character_name: characters[i].json.character_name,\n character_image_url: item.json.secure_url\n }\n}));"},"typeVersion":2},{"id":"aeeb4e87-af77-4bb8-bd1e-42715c625512","name":"Build Story Context","type":"n8n-nodes-base.code","position":[-1120,576],"parameters":{"jsCode":"const items = $input.all();\n\nlet charList = '';\nlet charMap = [];\n\nfor (const item of items) {\n const c = item.json;\n charList += c.character_name + ' (image: ' + c.character_image_url + ')\\n';\n charMap.push({\n character_index: c.character_index,\n character_name: c.character_name,\n character_image_url: c.character_image_url\n });\n}\n\nreturn [{\n json: {\n character_count: items.length,\n character_list: charList,\n character_map: charMap\n }\n}];"},"typeVersion":2},{"id":"d9e694a2-db68-4109-b0f1-63dd0a634163","name":"Split Story Chunks","type":"n8n-nodes-base.code","position":[-576,576],"parameters":{"jsCode":"const story = $input.first().json.output;\nconst charMap = $('Map Character URLs').all(); // change to your node name with character URLs\n\nreturn story.chunks.map(chunk => ({\n json: {\n title: story.title,\n chunk_number: chunk.chunk_number,\n script: chunk.script,\n image_prompt: chunk.image_prompt,\n duration_seconds: chunk.duration_seconds,\n characters_used: chunk.characters_used,\n character_images: chunk.characters_used.map(i => \n charMap[i - 1].json.character_image_url\n )\n }\n}));"},"typeVersion":2},{"id":"5d98edd7-da5c-4a55-b825-caf729278015","name":"Extract Unique Character URLs","type":"n8n-nodes-base.code","position":[-368,576],"parameters":{"jsCode":"const chunks = $input.all();\nconst allUrls = new Set();\n\nfor (const chunk of chunks) {\n for (const url of chunk.json.character_images) {\n allUrls.add(url);\n }\n}\n\nreturn [...allUrls].map((url, i) => ({\n json: {\n character_index: i + 1,\n image_url: url\n }\n}));"},"typeVersion":2},{"id":"03c82920-81ad-4dab-8781-c41c4604a9d3","name":"Build Scene Payload","type":"n8n-nodes-base.code","position":[-160,576],"parameters":{"jsCode":"const story = $('Generate Story').first().json.output;\nconst charUrls = $('Extract Unique Character URLs').all();\n\nconst characters = charUrls.map(c => ({\n character_index: c.json.character_index,\n image_url: c.json.image_url\n}));\n\nconst chunks = story.chunks.map(chunk => ({\n chunk_number: chunk.chunk_number,\n script: chunk.script,\n image_prompt: chunk.image_prompt,\n duration_seconds: chunk.duration_seconds,\n characters_used: chunk.characters_used,\n character_images: chunk.characters_used.map(i =>\n characters.find(c => c.character_index === i).image_url\n )\n}));\n\nreturn [{\n json: {\n title: story.title,\n characters: characters,\n chunks: chunks\n }\n}];"},"typeVersion":2},{"id":"e977d9c6-e243-4e9e-8304-970eea973aba","name":"Trigger Scene Workflow","type":"n8n-nodes-base.httpRequest","position":[48,576],"parameters":{"url":"https://dr-firass.app.n8n.cloud/webhook-test/9a2bdfcb-57ae-4a8f-a3dc-78382b9334a0","method":"POST","options":{},"jsonBody":"={{ $json }}","sendBody":true,"specifyBody":"json"},"typeVersion":4.3},{"id":"ada7addf-8329-460d-8fcf-5d8478de85f5","name":"Structured Output Parser2","type":"@n8n/n8n-nodes-langchain.outputParserStructured","position":[-768,304],"parameters":{"jsonSchemaExample":"{\n \"prompts\": [\n {\n \"character_index\": 1,\n \"character_name\": \"...\",\n \"prompt\": \"...\"\n }\n ]\n}"},"typeVersion":1.3},{"id":"bb283fd5-6ebf-4e43-8de2-04bd4f522254","name":"Anthropic Chat Model1","type":"@n8n/n8n-nodes-langchain.lmChatAnthropic","position":[-912,304],"parameters":{"model":{"__rl":true,"mode":"list","value":"claude-sonnet-4-5-20250929","cachedResultName":"Claude Sonnet 4.5"},"options":{}},"credentials":{"anthropicApi":{"id":"credential-id","name":"Anthropic account 6"}},"typeVersion":1.3},{"id":"0bad2b61-4545-4f8b-9e4a-20a7e0ff9283","name":"Structured Output Parser","type":"@n8n/n8n-nodes-langchain.outputParserStructured","position":[-784,736],"parameters":{"jsonSchemaExample":"{\n \"title\": \"...\",\n \"chunks\": [\n {\n \"chunk_number\": 1,\n \"script\": \"...\",\n \"image_prompt\": \"...\",\n \"characters_used\": [1],\n \"duration_seconds\": 10\n }\n ]\n}"},"typeVersion":1.3},{"id":"7e9113bf-d405-4278-b62c-f6864fcbc39f","name":"Anthropic Chat Model","type":"@n8n/n8n-nodes-langchain.lmChatAnthropic","position":[-928,736],"parameters":{"model":{"__rl":true,"mode":"list","value":"claude-sonnet-4-5-20250929","cachedResultName":"Claude Sonnet 4.5"},"options":{}},"credentials":{"anthropicApi":{"id":"credential-id","name":"Anthropic account 6"}},"typeVersion":1.3},{"id":"fd6cd8ee-bd3a-461c-993c-169aeddb8acc","name":"Upload an asset from file data1","type":"n8n-nodes-cloudinary.cloudinary","position":[544,176],"parameters":{"operation":"uploadFile","additionalFieldsFile":{}},"credentials":{"cloudinaryApi":{"id":"credential-id","name":"Cloudinary account 7"}},"typeVersion":1},{"id":"402f8d66-8cdf-4f5a-b2a5-b04d719d2f44","name":"Generate Story","type":"@n8n/n8n-nodes-langchain.chainLlm","position":[-928,576],"parameters":{"text":"=Write a children's story using these {{ $json.character_count }} characters:\n\n{{ $json.character_list }}\n\nRespond in this EXACT JSON format:\n{\n \"title\": \"...\",\n \"chunks\": [\n {\n \"chunk_number\": 1,\n \"script\": \"narration text here\",\n \"image_prompt\": \"Using the attached character images as reference, create a scene showing...\",\n \"characters_used\": [1, 2],\n \"duration_seconds\": 10\n }\n ]\n}","batching":{},"messages":{"messageValues":[{"message":"You are a beloved children's storybook author and visual director. You write short, magical stories for ages 3-7. You will receive characters from a child's drawing. Write a story in chunks. Each chunk is about 10 seconds of narration (2-3 sentences). For each chunk you also write a detailed image generation prompt for the scene illustration. Rules: 1) Use the EXACT character names provided in the narration. 2) Each chunk specifies which characters appear by their index number. 3) Story has a beginning, a small adventure, a resolution, and a warm ending. 4) Language is simple, warm, magical β sensory details, sounds, colors, feelings. 5) Keep it to 4-6 chunks total. 6) Each image_prompt MUST start with 'Using the attached character images as reference, create a scene showing...' 7) Image style for ALL: 'soft watercolor, rounded shapes, warm lighting, children's storybook animation, Pixar meets Eric Carle'. 8) Include scene setting, character poses, expressions, actions, and mood in each image_prompt. 9) duration_seconds should match how long the narration takes to read aloud slowly. 10) Respond ONLY valid JSON β no explanation, no markdown, no backticks."}]},"promptType":"define","hasOutputParser":true},"typeVersion":1.9},{"id":"30739bcf-70f4-44e5-9307-ce3c65e08d9e","name":"Generate Character Prompts","type":"@n8n/n8n-nodes-langchain.chainLlm","position":[-912,160],"parameters":{"text":"=I have {{ $json.character_count }} characters. Generate an image prompt for each.\n\n{{ $json.character_list }}\n\nReturn EXACTLY {{ $json.character_count }} items in this JSON format:\n{{ $json.json_format }}","batching":{},"messages":{"messageValues":[{"message":"You are an expert prompt engineer for AI image generation. You will receive characters from a child's drawing. Write a detailed image generation prompt for EACH character that will be sent to an image-to-image AI model along with the original drawing as reference. All characters must share the SAME art style so they belong in the same storybook. Rules: 1) Every prompt MUST start with 'Using the attached image as reference, create an animated cartoon version of...'. 2) Reference EXACT colors, features, clothing, proportions. 3) Every prompt includes 'front-facing, full body, clean white background'. 4) Style for ALL: 'soft watercolor, rounded shapes, warm lighting, children's storybook animation, Pixar meets Eric Carle'. 5) No text in image. 6) ONE character per prompt β isolate only that character from the drawing. 7) Respond ONLY valid JSON β no explanation, no markdown, no backticks."}]},"promptType":"define","hasOutputParser":true},"typeVersion":1.9},{"id":"450c3e35-ca79-4e4c-a623-13ef050f494a","name":"Sticky Note3","type":"n8n-nodes-base.stickyNote","position":[-1280,-368],"parameters":{"color":7,"width":2320,"height":1264,"content":"## Part 1 β From Drawing to Story: Characters, Images & Narrative"},"typeVersion":1},{"id":"5d18efd8-0e84-4f57-a39b-e41bc0d475f3","name":"Webhook","type":"n8n-nodes-base.webhook","position":[-1152,1008],"webhookId":"87f99887-37fa-432a-ba5b-fd8043373e64","parameters":{"path":"87f99887-37fa-432a-ba5b-fd8043373e64","options":{},"httpMethod":"POST"},"typeVersion":2.1},{"id":"483a8e4d-b3b0-4022-a388-b3eafcbe97ee","name":"Split Scene Chunks","type":"n8n-nodes-base.code","position":[-928,1008],"parameters":{"jsCode":"const body = $input.first().json.body;\n\nreturn body.chunks.map(chunk => ({\n json: {\n title: body.title,\n chunk_number: chunk.chunk_number,\n script: chunk.script,\n image_prompt: chunk.image_prompt,\n duration_seconds: chunk.duration_seconds,\n characters_used: chunk.characters_used,\n character_images: chunk.character_images\n }\n}));"},"typeVersion":2},{"id":"a5e18250-500b-4095-8b8d-496d214ceaf8","name":"Extract Character URLs","type":"n8n-nodes-base.code","position":[-720,1008],"parameters":{"jsCode":"const body = $('Webhook').first().json.body;\n\nreturn body.characters.map(c => ({\n json: {\n character_index: c.character_index,\n image_url: c.image_url\n }\n}));"},"typeVersion":2},{"id":"133845a0-1d49-4505-840e-0c647730a4bc","name":"Download Character Images","type":"n8n-nodes-base.httpRequest","position":[-544,1008],"parameters":{"url":"={{ $json.image_url }}","options":{}},"typeVersion":4.3},{"id":"60a78731-bbc4-4891-b50e-55c0429b35e8","name":"Character Images to Base64","type":"n8n-nodes-base.extractFromFile","position":[-336,1008],"parameters":{"options":{},"operation":"binaryToPropery"},"typeVersion":1.1},{"id":"83ea57c6-9f84-4ba0-a6c5-35356a5a199b","name":"Build Base64 Map","type":"n8n-nodes-base.code","position":[-128,1008],"parameters":{"jsCode":"const items = $input.all();\n\nreturn [{\n json: {\n character_base64: items.map((item, i) => ({\n character_index: i + 1,\n base64: item.json.data,\n mimetype: item.json.mimetype || 'image/jpeg'\n }))\n }\n}];"},"typeVersion":2},{"id":"6a322b10-3b98-414c-a184-bc5777a53e5f","name":"Build Scene Image Payloads","type":"n8n-nodes-base.code","position":[-1168,1424],"parameters":{"jsCode":"const charData = $('Build Base64 Map').first().json.character_base64;\nconst body = $('Webhook').first().json.body;\n\nreturn body.chunks.map(chunk => {\n const parts = [\n { text: chunk.image_prompt }\n ];\n\n for (const charIdx of chunk.characters_used) {\n const char = charData.find(c => c.character_index === charIdx);\n if (char) {\n parts.push({\n inline_data: {\n mime_type: char.mimetype,\n data: char.base64\n }\n });\n }\n }\n\n return {\n json: {\n chunk_number: chunk.chunk_number,\n script: chunk.script,\n duration_seconds: chunk.duration_seconds,\n payload: {\n contents: [{ parts }],\n generationConfig: {\n responseModalities: [\"TEXT\", \"IMAGE\"],\n imageConfig: {\n aspectRatio: \"16:9\",\n imageSize: \"1K\"\n }\n }\n }\n }\n };\n});"},"typeVersion":2},{"id":"009372bc-69aa-4edd-8706-9cd0aa7cb1d3","name":"Nano Banana Scene Generate","type":"n8n-nodes-base.httpRequest","position":[-976,1424],"parameters":{"url":"https://generativelanguage.googleapis.com/v1beta/models/gemini-3-pro-image-preview:generateContent","method":"POST","options":{},"jsonBody":"={{ $json.payload }}","sendBody":true,"sendHeaders":true,"specifyBody":"json","authentication":"predefinedCredentialType","headerParameters":{"parameters":[{"name":"Content-Type","value":"application/json"}]},"nodeCredentialType":"googlePalmApi"},"credentials":{"googlePalmApi":{"id":"credential-id","name":"Google Gemini(PaLM) Api account 2"}},"retryOnFail":true,"typeVersion":4.3,"waitBetweenTries":5000},{"id":"ef558c1f-8fcc-419d-aa46-da2b1e73558f","name":"Extract Scene Image","type":"n8n-nodes-base.set","position":[-752,1424],"parameters":{"options":{},"assignments":{"assignments":[{"id":"b5b58c86-29ff-4815-84dd-e54331612abb","name":"data","type":"string","value":"={{ $json.candidates[0].content.parts[0].inlineData.data }}"}]}},"typeVersion":3.4},{"id":"145af6ac-6e9f-4fbf-9c1b-04193c88b57a","name":"Scene Image to File","type":"n8n-nodes-base.convertToFile","position":[-544,1424],"parameters":{"options":{},"operation":"toBinary","sourceProperty":"data"},"typeVersion":1.1},{"id":"8a926c81-c894-488c-b4cd-d87e98923296","name":"Map Scene URLs","type":"n8n-nodes-base.code","position":[-128,1424],"parameters":{"jsCode":"const cloudinary = $input.all();\nconst body = $('Webhook').first().json.body;\n\nconst chunks = body.chunks.map((chunk, i) => ({\n chunk_number: chunk.chunk_number,\n script: chunk.script,\n image_prompt: chunk.image_prompt,\n duration_seconds: chunk.duration_seconds,\n scene_image_url: cloudinary[i].json.secure_url\n}));\n\nreturn [{\n json: {\n title: body.title,\n chunks: chunks\n }\n}];"},"typeVersion":2},{"id":"0f81613e-826f-45b1-ae18-37f1bdc517a6","name":"Trigger Video Workflow","type":"n8n-nodes-base.httpRequest","position":[80,1424],"parameters":{"url":"https://dr-firass.app.n8n.cloud/webhook-test/ec089d71-c7c3-482a-b11d-f7c24ec16fbc","method":"POST","options":{},"jsonBody":"={{ $json }}","sendBody":true,"specifyBody":"json"},"typeVersion":4.3},{"id":"d7dd8fba-1eb8-441c-a7e7-2fd0257bfd77","name":"Sticky Note","type":"n8n-nodes-base.stickyNote","position":[-1280,944],"parameters":{"color":7,"width":2304,"height":688,"content":"## Part 2 β From Characters to Scenes: Rendering the Visual Story"},"typeVersion":1},{"id":"fe093455-f66b-428d-93ec-f8ba8391f326","name":"Upload an asset from file data","type":"n8n-nodes-cloudinary.cloudinary","position":[-336,1424],"parameters":{"operation":"uploadFile","additionalFieldsFile":{}},"credentials":{"cloudinaryApi":{"id":"credential-id","name":"Cloudinary account 7"}},"typeVersion":1},{"id":"06619dc2-b9df-4d46-89b9-cb2eb59d168f","name":"Structured Output Parser1","type":"@n8n/n8n-nodes-langchain.outputParserStructured","position":[-400,1952],"parameters":{"jsonSchemaExample":"{\n \"scenes\": [\n {\n \"chunk_number\": 1,\n \"video_prompt\": \"Use the provided image as the first frame. ...\",\n \"narration\": \"shortened narration here\"\n }\n ]\n}"},"typeVersion":1.3},{"id":"369bb7b5-ab54-494c-8c0d-ea8cee07b13f","name":"Wait","type":"n8n-nodes-base.wait","position":[-864,2224],"webhookId":"6446b033-86ea-460a-915e-eab984f38441","parameters":{"unit":"minutes","amount":11},"typeVersion":1.1},{"id":"8b1a782e-4218-4e71-9f10-e5e5be20d0a1","name":"If","type":"n8n-nodes-base.if","position":[-480,2224],"parameters":{"options":{},"conditions":{"options":{"version":3,"leftValue":"","caseSensitive":true,"typeValidation":"loose"},"combinator":"and","conditions":[{"id":"47c96ffd-1daa-417b-8ccc-0702e38fbfc9","operator":{"type":"string","operation":"equals"},"leftValue":"={{ $json.data.status }}","rightValue":"completed"}]},"looseTypeValidation":true},"executeOnce":false,"typeVersion":2.3},{"id":"31f82df9-bcbd-4c0a-89da-830de57868b6","name":"Loop Over Items","type":"n8n-nodes-base.splitInBatches","position":[-32,2432],"parameters":{"options":{}},"typeVersion":3},{"id":"13eeeaee-9410-4afa-ac0f-798b8cb575b2","name":"Wait1","type":"n8n-nodes-base.wait","position":[560,2400],"webhookId":"8643f429-d5a7-4e94-8fa2-a047905bb585","parameters":{},"typeVersion":1.1},{"id":"4e0a68d7-1843-4592-8d21-1db57dec2271","name":"Wait2","type":"n8n-nodes-base.wait","position":[-352,2864],"webhookId":"0759957f-d934-4d6b-8765-42fc51f12087","parameters":{"unit":"minutes","amount":1},"typeVersion":1.1},{"id":"d8d2e7a0-bc42-4bd4-ad98-373bc6aca49a","name":"Split Video Chunks","type":"n8n-nodes-base.code","position":[-976,1808],"parameters":{"jsCode":"const body = $input.first().json.body;\n\nconst staticData = $getWorkflowStaticData('global');\nstaticData.title = body.title;\nstaticData.chunks = body.chunks;\n\nreturn body.chunks.map(chunk => ({\n json: {\n chunk_number: chunk.chunk_number,\n script: chunk.script,\n image_prompt: chunk.image_prompt,\n duration_seconds: chunk.duration_seconds,\n scene_image_url: chunk.scene_image_url\n }\n}));"},"typeVersion":2},{"id":"9617c307-c9d8-41d5-adcd-cd654e8e0b41","name":"Build Narration Context","type":"n8n-nodes-base.code","position":[-768,1808],"parameters":{"jsCode":"const items = $input.all();\n\nlet chunkList = '';\nfor (const item of items) {\n const c = item.json;\n chunkList += 'Scene ' + c.chunk_number + ':\\n';\n chunkList += 'Script: ' + c.script + '\\n';\n chunkList += 'Image description: ' + c.image_prompt + '\\n';\n chunkList += 'Duration: ' + c.duration_seconds + 's\\n\\n';\n}\n\nreturn [{\n json: {\n chunk_count: items.length,\n chunk_list: chunkList\n }\n}];"},"typeVersion":2},{"id":"3fadb3ec-8e75-4026-bdbb-b1a94ab0285c","name":"Generate Video Prompts","type":"@n8n/n8n-nodes-langchain.chainLlm","position":[-544,1808],"parameters":{"text":"=I have {{ $json.chunk_count }} scenes. Write a video animation prompt and shortened narration for each.\n\n{{ $json.chunk_list }}\n\nRespond in this EXACT JSON format:\n{\n \"scenes\": [\n {\n \"chunk_number\": 1,\n \"video_prompt\": \"Use the provided image as the first frame. ...\",\n \"narration\": \"shortened narration here\"\n }\n ]\n}","batching":{},"messages":{"messageValues":[{"message":"You are an expert video prompt engineer and children's storybook narrator. You receive scene descriptions and their narration scripts. For each scene you do TWO things: 1) Write a video animation prompt for AI image-to-video generation (Kling style). 2) Rewrite the narration to be SHORTER β must take no more than 8 seconds to read aloud at a slow storytelling pace (roughly 15-18 words max). The video starts from the provided scene image. Rules for video_prompt: 1) Describe what MOVES and HOW β character gestures, facial expressions, wind, particles, camera pans/zooms. 2) Keep camera movements cinematic but gentle β slow zoom, slight pan, subtle dolly. 3) Match the mood of the narration. 4) Duration is 10 seconds per clip. 5) Start each prompt with 'Use the provided image as the first frame.' 6) No text in video. Rules for narration: 1) Keep the same story meaning but use fewer words. 2) Maximum 18 words per chunk. 3) Warm, magical tone for ages 3-7. 4) Must sound natural when read aloud. Respond ONLY valid JSON β no explanation, no markdown, no backticks."}]},"promptType":"define","hasOutputParser":true},"typeVersion":1.9},{"id":"43b34654-a84f-4848-83ae-34592133ea5a","name":"Merge Video Prompts","type":"n8n-nodes-base.code","position":[-192,1808],"parameters":{"jsCode":"const scenes = $input.first().json.output.scenes;\nconst body = $('Webhook1').first().json.body;\n\nreturn scenes.map((scene, i) => ({\n json: {\n chunk_number: scene.chunk_number,\n script: body.chunks[i].script,\n duration_seconds: body.chunks[i].duration_seconds,\n scene_image_url: body.chunks[i].scene_image_url,\n video_prompt: scene.video_prompt\n }\n}));"},"typeVersion":2},{"id":"25cc95f0-505a-4a01-923d-bb5626b7726f","name":"Build Kling Payloads","type":"n8n-nodes-base.code","position":[16,1808],"parameters":{"jsCode":"const items = $input.all();\n\nreturn items.map((item, i) => {\n const nextImage = i < items.length - 1 ? items[i + 1].json.scene_image_url : undefined;\n\n const payload = {\n model: \"kwaivgi/kling-v3.0-pro/image-to-video\",\n cfg_scale: 0.5,\n duration: item.json.duration_seconds,\n image: item.json.scene_image_url,\n ...(nextImage && { end_image: nextImage }),\n prompt: item.json.video_prompt,\n negative_prompt: \"talking, speaking, mouth movement, lip sync, dialogue, voice, speech\",\n sound: true\n };\n\n return {\n json: {\n chunk_number: item.json.chunk_number,\n script: item.json.script,\n body: JSON.stringify(payload)\n }\n };\n});"},"typeVersion":2},{"id":"3d586f77-d0f2-46ea-8578-3455efe7453a","name":"Atlas Cloud Generate Video","type":"n8n-nodes-base.httpRequest","position":[-1152,2224],"parameters":{"url":"https://api.atlascloud.ai/api/v1/model/generateVideo","method":"POST","options":{},"jsonBody":"={{ $json.body }}","sendBody":true,"sendHeaders":true,"specifyBody":"json","headerParameters":{"parameters":[{"name":"Authorization","value":"Bearer YOUR_TOKEN_HERE"},{"name":"Content-Type","value":"application/json"}]}},"typeVersion":4.3},{"id":"fc0a7246-4b43-45b4-9d19-4ce16e2ece36","name":"Atlas Cloud Poll Result","type":"n8n-nodes-base.httpRequest","position":[-688,2224],"parameters":{"url":"=https://api.atlascloud.ai/api/v1/model/prediction/{{ $json.data.id }}","options":{},"sendHeaders":true,"headerParameters":{"parameters":[{"name":"Authorization","value":"Bearer YOUR_TOKEN_HERE"}]}},"typeVersion":4.3},{"id":"b2ffff3c-561a-4433-8252-a6aa2cf3ac34","name":"Extract Video URLs","type":"n8n-nodes-base.code","position":[-240,2256],"parameters":{"jsCode":"const items = $input.all();\n\nreturn items.map((item, i) => ({\n json: {\n chunk_number: i + 1,\n video_url: item.json.data.outputs[0]\n }\n}));"},"typeVersion":2},{"id":"6e5c0287-0bb7-485a-a4a8-cb869f67e7ac","name":"ElevenLabs Narrate","type":"n8n-nodes-base.httpRequest","position":[208,2288],"parameters":{"url":"https://api.elevenlabs.io/v1/text-to-speech/onwK4e9ZLuTAKqWW03F9","method":"POST","options":{"response":{"response":{"responseFormat":"file"}}},"jsonBody":"={{ $json.body }}","sendBody":true,"sendHeaders":true,"specifyBody":"json","headerParameters":{"parameters":[{"name":"xi-api-key","value":"YOUR_API_HERE"},{"name":"Content-Type","value":"application/json"}]}},"typeVersion":4.3},{"id":"b98705d6-1ba5-4b0a-8dea-7f6c8681f7da","name":"Extract Audio URLs","type":"n8n-nodes-base.code","position":[816,2480],"parameters":{"jsCode":"const items = $input.all();\n\nreturn items.map((item, i) => ({\n json: {\n chunk_number: i + 1,\n audio_url: item.json.secure_url\n }\n}));"},"typeVersion":2},{"id":"9d4836ef-d682-425c-a347-9cb05ef7835c","name":"Build Shotstack Payload","type":"n8n-nodes-base.code","position":[-1152,2864],"parameters":{"jsCode":"const audioItems = $input.all();\nconst videoItems = $('Extract Video URLs').all();\nconst chunkItems = $('Split Video Chunks').all();\n\nif (!chunkItems || chunkItems.length === 0) {\n throw new Error('Aucun chunk trouvΓ© depuis \"Split Video Chunks\".');\n}\nif (!videoItems || videoItems.length === 0) {\n throw new Error('Aucune vidΓ©o trouvΓ©e depuis \"Extract Video URLs\".');\n}\nif (!audioItems || audioItems.length === 0) {\n throw new Error('Aucun audio reΓ§u en entrΓ©e.');\n}\n\nconst staticData = $getWorkflowStaticData('global');\nconst title = staticData.title ?? chunkItems[0]?.json?.title ?? 'Sans titre';\n\nconst scenes = chunkItems.map((chunkItem, i) => ({\n chunk_number: chunkItem.json.chunk_number,\n duration_seconds: chunkItem.json.duration_seconds,\n video_url: videoItems[i]?.json?.video_url ?? null,\n audio_url: audioItems[i]?.json?.audio_url ?? null\n}));\n\nreturn [{\n json: {\n title: title,\n scenes: scenes\n }\n}];"},"typeVersion":2},{"id":"95bd87ae-5ace-4043-be74-fdd2abaf5cd7","name":"Build Shotstack Timeline","type":"n8n-nodes-base.code","position":[-848,2864],"parameters":{"jsCode":"const data = $input.first().json;\n\nlet currentTime = 0;\nconst videoClips = [];\nconst audioClips = [];\n\nfor (const scene of data.scenes) {\n videoClips.push({\n asset: {\n type: \"video\",\n src: scene.video_url\n },\n start: currentTime,\n length: \"auto\"\n });\n\n audioClips.push({\n asset: {\n type: \"audio\",\n src: scene.audio_url,\n volume: 1\n },\n start: currentTime,\n length: \"auto\"\n });\n\n currentTime += scene.duration_seconds;\n}\n\nreturn [{\n json: {\n body: JSON.stringify({\n timeline: {\n tracks: [\n { clips: videoClips },\n { clips: audioClips }\n ]\n },\n output: {\n format: \"mp4\",\n resolution: \"hd\"\n }\n })\n }\n}];"},"typeVersion":2},{"id":"fd067e22-704e-4d9f-8419-2fd0f9625cf0","name":"Shotstack Render","type":"n8n-nodes-base.httpRequest","position":[-560,2864],"parameters":{"url":"https://api.shotstack.io/edit/v1/render","method":"POST","options":{},"jsonBody":"={{ $json.body }}","sendBody":true,"sendHeaders":true,"specifyBody":"json","headerParameters":{"parameters":[{"name":"x-api-key","value":"YOUR_API_HERE"},{"name":"Content-Type","value":"application/json"}]}},"typeVersion":4.3},{"id":"99918688-3d1d-43c3-a8cf-de2e027a068d","name":"Shotstack Poll Result","type":"n8n-nodes-base.httpRequest","position":[-160,2864],"parameters":{"url":"=https://api.shotstack.io/edit/v1/render/{{ $json.response.id }}","options":{},"sendHeaders":true,"headerParameters":{"parameters":[{"name":"x-api-key","value":"YOUR_API_HERE"}]}},"typeVersion":4.3},{"id":"648603cd-b60c-4ca1-99ee-5bf0eae12b0f","name":"Prepare Narration Loop","type":"n8n-nodes-base.code","position":[-32,2256],"parameters":{"jsCode":"const scenes = $('Generate Video Prompts').first().json.output.scenes;\n\nreturn scenes.map(scene => ({\n json: {\n chunk_number: scene.chunk_number,\n narration: scene.narration,\n body: JSON.stringify({\n text: scene.narration,\n model_id: \"eleven_multilingual_v2\",\n voice_settings: {\n stability: 0.6,\n similarity_boost: 0.75,\n style: 0.7,\n use_speaker_boost: true\n }\n })\n }\n}));"},"typeVersion":2},{"id":"fd7c209b-3b3c-48ab-a90d-07a22aa43a71","name":"Generate Video Prompts1","type":"@n8n/n8n-nodes-langchain.chainLlm","position":[64,2832],"parameters":{"text":"=Story title: {{ $('Webhook1').item.json.body.title }}\n\nScene scripts:\n{{ $('Split Video Chunks').all().map(s => 'Scene ' + s.json.chunk_number + ': ' + s.json.script).join('\\n') }}\n\nGenerate a video title and description.","batching":{},"messages":{"messageValues":[{"message":"You are a creative content writer. You receive a story title and scene scripts from an AI-generated video. Based on the content, tone, and characters, write a fitting video title and description. Rules: 1) Title should capture the essence of the story and be engaging. Max 60 characters. 2) Description should be 2-3 sentences that make people want to watch. Match the tone of the story. 3) Include 3-5 relevant hashtags at the end. 4) Respond ONLY valid JSON β no explanation, no markdown, no backticks. Format: {\"video_title\": \"...\", \"video_description\": \"...\"}"}]},"promptType":"define","hasOutputParser":true},"typeVersion":1.9},{"id":"45c1aee1-8fe6-4410-abf9-63f7b6714d88","name":"Youtube","type":"@blotato/n8n-nodes-blotato.blotato","position":[464,2800],"parameters":{"options":{"scheduledTime":""},"platform":"youtube","accountId":{"__rl":true,"mode":"list","value":"30379","cachedResultUrl":"https://backend.blotato.com/v2/accounts/30379","cachedResultName":"Dr. Firas (PlayKids - English)"},"postContentText":"={{ $json.output.scenes[0].description }}","postContentMediaUrls":"={{ $('Shotstack Poll Result').item.json.response.url }}","postCreateYoutubeOptionTitle":"={{ $json.output.scenes[0].video_title }}","postCreateYoutubeOptionMadeForKids":true,"postCreateYoutubeOptionShouldNotifySubscribers":false},"credentials":{"blotatoApi":{"id":"credential-id","name":"Blotato account"}},"typeVersion":2},{"id":"3d6f21dd-b5de-4ea2-aed7-827b43ad8773","name":"Tiktok","type":"@blotato/n8n-nodes-blotato.blotato","disabled":true,"position":[464,2960],"parameters":{"options":{},"platform":"tiktok","accountId":{"__rl":true,"mode":"list","value":"8047","cachedResultUrl":"https://backend.blotato.com/v2/accounts/8047","cachedResultName":"DR FIRASS (Dr. Firas)"},"postContentText":"={{ $json.output.scenes[0].video_title }}","postContentMediaUrls":"={{ $('Shotstack Poll Result').item.json.response.url }}","postCreateTiktokOptionPrivacyLevel":"SELF_ONLY"},"credentials":{"blotatoApi":{"id":"credential-id","name":"Blotato account"}},"typeVersion":2},{"id":"846c0106-4687-4aa9-9211-6c1cc4aa79ba","name":"Webhook1","type":"n8n-nodes-base.webhook","position":[-1184,1808],"webhookId":"abbcaa45-a01a-4e75-bcac-e2734a9bb489","parameters":{"path":"abbcaa45-a01a-4e75-bcac-e2734a9bb489","options":{},"httpMethod":"POST"},"typeVersion":2.1},{"id":"7348aa99-f6ab-434b-b531-0cff4cff0e5a","name":"Anthropic Chat Model2","type":"@n8n/n8n-nodes-langchain.lmChatAnthropic","position":[-528,1952],"parameters":{"model":{"__rl":true,"mode":"list","value":"claude-sonnet-4-5-20250929","cachedResultName":"Claude Sonnet 4.5"},"options":{}},"credentials":{"anthropicApi":{"id":"credential-id","name":"Anthropic account 6"}},"typeVersion":1.3},{"id":"4343a6ce-1c77-4535-9bfc-2f52b64922e4","name":"Upload an asset from file data2","type":"n8n-nodes-cloudinary.cloudinary","position":[400,2288],"parameters":{"operation":"uploadFile","resource_type_file":"raw","additionalFieldsFile":{}},"credentials":{"cloudinaryApi":{"id":"credential-id","name":"Cloudinary account 7"}},"typeVersion":1},{"id":"c9d907b0-38db-48c1-9fbd-037d460f3412","name":"Structured Output Parser3","type":"@n8n/n8n-nodes-langchain.outputParserStructured","position":[208,2976],"parameters":{"jsonSchemaExample":"{\n \"scenes\": [\n {\n \"video_title\": \"Use the provided image as the first frame. ...\",\n \"description\": \"shortened narration here\"\n }\n ]\n}"},"typeVersion":1.3},{"id":"ef413214-9e30-4e13-a27f-f5ea737f0d67","name":"Anthropic Chat Model3","type":"@n8n/n8n-nodes-langchain.lmChatAnthropic","position":[80,2976],"parameters":{"model":{"__rl":true,"mode":"list","value":"claude-sonnet-4-5-20250929","cachedResultName":"Claude Sonnet 4.5"},"options":{}},"credentials":{"anthropicApi":{"id":"credential-id","name":"Anthropic account 6"}},"typeVersion":1.3},{"id":"d3bb0328-d8a2-42f7-8b58-f33078e39cea","name":"Sticky Note1","type":"n8n-nodes-base.stickyNote","position":[-1280,1696],"parameters":{"color":7,"width":2304,"height":1456,"content":"## IPart 3 β From Scenes to Screen: Prompts, Audio & Final Assembly"},"typeVersion":1},{"id":"168bf49e-77ea-4b3c-8e9b-aa7c5ebf5976","name":"Sticky Note2","type":"n8n-nodes-base.stickyNote","position":[-2160,-368],"parameters":{"width":816,"height":2000,"content":"# π¬ From Drawing to Story β Auto-Publish AI Video to YouTube with Blotato\n\nTransform a hand-drawn sketch into a fully narrated, animated video β automatically generated and published to YouTube.\n\n> β οΈ **This template is split into 3 separate workflows.**\n> Each part must be imported into its **own workflow** in n8n.\n\n---\n\n## π₯ Tutorial\n\n@[youtube](bmx4IHX5lhI)\n\n---\n\n## βοΈ How It Works\n\n- **Part 1** β Analyze the drawing, generate character images & write the story\n- **Part 2** β Generate scene images with character consistency\n- **Part 3** β Generate videos, narrate with AI audio & auto-publish to YouTube\n\n---\n\n## π Credentials Needed\n\n- **[Blotato](https://blotato.com/?ref=firas)** β YouTube / TikTok publishing\n- **[AtlasCloud](https://www.atlascloud.ai?ref=8QKPJE)** β Kling Pro 3.0 video generation\n- **Anthropic** β Claude AI (story & prompts)\n- **ElevenLabs** β Narration audio\n- **Shotstack** β Final video assembly\n- **Nano Banana** β Image generation\n\n---\n\n## π Setup Steps\n\n1. Configure credentials for each service above in n8n\n2. Set up a form trigger with a file upload field for the drawing\n3. Deploy the 3 workflows in order and connect them via webhooks\n4. Run a test submission with a simple sketch to validate the full pipeline"},"typeVersion":1}],"active":false,"pinData":{},"settings":{"binaryMode":"separate","availableInMCP":false,"executionOrder":"v1"},"versionId":"113acb6b-26b0-4415-9d0c-701ee10357cd","connections":{"If":{"main":[[{"node":"Extract Video URLs","type":"main","index":0}],[{"node":"Wait","type":"main","index":0}]]},"Wait":{"main":[[{"node":"Atlas Cloud Poll Result","type":"main","index":0}]]},"Wait1":{"main":[[{"node":"Loop Over Items","type":"main","index":0}]]},"Wait2":{"main":[[{"node":"Shotstack Poll Result","type":"main","index":0}]]},"Webhook":{"main":[[{"node":"Split Scene Chun...How to Import This Workflow
- 1Copy the workflow JSON above using the Copy Workflow JSON button.
- 2Open your n8n instance and go to Workflows.
- 3Click Import from JSON and paste the copied workflow.
Don't have an n8n instance? Start your free trial at n8nautomation.cloud
Related Templates
Auto-create TikTok videos with VEED.io AI avatars, ElevenLabs & GPT-4
Automate the creation and distribution of trending TikTok videos using AI avatars. This workflow connects Telegram, Perplexity, OpenAI, ElevenLabs, VEED.io, and BLOTATO to generate scripts, synthesize voice, create video, and publish across multiple social platforms. Content creators and marketers can rapidly produce engaging short-form video content without manual editing.
IT Ops AI SlackBot Workflow - Chat with your knowledge base
Empower your IT operations team to instantly access information by building an AI-powered Slackbot that chats with your Confluence knowledge base. This productivity workflow integrates OpenAI and Slack, using a webhook trigger to initiate conversations. When a direct message is received in Slack, the workflow verifies the webhook, sends an initial message, and then an AI Agent processes the user's query. The AI Agent leverages a Window Buffer Memory to maintain context and can call a Confluence Workflow Tool to retrieve relevant information from your knowledge base. After processing, the initial message is deleted, and the AI Agent sends a comprehensive response back to the user in Slack. This automation streamlines IT support, reduces response times, and frees up valuable IT staff by allowing them to quickly find answers to common questions without manual searching, ultimately improving efficiency and user satisfaction within your organization.
Manage Google Calendar and Gmail from Telegram with a Claude AI assistant
Manage your Google Calendar and Gmail directly from Telegram using a Claude AI assistant. This workflow connects Telegram for input, OpenAI and LmChatOpenRouter for AI processing, and Google Calendar and Gmail for managing events and emails. Users can schedule meetings, check their calendar, or reply to emails on the go, making personal and professional organization more efficient.