node_name stringlengths 2 42 | display_name stringlengths 2 37 | categories stringclasses 63
values | subcategories stringclasses 37
values | group stringclasses 8
values | version stringclasses 22
values | description stringlengths 3 168 | credentials_required stringlengths 2 43 | operations_supported stringlengths 2 187 | properties_schema stringlengths 0 4.72k | source_package stringclasses 2
values | source_file_path stringlengths 39 144 | github_permalink stringlengths 81 186 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
agent | AI Agent | ["AI"] | ["Agents", "Root Nodes"] | ["transform"] | 3.1 | Generates an action plan and executes it. Can use external tools. | [] | [] | @n8n | packages/@n8n/nodes-langchain/nodes/agents/Agent/Agent.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/agents/Agent/Agent.node.ts | |
agentTool | AI Agent Tool | ["AI"] | ["Tools", "Recommended Tools"] | ["transform"] | 3 | Generates an action plan and executes it. Can use external tools. | [] | [] | @n8n | packages/@n8n/nodes-langchain/nodes/agents/Agent/AgentTool.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/agents/Agent/AgentTool.node.ts | |
chainLlm | Basic LLM Chain | ["AI"] | ["Chains", "Root Nodes"] | ["transform"] | 1.9 | A simple chain to prompt a large language model | [] | [] | @n8n | packages/@n8n/nodes-langchain/nodes/chains/ChainLLM/ChainLlm.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/chains/ChainLLM/ChainLlm.node.ts | |
chainRetrievalQa | Question and Answer Chain | ["AI"] | ["Chains", "Root Nodes"] | ["transform"] | 1.7 | Answer questions about retrieved documents | [] | [] | [{"name":"query","displayName":"Query","type":"string"},{"name":"query","displayName":"Query","type":"string"},{"name":"query","displayName":"Query","type":"string"},{"name":"text","displayName":"Prompt (User Message)","type":"string"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/chains/ChainRetrievalQA/ChainRetrievalQa.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/chains/ChainRetrievalQA/ChainRetrievalQa.node.ts |
chainSummarization | Summarization Chain | ["AI"] | ["Chains", "Root Nodes"] | ["transform"] | 2.1 | Transforms text into a concise summary | [] | [] | @n8n | packages/@n8n/nodes-langchain/nodes/chains/ChainSummarization/ChainSummarization.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/chains/ChainSummarization/ChainSummarization.node.ts | |
chat | Chat | ["Core Nodes", "HITL"] | ["Human in the Loop"] | ["input"] | 1.3 | Send a message into the chat | [] | ["send"] | [{"name":"generalNotice","displayName":"Verify you","type":"notice"},{"name":"operation","displayName":"Operation","type":"options"},{"name":"message","displayName":"Message","type":"string"},{"name":"options","displayName":"Options","type":"collection"},{"name":"options","displayName":"Options","type":"collection"},{"... | @n8n | packages/@n8n/nodes-langchain/nodes/trigger/ChatTrigger/Chat.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/trigger/ChatTrigger/Chat.node.ts |
chatTrigger | Chat Trigger | ["Core Nodes"] | [] | ["trigger"] | 1.4 | Runs the workflow when an n8n generated webchat is submitted | ["httpBasicAuth"] | [] | [{"name":"public","displayName":"Make Chat Publicly Available","type":"boolean"},{"name":"mode","displayName":"Mode","type":"options"},{"name":"hostedChatNotice","displayName":"Chat will be live at the URL above once this workflow is published. Live executions will show up in the \u2018executions\u2019 tab","type":"not... | @n8n | packages/@n8n/nodes-langchain/nodes/trigger/ChatTrigger/ChatTrigger.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/trigger/ChatTrigger/ChatTrigger.node.ts |
documentBinaryInputLoader | Binary Input Loader | ["AI"] | ["Document Loaders"] | ["transform"] | 1 | Use binary data from a previous step in the workflow | [] | [] | [{"name":"loader","displayName":"Loader Type","type":"options"},{"name":"binaryDataKey","displayName":"Binary Data Key","type":"string"},{"name":"splitPages","displayName":"Split Pages","type":"boolean"},{"name":"column","displayName":"Column","type":"string"},{"name":"separator","displayName":"Separator","type":"strin... | @n8n | packages/@n8n/nodes-langchain/nodes/document_loaders/DocumentBinaryInputLoader/DocumentBinaryInputLoader.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/document_loaders/DocumentBinaryInputLoader/DocumentBinaryInputLoader.node.ts |
documentDefaultDataLoader | Default Data Loader | ["AI"] | ["Document Loaders"] | ["transform"] | 1.1 | Load data from previous step in the workflow | [] | [] | [{"name":"notice","displayName":"This will load data from a previous step in the workflow. <a href=","type":"notice"},{"name":"dataType","displayName":"Type of Data","type":"options"},{"name":"jsonMode","displayName":"Mode","type":"options"},{"name":"binaryMode","displayName":"Mode","type":"options"},{"name":"loader","... | @n8n | packages/@n8n/nodes-langchain/nodes/document_loaders/DocumentDefaultDataLoader/DocumentDefaultDataLoader.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/document_loaders/DocumentDefaultDataLoader/DocumentDefaultDataLoader.node.ts |
documentGithubLoader | GitHub Document Loader | ["AI"] | ["Document Loaders"] | ["transform"] | 1.1 | Use GitHub data as input to this chain | ["githubApi"] | [] | [{"name":"repository","displayName":"Repository Link","type":"string"},{"name":"branch","displayName":"Branch","type":"string"},{"name":"textSplittingMode","displayName":"Text Splitting","type":"options"},{"name":"additionalOptions","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/document_loaders/DocumentGithubLoader/DocumentGithubLoader.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/document_loaders/DocumentGithubLoader/DocumentGithubLoader.node.ts |
documentJsonInputLoader | JSON Input Loader | ["AI"] | ["Document Loaders"] | ["transform"] | 1 | Use JSON data from a previous step in the workflow | [] | [] | [{"name":"pointers","displayName":"Pointers","type":"string"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/document_loaders/DocumentJSONInputLoader/DocumentJsonInputLoader.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/document_loaders/DocumentJSONInputLoader/DocumentJsonInputLoader.node.ts |
embeddingsAwsBedrock | Embeddings AWS Bedrock | ["AI"] | ["Embeddings"] | ["transform"] | 1 | Use Embeddings AWS Bedrock | ["aws"] | [] | [{"name":"model","displayName":"Model","type":"options"}] | @n8n | packages/@n8n/nodes-langchain/nodes/embeddings/EmbeddingsAwsBedrock/EmbeddingsAwsBedrock.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/embeddings/EmbeddingsAwsBedrock/EmbeddingsAwsBedrock.node.ts |
embeddingsAzureOpenAi | Embeddings Azure OpenAI | ["AI"] | ["Embeddings"] | ["transform"] | 1 | Use Embeddings Azure OpenAI | ["azureOpenAiApi"] | [] | [{"name":"model","displayName":"Model (Deployment) Name","type":"string"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/embeddings/EmbeddingsAzureOpenAi/EmbeddingsAzureOpenAi.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/embeddings/EmbeddingsAzureOpenAi/EmbeddingsAzureOpenAi.node.ts |
embeddingsCohere | Embeddings Cohere | ["AI"] | ["Embeddings"] | ["transform"] | 1 | Use Cohere Embeddings | ["cohereApi"] | [] | [{"name":"notice","displayName":"Each model is using different dimensional density for embeddings. Please make sure to use the same dimensionality for your vector store. The default model is using 768-dimensional embeddings.","type":"notice"},{"name":"modelName","displayName":"Model","type":"options"}] | @n8n | packages/@n8n/nodes-langchain/nodes/embeddings/EmbeddingsCohere/EmbeddingsCohere.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/embeddings/EmbeddingsCohere/EmbeddingsCohere.node.ts |
embeddingsGoogleGemini | Embeddings Google Gemini | ["AI"] | ["Embeddings"] | ["transform"] | 1 | Use Google Gemini Embeddings | ["googlePalmApi"] | [] | [{"name":"notice","displayName":"Each model is using different dimensional density for embeddings. Please make sure to use the same dimensionality for your vector store. The default model is using 768-dimensional embeddings.","type":"notice"},{"name":"modelName","displayName":"Model","type":"options"}] | @n8n | packages/@n8n/nodes-langchain/nodes/embeddings/EmbeddingsGoogleGemini/EmbeddingsGoogleGemini.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/embeddings/EmbeddingsGoogleGemini/EmbeddingsGoogleGemini.node.ts |
embeddingsGoogleVertex | Embeddings Google Vertex | ["AI"] | ["Embeddings"] | ["transform"] | 1 | Use Google Vertex Embeddings | ["googleApi"] | [] | [{"name":"notice","displayName":"Each model is using different dimensional density for embeddings. Please make sure to use the same dimensionality for your vector store. The default model is using 768-dimensional embeddings. You can find available models <a href=","type":"notice"},{"name":"projectId","displayName":"Pro... | @n8n | packages/@n8n/nodes-langchain/nodes/embeddings/EmbeddingsGoogleVertex/EmbeddingsGoogleVertex.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/embeddings/EmbeddingsGoogleVertex/EmbeddingsGoogleVertex.node.ts |
embeddingsHuggingFaceInference | Embeddings Hugging Face Inference | ["AI"] | ["Embeddings"] | ["transform"] | 1 | Use HuggingFace Inference Embeddings | ["huggingFaceApi"] | [] | [{"name":"notice","displayName":"Each model is using different dimensional density for embeddings. Please make sure to use the same dimensionality for your vector store. The default model is using 768-dimensional embeddings.","type":"notice"},{"name":"modelName","displayName":"Model Name","type":"string"},{"name":"opti... | @n8n | packages/@n8n/nodes-langchain/nodes/embeddings/EmbeddingsHuggingFaceInference/EmbeddingsHuggingFaceInference.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/embeddings/EmbeddingsHuggingFaceInference/EmbeddingsHuggingFaceInference.node.ts |
embeddingsLemonade | Embeddings Lemonade | ["AI"] | ["Embeddings"] | ["transform"] | 1 | Use Lemonade Embeddings | [] | [] | @n8n | packages/@n8n/nodes-langchain/nodes/embeddings/EmbeddingsLemonade/EmbeddingsLemonade.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/embeddings/EmbeddingsLemonade/EmbeddingsLemonade.node.ts | |
embeddingsMistralCloud | Embeddings Mistral Cloud | ["AI"] | ["Embeddings"] | ["transform"] | 1 | Use Embeddings Mistral Cloud | ["mistralCloudApi"] | [] | [{"name":"model","displayName":"Model","type":"options"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/embeddings/EmbeddingsMistralCloud/EmbeddingsMistralCloud.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/embeddings/EmbeddingsMistralCloud/EmbeddingsMistralCloud.node.ts |
embeddingsOllama | Embeddings Ollama | ["AI"] | ["Embeddings"] | ["transform"] | 1 | Use Ollama Embeddings | [] | [] | @n8n | packages/@n8n/nodes-langchain/nodes/embeddings/EmbeddingsOllama/EmbeddingsOllama.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/embeddings/EmbeddingsOllama/EmbeddingsOllama.node.ts | |
guardrails | Guardrails | ["AI"] | ["Agents", "Miscellaneous", "Root Nodes"] | ["transform"] | 2 | Safeguard AI models from malicious input or prevent them from generating undesirable responses | [] | [] | @n8n | packages/@n8n/nodes-langchain/nodes/Guardrails/Guardrails.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/Guardrails/Guardrails.node.ts | |
informationExtractor | Information Extractor | ["AI"] | ["Chains", "Root Nodes"] | ["transform"] | 1.2 | Extract information from text in a structured format | [] | [] | [{"name":"text","displayName":"Text","type":"string"},{"name":"From Attribute Descriptions"},{"name":"attributes","displayName":"Attributes","type":"fixedCollection"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/chains/InformationExtractor/InformationExtractor.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/chains/InformationExtractor/InformationExtractor.node.ts |
lmChatAlibabaCloud | Alibaba Cloud Chat Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Chat Models (Recommended)"] | ["transform"] | 1 | For advanced usage with an AI chain | ["alibabaCloudApi"] | [] | [{"name":"notice","displayName":"If using JSON response format, you must include word ","type":"notice"},{"name":"model","displayName":"Model","type":"options"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LmChatAlibabaCloud/LmChatAlibabaCloud.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LmChatAlibabaCloud/LmChatAlibabaCloud.node.ts |
lmChatAnthropic | Anthropic Chat Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Chat Models (Recommended)"] | ["transform"] | 1.5 | Language Model Anthropic | ["anthropicApi"] | [] | [{"name":"model","displayName":"Model","type":"resourceLocator"},{"name":"model","displayName":"Model","type":"resourceLocator"},{"name":"model","displayName":"Model","type":"resourceLocator"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LMChatAnthropic/LmChatAnthropic.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LMChatAnthropic/LmChatAnthropic.node.ts |
lmChatAwsBedrock | AWS Bedrock Chat Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Chat Models (Recommended)"] | ["transform"] | 1.1 | Language Model AWS Bedrock | ["aws"] | [] | [{"name":"modelSource","displayName":"Model Source","type":"options"},{"name":"model","displayName":"Model","type":"options"},{"name":"model","displayName":"Model","type":"options"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LmChatAwsBedrock/LmChatAwsBedrock.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LmChatAwsBedrock/LmChatAwsBedrock.node.ts |
lmChatAzureOpenAi | Azure OpenAI Chat Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Chat Models (Recommended)"] | ["transform"] | 1 | For advanced usage with an AI chain | ["azureOpenAiApi"] | [] | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LmChatAzureOpenAi/LmChatAzureOpenAi.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LmChatAzureOpenAi/LmChatAzureOpenAi.node.ts | |
lmChatCohere | Cohere Chat Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Chat Models (Recommended)"] | ["transform"] | 1 | For advanced usage with an AI chain | ["cohereApi"] | [] | [{"name":"model","displayName":"Model","type":"options"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LmChatCohere/LmChatCohere.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LmChatCohere/LmChatCohere.node.ts |
lmChatDeepSeek | DeepSeek Chat Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Chat Models (Recommended)"] | ["transform"] | 1 | For advanced usage with an AI chain | ["deepSeekApi"] | [] | [{"name":"notice","displayName":"If using JSON response format, you must include word ","type":"notice"},{"name":"model","displayName":"Model","type":"options"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LmChatDeepSeek/LmChatDeepSeek.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LmChatDeepSeek/LmChatDeepSeek.node.ts |
lmChatGoogleGemini | Google Gemini Chat Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Chat Models (Recommended)"] | ["transform"] | 1.1 | Chat Model Google Gemini | ["googlePalmApi"] | [] | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LmChatGoogleGemini/LmChatGoogleGemini.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LmChatGoogleGemini/LmChatGoogleGemini.node.ts | |
lmChatGoogleVertex | Google Vertex Chat Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Chat Models (Recommended)"] | ["transform"] | 1 | Chat Model Google Vertex | ["googleApi"] | [] | [{"name":"projectId","displayName":"Project ID","type":"resourceLocator"},{"name":"modelName","displayName":"Model Name","type":"string"}] | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LmChatGoogleVertex/LmChatGoogleVertex.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LmChatGoogleVertex/LmChatGoogleVertex.node.ts |
lmChatGroq | Groq Chat Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Chat Models (Recommended)"] | ["transform"] | 1 | Language Model Groq | ["groqApi"] | [] | [{"name":"model","displayName":"Model","type":"options"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LmChatGroq/LmChatGroq.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LmChatGroq/LmChatGroq.node.ts |
lmChatLemonade | Lemonade Chat Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Chat Models (Recommended)"] | ["transform"] | 1 | Language Model Lemonade Chat | [] | [] | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LMChatLemonade/LmChatLemonade.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LMChatLemonade/LmChatLemonade.node.ts | |
lmChatMinimax | MiniMax Chat Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Chat Models (Recommended)"] | ["transform"] | 1 | For advanced usage with an AI chain | ["minimaxApi"] | [] | [{"name":"model","displayName":"Model","type":"options"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LmChatMinimax/LmChatMinimax.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LmChatMinimax/LmChatMinimax.node.ts |
lmChatMistralCloud | Mistral Cloud Chat Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Chat Models (Recommended)"] | ["transform"] | 1 | For advanced usage with an AI chain | ["mistralCloudApi"] | [] | [{"name":"model","displayName":"Model","type":"options"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LmChatMistralCloud/LmChatMistralCloud.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LmChatMistralCloud/LmChatMistralCloud.node.ts |
lmChatMoonshot | Moonshot Kimi Chat Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Chat Models (Recommended)"] | ["transform"] | 1.1 | For advanced usage with an AI chain | ["moonshotApi"] | [] | [{"name":"notice","displayName":"If using JSON response format, you must include word ","type":"notice"},{"name":"model","displayName":"Model","type":"options"},{"name":"model","displayName":"Model","type":"options"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LmChatMoonshot/LmChatMoonshot.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LmChatMoonshot/LmChatMoonshot.node.ts |
lmChatOllama | Ollama Chat Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Chat Models (Recommended)"] | ["transform"] | 1 | Language Model Ollama | [] | [] | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LMChatOllama/LmChatOllama.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LMChatOllama/LmChatOllama.node.ts | |
lmChatOpenAi | OpenAI Chat Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Chat Models (Recommended)"] | ["transform"] | 1.3 | For advanced usage with an AI chain | ["openAiApi"] | [] | [{"name":"model","displayName":"Model","type":"options"},{"name":"model","displayName":"Model","type":"resourceLocator"},{"name":"notice","displayName":"When using non-OpenAI models via ","type":"notice"},{"name":"responsesApiEnabled","displayName":"Use Responses API","type":"boolean"},{"name":"builtInTools","displayNa... | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LMChatOpenAi/LmChatOpenAi.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LMChatOpenAi/LmChatOpenAi.node.ts |
lmChatOpenRouter | OpenRouter Chat Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Chat Models (Recommended)"] | ["transform"] | 1 | For advanced usage with an AI chain | ["openRouterApi"] | [] | [{"name":"notice","displayName":"If using JSON response format, you must include word ","type":"notice"},{"name":"model","displayName":"Model","type":"options"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LmChatOpenRouter/LmChatOpenRouter.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LmChatOpenRouter/LmChatOpenRouter.node.ts |
lmChatVercelAiGateway | Vercel AI Gateway Chat Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Chat Models (Recommended)"] | ["transform"] | 1 | For advanced usage with an AI chain via Vercel AI Gateway | ["vercelAiGatewayApi"] | [] | [{"name":"notice","displayName":"If using JSON response format, you must include word ","type":"notice"},{"name":"model","displayName":"Model","type":"options"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LmChatVercelAiGateway/LmChatVercelAiGateway.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LmChatVercelAiGateway/LmChatVercelAiGateway.node.ts |
lmChatXAiGrok | xAI Grok Chat Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Chat Models (Recommended)"] | ["transform"] | 1 | For advanced usage with an AI chain | ["xAiApi"] | [] | [{"name":"notice","displayName":"If using JSON response format, you must include word ","type":"notice"},{"name":"model","displayName":"Model","type":"options"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LmChatXAiGrok/LmChatXAiGrok.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LmChatXAiGrok/LmChatXAiGrok.node.ts |
lmCohere | Cohere Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Text Completion Models"] | ["transform"] | 1 | Language Model Cohere | ["cohereApi"] | [] | [{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LMCohere/LmCohere.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LMCohere/LmCohere.node.ts |
lmLemonade | Lemonade Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Text Completion Models"] | ["transform"] | 1 | Language Model Lemonade | [] | [] | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LMLemonade/LmLemonade.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LMLemonade/LmLemonade.node.ts | |
lmOllama | Ollama Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Text Completion Models"] | ["transform"] | 1 | Language Model Ollama | [] | [] | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LMOllama/LmOllama.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LMOllama/LmOllama.node.ts | |
lmOpenAi | OpenAI Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Text Completion Models"] | ["transform"] | 1 | For advanced usage with an AI chain | ["openAiApi"] | [] | [{"name":"deprecated","displayName":"This node is using OpenAI completions which are now deprecated. Please use the OpenAI Chat Model node instead.","type":"notice"},{"name":"model","displayName":"Model","type":"resourceLocator"},{"name":"notice","displayName":"When using non OpenAI models via Base URL override, not al... | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LMOpenAi/LmOpenAi.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LMOpenAi/LmOpenAi.node.ts |
lmOpenHuggingFaceInference | Hugging Face Inference Model | ["AI"] | ["Language Models", "Root Nodes", "Language Models", "Text Completion Models"] | ["transform"] | 1 | Language Model HuggingFaceInference | ["huggingFaceApi"] | [] | [{"name":"model","displayName":"Model","type":"string"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/llms/LMOpenHuggingFaceInference/LmOpenHuggingFaceInference.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/llms/LMOpenHuggingFaceInference/LmOpenHuggingFaceInference.node.ts |
manualChatTrigger | Manual Chat Trigger | ["Core Nodes"] | ["Core Nodes", "Other Trigger Nodes"] | ["trigger"] | 1.1 | Runs the flow on new manual chat message | [] | [] | [{"name":"notice","displayName":"This node is where a manual chat workflow execution starts. To make one, go back to the canvas and click \u2018Chat\u2019","type":"notice"},{"name":"openChat","displayName":"Chat and execute workflow","type":"button"}] | @n8n | packages/@n8n/nodes-langchain/nodes/trigger/ManualChatTrigger/ManualChatTrigger.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/trigger/ManualChatTrigger/ManualChatTrigger.node.ts |
mcpClient | MCP Client | [] | [] | ["transform"] | 1 | Standalone MCP Client | [] | [] | [{"name":"endpointUrl","displayName":"MCP Endpoint URL","type":"string"},{"name":"authentication","displayName":"Authentication","type":"options"},{"name":"credentials","displayName":"Credentials","type":"credentials"},{"name":"tool","displayName":"Tool","type":"resourceLocator"},{"name":"inputMode","displayName":"Inpu... | @n8n | packages/@n8n/nodes-langchain/nodes/mcp/McpClient/McpClient.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/mcp/McpClient/McpClient.node.ts |
mcpClientTool | MCP Client Tool | ["AI"] | ["Tools", "Recommended Tools"] | ["output"] | 1.2 | Connect tools from an MCP Server | [] | [] | [{"name":"sseEndpoint","displayName":"SSE Endpoint","type":"string"},{"name":"endpointUrl","displayName":"Endpoint","type":"string"},{"name":"authentication","displayName":"Authentication","type":"options"},{"name":"authentication","displayName":"Authentication","type":"options"},{"name":"credentials","displayName":"Cr... | @n8n | packages/@n8n/nodes-langchain/nodes/mcp/McpClientTool/McpClientTool.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/mcp/McpClientTool/McpClientTool.node.ts |
mcpTrigger | MCP Server Trigger | ["AI", "Core Nodes"] | ["Root Nodes", "Model Context Protocol", "Core Nodes", "Other Trigger Nodes"] | ["trigger"] | 2 | Expose n8n tools as an MCP Server endpoint | ["httpBearerAuth"] | [] | [{"name":"authentication","displayName":"Authentication","type":"options"},{"name":"path","displayName":"Path","type":"string"},{"name":"setup"},{"name":"default"},{"name":"default"}] | @n8n | packages/@n8n/nodes-langchain/nodes/mcp/McpTrigger/McpTrigger.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/mcp/McpTrigger/McpTrigger.node.ts |
memoryBufferWindow | Simple Memory | ["AI"] | ["Memory", "For beginners"] | ["transform"] | 1.4 | Stores in n8n memory, so no credentials required | [] | [] | [{"name":"scalingNotice","displayName":"This node stores memory locally in the n8n instance. It is not compatible with Queue Mode or Multi-Main setups, as memory will not be shared across workers. For production use with scaling, consider using an external memory store such as Redis, Postgres, or another persistent mem... | @n8n | packages/@n8n/nodes-langchain/nodes/memory/MemoryBufferWindow/MemoryBufferWindow.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/memory/MemoryBufferWindow/MemoryBufferWindow.node.ts |
memoryChatRetriever | Chat Messages Retriever | ["AI"] | ["Miscellaneous"] | ["transform"] | 1 | Retrieve chat messages from memory and use them in the workflow | [] | [] | [{"name":"deprecatedNotice","displayName":"This node is deprecated. Use ","type":"notice"},{"name":"simplifyOutput","displayName":"Simplify Output","type":"boolean"}] | @n8n | packages/@n8n/nodes-langchain/nodes/memory/MemoryChatRetriever/MemoryChatRetriever.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/memory/MemoryChatRetriever/MemoryChatRetriever.node.ts |
memoryManager | Chat Memory Manager | ["AI"] | ["Miscellaneous", "Root Nodes"] | ["transform"] | 1.1 | Manage chat messages memory and use it in the workflow | [] | [] | [{"name":"mode","displayName":"Operation Mode","type":"options"},{"name":"insertMode","displayName":"Insert Mode","type":"options"},{"name":"deleteMode","displayName":"Delete Mode","type":"options"},{"name":"messages","displayName":"Chat Messages","type":"fixedCollection"},{"name":"lastMessagesCount","displayName":"Mes... | @n8n | packages/@n8n/nodes-langchain/nodes/memory/MemoryManager/MemoryManager.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/memory/MemoryManager/MemoryManager.node.ts |
memoryMongoDbChat | MongoDB Chat Memory | ["AI"] | ["Memory", "Other memories"] | ["transform"] | 1.1 | Stores the chat history in MongoDB collection. | ["mongoDb"] | [] | [{"name":"collectionName","displayName":"Collection Name","type":"string"},{"name":"databaseName","displayName":"Database Name","type":"string"}] | @n8n | packages/@n8n/nodes-langchain/nodes/memory/MemoryMongoDbChat/MemoryMongoDbChat.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/memory/MemoryMongoDbChat/MemoryMongoDbChat.node.ts |
memoryMotorhead | Motorhead | ["AI"] | ["Memory", "Other memories"] | ["transform"] | 1.4 | Use Motorhead Memory | ["motorheadApi"] | [] | [{"name":"deprecationNotice","displayName":"The Motorhead project is no longer maintained. This node is deprecated and will be removed in a future version.","type":"notice"},{"name":"sessionId","displayName":"Session ID","type":"string"},{"name":"sessionId","displayName":"Session ID","type":"string"}] | @n8n | packages/@n8n/nodes-langchain/nodes/memory/MemoryMotorhead/MemoryMotorhead.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/memory/MemoryMotorhead/MemoryMotorhead.node.ts |
memoryPostgresChat | Postgres Chat Memory | ["AI"] | ["Memory", "Other memories"] | ["transform"] | 1.4 | Stores the chat history in Postgres table. | ["postgres"] | [] | [{"name":"tableName","displayName":"Table Name","type":"string"}] | @n8n | packages/@n8n/nodes-langchain/nodes/memory/MemoryPostgresChat/MemoryPostgresChat.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/memory/MemoryPostgresChat/MemoryPostgresChat.node.ts |
memoryRedisChat | Redis Chat Memory | ["AI"] | ["Memory", "Other memories"] | ["transform"] | 1.6 | Stores the chat history in Redis. | ["redis"] | [] | [{"name":"sessionKey","displayName":"Session Key","type":"string"},{"name":"sessionKey","displayName":"Session ID","type":"string"},{"name":"sessionTTL","displayName":"Session Time To Live","type":"number"}] | @n8n | packages/@n8n/nodes-langchain/nodes/memory/MemoryRedisChat/MemoryRedisChat.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/memory/MemoryRedisChat/MemoryRedisChat.node.ts |
memoryXata | Xata | ["AI"] | ["Memory", "Other memories"] | ["transform"] | 1.5 | Use Xata Memory | ["xataApi"] | [] | [{"name":"sessionId","displayName":"Session ID","type":"string"},{"name":"sessionId","displayName":"Session ID","type":"string"}] | @n8n | packages/@n8n/nodes-langchain/nodes/memory/MemoryXata/MemoryXata.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/memory/MemoryXata/MemoryXata.node.ts |
memoryZep | Zep | ["AI"] | ["Memory", "Other memories"] | ["transform"] | 1.4 | Use Zep Memory | ["zepApi"] | [] | [{"name":"deprecationNotice","displayName":"This Zep integration is deprecated and will be removed in a future version.","type":"notice"},{"name":"supportedVersions","displayName":"Only works with Zep Cloud and Community edition <= v0.27.2","type":"notice"},{"name":"sessionId","displayName":"Session ID","type":"string"... | @n8n | packages/@n8n/nodes-langchain/nodes/memory/MemoryZep/MemoryZep.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/memory/MemoryZep/MemoryZep.node.ts |
microsoftAgent365Trigger | Microsoft Agent 365 Trigger | ["Core Nodes"] | [] | ["trigger"] | 1.1 | Trigger for Microsoft Agent 365 | ["microsoftAgent365Api"] | [] | [{"name":"previewNotice","displayName":"This is an early preview for building Agents with Microsoft Agent 365 and n8n. You need to be part of the <a href=","type":"notice"},{"name":"systemPrompt","displayName":"System Prompt","type":"string"},{"name":"notice","type":"notice"},{"name":"needsFallback","displayName":"Enab... | @n8n | packages/@n8n/nodes-langchain/nodes/vendors/Microsoft/MicrosoftAgent365Trigger.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/vendors/Microsoft/MicrosoftAgent365Trigger.node.ts |
modelSelector | Model Selector | ["AI"] | ["Language Models"] | ["transform"] | 1 | Use this node to select one of the connected models to this node based on workflow data | [] | [] | [{"name":"rules","displayName":"Rules","type":"fixedCollection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/ModelSelector/ModelSelector.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/ModelSelector/ModelSelector.node.ts |
openAi | OpenAI | ["AI"] | ["Agents", "Miscellaneous", "Root Nodes"] | ["transform"] | 2.3 | Message an assistant or GPT, analyze images, generate audio, etc. | [] | [] | @n8n | packages/@n8n/nodes-langchain/nodes/vendors/OpenAi/OpenAi.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/vendors/OpenAi/OpenAi.node.ts | |
outputParserAutofixing | Auto-fixing Output Parser | ["AI"] | ["Output Parsers"] | ["transform"] | 1 | Deprecated, use structured output parser | [] | [] | [{"name":"info","displayName":"This node wraps another output parser. If the first one fails it calls an LLM to fix the format","type":"notice"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/output_parser/OutputParserAutofixing/OutputParserAutofixing.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/output_parser/OutputParserAutofixing/OutputParserAutofixing.node.ts |
outputParserItemList | Item List Output Parser | ["AI"] | ["Output Parsers"] | ["transform"] | 1 | Return the results as separate items | [] | [] | [{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/output_parser/OutputParserItemList/OutputParserItemList.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/output_parser/OutputParserItemList/OutputParserItemList.node.ts |
outputParserStructured | Structured Output Parser | ["AI"] | ["Output Parsers"] | ["transform"] | 1.3 | Return data in a defined JSON format | [] | [] | [{"name":"jsonSchema","displayName":"JSON Schema","type":"json"},{"name":"autoFix","displayName":"Auto-Fix Format","type":"boolean"},{"name":"customizeRetryPrompt","displayName":"Customize Retry Prompt","type":"boolean"},{"name":"prompt","displayName":"Custom Prompt","type":"string"}] | @n8n | packages/@n8n/nodes-langchain/nodes/output_parser/OutputParserStructured/OutputParserStructured.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/output_parser/OutputParserStructured/OutputParserStructured.node.ts |
rerankerCohere | Reranker Cohere | ["AI"] | ["Rerankers"] | ["transform"] | 1 | Use Cohere Reranker to reorder documents after retrieval from a vector store by relevance to the given query. | ["cohereApi"] | [] | [{"name":"modelName","displayName":"Model","type":"options"},{"name":"topN","displayName":"Top N","type":"number"}] | @n8n | packages/@n8n/nodes-langchain/nodes/rerankers/RerankerCohere/RerankerCohere.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/rerankers/RerankerCohere/RerankerCohere.node.ts |
retrieverContextualCompression | Contextual Compression Retriever | ["AI"] | ["Retrievers"] | ["transform"] | 1 | Enhances document similarity search by contextual compression. | [] | [] | @n8n | packages/@n8n/nodes-langchain/nodes/retrievers/RetrieverContextualCompression/RetrieverContextualCompression.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/retrievers/RetrieverContextualCompression/RetrieverContextualCompression.node.ts | |
retrieverMultiQuery | MultiQuery Retriever | ["AI"] | ["Retrievers"] | ["transform"] | 1 | Automates prompt tuning, generates diverse queries and expands document pool for enhanced retrieval. | [] | [] | [{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/retrievers/RetrieverMultiQuery/RetrieverMultiQuery.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/retrievers/RetrieverMultiQuery/RetrieverMultiQuery.node.ts |
retrieverVectorStore | Vector Store Retriever | ["AI"] | ["Retrievers"] | ["transform"] | 1 | Use a Vector Store as Retriever | [] | [] | [{"name":"topK","displayName":"Limit","type":"number"}] | @n8n | packages/@n8n/nodes-langchain/nodes/retrievers/RetrieverVectorStore/RetrieverVectorStore.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/retrievers/RetrieverVectorStore/RetrieverVectorStore.node.ts |
retrieverWorkflow | Workflow Retriever | ["AI"] | ["Retrievers"] | ["transform"] | 1.1 | Use an n8n Workflow as Retriever | [] | [] | [{"name":"executeNotice","displayName":"The workflow will receive ","type":"notice"},{"name":"source","displayName":"Source","type":"options"},{"name":"workflowId","displayName":"Workflow ID","type":"string"},{"name":"workflowId","displayName":"Workflow","type":"workflowSelector"},{"name":"workflowJson","displayName":"... | @n8n | packages/@n8n/nodes-langchain/nodes/retrievers/RetrieverWorkflow/RetrieverWorkflow.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/retrievers/RetrieverWorkflow/RetrieverWorkflow.node.ts |
sentimentAnalysis | Sentiment Analysis | ["AI"] | ["Chains", "Root Nodes"] | ["transform"] | 1.1 | Analyze the sentiment of your text | [] | [] | [{"name":"inputText","displayName":"Text to Analyze","type":"string"},{"name":"detailedResultsNotice","displayName":"Sentiment scores are LLM-generated estimates, not statistically rigorous measurements. They may be inconsistent across runs and should be used as rough indicators only.","type":"notice"},{"name":"options... | @n8n | packages/@n8n/nodes-langchain/nodes/chains/SentimentAnalysis/SentimentAnalysis.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/chains/SentimentAnalysis/SentimentAnalysis.node.ts |
textClassifier | Text Classifier | ["AI"] | ["Chains", "Root Nodes"] | ["transform"] | 1.1 | Classify your text into distinct categories | [] | [] | [{"name":"inputText","displayName":"Text to Classify","type":"string"},{"name":"categories","displayName":"Categories","type":"fixedCollection"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/chains/TextClassifier/TextClassifier.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/chains/TextClassifier/TextClassifier.node.ts |
textSplitterCharacterTextSplitter | Character Text Splitter | ["AI"] | ["Text Splitters"] | ["transform"] | 1 | Split text into chunks by characters | [] | [] | [{"name":"separator","displayName":"Separator","type":"string"},{"name":"chunkSize","displayName":"Chunk Size","type":"number"},{"name":"chunkOverlap","displayName":"Chunk Overlap","type":"number"}] | @n8n | packages/@n8n/nodes-langchain/nodes/text_splitters/TextSplitterCharacterTextSplitter/TextSplitterCharacterTextSplitter.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/text_splitters/TextSplitterCharacterTextSplitter/TextSplitterCharacterTextSplitter.node.ts |
textSplitterRecursiveCharacterTextSplitter | Recursive Character Text Splitter | ["AI"] | ["Text Splitters"] | ["transform"] | 1 | Split text into chunks by characters recursively, recommended for most use cases | [] | [] | [{"name":"chunkSize","displayName":"Chunk Size","type":"number"},{"name":"chunkOverlap","displayName":"Chunk Overlap","type":"number"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/text_splitters/TextSplitterRecursiveCharacterTextSplitter/TextSplitterRecursiveCharacterTextSplitter.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/text_splitters/TextSplitterRecursiveCharacterTextSplitter/TextSplitterRecursiveCharacterTextSplitter.node.ts |
textSplitterTokenSplitter | Token Splitter | ["AI"] | ["Text Splitters"] | ["transform"] | 1 | Split text into chunks by tokens | [] | [] | [{"name":"chunkSize","displayName":"Chunk Size","type":"number"},{"name":"chunkOverlap","displayName":"Chunk Overlap","type":"number"}] | @n8n | packages/@n8n/nodes-langchain/nodes/text_splitters/TextSplitterTokenSplitter/TextSplitterTokenSplitter.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/text_splitters/TextSplitterTokenSplitter/TextSplitterTokenSplitter.node.ts |
toolCalculator | Calculator | ["AI"] | ["Tools", "Other Tools"] | ["transform"] | 1 | Make it easier for AI agents to perform arithmetic | [] | [] | @n8n | packages/@n8n/nodes-langchain/nodes/tools/ToolCalculator/ToolCalculator.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/tools/ToolCalculator/ToolCalculator.node.ts | |
toolCode | Code Tool | ["AI"] | ["Tools", "Recommended Tools"] | ["transform"] | 1.3 | Write a tool in JS or Python | [] | [] | [{"name":"noticeTemplateExample","displayName":"See an example of a conversational agent with custom tool written in JavaScript <a href=","type":"notice"},{"name":"name","displayName":"Name","type":"string"},{"name":"name","displayName":"Name","type":"string"},{"name":"description","displayName":"Description","type":"s... | @n8n | packages/@n8n/nodes-langchain/nodes/tools/ToolCode/ToolCode.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/tools/ToolCode/ToolCode.node.ts |
toolExecutor | Tool Executor | ["Core Nodes"] | ["Helpers"] | ["transform"] | 1 | Key-value pairs, where key is the name of the tool name and value is the parameters to pass to the tool | [] | [] | [{"name":"query","displayName":"Query","type":"json"},{"name":"toolName","displayName":"Tool Name","type":"string"},{"name":"node","displayName":"Node","type":"string"}] | @n8n | packages/@n8n/nodes-langchain/nodes/ToolExecutor/ToolExecutor.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/ToolExecutor/ToolExecutor.node.ts |
toolHttpRequest | HTTP Request Tool | ["AI"] | ["Tools", "Recommended Tools"] | ["output"] | 1.1 | Makes an HTTP request and returns the response data | [] | [] | [{"name":"toolDescription","displayName":"Description","type":"string"},{"name":"method","displayName":"Method","type":"options"},{"name":"placeholderNotice","displayName":"Tip: You can use a {placeholder} for any part of the request to be filled by the model. Provide more context about them in the placeholders section... | @n8n | packages/@n8n/nodes-langchain/nodes/tools/ToolHttpRequest/ToolHttpRequest.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/tools/ToolHttpRequest/ToolHttpRequest.node.ts |
toolSearXng | SearXNG | ["AI"] | ["Tools", "Other Tools"] | ["transform"] | 1 | Search in SearXNG | ["searXngApi"] | [] | [{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/tools/ToolSearXng/ToolSearXng.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/tools/ToolSearXng/ToolSearXng.node.ts |
toolSerpApi | SerpApi (Google Search) | ["AI"] | ["Tools", "Other Tools"] | ["transform"] | 1 | Search in Google using SerpAPI | ["serpApi"] | [] | [{"name":"oldVersionNotice","displayName":"This node is deprecated and will not be updated in the future. Please use the official verified community node instead.","type":"notice"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/tools/ToolSerpApi/ToolSerpApi.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/tools/ToolSerpApi/ToolSerpApi.node.ts |
toolThink | Think Tool | ["AI"] | ["Tools", "Other Tools"] | ["transform"] | 1.1 | Invite the AI agent to do some thinking | [] | [] | [{"name":"description","displayName":"Think Tool Description","type":"string"}] | @n8n | packages/@n8n/nodes-langchain/nodes/tools/ToolThink/ToolThink.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/tools/ToolThink/ToolThink.node.ts |
toolVectorStore | Vector Store Question Answer Tool | ["AI"] | ["Tools", "Other Tools"] | ["transform"] | 1.1 | Answer questions with a vector store | [] | [] | [{"name":"name","displayName":"Data Name","type":"string"},{"name":"description","displayName":"Description of Data","type":"string"},{"name":"topK","displayName":"Limit","type":"number"}] | @n8n | packages/@n8n/nodes-langchain/nodes/tools/ToolVectorStore/ToolVectorStore.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/tools/ToolVectorStore/ToolVectorStore.node.ts |
toolWikipedia | Wikipedia | ["AI"] | ["Tools", "Other Tools"] | ["transform"] | 1 | Search in Wikipedia | [] | [] | @n8n | packages/@n8n/nodes-langchain/nodes/tools/ToolWikipedia/ToolWikipedia.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/tools/ToolWikipedia/ToolWikipedia.node.ts | |
toolWolframAlpha | Wolfram|Alpha | ["AI"] | ["Tools", "Other Tools"] | ["transform"] | 1 | Connects to WolframAlpha | ["wolframAlphaApi"] | [] | @n8n | packages/@n8n/nodes-langchain/nodes/tools/ToolWolframAlpha/ToolWolframAlpha.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/tools/ToolWolframAlpha/ToolWolframAlpha.node.ts | |
toolWorkflow | Call n8n Sub-Workflow Tool | ["AI"] | ["Tools", "Recommended Tools"] | ["transform"] | 2.2 | Uses another n8n workflow as a tool. Allows packaging any n8n node(s) as a tool. | [] | [] | @n8n | packages/@n8n/nodes-langchain/nodes/tools/ToolWorkflow/ToolWorkflow.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/tools/ToolWorkflow/ToolWorkflow.node.ts | |
vectorStoreInMemoryInsert | In Memory Vector Store Insert | ["AI"] | ["Vector Stores"] | ["transform"] | 1 | Insert data into an in-memory vector store | [] | [] | [{"name":"notice","displayName":"The embbded data are stored in the server memory, so they will be lost when the server is restarted. Additionally, if the amount of data is too large, it may cause the server to crash due to insufficient memory.","type":"notice"},{"name":"clearStore","displayName":"Clear Store","type":"... | @n8n | packages/@n8n/nodes-langchain/nodes/vector_store/VectorStoreInMemoryInsert/VectorStoreInMemoryInsert.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/vector_store/VectorStoreInMemoryInsert/VectorStoreInMemoryInsert.node.ts |
vectorStoreInMemoryLoad | In Memory Vector Store Load | ["AI"] | ["Vector Stores"] | ["transform"] | 1 | Load embedded data from an in-memory vector store | [] | [] | [{"name":"memoryKey","displayName":"Memory Key","type":"string"}] | @n8n | packages/@n8n/nodes-langchain/nodes/vector_store/VectorStoreInMemoryLoad/VectorStoreInMemoryLoad.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/vector_store/VectorStoreInMemoryLoad/VectorStoreInMemoryLoad.node.ts |
vectorStorePineconeInsert | Pinecone: Insert | ["AI"] | ["Vector Stores"] | ["transform"] | 1 | Insert data into Pinecone Vector Store index | ["pineconeApi"] | [] | [{"name":"pineconeNamespace","displayName":"Pinecone Namespace","type":"string"},{"name":"notice","displayName":"Specify the document to load in the document loader sub-node","type":"notice"},{"name":"clearNamespace","displayName":"Clear Namespace","type":"boolean"}] | @n8n | packages/@n8n/nodes-langchain/nodes/vector_store/VectorStorePineconeInsert/VectorStorePineconeInsert.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/vector_store/VectorStorePineconeInsert/VectorStorePineconeInsert.node.ts |
vectorStorePineconeLoad | Pinecone: Load | ["AI"] | ["Vector Stores"] | ["transform"] | 1 | Load data from Pinecone Vector Store index | ["pineconeApi"] | [] | [{"name":"pineconeNamespace","displayName":"Pinecone Namespace","type":"string"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/vector_store/VectorStorePineconeLoad/VectorStorePineconeLoad.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/vector_store/VectorStorePineconeLoad/VectorStorePineconeLoad.node.ts |
vectorStoreSupabaseInsert | Supabase: Insert | ["AI"] | ["Vector Stores"] | ["transform"] | 1 | Insert data into Supabase Vector Store index [https:
defaults: {
name: | ["supabaseApi"] | [] | [{"name":"setupNotice","displayName":"Please refer to the <a href=","type":"notice"},{"name":"queryName","displayName":"Query Name","type":"string"},{"name":"notice","displayName":"Specify the document to load in the document loader sub-node","type":"notice"}] | @n8n | packages/@n8n/nodes-langchain/nodes/vector_store/VectorStoreSupabaseInsert/VectorStoreSupabaseInsert.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/vector_store/VectorStoreSupabaseInsert/VectorStoreSupabaseInsert.node.ts |
vectorStoreSupabaseLoad | Supabase: Load | ["AI"] | ["Vector Stores"] | ["transform"] | 1 | Load data from Supabase Vector Store index | ["supabaseApi"] | [] | [{"name":"queryName","displayName":"Query Name","type":"string"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/vector_store/VectorStoreSupabaseLoad/VectorStoreSupabaseLoad.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/vector_store/VectorStoreSupabaseLoad/VectorStoreSupabaseLoad.node.ts |
vectorStoreZepInsert | Zep Vector Store: Insert | ["AI"] | ["Vector Stores"] | ["transform"] | 1 | Insert data into Zep Vector Store index | ["zepApi"] | [] | [{"name":"deprecationNotice","displayName":"This Zep integration is deprecated and will be removed in a future version.","type":"notice"},{"name":"collectionName","displayName":"Collection Name","type":"string"},{"name":"notice","displayName":"Specify the document to load in the document loader sub-node","type":"notice... | @n8n | packages/@n8n/nodes-langchain/nodes/vector_store/VectorStoreZepInsert/VectorStoreZepInsert.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/vector_store/VectorStoreZepInsert/VectorStoreZepInsert.node.ts |
vectorStoreZepLoad | Zep Vector Store: Load | ["AI"] | ["Vector Stores"] | ["transform"] | 1 | Load data from Zep Vector Store index | ["zepApi"] | [] | [{"name":"deprecationNotice","displayName":"This Zep integration is deprecated and will be removed in a future version.","type":"notice"},{"name":"collectionName","displayName":"Collection Name","type":"string"},{"name":"options","displayName":"Options","type":"collection"}] | @n8n | packages/@n8n/nodes-langchain/nodes/vector_store/VectorStoreZepLoad/VectorStoreZepLoad.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/@n8n/nodes-langchain/nodes/vector_store/VectorStoreZepLoad/VectorStoreZepLoad.node.ts |
Brandfetch | Brandfetch | ["Utility", "Sales"] | [] | ["output"] | 1 | Consume Brandfetch API | ["brandfetchApi"] | ["color", "company", "font", "industry", "logo"] | [{"name":"operation","displayName":"Operation","type":"options"},{"name":"domain","displayName":"Domain","type":"string"},{"name":"download","displayName":"Download","type":"boolean"},{"name":"imageTypes","displayName":"Image Type","type":"multiOptions"},{"name":"imageFormats","displayName":"Image Format","type":"multi... | nodes-base | packages/nodes-base/nodes/Brandfetch/Brandfetch.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/nodes-base/nodes/Brandfetch/Brandfetch.node.ts |
actionNetwork | Action Network | ["Sales", "Marketing"] | [] | ["transform"] | 1 | Consume the Action Network API | ["actionNetworkApi"] | ["attendance", "event", "person", "personTag", "petition", "signature", "tag"] | [{"name":"resource","displayName":"Resource","type":"options"}] | nodes-base | packages/nodes-base/nodes/ActionNetwork/ActionNetwork.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/nodes-base/nodes/ActionNetwork/ActionNetwork.node.ts |
activeCampaign | ActiveCampaign | ["Marketing"] | [] | ["transform"] | 1 | Create and edit data in ActiveCampaign | ["activeCampaignApi"] | ["account", "accountContact", "connection", "contact", "contactList", "contactTag", "deal", "ecommerceCustomer", "ecommerceOrder", "ecommerceOrderProducts", "list", "tag"] | [{"name":"resource","displayName":"Resource","type":"options"}] | nodes-base | packages/nodes-base/nodes/ActiveCampaign/ActiveCampaign.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/nodes-base/nodes/ActiveCampaign/ActiveCampaign.node.ts |
activeCampaignTrigger | ActiveCampaign Trigger | ["Marketing"] | [] | ["trigger"] | 1 | Handle ActiveCampaign events via webhooks | ["activeCampaignApi"] | [] | [{"name":"events","displayName":"Event Names or IDs","type":"multiOptions"},{"name":"sources","displayName":"Source","type":"multiOptions"}] | nodes-base | packages/nodes-base/nodes/ActiveCampaign/ActiveCampaignTrigger.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/nodes-base/nodes/ActiveCampaign/ActiveCampaignTrigger.node.ts |
acuitySchedulingTrigger | Acuity Scheduling Trigger | ["Productivity"] | [] | ["trigger"] | 1 | Handle Acuity Scheduling events via webhooks | ["acuitySchedulingApi"] | [] | [{"name":"authentication","displayName":"Authentication","type":"options"},{"name":"event","displayName":"Event","type":"options"},{"name":"resolveData","displayName":"Resolve Data","type":"boolean"}] | nodes-base | packages/nodes-base/nodes/AcuityScheduling/AcuitySchedulingTrigger.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/nodes-base/nodes/AcuityScheduling/AcuitySchedulingTrigger.node.ts |
adalo | Adalo | ["Data & Storage"] | [] | ["transform"] | 1 | Consume Adalo API | ["adaloApi"] | ["create"] | [{"name":"resource","displayName":"Resource","type":"options"},{"name":"operation","displayName":"Operation","type":"options"},{"name":"collectionId","displayName":"Collection ID","type":"string"}] | nodes-base | packages/nodes-base/nodes/Adalo/Adalo.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/nodes-base/nodes/Adalo/Adalo.node.ts |
affinity | Affinity | ["Sales"] | [] | ["output"] | 1 | Consume Affinity API | ["affinityApi"] | ["list", "listEntry", "organization", "person"] | [{"name":"resource","displayName":"Resource","type":"options"}] | nodes-base | packages/nodes-base/nodes/Affinity/Affinity.node.ts | https://github.com/n8n-io/n8n/blob/stable/packages/nodes-base/nodes/Affinity/Affinity.node.ts |
n8n Nodes Catalog
A structured, machine-readable catalog of n8n node metadata extracted directly from the n8n GitHub repository. Covers 524 nodes across packages/nodes-base (431 nodes) and packages/@n8n/nodes-langchain (93 nodes), sourced from n8n@2.20.6.
Updated monthly. Last updated: 2026-05.
Dataset Summary
This dataset catalogs what each n8n node is: its name, category, supported operations, credential requirements, properties schema, and source location. Existing n8n datasets on HuggingFace (workflow collections, builder training sets) focus on how workflows are assembled. This dataset fills the gap underneath - the node-level metadata that lets an AI agent reason about which nodes to use and what they support, without guessing from stale training data.
Intended Uses
LLM training and fine-tuning. Ground models in current n8n node capabilities. A model that has seen this catalog stops hallucinating node names and operation signatures.
Agent tooling at inference time. An AI agent building an n8n workflow can load this dataset as context to select the right node, check credential requirements, and validate operation names before generating a workflow.
Developer reference. "What n8n nodes support database operations?" is currently a docs-browsing exercise. With this dataset it is a one-liner (see Sample Queries below).
Research. Quantitative analysis of the n8n node ecosystem: coverage by category, credential distribution, operation surface area over time.
Files
| File | Description |
|---|---|
nodes.json |
Canonical record-per-node, UTF-8 JSON array |
nodes.parquet |
Columnar output, Snappy-compressed, HuggingFace dataset viewer-ready |
extract.py |
Extraction script - run to regenerate |
Schema
All fields are locked to what the extraction script (extract.py) produces. Do not rely on fields not listed here.
| Field | Type | Notes |
|---|---|---|
node_name |
string | Internal identifier (e.g. slack, airtable). Matches the name field in INodeTypeDescription. |
display_name |
string | Human-readable name shown in the n8n UI. |
categories |
list[string] | Node category tags from .node.json codex (authoritative) or inline codex.categories in .node.ts (fallback). Examples: Communication, AI, Data & Storage. |
subcategories |
list[string] | Subcategory values, flattened from the codex.subcategories dict. Keys (parent categories) are dropped; only the leaf values are kept. |
group |
list[string] | n8n execution group: input, output, or transform. |
version |
string | For single-version nodes: the explicit version value. For multi-version nodes (defaultVersion): the current default version. |
description |
string | One-line description from INodeTypeDescription.description. |
credentials_required |
list[string] | Credential type names from the node's credentials array. Empty for trigger nodes, core nodes, and multi-version nodes where credentials live in versioned implementation files. |
operations_supported |
list[string] | Values from the operation property options array. Falls back to resource options if no operation property exists. Empty for nodes without a resource/operation picker (e.g. webhooks, core transforms). |
properties_schema |
string (JSON) | Compact array of top-level property descriptors: [{"name": "...", "displayName": "...", "type": "..."}]. Top-level only - nested options are not included. Serialized as a JSON string. |
source_package |
string | nodes-base or @n8n (for nodes-langchain nodes). |
source_file_path |
string | Repo-relative path to the primary .node.ts file. |
github_permalink |
string | Permanent GitHub link to the file at the extracted tag. |
Note on list fields in Parquet: categories, subcategories, group, credentials_required, and operations_supported are stored as JSON strings in the Parquet file (e.g. '["Communication","HITL"]'). Parse with json.loads().
Methodology
The catalog is extracted from the n8n GitHub repository using extract.py. The script:
- Downloads the n8n release tarball for the target tag (default: latest release). The tarball is cached locally to make re-runs fast.
- Walks
packages/nodes-base/nodes/andpackages/@n8n/nodes-langchain/nodes/, collecting.node.tsfiles that are NOT inside versioned or implementation sub-directories (v1/,v2/,V1/,V2/,actions/,methods/,transport/, etc.). - For each
.node.ts, parses the TypeScript source with a targeted regex/AST approach to extract theINodeTypeDescriptionfields. Also reads the sibling.node.jsoncodex file for category metadata when present. - Handles multi-version nodes by reading
baseDescriptionfrom the primary file and recordingdefaultVersion. Versioned implementation files (V1/,V2/, etc.) are excluded as they are not standalone nodes. - Emits
nodes.json(UTF-8 JSON array) andnodes.parquet(Snappy-compressed columnar).
The script is idempotent: re-running with the same tag produces identical output. Run with --tag n8n@2.20.6 to pin to a specific release.
What is not included: credentials definitions, utility modules, the core workflow engine, EE-only nodes that don't follow the standard descriptor pattern.
Update Cadence
This dataset is updated monthly via an automated pipeline. The github_permalink field anchors each record to the specific tag it was extracted from, so older rows remain stable across updates.
The Last Updated field at the top of this card tracks the most recent extraction run.
Sample Queries
Find all nodes that support Slack operations (pandas):
import pandas as pd, json
df = pd.read_parquet("nodes.parquet")
df["ops"] = df["operations_supported"].apply(json.loads)
slack_nodes = df[df["node_name"].str.contains("slack", case=False)]
print(slack_nodes[["display_name", "ops"]])
List all nodes requiring OAuth2 credentials (HuggingFace datasets API):
from datasets import load_dataset
import json
ds = load_dataset("automatelab/n8n-nodes-catalog", split="train")
oauth = [r for r in ds if "oAuth2Api" in json.loads(r["credentials_required"])]
print([r["display_name"] for r in oauth])
Count nodes by category (SQL via DuckDB):
SELECT category, COUNT(*) as node_count
FROM read_parquet('nodes.parquet'),
UNNEST(json_extract_string(categories, '$[*]')) AS t(category)
GROUP BY category
ORDER BY node_count DESC;
Companion Blog Post
A detailed companion post covering the dataset, queryability, and AI-agent use cases is now live. Read the companion post: n8n nodes catalog: structured for AI agents
License
Our additions (catalog format, extraction script extract.py, this dataset card, and any editorial framing) are licensed under CC-BY-4.0.
Upstream node metadata: The node metadata in this catalog is derived from n8n source code. Upstream node metadata copyright n8n team, used under the n8n Sustainable Use License. This dataset is a community-maintained catalog/index of that metadata, not a redistribution.
The n8n Sustainable Use License permits derivative works free of charge for non-commercial use and requires preservation of the copyright notice. The full license text is available at https://docs.n8n.io/sustainable-use-license/ and in the n8n repository.
Explicit attribution: Upstream node metadata copyright n8n team, used under n8n SUL. This dataset is a community-maintained catalog/index of that metadata, not a redistribution.
Maintainer
AutomateLab - AI automation guides and tools.
- Downloads last month
- 43