Langchain documentation.
This is documentation for LangChain v0.
Langchain documentation. js - chatbot for answering questions about LangChain's open source libraries Open Canvas - document & chat-based UX for writing code or markdown. Explore tutorials, how-to guides, conceptual introductions, API reference, and more. Documentation style guide As LangChain continues to grow, the amount of documentation required to cover the various concepts and integrations continues to grow too. This state management can take several forms, including: Simply stuffing previous messages into a chat model prompt. With LangChain’s ingestion and retrieval methods, developers can easily augment the LLM’s knowledge with company data, user information, and other private sources. Agents are systems that take a high-level task and use an LLM as a reasoning engine to decide what actions to take and execute those actions. Please see the Runnable Interface for more details. The page content is b64 encoded img, metadata is default or defined by user. langchain-core This package contains base abstractions for different components and ways to compose them together. x Cloudflare Workers Vercel / Next. persist_directory (Optional[str]) – Directory to persist the collection. The particular data structure used to implement this is often an inverted index. The latest and most popular Azure OpenAI models are chat completion models. Use LangGraph. A retriever does not need to be able to store documents, only to return (or retrieve) it. document_separator (str) – String separator to use between formatted document strings. Nov 17, 2023 · In this tutorial, we cover a simple example of how to interact with GPT using LangChain and query a document for semantic meaning using LangChain with a vector store. LangSmith documentation is hosted on a separate site. head to the Google AI docs. Setting Build an Extraction Chain In this tutorial, we will use tool-calling features of chat models to extract structured information from unstructured text. Under Installation Supported Environments LangChain is written in TypeScript and can be used in: Node. Chroma is licensed under Apache 2. Example Head to Integrations for documentation on built-in document loader integrations with 3rd-party tools. Reference Documentation LangChain is a framework for building LLM-powered applications. LangChain-OpenTutorial: The main repository for the LangChain Open Tutorial project. js (ESM and CommonJS) - 18. prompt. Document ¶ class langchain_core. . Class hierarchy for Memory: This tutorial previously used the RunnableWithMessageHistory abstraction. from langchain. Class hierarchy: This will help you get started with Groq chat models. These applications use a technique known as Retrieval Augmented Generation, or RAG. Prompt is often constructed from multiple components and prompt values. Interface LangChain chat models implement the BaseChatModel interface. For a list of all Groq models, visit this link. No third-party integrations are langchain: 0. The above, but trimming old messages to reduce the amount of distracting information the model has to deal with. [docs] def init_embeddings( model: str, *, provider: Optional[str] = None, **kwargs: Any, ) -> Union[Embeddings, Runnable[Any, list[float]]]: """Initialize an embeddings model from a model name and optional provider. Self-ask Tools for every task LangChain offers an extensive library of off-the-shelf tools u2028and an intuitive framework for customizing your own. Each tool has a description. For detailed documentation of all ChatGoogleGenerativeAI features and configurations head to the API reference. embedding_function (Optional[Embeddings]) – Embedding class object. g. js how-to guides here. 0 chains LangChain has evolved since its initial release, and many of the original "Chain" classes have been deprecated in favor of the more flexible and powerful frameworks of LCEL and LangGraph. x, 20. 💁 Contributing As an open-source project in a rapidly developing field, we are extremely open to contributions, whether it be in the form of a new feature, improved infrastructure, or better documentation. BaseMessage # class langchain_core. PromptTemplate [source] # Bases: StringPromptTemplate Prompt template for a language model. Ollama allows you to run open-source large language models, such as Llama 2, locally. js is an extension of LangChain aimed at building robust and stateful multi-actor applications with LLMs by modeling steps as edges and nodes in a graph. prompts # Prompt is the input to the model. It is more general than a vector store. js🦜️🔗 LangChain. We recommend that you use LangGraph for building agents Sep 16, 2024 · LangChain v0. This notebook provides a quick overview for getting started with Databricks chat models. How the chunk size is measured: by number of characters. , document id, file name, source, etc). The Python open-source library is now downloaded over 7 million times per month, and has had more than 20,000 pull requests and 2,500 contributors! The community is truly what makes LangChain incredible, and we're beyond thankful Retrievers return a list of Document objects, which have two attributes: page_content: The content of this document. This guide will help you migrate your existing v0. When given a query, RAG systems first search a knowledge base for relevant information. For more information on these concepts, please see our full documentation. _identifying_params property: Return a dictionary of the identifying parameters This is critical for caching and tracing purposes Dec 9, 2024 · langchain 0. In this article we will learn more about complete LangChain ecosystem. Tools can be passed to chat models that support tool calling allowing the model to request the execution of a specific function with specific inputs. This process offers several benefits, such as ensuring consistent processing of varying document lengths, overcoming input size limitations of models, and improving the quality of text representations used in retrieval systems. page_content. Currently is a string. 3. ) This documents # Document module is a collection of classes that handle documents and their transformations. Many of the key methods of chat models operate on messages as input and return messages as output. 26 allows users to opt-in to an updated AIMessage format when using the Responses API. You are currently on a page documenting the use of OpenAI text completion models. 35 # langchain-core defines the base abstractions for the LangChain ecosystem. It helps you chain together interoperable components and third-party integrations to simplify AI application development — all while future-proofing decisions as the underlying technology evolves. This is documentation for LangChain v0. The agent executes the action (e. Please reference the table below for information about which methods and properties are required or optional for implementations. Class hierarchy: Integration Packages These providers have standalone langchain-{provider} packages for improved versioning, dependency management and testing. ATTENTION The schema definitions are provided for backwards compatibility. For detailed documentation of all ChatOpenAI features and configurations head to the API reference. 27 # Main entrypoint into package. It allows you to closely monitor and evaluate your application, so you can ship quickly and with confidence. Returns: An LCEL Runnable. Document [source] ¶ Bases: BaseMedia Class for storing a piece of text and associated metadata. For information on the latest models, their features, context windows, etc. Document Loaders are usually used to load a lot of Documents in a single run. Vector stores can be used as the backbone of a retriever, but there are other types of retrievers as well. LangSmith It seamlessly integrates with LangChain, and you can use it to inspect and debug individual steps of your chains as you build. Overview ChatDatabricks class wraps a chat model endpoint hosted on Databricks Model Serving. These are traditionally newer models ( older models are generally LLMs, see below). Hit the ground running using third-party integrations and Templates. The interfaces for core components like chat models, LLMs, vector stores, retrievers, and more are defined here. language_models. Installation To install the main langchain package, run: Dec 9, 2024 · langchain_core 0. tools # Tools are classes that an Agent uses to interact with the world. LangChain allows AI developers to develop applications based on the combined Large Language Models (such as GPT-4) with external sources of Introduction LangChain is a framework for developing applications powered by large language models (LLMs). Build controllable agents with LangGraph, our low-level agent orchestration framework. It includes all the tutorial content and resources. 2 Learn about the docs refresh for LangChain v0. It provides a set of tools and components that enable seamless integration of large language models (LLMs) with other data sources, systems and services. conversational_retrieval. Agent uses the description to choose the right tool for the job. This allowed imports of vectorstores, chat models, and other integrations to continue working through langchain rather than forcing users to update all of their imports to langchain-community. This page provides guidelines for anyone writing documentation for LangChain, as well as some of our philosophies around organization and structure. The Chain interface makes it easy to create apps that are: langchain-core: 0. In Agents, a language model is used as a reasoning engine to determine which actions to take and in which order. Chain [source] # Bases: RunnableSerializable[dict[str, Any], dict[str, Any]], ABC Abstract base class for creating structured sequences of calls to components. documents. May 22, 2023 · LangChain excels in handling document data, transforming scanned documents into actionable data through workflow automation. Markdown-Generator: A utility tool for generating markdown for GitBook. langchain-community: 0. After executing actions, the results can be fed back into the LLM to determine whether more actions are needed, or whether it is okay to finish. This page provides guidelines for anyone writing documentation for LangChain and outlines some of our philosophies around organization and structure. Contribute documentation Documentation is a vital part of LangChain. Documentation for LangChain. PromptTemplate # class langchain_core. x, 19. Philosophy LangChain's documentation aspires to follow the Diataxis framework. Google AI offers a number of different chat models. For detailed documentation of all ChatHuggingFace features and configurations head to the API reference. 27 memory ConversationBufferMemory Creating documents A document at its core is fairly simple. A basic agent works in the following manner: Given a prompt an agent uses an LLM to request an action to take (e. agents ¶ Schema definitions for representing agent actions, observations, and return values. It enables applications that: Are context-aware: connect a language model to sources of context (prompt instructions, few shot examples, content to ground its response in, etc. Philosophy LangChain's documentation follows the Diataxis framework. 0th element in each tuple is a LangChain Document Object. Example from langchain. 0 release, langchain-community was retained as required a dependency of langchain. Why use LangGraph Platform? What is LangGraph Platform? Explained in 4 Minutes. LangSmith is a unified developer platform for building, testing, and monitoring LLM applications. This guide provides explanations of the key concepts behind the LangChain framework and AI applications more broadly. They are useful for summarizing documents, answering questions over documents, extracting information from documents, and more. kwargs – Additional fields to pass to the param additional_kwargs: dict Overview Document splitting is often a crucial preprocessing step for many applications. View the full docs of Chroma at this page, and find the API reference for the LangChain integration at this page. load method. This application will translate text from English into another language. Learn how to use LangChain's Python and JavaScript libraries, integrations, methods, and tools to create end-to-end applications with LLMs. Document [source] # Bases: BaseMedia Class for storing a piece of text and associated metadata. **Note:** Must have the integration package corresponding to the model provider installed. This tutorial delves into LangChain, starting from an overview then providing practical examples. Vector search for Amazon DocumentDB combines the flexibility and rich querying capability of a JSON-based document Document loaders are designed to load document objects. Agents use language models to choose a sequence of actions to take. Parameters: content – The string contents of the message. In this quickstart we'll show you how to build a simple LLM application with LangChain. For detailed documentation of all ChatDatabricks features and configurations head to the API reference. Chroma is a AI-native open-source vector database focused on developer productivity and happiness. 3 Last updated: 09. Prompt classes and functions make constructing This notebook provides a quick overview for getting started with OpenAI chat models. LLM [source] # Bases: BaseLLM Simple interface for implementing a custom LLM. We welcome both new documentation for new features and community improvements to our current documentation. , runs the tool), and receives an observation. document_variable_name (str) – Variable name to use for the formatted documents in the prompt. Overview Integration details Integration details Overview The tool abstraction in LangChain associates a Python function with a schema that defines the function's name, description and expected arguments. Under this langchain-core: 0. Many popular Ollama models are chat completion models. v1. Learn how to build and deploy applications powered by large language models (LLMs) using LangChain's open-source libraries and tools. This tutorial builds upon the foundation of the existing tutorial available here: link written in Korean. Docx2txtLoader # class langchain_community. Still, this is a great way to get started with LangChain - a lot of features can be built with just some prompting and an LLM call! agents # Agent is a class that uses an LLM to choose a sequence of actions to take. Settings]) – Chroma client settings collection_metadata (Optional[Dict Document # class langchain_core. There's now versioned docs and a clearer structure — with tutorials, how-to guides, conceptual guides, and API docs See the full list of integrations in the Section Navigation. The Chain interface makes it See this blog post case-study on analyzing user interactions (questions about LangChain documentation)! The blog post and associated repo also introduce clustering as a means of summarization. Learn how to use LangChain's components, integrations, and orchestration framework with tutorials, guides, and API reference. , for use in downstream tasks), use . Class hierarchy for Memory: Document loaders DocumentLoaders load data into the standard LangChain Document format. , and provide a simple interface to this sequence. messages. document_loaders. metadata: Arbitrary metadata associated with this document (e. Productionization: Use LangSmith to inspect, monitor Introduction LangChain is a framework for developing applications powered by language models. See the full list of integrations in the Section Navigation. LangChain simplifies every stage of the LLM application lifecycle: Development: Build your applications using LangChain's open-source building blocks and components. LangSmith This is documentation for LangChain v0. Class hierarchy: Prompt templates help to translate user input and parameters into instructions for a language model. LangChain is an open-source framework for building with GenAI using flexible abstractions and AI-first toolkit. ChatLangChain and ChatLangChain. The dependencies are kept purposefully very lightweight One of the most powerful applications enabled by LLMs is sophisticated question-answering (Q&A) chatbots. The template can be formatted using either f-strings (default), jinja2, or mustache syntax Head to Integrations for documentation on built-in integrations with 3rd-party vector stores. RAG addresses a key limitation of models: models rely on fixed training datasets, which can lead to outdated or incomplete information. Chroma This notebook covers how to get started with the Chroma vector store. client_settings (Optional[chromadb. split_text. , a tool to run). prompts import PromptTemplate prompt_template = "Tell me a {adjective} joke" prompt = PromptTemplate( input_variables=["adjective"], template=prompt_template ) llm = LLMChain(llm=OpenAI(), prompt=prompt) Note LangChain 🔌 MCP. Agents By themselves, language models can't take actions - they just output text. How to: construct knowledge graphs LangGraph. The agent returns the observation to the LLM, which can then be used to generate the next action. It accepts a set of parameters from the user that can be used to generate a prompt for a language model. pydantic_v1 or pydantic. 16. The universal invocation protocol (Runnables) along with a syntax for combining components (LangChain Expression Language) are also defined here. The latest and most popular OpenAI models are chat completion models. How to: pass runtime secrets to a runnable LangGraph LangGraph is an extension of LangChain aimed at building robust and stateful multi-actor applications with LLMs by modeling steps as edges and nodes in a graph. You can access that version of the documentation in the v0. ) This framework Docling parses PDF, DOCX, PPTX, HTML, and other formats into a rich unified representation including document layout, tables etc. prompts. LangChain has two main classes to work with language models: Chat Models and “old-fashioned” LLMs. LangGraph is an extension of LangChain specifically aimed at creating highly controllable and customizable agents. js (Browser, Serverless and Edge functions) Supabase Edge Functions Browser Deno Bun However, note that individual integrations may not be supported in all environments. js ⚡ Building applications with LLMs through composability ⚡ Looking for the Python version? Check out LangChain. chains import LLMChain from langchain_community. memory # Memory maintains Chain state, incorporating context from past runs. langserve: Deploy LangChain chains as REST APIs. BaseMessage [source] # Bases: Serializable Base abstract message class. This example agents # Agent is a class that uses an LLM to choose a sequence of actions to take. LangChain - JavaScript Open-source framework for developing applications powered by large language models (LLMs). You can peruse LangSmith tutorials here. base. Use of Pydantic 2 in user code is fully supported with all packages without the need for bridges like langchain_core. Class hierarchy: messages # Messages are objects used in prompts and chat conversations. The system documents # Document module is a collection of classes that handle documents and their transformations. LangChain has hundreds of integrations with various data sources to load data from: Slack, Notion, Google Drive, etc. ChatHuggingFace This will help you get started with langchain_huggingface chat models. Document module is a collection of classes that handle documents and their transformations. More complex modifications See the full list of integrations in the Section Navigation. Docx2txtLoader(file_path: str | Path) [source] # Load DOCX file using docx2txt and chunks at character level. 2. In Chains, a sequence of actions is hardcoded. Pydantic 1 will no longer be supported as it reached its end-of-life in June 2024. Components 🗃️ Chat models 89 items 🗃️ Retrievers 67 items 🗃️ Tools/Toolkits 136 items 🗃️ Document loaders 197 items 🗃️ Vector stores 120 items 🗃️ Embedding models 86 items 🗃️ Other 9 items Previous Zotero Next Default to a prompt that only contains Document. ⚡️ Quick Install You can use npm, yarn, or pnpm See the full list of integrations in the Section Navigation. Classes How the text is split: by list of characters. Chat models Chain # class langchain. Defaults to “context”. ChatDatabricks Databricks Lakehouse Platform unifies data, analytics, and AI on one platform. Although "LangChain" is in our name, the project is a fusion of ideas and concepts from LangChain, Haystack, LlamaIndex, and the broader community, spiced up with a touch of our own innovation. The LangChain community in Seoul is excited to announce the LangChain OpenTutorial, a brand-new resource designed for everyone. Defaults to check for local file, but if the file is a web path, it will download it to a temporary file, and use that, then clean up the temporary file after ConversationalRetrievalChain # class langchain. Evaluation LangSmith helps you evaluate the performance of your LLM applications. For detailed documentation of all ChatGroq features and configurations head to the API reference. 1, which is no longer actively maintained. Below is a detailed walkthrough of LangChain’s main modules, their roles, and code examples, following the latest Introduction LangChain is a framework for developing applications powered by large language models (LLMs). LangGraph documentation is currently hosted on a separate site. llms import OpenAI from langchain_core. Classes Jul 23, 2025 · LangChain is an open-source framework designed to simplify the creation of applications using large language models (LLMs). Jul 23, 2025 · LangChain is a modular framework designed to build applications powered by large language models (LLMs). Messages are the inputs and outputs of ChatModels. With Amazon DocumentDB, you can run the same application code and use the same drivers and tools that you use with MongoDB. You can peruse LangGraph. To demonstrate LangChain’s ability to inject up-to-date knowledge into your LLM application and the ability to do a semantic search, we cover how to query a document. Get started with LangSmith LangSmith is a platform for building production-grade LLM applications. List of tuples containing documents similar to the query image and their similarity scores. Chat Models Language models that use a sequence of messages as inputs and return chat messages as outputs (as opposed to using plain text). In this tutorial we LangChain Documentation Style Guide Introduction As LangChain continues to grow, the surface area of documentation required to cover it continues to grow too. When the agent This approach is called lexical retrieval, using search algorithms that are typically based upon word frequencies. Below we show example usage. ConversationalRetrievalChain [source] # Bases: BaseConversationalRetrievalChain ChatGoogleGenerativeAI This docs will help you get started with Google AI chat models. The tutorial below is a great way to get Document # class langchain_core. 0. Class hierarchy: Initialize with a Chroma client. Amazon Document DB Amazon DocumentDB (with MongoDB Compatibility) makes it easy to set up, operate, and scale MongoDB-compatible databases in the cloud. Agents select and use Tools and Toolkits for actions. Example This guide provides explanations of the key concepts behind the LangChain framework and AI applications more broadly. 0 chains to the new abstractions. This table provides a brief overview of the main declarative methods. An example use case is as follows: Note langchain-openai >= 0. Key concepts Tools are a way to encapsulate a function and its schema in a way that can be Apr 5, 2024 · LangChain has seen some incredible growth in the last year and half. create_documents. LangChain is a framework for developing applications powered by language models. Class hierarchy: This is documentation for LangChain v0. The interfaces for core components like chat models, vector stores, tools and more are defined here. It involves breaking down large texts into smaller, manageable chunks. How to add memory to chatbots A key feature of chatbots is their ability to use the content of previous conversational turns as context. Browse the classes, functions, and methods for agents, tools, output parsers, and more. How to migrate from v0. No third-party integrations are defined here. chains. May 20, 2024 · Documentation Refresh for LangChain v0. For the current stable version, see this version (Latest). You can peruse LangGraph how-to guides here. Used to embed texts. js documentation is currently hosted on a separate site. Still, this is a great way to get started with LangChain - a lot of features can be built with just some prompting and an LLM call! These are the core chains for working with Documents. Pass in content as positional arg. Chains should be used to encode a sequence of calls to components like models, document retrievers, other chains, etc. Please see the reference for each method for full documentation. documents # Documents module. agents ¶ Agent is a class that uses an LLM to choose a sequence of actions to take. A prompt template consists of a string template. Example LangChain optimizes the run-time execution of chains built with LCEL in a number of ways: Optimized parallel execution: Run Runnables in parallel using RunnableParallel or run multiple inputs through a given chain in parallel using the Runnable Batch API. js to build stateful agents with first-class streaming and human-in-the-loop You are currently on a page documenting the use of Ollama models as text completion models. These are applications that can answer questions about specific source information. It provides a standard interface for chains, many integrations with other tools, and end-to-end chains for common applications. 1. For a list of models supported by Hugging Face check out this page. ) Reason: rely on a language model to reason (about how to answer based on provided context, what actions to take, etc. The intution is simple: a word appears frequently both in the user’s query and a particular document, then this document might be a good match. Python 3. js LangGraph. Main Libraries in the LangChain Ecosystem How to migrate from v0. langchain-opentutorial-pypi: The Python package repository for LangChain OpenTutorial utilities and libraries, available on PyPI for easy integration. Parameters: collection_name (str) – Name of the collection to create. LangGraph. LangChain Python API Reference langchain: 0. word_document. We actively monitor community developments, aiming to quickly incorporate new techniques and integrations, ensuring you stay up-to-date. Creating custom chat model: Custom chat model implementations should inherit from this class. You should subclass this class and implement the following: _call method: Run the LLM on the given prompt and input (used by invoke). 43 ¶ langchain_core. To create LangChain Document objects (e. 72 # langchain-core defines the base abstractions for the LangChain ecosystem. 2 docs. All of LangChain’s reference documentation, in one place. LangChain simplifies every stage of the LLM application lifecycle: Development: Build your applications using LangChain's open-source building blocks, components, and third-party integrations. text langchain-community: 0. 2, which is no longer actively maintained. Its architecture allows developers to integrate LLMs with external data, prompt engineering, retrieval-augmented generation (RAG), semantic search, and agent workflows. Reference Docs # All of LangChain’s reference documentation, in one place. To obtain the string content directly, use . document_loaders import TextLoader from langchain. 24 What's changed All packages have been upgraded from Pydantic 1 to Pydantic 2 internally. In the 0. Can be either: - A model string like "openai:text You are currently on a page documenting the use of Azure OpenAI text completion models. Learn how to use langchain, a library for building language applications with LLMs and tools. * RetrievalOverview Retrieval Augmented Generation (RAG) is a powerful technique that enhances language models by combining them with external knowledge bases. There This is documentation for LangChain v0. , making them ready for generative AI workflows like RAG. Deploy and scale with LangGraph Platform, with APIs for state management, a visual studio for debugging, and multiple deployment options. 8 will no longer be supported as Develop, deploy, and scale agents with LangGraph Platform — our purpose-built platform for long-running, stateful workflows. Full documentation on all methods, classes, installation methods, and integration setups for LangChain. Sep 22, 2023 · LangChain offers many handy utilities such as document loaders, text splitters, embeddings and vector stores like Chroma. Now that we have this data indexed in a vectorstore, we will create a retrieval chain. LLM # class langchain_core. LangChain is a Python library that simplifies every stage of the LLM application lifecycle: development, productionization, and deployment. It consists of a piece of text and optional metadata. Class hierarchy: Jul 24, 2025 · LangChain provides some prompts/chains for assisting in this. New to LangChain or LLM app development in general? Read this material to quickly get up and running building your first applications. Because BaseChatModel also implements the Runnable Interface, chat models support a standard streaming interface, async programming, optimized batching, and more. llms. chains # Chains are easily reusable components linked together. Class hierarchy: LangChain’s suite of products supports developers along each step of their development journey. Contribute to langchain-ai/langchain-mcp-adapters development by creating an account on GitHub. Each DocumentLoader has its own specific parameters, but they can all be invoked in the same way with the . Dec 9, 2024 · langchain_core. This is a relatively simple LLM application - it's just a single LLM call plus some prompting. We will also demonstrate how to use few-shot prompting in this context to improve performance. This is often achieved via tool-calling. Args: model: Name of the model to use. Classes The LangChain vectorstore class will automatically prepare each raw document using the embeddings model. Chains encode a sequence of calls to components like models, document retrievers, other Chains, etc. Please read the resources below before getting started: Documentation style guide Setup Jun 17, 2025 · Build an Agent LangChain supports the creation of agents, or systems that use LLMs as reasoning engines to determine which actions to take and the inputs necessary to perform the action. Example Jul 23, 2025 · LangChain is an open-source framework designed to simplify the development of advanced language model-based applications. retrievers # Retriever class returns Documents given a text query. 17 ¶ langchain. Check out the quickstart guides for instructions on how to use LangGraph Platform to run a LangGraph application locally or deploy to cloud. Within this new repository, we offer the following enhancements document_loaders # Document Loaders are classes to load Documents. config. The piece of text is what we interact with the language model, while the optional metadata is useful for keeping track of metadata about the document (such as the source). The Chain interface makes it easy to create apps that are: Architecture LangChain is a framework that consists of a number of packages. To help you ship LangChain apps to production faster, check out LangSmith. Document # class langchain_core. niyryfqsxvaluudauxbwvnhltfvjxzvsmaohbmvxqxdokj