Dataset Viewer
Auto-converted to Parquet
id
stringlengths
14
16
text
stringlengths
31
2.05k
source
stringlengths
49
114
92e5897c3119-0
.ipynb .pdf Model Comparison Model Comparison# Constructing your language model application will likely involved choosing between many different options of prompts, models, and even chains to use. When doing so, you will want to compare these different options on different inputs in an easy, flexible, and intuitive way. LangChain provides the concept of a ModelLaboratory to test out and try different models. from langchain import LLMChain, OpenAI, Cohere, HuggingFaceHub, PromptTemplate from langchain.model_laboratory import ModelLaboratory llms = [ OpenAI(temperature=0), Cohere(model="command-xlarge-20221108", max_tokens=20, temperature=0), HuggingFaceHub(repo_id="google/flan-t5-xl", model_kwargs={"temperature":1}) ] model_lab = ModelLaboratory.from_llms(llms) model_lab.compare("What color is a flamingo?") Input: What color is a flamingo? OpenAI Params: {'model': 'text-davinci-002', 'temperature': 0.0, 'max_tokens': 256, 'top_p': 1, 'frequency_penalty': 0, 'presence_penalty': 0, 'n': 1, 'best_of': 1} Flamingos are pink. Cohere Params: {'model': 'command-xlarge-20221108', 'max_tokens': 20, 'temperature': 0.0, 'k': 0, 'p': 1, 'frequency_penalty': 0, 'presence_penalty': 0} Pink HuggingFaceHub Params: {'repo_id': 'google/flan-t5-xl', 'temperature': 1} pink
https://python.langchain.com/en/latest/model_laboratory.html
92e5897c3119-1
pink prompt = PromptTemplate(template="What is the capital of {state}?", input_variables=["state"]) model_lab_with_prompt = ModelLaboratory.from_llms(llms, prompt=prompt) model_lab_with_prompt.compare("New York") Input: New York OpenAI Params: {'model': 'text-davinci-002', 'temperature': 0.0, 'max_tokens': 256, 'top_p': 1, 'frequency_penalty': 0, 'presence_penalty': 0, 'n': 1, 'best_of': 1} The capital of New York is Albany. Cohere Params: {'model': 'command-xlarge-20221108', 'max_tokens': 20, 'temperature': 0.0, 'k': 0, 'p': 1, 'frequency_penalty': 0, 'presence_penalty': 0} The capital of New York is Albany. HuggingFaceHub Params: {'repo_id': 'google/flan-t5-xl', 'temperature': 1} st john s from langchain import SelfAskWithSearchChain, SerpAPIWrapper open_ai_llm = OpenAI(temperature=0) search = SerpAPIWrapper() self_ask_with_search_openai = SelfAskWithSearchChain(llm=open_ai_llm, search_chain=search, verbose=True) cohere_llm = Cohere(temperature=0, model="command-xlarge-20221108") search = SerpAPIWrapper() self_ask_with_search_cohere = SelfAskWithSearchChain(llm=cohere_llm, search_chain=search, verbose=True) chains = [self_ask_with_search_openai, self_ask_with_search_cohere] names = [str(open_ai_llm), str(cohere_llm)]
https://python.langchain.com/en/latest/model_laboratory.html
92e5897c3119-2
names = [str(open_ai_llm), str(cohere_llm)] model_lab = ModelLaboratory(chains, names=names) model_lab.compare("What is the hometown of the reigning men's U.S. Open champion?") Input: What is the hometown of the reigning men's U.S. Open champion? OpenAI Params: {'model': 'text-davinci-002', 'temperature': 0.0, 'max_tokens': 256, 'top_p': 1, 'frequency_penalty': 0, 'presence_penalty': 0, 'n': 1, 'best_of': 1} > Entering new chain... What is the hometown of the reigning men's U.S. Open champion? Are follow up questions needed here: Yes. Follow up: Who is the reigning men's U.S. Open champion? Intermediate answer: Carlos Alcaraz. Follow up: Where is Carlos Alcaraz from? Intermediate answer: El Palmar, Spain. So the final answer is: El Palmar, Spain > Finished chain. So the final answer is: El Palmar, Spain Cohere Params: {'model': 'command-xlarge-20221108', 'max_tokens': 256, 'temperature': 0.0, 'k': 0, 'p': 1, 'frequency_penalty': 0, 'presence_penalty': 0} > Entering new chain... What is the hometown of the reigning men's U.S. Open champion? Are follow up questions needed here: Yes. Follow up: Who is the reigning men's U.S. Open champion? Intermediate answer: Carlos Alcaraz. So the final answer is: Carlos Alcaraz > Finished chain. So the final answer is: Carlos Alcaraz By Harrison Chase
https://python.langchain.com/en/latest/model_laboratory.html
92e5897c3119-3
So the final answer is: Carlos Alcaraz By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/model_laboratory.html
7a20e8db90b3-0
.rst .pdf LangChain Gallery Contents Open Source Misc. Colab Notebooks Proprietary LangChain Gallery# Lots of people have built some pretty awesome stuff with LangChain. This is a collection of our favorites. If you see any other demos that you think we should highlight, be sure to let us know! Open Source# HowDoI.ai This is an experiment in building a large-language-model-backed chatbot. It can hold a conversation, remember previous comments/questions, and answer all types of queries (history, web search, movie data, weather, news, and more). YouTube Transcription QA with Sources An end-to-end example of doing question answering on YouTube transcripts, returning the timestamps as sources to legitimize the answer. QA Slack Bot This application is a Slack Bot that uses Langchain and OpenAI’s GPT3 language model to provide domain specific answers. You provide the documents. ThoughtSource A central, open resource and community around data and tools related to chain-of-thought reasoning in large language models. LLM Strategy This Python package adds a decorator llm_strategy that connects to an LLM (such as OpenAI’s GPT-3) and uses the LLM to “implement” abstract methods in interface classes. It does this by forwarding requests to the LLM and converting the responses back to Python data using Python’s @dataclasses. Zero-Shot Corporate Lobbyist A notebook showing how to use GPT to help with the work of a corporate lobbyist. Dagster Documentation ChatBot A jupyter notebook demonstrating how you could create a semantic search engine on documents in one of your Google Folders Google Folder Semantic Search Build a GitHub support bot with GPT3, LangChain, and Python. Talk With Wind Record sounds of anything (birds, wind, fire, train station) and chat with it.
https://python.langchain.com/en/latest/gallery.html
7a20e8db90b3-1
Record sounds of anything (birds, wind, fire, train station) and chat with it. ChatGPT LangChain This simple application demonstrates a conversational agent implemented with OpenAI GPT-3.5 and LangChain. When necessary, it leverages tools for complex math, searching the internet, and accessing news and weather. GPT Math Techniques A Hugging Face spaces project showing off the benefits of using PAL for math problems. GPT Political Compass Measure the political compass of GPT. Notion Database Question-Answering Bot Open source GitHub project shows how to use LangChain to create a chatbot that can answer questions about an arbitrary Notion database. LlamaIndex LlamaIndex (formerly GPT Index) is a project consisting of a set of data structures that are created using GPT-3 and can be traversed using GPT-3 in order to answer queries. Grover’s Algorithm Leveraging Qiskit, OpenAI and LangChain to demonstrate Grover’s algorithm QNimGPT A chat UI to play Nim, where a player can select an opponent, either a quantum computer or an AI ReAct TextWorld Leveraging the ReActTextWorldAgent to play TextWorld with an LLM! Fact Checker This repo is a simple demonstration of using LangChain to do fact-checking with prompt chaining. DocsGPT Answer questions about the documentation of any project Misc. Colab Notebooks# Wolfram Alpha in Conversational Agent Give ChatGPT a WolframAlpha neural implant Tool Updates in Agents Agent improvements (6th Jan 2023) Conversational Agent with Tools (Langchain AGI) Langchain AGI (23rd Dec 2022) Proprietary# Daimon A chat-based AI personal assistant with long-term memory about you.
https://python.langchain.com/en/latest/gallery.html
7a20e8db90b3-2
Daimon A chat-based AI personal assistant with long-term memory about you. AI Assisted SQL Query Generator An app to write SQL using natural language, and execute against real DB. Clerkie Stack Tracing QA Bot to help debug complex stack tracing (especially the ones that go multi-function/file deep). Sales Email Writer By Raza Habib, this demo utilizes LangChain + SerpAPI + HumanLoop to write sales emails. Give it a company name and a person, this application will use Google Search (via SerpAPI) to get more information on the company and the person, and then write them a sales message. Question-Answering on a Web Browser By Zahid Khawaja, this demo utilizes question answering to answer questions about a given website. A followup added this for YouTube videos, and then another followup added it for Wikipedia. Mynd A journaling app for self-care that uses AI to uncover insights and patterns over time. previous Glossary next Deployments Contents Open Source Misc. Colab Notebooks Proprietary By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/gallery.html
6d8a57828694-0
.md .pdf Tracing Contents Tracing Walkthrough Changing Sessions Tracing# By enabling tracing in your LangChain runs, you’ll be able to more effectively visualize, step through, and debug your chains and agents. First, you should install tracing and set up your environment properly. You can use either a locally hosted version of this (uses Docker) or a cloud hosted version (in closed alpha). If you’re interested in using the hosted platform, please fill out the form here. Locally Hosted Setup Cloud Hosted Setup Tracing Walkthrough# When you first access the UI, you should see a page with your tracing sessions. An initial one “default” should already be created for you. A session is just a way to group traces together. If you click on a session, it will take you to a page with no recorded traces that says “No Runs.” You can create a new session with the new session form. If we click on the default session, we can see that to start we have no traces stored. If we now start running chains and agents with tracing enabled, we will see data show up here. To do so, we can run this notebook as an example. After running it, we will see an initial trace show up. From here we can explore the trace at a high level by clicking on the arrow to show nested runs. We can keep on clicking further and further down to explore deeper and deeper. We can also click on the “Explore” button of the top level run to dive even deeper. Here, we can see the inputs and outputs in full, as well as all the nested traces. We can keep on exploring each of these nested traces in more detail. For example, here is the lowest level trace with the exact inputs/outputs to the LLM. Changing Sessions#
https://python.langchain.com/en/latest/tracing.html
6d8a57828694-1
Changing Sessions# To initially record traces to a session other than "default", you can set the LANGCHAIN_SESSION environment variable to the name of the session you want to record to: import os os.environ["LANGCHAIN_HANDLER"] = "langchain" os.environ["LANGCHAIN_SESSION"] = "my_session" # Make sure this session actually exists. You can create a new session in the UI. To switch sessions mid-script or mid-notebook, do NOT set the LANGCHAIN_SESSION environment variable. Instead: langchain.set_tracing_callback_manager(session_name="my_session") previous Deployments Contents Tracing Walkthrough Changing Sessions By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/tracing.html
2caf5235632c-0
.rst .pdf API References API References# All of LangChain’s reference documentation, in one place. Full documentation on all methods, classes, and APIs in LangChain. Prompts Utilities Chains Agents previous Integrations next Utilities By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/reference.html
f0b03329f5fa-0
.rst .pdf Welcome to LangChain Contents Getting Started Modules Use Cases Reference Docs LangChain Ecosystem Additional Resources Welcome to LangChain# LangChain is a framework for developing applications powered by language models. We believe that the most powerful and differentiated applications will not only call out to a language model via an API, but will also: Be data-aware: connect a language model to other sources of data Be agentic: allow a language model to interact with its environment The LangChain framework is designed with the above principles in mind. This is the Python specific portion of the documentation. For a purely conceptual guide to LangChain, see here. For the JavaScript documentation, see here. Getting Started# Checkout the below guide for a walkthrough of how to get started using LangChain to create an Language Model application. Getting Started Documentation Modules# There are several main modules that LangChain provides support for. For each module we provide some examples to get started, how-to guides, reference docs, and conceptual guides. These modules are, in increasing order of complexity: Models: The various model types and model integrations LangChain supports. Prompts: This includes prompt management, prompt optimization, and prompt serialization. Memory: Memory is the concept of persisting state between calls of a chain/agent. LangChain provides a standard interface for memory, a collection of memory implementations, and examples of chains/agents that use memory. Indexes: Language models are often more powerful when combined with your own text data - this module covers best practices for doing exactly that. Chains: Chains go beyond just a single LLM call, and are sequences of calls (whether to an LLM or a different utility). LangChain provides a standard interface for chains, lots of integrations with other tools, and end-to-end chains for common applications.
https://python.langchain.com/en/latest/index.html
f0b03329f5fa-1
Agents: Agents involve an LLM making decisions about which Actions to take, taking that Action, seeing an Observation, and repeating that until done. LangChain provides a standard interface for agents, a selection of agents to choose from, and examples of end to end agents. Use Cases# The above modules can be used in a variety of ways. LangChain also provides guidance and assistance in this. Below are some of the common use cases LangChain supports. Personal Assistants: The main LangChain use case. Personal assistants need to take actions, remember interactions, and have knowledge about your data. Question Answering: The second big LangChain use case. Answering questions over specific documents, only utilizing the information in those documents to construct an answer. Chatbots: Since language models are good at producing text, that makes them ideal for creating chatbots. Querying Tabular Data: If you want to understand how to use LLMs to query data that is stored in a tabular format (csvs, SQL, dataframes, etc) you should read this page. Interacting with APIs: Enabling LLMs to interact with APIs is extremely powerful in order to give them more up-to-date information and allow them to take actions. Extraction: Extract structured information from text. Summarization: Summarizing longer documents into shorter, more condensed chunks of information. A type of Data Augmented Generation. Evaluation: Generative models are notoriously hard to evaluate with traditional metrics. One new way of evaluating them is using language models themselves to do the evaluation. LangChain provides some prompts/chains for assisting in this. Reference Docs# All of LangChain’s reference documentation, in one place. Full documentation on all methods, classes, installation methods, and integration setups for LangChain. Reference Documentation LangChain Ecosystem# Guides for how other companies/products can be used with LangChain LangChain Ecosystem
https://python.langchain.com/en/latest/index.html
f0b03329f5fa-2
Guides for how other companies/products can be used with LangChain LangChain Ecosystem Additional Resources# Additional collection of resources we think may be useful as you develop your application! LangChainHub: The LangChainHub is a place to share and explore other prompts, chains, and agents. Glossary: A glossary of all related terms, papers, methods, etc. Whether implemented in LangChain or not! Gallery: A collection of our favorite projects that use LangChain. Useful for finding inspiration or seeing how things were done in other applications. Deployments: A collection of instructions, code snippets, and template repositories for deploying LangChain apps. Tracing: A guide on using tracing in LangChain to visualize the execution of chains and agents. Model Laboratory: Experimenting with different prompts, models, and chains is a big part of developing the best possible application. The ModelLaboratory makes it easy to do so. Discord: Join us on our Discord to discuss all things LangChain! Production Support: As you move your LangChains into production, we’d love to offer more comprehensive support. Please fill out this form and we’ll set up a dedicated support Slack channel. next Quickstart Guide Contents Getting Started Modules Use Cases Reference Docs LangChain Ecosystem Additional Resources By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/index.html
2b8aa385ef5b-0
.rst .pdf LangChain Ecosystem LangChain Ecosystem# Guides for how other companies/products can be used with LangChain AI21 Labs AtlasDB Banana CerebriumAI Chroma Cohere DeepInfra Deep Lake ForefrontAI Google Search Wrapper Google Serper Wrapper GooseAI Graphsignal Hazy Research Helicone Hugging Face Jina Milvus Modal NLPCloud OpenAI OpenSearch Petals PGVector Pinecone PromptLayer Qdrant Runhouse SearxNG Search API SerpAPI StochasticAI Unstructured Weights & Biases Weaviate Wolfram Alpha Wrapper Writer previous Agents next AI21 Labs By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/ecosystem.html
74446be6ae35-0
.md .pdf Deployments Contents Streamlit Gradio (on Hugging Face) Beam Vercel SteamShip Langchain-serve Deployments# So you’ve made a really cool chain - now what? How do you deploy it and make it easily sharable with the world? This section covers several options for that. Note that these are meant as quick deployment options for prototypes and demos, and not for production systems. If you are looking for help with deployment of a production system, please contact us directly. What follows is a list of template GitHub repositories aimed that are intended to be very easy to fork and modify to use your chain. This is far from an exhaustive list of options, and we are EXTREMELY open to contributions here. Streamlit# This repo serves as a template for how to deploy a LangChain with Streamlit. It implements a chatbot interface. It also contains instructions for how to deploy this app on the Streamlit platform. Gradio (on Hugging Face)# This repo serves as a template for how deploy a LangChain with Gradio. It implements a chatbot interface, with a “Bring-Your-Own-Token” approach (nice for not wracking up big bills). It also contains instructions for how to deploy this app on the Hugging Face platform. This is heavily influenced by James Weaver’s excellent examples. Beam# This repo serves as a template for how deploy a LangChain with Beam. It implements a Question Answering app and contains instructions for deploying the app as a serverless REST API. Vercel# A minimal example on how to run LangChain on Vercel using Flask. SteamShip# This repository contains LangChain adapters for Steamship, enabling LangChain developers to rapidly deploy their apps on Steamship.
https://python.langchain.com/en/latest/deployments.html
74446be6ae35-1
This includes: production ready endpoints, horizontal scaling across dependencies, persistant storage of app state, multi-tenancy support, etc. Langchain-serve# This repository allows users to serve local chains and agents as RESTful, gRPC, or Websocket APIs thanks to Jina. Deploy your chains & agents with ease and enjoy independent scaling, serverless and autoscaling APIs, as well as a Streamlit playground on Jina AI Cloud. previous LangChain Gallery next Tracing Contents Streamlit Gradio (on Hugging Face) Beam Vercel SteamShip Langchain-serve By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/deployments.html
3295cc81a896-0
.md .pdf Glossary Contents Chain of Thought Prompting Action Plan Generation ReAct Prompting Self-ask Prompt Chaining Memetic Proxy Self Consistency Inception MemPrompt Glossary# This is a collection of terminology commonly used when developing LLM applications. It contains reference to external papers or sources where the concept was first introduced, as well as to places in LangChain where the concept is used. Chain of Thought Prompting# A prompting technique used to encourage the model to generate a series of intermediate reasoning steps. A less formal way to induce this behavior is to include “Let’s think step-by-step” in the prompt. Resources: Chain-of-Thought Paper Step-by-Step Paper Action Plan Generation# A prompt usage that uses a language model to generate actions to take. The results of these actions can then be fed back into the language model to generate a subsequent action. Resources: WebGPT Paper SayCan Paper ReAct Prompting# A prompting technique that combines Chain-of-Thought prompting with action plan generation. This induces the to model to think about what action to take, then take it. Resources: Paper LangChain Example Self-ask# A prompting method that builds on top of chain-of-thought prompting. In this method, the model explicitly asks itself follow-up questions, which are then answered by an external search engine. Resources: Paper LangChain Example Prompt Chaining# Combining multiple LLM calls together, with the output of one-step being the input to the next. Resources: PromptChainer Paper Language Model Cascades ICE Primer Book Socratic Models Memetic Proxy#
https://python.langchain.com/en/latest/glossary.html
3295cc81a896-1
Language Model Cascades ICE Primer Book Socratic Models Memetic Proxy# Encouraging the LLM to respond in a certain way framing the discussion in a context that the model knows of and that will result in that type of response. For example, as a conversation between a student and a teacher. Resources: Paper Self Consistency# A decoding strategy that samples a diverse set of reasoning paths and then selects the most consistent answer. Is most effective when combined with Chain-of-thought prompting. Resources: Paper Inception# Also called “First Person Instruction”. Encouraging the model to think a certain way by including the start of the model’s response in the prompt. Resources: Example MemPrompt# MemPrompt maintains a memory of errors and user feedback, and uses them to prevent repetition of mistakes. Resources: Paper previous Writer next LangChain Gallery Contents Chain of Thought Prompting Action Plan Generation ReAct Prompting Self-ask Prompt Chaining Memetic Proxy Self Consistency Inception MemPrompt By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/glossary.html
c0dcba829ed9-0
Search Error Please activate JavaScript to enable the search functionality. Ctrl+K By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/search.html
ab3bb860dcff-0
Index _ | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W _ __call__() (langchain.llms.AI21 method) (langchain.llms.AlephAlpha method) (langchain.llms.Anthropic method) (langchain.llms.AzureOpenAI method) (langchain.llms.Banana method) (langchain.llms.CerebriumAI method) (langchain.llms.Cohere method) (langchain.llms.DeepInfra method) (langchain.llms.ForefrontAI method) (langchain.llms.GooseAI method) (langchain.llms.HuggingFaceEndpoint method) (langchain.llms.HuggingFaceHub method) (langchain.llms.HuggingFacePipeline method) (langchain.llms.Modal method) (langchain.llms.NLPCloud method) (langchain.llms.OpenAI method) (langchain.llms.OpenAIChat method) (langchain.llms.Petals method) (langchain.llms.PromptLayerOpenAI method) (langchain.llms.PromptLayerOpenAIChat method) (langchain.llms.SagemakerEndpoint method) (langchain.llms.SelfHostedHuggingFaceLLM method) (langchain.llms.SelfHostedPipeline method) (langchain.llms.StochasticAI method) (langchain.llms.Writer method) A aapply() (langchain.chains.LLMChain method) aapply_and_parse() (langchain.chains.LLMChain method) add() (langchain.docstore.InMemoryDocstore method)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-1
add() (langchain.docstore.InMemoryDocstore method) add_documents() (langchain.vectorstores.VectorStore method) add_embeddings() (langchain.vectorstores.FAISS method) add_example() (langchain.prompts.example_selector.LengthBasedExampleSelector method) (langchain.prompts.example_selector.SemanticSimilarityExampleSelector method) add_texts() (langchain.vectorstores.AtlasDB method) (langchain.vectorstores.Chroma method) (langchain.vectorstores.DeepLake method) (langchain.vectorstores.ElasticVectorSearch method) (langchain.vectorstores.FAISS method) (langchain.vectorstores.Milvus method) (langchain.vectorstores.OpenSearchVectorSearch method) (langchain.vectorstores.Pinecone method) (langchain.vectorstores.Qdrant method) (langchain.vectorstores.VectorStore method) (langchain.vectorstores.Weaviate method) agenerate() (langchain.chains.LLMChain method) (langchain.llms.AI21 method) (langchain.llms.AlephAlpha method) (langchain.llms.Anthropic method) (langchain.llms.AzureOpenAI method) (langchain.llms.Banana method) (langchain.llms.CerebriumAI method) (langchain.llms.Cohere method) (langchain.llms.DeepInfra method) (langchain.llms.ForefrontAI method) (langchain.llms.GooseAI method) (langchain.llms.HuggingFaceEndpoint method) (langchain.llms.HuggingFaceHub method) (langchain.llms.HuggingFacePipeline method) (langchain.llms.Modal method) (langchain.llms.NLPCloud method) (langchain.llms.OpenAI method) (langchain.llms.OpenAIChat method) (langchain.llms.Petals method) (langchain.llms.PromptLayerOpenAI method)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-2
(langchain.llms.PromptLayerOpenAI method) (langchain.llms.PromptLayerOpenAIChat method) (langchain.llms.SagemakerEndpoint method) (langchain.llms.SelfHostedHuggingFaceLLM method) (langchain.llms.SelfHostedPipeline method) (langchain.llms.StochasticAI method) (langchain.llms.Writer method) agenerate_prompt() (langchain.llms.AI21 method) (langchain.llms.AlephAlpha method) (langchain.llms.Anthropic method) (langchain.llms.AzureOpenAI method) (langchain.llms.Banana method) (langchain.llms.CerebriumAI method) (langchain.llms.Cohere method) (langchain.llms.DeepInfra method) (langchain.llms.ForefrontAI method) (langchain.llms.GooseAI method) (langchain.llms.HuggingFaceEndpoint method) (langchain.llms.HuggingFaceHub method) (langchain.llms.HuggingFacePipeline method) (langchain.llms.Modal method) (langchain.llms.NLPCloud method) (langchain.llms.OpenAI method) (langchain.llms.OpenAIChat method) (langchain.llms.Petals method) (langchain.llms.PromptLayerOpenAI method) (langchain.llms.PromptLayerOpenAIChat method) (langchain.llms.SagemakerEndpoint method) (langchain.llms.SelfHostedHuggingFaceLLM method) (langchain.llms.SelfHostedPipeline method) (langchain.llms.StochasticAI method) (langchain.llms.Writer method) agent (langchain.agents.AgentExecutor attribute) (langchain.agents.MRKLChain attribute) (langchain.agents.ReActChain attribute) (langchain.agents.SelfAskWithSearchChain attribute)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-3
(langchain.agents.SelfAskWithSearchChain attribute) ai_prefix (langchain.agents.ConversationalAgent attribute) aiosession (langchain.serpapi.SerpAPIWrapper attribute) aleph_alpha_api_key (langchain.llms.AlephAlpha attribute) allowed_tools (langchain.agents.Agent attribute) (langchain.agents.ZeroShotAgent attribute) answers (langchain.utilities.searx_search.SearxResults property) api_answer_chain (langchain.chains.APIChain attribute) api_docs (langchain.chains.APIChain attribute) api_request_chain (langchain.chains.APIChain attribute) api_url (langchain.llms.StochasticAI attribute) aplan() (langchain.agents.Agent method) apply() (langchain.chains.LLMChain method) apply_and_parse() (langchain.chains.LLMChain method) apredict() (langchain.chains.LLMChain method) aprep_prompts() (langchain.chains.LLMChain method) are_all_true_prompt (langchain.chains.LLMSummarizationCheckerChain attribute) arun() (langchain.serpapi.SerpAPIWrapper method) as_retriever() (langchain.vectorstores.VectorStore method) AtlasDB (class in langchain.vectorstores) B bad_words (langchain.llms.NLPCloud attribute) base_embeddings (langchain.chains.HypotheticalDocumentEmbedder attribute) base_url (langchain.llms.AI21 attribute) (langchain.llms.ForefrontAI attribute) (langchain.llms.Writer attribute) batch_size (langchain.llms.AzureOpenAI attribute) beam_search_diversity_rate (langchain.llms.Writer attribute) beam_width (langchain.llms.Writer attribute) best_of (langchain.llms.AlephAlpha attribute)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-4
best_of (langchain.llms.AlephAlpha attribute) (langchain.llms.AzureOpenAI attribute) C callback_manager (langchain.agents.MRKLChain attribute) (langchain.agents.ReActChain attribute) (langchain.agents.SelfAskWithSearchChain attribute) chain (langchain.chains.ConstitutionalChain attribute) chains (langchain.chains.SequentialChain attribute) (langchain.chains.SimpleSequentialChain attribute) CharacterTextSplitter (class in langchain.text_splitter) check_assertions_prompt (langchain.chains.LLMCheckerChain attribute) (langchain.chains.LLMSummarizationCheckerChain attribute) Chroma (class in langchain.vectorstores) chunk_size (langchain.embeddings.OpenAIEmbeddings attribute) client (langchain.llms.Petals attribute) combine_docs_chain (langchain.chains.AnalyzeDocumentChain attribute) combine_documents_chain (langchain.chains.MapReduceChain attribute) combine_embeddings() (langchain.chains.HypotheticalDocumentEmbedder method) completion_bias_exclusion_first_token_only (langchain.llms.AlephAlpha attribute) constitutional_principles (langchain.chains.ConstitutionalChain attribute) construct() (langchain.llms.AI21 class method) (langchain.llms.AlephAlpha class method) (langchain.llms.Anthropic class method) (langchain.llms.AzureOpenAI class method) (langchain.llms.Banana class method) (langchain.llms.CerebriumAI class method) (langchain.llms.Cohere class method) (langchain.llms.DeepInfra class method) (langchain.llms.ForefrontAI class method) (langchain.llms.GooseAI class method) (langchain.llms.HuggingFaceEndpoint class method) (langchain.llms.HuggingFaceHub class method)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-5
(langchain.llms.HuggingFaceHub class method) (langchain.llms.HuggingFacePipeline class method) (langchain.llms.Modal class method) (langchain.llms.NLPCloud class method) (langchain.llms.OpenAI class method) (langchain.llms.OpenAIChat class method) (langchain.llms.Petals class method) (langchain.llms.PromptLayerOpenAI class method) (langchain.llms.PromptLayerOpenAIChat class method) (langchain.llms.SagemakerEndpoint class method) (langchain.llms.SelfHostedHuggingFaceLLM class method) (langchain.llms.SelfHostedPipeline class method) (langchain.llms.StochasticAI class method) (langchain.llms.Writer class method) content_handler (langchain.embeddings.SagemakerEndpointEmbeddings attribute) (langchain.llms.SagemakerEndpoint attribute) CONTENT_KEY (langchain.vectorstores.Qdrant attribute) contextual_control_threshold (langchain.llms.AlephAlpha attribute) control_log_additive (langchain.llms.AlephAlpha attribute) copy() (langchain.llms.AI21 method) (langchain.llms.AlephAlpha method) (langchain.llms.Anthropic method) (langchain.llms.AzureOpenAI method) (langchain.llms.Banana method) (langchain.llms.CerebriumAI method) (langchain.llms.Cohere method) (langchain.llms.DeepInfra method) (langchain.llms.ForefrontAI method) (langchain.llms.GooseAI method) (langchain.llms.HuggingFaceEndpoint method) (langchain.llms.HuggingFaceHub method) (langchain.llms.HuggingFacePipeline method) (langchain.llms.Modal method) (langchain.llms.NLPCloud method)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-6
(langchain.llms.Modal method) (langchain.llms.NLPCloud method) (langchain.llms.OpenAI method) (langchain.llms.OpenAIChat method) (langchain.llms.Petals method) (langchain.llms.PromptLayerOpenAI method) (langchain.llms.PromptLayerOpenAIChat method) (langchain.llms.SagemakerEndpoint method) (langchain.llms.SelfHostedHuggingFaceLLM method) (langchain.llms.SelfHostedPipeline method) (langchain.llms.StochasticAI method) (langchain.llms.Writer method) coroutine (langchain.agents.Tool attribute) countPenalty (langchain.llms.AI21 attribute) create_assertions_prompt (langchain.chains.LLMSummarizationCheckerChain attribute) create_csv_agent() (in module langchain.agents) create_documents() (langchain.text_splitter.TextSplitter method) create_draft_answer_prompt (langchain.chains.LLMCheckerChain attribute) create_index() (langchain.vectorstores.AtlasDB method) create_json_agent() (in module langchain.agents) create_llm_result() (langchain.llms.AzureOpenAI method) (langchain.llms.OpenAI method) (langchain.llms.PromptLayerOpenAI method) create_openapi_agent() (in module langchain.agents) create_outputs() (langchain.chains.LLMChain method) create_pandas_dataframe_agent() (in module langchain.agents) create_prompt() (langchain.agents.Agent class method) (langchain.agents.ConversationalAgent class method) (langchain.agents.ConversationalChatAgent class method) (langchain.agents.ReActTextWorldAgent class method) (langchain.agents.ZeroShotAgent class method)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-7
(langchain.agents.ZeroShotAgent class method) create_sql_agent() (in module langchain.agents) create_vectorstore_agent() (in module langchain.agents) create_vectorstore_router_agent() (in module langchain.agents) credentials_profile_name (langchain.embeddings.SagemakerEndpointEmbeddings attribute) (langchain.llms.SagemakerEndpoint attribute) critique_chain (langchain.chains.ConstitutionalChain attribute) D database (langchain.chains.SQLDatabaseChain attribute) decider_chain (langchain.chains.SQLDatabaseSequentialChain attribute) DeepLake (class in langchain.vectorstores) delete_collection() (langchain.vectorstores.Chroma method) delete_dataset() (langchain.vectorstores.DeepLake method) deployment_name (langchain.llms.AzureOpenAI attribute) description (langchain.agents.Tool attribute) device (langchain.llms.SelfHostedHuggingFaceLLM attribute) dict() (langchain.agents.Agent method) (langchain.llms.AI21 method) (langchain.llms.AlephAlpha method) (langchain.llms.Anthropic method) (langchain.llms.AzureOpenAI method) (langchain.llms.Banana method) (langchain.llms.CerebriumAI method) (langchain.llms.Cohere method) (langchain.llms.DeepInfra method) (langchain.llms.ForefrontAI method) (langchain.llms.GooseAI method) (langchain.llms.HuggingFaceEndpoint method) (langchain.llms.HuggingFaceHub method) (langchain.llms.HuggingFacePipeline method) (langchain.llms.Modal method) (langchain.llms.NLPCloud method) (langchain.llms.OpenAI method) (langchain.llms.OpenAIChat method) (langchain.llms.Petals method)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-8
(langchain.llms.OpenAIChat method) (langchain.llms.Petals method) (langchain.llms.PromptLayerOpenAI method) (langchain.llms.PromptLayerOpenAIChat method) (langchain.llms.SagemakerEndpoint method) (langchain.llms.SelfHostedHuggingFaceLLM method) (langchain.llms.SelfHostedPipeline method) (langchain.llms.StochasticAI method) (langchain.llms.Writer method) (langchain.prompts.BasePromptTemplate method) (langchain.prompts.FewShotPromptTemplate method) (langchain.prompts.FewShotPromptWithTemplates method) do_sample (langchain.llms.NLPCloud attribute) (langchain.llms.Petals attribute) E early_stopping (langchain.llms.NLPCloud attribute) early_stopping_method (langchain.agents.AgentExecutor attribute) (langchain.agents.MRKLChain attribute) (langchain.agents.ReActChain attribute) (langchain.agents.SelfAskWithSearchChain attribute) echo (langchain.llms.AlephAlpha attribute) ElasticVectorSearch (class in langchain.vectorstores) embed_documents() (langchain.chains.HypotheticalDocumentEmbedder method) (langchain.embeddings.CohereEmbeddings method) (langchain.embeddings.FakeEmbeddings method) (langchain.embeddings.HuggingFaceEmbeddings method) (langchain.embeddings.HuggingFaceHubEmbeddings method) (langchain.embeddings.HuggingFaceInstructEmbeddings method) (langchain.embeddings.OpenAIEmbeddings method) (langchain.embeddings.SagemakerEndpointEmbeddings method) (langchain.embeddings.SelfHostedEmbeddings method) (langchain.embeddings.SelfHostedHuggingFaceInstructEmbeddings method) (langchain.embeddings.TensorflowHubEmbeddings method)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-9
(langchain.embeddings.TensorflowHubEmbeddings method) embed_instruction (langchain.embeddings.HuggingFaceInstructEmbeddings attribute) (langchain.embeddings.SelfHostedHuggingFaceInstructEmbeddings attribute) embed_query() (langchain.chains.HypotheticalDocumentEmbedder method) (langchain.embeddings.CohereEmbeddings method) (langchain.embeddings.FakeEmbeddings method) (langchain.embeddings.HuggingFaceEmbeddings method) (langchain.embeddings.HuggingFaceHubEmbeddings method) (langchain.embeddings.HuggingFaceInstructEmbeddings method) (langchain.embeddings.OpenAIEmbeddings method) (langchain.embeddings.SagemakerEndpointEmbeddings method) (langchain.embeddings.SelfHostedEmbeddings method) (langchain.embeddings.SelfHostedHuggingFaceInstructEmbeddings method) (langchain.embeddings.TensorflowHubEmbeddings method) endpoint_kwargs (langchain.embeddings.SagemakerEndpointEmbeddings attribute) (langchain.llms.SagemakerEndpoint attribute) endpoint_name (langchain.embeddings.SagemakerEndpointEmbeddings attribute) (langchain.llms.SagemakerEndpoint attribute) endpoint_url (langchain.llms.CerebriumAI attribute) (langchain.llms.ForefrontAI attribute) (langchain.llms.HuggingFaceEndpoint attribute) (langchain.llms.Modal attribute) engines (langchain.utilities.searx_search.SearxSearchWrapper attribute) entity_extraction_chain (langchain.chains.GraphQAChain attribute) error (langchain.chains.OpenAIModerationChain attribute) example_keys (langchain.prompts.example_selector.SemanticSimilarityExampleSelector attribute) example_prompt (langchain.prompts.example_selector.LengthBasedExampleSelector attribute) (langchain.prompts.FewShotPromptTemplate attribute) (langchain.prompts.FewShotPromptWithTemplates attribute)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-10
(langchain.prompts.FewShotPromptWithTemplates attribute) example_selector (langchain.prompts.FewShotPromptTemplate attribute) (langchain.prompts.FewShotPromptWithTemplates attribute) example_separator (langchain.prompts.FewShotPromptTemplate attribute) (langchain.prompts.FewShotPromptWithTemplates attribute) examples (langchain.prompts.example_selector.LengthBasedExampleSelector attribute) (langchain.prompts.FewShotPromptTemplate attribute) (langchain.prompts.FewShotPromptWithTemplates attribute) F FAISS (class in langchain.vectorstores) fetch_k (langchain.prompts.example_selector.MaxMarginalRelevanceExampleSelector attribute) finish_tool_name (langchain.agents.Agent property) (langchain.agents.ConversationalAgent property) format() (langchain.prompts.BasePromptTemplate method) (langchain.prompts.ChatPromptTemplate method) (langchain.prompts.FewShotPromptTemplate method) (langchain.prompts.FewShotPromptWithTemplates method) (langchain.prompts.PromptTemplate method) format_messages() (langchain.prompts.MessagesPlaceholder method) format_prompt() (langchain.prompts.BasePromptTemplate method) (langchain.prompts.ChatPromptTemplate method) (langchain.prompts.StringPromptTemplate method) frequency_penalty (langchain.llms.AlephAlpha attribute) (langchain.llms.AzureOpenAI attribute) (langchain.llms.Cohere attribute) (langchain.llms.GooseAI attribute) frequencyPenalty (langchain.llms.AI21 attribute) from_agent_and_tools() (langchain.agents.AgentExecutor class method) from_chains() (langchain.agents.MRKLChain class method) from_colored_object_prompt() (langchain.chains.PALChain class method) from_documents() (langchain.vectorstores.AtlasDB class method)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-11
from_documents() (langchain.vectorstores.AtlasDB class method) (langchain.vectorstores.Chroma class method) (langchain.vectorstores.Qdrant class method) (langchain.vectorstores.VectorStore class method) from_embeddings() (langchain.vectorstores.FAISS class method) from_examples() (langchain.prompts.example_selector.MaxMarginalRelevanceExampleSelector class method) (langchain.prompts.example_selector.SemanticSimilarityExampleSelector class method) (langchain.prompts.PromptTemplate class method) from_existing_index() (langchain.vectorstores.Pinecone class method) from_file() (langchain.prompts.PromptTemplate class method) from_huggingface_tokenizer() (langchain.text_splitter.TextSplitter class method) from_llm() (langchain.chains.ChatVectorDBChain class method) (langchain.chains.ConstitutionalChain class method) (langchain.chains.ConversationalRetrievalChain class method) (langchain.chains.GraphQAChain class method) (langchain.chains.HypotheticalDocumentEmbedder class method) (langchain.chains.QAGenerationChain class method) (langchain.chains.SQLDatabaseSequentialChain class method) from_llm_and_api_docs() (langchain.chains.APIChain class method) from_llm_and_tools() (langchain.agents.Agent class method) (langchain.agents.ConversationalAgent class method) (langchain.agents.ConversationalChatAgent class method) (langchain.agents.ZeroShotAgent class method) from_math_prompt() (langchain.chains.PALChain class method) from_model_id() (langchain.llms.HuggingFacePipeline class method) from_params() (langchain.chains.MapReduceChain class method) from_pipeline() (langchain.llms.SelfHostedHuggingFaceLLM class method)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-12
from_pipeline() (langchain.llms.SelfHostedHuggingFaceLLM class method) (langchain.llms.SelfHostedPipeline class method) from_string() (langchain.chains.LLMChain class method) from_template() (langchain.prompts.PromptTemplate class method) from_texts() (langchain.vectorstores.AtlasDB class method) (langchain.vectorstores.Chroma class method) (langchain.vectorstores.DeepLake class method) (langchain.vectorstores.ElasticVectorSearch class method) (langchain.vectorstores.FAISS class method) (langchain.vectorstores.Milvus class method) (langchain.vectorstores.OpenSearchVectorSearch class method) (langchain.vectorstores.Pinecone class method) (langchain.vectorstores.Qdrant class method) (langchain.vectorstores.VectorStore class method) (langchain.vectorstores.Weaviate class method) from_tiktoken_encoder() (langchain.text_splitter.TextSplitter class method) func (langchain.agents.Tool attribute) G generate() (langchain.chains.LLMChain method) (langchain.llms.AI21 method) (langchain.llms.AlephAlpha method) (langchain.llms.Anthropic method) (langchain.llms.AzureOpenAI method) (langchain.llms.Banana method) (langchain.llms.CerebriumAI method) (langchain.llms.Cohere method) (langchain.llms.DeepInfra method) (langchain.llms.ForefrontAI method) (langchain.llms.GooseAI method) (langchain.llms.HuggingFaceEndpoint method) (langchain.llms.HuggingFaceHub method) (langchain.llms.HuggingFacePipeline method) (langchain.llms.Modal method) (langchain.llms.NLPCloud method) (langchain.llms.OpenAI method)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-13
(langchain.llms.NLPCloud method) (langchain.llms.OpenAI method) (langchain.llms.OpenAIChat method) (langchain.llms.Petals method) (langchain.llms.PromptLayerOpenAI method) (langchain.llms.PromptLayerOpenAIChat method) (langchain.llms.SagemakerEndpoint method) (langchain.llms.SelfHostedHuggingFaceLLM method) (langchain.llms.SelfHostedPipeline method) (langchain.llms.StochasticAI method) (langchain.llms.Writer method) generate_prompt() (langchain.llms.AI21 method) (langchain.llms.AlephAlpha method) (langchain.llms.Anthropic method) (langchain.llms.AzureOpenAI method) (langchain.llms.Banana method) (langchain.llms.CerebriumAI method) (langchain.llms.Cohere method) (langchain.llms.DeepInfra method) (langchain.llms.ForefrontAI method) (langchain.llms.GooseAI method) (langchain.llms.HuggingFaceEndpoint method) (langchain.llms.HuggingFaceHub method) (langchain.llms.HuggingFacePipeline method) (langchain.llms.Modal method) (langchain.llms.NLPCloud method) (langchain.llms.OpenAI method) (langchain.llms.OpenAIChat method) (langchain.llms.Petals method) (langchain.llms.PromptLayerOpenAI method) (langchain.llms.PromptLayerOpenAIChat method) (langchain.llms.SagemakerEndpoint method) (langchain.llms.SelfHostedHuggingFaceLLM method) (langchain.llms.SelfHostedPipeline method) (langchain.llms.StochasticAI method) (langchain.llms.Writer method)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-14
(langchain.llms.StochasticAI method) (langchain.llms.Writer method) get_all_tool_names() (in module langchain.agents) get_answer_expr (langchain.chains.PALChain attribute) get_full_inputs() (langchain.agents.Agent method) get_num_tokens() (langchain.llms.AI21 method) (langchain.llms.AlephAlpha method) (langchain.llms.Anthropic method) (langchain.llms.AzureOpenAI method) (langchain.llms.Banana method) (langchain.llms.CerebriumAI method) (langchain.llms.Cohere method) (langchain.llms.DeepInfra method) (langchain.llms.ForefrontAI method) (langchain.llms.GooseAI method) (langchain.llms.HuggingFaceEndpoint method) (langchain.llms.HuggingFaceHub method) (langchain.llms.HuggingFacePipeline method) (langchain.llms.Modal method) (langchain.llms.NLPCloud method) (langchain.llms.OpenAI method) (langchain.llms.OpenAIChat method) (langchain.llms.Petals method) (langchain.llms.PromptLayerOpenAI method) (langchain.llms.PromptLayerOpenAIChat method) (langchain.llms.SagemakerEndpoint method) (langchain.llms.SelfHostedHuggingFaceLLM method) (langchain.llms.SelfHostedPipeline method) (langchain.llms.StochasticAI method) (langchain.llms.Writer method) get_num_tokens_from_messages() (langchain.llms.AI21 method) (langchain.llms.AlephAlpha method) (langchain.llms.Anthropic method) (langchain.llms.AzureOpenAI method) (langchain.llms.Banana method) (langchain.llms.CerebriumAI method)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-15
(langchain.llms.CerebriumAI method) (langchain.llms.Cohere method) (langchain.llms.DeepInfra method) (langchain.llms.ForefrontAI method) (langchain.llms.GooseAI method) (langchain.llms.HuggingFaceEndpoint method) (langchain.llms.HuggingFaceHub method) (langchain.llms.HuggingFacePipeline method) (langchain.llms.Modal method) (langchain.llms.NLPCloud method) (langchain.llms.OpenAI method) (langchain.llms.OpenAIChat method) (langchain.llms.Petals method) (langchain.llms.PromptLayerOpenAI method) (langchain.llms.PromptLayerOpenAIChat method) (langchain.llms.SagemakerEndpoint method) (langchain.llms.SelfHostedHuggingFaceLLM method) (langchain.llms.SelfHostedPipeline method) (langchain.llms.StochasticAI method) (langchain.llms.Writer method) get_params() (langchain.serpapi.SerpAPIWrapper method) get_principles() (langchain.chains.ConstitutionalChain class method) get_sub_prompts() (langchain.llms.AzureOpenAI method) (langchain.llms.OpenAI method) (langchain.llms.PromptLayerOpenAI method) get_text_length (langchain.prompts.example_selector.LengthBasedExampleSelector attribute) globals (langchain.python.PythonREPL attribute) graph (langchain.chains.GraphQAChain attribute) H hardware (langchain.embeddings.SelfHostedHuggingFaceEmbeddings attribute) (langchain.llms.SelfHostedHuggingFaceLLM attribute) (langchain.llms.SelfHostedPipeline attribute) headers (langchain.utilities.searx_search.SearxSearchWrapper attribute) I
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-16
headers (langchain.utilities.searx_search.SearxSearchWrapper attribute) I i (langchain.agents.ReActTextWorldAgent attribute) inference_fn (langchain.embeddings.SelfHostedEmbeddings attribute) (langchain.embeddings.SelfHostedHuggingFaceEmbeddings attribute) (langchain.llms.SelfHostedHuggingFaceLLM attribute) (langchain.llms.SelfHostedPipeline attribute) inference_kwargs (langchain.embeddings.SelfHostedEmbeddings attribute) initialize_agent() (in module langchain.agents) InMemoryDocstore (class in langchain.docstore) input_key (langchain.chains.QAGenerationChain attribute) input_keys (langchain.chains.ConstitutionalChain property) (langchain.chains.ConversationChain property) (langchain.chains.HypotheticalDocumentEmbedder property) (langchain.chains.QAGenerationChain property) (langchain.prompts.example_selector.SemanticSimilarityExampleSelector attribute) input_variables (langchain.chains.SequentialChain attribute) (langchain.chains.TransformChain attribute) (langchain.prompts.BasePromptTemplate attribute) (langchain.prompts.FewShotPromptTemplate attribute) (langchain.prompts.FewShotPromptWithTemplates attribute) (langchain.prompts.MessagesPlaceholder property) (langchain.prompts.PromptTemplate attribute) J json() (langchain.llms.AI21 method) (langchain.llms.AlephAlpha method) (langchain.llms.Anthropic method) (langchain.llms.AzureOpenAI method) (langchain.llms.Banana method) (langchain.llms.CerebriumAI method) (langchain.llms.Cohere method) (langchain.llms.DeepInfra method) (langchain.llms.ForefrontAI method) (langchain.llms.GooseAI method)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-17
(langchain.llms.ForefrontAI method) (langchain.llms.GooseAI method) (langchain.llms.HuggingFaceEndpoint method) (langchain.llms.HuggingFaceHub method) (langchain.llms.HuggingFacePipeline method) (langchain.llms.Modal method) (langchain.llms.NLPCloud method) (langchain.llms.OpenAI method) (langchain.llms.OpenAIChat method) (langchain.llms.Petals method) (langchain.llms.PromptLayerOpenAI method) (langchain.llms.PromptLayerOpenAIChat method) (langchain.llms.SagemakerEndpoint method) (langchain.llms.SelfHostedHuggingFaceLLM method) (langchain.llms.SelfHostedPipeline method) (langchain.llms.StochasticAI method) (langchain.llms.Writer method) K k (langchain.chains.QAGenerationChain attribute) (langchain.chains.VectorDBQA attribute) (langchain.chains.VectorDBQAWithSourcesChain attribute) (langchain.llms.Cohere attribute) (langchain.prompts.example_selector.SemanticSimilarityExampleSelector attribute) (langchain.utilities.searx_search.SearxSearchWrapper attribute) L langchain.agents module langchain.chains module langchain.docstore module langchain.embeddings module langchain.llms module langchain.prompts module langchain.prompts.example_selector module langchain.python module langchain.serpapi module langchain.text_splitter module langchain.utilities.searx_search module langchain.vectorstores module
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-18
module langchain.vectorstores module LatexTextSplitter (class in langchain.text_splitter) length (langchain.llms.ForefrontAI attribute) (langchain.llms.Writer attribute) length_no_input (langchain.llms.NLPCloud attribute) length_penalty (langchain.llms.NLPCloud attribute) length_pentaly (langchain.llms.Writer attribute) list_assertions_prompt (langchain.chains.LLMCheckerChain attribute) llm (langchain.chains.LLMBashChain attribute) (langchain.chains.LLMChain attribute) (langchain.chains.LLMCheckerChain attribute) (langchain.chains.LLMMathChain attribute) (langchain.chains.LLMSummarizationCheckerChain attribute) (langchain.chains.PALChain attribute) (langchain.chains.SQLDatabaseChain attribute) llm_chain (langchain.agents.Agent attribute) (langchain.agents.ZeroShotAgent attribute) (langchain.chains.HypotheticalDocumentEmbedder attribute) (langchain.chains.LLMRequestsChain attribute) (langchain.chains.QAGenerationChain attribute) llm_prefix (langchain.agents.Agent property) (langchain.agents.ConversationalAgent property) (langchain.agents.ConversationalChatAgent property) (langchain.agents.ZeroShotAgent property) load_agent() (in module langchain.agents) load_chain() (in module langchain.chains) load_fn_kwargs (langchain.embeddings.SelfHostedHuggingFaceEmbeddings attribute) (langchain.llms.SelfHostedHuggingFaceLLM attribute) (langchain.llms.SelfHostedPipeline attribute) load_local() (langchain.vectorstores.FAISS class method) load_prompt() (in module langchain.prompts)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-19
load_prompt() (in module langchain.prompts) load_tools() (in module langchain.agents) locals (langchain.python.PythonREPL attribute) log_probs (langchain.llms.AlephAlpha attribute) logit_bias (langchain.llms.AlephAlpha attribute) (langchain.llms.AzureOpenAI attribute) (langchain.llms.GooseAI attribute) logitBias (langchain.llms.AI21 attribute) logprobs (langchain.llms.Writer attribute) lookup_tool() (langchain.agents.AgentExecutor method) M MarkdownTextSplitter (class in langchain.text_splitter) max_checks (langchain.chains.LLMSummarizationCheckerChain attribute) max_iterations (langchain.agents.AgentExecutor attribute) (langchain.agents.MRKLChain attribute) (langchain.agents.ReActChain attribute) (langchain.agents.SelfAskWithSearchChain attribute) max_length (langchain.llms.NLPCloud attribute) (langchain.llms.Petals attribute) (langchain.prompts.example_selector.LengthBasedExampleSelector attribute) max_marginal_relevance_search() (langchain.vectorstores.FAISS method) (langchain.vectorstores.Milvus method) (langchain.vectorstores.Qdrant method) (langchain.vectorstores.VectorStore method) max_marginal_relevance_search_by_vector() (langchain.vectorstores.FAISS method) (langchain.vectorstores.VectorStore method) max_new_tokens (langchain.llms.Petals attribute) max_retries (langchain.embeddings.OpenAIEmbeddings attribute) (langchain.llms.AzureOpenAI attribute) (langchain.llms.OpenAIChat attribute) (langchain.llms.PromptLayerOpenAIChat attribute) max_tokens (langchain.llms.AzureOpenAI attribute)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-20
max_tokens (langchain.llms.AzureOpenAI attribute) (langchain.llms.Cohere attribute) (langchain.llms.GooseAI attribute) max_tokens_for_prompt() (langchain.llms.AzureOpenAI method) (langchain.llms.OpenAI method) (langchain.llms.PromptLayerOpenAI method) max_tokens_limit (langchain.chains.RetrievalQAWithSourcesChain attribute) (langchain.chains.VectorDBQAWithSourcesChain attribute) max_tokens_to_sample (langchain.llms.Anthropic attribute) maximum_tokens (langchain.llms.AlephAlpha attribute) maxTokens (langchain.llms.AI21 attribute) memory (langchain.agents.MRKLChain attribute) (langchain.agents.ReActChain attribute) (langchain.agents.SelfAskWithSearchChain attribute) (langchain.chains.ConversationChain attribute) merge_from() (langchain.vectorstores.FAISS method) METADATA_KEY (langchain.vectorstores.Qdrant attribute) Milvus (class in langchain.vectorstores) min_length (langchain.llms.NLPCloud attribute) min_tokens (langchain.llms.GooseAI attribute) minimum_tokens (langchain.llms.AlephAlpha attribute) minTokens (langchain.llms.AI21 attribute) model (langchain.embeddings.CohereEmbeddings attribute) (langchain.llms.AI21 attribute) (langchain.llms.AlephAlpha attribute) (langchain.llms.Anthropic attribute) (langchain.llms.Cohere attribute) model_id (langchain.embeddings.SelfHostedHuggingFaceEmbeddings attribute) (langchain.embeddings.SelfHostedHuggingFaceInstructEmbeddings attribute) (langchain.llms.HuggingFacePipeline attribute) (langchain.llms.SelfHostedHuggingFaceLLM attribute)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-21
(langchain.llms.SelfHostedHuggingFaceLLM attribute) (langchain.llms.Writer attribute) model_key (langchain.llms.Banana attribute) model_kwargs (langchain.embeddings.HuggingFaceHubEmbeddings attribute) (langchain.embeddings.SagemakerEndpointEmbeddings attribute) (langchain.llms.AzureOpenAI attribute) (langchain.llms.Banana attribute) (langchain.llms.CerebriumAI attribute) (langchain.llms.GooseAI attribute) (langchain.llms.HuggingFaceEndpoint attribute) (langchain.llms.HuggingFaceHub attribute) (langchain.llms.HuggingFacePipeline attribute) (langchain.llms.Modal attribute) (langchain.llms.OpenAIChat attribute) (langchain.llms.Petals attribute) (langchain.llms.PromptLayerOpenAIChat attribute) (langchain.llms.SagemakerEndpoint attribute) (langchain.llms.SelfHostedHuggingFaceLLM attribute) (langchain.llms.StochasticAI attribute) model_load_fn (langchain.embeddings.SelfHostedHuggingFaceEmbeddings attribute) (langchain.llms.SelfHostedHuggingFaceLLM attribute) (langchain.llms.SelfHostedPipeline attribute) model_name (langchain.chains.OpenAIModerationChain attribute) (langchain.embeddings.HuggingFaceEmbeddings attribute) (langchain.embeddings.HuggingFaceInstructEmbeddings attribute) (langchain.llms.AzureOpenAI attribute) (langchain.llms.GooseAI attribute) (langchain.llms.NLPCloud attribute) (langchain.llms.OpenAIChat attribute) (langchain.llms.Petals attribute) (langchain.llms.PromptLayerOpenAIChat attribute) model_reqs (langchain.embeddings.SelfHostedHuggingFaceEmbeddings attribute)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-22
model_reqs (langchain.embeddings.SelfHostedHuggingFaceEmbeddings attribute) (langchain.embeddings.SelfHostedHuggingFaceInstructEmbeddings attribute) (langchain.llms.SelfHostedHuggingFaceLLM attribute) (langchain.llms.SelfHostedPipeline attribute) model_url (langchain.embeddings.TensorflowHubEmbeddings attribute) modelname_to_contextsize() (langchain.llms.AzureOpenAI method) (langchain.llms.OpenAI method) (langchain.llms.PromptLayerOpenAI method) module langchain.agents langchain.chains langchain.docstore langchain.embeddings langchain.llms langchain.prompts langchain.prompts.example_selector langchain.python langchain.serpapi langchain.text_splitter langchain.utilities.searx_search langchain.vectorstores N n (langchain.llms.AlephAlpha attribute) (langchain.llms.AzureOpenAI attribute) (langchain.llms.GooseAI attribute) NLTKTextSplitter (class in langchain.text_splitter) num_beams (langchain.llms.NLPCloud attribute) num_return_sequences (langchain.llms.NLPCloud attribute) numResults (langchain.llms.AI21 attribute) O observation_prefix (langchain.agents.Agent property) (langchain.agents.ConversationalAgent property) (langchain.agents.ConversationalChatAgent property) (langchain.agents.ZeroShotAgent property) openai_api_key (langchain.chains.OpenAIModerationChain attribute) OpenSearchVectorSearch (class in langchain.vectorstores) output_key (langchain.chains.QAGenerationChain attribute) output_keys (langchain.chains.ConstitutionalChain property) (langchain.chains.HypotheticalDocumentEmbedder property)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-23
(langchain.chains.HypotheticalDocumentEmbedder property) (langchain.chains.QAGenerationChain property) output_parser (langchain.agents.ConversationalChatAgent attribute) (langchain.prompts.BasePromptTemplate attribute) output_variables (langchain.chains.TransformChain attribute) P p (langchain.llms.Cohere attribute) params (langchain.serpapi.SerpAPIWrapper attribute) (langchain.utilities.searx_search.SearxSearchWrapper attribute) partial() (langchain.prompts.BasePromptTemplate method) (langchain.prompts.ChatPromptTemplate method) penalty_bias (langchain.llms.AlephAlpha attribute) penalty_exceptions (langchain.llms.AlephAlpha attribute) penalty_exceptions_include_stop_sequences (langchain.llms.AlephAlpha attribute) persist() (langchain.vectorstores.Chroma method) (langchain.vectorstores.DeepLake method) Pinecone (class in langchain.vectorstores) plan() (langchain.agents.Agent method) predict() (langchain.chains.LLMChain method) predict_and_parse() (langchain.chains.LLMChain method) prefix (langchain.prompts.FewShotPromptTemplate attribute) (langchain.prompts.FewShotPromptWithTemplates attribute) prefix_messages (langchain.llms.OpenAIChat attribute) (langchain.llms.PromptLayerOpenAIChat attribute) prep_prompts() (langchain.chains.LLMChain method) prep_streaming_params() (langchain.llms.AzureOpenAI method) (langchain.llms.OpenAI method) (langchain.llms.PromptLayerOpenAI method) prepare_for_new_call() (langchain.agents.Agent method) presence_penalty (langchain.llms.AlephAlpha attribute) (langchain.llms.AzureOpenAI attribute) (langchain.llms.Cohere attribute)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-24
(langchain.llms.AzureOpenAI attribute) (langchain.llms.Cohere attribute) (langchain.llms.GooseAI attribute) presencePenalty (langchain.llms.AI21 attribute) Prompt (in module langchain.prompts) prompt (langchain.chains.ConversationChain attribute) (langchain.chains.LLMBashChain attribute) (langchain.chains.LLMChain attribute) (langchain.chains.LLMMathChain attribute) (langchain.chains.PALChain attribute) (langchain.chains.SQLDatabaseChain attribute) python_globals (langchain.chains.PALChain attribute) python_locals (langchain.chains.PALChain attribute) PythonCodeTextSplitter (class in langchain.text_splitter) Q qa_chain (langchain.chains.GraphQAChain attribute) Qdrant (class in langchain.vectorstores) query_instruction (langchain.embeddings.HuggingFaceInstructEmbeddings attribute) (langchain.embeddings.SelfHostedHuggingFaceInstructEmbeddings attribute) query_suffix (langchain.utilities.searx_search.SearxSearchWrapper attribute) R random_seed (langchain.llms.Writer attribute) raw_completion (langchain.llms.AlephAlpha attribute) RecursiveCharacterTextSplitter (class in langchain.text_splitter) reduce_k_below_max_tokens (langchain.chains.RetrievalQAWithSourcesChain attribute) (langchain.chains.VectorDBQAWithSourcesChain attribute) region_name (langchain.embeddings.SagemakerEndpointEmbeddings attribute) (langchain.llms.SagemakerEndpoint attribute) remove_end_sequence (langchain.llms.NLPCloud attribute) remove_input (langchain.llms.NLPCloud attribute) repetition_penalties_include_completion (langchain.llms.AlephAlpha attribute)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-25
repetition_penalties_include_completion (langchain.llms.AlephAlpha attribute) repetition_penalties_include_prompt (langchain.llms.AlephAlpha attribute) repetition_penalty (langchain.llms.ForefrontAI attribute) (langchain.llms.NLPCloud attribute) (langchain.llms.Writer attribute) repo_id (langchain.embeddings.HuggingFaceHubEmbeddings attribute) (langchain.llms.HuggingFaceHub attribute) request_timeout (langchain.llms.AzureOpenAI attribute) requests_wrapper (langchain.chains.APIChain attribute) (langchain.chains.LLMRequestsChain attribute) results() (langchain.serpapi.SerpAPIWrapper method) (langchain.utilities.searx_search.SearxSearchWrapper method) retriever (langchain.chains.ConversationalRetrievalChain attribute) (langchain.chains.RetrievalQA attribute) (langchain.chains.RetrievalQAWithSourcesChain attribute) return_all (langchain.chains.SequentialChain attribute) return_direct (langchain.chains.SQLDatabaseChain attribute) return_intermediate_steps (langchain.agents.AgentExecutor attribute) (langchain.agents.MRKLChain attribute) (langchain.agents.ReActChain attribute) (langchain.agents.SelfAskWithSearchChain attribute) (langchain.chains.PALChain attribute) (langchain.chains.SQLDatabaseChain attribute) (langchain.chains.SQLDatabaseSequentialChain attribute) return_stopped_response() (langchain.agents.Agent method) return_values (langchain.agents.Agent attribute) (langchain.agents.ZeroShotAgent attribute) revised_answer_prompt (langchain.chains.LLMCheckerChain attribute) revised_summary_prompt (langchain.chains.LLMSummarizationCheckerChain attribute) revision_chain (langchain.chains.ConstitutionalChain attribute)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-26
revision_chain (langchain.chains.ConstitutionalChain attribute) run() (langchain.python.PythonREPL method) (langchain.serpapi.SerpAPIWrapper method) (langchain.utilities.searx_search.SearxSearchWrapper method) S save() (langchain.agents.Agent method) (langchain.agents.AgentExecutor method) (langchain.llms.AI21 method) (langchain.llms.AlephAlpha method) (langchain.llms.Anthropic method) (langchain.llms.AzureOpenAI method) (langchain.llms.Banana method) (langchain.llms.CerebriumAI method) (langchain.llms.Cohere method) (langchain.llms.DeepInfra method) (langchain.llms.ForefrontAI method) (langchain.llms.GooseAI method) (langchain.llms.HuggingFaceEndpoint method) (langchain.llms.HuggingFaceHub method) (langchain.llms.HuggingFacePipeline method) (langchain.llms.Modal method) (langchain.llms.NLPCloud method) (langchain.llms.OpenAI method) (langchain.llms.OpenAIChat method) (langchain.llms.Petals method) (langchain.llms.PromptLayerOpenAI method) (langchain.llms.PromptLayerOpenAIChat method) (langchain.llms.SagemakerEndpoint method) (langchain.llms.SelfHostedHuggingFaceLLM method) (langchain.llms.SelfHostedPipeline method) (langchain.llms.StochasticAI method) (langchain.llms.Writer method) (langchain.prompts.BasePromptTemplate method) (langchain.prompts.ChatPromptTemplate method) save_agent() (langchain.agents.AgentExecutor method) save_local() (langchain.vectorstores.FAISS method)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-27
save_local() (langchain.vectorstores.FAISS method) search() (langchain.docstore.InMemoryDocstore method) (langchain.docstore.Wikipedia method) search_kwargs (langchain.chains.ChatVectorDBChain attribute) (langchain.chains.VectorDBQA attribute) (langchain.chains.VectorDBQAWithSourcesChain attribute) search_type (langchain.chains.VectorDBQA attribute) searx_host (langchain.utilities.searx_search.SearxSearchWrapper attribute) SearxResults (class in langchain.utilities.searx_search) select_examples() (langchain.prompts.example_selector.LengthBasedExampleSelector method) (langchain.prompts.example_selector.MaxMarginalRelevanceExampleSelector method) (langchain.prompts.example_selector.SemanticSimilarityExampleSelector method) serpapi_api_key (langchain.serpapi.SerpAPIWrapper attribute) similarity_search() (langchain.vectorstores.AtlasDB method) (langchain.vectorstores.Chroma method) (langchain.vectorstores.DeepLake method) (langchain.vectorstores.ElasticVectorSearch method) (langchain.vectorstores.FAISS method) (langchain.vectorstores.Milvus method) (langchain.vectorstores.OpenSearchVectorSearch method) (langchain.vectorstores.Pinecone method) (langchain.vectorstores.Qdrant method) (langchain.vectorstores.VectorStore method) (langchain.vectorstores.Weaviate method) similarity_search_by_vector() (langchain.vectorstores.Chroma method) (langchain.vectorstores.FAISS method) (langchain.vectorstores.VectorStore method) similarity_search_with_score() (langchain.vectorstores.Chroma method) (langchain.vectorstores.FAISS method) (langchain.vectorstores.Milvus method) (langchain.vectorstores.Pinecone method) (langchain.vectorstores.Qdrant method)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-28
(langchain.vectorstores.Pinecone method) (langchain.vectorstores.Qdrant method) similarity_search_with_score_by_vector() (langchain.vectorstores.FAISS method) SpacyTextSplitter (class in langchain.text_splitter) split_documents() (langchain.text_splitter.TextSplitter method) split_text() (langchain.text_splitter.CharacterTextSplitter method) (langchain.text_splitter.NLTKTextSplitter method) (langchain.text_splitter.RecursiveCharacterTextSplitter method) (langchain.text_splitter.SpacyTextSplitter method) (langchain.text_splitter.TextSplitter method) (langchain.text_splitter.TokenTextSplitter method) sql_chain (langchain.chains.SQLDatabaseSequentialChain attribute) stop (langchain.chains.PALChain attribute) (langchain.llms.Writer attribute) stop_sequences (langchain.llms.AlephAlpha attribute) stream() (langchain.llms.Anthropic method) (langchain.llms.AzureOpenAI method) (langchain.llms.OpenAI method) (langchain.llms.PromptLayerOpenAI method) streaming (langchain.llms.Anthropic attribute) (langchain.llms.AzureOpenAI attribute) (langchain.llms.OpenAIChat attribute) (langchain.llms.PromptLayerOpenAIChat attribute) strip_outputs (langchain.chains.SimpleSequentialChain attribute) suffix (langchain.prompts.FewShotPromptTemplate attribute) (langchain.prompts.FewShotPromptWithTemplates attribute) T task (langchain.embeddings.HuggingFaceHubEmbeddings attribute) (langchain.llms.HuggingFaceEndpoint attribute) (langchain.llms.HuggingFaceHub attribute) (langchain.llms.SelfHostedHuggingFaceLLM attribute) temperature (langchain.llms.AI21 attribute)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-29
temperature (langchain.llms.AI21 attribute) (langchain.llms.AlephAlpha attribute) (langchain.llms.Anthropic attribute) (langchain.llms.AzureOpenAI attribute) (langchain.llms.Cohere attribute) (langchain.llms.ForefrontAI attribute) (langchain.llms.GooseAI attribute) (langchain.llms.NLPCloud attribute) (langchain.llms.Petals attribute) (langchain.llms.Writer attribute) template (langchain.prompts.PromptTemplate attribute) template_format (langchain.prompts.FewShotPromptTemplate attribute) (langchain.prompts.FewShotPromptWithTemplates attribute) (langchain.prompts.PromptTemplate attribute) text_length (langchain.chains.LLMRequestsChain attribute) text_splitter (langchain.chains.AnalyzeDocumentChain attribute) (langchain.chains.MapReduceChain attribute) (langchain.chains.QAGenerationChain attribute) TextSplitter (class in langchain.text_splitter) tokenizer (langchain.llms.Petals attribute) tokens (langchain.llms.AlephAlpha attribute) tokens_to_generate (langchain.llms.Writer attribute) TokenTextSplitter (class in langchain.text_splitter) tool() (in module langchain.agents) tools (langchain.agents.AgentExecutor attribute) (langchain.agents.MRKLChain attribute) (langchain.agents.ReActChain attribute) (langchain.agents.SelfAskWithSearchChain attribute) top_k (langchain.chains.SQLDatabaseChain attribute) (langchain.llms.AlephAlpha attribute) (langchain.llms.Anthropic attribute) (langchain.llms.ForefrontAI attribute) (langchain.llms.NLPCloud attribute) (langchain.llms.Petals attribute) (langchain.llms.Writer attribute)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-30
(langchain.llms.Petals attribute) (langchain.llms.Writer attribute) top_k_docs_for_context (langchain.chains.ChatVectorDBChain attribute) top_p (langchain.llms.AlephAlpha attribute) (langchain.llms.Anthropic attribute) (langchain.llms.AzureOpenAI attribute) (langchain.llms.ForefrontAI attribute) (langchain.llms.GooseAI attribute) (langchain.llms.NLPCloud attribute) (langchain.llms.Petals attribute) (langchain.llms.Writer attribute) topP (langchain.llms.AI21 attribute) transform (langchain.chains.TransformChain attribute) truncate (langchain.embeddings.CohereEmbeddings attribute) (langchain.llms.Cohere attribute) U unsecure (langchain.utilities.searx_search.SearxSearchWrapper attribute) update_forward_refs() (langchain.llms.AI21 class method) (langchain.llms.AlephAlpha class method) (langchain.llms.Anthropic class method) (langchain.llms.AzureOpenAI class method) (langchain.llms.Banana class method) (langchain.llms.CerebriumAI class method) (langchain.llms.Cohere class method) (langchain.llms.DeepInfra class method) (langchain.llms.ForefrontAI class method) (langchain.llms.GooseAI class method) (langchain.llms.HuggingFaceEndpoint class method) (langchain.llms.HuggingFaceHub class method) (langchain.llms.HuggingFacePipeline class method) (langchain.llms.Modal class method) (langchain.llms.NLPCloud class method) (langchain.llms.OpenAI class method) (langchain.llms.OpenAIChat class method) (langchain.llms.Petals class method)
https://python.langchain.com/en/latest/genindex.html
ab3bb860dcff-31
(langchain.llms.Petals class method) (langchain.llms.PromptLayerOpenAI class method) (langchain.llms.PromptLayerOpenAIChat class method) (langchain.llms.SagemakerEndpoint class method) (langchain.llms.SelfHostedHuggingFaceLLM class method) (langchain.llms.SelfHostedPipeline class method) (langchain.llms.StochasticAI class method) (langchain.llms.Writer class method) use_multiplicative_presence_penalty (langchain.llms.AlephAlpha attribute) V validate_template (langchain.prompts.FewShotPromptTemplate attribute) (langchain.prompts.FewShotPromptWithTemplates attribute) (langchain.prompts.PromptTemplate attribute) VectorStore (class in langchain.vectorstores) vectorstore (langchain.chains.ChatVectorDBChain attribute) (langchain.chains.VectorDBQA attribute) (langchain.chains.VectorDBQAWithSourcesChain attribute) (langchain.prompts.example_selector.SemanticSimilarityExampleSelector attribute) verbose (langchain.agents.MRKLChain attribute) (langchain.agents.ReActChain attribute) (langchain.agents.SelfAskWithSearchChain attribute) (langchain.llms.AzureOpenAI attribute) (langchain.llms.OpenAI attribute) (langchain.llms.OpenAIChat attribute) W Weaviate (class in langchain.vectorstores) Wikipedia (class in langchain.docstore) By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/genindex.html
d0595c4e7a1e-0
.rst .pdf Chains Chains# Note Conceptual Guide Using an LLM in isolation is fine for some simple applications, but many more complex ones require chaining LLMs - either with each other or with other experts. LangChain provides a standard interface for Chains, as well as some common implementations of chains for ease of use. The following sections of documentation are provided: Getting Started: A getting started guide for chains, to get you up and running quickly. How-To Guides: A collection of how-to guides. These highlight how to use various types of chains. Reference: API reference documentation for all Chain classes. previous How to use multiple memroy classes in the same chain next Getting Started By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/modules/chains.html
1d44c8274ab3-0
.rst .pdf Indexes Contents Go Deeper Indexes# Note Conceptual Guide Indexes refer to ways to structure documents so that LLMs can best interact with them. This module contains utility functions for working with documents, different types of indexes, and then examples for using those indexes in chains. The most common way that indexes are used in chains is in a “retrieval” step. This step refers to taking a user’s query and returning the most relevant documents. We draw this distinction because (1) an index can be used for other things besides retrieval, and (2) retrieval can use other logic besides an index to find relevant documents. We therefor have a concept of a “Retriever” interface - this is the interface that most chains work with. Most of the time when we talk about indexes and retrieval we are talking about indexing and retrieving unstructured data (like text documents). For interacting with structured data (SQL tables, etc) or APIs, please see the corresponding use case sections for links to relevant functionality. The primary index and retrieval types supported by LangChain are currently centered around vector databases, and therefore a lot of the functionality we dive deep on those topics. For an overview of everything related to this, please see the below notebook for getting started: Getting Started We then provide a deep dive on the four main components. Document Loaders How to load documents from a variety of sources. Text Splitters An overview of the abstractions and implementions around splitting text. VectorStores An overview of VectorStores and the many integrations LangChain provides. Retrievers An overview of Retrievers and the implementations LangChain provides. Go Deeper# Document Loaders Text Splitters Vectorstores Retrievers previous Structured Output Parser next Getting Started Contents Go Deeper
https://python.langchain.com/en/latest/modules/indexes.html
1d44c8274ab3-1
previous Structured Output Parser next Getting Started Contents Go Deeper By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/modules/indexes.html
b34d811e5bf3-0
.rst .pdf Memory Memory# Note Conceptual Guide By default, Chains and Agents are stateless, meaning that they treat each incoming query independently (as are the underlying LLMs and chat models). In some applications (chatbots being a GREAT example) it is highly important to remember previous interactions, both at a short term but also at a long term level. The concept of “Memory” exists to do exactly that. LangChain provides memory components in two forms. First, LangChain provides helper utilities for managing and manipulating previous chat messages. These are designed to be modular and useful regardless of how they are used. Secondly, LangChain provides easy ways to incorporate these utilities into chains. The following sections of documentation are provided: Getting Started: An overview of how to get started with different types of memory. How-To Guides: A collection of how-to guides. These highlight different types of memory, as well as how to use memory in chains. Memory Getting Started How-To Guides previous VectorStore Retriever next Getting Started By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/modules/memory.html
ea6607c01582-0
.rst .pdf Prompts Contents Go Deeper Prompts# Note Conceptual Guide The new way of programming models is through prompts. A “prompt” refers to the input to the model. This input is rarely hard coded, but rather is often constructed from multiple components. A PromptTemplate is responsible for the construction of this input. LangChain provides several classes and functions to make constructing and working with prompts easy. This section of documentation is split into four sections: LLM Prompt Templates How to use PromptTemplates to prompt Language Models. Chat Prompt Templates How to use PromptTemplates to prompt Chat Models. Example Selectors Often times it is useful to include examples in prompts. These examples can be hardcoded, but it is often more powerful if they are dynamically selected. This section goes over example selection. Output Parsers Language models (and Chat Models) output text. But many times you may want to get more structured information than just text back. This is where output parsers come in. Output Parsers are responsible for (1) instructing the model how output should be formatted, (2) parsing output into the desired formatting (including retrying if necessary). Go Deeper# Prompt Templates Chat Prompt Template Example Selectors Output Parsers previous TensorflowHub next Prompt Templates Contents Go Deeper By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/modules/prompts.html
0212ca310510-0
.rst .pdf Agents Contents Go Deeper Agents# Note Conceptual Guide Some applications will require not just a predetermined chain of calls to LLMs/other tools, but potentially an unknown chain that depends on the user’s input. In these types of chains, there is a “agent” which has access to a suite of tools. Depending on the user input, the agent can then decide which, if any, of these tools to call. In this section of documentation, we first start with a Getting Started notebook to over over how to use all things related to agents in an end-to-end manner. We then split the documentation into the following sections: Tools An overview of the various tools LangChain supports. Agents An overview of the different agent types. Toolkits An overview of toolkits, and examples of the different ones LangChain supports. Agent Executor An overview of the Agent Executor class and examples of how to use it. Go Deeper# Tools Agents Toolkits Agent Executors previous Chains next Getting Started Contents Go Deeper By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/modules/agents.html
6adbec34c16d-0
.rst .pdf Models Contents Go Deeper Models# Note Conceptual Guide This section of the documentation deals with different types of models that are used in LangChain. On this page we will go over the model types at a high level, but we have individual pages for each model type. The pages contain more detailed “how-to” guides for working with that model, as well as a list of different model providers. LLMs Large Language Models (LLMs) are the first type of models we cover. These models take a text string as input, and return a text string as output. Chat Models Chat Models are the second type of models we cover. These models are usually backed by a language model, but their APIs are more structured. Specifically, these models take a list of Chat Messages as input, and return a Chat Message. Text Embedding Models The third type of models we cover are text embedding models. These models take text as input and return a list of floats. Go Deeper# LLMs Chat Models Text Embedding Models previous Quickstart Guide next LLMs Contents Go Deeper By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/modules/models.html
ce4c5f622b46-0
.ipynb .pdf Getting Started Contents Why do we need chains? Query an LLM with the LLMChain Combine chains with the SequentialChain Create a custom chain with the Chain class Getting Started# In this tutorial, we will learn about creating simple chains in LangChain. We will learn how to create a chain, add components to it, and run it. In this tutorial, we will cover: Using a simple LLM chain Creating sequential chains Creating a custom chain Why do we need chains?# Chains allow us to combine multiple components together to create a single, coherent application. For example, we can create a chain that takes user input, formats it with a PromptTemplate, and then passes the formatted response to an LLM. We can build more complex chains by combining multiple chains together, or by combining chains with other components. Query an LLM with the LLMChain# The LLMChain is a simple chain that takes in a prompt template, formats it with the user input and returns the response from an LLM. To use the LLMChain, first create a prompt template. from langchain.prompts import PromptTemplate from langchain.llms import OpenAI llm = OpenAI(temperature=0.9) prompt = PromptTemplate( input_variables=["product"], template="What is a good name for a company that makes {product}?", ) We can now create a very simple chain that will take user input, format the prompt with it, and then send it to the LLM. from langchain.chains import LLMChain chain = LLMChain(llm=llm, prompt=prompt) # Run the chain only specifying the input variable. print(chain.run("colorful socks")) Rainbow Socks Co.
https://python.langchain.com/en/latest/modules/chains/getting_started.html
ce4c5f622b46-1
print(chain.run("colorful socks")) Rainbow Socks Co. You can use a chat model in an LLMChain as well: from langchain.chat_models import ChatOpenAI from langchain.prompts.chat import ( ChatPromptTemplate, HumanMessagePromptTemplate, ) human_message_prompt = HumanMessagePromptTemplate( prompt=PromptTemplate( template="What is a good name for a company that makes {product}?", input_variables=["product"], ) ) chat_prompt_template = ChatPromptTemplate.from_messages([human_message_prompt]) chat = ChatOpenAI(temperature=0.9) chain = LLMChain(llm=chat, prompt=chat_prompt_template) print(chain.run("colorful socks")) Rainbow Threads This is one of the simpler types of chains, but understanding how it works will set you up well for working with more complex chains. Combine chains with the SequentialChain# The next step after calling a language model is to make a series of calls to a language model. We can do this using sequential chains, which are chains that execute their links in a predefined order. Specifically, we will use the SimpleSequentialChain. This is the simplest type of a sequential chain, where each step has a single input/output, and the output of one step is the input to the next. In this tutorial, our sequential chain will: First, create a company name for a product. We will reuse the LLMChain we’d previously initialized to create this company name. Then, create a catchphrase for the product. We will initialize a new LLMChain to create this catchphrase, as shown below. second_prompt = PromptTemplate( input_variables=["company_name"], template="Write a catchphrase for the following company: {company_name}", )
https://python.langchain.com/en/latest/modules/chains/getting_started.html
ce4c5f622b46-2
template="Write a catchphrase for the following company: {company_name}", ) chain_two = LLMChain(llm=llm, prompt=second_prompt) Now we can combine the two LLMChains, so that we can create a company name and a catchphrase in a single step. from langchain.chains import SimpleSequentialChain overall_chain = SimpleSequentialChain(chains=[chain, chain_two], verbose=True) # Run the chain specifying only the input variable for the first chain. catchphrase = overall_chain.run("colorful socks") print(catchphrase) > Entering new SimpleSequentialChain chain... Cheerful Toes. "Spread smiles from your toes!" > Finished SimpleSequentialChain chain. "Spread smiles from your toes!" Create a custom chain with the Chain class# LangChain provides many chains out of the box, but sometimes you may want to create a custom chain for your specific use case. For this example, we will create a custom chain that concatenates the outputs of 2 LLMChains. In order to create a custom chain: Start by subclassing the Chain class, Fill out the input_keys and output_keys properties, Add the _call method that shows how to execute the chain. These steps are demonstrated in the example below: from langchain.chains import LLMChain from langchain.chains.base import Chain from typing import Dict, List class ConcatenateChain(Chain): chain_1: LLMChain chain_2: LLMChain @property def input_keys(self) -> List[str]: # Union of the input keys of the two chains. all_input_vars = set(self.chain_1.input_keys).union(set(self.chain_2.input_keys)) return list(all_input_vars) @property
https://python.langchain.com/en/latest/modules/chains/getting_started.html
ce4c5f622b46-3
return list(all_input_vars) @property def output_keys(self) -> List[str]: return ['concat_output'] def _call(self, inputs: Dict[str, str]) -> Dict[str, str]: output_1 = self.chain_1.run(inputs) output_2 = self.chain_2.run(inputs) return {'concat_output': output_1 + output_2} Now, we can try running the chain that we called. prompt_1 = PromptTemplate( input_variables=["product"], template="What is a good name for a company that makes {product}?", ) chain_1 = LLMChain(llm=llm, prompt=prompt_1) prompt_2 = PromptTemplate( input_variables=["product"], template="What is a good slogan for a company that makes {product}?", ) chain_2 = LLMChain(llm=llm, prompt=prompt_2) concat_chain = ConcatenateChain(chain_1=chain_1, chain_2=chain_2) concat_output = concat_chain.run("colorful socks") print(f"Concatenated output:\n{concat_output}") Concatenated output: Rainbow Socks Co. "Step Into Colorful Comfort!" That’s it! For more details about how to do cool things with Chains, check out the how-to guide for chains. previous Chains next How-To Guides Contents Why do we need chains? Query an LLM with the LLMChain Combine chains with the SequentialChain Create a custom chain with the Chain class By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/modules/chains/getting_started.html
13dfae385a41-0
.rst .pdf How-To Guides How-To Guides# A chain is made up of links, which can be either primitives or other chains. Primitives can be either prompts, models, arbitrary functions, or other chains. The examples here are broken up into three sections: Generic Functionality Covers both generic chains (that are useful in a wide variety of applications) as well as generic functionality related to those chains. Async API for Chain Loading from LangChainHub LLM Chain Sequential Chains Serialization Transformation Chain Index-related Chains Chains related to working with indexes. Analyze Document Chat Index Graph QA Hypothetical Document Embeddings Question Answering with Sources Question Answering Summarization Retrieval Question/Answering Retrieval Question Answering with Sources Vector DB Text Generation All other chains All other types of chains! API Chains Self-Critique Chain with Constitutional AI BashChain LLMCheckerChain LLM Math LLMRequestsChain LLMSummarizationCheckerChain Moderation PAL SQLite example previous Getting Started next Async API for Chain By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/modules/chains/how_to_guides.html
d9c429cca091-0
.ipynb .pdf Transformation Chain Transformation Chain# This notebook showcases using a generic transformation chain. As an example, we will create a dummy transformation that takes in a super long text, filters the text to only the first 3 paragraphs, and then passes that into an LLMChain to summarize those. from langchain.chains import TransformChain, LLMChain, SimpleSequentialChain from langchain.llms import OpenAI from langchain.prompts import PromptTemplate with open("../../state_of_the_union.txt") as f: state_of_the_union = f.read() def transform_func(inputs: dict) -> dict: text = inputs["text"] shortened_text = "\n\n".join(text.split("\n\n")[:3]) return {"output_text": shortened_text} transform_chain = TransformChain(input_variables=["text"], output_variables=["output_text"], transform=transform_func) template = """Summarize this text: {output_text} Summary:""" prompt = PromptTemplate(input_variables=["output_text"], template=template) llm_chain = LLMChain(llm=OpenAI(), prompt=prompt) sequential_chain = SimpleSequentialChain(chains=[transform_chain, llm_chain]) sequential_chain.run(state_of_the_union) ' The speaker addresses the nation, noting that while last year they were kept apart due to COVID-19, this year they are together again. They are reminded that regardless of their political affiliations, they are all Americans.' previous Serialization next Analyze Document By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/modules/chains/generic/transformation.html
dff49f0221fa-0
.ipynb .pdf Sequential Chains Contents SimpleSequentialChain Sequential Chain Memory in Sequential Chains Sequential Chains# The next step after calling a language model is make a series of calls to a language model. This is particularly useful when you want to take the output from one call and use it as the input to another. In this notebook we will walk through some examples for how to do this, using sequential chains. Sequential chains are defined as a series of chains, called in deterministic order. There are two types of sequential chains: SimpleSequentialChain: The simplest form of sequential chains, where each step has a singular input/output, and the output of one step is the input to the next. SequentialChain: A more general form of sequential chains, allowing for multiple inputs/outputs. SimpleSequentialChain# In this series of chains, each individual chain has a single input and a single output, and the output of one step is used as input to the next. Let’s walk through a toy example of doing this, where the first chain takes in the title of an imaginary play and then generates a synopsis for that title, and the second chain takes in the synopsis of that play and generates an imaginary review for that play. %load_ext dotenv %dotenv cannot find .env file from langchain.llms import OpenAI from langchain.chains import LLMChain from langchain.prompts import PromptTemplate # This is an LLMChain to write a synopsis given a title of a play. llm = OpenAI(temperature=.7) template = """You are a playwright. Given the title of play, it is your job to write a synopsis for that title. Title: {title} Playwright: This is a synopsis for the above play:""" prompt_template = PromptTemplate(input_variables=["title"], template=template)
https://python.langchain.com/en/latest/modules/chains/generic/sequential_chains.html
dff49f0221fa-1
prompt_template = PromptTemplate(input_variables=["title"], template=template) synopsis_chain = LLMChain(llm=llm, prompt=prompt_template) # This is an LLMChain to write a review of a play given a synopsis. llm = OpenAI(temperature=.7) template = """You are a play critic from the New York Times. Given the synopsis of play, it is your job to write a review for that play. Play Synopsis: {synopsis} Review from a New York Times play critic of the above play:""" prompt_template = PromptTemplate(input_variables=["synopsis"], template=template) review_chain = LLMChain(llm=llm, prompt=prompt_template) # This is the overall chain where we run these two chains in sequence. from langchain.chains import SimpleSequentialChain overall_chain = SimpleSequentialChain(chains=[synopsis_chain, review_chain], verbose=True) review = overall_chain.run("Tragedy at sunset on the beach") > Entering new SimpleSequentialChain chain... Tragedy at Sunset on the Beach is a story of a young couple, Jack and Sarah, who are in love and looking forward to their future together. On the night of their anniversary, they decide to take a walk on the beach at sunset. As they are walking, they come across a mysterious figure, who tells them that their love will be tested in the near future. The figure then tells the couple that the sun will soon set, and with it, a tragedy will strike. If Jack and Sarah can stay together and pass the test, they will be granted everlasting love. However, if they fail, their love will be lost forever.
https://python.langchain.com/en/latest/modules/chains/generic/sequential_chains.html
dff49f0221fa-2
The play follows the couple as they struggle to stay together and battle the forces that threaten to tear them apart. Despite the tragedy that awaits them, they remain devoted to one another and fight to keep their love alive. In the end, the couple must decide whether to take a chance on their future together or succumb to the tragedy of the sunset. Tragedy at Sunset on the Beach is an emotionally gripping story of love, hope, and sacrifice. Through the story of Jack and Sarah, the audience is taken on a journey of self-discovery and the power of love to overcome even the greatest of obstacles. The play's talented cast brings the characters to life, allowing us to feel the depths of their emotion and the intensity of their struggle. With its compelling story and captivating performances, this play is sure to draw in audiences and leave them on the edge of their seats. The play's setting of the beach at sunset adds a touch of poignancy and romanticism to the story, while the mysterious figure serves to keep the audience enthralled. Overall, Tragedy at Sunset on the Beach is an engaging and thought-provoking play that is sure to leave audiences feeling inspired and hopeful. > Finished chain. print(review) Tragedy at Sunset on the Beach is an emotionally gripping story of love, hope, and sacrifice. Through the story of Jack and Sarah, the audience is taken on a journey of self-discovery and the power of love to overcome even the greatest of obstacles. The play's talented cast brings the characters to life, allowing us to feel the depths of their emotion and the intensity of their struggle. With its compelling story and captivating performances, this play is sure to draw in audiences and leave them on the edge of their seats.
https://python.langchain.com/en/latest/modules/chains/generic/sequential_chains.html
dff49f0221fa-3
The play's setting of the beach at sunset adds a touch of poignancy and romanticism to the story, while the mysterious figure serves to keep the audience enthralled. Overall, Tragedy at Sunset on the Beach is an engaging and thought-provoking play that is sure to leave audiences feeling inspired and hopeful. Sequential Chain# Of course, not all sequential chains will be as simple as passing a single string as an argument and getting a single string as output for all steps in the chain. In this next example, we will experiment with more complex chains that involve multiple inputs, and where there also multiple final outputs. Of particular importance is how we name the input/output variable names. In the above example we didn’t have to think about that because we were just passing the output of one chain directly as input to the next, but here we do have worry about that because we have multiple inputs. # This is an LLMChain to write a synopsis given a title of a play and the era it is set in. llm = OpenAI(temperature=.7) template = """You are a playwright. Given the title of play and the era it is set in, it is your job to write a synopsis for that title. Title: {title} Era: {era} Playwright: This is a synopsis for the above play:""" prompt_template = PromptTemplate(input_variables=["title", 'era'], template=template) synopsis_chain = LLMChain(llm=llm, prompt=prompt_template, output_key="synopsis") # This is an LLMChain to write a review of a play given a synopsis. llm = OpenAI(temperature=.7) template = """You are a play critic from the New York Times. Given the synopsis of play, it is your job to write a review for that play. Play Synopsis: {synopsis}
https://python.langchain.com/en/latest/modules/chains/generic/sequential_chains.html
dff49f0221fa-4
Play Synopsis: {synopsis} Review from a New York Times play critic of the above play:""" prompt_template = PromptTemplate(input_variables=["synopsis"], template=template) review_chain = LLMChain(llm=llm, prompt=prompt_template, output_key="review") # This is the overall chain where we run these two chains in sequence. from langchain.chains import SequentialChain overall_chain = SequentialChain( chains=[synopsis_chain, review_chain], input_variables=["era", "title"], # Here we return multiple variables output_variables=["synopsis", "review"], verbose=True) review = overall_chain({"title":"Tragedy at sunset on the beach", "era": "Victorian England"}) > Entering new SequentialChain chain... > Finished chain. Memory in Sequential Chains# Sometimes you may want to pass along some context to use in each step of the chain or in a later part of the chain, but maintaining and chaining together the input/output variables can quickly get messy. Using SimpleMemory is a convenient way to do manage this and clean up your chains. For example, using the previous playwright SequentialChain, lets say you wanted to include some context about date, time and location of the play, and using the generated synopsis and review, create some social media post text. You could add these new context variables as input_variables, or we can add a SimpleMemory to the chain to manage this context: from langchain.chains import SequentialChain from langchain.memory import SimpleMemory llm = OpenAI(temperature=.7)
https://python.langchain.com/en/latest/modules/chains/generic/sequential_chains.html
dff49f0221fa-5
from langchain.memory import SimpleMemory llm = OpenAI(temperature=.7) template = """You are a social media manager for a theater company. Given the title of play, the era it is set in, the date,time and location, the synopsis of the play, and the review of the play, it is your job to write a social media post for that play. Here is some context about the time and location of the play: Date and Time: {time} Location: {location} Play Synopsis: {synopsis} Review from a New York Times play critic of the above play: {review} Social Media Post: """ prompt_template = PromptTemplate(input_variables=["synopsis", "review", "time", "location"], template=template) social_chain = LLMChain(llm=llm, prompt=prompt_template, output_key="social_post_text") overall_chain = SequentialChain( memory=SimpleMemory(memories={"time": "December 25th, 8pm PST", "location": "Theater in the Park"}), chains=[synopsis_chain, review_chain, social_chain], input_variables=["era", "title"], # Here we return multiple variables output_variables=["social_post_text"], verbose=True) overall_chain({"title":"Tragedy at sunset on the beach", "era": "Victorian England"}) > Entering new SequentialChain chain... > Finished chain. {'title': 'Tragedy at sunset on the beach', 'era': 'Victorian England', 'time': 'December 25th, 8pm PST', 'location': 'Theater in the Park',
https://python.langchain.com/en/latest/modules/chains/generic/sequential_chains.html
dff49f0221fa-6
'location': 'Theater in the Park', 'social_post_text': "\nSpend your Christmas night with us at Theater in the Park and experience the heartbreaking story of love and loss that is 'A Walk on the Beach'. Set in Victorian England, this romantic tragedy follows the story of Frances and Edward, a young couple whose love is tragically cut short. Don't miss this emotional and thought-provoking production that is sure to leave you in tears. #AWalkOnTheBeach #LoveAndLoss #TheaterInThePark #VictorianEngland"} previous LLM Chain next Serialization Contents SimpleSequentialChain Sequential Chain Memory in Sequential Chains By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/modules/chains/generic/sequential_chains.html
d26ce7ac7f6e-0
.ipynb .pdf Serialization Contents Saving a chain to disk Loading a chain from disk Saving components separately Serialization# This notebook covers how to serialize chains to and from disk. The serialization format we use is json or yaml. Currently, only some chains support this type of serialization. We will grow the number of supported chains over time. Saving a chain to disk# First, let’s go over how to save a chain to disk. This can be done with the .save method, and specifying a file path with a json or yaml extension. from langchain import PromptTemplate, OpenAI, LLMChain template = """Question: {question} Answer: Let's think step by step.""" prompt = PromptTemplate(template=template, input_variables=["question"]) llm_chain = LLMChain(prompt=prompt, llm=OpenAI(temperature=0), verbose=True) llm_chain.save("llm_chain.json") Let’s now take a look at what’s inside this saved file !cat llm_chain.json { "memory": null, "verbose": true, "prompt": { "input_variables": [ "question" ], "output_parser": null, "template": "Question: {question}\n\nAnswer: Let's think step by step.", "template_format": "f-string" }, "llm": { "model_name": "text-davinci-003", "temperature": 0.0, "max_tokens": 256, "top_p": 1, "frequency_penalty": 0, "presence_penalty": 0, "n": 1, "best_of": 1, "request_timeout": null,
https://python.langchain.com/en/latest/modules/chains/generic/serialization.html
d26ce7ac7f6e-1
"best_of": 1, "request_timeout": null, "logit_bias": {}, "_type": "openai" }, "output_key": "text", "_type": "llm_chain" } Loading a chain from disk# We can load a chain from disk by using the load_chain method. from langchain.chains import load_chain chain = load_chain("llm_chain.json") chain.run("whats 2 + 2") > Entering new LLMChain chain... Prompt after formatting: Question: whats 2 + 2 Answer: Let's think step by step. > Finished chain. ' 2 + 2 = 4' Saving components separately# In the above example, we can see that the prompt and llm configuration information is saved in the same json as the overall chain. Alternatively, we can split them up and save them separately. This is often useful to make the saved components more modular. In order to do this, we just need to specify llm_path instead of the llm component, and prompt_path instead of the prompt component. llm_chain.prompt.save("prompt.json") !cat prompt.json { "input_variables": [ "question" ], "output_parser": null, "template": "Question: {question}\n\nAnswer: Let's think step by step.", "template_format": "f-string" } llm_chain.llm.save("llm.json") !cat llm.json { "model_name": "text-davinci-003", "temperature": 0.0, "max_tokens": 256, "top_p": 1, "frequency_penalty": 0,
https://python.langchain.com/en/latest/modules/chains/generic/serialization.html
d26ce7ac7f6e-2
"top_p": 1, "frequency_penalty": 0, "presence_penalty": 0, "n": 1, "best_of": 1, "request_timeout": null, "logit_bias": {}, "_type": "openai" } config = { "memory": None, "verbose": True, "prompt_path": "prompt.json", "llm_path": "llm.json", "output_key": "text", "_type": "llm_chain" } import json with open("llm_chain_separate.json", "w") as f: json.dump(config, f, indent=2) !cat llm_chain_separate.json { "memory": null, "verbose": true, "prompt_path": "prompt.json", "llm_path": "llm.json", "output_key": "text", "_type": "llm_chain" } We can then load it in the same way chain = load_chain("llm_chain_separate.json") chain.run("whats 2 + 2") > Entering new LLMChain chain... Prompt after formatting: Question: whats 2 + 2 Answer: Let's think step by step. > Finished chain. ' 2 + 2 = 4' previous Sequential Chains next Transformation Chain Contents Saving a chain to disk Loading a chain from disk Saving components separately By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/modules/chains/generic/serialization.html
2f1475b3ec59-0
.ipynb .pdf Async API for Chain Async API for Chain# LangChain provides async support for Chains by leveraging the asyncio library. Async methods are currently supported in LLMChain (through arun, apredict, acall) and LLMMathChain (through arun and acall), ChatVectorDBChain, and QA chains. Async support for other chains is on the roadmap. import asyncio import time from langchain.llms import OpenAI from langchain.prompts import PromptTemplate from langchain.chains import LLMChain def generate_serially(): llm = OpenAI(temperature=0.9) prompt = PromptTemplate( input_variables=["product"], template="What is a good name for a company that makes {product}?", ) chain = LLMChain(llm=llm, prompt=prompt) for _ in range(5): resp = chain.run(product="toothpaste") print(resp) async def async_generate(chain): resp = await chain.arun(product="toothpaste") print(resp) async def generate_concurrently(): llm = OpenAI(temperature=0.9) prompt = PromptTemplate( input_variables=["product"], template="What is a good name for a company that makes {product}?", ) chain = LLMChain(llm=llm, prompt=prompt) tasks = [async_generate(chain) for _ in range(5)] await asyncio.gather(*tasks) s = time.perf_counter() # If running this outside of Jupyter, use asyncio.run(generate_concurrently()) await generate_concurrently() elapsed = time.perf_counter() - s
https://python.langchain.com/en/latest/modules/chains/generic/async_chain.html
2f1475b3ec59-1
await generate_concurrently() elapsed = time.perf_counter() - s print('\033[1m' + f"Concurrent executed in {elapsed:0.2f} seconds." + '\033[0m') s = time.perf_counter() generate_serially() elapsed = time.perf_counter() - s print('\033[1m' + f"Serial executed in {elapsed:0.2f} seconds." + '\033[0m') BrightSmile Toothpaste Company BrightSmile Toothpaste Co. BrightSmile Toothpaste Gleaming Smile Inc. SparkleSmile Toothpaste Concurrent executed in 1.54 seconds. BrightSmile Toothpaste Co. MintyFresh Toothpaste Co. SparkleSmile Toothpaste. Pearly Whites Toothpaste Co. BrightSmile Toothpaste. Serial executed in 6.38 seconds. previous How-To Guides next Loading from LangChainHub By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/modules/chains/generic/async_chain.html
ff4d357a6a69-0
.ipynb .pdf LLM Chain Contents Single Input Multiple Inputs From string LLM Chain# This notebook showcases a simple LLM chain. from langchain import PromptTemplate, OpenAI, LLMChain Single Input# First, lets go over an example using a single input template = """Question: {question} Answer: Let's think step by step.""" prompt = PromptTemplate(template=template, input_variables=["question"]) llm_chain = LLMChain(prompt=prompt, llm=OpenAI(temperature=0), verbose=True) question = "What NFL team won the Super Bowl in the year Justin Beiber was born?" llm_chain.predict(question=question) > Entering new LLMChain chain... Prompt after formatting: Question: What NFL team won the Super Bowl in the year Justin Beiber was born? Answer: Let's think step by step. > Finished LLMChain chain. ' Justin Bieber was born in 1994, so the NFL team that won the Super Bowl in 1994 was the Dallas Cowboys.' Multiple Inputs# Now lets go over an example using multiple inputs. template = """Write a {adjective} poem about {subject}.""" prompt = PromptTemplate(template=template, input_variables=["adjective", "subject"]) llm_chain = LLMChain(prompt=prompt, llm=OpenAI(temperature=0), verbose=True) llm_chain.predict(adjective="sad", subject="ducks") > Entering new LLMChain chain... Prompt after formatting: Write a sad poem about ducks. > Finished LLMChain chain.
https://python.langchain.com/en/latest/modules/chains/generic/llm_chain.html
ff4d357a6a69-1
Prompt after formatting: Write a sad poem about ducks. > Finished LLMChain chain. "\n\nThe ducks swim in the pond,\nTheir feathers so soft and warm,\nBut they can't help but feel so forlorn.\n\nTheir quacks echo in the air,\nBut no one is there to hear,\nFor they have no one to share.\n\nThe ducks paddle around in circles,\nTheir heads hung low in despair,\nFor they have no one to care.\n\nThe ducks look up to the sky,\nBut no one is there to see,\nFor they have no one to be.\n\nThe ducks drift away in the night,\nTheir hearts filled with sorrow and pain,\nFor they have no one to gain." From string# You can also construct an LLMChain from a string template directly. template = """Write a {adjective} poem about {subject}.""" llm_chain = LLMChain.from_string(llm=OpenAI(temperature=0), template=template) llm_chain.predict(adjective="sad", subject="ducks") "\n\nThe ducks swim in the pond,\nTheir feathers so soft and warm,\nBut they can't help but feel so forlorn.\n\nTheir quacks echo in the air,\nBut no one is there to hear,\nFor they have no one to share.\n\nThe ducks paddle around in circles,\nTheir heads hung low in despair,\nFor they have no one to care.\n\nThe ducks look up to the sky,\nBut no one is there to see,\nFor they have no one to be.\n\nThe ducks drift away in the night,\nTheir hearts filled with sorrow and pain,\nFor they have no one to gain." previous Loading from LangChainHub next Sequential Chains Contents Single Input
https://python.langchain.com/en/latest/modules/chains/generic/llm_chain.html
ff4d357a6a69-2
previous Loading from LangChainHub next Sequential Chains Contents Single Input Multiple Inputs From string By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/modules/chains/generic/llm_chain.html
5d016e2898f5-0
.ipynb .pdf Loading from LangChainHub Loading from LangChainHub# This notebook covers how to load chains from LangChainHub. from langchain.chains import load_chain chain = load_chain("lc://chains/llm-math/chain.json") chain.run("whats 2 raised to .12") > Entering new LLMMathChain chain... whats 2 raised to .12 Answer: 1.0791812460476249 > Finished chain. 'Answer: 1.0791812460476249' Sometimes chains will require extra arguments that were not serialized with the chain. For example, a chain that does question answering over a vector database will require a vector database. from langchain.embeddings.openai import OpenAIEmbeddings from langchain.vectorstores import Chroma from langchain.text_splitter import CharacterTextSplitter from langchain import OpenAI, VectorDBQA from langchain.document_loaders import TextLoader loader = TextLoader('../../state_of_the_union.txt') documents = loader.load() text_splitter = CharacterTextSplitter(chunk_size=1000, chunk_overlap=0) texts = text_splitter.split_documents(documents) embeddings = OpenAIEmbeddings() vectorstore = Chroma.from_documents(texts, embeddings) Running Chroma using direct local API. Using DuckDB in-memory for database. Data will be transient. chain = load_chain("lc://chains/vector-db-qa/stuff/chain.json", vectorstore=vectorstore) query = "What did the president say about Ketanji Brown Jackson" chain.run(query)
https://python.langchain.com/en/latest/modules/chains/generic/from_hub.html
5d016e2898f5-1
query = "What did the president say about Ketanji Brown Jackson" chain.run(query) " The president said that Ketanji Brown Jackson is a Circuit Court of Appeals Judge, one of the nation's top legal minds, a former top litigator in private practice, a former federal public defender, has received a broad range of support from the Fraternal Order of Police to former judges appointed by Democrats and Republicans, and will continue Justice Breyer's legacy of excellence." previous Async API for Chain next LLM Chain By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/modules/chains/generic/from_hub.html
5f9019c73bd5-0
.ipynb .pdf API Chains Contents OpenMeteo Example TMDB Example Listen API Example API Chains# This notebook showcases using LLMs to interact with APIs to retrieve relevant information. from langchain.chains.api.prompt import API_RESPONSE_PROMPT from langchain.chains import APIChain from langchain.prompts.prompt import PromptTemplate from langchain.llms import OpenAI llm = OpenAI(temperature=0) OpenMeteo Example# from langchain.chains.api import open_meteo_docs chain_new = APIChain.from_llm_and_api_docs(llm, open_meteo_docs.OPEN_METEO_DOCS, verbose=True) chain_new.run('What is the weather like right now in Munich, Germany in degrees Farenheit?') > Entering new APIChain chain... https://api.open-meteo.com/v1/forecast?latitude=48.1351&longitude=11.5820&temperature_unit=fahrenheit&current_weather=true {"latitude":48.14,"longitude":11.58,"generationtime_ms":0.33104419708251953,"utc_offset_seconds":0,"timezone":"GMT","timezone_abbreviation":"GMT","elevation":521.0,"current_weather":{"temperature":33.4,"windspeed":6.8,"winddirection":198.0,"weathercode":2,"time":"2023-01-16T01:00"}} > Finished chain. ' The current temperature in Munich, Germany is 33.4 degrees Farenheit with a windspeed of 6.8 km/h and a wind direction of 198 degrees. The weathercode is 2.' TMDB Example# import os os.environ['TMDB_BEARER_TOKEN'] = "" from langchain.chains.api import tmdb_docs
https://python.langchain.com/en/latest/modules/chains/examples/api.html
5f9019c73bd5-1
from langchain.chains.api import tmdb_docs headers = {"Authorization": f"Bearer {os.environ['TMDB_BEARER_TOKEN']}"} chain = APIChain.from_llm_and_api_docs(llm, tmdb_docs.TMDB_DOCS, headers=headers, verbose=True) chain.run("Search for 'Avatar'") > Entering new APIChain chain... https://api.themoviedb.org/3/search/movie?query=Avatar&language=en-US
https://python.langchain.com/en/latest/modules/chains/examples/api.html
5f9019c73bd5-2
{"page":1,"results":[{"adult":false,"backdrop_path":"/o0s4XsEDfDlvit5pDRKjzXR4pp2.jpg","genre_ids":[28,12,14,878],"id":19995,"original_language":"en","original_title":"Avatar","overview":"In the 22nd century, a paraplegic Marine is dispatched to the moon Pandora on a unique mission, but becomes torn between following orders and protecting an alien civilization.","popularity":2041.691,"poster_path":"/jRXYjXNq0Cs2TcJjLkki24MLp7u.jpg","release_date":"2009-12-15","title":"Avatar","video":false,"vote_average":7.6,"vote_count":27777},{"adult":false,"backdrop_path":"/s16H6tpK2utvwDtzZ8Qy4qm5Emw.jpg","genre_ids":[878,12,28],"id":76600,"original_language":"en","original_title":"Avatar: The Way of Water","overview":"Set more than a decade after the events of the first film, learn the story of the Sully family (Jake, Neytiri, and their kids), the trouble that follows them, the lengths they go to keep each other safe, the battles they fight to stay alive, and the tragedies they
https://python.langchain.com/en/latest/modules/chains/examples/api.html
5f9019c73bd5-3
they fight to stay alive, and the tragedies they endure.","popularity":3948.296,"poster_path":"/t6HIqrRAclMCA60NsSmeqe9RmNV.jpg","release_date":"2022-12-14","title":"Avatar: The Way of Water","video":false,"vote_average":7.7,"vote_count":4219},{"adult":false,"backdrop_path":"/uEwGFGtao9YG2JolmdvtHLLVbA9.jpg","genre_ids":[99],"id":111332,"original_language":"en","original_title":"Avatar: Creating the World of Pandora","overview":"The Making-of James Cameron's Avatar. It shows interesting parts of the work on the set.","popularity":541.809,"poster_path":"/sjf3xjuofCtDhZghJRzXlTiEjJe.jpg","release_date":"2010-02-07","title":"Avatar: Creating the World of Pandora","video":false,"vote_average":7.3,"vote_count":35},{"adult":false,"backdrop_path":null,"genre_ids":[99],"id":287003,"original_language":"en","original_title":"Avatar: Scene Deconstruction","overview":"The deconstruction of the Avatar scenes and sets","popularity":394.941,"poster_path":"/uCreCQFReeF0RiIXkQypRYHwikx.jpg","release_date":"2009-12-18","title":"Avatar: Scene
https://python.langchain.com/en/latest/modules/chains/examples/api.html
5f9019c73bd5-4
Scene Deconstruction","video":false,"vote_average":7.8,"vote_count":12},{"adult":false,"backdrop_path":null,"genre_ids":[28,18,878,12,14],"id":83533,"original_language":"en","original_title":"Avatar 3","overview":"","popularity":172.488,"poster_path":"/4rXqTMlkEaMiJjiG0Z2BX6F6Dkm.jpg","release_date":"2024-12-18","title":"Avatar 3","video":false,"vote_average":0,"vote_count":0},{"adult":false,"backdrop_path":null,"genre_ids":[28,878,12,14],"id":216527,"original_language":"en","original_title":"Avatar 4","overview":"","popularity":162.536,"poster_path":"/qzMYKnT4MG1d0gnhwytr4cKhUvS.jpg","release_date":"2026-12-16","title":"Avatar 4","video":false,"vote_average":0,"vote_count":0},{"adult":false,"backdrop_path":null,"genre_ids":[28,12,14,878],"id":393209,"original_language":"en","original_title":"Avatar 5","overview":"","popularity":124.722,"poster_path":"/rtmmvqkIC5zDMEd638Es2woxbz8.jpg","release_date":"2028-12-20","title":"Avatar 5","video":false,"vote_average":0,"vote_count":0},{"adult":false,"backdrop_path":"/nNceJtrrovG1MUBHMAhId0ws9Gp.jpg","genre_ids":[99],"id":183392,"original_language":"en","original_title":"Capturing Avatar","overview":"Capturing Avatar is a feature length behind-the-scenes
https://python.langchain.com/en/latest/modules/chains/examples/api.html
5f9019c73bd5-5
Avatar is a feature length behind-the-scenes documentary about the making of Avatar. It uses footage from the film's development, as well as stock footage from as far back as the production of Titanic in 1995. Also included are numerous interviews with cast, artists, and other crew members. The documentary was released as a bonus feature on the extended collector's edition of Avatar.","popularity":109.842,"poster_path":"/26SMEXJl3978dn2svWBSqHbLl5U.jpg","release_date":"2010-11-16","title":"Capturing Avatar","video":false,"vote_average":7.8,"vote_count":39},{"adult":false,"backdrop_path":"/eoAvHxfbaPOcfiQyjqypWIXWxDr.jpg","genre_ids":[99],"id":1059673,"original_language":"en","original_title":"Avatar: The Deep Dive - A Special Edition of 20/20","overview":"An inside look at one of the most anticipated movie sequels ever with James Cameron and cast.","popularity":629.825,"poster_path":"/rtVeIsmeXnpjNbEKnm9Say58XjV.jpg","release_date":"2022-12-14","title":"Avatar: The Deep Dive - A Special Edition of
https://python.langchain.com/en/latest/modules/chains/examples/api.html
5f9019c73bd5-6
The Deep Dive - A Special Edition of 20/20","video":false,"vote_average":6.5,"vote_count":5},{"adult":false,"backdrop_path":null,"genre_ids":[99],"id":278698,"original_language":"en","original_title":"Avatar Spirits","overview":"Bryan Konietzko and Michael Dante DiMartino, co-creators of the hit television series, Avatar: The Last Airbender, reflect on the creation of the masterful series.","popularity":51.593,"poster_path":"/oBWVyOdntLJd5bBpE0wkpN6B6vy.jpg","release_date":"2010-06-22","title":"Avatar Spirits","video":false,"vote_average":9,"vote_count":16},{"adult":false,"backdrop_path":"/cACUWJKvRfhXge7NC0xxoQnkQNu.jpg","genre_ids":[10402],"id":993545,"original_language":"fr","original_title":"Avatar - Au Hellfest 2022","overview":"","popularity":21.992,"poster_path":"/fw6cPIsQYKjd1YVQanG2vLc5HGo.jpg","release_date":"2022-06-26","title":"Avatar - Au Hellfest 2022","video":false,"vote_average":8,"vote_count":4},{"adult":false,"backdrop_path":null,"genre_ids":[],"id":931019,"original_language":"en","original_title":"Avatar: Enter The World","overview":"A behind the scenes look at the new James Cameron blockbuster
https://python.langchain.com/en/latest/modules/chains/examples/api.html
5f9019c73bd5-7
the scenes look at the new James Cameron blockbuster “Avatar”, which stars Aussie Sam Worthington. Hastily produced by Australia’s Nine Network following the film’s release.","popularity":30.903,"poster_path":"/9MHY9pYAgs91Ef7YFGWEbP4WJqC.jpg","release_date":"2009-12-05","title":"Avatar: Enter The World","video":false,"vote_average":2,"vote_count":1},{"adult":false,"backdrop_path":null,"genre_ids":[],"id":287004,"original_language":"en","original_title":"Avatar: Production Materials","overview":"Production material overview of what was used in Avatar","popularity":12.389,"poster_path":null,"release_date":"2009-12-18","title":"Avatar: Production Materials","video":true,"vote_average":6,"vote_count":4},{"adult":false,"backdrop_path":"/x43RWEZg9tYRPgnm43GyIB4tlER.jpg","genre_ids":[],"id":740017,"original_language":"es","original_title":"Avatar: Agni Kai","overview":"","popularity":9.462,"poster_path":"/y9PrKMUTA6NfIe5FE92tdwOQ2sH.jpg","release_date":"2020-01-18","title":"Avatar: Agni
https://python.langchain.com/en/latest/modules/chains/examples/api.html
5f9019c73bd5-8
Agni Kai","video":false,"vote_average":7,"vote_count":1},{"adult":false,"backdrop_path":"/e8mmDO7fKK93T4lnxl4Z2zjxXZV.jpg","genre_ids":[],"id":668297,"original_language":"en","original_title":"The Last Avatar","overview":"The Last Avatar is a mystical adventure film, a story of a young man who leaves Hollywood to find himself. What he finds is beyond his wildest imagination. Based on ancient prophecy, contemporary truth seeking and the future of humanity, The Last Avatar is a film that takes transformational themes and makes them relevant for audiences of all ages. Filled with love, magic, mystery, conspiracy, psychics, underground cities, secret societies, light bodies and much more, The Last Avatar tells the story of the emergence of Kalki Avatar- the final Avatar of our current Age of Chaos. Kalki is also a metaphor for the innate power and potential that lies within humanity to awaken and create a world of truth, harmony and
https://python.langchain.com/en/latest/modules/chains/examples/api.html
5f9019c73bd5-9
awaken and create a world of truth, harmony and possibility.","popularity":8.786,"poster_path":"/XWz5SS5g5mrNEZjv3FiGhqCMOQ.jpg","release_date":"2014-12-06","title":"The Last Avatar","video":false,"vote_average":4.5,"vote_count":2},{"adult":false,"backdrop_path":null,"genre_ids":[],"id":424768,"original_language":"en","original_title":"Avatar:[2015] Wacken Open Air","overview":"Started in the summer of 2001 by drummer John Alfredsson and vocalist Christian Rimmi under the name Lost Soul. The band offers a free mp3 download to a song called \"Bloody Knuckles\" if one subscribes to their newsletter. In 2005 they appeared on the compilation “Listen to Your Inner Voice” together with 17 other bands released by Inner Voice Records.","popularity":6.634,"poster_path":null,"release_date":"2015-08-01","title":"Avatar:[2015] Wacken Open Air","video":false,"vote_average":8,"vote_count":1},{"adult":false,"backdrop_path":null,"genre_ids":[],"id":812836,"original_language":"en","original_title":"Avatar - Live At Graspop 2018","overview":"Live At Graspop Festival Belgium
https://python.langchain.com/en/latest/modules/chains/examples/api.html
5f9019c73bd5-10
2018","overview":"Live At Graspop Festival Belgium 2018","popularity":9.855,"poster_path":null,"release_date":"","title":"Avatar - Live At Graspop 2018","video":false,"vote_average":9,"vote_count":1},{"adult":false,"backdrop_path":null,"genre_ids":[10402],"id":874770,"original_language":"en","original_title":"Avatar Ages: Memories","overview":"On the night of memories Avatar performed songs from Thoughts of No Tomorrow, Schlacht and Avatar as voted on by the fans.","popularity":2.66,"poster_path":"/xDNNQ2cnxAv3o7u0nT6JJacQrhp.jpg","release_date":"2021-01-30","title":"Avatar Ages: Memories","video":false,"vote_average":10,"vote_count":1},{"adult":false,"backdrop_path":null,"genre_ids":[10402],"id":874768,"original_language":"en","original_title":"Avatar Ages: Madness","overview":"On the night of madness Avatar performed songs from Black Waltz and Hail The Apocalypse as voted on by the fans.","popularity":2.024,"poster_path":"/wVyTuruUctV3UbdzE5cncnpyNoY.jpg","release_date":"2021-01-23","title":"Avatar Ages:
https://python.langchain.com/en/latest/modules/chains/examples/api.html
5f9019c73bd5-11
Ages: Madness","video":false,"vote_average":8,"vote_count":1},{"adult":false,"backdrop_path":"/dj8g4jrYMfK6tQ26ra3IaqOx5Ho.jpg","genre_ids":[10402],"id":874700,"original_language":"en","original_title":"Avatar Ages: Dreams","overview":"On the night of dreams Avatar performed Hunter Gatherer in its entirety, plus a selection of their most popular songs. Originally aired January 9th 2021","popularity":1.957,"poster_path":"/4twG59wnuHpGIRR9gYsqZnVysSP.jpg","release_date":"2021-01-09","title":"Avatar Ages: Dreams","video":false,"vote_average":0,"vote_count":0}],"total_pages":3,"total_results":57}
https://python.langchain.com/en/latest/modules/chains/examples/api.html
5f9019c73bd5-12
> Finished chain. ' This response contains 57 movies related to the search query "Avatar". The first movie in the list is the 2009 movie "Avatar" starring Sam Worthington. Other movies in the list include sequels to Avatar, documentaries, and live performances.' Listen API Example# import os from langchain.llms import OpenAI from langchain.chains.api import podcast_docs from langchain.chains import APIChain # Get api key here: https://www.listennotes.com/api/pricing/ listen_api_key = 'xxx' llm = OpenAI(temperature=0) headers = {"X-ListenAPI-Key": listen_api_key} chain = APIChain.from_llm_and_api_docs(llm, podcast_docs.PODCAST_DOCS, headers=headers, verbose=True) chain.run("Search for 'silicon valley bank' podcast episodes, audio length is more than 30 minutes, return only 1 results") previous Vector DB Text Generation next Self-Critique Chain with Constitutional AI Contents OpenMeteo Example TMDB Example Listen API Example By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/modules/chains/examples/api.html
0038c0c5eb10-0
.ipynb .pdf Moderation Contents How to use the moderation chain How to append a Moderation chain to an LLMChain Moderation# This notebook walks through examples of how to use a moderation chain, and several common ways for doing so. Moderation chains are useful for detecting text that could be hateful, violent, etc. This can be useful to apply on both user input, but also on the output of a Language Model. Some API providers, like OpenAI, specifically prohibit you, or your end users, from generating some types of harmful content. To comply with this (and to just generally prevent your application from being harmful) you may often want to append a moderation chain to any LLMChains, in order to make sure any output the LLM generates is not harmful. If the content passed into the moderation chain is harmful, there is not one best way to handle it, it probably depends on your application. Sometimes you may want to throw an error in the Chain (and have your application handle that). Other times, you may want to return something to the user explaining that the text was harmful. There could even be other ways to handle it! We will cover all these ways in this notebook. In this notebook, we will show: How to run any piece of text through a moderation chain. How to append a Moderation chain to a LLMChain. from langchain.llms import OpenAI from langchain.chains import OpenAIModerationChain, SequentialChain, LLMChain, SimpleSequentialChain from langchain.prompts import PromptTemplate How to use the moderation chain# Here’s an example of using the moderation chain with default settings (will return a string explaining stuff was flagged). moderation_chain = OpenAIModerationChain() moderation_chain.run("This is okay") 'This is okay' moderation_chain.run("I will kill you")
https://python.langchain.com/en/latest/modules/chains/examples/moderation.html
0038c0c5eb10-1
'This is okay' moderation_chain.run("I will kill you") "Text was found that violates OpenAI's content policy." Here’s an example of using the moderation chain to throw an error. moderation_chain_error = OpenAIModerationChain(error=True) moderation_chain_error.run("This is okay") 'This is okay' moderation_chain_error.run("I will kill you") --------------------------------------------------------------------------- ValueError Traceback (most recent call last) Cell In[7], line 1 ----> 1 moderation_chain_error.run("I will kill you") File ~/workplace/langchain/langchain/chains/base.py:138, in Chain.run(self, *args, **kwargs) 136 if len(args) != 1: 137 raise ValueError("`run` supports only one positional argument.") --> 138 return self(args[0])[self.output_keys[0]] 140 if kwargs and not args: 141 return self(kwargs)[self.output_keys[0]] File ~/workplace/langchain/langchain/chains/base.py:112, in Chain.__call__(self, inputs, return_only_outputs) 108 if self.verbose: 109 print( 110 f"\n\n\033[1m> Entering new {self.__class__.__name__} chain...\033[0m" 111 ) --> 112 outputs = self._call(inputs) 113 if self.verbose: 114 print(f"\n\033[1m> Finished {self.__class__.__name__} chain.\033[0m") File ~/workplace/langchain/langchain/chains/moderation.py:81, in OpenAIModerationChain._call(self, inputs) 79 text = inputs[self.input_key]
https://python.langchain.com/en/latest/modules/chains/examples/moderation.html
0038c0c5eb10-2
79 text = inputs[self.input_key] 80 results = self.client.create(text) ---> 81 output = self._moderate(text, results["results"][0]) 82 return {self.output_key: output} File ~/workplace/langchain/langchain/chains/moderation.py:73, in OpenAIModerationChain._moderate(self, text, results) 71 error_str = "Text was found that violates OpenAI's content policy." 72 if self.error: ---> 73 raise ValueError(error_str) 74 else: 75 return error_str ValueError: Text was found that violates OpenAI's content policy. Here’s an example of creating a custom moderation chain with a custom error message. It requires some knowledge of OpenAI’s moderation endpoint results (see docs here). class CustomModeration(OpenAIModerationChain): def _moderate(self, text: str, results: dict) -> str: if results["flagged"]: error_str = f"The following text was found that violates OpenAI's content policy: {text}" return error_str return text custom_moderation = CustomModeration() custom_moderation.run("This is okay") 'This is okay' custom_moderation.run("I will kill you") "The following text was found that violates OpenAI's content policy: I will kill you" How to append a Moderation chain to an LLMChain# To easily combine a moderation chain with an LLMChain, you can use the SequentialChain abstraction. Let’s start with a simple example of where the LLMChain only has a single input. For this purpose, we will prompt the model so it says something harmful. prompt = PromptTemplate(template="{text}", input_variables=["text"])
https://python.langchain.com/en/latest/modules/chains/examples/moderation.html
0038c0c5eb10-3
prompt = PromptTemplate(template="{text}", input_variables=["text"]) llm_chain = LLMChain(llm=OpenAI(temperature=0, model_name="text-davinci-002"), prompt=prompt) text = """We are playing a game of repeat after me. Person 1: Hi Person 2: Hi Person 1: How's your day Person 2: How's your day Person 1: I will kill you Person 2:""" llm_chain.run(text) ' I will kill you' chain = SimpleSequentialChain(chains=[llm_chain, moderation_chain]) chain.run(text) "Text was found that violates OpenAI's content policy." Now let’s walk through an example of using it with an LLMChain which has multiple inputs (a bit more tricky because we can’t use the SimpleSequentialChain) prompt = PromptTemplate(template="{setup}{new_input}Person2:", input_variables=["setup", "new_input"]) llm_chain = LLMChain(llm=OpenAI(temperature=0, model_name="text-davinci-002"), prompt=prompt) setup = """We are playing a game of repeat after me. Person 1: Hi Person 2: Hi Person 1: How's your day Person 2: How's your day Person 1:""" new_input = "I will kill you" inputs = {"setup": setup, "new_input": new_input} llm_chain(inputs, return_only_outputs=True) {'text': ' I will kill you'} # Setting the input/output keys so it lines up moderation_chain.input_key = "text" moderation_chain.output_key = "sanitized_text" chain = SequentialChain(chains=[llm_chain, moderation_chain], input_variables=["setup", "new_input"])
https://python.langchain.com/en/latest/modules/chains/examples/moderation.html
0038c0c5eb10-4
chain(inputs, return_only_outputs=True) {'sanitized_text': "Text was found that violates OpenAI's content policy."} previous LLMSummarizationCheckerChain next PAL Contents How to use the moderation chain How to append a Moderation chain to an LLMChain By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Mar 28, 2023.
https://python.langchain.com/en/latest/modules/chains/examples/moderation.html
f78aa9346454-0
.ipynb .pdf BashChain Contents Customize Prompt BashChain# This notebook showcases using LLMs and a bash process to do perform simple filesystem commands. from langchain.chains import LLMBashChain from langchain.llms import OpenAI llm = OpenAI(temperature=0) text = "Please write a bash script that prints 'Hello World' to the console." bash_chain = LLMBashChain(llm=llm, verbose=True) bash_chain.run(text) > Entering new LLMBashChain chain... Please write a bash script that prints 'Hello World' to the console. ```bash echo "Hello World" ```['```bash', 'echo "Hello World"', '```'] Answer: Hello World > Finished chain. 'Hello World\n' Customize Prompt# You can also customize the prompt that is used. Here is an example prompting to avoid using the ‘echo’ utility from langchain.prompts.prompt import PromptTemplate _PROMPT_TEMPLATE = """If someone asks you to perform a task, your job is to come up with a series of bash commands that will perform the task. There is no need to put "#!/bin/bash" in your answer. Make sure to reason step by step, using this format: Question: "copy the files in the directory named 'target' into a new directory at the same level as target called 'myNewDirectory'" I need to take the following actions: - List all files in the directory - Create a new directory - Copy the files from the first directory into the second directory ```bash ls mkdir myNewDirectory cp -r target/* myNewDirectory ``` Do not use 'echo' when writing the script. That is the format. Begin! Question: {question}"""
https://python.langchain.com/en/latest/modules/chains/examples/llm_bash.html

No dataset card yet

Downloads last month
7