LangChain系列文章

LangChain 36 深入理解LangChain 表达式语言优势一 LangChain Expression Language (LCEL)LangChain 37 深入理解LangChain 表达式语言二 实现prompt+model+output parser LangChain Expression Language (LCEL)LangChain 38 深入理解LangChain 表达式语言三 实现RAG检索增强生成 LangChain Expression Language (LCEL)LangChain 39 深入理解LangChain 表达式语言四 为什么要用LCEL LangChain Expression Language (LCEL)LangChain 40 实战Langchain访问OpenAI ChatGPT API Account deactivated的另类方法,访问跳板机APILangChain 41 深入理解LangChain 表达式语言五 为什么要用LCEL调用大模型LLM LangChain Expression Language (LCEL)LangChain 42 深入理解LangChain 表达式语言六 Runtime调用不同大模型LLM LangChain Expression Language (LCEL)LangChain 43 深入理解LangChain 表达式语言七 日志和Fallbacks异常备选方案 LangChain Expression Language (LCEL)LangChain 44 深入理解LangChain 表达式语言八 Runnable接口输入输出模式 LangChain Expression Language (LCEL)LangChain 45 深入理解LangChain 表达式语言九 Runnable 调用、流输出、批量调用、异步处理 LangChain Expression Language (LCEL)LangChain 46 深入理解LangChain 表达式语言十 Runnable 调用中间状态调试日志 LangChain Expression Language (LCEL)LangChain 47 深入理解LangChain 表达式语言十一 Runnable 并行处理 LangChain Expression Language (LCEL)LangChain 48 终极解决 实战Langchain访问OpenAI ChatGPT API Account deactivated的另类方法,访问跳板机APILangChain 49 深入理解LangChain 表达式语言十二 Runnable 透传数据保持输入不变 LangChain Expression Language (LCEL)LangChain 50 深入理解LangChain 表达式语言十三 自定义pipeline函数 LangChain Expression Language (LCEL)LangChain 51 深入理解LangChain 表达式语言十四 自动修复配置RunnableConfig LangChain Expression Language (LCEL)LangChain 52 深入理解LangChain 表达式语言十五 Bind runtime args绑定运行时参数 LangChain Expression Language (LCEL)LangChain 53 深入理解LangChain 表达式语言十六 Dynamically route动态路由 LangChain Expression Language (LCEL)LangChain 54 深入理解LangChain 表达式语言十七 Chains Route动态路由 LangChain Expression Language (LCEL)LangChain 55 深入理解LangChain 表达式语言十八 function Route自定义动态路由 LangChain Expression Language (LCEL)LangChain 56 深入理解LangChain 表达式语言十九 config运行时选择大模型LLM LangChain Expression Language (LCEL)LangChain 57 深入理解LangChain 表达式语言二十 LLM Fallbacks速率限制备份大模型 LangChain Expression Language (LCEL)LangChain 58 深入理解LangChain 表达式语言21 Memory消息历史 LangChain Expression Language (LCEL)

multiple chains多个链交互

可运行的任务可以轻松地用来串联多个链。

chain1: 先英文问奥巴马来自哪个城市。chain2: 把chain1的城市当做参数传入chain2,问这个城市属于哪个国家,并用中文输出。

from langchain.prompts import PromptTemplate

from langchain_community.chat_models import ChatOpenAI

from langchain_core.runnables import ConfigurableField

# We add in a string output parser here so the outputs between the two are the same type

from langchain_core.output_parsers import StrOutputParser

from langchain.prompts import ChatPromptTemplate

# Now lets create a chain with the normal OpenAI model

from langchain_community.llms import OpenAI

from operator import itemgetter

from langchain.prompts import ChatPromptTemplate

from langchain.schema import StrOutputParser

from langchain_community.chat_models import ChatOpenAI

from dotenv import load_dotenv # 导入从 .env 文件加载环境变量的函数

load_dotenv() # 调用函数实际加载环境变量

from langchain.globals import set_debug # 导入在 langchain 中设置调试模式的函数

set_debug(True) # 启用 langchain 的调试模式

prompt1 = ChatPromptTemplate.from_template("what is the city {person} is from?")

prompt2 = ChatPromptTemplate.from_template(

"what country is the city {city} in? respond in {language}"

)

model = ChatOpenAI()

chain1 = prompt1 | model | StrOutputParser()

chain2 = (

{"city": chain1, "language": itemgetter("language")}

| prompt2

| model

| StrOutputParser()

)

respone = chain2.invoke({"person": "obama", "language": "Chinese"})

print('respone >> ', respone)

输出

(.venv) zgpeace@zgpeaces-MacBook-Pro git:(develop) ✗% python LCEL/chains_multiple.py ~/Workspace/LLM/langchain-llm-app

[chain/start] [1:chain:RunnableSequence] Entering Chain run with input:

{

"person": "obama",

"language": "Chinese"

}

[chain/start] [1:chain:RunnableSequence > 2:chain:RunnableParallel] Entering Chain run with input:

{

"person": "obama",

"language": "Chinese"

}

[chain/start] [1:chain:RunnableSequence > 2:chain:RunnableParallel > 3:chain:RunnableSequence] Entering Chain run with input:

{

"person": "obama",

"language": "Chinese"

}

[chain/start] [1:chain:RunnableSequence > 2:chain:RunnableParallel > 3:chain:RunnableSequence > 4:prompt:ChatPromptTemplate] Entering Prompt run with input:

{

"person": "obama",

"language": "Chinese"

}

[chain/start] [1:chain:RunnableSequence > 2:chain:RunnableParallel > 4:chain:RunnableLambda] Entering Chain run with input:

{

"person": "obama",

"language": "Chinese"

}

[chain/end] [1:chain:RunnableSequence > 2:chain:RunnableParallel > 3:chain:RunnableSequence > 4:prompt:ChatPromptTemplate] [3ms] Exiting Prompt run with output:

{

"lc": 1,

"type": "constructor",

"id": [

"langchain",

"prompts",

"chat",

"ChatPromptValue"

],

"kwargs": {

"messages": [

{

"lc": 1,

"type": "constructor",

"id": [

"langchain",

"schema",

"messages",

"HumanMessage"

],

"kwargs": {

"content": "what is the city obama is from?",

"additional_kwargs": {}

}

}

]

}

}

[chain/end] [1:chain:RunnableSequence > 2:chain:RunnableParallel > 4:chain:RunnableLambda] [4ms] Exiting Chain run with output:

{

"output": "Chinese"

}

[llm/start] [1:chain:RunnableSequence > 2:chain:RunnableParallel > 3:chain:RunnableSequence > 5:llm:ChatOpenAI] Entering LLM run with input:

{

"prompts": [

"Human: what is the city obama is from?"

]

}

[llm/end] [1:chain:RunnableSequence > 2:chain:RunnableParallel > 3:chain:RunnableSequence > 5:llm:ChatOpenAI] [2.75s] Exiting LLM run with output:

{

"generations": [

[

{

"text": "Barack Obama, the 44th President of the United States, was born in Honolulu, Hawaii.",

"generation_info": {

"finish_reason": "stop",

"logprobs": null

},

"type": "ChatGeneration",

"message": {

"lc": 1,

"type": "constructor",

"id": [

"langchain",

"schema",

"messages",

"AIMessage"

],

"kwargs": {

"content": "Barack Obama, the 44th President of the United States, was born in Honolulu, Hawaii.",

"additional_kwargs": {}

}

}

}

]

],

"llm_output": {

"token_usage": {

"completion_tokens": 21,

"prompt_tokens": 16,

"total_tokens": 37

},

"model_name": "gpt-3.5-turbo",

"system_fingerprint": null

},

"run": null

}

[chain/start] [1:chain:RunnableSequence > 2:chain:RunnableParallel > 3:chain:RunnableSequence > 6:parser:StrOutputParser] Entering Parser run with input:

[inputs]

[chain/end] [1:chain:RunnableSequence > 2:chain:RunnableParallel > 3:chain:RunnableSequence > 6:parser:StrOutputParser] [1ms] Exiting Parser run with output:

{

"output": "Barack Obama, the 44th President of the United States, was born in Honolulu, Hawaii."

}

[chain/end] [1:chain:RunnableSequence > 2:chain:RunnableParallel > 3:chain:RunnableSequence] [2.76s] Exiting Chain run with output:

{

"output": "Barack Obama, the 44th President of the United States, was born in Honolulu, Hawaii."

}

[chain/end] [1:chain:RunnableSequence > 2:chain:RunnableParallel] [2.78s] Exiting Chain run with output:

{

"city": "Barack Obama, the 44th President of the United States, was born in Honolulu, Hawaii.",

"language": "Chinese"

}

[chain/start] [1:chain:RunnableSequence > 7:prompt:ChatPromptTemplate] Entering Prompt run with input:

{

"city": "Barack Obama, the 44th President of the United States, was born in Honolulu, Hawaii.",

"language": "Chinese"

}

[chain/end] [1:chain:RunnableSequence > 7:prompt:ChatPromptTemplate] [1ms] Exiting Prompt run with output:

{

"lc": 1,

"type": "constructor",

"id": [

"langchain",

"prompts",

"chat",

"ChatPromptValue"

],

"kwargs": {

"messages": [

{

"lc": 1,

"type": "constructor",

"id": [

"langchain",

"schema",

"messages",

"HumanMessage"

],

"kwargs": {

"content": "what country is the city Barack Obama, the 44th President of the United States, was born in Honolulu, Hawaii. in? respond in Chinese",

"additional_kwargs": {}

}

}

]

}

}

[llm/start] [1:chain:RunnableSequence > 8:llm:ChatOpenAI] Entering LLM run with input:

{

"prompts": [

"Human: what country is the city Barack Obama, the 44th President of the United States, was born in Honolulu, Hawaii. in? respond in Chinese"

]

}

[llm/end] [1:chain:RunnableSequence > 8:llm:ChatOpenAI] [2.55s] Exiting LLM run with output:

{

"generations": [

[

{

"text": "巴拉克·奥巴马,美国第44任总统,出生在夏威夷的檀香山市。",

"generation_info": {

"finish_reason": "stop",

"logprobs": null

},

"type": "ChatGeneration",

"message": {

"lc": 1,

"type": "constructor",

"id": [

"langchain",

"schema",

"messages",

"AIMessage"

],

"kwargs": {

"content": "巴拉克·奥巴马,美国第44任总统,出生在夏威夷的檀香山市。",

"additional_kwargs": {}

}

}

}

]

],

"llm_output": {

"token_usage": {

"completion_tokens": 40,

"prompt_tokens": 37,

"total_tokens": 77

},

"model_name": "gpt-3.5-turbo",

"system_fingerprint": null

},

"run": null

}

[chain/start] [1:chain:RunnableSequence > 9:parser:StrOutputParser] Entering Parser run with input:

[inputs]

[chain/end] [1:chain:RunnableSequence > 9:parser:StrOutputParser] [1ms] Exiting Parser run with output:

{

"output": "巴拉克·奥巴马,美国第44任总统,出生在夏威夷的檀香山市。"

}

[chain/end] [1:chain:RunnableSequence] [5.33s] Exiting Chain run with output:

{

"output": "巴拉克·奥巴马,美国第44任总统,出生在夏威夷的檀香山市。"

}

respone >> 巴拉克·奥巴马,美国第44任总统,出生在夏威夷的檀香山市。

代码

https://github.com/zgpeace/pets-name-langchain/tree/develop

参考

https://python.langchain.com/docs/expression_language/cookbook/multiple_chains

精彩文章

评论可见,请评论后查看内容,谢谢!!!
 您阅读本篇文章共花了: