当前位置: 首页 > news >正文

电子商务网站建设课设心得体会网站怎么发外链

电子商务网站建设课设心得体会,网站怎么发外链,网站建设的公司如何招销售,建设局象山网站系列文章索引 LangChain教程 - 系列文章 LangChain提供了一种灵活且强大的表达式语言 (LangChain Expression Language, LCEL)#xff0c;用于创建复杂的逻辑链。通过将不同的可运行对象组合起来#xff0c;LCEL可以实现顺序链、嵌套链、并行链、路由以及动态构建等高级功能…系列文章索引 LangChain教程 - 系列文章 LangChain提供了一种灵活且强大的表达式语言 (LangChain Expression Language, LCEL)用于创建复杂的逻辑链。通过将不同的可运行对象组合起来LCEL可以实现顺序链、嵌套链、并行链、路由以及动态构建等高级功能从而满足各种场景下的需求。本文将详细介绍这些功能及其实现方式。 顺序链 LCEL的核心功能是将可运行对象按顺序组合起来其中前一个对象的输出会自动传递给下一个对象作为输入。我们可以使用管道操作符 (|) 或显式的 .pipe() 方法来构建顺序链。 以下是一个简单的例子 from langchain_ollama import OllamaLLM from langchain_core.prompts import ChatPromptTemplate from langchain_core.output_parsers import StrOutputParsermodel OllamaLLM(modelqwen2.5:0.5b) prompt ChatPromptTemplate.from_template(tell me a joke about {topic})chain prompt | model | StrOutputParser()result chain.invoke({topic: bears}) print(result)输出 Heres a bear joke for you:Why did the bear dissolve in water? Because it was a polar bear!在上述例子中提示模板将输入格式化为聊天模型的输入格式聊天模型生成笑话最后通过输出解析器将结果转换为字符串。 嵌套链 嵌套链允许我们将多个链组合起来以创建更复杂的逻辑。例如可以将一个生成笑话的链与另一个链组合该链负责分析笑话的有趣程度。 analysis_prompt ChatPromptTemplate.from_template(is this a funny joke? {joke}) composed_chain {joke: chain} | analysis_prompt | model | StrOutputParser()result composed_chain.invoke({topic: bears}) print(result)输出 Haha, thats a clever play on words! Using polar to imply the bear dissolved or became polar/polarized when put in water. Not the most hilarious joke ever, but it has a cute, groan-worthy pun that makes it mildly amusing.并行链 RunnableParallel 使得可以并行运行多个链并将每个链的结果组合成一个字典。这种方式适用于需要同时处理多个任务的场景。 from langchain_core.runnables import RunnableParalleljoke_chain ChatPromptTemplate.from_template(tell me a joke about {topic}) | model poem_chain ChatPromptTemplate.from_template(write a 2-line poem about {topic}) | modelparallel_chain RunnableParallel(jokejoke_chain, poempoem_chain)result parallel_chain.invoke({topic: bear}) print(result)输出 {joke: Why dont bears like fast food? Because they cant catch it!,poem: In the quiet of the forest, the bear roams free\nMajestic and wild, a sight to see. }路由 路由允许根据输入动态选择要执行的子链。LCEL提供了两种实现路由的方式 使用自定义函数 通过 RunnableLambda 实现动态路由 from langchain_core.prompts import PromptTemplate from langchain_core.runnables import RunnableLambdachain (PromptTemplate.from_template(Given the user question below, classify it as either being about LangChain, Anthropic, or Other.Do not respond with more than one word.question {question} /questionClassification:)| OllamaLLM(modelqwen2.5:0.5b)| StrOutputParser() )langchain_chain PromptTemplate.from_template(You are an expert in langchain. \ Always answer questions starting with As Harrison Chase told me. \ Respond to the following question:Question: {question} Answer: ) | OllamaLLM(modelqwen2.5:0.5b) anthropic_chain PromptTemplate.from_template(You are an expert in anthropic. \ Always answer questions starting with As Dario Amodei told me. \ Respond to the following question:Question: {question} Answer: ) | OllamaLLM(modelqwen2.5:0.5b) general_chain PromptTemplate.from_template(Respond to the following question:Question: {question} Answer: ) | OllamaLLM(modelqwen2.5:0.5b)def route(info):if anthropic in info[topic].lower():return anthropic_chainelif langchain in info[topic].lower():return langchain_chainelse:return general_chainfull_chain {topic: chain, question: lambda x: x[question]} | RunnableLambda(route)result full_chain.invoke({question: how do I use LangChain?}) print(result)def route(info):if anthropic in info[topic].lower():return anthropic_chainelif langchain in info[topic].lower():return langchain_chainelse:return general_chainfrom langchain_core.runnables import RunnableLambdafull_chain {topic: chain, question: lambda x: x[question]} | RunnableLambda(route)result full_chain.invoke({question: how do I use LangChain?}) print(result)使用 RunnableBranch RunnableBranch 通过条件匹配选择分支 from langchain_core.runnables import RunnableBranchbranch RunnableBranch((lambda x: anthropic in x[topic].lower(), anthropic_chain),(lambda x: langchain in x[topic].lower(), langchain_chain),general_chain, )full_chain {topic: chain, question: lambda x: x[question]} | branch result full_chain.invoke({question: how do I use Anthropic?}) print(result)动态构建 动态构建链可以根据输入在运行时生成链的部分。通过 RunnableLambda 的返回值机制可以返回一个新的 Runnable。 from langchain_core.runnables import chain, RunnablePassthroughllm OllamaLLM(modelqwen2.5:0.5b)contextualize_instructions Convert the latest user question into a standalone question given the chat history. Dont answer the question, return the question and nothing else (no descriptive text). contextualize_prompt ChatPromptTemplate.from_messages([(system, contextualize_instructions),(placeholder, {chat_history}),(human, {question}),] ) contextualize_question contextualize_prompt | llm | StrOutputParser()chain def contextualize_if_needed(input_: dict):if input_.get(chat_history):return contextualize_questionelse:return RunnablePassthrough() | itemgetter(question)chain def fake_retriever(input_: dict):return egypts population in 2024 is about 111 millionqa_instructions (Answer the user question given the following context:\n\n{context}. ) qa_prompt ChatPromptTemplate.from_messages([(system, qa_instructions), (human, {question})] )full_chain (RunnablePassthrough.assign(questioncontextualize_if_needed).assign(contextfake_retriever)| qa_prompt| llm| StrOutputParser() )result full_chain.invoke({question: what about egypt,chat_history: [(human, whats the population of indonesia),(ai, about 276 million),], }) print(result)输出 According to the context provided, Egypts population in 2024 is estimated to be about 111 million.完整代码实例 from operator import itemgetterfrom langchain_ollama import OllamaLLM from langchain_core.prompts import ChatPromptTemplate from langchain_core.output_parsers import StrOutputParserprint(\n-----------------------------------\n)# Simple demo model OllamaLLM(modelqwen2.5:0.5b) prompt ChatPromptTemplate.from_template(tell me a joke about {topic})chain prompt | model | StrOutputParser()result chain.invoke({topic: bears}) print(result)print(\n-----------------------------------\n)# Compose demo analysis_prompt ChatPromptTemplate.from_template(is this a funny joke? {joke}) composed_chain {joke: chain} | analysis_prompt | model | StrOutputParser()result composed_chain.invoke({topic: bears}) print(result)print(\n-----------------------------------\n)# Parallel demo from langchain_core.runnables import RunnableParalleljoke_chain ChatPromptTemplate.from_template(tell me a joke about {topic}) | model poem_chain ChatPromptTemplate.from_template(write a 2-line poem about {topic}) | modelparallel_chain RunnableParallel(jokejoke_chain, poempoem_chain)result parallel_chain.invoke({topic: bear}) print(result)print(\n-----------------------------------\n)# Route demo from langchain_core.prompts import PromptTemplate from langchain_core.runnables import RunnableLambdachain (PromptTemplate.from_template(Given the user question below, classify it as either being about LangChain, Anthropic, or Other.Do not respond with more than one word.question {question} /questionClassification:)| OllamaLLM(modelqwen2.5:0.5b)| StrOutputParser() )langchain_chain PromptTemplate.from_template(You are an expert in langchain. \ Always answer questions starting with As Harrison Chase told me. \ Respond to the following question:Question: {question} Answer: ) | OllamaLLM(modelqwen2.5:0.5b) anthropic_chain PromptTemplate.from_template(You are an expert in anthropic. \ Always answer questions starting with As Dario Amodei told me. \ Respond to the following question:Question: {question} Answer: ) | OllamaLLM(modelqwen2.5:0.5b) general_chain PromptTemplate.from_template(Respond to the following question:Question: {question} Answer: ) | OllamaLLM(modelqwen2.5:0.5b)def route(info):if anthropic in info[topic].lower():return anthropic_chainelif langchain in info[topic].lower():return langchain_chainelse:return general_chainfull_chain {topic: chain, question: lambda x: x[question]} | RunnableLambda(route)result full_chain.invoke({question: how do I use LangChain?}) print(result)print(\n-----------------------------------\n)# Branch demo from langchain_core.runnables import RunnableBranchbranch RunnableBranch((lambda x: anthropic in x[topic].lower(), anthropic_chain),(lambda x: langchain in x[topic].lower(), langchain_chain),general_chain, )full_chain {topic: chain, question: lambda x: x[question]} | branch result full_chain.invoke({question: how do I use Anthropic?}) print(result)print(\n-----------------------------------\n)# Dynamic demo from langchain_core.runnables import chain, RunnablePassthroughllm OllamaLLM(modelqwen2.5:0.5b)contextualize_instructions Convert the latest user question into a standalone question given the chat history. Dont answer the question, return the question and nothing else (no descriptive text). contextualize_prompt ChatPromptTemplate.from_messages([(system, contextualize_instructions),(placeholder, {chat_history}),(human, {question}),] ) contextualize_question contextualize_prompt | llm | StrOutputParser()chain def contextualize_if_needed(input_: dict):if input_.get(chat_history):return contextualize_questionelse:return RunnablePassthrough() | itemgetter(question)chain def fake_retriever(input_: dict):return egypts population in 2024 is about 111 millionqa_instructions (Answer the user question given the following context:\n\n{context}. ) qa_prompt ChatPromptTemplate.from_messages([(system, qa_instructions), (human, {question})] )full_chain (RunnablePassthrough.assign(questioncontextualize_if_needed).assign(contextfake_retriever)| qa_prompt| llm| StrOutputParser() )result full_chain.invoke({question: what about egypt,chat_history: [(human, whats the population of indonesia),(ai, about 276 million),], }) print(result)print(\n-----------------------------------\n)J-LangChain实现上面实例 J-LangChain - 智能链构建 总结 LangChain的LCEL通过提供顺序链、嵌套链、并行链、路由和动态构建等功能为开发者构建复杂的语言任务提供了强大的工具。无论是简单的逻辑流还是复杂的动态决策LCEL都能高效地满足需求。通过合理使用这些功能开发者可以快速搭建高效、灵活的智能链为各种场景的应用提供支持。
http://www.w-s-a.com/news/344263/

相关文章:

  • 北京教育云平台网站建设中国服装设计网站
  • 网络公司专业做网站豌豆荚app下载
  • 网站建设属于什么岗位济宁网站建设_云科网络
  • wordpress网站监测fwa 网站 欣赏
  • 用jsp做的可运行的网站推广网络
  • 电商网站设计论文wordpress子文件夹建站
  • 临沂网站优化如何如何做公司的网站建设
  • 建设部网站 光纤到户沈阳网页设计兼职
  • 企业网站建设作用宁波企业网站推广效果好
  • wordpress课件站模板做网站的公司 贵阳
  • 低价格网站建设网站建设中的板块名称
  • 青岛网站建设华夏h5链接是什么意思
  • 贸易公司如何做网站百度做的网站一般在什么后台
  • 东莞网站设计方案广州做服装电商拿货的网站
  • 部队网站建设设计dede个人网站模板
  • 个人网站怎么自己备案重庆怎样网站推广
  • 做电影网站挣钱吗重庆网站建设技术托管
  • 网站建设用户登录网站商业授权含义
  • 接做室内效果图的网站wordpress制作上传图片
  • 维护一个网站一年多少钱网站微信登录怎么做的
  • 中国建设银行网站E路护航官网如何在招聘网站上选个好公司做销售
  • 网站开发质量管理招聘网站建设方案
  • 有没有那个的网站seo编辑的工作内容
  • 平度那里有做网站的昆明建设招聘信息网站
  • 邯郸城乡建设部网站首页唐山市住房城乡建设部网站主页
  • 慕课联盟网站开发实战六安品牌网站建设电话
  • 制作企业网站首页贵州小程序制作开发
  • 什么是网站后台郑州众志seo
  • 做线上交互的网站分销平台
  • 培训机构网站开发江门cms模板建站