langchain_community.chains.ernie_functions.base.create_ernie_fn_chain

langchain_community.chains.ernie_functions.base.create_ernie_fn_chain(functions: Sequence[Union[Dict[str, Any], Type[BaseModel], Callable]], llm: BaseLanguageModel, prompt: BasePromptTemplate, *, output_key: str = 'function', output_parser: Optional[BaseLLMOutputParser] = None, **kwargs: Any) LLMChain[source]

(旧版)创建使用Ernie函数的LLM链。

参数
  • functions (Sequence[Union[Dict[str, Any], Type[BaseModel], Callable]]) – 字典、pydantic.BaseModel类或Python函数的序列。如果传入的是字典,则假定它们已经是有效的Ernie函数。如果只传入一个函数,则将强制使用该函数。pydantic.BaseModel和Python函数应包含描述函数作用的文档字符串。为了获得最佳结果,pydantic.BaseModel应包含参数描述,Python函数的文档字符串中应包含Google Python风格参数描述。此外,Python函数应仅使用原始类型(str、int、float、bool)或pydantic.BaseModel作为参数。

  • llm (BaseLanguageModel) – 要使用的语言模型,假定支持Ernie函数调用API。

  • prompt (BasePromptTemplate) – 要传递给模型的BasePromptTemplate。

  • output_key (str) – 在LLMChain.__call__中返回输出时使用的键。

  • output_parser (可选[BaseLLMOutputParser]) – 用于解析模型输出的 BaseLLMOutputParser。默认情况下将根据函数类型推断。如果传入 pydantic.BaseModels,则 OutputParser 将尝试使用这些模型解析输出。否则,模型输出将简单地作为 JSON 解析。如果传入多个函数且它们不是 pydantic.BaseModels,则链式输出将包括返回的函数名称以及传递给函数的参数。

  • kwargs (任何类型) –

返回

一个 LLMChain,在运行时将传入给定的函数到模型中。

返回类型

LLMChain

示例

from typing import Optional

from langchain.chains.ernie_functions import create_ernie_fn_chain
from langchain_community.chat_models import ErnieBotChat
from langchain_core.prompts import ChatPromptTemplate

from langchain.pydantic_v1 import BaseModel, Field


class RecordPerson(BaseModel):
    """Record some identifying information about a person."""

    name: str = Field(..., description="The person's name")
    age: int = Field(..., description="The person's age")
    fav_food: Optional[str] = Field(None, description="The person's favorite food")


class RecordDog(BaseModel):
    """Record some identifying information about a dog."""

    name: str = Field(..., description="The dog's name")
    color: str = Field(..., description="The dog's color")
    fav_food: Optional[str] = Field(None, description="The dog's favorite food")


llm = ErnieBotChat(model_name="ERNIE-Bot-4")
prompt = ChatPromptTemplate.from_messages(
    [
        ("user", "Make calls to the relevant function to record the entities in the following input: {input}"),
        ("assistant", "OK!"),
        ("user", "Tip: Make sure to answer in the correct format"),
    ]
)
chain = create_ernie_fn_chain([RecordPerson, RecordDog], llm, prompt)
chain.run("Harry was a chubby brown beagle who loved chicken")
# -> RecordDog(name="Harry", color="brown", fav_food="chicken")