langchain_community.chains.ernie_functions.base
.create_ernie_fn_runnable¶
- langchain_community.chains.ernie_functions.base.create_ernie_fn_runnable(functions: Sequence[Union[Dict[str, Any], Type[BaseModel], Callable]], llm: Runnable, prompt: BasePromptTemplate, *, output_parser: Optional[Union[BaseOutputParser, BaseGenerationOutputParser]] = None, **kwargs: Any) Runnable [source]¶
创建一个使用Ernie函数的可运行序列。
- 参数
functions (Sequence[Union[Dict[str, Any], Type[BaseModel], Callable]]) – 字典、pydantic.BaseModel 类或 Python 函数的序列。如果传递了字典,则假定它们已经是有效的 Ernie 函数。如果只传递一个函数,则将强制使用该函数的模型。pydantic.BaseModel 和 Python 函数应该有描述其功能的文档字符串。为了获得最佳结果,pydantic.BaseModel 应该包含参数描述,Python 函数的文档字符串应该使用 Google Python 风格的参数描述。此外,Python 函数应只使用原始类型(str、int、float、bool)或 pydantic.BaseModel 作为参数。
llm (Runnable) – 要使用的语言模型,假定支持 Ernie 函数调用 API。
prompt (BasePromptTemplate) – 传递给模型的 BasePromptTemplate。
output_parser (可选[并集[BaseOutputParser, BaseGenerationOutputParser]]) – 用于解析模型输出的基础LLMOutputParser。默认情况下将根据函数类型推断。如果传入pydantic.BaseModels,则OutputParser将尝试使用这些模型解析输出。否则,模型输出将直接以JSON格式解析。如果传入多个函数且它们不是pydantic.BaseModels,则链式输出将包括返回的函数名称和传递给函数的参数。
kwargs (任何类型) –
- 返回
一个可运行的序列,当运行时将给定函数传递给模型。
- 返回类型
示例
from typing import Optional from langchain.chains.ernie_functions import create_ernie_fn_chain from langchain_community.chat_models import ErnieBotChat from langchain_core.prompts import ChatPromptTemplate from langchain.pydantic_v1 import BaseModel, Field class RecordPerson(BaseModel): """Record some identifying information about a person.""" name: str = Field(..., description="The person's name") age: int = Field(..., description="The person's age") fav_food: Optional[str] = Field(None, description="The person's favorite food") class RecordDog(BaseModel): """Record some identifying information about a dog.""" name: str = Field(..., description="The dog's name") color: str = Field(..., description="The dog's color") fav_food: Optional[str] = Field(None, description="The dog's favorite food") llm = ErnieBotChat(model_name="ERNIE-Bot-4") prompt = ChatPromptTemplate.from_messages( [ ("user", "Make calls to the relevant function to record the entities in the following input: {input}"), ("assistant", "OK!"), ("user", "Tip: Make sure to answer in the correct format"), ] ) chain = create_ernie_fn_runnable([RecordPerson, RecordDog], llm, prompt) chain.invoke({"input": "Harry was a chubby brown beagle who loved chicken"}) # -> RecordDog(name="Harry", color="brown", fav_food="chicken")