v3.2.1.3
New Features
[Request: OAIClient]
Add a new request plugins for those models which API format is alike OpenAI but have some additional rules like can not support multi system messages or must have strict user-assistant message orders. It's very useful for those local servering models started by local model servering library like Xinference.
HOW TO USE:
import Agently
agent_factory = (
Agently.AgentFactory(is_debug=True)
.set_settings("current_model", "OAIClient")
# Mixtral for example
.set_settings("model.OAIClient.url" , "https://api.mistral.ai/v1")
# if you want to use Moonshot Kimi:
#.set_settings("model.OAIClient.url", "https://api.moonshot.cn/v1")
# set model name
# Mixtral model list: https://docs.mistral.ai/platform/endpoints/
.set_settings("model.OAIClient.options", { "model": "open-mistral-7b" })
# Moonshot mode list: https://platform.moonshot.cn/docs/pricing#文本生成模型-moonshot-v1
# set API-KEY if needed
.set_settings("model.OAIClient.auth.api_key", "")
# set proxy if needed
#.set_proxy("http://127.0.0.1:7890")
# you can also change message rules
#.set_settings("model.OAIClient.message_rules", {
# "no_multi_system_messages": True, # True by default, will combine multi system messages into one
# "strict_orders": True, # True by default, will transform messages' order into "User-Assistant-User-Assitant" strictly
# "no_multi_type_messages": True, # True by default, will only allow text messages
#})
)
agent = agent_factory.create_agent()
(
agent
.set_role("You love EMOJI very much and try to use EMOJI in every sentence.")
.chat_history([
{ "role": "user", "content": "It's a beautiful day, isn't it?" },
{ "role": "assistant", "content": "Right, shine and bright!☀️" }
])
.input("What do you suggest us to do today?")
# use .start("completions") if your model is a completion model
.start("chat")
)
Update
[Request: ERNIE]
Add support for parametersystem
in new API reference, now system prompt will be pass to parametersystem
instead of being transformed into one of the user chat message. Maplemx/Agently@dc52bdc[Request]
Optimized prompt about multi items in list Maplemx/Agently@9f378c7
Bug Fixed
[Request Alias]
Fixed some bugs that cause.general()
and.abstract()
not working Maplemx/Agently@5f6dd5e[Agent Component: Segment]
Fixed a bug that cause streaming handler not working Maplemx/Agently@8ad370c[Request: ERNIE]
Fixed some quotation marks conflict Maplemx/Agently@fcdcdf0
新功能
[模型请求插件: OAIClient]
新增新的模型请求插件OAIClient
,用于支持开发者请求那些看起来很像OpenAI API格式的模型(但通常会有些和OpenAI API不一致的潜在规则)。这个请求插件也可以用于请求通过类似Xinference这样的本地模型服务化库启动的本地模型服务。
如何使用:
import Agently
agent_factory = (
Agently.AgentFactory(is_debug=True)
.set_settings("current_model", "OAIClient")
# 这里用Mixtral举例
.set_settings("model.OAIClient.url" , "https://api.mistral.ai/v1")
# 如果你希望使用月之暗面的Kimi可以参考下面这个url
#.set_settings("model.OAIClient.url", "https://api.moonshot.cn/v1")
# 设置你要使用的具体模型
# Mixtral支持的模型列表: https://docs.mistral.ai/platform/endpoints/
.set_settings("model.OAIClient.options", { "model": "open-mistral-7b" })
# 月之暗面支持的模型列表: https://platform.moonshot.cn/docs/pricing#文本生成模型-moonshot-v1
# 设置API-KEY(如果需要的话,本地模型可能不需要)
.set_settings("model.OAIClient.auth.api_key", "")
# 设置代理(如果需要的话)
#.set_proxy("http://127.0.0.1:7890")
# 你也可以通过设置变更消息处理的规则
#.set_settings("model.OAIClient.message_rules", {
# "no_multi_system_messages": True, # 默认开,如果有多条system消息,将会被合并成一条
# "strict_orders": True, # 默认开,将会强制将消息列转换成“用户-助理-用户-助理”顺序
# "no_multi_type_messages": True, # 默认开,将只保留文本类消息,并且直接将文本值放到content里
#})
)
agent = agent_factory.create_agent()
(
agent
.set_role("You love EMOJI very much and try to use EMOJI in every sentence.")
.chat_history([
{ "role": "user", "content": "It's a beautiful day, isn't it?" },
{ "role": "assistant", "content": "Right, shine and bright!☀️" }
])
.input("What do you suggest us to do today?")
# 使用.start("completions")可以支持补全模型!
.start("chat")
)
更新
[模型请求插件: ERNIE]
添加了在新的API规范中,对system
参数的直接支持。现在传递给文心一言的system prompt将会直接传递给API接口的system
参数,而不再转换成用户对话消息了;[请求优化]
优化了list中可以包含多个item的prompt提示方法。
问题修复
[请求指令]
修复了导致.general()
和.abstract()
不能正常工作的问题;[Agent能力插件: Segment]
修复了导致流式输出过程中,处理器(handler)无法工作的问题;[模型请求插件: ERNIE]
修复了一些引号冲突的问题。