Replies: 2 comments 2 replies
-
The error you're encountering, "Model name llama2 does not support function calling API," indicates that the To resolve this issue, you need a model that is explicitly marked to support function calling. This typically involves ensuring the model's metadata has the If you're working directly with the
|
Beta Was this translation helpful? Give feedback.
-
@AashiDutt you should use ReActAgentWorker -- the function calling agent worker is only for LLMs with a specific API built-in for tool calling (this includes openai, anthropic, and mistral) |
Beta Was this translation helpful? Give feedback.
-
Hello,
I want to use local llm within agent worker without using any APIs. Here is the code I'm trying to work around with
from llama_index.core.agent import FunctionCallingAgentWorker
from llama_index.core.agent import AgentRunner
from llama_index.llms.ollama import Ollama
llm = Ollama(model="llama2")
agent_worker = FunctionCallingAgentWorker.from_tools(
initial_tools,
llm = llm,
verbose = True
)
agent = AgentRunner(agent_worker)
This code returned:
Cell In[107], line 4
1 from llama_index.core.agent import FunctionCallingAgentWorker
2 from llama_index.core.agent import AgentRunner
----> 4 agent_worker = FunctionCallingAgentWorker.from_tools(
5 initial_tools,
6 llm = llm,
7 verbose = True
8 )
10 agent = AgentRunner(agent_worker)
File ~/miniconda3/envs/DL/lib/python3.10/site-packages/llama_index/core/agent/function_calling/step.py:125, in FunctionCallingAgentWorker.from_tools(cls, tools, tool_retriever, llm, verbose, max_function_calls, callback_manager, system_prompt, prefix_messages, **kwargs)
121 prefix_messages = [ChatMessage(content=system_prompt, role="system")]
123 prefix_messages = prefix_messages or []
--> 125 return cls(
126 tools=tools,
127 tool_retriever=tool_retriever,
128 llm=llm,
129 prefix_messages=prefix_messages,
130 verbose=verbose,
131 max_function_calls=max_function_calls,
132 callback_manager=callback_manager,
133 **kwargs,
134 )
...
71 )
72 self._llm = llm
73 self._verbose = verbose
ValueError: Model name llama2 does not support function calling API.
Beta Was this translation helpful? Give feedback.
All reactions