Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LLM Tools #2164

Closed
wants to merge 16 commits into from
Closed

LLM Tools #2164

wants to merge 16 commits into from

Conversation

Shulyaka
Copy link
Contributor

@Shulyaka Shulyaka commented May 7, 2024

Proposed change

Add developer documentation on how to add a new LLM tool.

Type of change

  • Document existing features within Home Assistant
  • Document new or changing features which there is an existing pull request elsewhere
  • Spelling or grammatical corrections, or rewording for improved clarity
  • Changes to the backend of this documentation
  • Removed stale or deprecated documentation

Additional information

@Shulyaka Shulyaka mentioned this pull request May 7, 2024
20 tasks
docs/llm_tools_invocation.md Outdated Show resolved Hide resolved
docs/llm_tools_invocation.md Outdated Show resolved Hide resolved
function_response = await llm.async_call_tool(
self.hass, tool_input
)
except Exception as e: # pylint: disable=broad-exception-caught
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Limit the set of exceptions caught to just those thrown by well behaving tools. While in theory it is good to be defensive here, the reason for being more specific is to avoid masking bugs.

The two types of errors I can forsee here are:

  • HomeAssistantError raised when a known problem happens e.g. transient error from a backend, etc
  • An input validation exception (e.g. vol.Invalid or some kind of equivalent validation exception).

Also, the response here of returning an error with a dictionary is not a common example in the home assistant code base. Given LLM calls happen from conversation agents perhaps an example with an IntentResponse would work? e.g. https://github.com/home-assistant/core/blob/8168aff25370e5a7afcb989d83f77815cbbf6903/homeassistant/components/openai_conversation/conversation.py#L120

docs/llm_tools.md Outdated Show resolved Hide resolved
docs/llm_tools.md Outdated Show resolved Hide resolved
docs/llm_tools_invocation.md Outdated Show resolved Hide resolved

```python
from homeassistant.helpers import llm
...
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

docs/llm_tools_invocation.md Outdated Show resolved Hide resolved
docs/llm_tools_invocation.md Outdated Show resolved Hide resolved
docs/llm_tools_invocation.md Outdated Show resolved Hide resolved
@home-assistant home-assistant bot marked this pull request as draft May 10, 2024 19:53
@home-assistant
Copy link

Please take a look at the requested changes, and use the Ready for review button when you are done, thanks 👍

Learn more about our pull request process.

Shulyaka and others added 6 commits May 11, 2024 15:45
Co-authored-by: Allen Porter <allen.porter@gmail.com>
Co-authored-by: Allen Porter <allen.porter@gmail.com>
Co-authored-by: Allen Porter <allen.porter@gmail.com>
@Shulyaka Shulyaka marked this pull request as ready for review May 12, 2024 00:11
@home-assistant home-assistant bot requested a review from allenporter May 12, 2024 00:11
Copy link
Contributor

@allenporter allenporter left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good!

docs/llm_tools_invocation.md Show resolved Hide resolved
) -> conversation.ConversationResult:
"""Process a sentence."""

while True:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

check indent

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps the example should just try at most two llm calls, though i realize it will get verbose

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You mean remove the loop?

self.hass, tool_input
)
except (HomeAssistantError, vol.Invalid) as e:
tool_response = {"error": type(e).__name__}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Make a ConversationResult here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Too early. Need to feed the tool result back to LLM and wait for its response.

docs/llm_tools_invocation.md Outdated Show resolved Hide resolved
Apply suggestions from code review
@balloob balloob mentioned this pull request May 20, 2024
5 tasks
@Shulyaka
Copy link
Contributor Author

Superseded by #2178

@Shulyaka Shulyaka closed this May 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants