MVP (minimum viable product) version, there are still many things that can be optimized and will be gradually improved.
If you have any ideas or suggestions, you can raise an issue to tell me.
English | 简体中文
This component integrates the power of LangChain into Home Assistant. With this component, users can engage in natural language conversations to control smart devices, create automations, and more.
-
Conversational Control:
Use natural language to interact with your smart home. Simply chat with your Home Assistant using voice or text commands to perform actions like turning on lights, adjusting thermostat, or creating automation rules.
-
Contextual Understanding:
Utilizing LangChain's memory capability enables this component to understand the context of the conversation and complete some follow-up conversations.
-
Multi LLM Support
Based on LangChain's ability, this component will be able to support a variety of large models in the future, including locally deployed models, which will avoid users' concerns about privacy.
-
call a service, including triggering a scene
-
add an automation
-
add a script
-
add a scene
...
- Copy
llm_conversation_assist
folder into<your config directory>/custom_components
. - Restart Home Assistant to load the component
- Open the Home Assistant frontend or mobile app.
- Navigate to Settings > Devices&services.
- Select llm_conversation_assist in Integrations tab.
- Click ADD SERVICE and follow config flow to complete the setup.
- configure necessary settings of the llm model
- specify config name
<your agent name>
- Navigate to Settings > Voice Assistants.
- Click Add assistant
- specify assistant name
<your assistant name>
- choose
<your agent name>
as Conversation agent
- specify assistant name
- Open the Home Assistant frontend or mobile app.
- Click on the conversation agent icon or open the conversation agent panel.
- Select your assistant by switching to
<your assistant name>
- Now start your conversation
# configuration.yaml
logger:
default: info
logs:
custom_components.llm_conversation_assist: debug
Most likely caused by the failure to install python dependency packages.
- If the llm you want to use does not depend on this python package, you can just delete the corresponding package name in
manifest.json
.- OpenAI ->
langchain-openai
- Tongyi ->
dashscope
- Qianfan ->
qianfan
- OpenAI ->
- You can also manually install the corresponding dependencies via
pip install