Ollama
The Ollama integration集成将 Home Assistant 与您的设备、服务等连接和集成。 [Learn more] adds a conversation agent in Home Assistant powered by a local Ollama
Controlling Home Assistant is an experimental feature that provides the AI access to the Assist API of Home Assistant. You can control what devices and entities it can access from the exposed entities page. The AI is able to provide you information about your devices and control them.
This integration does not integrate with sentence triggers.
This integration requires an external Ollama server, which is available for macOS, Linux, and Windows. Follow the download instructions
配置
要将 Ollama service 添加到您的 Home Assistant 实例中,请使用此 My 按钮:
手动配置步骤
如果上述 My 按钮不起作用,您也可以手动执行以下步骤:
-
浏览到您的 Home Assistant 实例。
-
转到
设置 > 设备与服务。 -
在右下角,选择
Add Integration 按钮。 -
从列表中选择 Ollama。
-
按照屏幕上的说明完成设置。
选项
Ollama 的选项可以通过用户界面设置,具体步骤如下:
- 浏览到您的 Home Assistant 实例。
- 转到 设置 > 设备与服务。
- 如果配置了多个 Ollama 实例,请选择您想配置的实例。
- 选择集成,然后选择 配置。
Name of the Ollama modelmistral
or llama2:13b
. Models will be automatically downloaded during setup.
Instructions for the AI on how it should respond to your requests. It is written using Home Assistant Templating.
If the model is allowed to interact with Home Assistant. It can only control or provide information about entities that are exposed to it. This feature is considered experimental and see Controlling Home Assistant below for details on model limitations.
The context window size is the number of tokens the model can take as input. Home Assistant defaults to 8k, which is larger than the default value in Ollama Server (2k), and you may adjust it based on the maximum context size of the specific model used. A larger value will better support larger homes with more entities, and smaller values may lower Ollama server RAM usage.
Maximum number of messages to keep for each conversation (0 = no limit). Limiting this value will cause older messages in a conversation to be dropped.
Controlling Home Assistant
If you want to experiment with local LLMs using Home Assistant, we currently recommend using the llama3.1:8b
model and exposing fewer than 25 entities. Note that smaller models are more likely to make mistakes than larger models.
Only models that support Tools
Smaller models may not reliably maintain a conversation
- Add the Ollama integration without enabling control of Home Assistant. You can use this conversation agent to have a conversation.
- Add an additional Ollama integration, using the same model, enabling control of Home Assistant. You can use this conversation agent to control Home Assistant.