ollama-python 0.4.7-1 source package in Ubuntu
Changelog
ollama-python (0.4.7-1) unstable; urgency=medium * New upstream release. -- Edward Betts <edward@4angle.com> Sat, 25 Jan 2025 12:50:13 +0100
Upload details
- Uploaded by:
- Home Assistant Team
- Uploaded to:
- Sid
- Original maintainer:
- Home Assistant Team
- Architectures:
- all
- Section:
- misc
- Urgency:
- Medium Urgency
See full publishing history Publishing
| Series | Published | Component | Section | |
|---|---|---|---|---|
| Plucky | release | universe | misc |
Downloads
| File | Size | SHA-256 Checksum |
|---|---|---|
| ollama-python_0.4.7-1.dsc | 2.4 KiB | 99d8aed011b8628ff81c3d68b46119263bbeee7d86ffb35f2bf4073ac5935dfa |
| ollama-python_0.4.7.orig.tar.gz | 41.5 KiB | 6d59d084ebf61d5eff140af25e40562e377823471a1933f191471b11b533288c |
| ollama-python_0.4.7-1.debian.tar.xz | 2.4 KiB | b79faf1630d229ecc13a1cbea32ec1dad79cec938346581e9ab89997e469a75f |
Available diffs
- diff from 0.4.5-1 to 0.4.7-1 (14.6 KiB)
No changes file available.
Binary packages built by this source
- python3-ollama: Library for interacting with the Ollama server and its AI models
This library provides functionality for integrating with an Ollama server to
interact with AI language models and create conversational experiences. It
allows querying and generating text through an API that communicates with the
server, supporting various operations such as model management, message
exchange, and prompt handling. Ollama requires configuration to connect to a
network-accessible server, after which it can be used to fetch and generate
information based on context received from Home Assistant or similar
platforms. Through model specification and prompt templates, the library
adapts responses to the specific environment, although it operates without
direct command over connected devices.
