llama-utils
LlamaIndex utility package
Current release info
Name | Downloads | Version | Platforms |
---|---|---|---|
llama-utils - Large Language Model Utility Package
llama-utils is a large language model utility package
Main Features
- llama-index
Package Overview
graph TB
Package[llama-utils]
Package --> SubPackage1[Indexing]
Package --> SubPackage3[Storage]
SubPackage1 --> Module1[index_manager.py]
SubPackage1 --> Module2[custom_index.py]
SubPackage3 --> Module5[storage.py]
SubPackage3 --> Module6[config_loader.py]
complete overview of the design and architecture here
Installing llama-utils
Installing llama-utils
from the conda-forge
channel can be achieved by:
conda install -c conda-forge llama-utils=0.2.0
It is possible to list all the versions of llama-utils
available on your platform with:
conda search llama-utils --channel conda-forge
Install from GitHub
to install the last development to time, you can install the library from GitHub
pip install git+https://github.com/Serapieum-of-alex/llama-utils
pip
to install the last release, you can easily use pip
pip install llama-utils==0.2.0
Quick start
- First download ollama from here ollama and install it.
- Then run the following command to pull the
llama3
modelollama pull llama3
- Then run ollama server (if you get an error, check the errors section below to solve it)
Now you can use the
ollama serve
llama-utils
package to interact with theollama
server
from llama_utils.retrieval.storage import Storage
STORAGE_DIR= "examples/data/llama3"
storage = Storage.create()
data_path = "examples/data/essay"
docs = storage.read_documents(data_path)
storage.add_documents(docs)
storage.save(STORAGE_DIR)