Focuses on running LLMs at the edge (on devices like Raspberry Pi). Why it works: Highlights the project’s edge-computing nature and AI capabilities.
-
Updated
Apr 2, 2025 - Jupyter Notebook
Focuses on running LLMs at the edge (on devices like Raspberry Pi). Why it works: Highlights the project’s edge-computing nature and AI capabilities.
A high-performance, modular AI chat solution for Jetson™ edge devices. It integrates Ollama with the Meta Llama 3.2 3B model for LLM inference, FastAPI-based Langchain middleware, and OpenWebUI.
Add a description, image, and links to the edgellm topic page so that developers can more easily learn about it.
To associate your repository with the edgellm topic, visit your repo's landing page and select "manage topics."