MC Lab is a powerful and user-friendly software designed to run large language models (LLMs) locally. It provides seamless integration with Hugging Face, allowing users to download and manage models effortlessly. Additionally, it supports manual input of .gguf files for full flexibility. With MC Lab, you can also access an automatically configured FastAPI server to chat with your locally hosted models.
- Download Models from Hugging Face: Search and download models directly from the Hugging Face repository.
- Manual Model Input: Specify custom
.gguffiles for running LLMs. - Integrated FastAPI Server:
- Automatically starts a FastAPI server upon launching the app.
- Access the chat endpoint at:
http://127.0.0.1:8000/chat
- Interactive Chat Interface: Chat directly with the downloaded models via the app or API.
- Full-Screen, Modern GUI: Designed with PyQt5 for a sleek and intuitive user experience.
Follow these steps to get started with MC Lab:
-
Python 3.8+
-
Install the required Python libraries:
pip install -r requirements.txt
Required libraries include:
PyQt5huggingface_hubfastapiuvicornlangchain-corelangchain-community
-
Clone the repository:
git clone https://github.com/metriccoders/mc-lab.git cd mc-lab -
Run the application:
python app.py
Run app.py to launch the GUI application. The application starts in full-screen mode and includes the following features:
- Search and Download Models:
- Use the search bar to find models on Hugging Face.
- Download the selected model by clicking the "Download LLM" button.
- Manual Model Loading:
- Specify the path to a
.gguffile manually.
- Specify the path to a
- Chat with Models:
- Use the chat interface in the GUI to interact with your models.
- The FastAPI server runs automatically when the application starts.
- Chat Endpoint:
http://127.0.0.1:8000/chat- Send a POST request to this endpoint with the following format:
{ "message": "Your message here" } - The server will respond with the model's output.
- Send a POST request to this endpoint with the following format:
You can interact with the FastAPI server using any HTTP client, such as curl or Postman.
Example using curl:
curl -X POST "http://127.0.0.1:8000/chat" \
-H "Content-Type: application/json" \
-d '{"message": "Hello, LLM!"}'Response:
{
"response": "Hello! How can I assist you today?"
}Coming soon!
We welcome contributions to MC Lab! Feel free to submit issues or pull requests to improve the application.
MC Lab is released under the MIT License.
For any questions or feedback, please reach out to us at suhas@metriccoders.com.
Enjoy exploring the power of LLMs with MC Lab!