localGPT logo

localGPT

General Purpose

About

LocalGPT is an open-source initiative that allows you to converse with your documents without compromising your privacy. With everything running locally, you can be assured that no data ever leaves your computer. Dive into the world of secure, local document interactions with LocalGPT.

Product Overview

LocalGPT is an open-source initiative that allows you to converse with your documents without compromising your privacy. With everything running locally, you can be assured that no data ever leaves your computer. Dive into the world of secure, local document interactions with LocalGPT.

Key Features

  • Utmost Privacy: Your data remains on your computer, ensuring 100% security.
  • Versatile Model Support: Seamlessly integrate a variety of open-source models, including HF, GPTQ, GGML, and GGUF.
  • Diverse Embeddings: Choose from a range of open-source embeddings.
  • Reuse Your LLM: Once downloaded, reuse your LLM without the need for repeated downloads.
  • Chat History: Remembers your previous conversations (in a session).
  • API: LocalGPT has an API that you can use for building RAG Applications.
  • Graphical Interface: LocalGPT comes with two GUIs, one uses the API and the other is standalone (based on streamlit).
  • GPU, CPU, HPU & MPS Support: Supports multiple platforms out of the box, Chat with your data using CUDA, CPU, HPU (Intel® Gaudi®) or MPS and more!

How It Works

LocalGPT uses a combination of local models and LangChain to run the entire RAG pipeline locally, ensuring data privacy and reasonable performance. The ingest.py script uses LangChain tools to parse the document and create embeddings locally using InstructorEmbeddings. The result is stored in a local vector database using Chroma vector store. The run_localGPT.py script uses a local LLM (Language Model) to understand questions and create answers. The context for the answers is extracted from the local vector store using a similarity search to locate the relevant piece of context from the documents. LocalGPT also supports integration with other LLMs from HuggingFace.

Use Cases

LocalGPT can be used in various scenarios, including:

  • Secure document interactions
  • Natural language processing tasks
  • Building RAG (Retrieval-Augmented Generation) applications

Technical Requirements

To set up LocalGPT, follow these steps:

  1. Clone the repository using git:
git clone https://github.com/PromtEngineer/localGPT.git
  1. Install conda for virtual environment management. Create and activate a new virtual environment:
conda create -n localGPT python=3.10.0
conda activate localGPT
  1. Install the dependencies using pip:
pip install -r requirements.txt
  1. Install LLAMA-CPP for GGML and GGUF models. Refer to the provided instructions for installation details.

Benefits

  • Ensures utmost privacy by keeping all data on the user's computer
  • Provides versatile model support with integration options for various open-source models
  • Offers diverse embeddings to choose from
  • Enables reuse of downloaded LLMs without repeated downloads
  • Remembers chat history for seamless conversations
  • Provides an API for building RAG applications
  • Offers a graphical interface with two GUI options
  • Supports multiple platforms, including GPU, CPU, HPU (Intel® Gaudi®), and MPS

Conclusion

LocalGPT is a powerful open-source tool that allows secure and local document interactions while maintaining data privacy. With its versatile model support, diverse embeddings, and support for multiple platforms, LocalGPT is a valuable asset for natural language processing tasks and building RAG applications.