Skip to content

aboutmydreams/local-agent-chat

 
 

Repository files navigation

LLM Flutter Application

This is a Flutter application that utilizes the Llama.cpp library to run large LLM models offline.

Features

  • Offline Model Execution: The application is capable of running large LLM models offline, making it ideal for environments with limited or no internet connectivity.
  • Cross-Platform: Built with Flutter, this application can be compiled and run on multiple platforms including iOS, Android, and web.
  • Efficient Performance: The use of Llama.cpp ensures efficient execution of large models, providing fast and accurate results.

demo png

  • Image 3
  • Image 4
  • Image 1
  • Image 2

Getting Started

To get started with this project, clone the repository and navigate to the project directory.

git clone https://github.com/mizy/local-agent-chat.git
cd local-agent-chat
git submodule update --init --recursive

Supported Platforms

  • MacOS
  • Linux
  • IOS

Building the Project

To build the project, use the following command:

flutter build

This will generate a build based on your current platform.

Running the Project

To run the project, use the following command:

flutter run

Add a new prompt format

change llama_cpp_dart/src/llm.cpp antiprompt_map to add a new prompt format

Contributing

Contributions are welcome!

License

This project is licensed under the terms of the MIT license.

About

a flutter llama.cpp chat ui

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Metal 56.7%
  • Dart 37.8%
  • C++ 2.6%
  • CMake 1.7%
  • Ruby 0.5%
  • C 0.2%
  • Other 0.5%