Meltdown: Interface for llama.cpp and ChatGPT

I’m afraid linking what I’ve been working on for a while as my first post might not be greatly received, but I think you might find it interesting none the less.

I’m making a text interface to chat with local and remote models. It is made in 100% python, it uses tkinter/​tcl which should be bundled with a normal python installation.

I made it because I wasn’t able to find an interface that felt right to me. I didn’t try them all though. I like adding “power user” features when I think of one.

You can see a video demo I made here: https://​​meltdown.merkoba.com/​​

Some features:

  • Load llama.cpp models (only gguf tested for now).

  • Use your ChatGPT api key with a specific model of openai.

  • Model configuration tweaks like temperature, top-p etc.

  • Sessions with conversations spread in tabs. These can be saved and loaded.

  • Configurations can be saved and loaded.

  • Markdown support, including syntax highlighting for code snippets.

  • Click, right click, or double click words to either Copy, Explain, Search, or open a new conversation.

  • Dark and light themes available.

  • Commands with tab completion and similarity check.

  • Command line arguments to set how the program works.

  • Saved context to use with the models.

  • Save logs to either json or text.

  • Run a command upon saving a log, like opening it with a text editor.

  • Compact mode which hides some panels.

  • Scrollable panel to pack more configs.

  • Prepend and Append to your prompt automatically.

  • Close tabs in different ways like old, others, all, etc.

  • Display CPU, RAM, and Temperature. Clicking these opens a task manager. This can be expanded to work on more systems.

  • Input history to go back to previous prompts by using up/​down arrows, buttons, or mousewheel.

  • Keyboard shortcuts to perform various actions.

  • Variables to use for the system. For example \@name_user, \@name_ai, and \@date.

  • Responses are streamed live.

I don’t know if this works in systems different to mine. But you are encouraged to try.

No comments.