Generate text using local LLMs with customizable prompts
Install with your favorite plugin manager, e.g. lazy.nvim
Example with Lazy
-- Custom Parameters (with defaults)
{
"MHD-GDev/genlms.nvim",
dependencies = {
"nvim-lualine/lualine.nvim",
},
config = function()
require("genlms").setup({
quit_map = "q",
retry_map = "<c-r>",
accept_map = "<c-cr>",
host = "localhost",
port = "1234",
display_mode = "split",
show_prompt = true,
show_model = true,
no_auto_close = false,
json_response = true,
result_filetype = "markdown",
debug = false,
})
-- Key mappings (Change as you like)
vim.keymap.set({ "n", "v" }, "<leader>]", ":Genlms<CR>")
vim.keymap.set("n", "<leader>ga", "<CMD>Genlms Ask<CR>", { noremap = true })
vim.keymap.set("n", "<leader>gc", "<CMD>Genlms Chat<CR>", { noremap = true })
vim.keymap.set("n", "<leader>gg", "<CMD>Genlms Generate<CR>", { noremap = true })
vim.keymap.set("v", "<leader>gD", ":'<,'>Genlms Document_Code<CR>", { noremap = true })
vim.keymap.set("v", "<leader>gC", ":'<,'>Genlms Change<CR>", { noremap = true })
vim.keymap.set("v", "<leader>ge", ":'<,'>Genlms Enhance_Code<CR>", { noremap = true })
vim.keymap.set("v", "<leader>gR", ":'<,'>Genlms Review_Code<CR>", { noremap = true })
vim.keymap.set("v", "<leader>gs", ":'<,'>Genlms Summarize<CR>", { noremap = true })
vim.keymap.set("v", "<leader>ga", ":'<,'>Genlms Ask<CR>", { noremap = true })
vim.keymap.set("v", "<leader>gx", ":'<,'>Genlms Fix_Code<CR>", { noremap = true })
vim.keymap.set("n", "<leader>gl", "<CMD>GenLoadModel<CR>", { noremap = true })
vim.keymap.set("n", "<leader>gu", "<CMD>GenUnloadModel<CR>", { noremap = true })
end,
},Use command Genlms to generate text based on predefined and customizable prompts.
Example key maps:
vim.keymap.set({ 'n', 'v' }, '<leader>]', ':Genlms<CR>')You can also directly invoke it with one of the predefined prompts or your custom prompts:
vim.keymap.set('v', '<leader>]', ':Genlms Enhance_Grammar_Spelling<CR>')After a conversation begins, the entire context is sent to the LLM. That allows you to ask follow-up questions with
:Genlms Chatand once the window is closed, you start with a fresh conversation.
For prompts which don't automatically replace the previously selected text (replace = false), you can replace the selected text with the generated output with <c-cr>.
To use genlms you need to load or unload models with these commands :GenUnloadModel and :GenLoadModel .
- You can downlaod models from Hugingface
All prompts are defined in require('genlms').prompts, you can enhance or modify them.
Example:
require('genlms').prompts['Elaborate_Text'] = {
prompt = "Elaborate the following text:\n$text",
replace = true
}
require('genlms').prompts['Fix_Code'] = {
prompt = "Fix the following code. Only output the result in format ```$filetype\n...\n```:\n```$filetype\n$text\n```",
replace = true,
extract = "```$filetype\n(.-)```"
}You can use the following properties per prompt:
prompt: (string | function) Prompt either as a string or a function which should return a string. The result can use the following placeholders:$text: Visually selected text or the content of the current buffer$filetype: File type of the buffer (e.g.javascript)$input: Additional user input$register: Value of the unnamed register (yanked text)
replace:trueif the selected text shall be replaced with the generated outputextract: Regular expression used to extract the generated resultmodel: The model to use, default:local-model
User selections can be delegated to Telescope with telescope-ui-select.
- This project was inspired by gen.nvim, which laid the foundation for this version.
- For more information, see the original project.
- We would like to express our gratitude to the original authors and contributors who made this project possible.
