A starter template for LazyVim. Refer to the documentation to get started.
There are two approaches for installing the prerquisites. You may pick either or a combination of both
Please see installation requirements for LazyVim here. For installation of Neovim (>= version 0.9.0) refer to their installation guide
The link above mentions lazygit as an optional requirement. Since it makes working
with Git in Neovim so much easier and joyful, I'd recommend to install be it manually
or as a Nix package (as part of your Nix set-up).
- [Optional: for Scala] NeoVm will configure
metalsfor you, but you will have to manually installmetalswhen on a*.scalafile with:MetalsInstall - [Optional: for Protocol-Buffers] install protols plugin.
Make sure that you have
cargoversion1.88.0installed. This plugin requiresedition2024. I usedrustup's `nightly toolchain'. - [Optional: for Go] Install Go CI linter to avoid anoying warnings:
go install github.com/golangci/golangci-lint/cmd/golangci-lint@latestNix is a package management tool for reproducible development environments.
Make sure that you have Nix installed, by following the instruction steps here.
Clone this repository and follow README steps in
its nix directory.
For most basic Neovim setup you may run nvim inside nix-shell like such:
~/$PATH_TO_CLONED_REPO/bash-utils/nix/dynamic-nix-shell.sh nvimThen you can add more modules following nvim in the previous command like python depending on
your current project.
Make sure that you have your prerequistes and then follow the steps from this page.
This LazyVim configuration integrates with CodeCompanion.nvim to provide AI-powered coding assistance. By default, it is configured to use Ollama with the qwen3:14b model, which is a cost-efficient (free) solution for local AI inference.
To use CodeCompanion with Ollama, you need to install Ollama and download the qwen3:14b model.
For Linux (Debian-based):
-
Install Ollama:
curl -fsSL https://ollama.com/install.sh | sh -
Download the qwen3 model:
ollama pull qwen3:14b
For macOS:
-
Install Ollama: Download the macOS application from ollama.com/download and follow the installation instructions.
-
Download the qwen3 model:
ollama pull qwen3:14b
The default model for CodeCompanion is qwen3:14b due to its cost-effectiveness. However, you can easily switch to other models supported by Ollama or other adapters (e.g., OpenAI, DeepSeek, Anthropic) by modifying the lua/plugins/codecompanion.lua file.
To change the default model for Ollama, locate the ollama_qwen adapter definition and modify the model field:
ollama_qwen = function()
return require("codecompanion.adapters").extend("ollama", {
name = "ollama_qwen",
schema = {
model = {
default = "qwen3:14b", -- Change 'qwen3:14b' to your desired Ollama model
},
},
})
end,To switch to a different adapter (e.g., OpenAI), you would modify the interactions section:
interactions = {
chat = { adapter = "openai" }, -- Change 'ollama_qwen' to 'openai' or another configured adapter
inline = { adapter = "openai" },
agent = { adapter = "deepseek" },
},Remember to set the corresponding API key as an environment variable if you are using a paid service like OpenAI, DeepSeek, or Anthropic. For example, for OpenAI, set OPENAI_API_KEY.