[SIGIR'24] The official implementation code of MOELoRA.
-
Updated
Jul 22, 2024 - Python
[SIGIR'24] The official implementation code of MOELoRA.
Repository for Chat LLaMA - training a LoRA for the LLaMA (1 or 2) models on HuggingFace with 8-bit or 4-bit quantization. Research only.
A generalized framework for subspace tuning methods in parameter efficient fine-tuning.
LORA: Low-Rank Adaptation of Large Language Models implemented using PyTorch
The official implementation for MTLoRA: A Low-Rank Adaptation Approach for Efficient Multi-Task Learning (CVPR '24)
Easy wrapper for inserting LoRA layers in CLIP.
GaLore: Memory-Efficient LLM Training by Gradient Low-Rank Projection
Fine tuning Mistral-7b with PEFT(Parameter Efficient Fine-Tuning) and LoRA(Low-Rank Adaptation) on Puffin Dataset(multi-turn conversations between GPT-4 and real humans)
Over 60 figures and diagrams of LLMs, quantization, low-rank adapters (LoRA), and chat templates FREE TO USE in your blog posts, slides, presentations, or papers.
This repository contains the lab work for Coursera course on "Generative AI with Large Language Models".
A curated list of Parameter Efficient Fine-tuning papers with a TL;DR
Efficient fine-tuned large language model (LLM) for the task of sentiment analysis using the IMDB dataset.
Advanced AI-driven tool for generating unique video game characters using Stable Diffusion, DreamBooth, and LoRa adaptations. Enhances creativity with customizable, high-quality character designs, tailored specifically for game developers and artists.
The simplest repository & Neat implementation of different Lora methods for training/fine-tuning Transformer-based models (i.e., BERT, GPTs). [ Research purpose ]
Long term project about a custom AI architecture. Consist of cutting-edge technique in machine learning such as Flash-Attention, Group-Query-Attention, ZeRO-Infinity, BitNet, etc.
EDoRA: Efficient Weight-Decomposed Low-Rank Adaptation via Singular Value Decomposition
In this we fine-tune MLP with LoRA and DoRA on CIFAR-10 Dataset
Machine Learning Notes 2025 (LoRA, etc.)
Add a description, image, and links to the low-rank-adaptation topic page so that developers can more easily learn about it.
To associate your repository with the low-rank-adaptation topic, visit your repo's landing page and select "manage topics."