Skip to content

Task_3 #24

Open
Open
@Aayushkanjani

Description

@Aayushkanjani
  1. Define Llama2-7B-chat-hf model
  2. Load preprocessed dataset
  3. Tuning of parameters and Supervised Finetuning parameters (Use-4 bit quantization)
  4. Push your notebook to Task3_solution folder with name (Your_rollno.ipynb(IIIT A), if from different college, then
    name it as (IITBHU(YOURNAME).ipynb

--> Please create a PR following the mentioned template only.
--> Its a competitive issue, so only the best PR will be merged.
--> The deadline for the task is 27 December at 11:59 pm; that is, the best PR will be merged on 27 DEC after thorough reviewing of all the notebooks.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Points: 50competitiveFor competitive issues, only top PRs need to be accepted.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions