Skip to content

Slides and Materials for a workshop on NLP, Transformers, LLMs and Agents. Taught at the University of Hildesheim, February 2025

Notifications You must be signed in to change notification settings

TimBMK/LLM-workshop-HI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

33 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Natural Language Processing with Large Language Models: From Bag of Words to Agents

These are the materials for a workshop on Natural Language Processing (NLP) with Large Language Models (LLMs). It gives an introduction to NLP, starting with Bag of Words, Static Embeddings, Contextualized Embeddings/Transformers, and Classification Tasks. It then continues to introduce Large Language Models, how to deploy them locally with the Ollama framework, and how to use Agents for more complex tasks including Retrieval Augmented Generation (RAG).

The workshop presentations are accompanied by coding notebooks with practical implementations and exercises. While the presentations and the accompanying slides are in German, all the other materials are in English.

In order to prepare for the workshop, please refer to the setup. You will need a suitable Python environment, accounts for OpenAI and Hugging Face, as well as a Ollama installation.

Note that, due to time constraints, we will not explicitly look at fine-tuning a model yourself. For this, I refer you to a detailled article by Towards Data Science and excellent tutorials by Hauke Licht on how to train a model and how to evaluate the changes inroduced by fine-tuning. If you are lookig to fine-tune LLMs, the Ludwig framework may also be helpful.

The Workshop is held on the 27th and 28th of February 2025 at the University of Hildesheim.

About

Slides and Materials for a workshop on NLP, Transformers, LLMs and Agents. Taught at the University of Hildesheim, February 2025

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published