LLM examples This repository contains some examples of LLM deployments in Aalto University premises. Use local models Huggingface transformers See this document. Server via vLLM See this document. Batch inference via vLLM See this document. Chat with your pdf via langchain (RAG) See this document. Use Aalto's open-source LLM API See this document.