In this lesson you'll learn how to apply AI patterns to solve real problems. This is where concepts become solutions. You'll build applications that understand meaning, ground responses in your data, and process documents intelligently.
Click the image to watch the video
- How embeddings represent meaning as numbers
- Build semantic search that understands intent, not just keywords
- Implement Retrieval-Augmented Generation (RAG) to ground AI in your data
- Create applications that process and understand documents and images
- Know when to use each pattern and how to combine them
This lesson is divided into five parts:
Understand how AI represents meaning as vectors and build search that finds by intent.
Ground AI responses in your own documents and data.
Process images, PDFs, and visual content with multimodal AI.
Build sophisticated applications that combine multiple patterns.
Run AI models locally using AI Toolkit, Docker Model Runner, and Foundry Local.
In the previous lesson, you learned the techniques: chat, streaming, function calling, middleware. But techniques alone don't solve problems.
Patterns are proven combinations of techniques that solve specific types of problems:
| Pattern | Problem It Solves |
|---|---|
| Semantic Search | "Find things by meaning, not keywords" |
| RAG | "Answer questions using my specific data" |
| Vision Processing | "Understand and extract information from images" |
| Document Understanding | "Process and analyze document content" |
| Local Model Runners | "Run AI privately and offline on my own hardware" |
This lesson teaches you to recognize problems and apply the right pattern.
All code samples for this lesson are located in the samples/CoreSamples/ directory:
| Category | Samples | Description |
|---|---|---|
| Embeddings & RAG | RAGSimple-02MEAIVectorsMemory | In-memory vector store |
| RAGSimple-03MEAIVectorsAISearch | Azure AI Search | |
| RAGSimple-04MEAIVectorsQdrant | Qdrant vector store | |
| Vision | Vision-01MEAI-AzureOpenAI | Vision with Azure OpenAI |
| Vision-02MEAI-Ollama | Local vision with Ollama | |
| Vision-03MEAI-AOAI | Vision with Azure OpenAI | |
| Documents | OpenAI-FileProcessing-Pdf-01 | PDF document processing |
| Local Models | AIToolkit-02-MEAI-Chat | AI Toolkit with MEAI |
| DockerModels-02-MEAI-Chat | Docker Model Runner with MEAI | |
| AIFoundryLocal-01-MEAI-Chat | Foundry Local with MEAI |
Each lesson document links directly to the relevant samples.
Start with understanding how AI represents meaning:
Continue to Part 1: Embeddings and Semantic Search →
Once you complete all parts of this lesson, you'll be ready for AI Agents in Lesson 4:
- Building autonomous agents that make decisions
- Multi-agent orchestration
- Agent tools and plugins
Continue to Lesson 4: AI Agents →
- Microsoft.Extensions.VectorData Documentation: Working with vector databases in .NET
- Build a Vector Search App: End-to-end quickstart for semantic search
- Embeddings Explained: How AI represents meaning as numbers