LLM Basics
- Code training GK840035
- Duur 2 dagen
Andere trainingsmethoden
Methode
Deze training is in de volgende formats beschikbaar:
-
Class Connect
Verbind naar een klas in HD
-
Klassikale training
Klassikaal leren
-
Op locatie klant
Op locatie klant
-
Virtueel leren
Virtueel leren
Vraag deze training aan in een andere lesvorm.
Trainingsbeschrijving
Naar bovenData
Naar boven-
- Methode: Klassikale training
- Datum: 12-13 januari, 2026 | 09:30 to 17:00
- Locatie: Eindhoven (Evoluon Noord Brabantlaan 1) (W. Europe )
- Taal: Nederlands
-
- Methode: Virtueel leren
- Datum: 12-13 januari, 2026 | 09:30 to 17:00
- Locatie: Virtueel-en-klassikaal (W. Europe )
- Taal: Nederlands
-
- Methode: Virtueel leren
- Datum: 15-16 januari, 2026 | 09:30 to 17:00
- Locatie: Virtueel-en-klassikaal (W. Europe )
- Taal: Engels
-
- Methode: Virtueel leren
- Datum: 15-16 januari, 2026 | 10:30 to 18:00
- Locatie: Virtueel-en-klassikaal (W. Europe )
- Taal: Engels
-
- Methode: Klassikale training
- Datum: 16-17 februari, 2026 | 09:30 to 17:00
- Locatie: Groningen/Paterswolde (Groningerweg 19) (W. Europe )
- Taal: Nederlands
-
- Methode: Virtueel leren
- Datum: 16-17 februari, 2026 | 09:30 to 17:00
- Locatie: Virtueel-en-klassikaal (W. Europe )
- Taal: Nederlands
Doelgroep
Naar boven- AI/ML Enthusiasts interested in learning about NLP (Natural Language Processing) and Large Language Models (LLMs).
- Data Scientists/Engineers interesting in using LLMs for inference and finetuning
- Software Developers wanting basic practical experience with NLP frameworks and LLMs
- Students and Professionals curious about the basics of transformers and how they power AI models
Trainingsdoelstellingen
Naar bovenWorking with an engaging, hands-on learning environment, and guided by an expert instructor, students will learn the basics of Large Language Models (LLMs) and how to use them for inference to build AI powered applications.
- Understand the basics of Natural Language Processing
- Implement text preprocessing and tokenization techniques using NLTK
- Explain word embeddings and the evolution of language models
- Use RNNs and LSTMs for handling sequential data
- Describe what transformers are and use key models like BERT and GPT
- Understand the risks and limitations of LLMs
- Use pre-trained models from Hugging Face to implement NLP tasks
- Understand the basics of Retrieval-Augmented Generation (RAG) systems
Inhoud training
Naar boven1) Introduction to NLP
- What is NLP?
- NLP Basics: Text Preprocessing and Tokenization
- NLP Basics: Word Embeddings
- Introducing Traditional NLP Libraries
- A brief history of modeling language
- Introducing PyTorch and HuggingFace for Text Preprocessing
- Neural Networks and Text Data
- Building Language Models using RNNs and LSTMs
2) Transformers and LLMs
- Introduction to Transformers
- Using Hugging Face’s Transformers for inference
- LLMs and Generative AI
- Current LLM Options
- Fine tuning GPT
- Aligning LLMs with Human Values
- Retrieval-Augmented Generation (RAG) Systems
Voorkennis
Naar boven- Proficiency in Python programming
- Familiarity with data analysis using Pandas