LLM Basics
- Course Code GK840035
- Duration 2 days
Course Delivery
Jump to:
Course Delivery
This course is available in the following formats:
-
Company Event
Event at company
-
Public Classroom
Traditional Classroom Learning
-
Virtual Learning
Learning that is virtual
Request this course in a different delivery format.
Course Overview
TopVirtual Learning
This interactive training can be taken from any location, your office or home and is delivered by a trainer. This training does not have any delegates in the class with the instructor, since all delegates are virtually connected. Virtual delegates do not travel to this course, Global Knowledge will send you all the information needed before the start of the course and you can test the logins.
Course Schedule
Top-
- Delivery Format: Virtual Learning
- Date: 15-16 January, 2026 | 8:30 AM to 4:00 PM
- Location: Virtual (GMT Standa)
- Language: English
-
- Delivery Format: Virtual Learning
- Date: 15-16 January, 2026 | 9:30 AM to 5:00 PM
- Location: Virtual (GMT Standa)
- Language: English
-
- Delivery Format: Virtual Learning
- Date: 16-17 February, 2026 | 8:30 AM to 4:00 PM
- Location: Virtual (GMT Standa)
- Language: English
-
- Delivery Format: Virtual Learning
- Date: 19-20 February, 2026 | 9:30 AM to 5:00 PM
- Location: Virtual (GMT Standa)
- Language: English
-
- Delivery Format: Virtual Learning
- Date: 19-20 March, 2026 | 8:30 AM to 4:00 PM
- Location: Virtual (GMT Standa)
- Language: English
-
- Delivery Format: Virtual Learning
- Date: 19-20 March, 2026 | 9:30 AM to 5:00 PM
- Location: Virtual (GMT Standa)
- Language: English
Target Audience
Top- AI/ML Enthusiasts interested in learning about NLP (Natural Language Processing) and Large Language Models (LLMs).
- Data Scientists/Engineers interesting in using LLMs for inference and finetuning
- Software Developers wanting basic practical experience with NLP frameworks and LLMs
- Students and Professionals curious about the basics of transformers and how they power AI models
Course Objectives
TopWorking with an engaging, hands-on learning environment, and guided by an expert instructor, students will learn the basics of Large Language Models (LLMs) and how to use them for inference to build AI powered applications.
- Understand the basics of Natural Language Processing
- Implement text preprocessing and tokenization techniques using NLTK
- Explain word embeddings and the evolution of language models
- Use RNNs and LSTMs for handling sequential data
- Describe what transformers are and use key models like BERT and GPT
- Understand the risks and limitations of LLMs
- Use pre-trained models from Hugging Face to implement NLP tasks
- Understand the basics of Retrieval-Augmented Generation (RAG) systems
Course Content
Top1) Introduction to NLP
- What is NLP?
- NLP Basics: Text Preprocessing and Tokenization
- NLP Basics: Word Embeddings
- Introducing Traditional NLP Libraries
- A brief history of modeling language
- Introducing PyTorch and HuggingFace for Text Preprocessing
- Neural Networks and Text Data
- Building Language Models using RNNs and LSTMs
2) Transformers and LLMs
- Introduction to Transformers
- Using Hugging Face’s Transformers for inference
- LLMs and Generative AI
- Current LLM Options
- Fine tuning GPT
- Aligning LLMs with Human Values
- Retrieval-Augmented Generation (RAG) Systems
Course Prerequisites
Top- Proficiency in Python programming
- Familiarity with data analysis using Pandas