Skip to main Content

LLM Basics

  • Código del Curso GK840035
  • Duración 2 días

Otros Métodos de Impartición

Cerrado Precio

Por favor contáctenos

Solicitar Formación Grupal Inscribirse

Método de Impartición

Este curso está disponible en los siguientes formatos:

  • Cerrado

    Cerrado

  • Clase de calendario

    Aprendizaje tradicional en el aula

  • Aprendizaje Virtual

    Aprendizaje virtual

Solicitar este curso en un formato de entrega diferente.

This course provides a comprehensive introduction to Large Language Models (LLMs), focusing on what they are, how to build them using PyTorch, and how to use them for inference in language tasks. Participants will learn about the history of LLMs, how LLMs fit into the larger AI/Generative AI landscape, neural-network-based language models, and how to use RNNs, LSTMs, and transformers for natural language processing tasks.

Company Events

These events can be delivered exclusively for your company at our locations or yours, specifically for your delegates and your needs. The Company Events can be tailored or standard course deliveries.

Calendario

Parte superior

Dirigido a

Parte superior

- AI/ML Enthusiasts interested in learning about NLP (Natural Language Processing) and Large Language Models (LLMs).

- Data Scientists/Engineers interesting in using LLMs for inference and finetuning

- Software Developers wanting basic practical experience with NLP frameworks and LLMs

- Students and Professionals curious about the basics of transformers and how they power AI models

Objetivos del Curso

Parte superior

Working with an engaging, hands-on learning environment, and guided by  an expert instructor, students will learn the basics of Large Language Models (LLMs) and how to use them for inference to build AI powered applications.    

  • Understand the basics of Natural Language Processing   
  • Implement text preprocessing and tokenization techniques using NLTK   
  • Explain word embeddings and the evolution of language models   
  • Use RNNs and LSTMs for handling sequential data   
  • Describe what transformers are and use key models like BERT and GPT   
  • Understand the risks and limitations of LLMs   
  • Use pre-trained models from Hugging Face to implement NLP tasks   
  • Understand the basics of Retrieval-Augmented Generation (RAG) systems

1) Introduction to NLP   

  • What is NLP?  
  • NLP Basics: Text Preprocessing and Tokenization  
  • NLP Basics: Word Embeddings  
  • Introducing Traditional NLP Libraries  
  • A brief history of modeling language  
  • Introducing PyTorch and HuggingFace for Text Preprocessing  
  • Neural Networks and Text Data  
  • Building Language Models using RNNs and LSTMs 

2) Transformers and LLMs   

  • Introduction to Transformers  
  • Using Hugging Face’s Transformers for inference   
  • LLMs and Generative AI  
  • Current LLM Options   
  • Fine tuning GPT  
  • Aligning LLMs with Human Values   
  • Retrieval-Augmented Generation (RAG) Systems

Pre-requisitos

Parte superior
  • Proficiency in Python programming
  • Familiarity with data analysis using Pandas