Open Source Generative AI

$2595

4 days

2024-08-20

Enroll

Essential Skills Gained

Master advanced prompt engineering.

Understand AI architecture, especially Transformers.

Write a real-world AI web application.

Describe tokenization and word embeddings.

Format

4 day course with lecture and hands-on labs.

Audience

Project Managers

Architects

Developers

Data Acquisition Specialists

Description

Learn how to write practical AI applications via hands-on labs. You will design and develop Transformer models, ensuring data security in your work. The course covers AI transformer architectures, Python programming, hardware requirements, training techniques, and AI tasks like classification and regression. It includes hands-on exercises with open-source LLM frameworks, advanced topics like fine-tuning and quantization, and offers AI certification from Alta3 Research. Ideal for Python Developers, DevSecOps Engineers, and Managers or Directors, the course requires basic Python skills and provides access to a GPU-accelerated server for practical experience.

Download PDF βž”

Summary

  • πŸ’»Welcome to Alta3 Live

  • πŸ’» Register for Poll

Learning Your Environment

  • πŸ’» Using Vim

  • πŸ’» Tmux

  • πŸ’» VScode Integration

  • πŸ’» Revision Control with GitHub

Deep Learning Intro

  • πŸ’¬ What is Intelligence?

  • πŸ’¬ Generative AI Unveiled

  • πŸ’¬ The Transformer Model

  • πŸ’¬ Feed Forward Neural Networks

  • πŸ’» Tokenization

  • πŸ’» Word embeddings

  • πŸ’» Positional Encoding

Build a Transformer Model from Scratch

  • πŸ’¬ PyTorch

  • πŸ’» Construct a Tensor from a Dataset

  • πŸ’» Orchestrate Tensors in Blocks and Batches

  • πŸ’» Initialize PyTorch Generator Function

  • πŸ’» Train the Transformer Model

  • πŸ’» Apply Positional Encoding and Self-Attention

  • πŸ’» Attach the Feed Forward Neural Network

  • πŸ’» Build the Decoder Block

  • πŸ’» Transformer Model as Code

Prompt Engineering

  • πŸ’¬ Introduction to Prompt Engineering

  • πŸ’» Getting Started with Gemini

  • πŸ’» Developing Basic Prompts

  • πŸ’» Intermediate Prompts: Define Task/Inputs/Outputs/Constraints/Style

  • πŸ’» Advanced Prompts: Chaining, Set Role, Feedback, Examples

Hardware requirements

  • πŸ’¬ GPUs role in AI performance (CPU vs GPU)

  • πŸ’¬ Current GPUs and cost vs value

  • πŸ’¬ Tensorcore vs older GPU architectures

Pre-trained LLM

  • πŸ’¬ A History of Neural Network Architectures

  • πŸ’¬ Introduction to the LLaMa.cpp Interface

  • πŸ’¬ Preparing A100 for Server Operations

  • πŸ’» Operate LLaMa2 Models with LLaMa.cpp

  • πŸ’» Selecting Quantization Level to Meet Performance and Perplexity Requirements

  • πŸ’¬ Running the llama.cpp Package

  • πŸ’» Llama interactive mode

  • πŸ’» Persistent Context with Llama

  • πŸ’» Constraining Output with Grammars

  • πŸ’» Deploy Llama API Server

  • πŸ’» Develop LLaMa Client Application

  • πŸ’» Write a Real-World AI Application using the Llama API

Fine Tuning

  • πŸ’» Using PyTorch to fine tune models

  • πŸ’» Advanced Prompt Engineering Techniques

Testing and Pushing Limits

  • πŸ’» Maximizing Model Limits

  • πŸ’¬ Curriculum Path: GenerativeAI

Your Team has Unique Training Needs.

Your team deserves training as unique as they are.

Let us tailor the course to your needs at no extra cost.