Jumpstart your career in data science with our comprehensive AI/ML course designed for aspiring data scientists and professionals seeking expertise in machine learning algorithms and artificial intelligence programming.
Understand current AI and machine learning methods, tools, and techniques.
Implement machine learning models using popular algorithms.
Design hands-on projects applying AI concepts to solve computational problems.
Explore further learning opportunities in advanced AI/ML courses.
Introduction to Artificial Intelligence (AI) & Machine Learning (AI & ML JumpStart) is a three-day, foundation-level, hands-on course that explores the fast-changing field of artificial intelligence (AI). programming, logic, search, machine learning, and natural language understanding. Students will learn current AI / ML methods, tools, and techniques, their application to computational problems, and their contribution to understanding intelligence. In this course, we will cut through the math and learn exactly how machine learning algorithms work. Although there is clearly a requirement for the students to have an aptitude for math, this course is about focusing on the algorithms that will be used to create machine learning models. This course presents a wide variety of related technologies, concepts and skills in a fast-paced, hands-on format, providing students with a solid foundation for understanding and getting a jumpstart into working with AI and machine learning. Each topic area presents a specific challenge area, current progress, and approaches to the presented problem. Attendees will exit the course with practical understanding of related core skills, methods and algorithms, and be prepared for continued learning in next-level, more advanced follow-on courses that dive deeper into specific skillsets or tools.
No upcoming dates. Please check back later.
Is machine learning difficult?
What is artificial intelligence
Difference between AI and machine learning
Machine learning examples
Three different types of machine learning: supervised, unsupervised, and reinforcement learning
Difference between labeled and unlabeled data
The difference between regression and classification, and how they are used
Fitting a line through a set of data points
Coding the linear regression algorithm in Python
Using Turi Create to build a linear regression model to predict housing prices in a real dataset
What is polynomial regression
Fitting a more complex curve to nonlinear data
Examples of linear regression
What is underfitting and overfitting
Solutions for avoiding overfitting
Testing the model complexity graph, and regularization
Calculating the complexity of the model
Picking the best model in terms of performance and complexity
What is classification
Sentiment analysis
How to draw a line that separates points of two colors
What is a perceptron
Coding the perceptron algorithm in Python and Turi Create
Hard assignments and Soft assignments
The sigmoid function
Discrete perceptrons vs. Continuous perceptrons
Logistic regression algorithm for classifying data
Coding the logistic regression algorithm in Python
Types of errors a model can make
The confusion matrix
What are accuracy, recall, precision, F-score, sensitivity, and specificity
What is the ROC curve
What is Bayes theorem
Dependent and independent events
The prior and posterior probabilities
Calculating conditional probabilities
Using the naive Bayes model
Coding the naive Bayes algorithm in Python
What is a decision tree
Using decision trees for classification and regression
Building an app-recommendation system using users’ information
Accuracy, Gini index, and entropy
Using Scikit-Learn to train a decision tree
What is a neural network
Architecture of a neural network: nodes, layers, depth, and activation functions
Training neural networks
Potential problems in training neural networks
Techniques to improve neural network training
Using neural networks as regression models
What a support vector machine
Which of the linear classifiers for a dataset has the best boundary
Using the kernel method to build nonlinear classifiers
Coding support vector machines and the kernel method in Scikit-Learn
What ensemble learning is
Using bagging to combine classifiers
Using boosting to combine classifiers
Ensemble methods: random forests, AdaBoost, gradient boosting, and XGBoost
Cleaning up and preprocessing data to make it readable by our model
Using Scikit-Learn to train and evaluate several models
Using grid search to select good hyperparameters for our model
Using k-fold cross-validation to be able to use our data for training and validation simultaneously
Your team deserves training as unique as they are.
Let us tailor the course to your needs at no extra cost.
Trusted by Engineers at:
and more...
Aaron Steele
Casey Pense
Chris Tsantiris
Javier Martin
Justin Gilley
Kathy Le
Kelson Smith
Oussama Azzam
Pascal Rodmacq
Randall Granier
Aaron Steele
Casey Pense
Chris Tsantiris
Javier Martin
Justin Gilley
Kathy Le
Kelson Smith
Oussama Azzam
Pascal Rodmacq
Randall Granier