Introduction to Machine Learning Training in Syracuse

Enroll in or hire us to teach our Introduction to Machine Learning class in Syracuse, New York by calling us @303.377.6176. Like all HSG classes, Introduction to Machine Learning may be offered either onsite or via instructor led virtual training. Consider looking at our public training schedule to see if it is scheduled: Public Training Classes
Provided there are enough attendees, Introduction to Machine Learning may be taught at one of our local training facilities.
We offer private customized training for groups of 3 or more attendees.

Course Description

 
Why write programs when the computer can instead learn them from data? In this class you will learn how to make this happen, from the simplest machine learning algorithms to quite sophisticated ones.
Course Length: 3 Days
Course Tuition: $2090 (US)

Prerequisites

Experience in software development, project management, or business or systems analysis is desirable, but not mandatory.

Course Outline

 
Introduction
 
What is Machine Learning
Why is Machine Learning important
Stages of gaining knowledge
Types of Machine Learning
Groupings and Classification
Challenges
 
Input to Output
 
Functional Learning
Parametric and Non-Parametric Functions
Bias and Variance
Bias-Variance Trade-Off
Overfitting and Underfitting
 
Linear Regression
 
Linear Fitting
Loss Function
Least Squares Fit
Polynomial and Quadratic Models
Regularization
Lasso and Ridge Regression
Cross-Validation
 
Logistic Regression
 
Log Odds
Standard Logistic Function
Training a Logistic Regression Model
Advantages and Disadvantages
 
 
Linear Disriminant Analysis
 
Purpose
Learning LDA Models
Mean and Variance
Making Predictions
Bayes Theorem
Extensions to LDA
 
Classification Models
 
Definition
Stages
Two Class and Multi-Class Tasks
Techniques
Example of a Decision Tree
Decision Tree Induction
Structure of a split
Measure of node impurity
Stopping criteria
Advantages and Disadvantages
Generalization
Classifier Performance
Occam's Razor
Addressing Overfitting
Pruning
Model Validation
Validation Strategies
 
Bayesian Classifiers - Naive Bayes
 
Bayesian Classification
Examples
Bayes classifiers
Naive Bayes classifier
Estimate from data
 
K-Nearest Neighbors
 
Introduction
Instance based classifiers
Rote Learning
Nearest Neighbor classifier
Definition of Nearest Neighbor
Distance metrics
Choosing the values of K
Scaling issues
Different names
 
Neural Networks
 
Historical Sketch
Applications
Biological replication
Artificial Neurons
Activation Functions
Learning Networks
Perceptrons
Perceptron structure
Decision boundary
Training process
Training rule
Squared error function
Gradient error function
Gradient descent
Equivalence of Perceptron and Linear Models
Structure of Multilayer Neural Network
Neural Network architectures
Roles of nodes
Algorithm for Learning Neural Network
Sigmoid Unit
Backpropagation
Backward Pass
Convergence of Backpropagation
Avoid Overfitting
Expressiveness of Multilayer Neural Networks
 
 
Support Vector Machines
 
Linearly Separated Classes
Computation of Optimal Hyperplane
Maximum Margin
Non-Linearity
Non-Linear Boundary
Transformations
Kernel Trick
 
 
Ensemble Methods
 
Classifiers
Building and Using a Committee Ensemble
Binomial Distribution
Why do Ensembles Work
Diversity
Accuracy and speed
Bootstrap Sampling
Bagging
Boosting
AdaBoost

Course Directory [training on all levels]

Upcoming Classes
Gain insight and ideas from students with different perspectives and experiences.

Interesting Reads Take a class with us and receive a book of your choosing for 50% off MSRP.