AI Tutorials for Software Engineers

A series of tutorials that dive deep into the fundamental concepts underlying modern artificial intelligence with the theory and implementations in Go and Rust.

These tutorials should be accessible to anyone and only a basic understanding of mathematics is assumed. All the prerequisites and foundational concepts are introduced to allow an understanding from the basic to the advanced.


Probability & Statistics (coming soon)

An introduction to probability and statistics and the fundamental concepts that will be important in many of the tutorials to follow. Starting with the basic concepts of probability of an event occurring to probability density functions, odds, log odds and the odds ratios among other things.

Differential Calculus (coming soon)

A deep dive into the rate of change of a quantity for a given input. This tutorial on differential calculus walks through the concepts of slope of a line and the techniques used to calculate the slope of a curve at a given point on the curve. Derivatives of several important functions are introduced along with proofs as well as the chain rule and product rule of derivatives which will be important in tutorials to follow. Finally partial derivatives are explained with an eye to understanding the derivative of a function with respect to a given parameter.

Matrix Algebra & Matrix Calculus (coming soon)

Matrices and Matrix algebra are critical to understand and this tutorial walks through the building blocks and fundamental concepts of matrices and vectors and the key axioms of linear algebra using matrices. This tutorial also covers matrix calculus and matrix differentiation which will be key to understanding differential calculus and partial derivatives in n dimensions.


Simple Linear Regression

Simple linear regression is a statistical technique of making predictions from data. The tutorial introduces a linear model in two dimensions that examines the relation ship between one dependent variable and one independent variable and then finds line that best fits the data. Once the line of best fit is determined it can be used to predict values of y (the dependent variable) for any value of x (the independent variable)


Simple Linear Regression - Proof of the Closed Form

This tutorial walks through a detailed proof of the the closed form solution of simple linear regression introduced above. This proof walks through solving two partial differential equations to compute the values of the two parameters. While this proof provides a detailed explanation (and requires an understanding of differential calculus), for the curious reader of how the closed form solution is obtained, it isn't strictly necessary for the material ahead and can be skipped if needed.


Multiple Regression

Multiple regression extends the two dimensional linear model introduced in Simple Linear Regression to k + 1 dimensions with one dependent variable, k independent variables and k+1 parameters. The general matrix form of the model is introduced along with a closed form solution that depends on the feature matrix being invertible (which is not always possible) and sets the stage for using gradient descent to solve linear regression.


Multiple Regression - Deriving the Matrix Form


Gradient Descent for Simple Linear Regression


Gradient Descent for Multiple Regression


Logistic Regression (coming soon)


Cost function & Gradient Descent for Logistic Regression (coming soon)


Derivative of the Cost Function for Logistic Regression (coming soon)


Naive Bayes Classifiers


Decision Trees


Markov Decision Processes


Reinforcement Learning