In this post I will look at “Regularization” in order to address an important problem that is common with implementations, namely over-fitting. We’ll go through for logistic regression and linear regression. After getting the equations for regularization worked out we’ll look at an example in Python showing how this can be used for a badly over-fit linear regression model.
Machine Learning and Data Science: Logistic Regression Theory
Logistic regression is a widely used Machine Learning method for binary classification. It is also a good stepping stone for understanding Neural Networks. In this post I will present the theory behind it including a derivation of the Logistic Regression Cost Function gradient.
Machine Learning and Data Science: Linear Regression Part 6
This will be the last post in the Linear Regression series. We will look at the problems of over or under fitting data along with non-linear feature variables.
Machine Learning and Data Science: Linear Regression Part 5
In this post I will present the matrix/vector form of the Linear Regression problem and derive the “exact” solution for the parameters.
Machine Learning and Data Science: Linear Regression Part 4
In this post I’ll be working up, analyzing, visualizing, and doing Gradient Descent for Linear Regression. It’s a Jupyter notebook with all the code for plots and functions in Python available on my github account.
Machine Learning and Data Science: Linear Regression Part 3
In Part 3 of this series on Linear Regression I will go into more detail about the Model and Cost function. Including several graphs that will hopefully give insight into the their nature and serve as a reference for developing algorithms in the next post.
Machine Learning and Data Science: Linear Regression Part 2
In Part 2 of this series on Linear Regression I will pull a data-set of house sale prices and “features” from Kaggle and explore the data in a Jupyter notebook with pandas and seaborn. We will extract a good subset of data to use for our example analysis of the linear regression algorithms.
Machine Learning and Data Science: Linear Regression Part 1
Linear regression could possibly be considered the “Hello World” problem of Machine Learning. It’s implementation touches on many of the fundamental ideas and problems in this field. I’ll give you some guidance for understanding and implementation of this fundamental idea.
Machine Learning and Data Science: Introduction
This is the start of a series of posts on Machine Learning and Data Science. I’ll be exploring the algorithms and tools of Machine Learning and Data Science. It will be tutorials, guides, how-to, reviews and “real world” application. The post will be done using Juypter notebooks and the notebooks will be available on GitHub.
Docker and NVIDIA-docker on your workstation: Integration with your Desktop
I’ve been doing this series of posts about setting up Docker for your desktop system, so why not literally add containers to your desktop! The way we have Docker configured, containers are the same as other applications you run. In this post I’ll show you how to add icons and menu items to launch containers.




