Deep learning for long-term predictions

Deep learning for event predictions
At Sentiance, we use machine learning to extract intelligence from smartphone sensor data such as accelerometer, gyroscope and location. We’ve been doing this for quite a while now, and are very proud on our state-of-the-art results regarding sensor based activity detection, map matching, driving behavior, venue mapping and...
Continue reading »

Feature extraction using PCA

Eigenfaces
In this article, we discuss how Principal Component Analysis (PCA) works, and how it can be used as a dimensionality reduction technique for classification problems. At the end of this article, Matlab source code is provided for demonstration purposes. In an earlier article, we discussed the so called...
Continue reading »

A geometric interpretation of the covariance matrix

covariances
In this article, we provide an intuitive, geometric interpretation of the covariance matrix, by exploring the relation between linear transformations and the resulting data covariance. Most textbooks explain the shape of data based on the concept of covariance matrices. Instead, we take a backwards approach and explain the...
Continue reading »

The Curse of Dimensionality in classification

Linear classifier
In this article, we will discuss the so called ‘Curse of Dimensionality’, and explain why it is important when designing a classifier. In the following sections I will provide an intuitive explanation of this concept, illustrated by a clear example of overfitting due to the curse of dimensionality....
Continue reading »

How to draw a covariance error ellipse?

Error ellipses
In this post, I will show how to draw an error ellipse, a.k.a. confidence ellipse, for 2D normally distributed data. The error ellipse represents an iso-contour of the Gaussian distribution, and allows you to visualize a 2D confidence interval. The following figure shows a 95% confidence ellipse for...
Continue reading »

Why divide the sample variance by N-1?

Normal distribution
In this article, we will derive the well known formulas for calculating the mean and the variance of normally distributed data, in order to answer the question in the article’s title. However, for readers who are not interested in the ‘why’ of this question but only in the...
Continue reading »

What are eigenvectors and eigenvalues?

eigenvectors
Eigenvectors and eigenvalues have many important applications in computer vision and machine learning in general. Well known examples are PCA (Principal Component Analysis) for dimensionality reduction or EigenFaces for face recognition. An interesting use of eigenvectors and eigenvalues is also illustrated in my post about error ellipses. Furthermore,...
Continue reading »