Comments on: A geometric interpretation of the covariance matrix https://www.visiondummy.com/2014/04/geometric-interpretation-covariance-matrix/ A blog about intelligent algorithms, machine learning, computer vision, datamining and more. Fri, 21 Jul 2017 05:50:18 +0000 hourly 1 https://wordpress.org/?v=3.8.39 By: NA https://www.visiondummy.com/2014/04/geometric-interpretation-covariance-matrix/#comment-398 Mon, 12 Jun 2017 03:04:46 +0000 http://www.visiondummy.com/?p=440#comment-398 I really liked this explanation, thanks.

]]>
By: Kamesh https://www.visiondummy.com/2014/04/geometric-interpretation-covariance-matrix/#comment-395 Thu, 08 Jun 2017 21:32:10 +0000 http://www.visiondummy.com/?p=440#comment-395 Great write up and explanation.

Consider \Sigma = [2 0.1; 0.1 3]; If you perform an eigenvalue-eigenvector decomposition, i.e. P = VLV^T. The V obtained is no longer a rotation matrix. It is orthogonal but not a rotation matrix. I think Niranjan Kotha sees the same issue. The problem is that the factorization doesn’t always yield a rotation matrix (orthogonal yes, but not the special orthogonal matrix).

]]>
By: Kevin https://www.visiondummy.com/2014/04/geometric-interpretation-covariance-matrix/#comment-374 Sun, 23 Apr 2017 23:27:46 +0000 http://www.visiondummy.com/?p=440#comment-374 very instructive!! thank you-

]]>
By: renzocoppola https://www.visiondummy.com/2014/04/geometric-interpretation-covariance-matrix/#comment-366 Sun, 02 Apr 2017 07:09:19 +0000 http://www.visiondummy.com/?p=440#comment-366 Really cool. I though I would never find the correlation of these matrices and transformations in the CMA-ES algorithm.

]]>
By: simon https://www.visiondummy.com/2014/04/geometric-interpretation-covariance-matrix/#comment-355 Fri, 10 Mar 2017 05:11:08 +0000 http://www.visiondummy.com/?p=440#comment-355 Do you know of any mathematics book where I can find a rigorous dissertation about this?

]]>
By: Manoj Gupta https://www.visiondummy.com/2014/04/geometric-interpretation-covariance-matrix/#comment-351 Mon, 06 Mar 2017 09:41:52 +0000 http://www.visiondummy.com/?p=440#comment-351 Very good explain and worthful.
But I have doubt why does eigenvector have one direction even though spread is in both directions.
It’s true that both cancel out and we are left with zero…
Where I am going geometrically worng.
Second
Why don’t we a complex eigen vector conjugate when we rotate the white data by rotational matrix….

]]>
By: harmyder https://www.visiondummy.com/2014/04/geometric-interpretation-covariance-matrix/#comment-342 Tue, 17 Jan 2017 08:30:29 +0000 http://www.visiondummy.com/?p=440#comment-342 The better way is to say that it is just a variance of projected data. So, that is a mistake, it should be variance, not covariance.

]]>
By: Dave https://www.visiondummy.com/2014/04/geometric-interpretation-covariance-matrix/#comment-330 Fri, 25 Nov 2016 19:42:52 +0000 http://www.visiondummy.com/?p=440#comment-330 Awesome article! Really helped me to understand this eigenvalue/eigenvector stuff :)..thanks!!!

]]>
By: humansofportsmouth https://www.visiondummy.com/2014/04/geometric-interpretation-covariance-matrix/#comment-325 Sun, 13 Nov 2016 15:29:43 +0000 http://www.visiondummy.com/?p=440#comment-325 Thank you a lot.. I was trying to implement my mcmc code using a proposal covariance matrix and thanks to your method everything is clear to me now :)

]]>
By: Duong Tuan Nguyen https://www.visiondummy.com/2014/04/geometric-interpretation-covariance-matrix/#comment-317 Mon, 03 Oct 2016 03:36:35 +0000 http://www.visiondummy.com/?p=440#comment-317 Very useful article. However, I don’t understand how \vec{v}^{\intercal} \Sigma \vec{v} is the variance of the projected data. Can anyone explain it for me?

Thanks in advance!

]]>