Comments for Computer vision for dummies https://www.visiondummy.com A blog about intelligent algorithms, machine learning, computer vision, datamining and more. Fri, 21 Jul 2017 05:50:18 +0000 hourly 1 https://wordpress.org/?v=3.8.39 Comment on What are eigenvectors and eigenvalues? by zh https://www.visiondummy.com/2014/03/eigenvalues-eigenvectors/#comment-416 Fri, 21 Jul 2017 05:50:18 +0000 http://www.visiondummy.com/?p=111#comment-416 Great work

]]>
Comment on The Curse of Dimensionality in classification by Sara https://www.visiondummy.com/2014/04/curse-dimensionality-affect-classification/#comment-411 Mon, 03 Jul 2017 05:57:37 +0000 http://www.visiondummy.com/?p=332#comment-411 Hi, I am a student who follows data analytics course module I my university. I was so confused with all the theories and awful explanations by the lecturer. Thank god I found this article!

]]>
Comment on Why divide the sample variance by N-1? by Vincent Spruyt https://www.visiondummy.com/2014/03/divide-variance-n-1/#comment-406 Wed, 21 Jun 2017 05:34:04 +0000 http://www.visiondummy.com/?p=196#comment-406 Hi Michael, it’s an honor, great work! Thanks a lot!

]]>
Comment on Why divide the sample variance by N-1? by Michael Cheng https://www.visiondummy.com/2014/03/divide-variance-n-1/#comment-405 Tue, 20 Jun 2017 11:20:39 +0000 http://www.visiondummy.com/?p=196#comment-405 Hi, there, I like your essay very much. I hope you do not mind that I have translated your article to Chinese.
Here is the link: http://commanber.com/2017/06/17/sample-variance/
I will remove my article immediately if you do not allow me to release the Chinese version of your article.
Thanks very much!
Have a good day!

]]>
Comment on The Curse of Dimensionality in classification by Kevin George https://www.visiondummy.com/2014/04/curse-dimensionality-affect-classification/#comment-402 Thu, 15 Jun 2017 05:31:26 +0000 http://www.visiondummy.com/?p=332#comment-402 Hi Vincent, first of all I would like to thank you for the wonder blog post and the work you are doing on visiondummy.

I had one doubt about equation 2. Shouldn’t there be only one distance from a sample point to a centroid? How can there be two distances i.e minimum and maximum distances?

]]>
Comment on Feature extraction using PCA by NA https://www.visiondummy.com/2014/05/feature-extraction-using-pca/#comment-400 Mon, 12 Jun 2017 16:36:23 +0000 http://www.visiondummy.com/?p=328#comment-400 Please write articles describing how LDA, SVD, and ICA work. I like how you are able to take subjects that are very complicated to learn about and teach them in a simple manner at a level I can understand.

]]>
Comment on A geometric interpretation of the covariance matrix by NA https://www.visiondummy.com/2014/04/geometric-interpretation-covariance-matrix/#comment-398 Mon, 12 Jun 2017 03:04:46 +0000 http://www.visiondummy.com/?p=440#comment-398 I really liked this explanation, thanks.

]]>
Comment on How to draw a covariance error ellipse? by Rick Sprague https://www.visiondummy.com/2014/04/draw-error-ellipse-representing-covariance-matrix/#comment-397 Mon, 12 Jun 2017 02:02:31 +0000 http://www.visiondummy.com/?p=288#comment-397 My half minor axis is a negative value and the square root returns a “nan”.

]]>
Comment on A geometric interpretation of the covariance matrix by Kamesh https://www.visiondummy.com/2014/04/geometric-interpretation-covariance-matrix/#comment-395 Thu, 08 Jun 2017 21:32:10 +0000 http://www.visiondummy.com/?p=440#comment-395 Great write up and explanation.

Consider \Sigma = [2 0.1; 0.1 3]; If you perform an eigenvalue-eigenvector decomposition, i.e. P = VLV^T. The V obtained is no longer a rotation matrix. It is orthogonal but not a rotation matrix. I think Niranjan Kotha sees the same issue. The problem is that the factorization doesn’t always yield a rotation matrix (orthogonal yes, but not the special orthogonal matrix).

]]>
Comment on What are eigenvectors and eigenvalues? by Swee Mok https://www.visiondummy.com/2014/03/eigenvalues-eigenvectors/#comment-393 Tue, 06 Jun 2017 12:40:45 +0000 http://www.visiondummy.com/?p=111#comment-393 Great explanation.

It looks like there is a typo in the 2nd line while deriving equation (6). The lambda square should have a positive sign.

]]>