Toggle navigation
Kunyu's Blog
Home
About
Archive
Archive
Change is neither good nor bad.
It simply is.
Show All
^{5}
Machine Learning
^{3}
SVD
^{3}
Bias–variance Tradeoff
^{2}
Linear Regression
^{2}
Statistical Learning
^{2}
PCA
^{1}
Regularization
^{1}
2021
在知乎答：为什么样本方差的分母是n - 1？
翻译自之前的博客：Estimate Population Variance: should we divide by n - 1 or n
2020
Estimate Population Variance: should we divide by n - 1 or n
Understand why we use (n − 1) in sample variance, and why dividing by n still gives us a good estimator
Principal Component Analysis (PCA)
Understand PCA and how we do it via EVD and SVD, and why the SVD implementation is better
Multicollinearity and Ridge Regression
Understand multicollinearity and how it compromises least squares, and how ridge regression helps
2019
SVD and Underdetermined Least Squares
Understand how SVD derives a consistent expression for least-square weights