In this chapter, we will cover the following topics:在这章,将涵盖以下主题:
1、 Fitting a line through data将数据进行线性拟合
2、 Evaluating the linear regression model评估线性回归模型
3、 Using ridge regression to overcome linear regression's shortfalls使用岭回归解决线性回归的误差问题
4、 Optimizing the ridge regression paramete最优化岭回归参数
5、 Using sparsity to regularize models使用稀疏性来标准化模型
6、 Taking a more fundamental approach to regularization with LARS使用更基本的近似方法LARS回归
7、 Using linear methods for classification – logistic regression使用线性方法进行分类-逻辑回归
8、 Directly applying Bayesian ridge regression 直接应用贝叶斯岭回归
9、 Using boosting to learn from errors 使用提升方法来从误差中学习
Introduction介绍
Linear models are fundamental in statistics and machine learning. Many methods rely on a linear combination of variables to describe the relationship in the data. Quite often, great efforts are taken in an attempt to make the transformations necessary so that the data can be described in a linear combination.
线性模型是机器学习的基本分析方法,很多方法依赖变量组合间的线性关系来描述数据之间的关系,通常,为了让数据能够被线性关系描述,必须进行很大的努力来做必要的变换。
In this chapter, we build up from the simplest idea of fitting a straight line through data to classification, and finally to Bayesian ridge regression.
在这章,我们通过对数据建立一条拟合直线的简单方法来进行分类,最终是进行贝叶斯岭回归