深度学习之Simple

    科技2022-08-29  95

    深度学习之Simple_Regression(简单线性回归)

    1、简单线性回归的主要过程

    ​ 简单线性回归主要是最小二乘法实现的,如下图,外面进行训练是主要是为了让目标函数(损失函数)尽可能的小,以达到函数偏差较小的目的。

    1.2、然后通过以上求最小,然后通过计算求的a,b的值,生成拟合曲线。

    2、简单线性回归代码实现

    2.1、首先寻找两个训练集x_strain,y_strain.
    x_train = np.array([1 ,3,5,7,8,10]).astype("float") y_train = np.array([2,3,4,6,8,9]).astype("float") #或者 x_big = np.random.random(size = 1000) #x训练集 y_big = x_big * 2.0 + 3.0 +np.random.normal(size=1000) #y训练集
    2.2、初始化两个参数,a,b
    def __init__(self): self.a_ = None self.b_ = None
    2.3、对训练集进行训练,求出a,b
    def fit(self,x_train,y_train): #训练函数 self.x_mean = np.mean(x_train) self.y_mean = np.mean(y_train) num = (x_train - self.x_mean).dot(y_train - self.y_mean) #用向量做相乘。dot d = (x_train - self.x_mean).dot(x_train - self.x_mean) self.a_ = num/d self.b_ = self.y_mean - self.a_ * self.x_mean
    2.4、对预测函数进行计算

    求出了a,b就求出了拟合曲线,先如果给相应的x_text,给定predict函数,进行求值

    def predict(self, x_predict): #直接调用的函数,传入——predict数组 return np.array([self._predict(x) for x in x_predict]) def _predict(self, x): return x * self.a_ +self.b_
    2.5、求出R_Square

    ​ R_Square越接近1,说明函数拟合的越好,测试数据跟原始数据的差距越小

    def R_Square(self , x_train , y_trian): #返回r平方 return 1 - ((self.predict(x_train) - y_trian).dot(self.predict(x_train) - y_trian) /(y_trian - self.y_mean).dot(y_trian - self.y_mean))

    3、实例代码

    from matplotlib import pyplot as plt import pandas as pd import numpy as np class SimpleRegression(): def __init__(self): self.a_ = None self.b_ = None def fit(self,x_train,y_train): #训练函数 self.x_mean = np.mean(x_train) self.y_mean = np.mean(y_train) # num = 0.0 # d = 0.0 # for x,y in zip(x_train , y_train): # num += (x-x_mean)*(y-y_mean) # d += (x - x_mean)**2 num = (x_train - self.x_mean).dot(y_train - self.y_mean) #用向量做相乘。dot d = (x_train - self.x_mean).dot(x_train - self.x_mean) self.a_ = num/d self.b_ = self.y_mean - self.a_ * self.x_mean plt.scatter(x_train , y_train) plt.plot(x_train, self.a_ * x_train +self.b_) plt.show() def predict(self, _predict): #测试函数 return np.array([self._predict(x) for x in _predict]) def _predict(self, x): return x * self.a_ +self.b_ def R_Square(self , x_train , y_trian): #返回r平方 return 1 - ((self.predict(x_train) - y_trian).dot(self.predict(x_train) - y_trian) / (y_trian - self.y_mean).dot(y_trian - self.y_mean)) if __name__ == '__main__': my_plt = SimpleRegression() # x_train = np.array([1 ,3,5,7,8,10]).astype("float") # y_train = np.array([2,3,4,6,8,9]).astype("float") # my_plt.fit(x_train,y_train) # reg = my_plt.predict(np.array([7,3,2])) # print(reg) x_big = np.random.random(size = 1000) #x训练集 y_big = x_big * 2.0 + 3.0 +np.random.normal(size=1000) #y训练集 my_plt.fit(x_big,y_big) print(my_plt.a_,my_plt.b_) print(my_plt.R_Square(x_big,y_big)) #R平方传入x集合和y集合训练
    Processed: 0.038, SQL: 9