基于keras的线性方程回归模型

2019-03-20 14:45:33 浏览数 (1)

0.完整代码

下面一段代码实现了2个功能: 1.用keras库编程实现拟合线性方程的回归模型; 2.对比了4种优化器的性能。

代码语言:javascript复制
from keras.models import Sequential
from keras.layers import Dense
import numpy as np
from keras import optimizers

if __name__ == '__main__':
    w = 2.5
    b = 1.5
    X = np.linspace(2, 100, 50)
    Y = X * w   b
    print('X[:5]:', X[:5])
    print('Y[:5]:', Y[:5])
    adam = optimizers.Adam(lr=0.02)
    sgd = optimizers.SGD(lr=0.0002)
    adagrad = optimizers.Adagrad(lr=0.3)
    adadelta = optimizers.Adadelta(lr=0.3)
    optimizer_list = [adam, sgd, adagrad, adadelta]
    epochs_list = [100, 200, 500, 1000]
    for epochs in epochs_list:
        for optimizer in optimizer_list:
            model = Sequential()
            model.add(Dense(input_dim=1, units=1))
            model.compile(loss='mse', optimizer=optimizer)
            model.fit(X, Y, steps_per_epoch=10, epochs=epochs, verbose=False)
            trained_w = model.layers[0].get_weights()[0][0][0]
            trained_b = model.layers[0].get_weights()[1][0]
            w_error = abs(trained_w - w)
            b_error = abs(trained_b - b)
            print('epochs:%d, 优化器种类:%s,t w误差:%.4f, b误差:%.4f'
                  %(epochs, optimizer.__class__, w_error, b_error))

上面一段代码的运行结果如下:

代码语言:javascript复制
 X[:5]: [ 2.  4.  6.  8. 10.]
 Y[:5]: [ 6.5 11.5 16.5 21.5 26.5]
 epochs:100, 优化器种类:<class 'keras.optimizers.Adam'>,   w误差:0.0083, b误差:0.5539
 epochs:100, 优化器种类:<class 'keras.optimizers.SGD'>,    w误差:0.0195, b误差:1.3155
 epochs:100, 优化器种类:<class 'keras.optimizers.Adagrad'>,    w误差:0.0297, b误差:1.9919
 epochs:100, 优化器种类:<class 'keras.optimizers.Adadelta'>,   w误差:0.4450, b误差:0.9875
 epochs:200, 优化器种类:<class 'keras.optimizers.Adam'>,   w误差:0.0032, b误差:0.2133
 epochs:200, 优化器种类:<class 'keras.optimizers.SGD'>,    w误差:0.0181, b误差:1.2160
 epochs:200, 优化器种类:<class 'keras.optimizers.Adagrad'>,    w误差:0.0046, b误差:0.3051
 epochs:200, 优化器种类:<class 'keras.optimizers.Adadelta'>,   w误差:0.3739, b误差:0.3786
 epochs:500, 优化器种类:<class 'keras.optimizers.Adam'>,   w误差:0.0000, b误差:0.0000
 epochs:500, 优化器种类:<class 'keras.optimizers.SGD'>,    w误差:0.0135, b误差:0.9093
 epochs:500, 优化器种类:<class 'keras.optimizers.Adagrad'>,    w误差:0.0050, b误差:0.3327
 epochs:500, 优化器种类:<class 'keras.optimizers.Adadelta'>,   w误差:0.0027, b误差:0.0172
 epochs:1000, 优化器种类:<class 'keras.optimizers.Adam'>,  w误差:0.0000, b误差:0.0000
 epochs:1000, 优化器种类:<class 'keras.optimizers.SGD'>,   w误差:0.0083, b误差:0.5563
 epochs:1000, 优化器种类:<class 'keras.optimizers.Adagrad'>,   w误差:0.0141, b误差:0.9425
 epochs:1000, 优化器种类:<class 'keras.optimizers.Adadelta'>,  w误差:0.0101, b误差:0.4870
 

从上面的运行结果可以看出: 在epochs为100时,Adam优化器效果最优,SGD优化器次优; 在epochs为200时,Adam优化器效果最优,Adagrad优化器次优; 在epochs为500时,Adam优化器效果最优,Adadelta优化器次优; 在epochs为1000时,Adam优化器效果最优。

1.结论

对于线性方程的回归模型,使用Adam优化器能够得到不错的拟合效果。

0 人点赞