【哈工大版】Dynamic ReLU:Adaptively Parametric ReLU及Keras代码(调参记录6)

2020-05-27 18:10:19 浏览数 (1)

本文介绍哈工大团队提出的一种Dynamic ReLU激活函数,即自适应参数化ReLU激活函数,原本是应用在基于振动信号的故障诊断,能够让每个样本有自己独特的ReLU参数,在2019年5月3日投稿至IEEE Transactions on Industrial Electronics,2020年1月24日录用,2020年2月13日在IEEE官网公布

续上一篇:

【哈工大版】Dynamic ReLU:Adaptively Parametric ReLU及Keras代码(调参记录5)

本文继续调整超参数,测试Adaptively Parametric ReLU(APReLU)激活函数在Cifar10图像集上的效果。

APReLU的基本原理如下图所示:

自适应参数化ReLU:一种dynamic ReLU激活函数自适应参数化ReLU:一种dynamic ReLU激活函数

首先,从之前的调参发现,当学习率从0.1降到0.01和从0.01降到0.001的时候,loss会有大幅的下降。之前学习率降到0.001就结束了,那么如果学习率继续往下降的话,是不是loss还会继续下降呢?

其次,当采用APReLU激活函数时,深度残差网络的结构比较复杂,更难训练,也许需要更多的迭代次数。

因此,这次测试将迭代次数恢复成1000个epoch,将1-300、301-600、601-900、901-1000个epoch的学习率分别设置成了0.1、0.01、0.001、0.0001。

同时,最后全局均值池化之前,如果采用APReLU的话,似乎是不利于模型训练的。这是因为APReLU里面用到了sigmoid函数。因此,全局均值池化之前的APReLU改成了普通的ReLU。(对于残差模块里面的APReLU,由于恒等路径的存在,其所导致训练难度的增加应该是可以容忍的)

Keras代码如下:

代码语言:python代码运行次数:0复制
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Tue Apr 14 04:17:45 2020
Implemented using TensorFlow 1.0.1 and Keras 2.2.1

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht,
Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, 
IEEE Transactions on Industrial Electronics, DOI: 10.1109/TIE.2020.2972458,
Date of Publication: 13 February 2020

@author: Minghang Zhao
"""

from __future__ import print_function
import keras
import numpy as np
from keras.datasets import cifar10
from keras.layers import Dense, Conv2D, BatchNormalization, Activation, Minimum
from keras.layers import AveragePooling2D, Input, GlobalAveragePooling2D, Concatenate, Reshape
from keras.regularizers import l2
from keras import backend as K
from keras.models import Model
from keras import optimizers
from keras.preprocessing.image import ImageDataGenerator
from keras.callbacks import LearningRateScheduler
K.set_learning_phase(1)

# The data, split between train and test sets
(x_train, y_train), (x_test, y_test) = cifar10.load_data()

# Noised data
x_train = x_train.astype('float32') / 255.
x_test = x_test.astype('float32') / 255.
x_test = x_test-np.mean(x_train)
x_train = x_train-np.mean(x_train)
print('x_train shape:', x_train.shape)
print(x_train.shape[0], 'train samples')
print(x_test.shape[0], 'test samples')

# convert class vectors to binary class matrices
y_train = keras.utils.to_categorical(y_train, 10)
y_test = keras.utils.to_categorical(y_test, 10)

# Schedule the learning rate, multiply 0.1 every 300 epoches
def scheduler(epoch):
    if epoch % 300 == 0 and epoch != 0:
        lr = K.get_value(model.optimizer.lr)
        K.set_value(model.optimizer.lr, lr * 0.1)
        print("lr changed to {}".format(lr * 0.1))
    return K.get_value(model.optimizer.lr)

# An adaptively parametric rectifier linear unit (APReLU)
def aprelu(inputs):
    # get the number of channels
    channels = inputs.get_shape().as_list()[-1]
    # get a zero feature map
    zeros_input = keras.layers.subtract([inputs, inputs])
    # get a feature map with only positive features
    pos_input = Activation('relu')(inputs)
    # get a feature map with only negative features
    neg_input = Minimum()([inputs,zeros_input])
    # define a network to obtain the scaling coefficients
    scales_p = GlobalAveragePooling2D()(pos_input)
    scales_n = GlobalAveragePooling2D()(neg_input)
    scales = Concatenate()([scales_n, scales_p])
    scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
    scales = BatchNormalization()(scales)
    scales = Activation('relu')(scales)
    scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
    scales = BatchNormalization()(scales)
    scales = Activation('sigmoid')(scales)
    scales = Reshape((1,1,channels))(scales)
    # apply a paramtetric relu
    neg_part = keras.layers.multiply([scales, neg_input])
    return keras.layers.add([pos_input, neg_part])

# Residual Block
def residual_block(incoming, nb_blocks, out_channels, downsample=False,
                   downsample_strides=2):
    
    residual = incoming
    in_channels = incoming.get_shape().as_list()[-1]
    
    for i in range(nb_blocks):
        
        identity = residual
        
        if not downsample:
            downsample_strides = 1
        
        residual = BatchNormalization()(residual)
        residual = aprelu(residual)
        residual = Conv2D(out_channels, 3, strides=(downsample_strides, downsample_strides), 
                          padding='same', kernel_initializer='he_normal', 
                          kernel_regularizer=l2(1e-4))(residual)
        
        residual = BatchNormalization()(residual)
        residual = aprelu(residual)
        residual = Conv2D(out_channels, 3, padding='same', kernel_initializer='he_normal', 
                          kernel_regularizer=l2(1e-4))(residual)
        
        # Downsampling
        if downsample_strides > 1:
            identity = AveragePooling2D(pool_size=(1,1), strides=(2,2))(identity)
            
        # Zero_padding to match channels
        if in_channels != out_channels:
            zeros_identity = keras.layers.subtract([identity, identity])
            identity = keras.layers.concatenate([identity, zeros_identity])
            in_channels = out_channels
        
        residual = keras.layers.add([residual, identity])
    
    return residual


# define and train a model
inputs = Input(shape=(32, 32, 3))
net = Conv2D(16, 3, padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(inputs)
net = residual_block(net, 9, 16, downsample=False)
net = residual_block(net, 1, 32, downsample=True)
net = residual_block(net, 8, 32, downsample=False)
net = residual_block(net, 1, 64, downsample=True)
net = residual_block(net, 8, 64, downsample=False)
net = BatchNormalization()(net)
net = Activation('relu')(net)
net = GlobalAveragePooling2D()(net)
outputs = Dense(10, activation='softmax', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(net)
model = Model(inputs=inputs, outputs=outputs)
sgd = optimizers.SGD(lr=0.1, decay=0., momentum=0.9, nesterov=True)
model.compile(loss='categorical_crossentropy', optimizer=sgd, metrics=['accuracy'])

# data augmentation
datagen = ImageDataGenerator(
    # randomly rotate images in the range (deg 0 to 180)
    rotation_range=30,
    # randomly flip images
    horizontal_flip=True,
    # randomly shift images horizontally
    width_shift_range=0.125,
    # randomly shift images vertically
    height_shift_range=0.125)

reduce_lr = LearningRateScheduler(scheduler)
# fit the model on the batches generated by datagen.flow().
model.fit_generator(datagen.flow(x_train, y_train, batch_size=100),
                    validation_data=(x_test, y_test), epochs=1000, 
                    verbose=1, callbacks=[reduce_lr], workers=4)

# get results
K.set_learning_phase(0)
DRSN_train_score = model.evaluate(x_train, y_train, batch_size=100, verbose=0)
print('Train loss:', DRSN_train_score[0])
print('Train accuracy:', DRSN_train_score[1])
DRSN_test_score = model.evaluate(x_test, y_test, batch_size=100, verbose=0)
print('Test loss:', DRSN_test_score[0])
print('Test accuracy:', DRSN_test_score[1])

实验结果如下:

代码语言:python代码运行次数:0复制
Using TensorFlow backend.
x_train shape: (50000, 32, 32, 3)
50000 train samples
10000 test samples
Epoch 1/1000
90s 179ms/step - loss: 2.6847 - acc: 0.4191 - val_loss: 2.2382 - val_acc: 0.5544
Epoch 2/1000
62s 125ms/step - loss: 2.1556 - acc: 0.5605 - val_loss: 1.8942 - val_acc: 0.6254
Epoch 3/1000
63s 125ms/step - loss: 1.8590 - acc: 0.6206 - val_loss: 1.6930 - val_acc: 0.6629
Epoch 4/1000
62s 125ms/step - loss: 1.6407 - acc: 0.6615 - val_loss: 1.4932 - val_acc: 0.6958
Epoch 5/1000
62s 125ms/step - loss: 1.4706 - acc: 0.6923 - val_loss: 1.3326 - val_acc: 0.7317
Epoch 6/1000
62s 125ms/step - loss: 1.3352 - acc: 0.7167 - val_loss: 1.2327 - val_acc: 0.7465
Epoch 7/1000
62s 125ms/step - loss: 1.2271 - acc: 0.7365 - val_loss: 1.1326 - val_acc: 0.7583
Epoch 8/1000
62s 125ms/step - loss: 1.1426 - acc: 0.7512 - val_loss: 1.0737 - val_acc: 0.7718
Epoch 9/1000
62s 125ms/step - loss: 1.0724 - acc: 0.7643 - val_loss: 1.0268 - val_acc: 0.7720
Epoch 10/1000
62s 125ms/step - loss: 1.0256 - acc: 0.7687 - val_loss: 0.9672 - val_acc: 0.7842
Epoch 11/1000
62s 125ms/step - loss: 0.9772 - acc: 0.7766 - val_loss: 0.9104 - val_acc: 0.8032
Epoch 12/1000
63s 125ms/step - loss: 0.9385 - acc: 0.7839 - val_loss: 0.8971 - val_acc: 0.8017
Epoch 13/1000
62s 125ms/step - loss: 0.9109 - acc: 0.7910 - val_loss: 0.8675 - val_acc: 0.8073
Epoch 14/1000
62s 125ms/step - loss: 0.8799 - acc: 0.7961 - val_loss: 0.8410 - val_acc: 0.8118
Epoch 15/1000
62s 125ms/step - loss: 0.8680 - acc: 0.7975 - val_loss: 0.8337 - val_acc: 0.8106
Epoch 16/1000
62s 125ms/step - loss: 0.8426 - acc: 0.8045 - val_loss: 0.7960 - val_acc: 0.8194
Epoch 17/1000
62s 124ms/step - loss: 0.8230 - acc: 0.8088 - val_loss: 0.8293 - val_acc: 0.8065
Epoch 18/1000
62s 125ms/step - loss: 0.8143 - acc: 0.8094 - val_loss: 0.7952 - val_acc: 0.8215
Epoch 19/1000
62s 125ms/step - loss: 0.7971 - acc: 0.8148 - val_loss: 0.7876 - val_acc: 0.8169
Epoch 20/1000
62s 125ms/step - loss: 0.7856 - acc: 0.8204 - val_loss: 0.7765 - val_acc: 0.8247
Epoch 21/1000
62s 124ms/step - loss: 0.7774 - acc: 0.8191 - val_loss: 0.7441 - val_acc: 0.8361
Epoch 22/1000
62s 125ms/step - loss: 0.7718 - acc: 0.8247 - val_loss: 0.7552 - val_acc: 0.8325
Epoch 23/1000
62s 125ms/step - loss: 0.7674 - acc: 0.8272 - val_loss: 0.7786 - val_acc: 0.8241
Epoch 24/1000
62s 125ms/step - loss: 0.7582 - acc: 0.8271 - val_loss: 0.7566 - val_acc: 0.8282
Epoch 25/1000
62s 125ms/step - loss: 0.7448 - acc: 0.8315 - val_loss: 0.7507 - val_acc: 0.8336
Epoch 26/1000
62s 125ms/step - loss: 0.7459 - acc: 0.8336 - val_loss: 0.7725 - val_acc: 0.8217
Epoch 27/1000
62s 125ms/step - loss: 0.7418 - acc: 0.8340 - val_loss: 0.7581 - val_acc: 0.8335
Epoch 28/1000
62s 124ms/step - loss: 0.7335 - acc: 0.8354 - val_loss: 0.7402 - val_acc: 0.8360
Epoch 29/1000
62s 125ms/step - loss: 0.7332 - acc: 0.8372 - val_loss: 0.7429 - val_acc: 0.8394
Epoch 30/1000
62s 125ms/step - loss: 0.7243 - acc: 0.8405 - val_loss: 0.7322 - val_acc: 0.8393
Epoch 31/1000
62s 124ms/step - loss: 0.7227 - acc: 0.8422 - val_loss: 0.7098 - val_acc: 0.8468
Epoch 32/1000
62s 125ms/step - loss: 0.7189 - acc: 0.8392 - val_loss: 0.7359 - val_acc: 0.8396
Epoch 33/1000
62s 125ms/step - loss: 0.7144 - acc: 0.8455 - val_loss: 0.7071 - val_acc: 0.8442
Epoch 34/1000
62s 125ms/step - loss: 0.7111 - acc: 0.8460 - val_loss: 0.7401 - val_acc: 0.8404
Epoch 35/1000
62s 125ms/step - loss: 0.7061 - acc: 0.8480 - val_loss: 0.7155 - val_acc: 0.8497
Epoch 36/1000
62s 124ms/step - loss: 0.7072 - acc: 0.8488 - val_loss: 0.7355 - val_acc: 0.8430
Epoch 37/1000
62s 125ms/step - loss: 0.7077 - acc: 0.8496 - val_loss: 0.7167 - val_acc: 0.8521
Epoch 38/1000
62s 125ms/step - loss: 0.6971 - acc: 0.8518 - val_loss: 0.7595 - val_acc: 0.8315
Epoch 39/1000
62s 125ms/step - loss: 0.6971 - acc: 0.8508 - val_loss: 0.7278 - val_acc: 0.8423
Epoch 40/1000
62s 125ms/step - loss: 0.6923 - acc: 0.8553 - val_loss: 0.7252 - val_acc: 0.8452
Epoch 41/1000
62s 125ms/step - loss: 0.6935 - acc: 0.8538 - val_loss: 0.7169 - val_acc: 0.8461
Epoch 42/1000
62s 125ms/step - loss: 0.6902 - acc: 0.8560 - val_loss: 0.7214 - val_acc: 0.8500
Epoch 43/1000
62s 125ms/step - loss: 0.6874 - acc: 0.8576 - val_loss: 0.7078 - val_acc: 0.8492
Epoch 44/1000
62s 125ms/step - loss: 0.6869 - acc: 0.8585 - val_loss: 0.7122 - val_acc: 0.8526
Epoch 45/1000
62s 124ms/step - loss: 0.6830 - acc: 0.8587 - val_loss: 0.7509 - val_acc: 0.8411
Epoch 46/1000
62s 124ms/step - loss: 0.6867 - acc: 0.8583 - val_loss: 0.7015 - val_acc: 0.8555
Epoch 47/1000
62s 124ms/step - loss: 0.6795 - acc: 0.8614 - val_loss: 0.7051 - val_acc: 0.8529
Epoch 48/1000
62s 125ms/step - loss: 0.6790 - acc: 0.8597 - val_loss: 0.7037 - val_acc: 0.8524
Epoch 49/1000
62s 125ms/step - loss: 0.6790 - acc: 0.8612 - val_loss: 0.7121 - val_acc: 0.8526
Epoch 50/1000
62s 125ms/step - loss: 0.6713 - acc: 0.8638 - val_loss: 0.7031 - val_acc: 0.8556
Epoch 51/1000
62s 125ms/step - loss: 0.6655 - acc: 0.8658 - val_loss: 0.6827 - val_acc: 0.8617
Epoch 52/1000
62s 124ms/step - loss: 0.6725 - acc: 0.8649 - val_loss: 0.7000 - val_acc: 0.8566
Epoch 53/1000
62s 125ms/step - loss: 0.6669 - acc: 0.8677 - val_loss: 0.7089 - val_acc: 0.8599
Epoch 54/1000
62s 125ms/step - loss: 0.6654 - acc: 0.8652 - val_loss: 0.6769 - val_acc: 0.8662
Epoch 55/1000
62s 125ms/step - loss: 0.6674 - acc: 0.8668 - val_loss: 0.7016 - val_acc: 0.8570
Epoch 56/1000
62s 124ms/step - loss: 0.6670 - acc: 0.8670 - val_loss: 0.6838 - val_acc: 0.8647
Epoch 57/1000
62s 125ms/step - loss: 0.6667 - acc: 0.8672 - val_loss: 0.7112 - val_acc: 0.8595
Epoch 58/1000
62s 125ms/step - loss: 0.6629 - acc: 0.8688 - val_loss: 0.7012 - val_acc: 0.8587
Epoch 59/1000
62s 125ms/step - loss: 0.6649 - acc: 0.8678 - val_loss: 0.6854 - val_acc: 0.8656
Epoch 60/1000
62s 125ms/step - loss: 0.6592 - acc: 0.8699 - val_loss: 0.6989 - val_acc: 0.8614
Epoch 61/1000
62s 125ms/step - loss: 0.6591 - acc: 0.8696 - val_loss: 0.6978 - val_acc: 0.8603
Epoch 62/1000
62s 124ms/step - loss: 0.6589 - acc: 0.8711 - val_loss: 0.6866 - val_acc: 0.8626
Epoch 63/1000
62s 124ms/step - loss: 0.6516 - acc: 0.8736 - val_loss: 0.6845 - val_acc: 0.8612
Epoch 64/1000
62s 125ms/step - loss: 0.6520 - acc: 0.8743 - val_loss: 0.7003 - val_acc: 0.8597
Epoch 65/1000
62s 125ms/step - loss: 0.6544 - acc: 0.8736 - val_loss: 0.6992 - val_acc: 0.8593
Epoch 66/1000
62s 125ms/step - loss: 0.6529 - acc: 0.8735 - val_loss: 0.6723 - val_acc: 0.8708
Epoch 67/1000
62s 125ms/step - loss: 0.6534 - acc: 0.8740 - val_loss: 0.6958 - val_acc: 0.8610
Epoch 68/1000
62s 124ms/step - loss: 0.6468 - acc: 0.8737 - val_loss: 0.6829 - val_acc: 0.8640
Epoch 69/1000
62s 125ms/step - loss: 0.6463 - acc: 0.8760 - val_loss: 0.7142 - val_acc: 0.8552
Epoch 70/1000
62s 125ms/step - loss: 0.6461 - acc: 0.8764 - val_loss: 0.6814 - val_acc: 0.8661
Epoch 71/1000
62s 125ms/step - loss: 0.6459 - acc: 0.8764 - val_loss: 0.6884 - val_acc: 0.8656
Epoch 72/1000
62s 125ms/step - loss: 0.6430 - acc: 0.8768 - val_loss: 0.6644 - val_acc: 0.8760
Epoch 73/1000
62s 125ms/step - loss: 0.6406 - acc: 0.8774 - val_loss: 0.6803 - val_acc: 0.8710
Epoch 74/1000
62s 125ms/step - loss: 0.6395 - acc: 0.8781 - val_loss: 0.6845 - val_acc: 0.8665
Epoch 75/1000
62s 125ms/step - loss: 0.6413 - acc: 0.8773 - val_loss: 0.7124 - val_acc: 0.8560
Epoch 76/1000
62s 125ms/step - loss: 0.6383 - acc: 0.8804 - val_loss: 0.7164 - val_acc: 0.8554
Epoch 77/1000
62s 125ms/step - loss: 0.6385 - acc: 0.8806 - val_loss: 0.6843 - val_acc: 0.8661
Epoch 78/1000
62s 124ms/step - loss: 0.6349 - acc: 0.8830 - val_loss: 0.7035 - val_acc: 0.8599
Epoch 79/1000
62s 124ms/step - loss: 0.6330 - acc: 0.8818 - val_loss: 0.6983 - val_acc: 0.8591
Epoch 80/1000
62s 125ms/step - loss: 0.6348 - acc: 0.8810 - val_loss: 0.6886 - val_acc: 0.8626
Epoch 81/1000
62s 125ms/step - loss: 0.6323 - acc: 0.8817 - val_loss: 0.6763 - val_acc: 0.8680
Epoch 82/1000
62s 125ms/step - loss: 0.6320 - acc: 0.8825 - val_loss: 0.6560 - val_acc: 0.8758
Epoch 83/1000
62s 125ms/step - loss: 0.6327 - acc: 0.8820 - val_loss: 0.6592 - val_acc: 0.8779
Epoch 84/1000
62s 124ms/step - loss: 0.6296 - acc: 0.8813 - val_loss: 0.6822 - val_acc: 0.8690
Epoch 85/1000
62s 125ms/step - loss: 0.6310 - acc: 0.8810 - val_loss: 0.6825 - val_acc: 0.8703
Epoch 86/1000
62s 125ms/step - loss: 0.6331 - acc: 0.8832 - val_loss: 0.6891 - val_acc: 0.8665
Epoch 87/1000
62s 125ms/step - loss: 0.6330 - acc: 0.8818 - val_loss: 0.6806 - val_acc: 0.8704
Epoch 88/1000
62s 125ms/step - loss: 0.6274 - acc: 0.8841 - val_loss: 0.6832 - val_acc: 0.8681
Epoch 89/1000
62s 125ms/step - loss: 0.6313 - acc: 0.8821 - val_loss: 0.6796 - val_acc: 0.8694
Epoch 90/1000
62s 125ms/step - loss: 0.6258 - acc: 0.8854 - val_loss: 0.6600 - val_acc: 0.8772
Epoch 91/1000
62s 125ms/step - loss: 0.6270 - acc: 0.8841 - val_loss: 0.6670 - val_acc: 0.8758
Epoch 92/1000
62s 125ms/step - loss: 0.6281 - acc: 0.8824 - val_loss: 0.6881 - val_acc: 0.8710
Epoch 93/1000
62s 124ms/step - loss: 0.6265 - acc: 0.8847 - val_loss: 0.6886 - val_acc: 0.8698
Epoch 94/1000
62s 125ms/step - loss: 0.6214 - acc: 0.8871 - val_loss: 0.6896 - val_acc: 0.8640
Epoch 95/1000
62s 125ms/step - loss: 0.6241 - acc: 0.8860 - val_loss: 0.6674 - val_acc: 0.8721
Epoch 96/1000
62s 125ms/step - loss: 0.6252 - acc: 0.8844 - val_loss: 0.6571 - val_acc: 0.8791
Epoch 97/1000
62s 125ms/step - loss: 0.6227 - acc: 0.8856 - val_loss: 0.6486 - val_acc: 0.8797
Epoch 98/1000
62s 125ms/step - loss: 0.6178 - acc: 0.8866 - val_loss: 0.6849 - val_acc: 0.8717
Epoch 99/1000
62s 125ms/step - loss: 0.6162 - acc: 0.8881 - val_loss: 0.6726 - val_acc: 0.8709
Epoch 100/1000
62s 124ms/step - loss: 0.6209 - acc: 0.8861 - val_loss: 0.6682 - val_acc: 0.8732
Epoch 101/1000
62s 125ms/step - loss: 0.6190 - acc: 0.8883 - val_loss: 0.6810 - val_acc: 0.8723
Epoch 102/1000
62s 125ms/step - loss: 0.6181 - acc: 0.8872 - val_loss: 0.6678 - val_acc: 0.8745
Epoch 103/1000
63s 125ms/step - loss: 0.6163 - acc: 0.8883 - val_loss: 0.6870 - val_acc: 0.8704
Epoch 104/1000
62s 125ms/step - loss: 0.6105 - acc: 0.8910 - val_loss: 0.6576 - val_acc: 0.8775
Epoch 105/1000
62s 125ms/step - loss: 0.6120 - acc: 0.8902 - val_loss: 0.6571 - val_acc: 0.8800
Epoch 106/1000
62s 125ms/step - loss: 0.6146 - acc: 0.8882 - val_loss: 0.6560 - val_acc: 0.8772
Epoch 107/1000
62s 125ms/step - loss: 0.6186 - acc: 0.8870 - val_loss: 0.6773 - val_acc: 0.8720
Epoch 108/1000
62s 125ms/step - loss: 0.6189 - acc: 0.8879 - val_loss: 0.6503 - val_acc: 0.8846
Epoch 109/1000
62s 125ms/step - loss: 0.6110 - acc: 0.8896 - val_loss: 0.6625 - val_acc: 0.8782
Epoch 110/1000
62s 125ms/step - loss: 0.6185 - acc: 0.8862 - val_loss: 0.6735 - val_acc: 0.8712
Epoch 111/1000
62s 125ms/step - loss: 0.6101 - acc: 0.8900 - val_loss: 0.6510 - val_acc: 0.8809
Epoch 112/1000
62s 125ms/step - loss: 0.6132 - acc: 0.8897 - val_loss: 0.6817 - val_acc: 0.8703
Epoch 113/1000
62s 125ms/step - loss: 0.6049 - acc: 0.8941 - val_loss: 0.6524 - val_acc: 0.8776
Epoch 114/1000
62s 125ms/step - loss: 0.6129 - acc: 0.8884 - val_loss: 0.6532 - val_acc: 0.8778
Epoch 115/1000
62s 125ms/step - loss: 0.6077 - acc: 0.8906 - val_loss: 0.6650 - val_acc: 0.8771
Epoch 116/1000
62s 124ms/step - loss: 0.6079 - acc: 0.8915 - val_loss: 0.6643 - val_acc: 0.8759
Epoch 117/1000
62s 125ms/step - loss: 0.6102 - acc: 0.8903 - val_loss: 0.6661 - val_acc: 0.8757
Epoch 118/1000
62s 124ms/step - loss: 0.6103 - acc: 0.8909 - val_loss: 0.6641 - val_acc: 0.8748
Epoch 119/1000
62s 125ms/step - loss: 0.6081 - acc: 0.8908 - val_loss: 0.6744 - val_acc: 0.8718
Epoch 120/1000
62s 125ms/step - loss: 0.6060 - acc: 0.8931 - val_loss: 0.6355 - val_acc: 0.8881
...
Epoch 280/1000
62s 124ms/step - loss: 0.5694 - acc: 0.9060 - val_loss: 0.6318 - val_acc: 0.8881
Epoch 281/1000
62s 125ms/step - loss: 0.5601 - acc: 0.9100 - val_loss: 0.6203 - val_acc: 0.8932
Epoch 282/1000
62s 125ms/step - loss: 0.5631 - acc: 0.9071 - val_loss: 0.6395 - val_acc: 0.8856
Epoch 283/1000
62s 125ms/step - loss: 0.5646 - acc: 0.9088 - val_loss: 0.6373 - val_acc: 0.8895
Epoch 284/1000
62s 124ms/step - loss: 0.5605 - acc: 0.9083 - val_loss: 0.6456 - val_acc: 0.8836
Epoch 285/1000
62s 124ms/step - loss: 0.5618 - acc: 0.9094 - val_loss: 0.6225 - val_acc: 0.8900
Epoch 286/1000
62s 124ms/step - loss: 0.5683 - acc: 0.9061 - val_loss: 0.6444 - val_acc: 0.8853
Epoch 287/1000
62s 124ms/step - loss: 0.5661 - acc: 0.9075 - val_loss: 0.6479 - val_acc: 0.8834
Epoch 288/1000
62s 125ms/step - loss: 0.5622 - acc: 0.9099 - val_loss: 0.6137 - val_acc: 0.8955
Epoch 289/1000
62s 125ms/step - loss: 0.5630 - acc: 0.9075 - val_loss: 0.6212 - val_acc: 0.8944
Epoch 290/1000
62s 125ms/step - loss: 0.5621 - acc: 0.9084 - val_loss: 0.6434 - val_acc: 0.8861
Epoch 291/1000
62s 125ms/step - loss: 0.5656 - acc: 0.9087 - val_loss: 0.6248 - val_acc: 0.8911
Epoch 292/1000
62s 124ms/step - loss: 0.5625 - acc: 0.9085 - val_loss: 0.6322 - val_acc: 0.8902
Epoch 293/1000
62s 125ms/step - loss: 0.5637 - acc: 0.9094 - val_loss: 0.6321 - val_acc: 0.8867
Epoch 294/1000
62s 125ms/step - loss: 0.5668 - acc: 0.9070 - val_loss: 0.6236 - val_acc: 0.8887
Epoch 295/1000
62s 125ms/step - loss: 0.5622 - acc: 0.9091 - val_loss: 0.6359 - val_acc: 0.8880
Epoch 296/1000
62s 125ms/step - loss: 0.5614 - acc: 0.9094 - val_loss: 0.6290 - val_acc: 0.8901
Epoch 297/1000
62s 125ms/step - loss: 0.5610 - acc: 0.9092 - val_loss: 0.6358 - val_acc: 0.8905
Epoch 298/1000
62s 125ms/step - loss: 0.5584 - acc: 0.9103 - val_loss: 0.6199 - val_acc: 0.8935
Epoch 299/1000
62s 125ms/step - loss: 0.5660 - acc: 0.9069 - val_loss: 0.6153 - val_acc: 0.8957
Epoch 300/1000
62s 124ms/step - loss: 0.5578 - acc: 0.9106 - val_loss: 0.6273 - val_acc: 0.8939
Epoch 301/1000
lr changed to 0.010000000149011612
62s 124ms/step - loss: 0.4654 - acc: 0.9431 - val_loss: 0.5402 - val_acc: 0.9195
Epoch 302/1000
63s 125ms/step - loss: 0.4160 - acc: 0.9576 - val_loss: 0.5281 - val_acc: 0.9208
Epoch 303/1000
63s 125ms/step - loss: 0.3942 - acc: 0.9640 - val_loss: 0.5227 - val_acc: 0.9234
Epoch 304/1000
62s 125ms/step - loss: 0.3791 - acc: 0.9677 - val_loss: 0.5185 - val_acc: 0.9257
Epoch 305/1000
62s 125ms/step - loss: 0.3685 - acc: 0.9689 - val_loss: 0.5151 - val_acc: 0.9273
Epoch 306/1000
62s 125ms/step - loss: 0.3548 - acc: 0.9717 - val_loss: 0.5098 - val_acc: 0.9268
Epoch 307/1000
62s 125ms/step - loss: 0.3455 - acc: 0.9737 - val_loss: 0.5064 - val_acc: 0.9260
Epoch 308/1000
62s 124ms/step - loss: 0.3382 - acc: 0.9758 - val_loss: 0.5038 - val_acc: 0.9268
Epoch 309/1000
62s 125ms/step - loss: 0.3281 - acc: 0.9766 - val_loss: 0.5063 - val_acc: 0.9248
Epoch 310/1000
62s 125ms/step - loss: 0.3208 - acc: 0.9779 - val_loss: 0.5018 - val_acc: 0.9242
Epoch 311/1000
62s 125ms/step - loss: 0.3133 - acc: 0.9792 - val_loss: 0.5024 - val_acc: 0.9248
Epoch 312/1000
62s 125ms/step - loss: 0.3078 - acc: 0.9790 - val_loss: 0.4962 - val_acc: 0.9250
Epoch 313/1000
63s 125ms/step - loss: 0.2999 - acc: 0.9810 - val_loss: 0.5008 - val_acc: 0.9234
Epoch 314/1000
62s 125ms/step - loss: 0.2930 - acc: 0.9817 - val_loss: 0.4988 - val_acc: 0.9227
Epoch 315/1000
62s 125ms/step - loss: 0.2868 - acc: 0.9824 - val_loss: 0.4896 - val_acc: 0.9221
Epoch 316/1000
62s 125ms/step - loss: 0.2815 - acc: 0.9827 - val_loss: 0.4896 - val_acc: 0.9255
Epoch 317/1000
62s 125ms/step - loss: 0.2752 - acc: 0.9834 - val_loss: 0.4882 - val_acc: 0.9233
Epoch 318/1000
62s 125ms/step - loss: 0.2719 - acc: 0.9836 - val_loss: 0.4935 - val_acc: 0.9225
Epoch 319/1000
62s 125ms/step - loss: 0.2659 - acc: 0.9839 - val_loss: 0.4843 - val_acc: 0.9230
Epoch 320/1000
62s 125ms/step - loss: 0.2607 - acc: 0.9845 - val_loss: 0.4881 - val_acc: 0.9221
Epoch 321/1000
62s 125ms/step - loss: 0.2561 - acc: 0.9850 - val_loss: 0.4871 - val_acc: 0.9200
Epoch 322/1000
62s 125ms/step - loss: 0.2543 - acc: 0.9846 - val_loss: 0.4793 - val_acc: 0.9227
Epoch 323/1000
62s 125ms/step - loss: 0.2500 - acc: 0.9852 - val_loss: 0.4661 - val_acc: 0.9221
Epoch 324/1000
62s 125ms/step - loss: 0.2459 - acc: 0.9851 - val_loss: 0.4621 - val_acc: 0.9260
Epoch 325/1000
62s 125ms/step - loss: 0.2410 - acc: 0.9855 - val_loss: 0.4690 - val_acc: 0.9236
Epoch 326/1000
62s 125ms/step - loss: 0.2352 - acc: 0.9866 - val_loss: 0.4689 - val_acc: 0.9227
Epoch 327/1000
62s 125ms/step - loss: 0.2334 - acc: 0.9860 - val_loss: 0.4711 - val_acc: 0.9205
Epoch 328/1000
62s 125ms/step - loss: 0.2296 - acc: 0.9863 - val_loss: 0.4718 - val_acc: 0.9231
Epoch 329/1000
62s 125ms/step - loss: 0.2259 - acc: 0.9869 - val_loss: 0.4648 - val_acc: 0.9212
Epoch 330/1000
62s 125ms/step - loss: 0.2211 - acc: 0.9875 - val_loss: 0.4697 - val_acc: 0.9229
Epoch 331/1000
62s 125ms/step - loss: 0.2228 - acc: 0.9861 - val_loss: 0.4697 - val_acc: 0.9200
Epoch 332/1000
62s 124ms/step - loss: 0.2175 - acc: 0.9862 - val_loss: 0.4546 - val_acc: 0.9224
Epoch 333/1000
62s 125ms/step - loss: 0.2143 - acc: 0.9872 - val_loss: 0.4580 - val_acc: 0.9229
Epoch 334/1000
62s 124ms/step - loss: 0.2107 - acc: 0.9878 - val_loss: 0.4492 - val_acc: 0.9197
Epoch 335/1000
62s 125ms/step - loss: 0.2080 - acc: 0.9875 - val_loss: 0.4626 - val_acc: 0.9184
Epoch 336/1000
62s 125ms/step - loss: 0.2066 - acc: 0.9870 - val_loss: 0.4614 - val_acc: 0.9180
Epoch 337/1000
62s 125ms/step - loss: 0.2045 - acc: 0.9871 - val_loss: 0.4447 - val_acc: 0.9210
Epoch 338/1000
62s 125ms/step - loss: 0.2001 - acc: 0.9874 - val_loss: 0.4554 - val_acc: 0.9207
Epoch 339/1000
62s 125ms/step - loss: 0.1991 - acc: 0.9877 - val_loss: 0.4527 - val_acc: 0.9206
Epoch 340/1000
62s 125ms/step - loss: 0.1958 - acc: 0.9878 - val_loss: 0.4630 - val_acc: 0.9157
Epoch 341/1000
62s 125ms/step - loss: 0.1957 - acc: 0.9868 - val_loss: 0.4447 - val_acc: 0.9225
Epoch 342/1000
62s 125ms/step - loss: 0.1939 - acc: 0.9870 - val_loss: 0.4558 - val_acc: 0.9160
Epoch 343/1000
63s 125ms/step - loss: 0.1921 - acc: 0.9866 - val_loss: 0.4451 - val_acc: 0.9195
Epoch 344/1000
62s 125ms/step - loss: 0.1929 - acc: 0.9860 - val_loss: 0.4431 - val_acc: 0.9213
Epoch 345/1000
62s 125ms/step - loss: 0.1889 - acc: 0.9864 - val_loss: 0.4386 - val_acc: 0.9213
Epoch 346/1000
62s 125ms/step - loss: 0.1865 - acc: 0.9869 - val_loss: 0.4504 - val_acc: 0.9167
Epoch 347/1000
62s 125ms/step - loss: 0.1847 - acc: 0.9870 - val_loss: 0.4285 - val_acc: 0.9196
Epoch 348/1000
62s 125ms/step - loss: 0.1836 - acc: 0.9865 - val_loss: 0.4252 - val_acc: 0.9220
Epoch 349/1000
62s 124ms/step - loss: 0.1827 - acc: 0.9864 - val_loss: 0.4364 - val_acc: 0.9205
Epoch 350/1000
62s 125ms/step - loss: 0.1800 - acc: 0.9870 - val_loss: 0.4379 - val_acc: 0.9214
...
Epoch 560/1000
62s 124ms/step - loss: 0.1442 - acc: 0.9865 - val_loss: 0.3911 - val_acc: 0.9212
Epoch 561/1000
62s 124ms/step - loss: 0.1459 - acc: 0.9852 - val_loss: 0.3984 - val_acc: 0.9185
Epoch 562/1000
62s 124ms/step - loss: 0.1473 - acc: 0.9851 - val_loss: 0.4080 - val_acc: 0.9196
Epoch 563/1000
62s 124ms/step - loss: 0.1465 - acc: 0.9860 - val_loss: 0.4058 - val_acc: 0.9166
Epoch 564/1000
62s 124ms/step - loss: 0.1423 - acc: 0.9870 - val_loss: 0.4046 - val_acc: 0.9180
Epoch 565/1000
62s 124ms/step - loss: 0.1486 - acc: 0.9851 - val_loss: 0.4022 - val_acc: 0.9184
Epoch 566/1000
62s 124ms/step - loss: 0.1478 - acc: 0.9853 - val_loss: 0.3896 - val_acc: 0.9224
Epoch 567/1000
62s 124ms/step - loss: 0.1470 - acc: 0.9850 - val_loss: 0.4141 - val_acc: 0.9151
Epoch 568/1000
62s 124ms/step - loss: 0.1438 - acc: 0.9862 - val_loss: 0.4139 - val_acc: 0.9197
Epoch 569/1000
62s 125ms/step - loss: 0.1470 - acc: 0.9851 - val_loss: 0.4143 - val_acc: 0.9156
Epoch 570/1000
62s 125ms/step - loss: 0.1484 - acc: 0.9845 - val_loss: 0.4151 - val_acc: 0.9148
Epoch 571/1000
62s 125ms/step - loss: 0.1479 - acc: 0.9849 - val_loss: 0.4206 - val_acc: 0.9136
Epoch 572/1000
62s 124ms/step - loss: 0.1458 - acc: 0.9855 - val_loss: 0.4172 - val_acc: 0.9147
Epoch 573/1000
62s 124ms/step - loss: 0.1450 - acc: 0.9860 - val_loss: 0.4267 - val_acc: 0.9156
Epoch 574/1000
62s 124ms/step - loss: 0.1514 - acc: 0.9834 - val_loss: 0.4357 - val_acc: 0.9127
Epoch 575/1000
62s 124ms/step - loss: 0.1475 - acc: 0.9851 - val_loss: 0.4212 - val_acc: 0.9142
Epoch 576/1000
62s 125ms/step - loss: 0.1464 - acc: 0.9858 - val_loss: 0.4141 - val_acc: 0.9162
Epoch 577/1000
62s 125ms/step - loss: 0.1478 - acc: 0.9846 - val_loss: 0.4065 - val_acc: 0.9151
Epoch 578/1000
62s 125ms/step - loss: 0.1418 - acc: 0.9868 - val_loss: 0.4090 - val_acc: 0.9145
Epoch 579/1000
62s 125ms/step - loss: 0.1456 - acc: 0.9852 - val_loss: 0.4350 - val_acc: 0.9101
Epoch 580/1000
62s 125ms/step - loss: 0.1422 - acc: 0.9870 - val_loss: 0.4116 - val_acc: 0.9185
Epoch 581/1000
62s 125ms/step - loss: 0.1449 - acc: 0.9858 - val_loss: 0.4245 - val_acc: 0.9127
Epoch 582/1000
62s 125ms/step - loss: 0.1429 - acc: 0.9863 - val_loss: 0.4157 - val_acc: 0.9163
Epoch 583/1000
62s 125ms/step - loss: 0.1473 - acc: 0.9851 - val_loss: 0.4094 - val_acc: 0.9165
Epoch 584/1000
62s 125ms/step - loss: 0.1500 - acc: 0.9845 - val_loss: 0.4269 - val_acc: 0.9115
Epoch 585/1000
62s 125ms/step - loss: 0.1450 - acc: 0.9860 - val_loss: 0.4189 - val_acc: 0.9165
Epoch 586/1000
62s 125ms/step - loss: 0.1450 - acc: 0.9859 - val_loss: 0.4153 - val_acc: 0.9153
Epoch 587/1000
62s 125ms/step - loss: 0.1453 - acc: 0.9859 - val_loss: 0.4166 - val_acc: 0.9155
Epoch 588/1000
62s 125ms/step - loss: 0.1409 - acc: 0.9875 - val_loss: 0.4088 - val_acc: 0.9193
Epoch 589/1000
62s 125ms/step - loss: 0.1455 - acc: 0.9854 - val_loss: 0.4220 - val_acc: 0.9149
Epoch 590/1000
62s 125ms/step - loss: 0.1466 - acc: 0.9848 - val_loss: 0.4264 - val_acc: 0.9136
Epoch 591/1000
62s 125ms/step - loss: 0.1424 - acc: 0.9868 - val_loss: 0.4212 - val_acc: 0.9178
Epoch 592/1000
62s 125ms/step - loss: 0.1441 - acc: 0.9862 - val_loss: 0.4271 - val_acc: 0.9127
Epoch 593/1000
62s 124ms/step - loss: 0.1469 - acc: 0.9852 - val_loss: 0.4247 - val_acc: 0.9170
Epoch 594/1000
62s 125ms/step - loss: 0.1468 - acc: 0.9845 - val_loss: 0.4080 - val_acc: 0.9192
Epoch 595/1000
62s 125ms/step - loss: 0.1437 - acc: 0.9857 - val_loss: 0.4111 - val_acc: 0.9174
Epoch 596/1000
62s 125ms/step - loss: 0.1451 - acc: 0.9852 - val_loss: 0.4290 - val_acc: 0.9124
Epoch 597/1000
62s 124ms/step - loss: 0.1465 - acc: 0.9856 - val_loss: 0.4203 - val_acc: 0.9167
Epoch 598/1000
62s 125ms/step - loss: 0.1451 - acc: 0.9855 - val_loss: 0.4203 - val_acc: 0.9136
Epoch 599/1000
62s 125ms/step - loss: 0.1460 - acc: 0.9857 - val_loss: 0.4248 - val_acc: 0.9161
Epoch 600/1000
62s 124ms/step - loss: 0.1466 - acc: 0.9856 - val_loss: 0.4286 - val_acc: 0.9143
Epoch 601/1000
lr changed to 0.0009999999776482583
62s 125ms/step - loss: 0.1318 - acc: 0.9907 - val_loss: 0.3912 - val_acc: 0.9255
Epoch 602/1000
62s 124ms/step - loss: 0.1212 - acc: 0.9945 - val_loss: 0.3822 - val_acc: 0.9269
Epoch 603/1000
62s 125ms/step - loss: 0.1176 - acc: 0.9960 - val_loss: 0.3786 - val_acc: 0.9289
Epoch 604/1000
62s 125ms/step - loss: 0.1168 - acc: 0.9959 - val_loss: 0.3779 - val_acc: 0.9286
Epoch 605/1000
62s 125ms/step - loss: 0.1146 - acc: 0.9965 - val_loss: 0.3782 - val_acc: 0.9295
Epoch 606/1000
62s 125ms/step - loss: 0.1130 - acc: 0.9973 - val_loss: 0.3791 - val_acc: 0.9294
Epoch 607/1000
62s 125ms/step - loss: 0.1127 - acc: 0.9974 - val_loss: 0.3780 - val_acc: 0.9301
Epoch 608/1000
62s 125ms/step - loss: 0.1118 - acc: 0.9976 - val_loss: 0.3777 - val_acc: 0.9300
Epoch 609/1000
62s 125ms/step - loss: 0.1112 - acc: 0.9975 - val_loss: 0.3760 - val_acc: 0.9298
Epoch 610/1000
62s 125ms/step - loss: 0.1102 - acc: 0.9978 - val_loss: 0.3769 - val_acc: 0.9301
Epoch 611/1000
62s 125ms/step - loss: 0.1106 - acc: 0.9977 - val_loss: 0.3775 - val_acc: 0.9309
Epoch 612/1000
62s 125ms/step - loss: 0.1092 - acc: 0.9979 - val_loss: 0.3781 - val_acc: 0.9304
Epoch 613/1000
62s 124ms/step - loss: 0.1096 - acc: 0.9979 - val_loss: 0.3768 - val_acc: 0.9297
Epoch 614/1000
62s 125ms/step - loss: 0.1092 - acc: 0.9979 - val_loss: 0.3770 - val_acc: 0.9302
Epoch 615/1000
62s 125ms/step - loss: 0.1084 - acc: 0.9982 - val_loss: 0.3779 - val_acc: 0.9309
Epoch 616/1000
62s 125ms/step - loss: 0.1077 - acc: 0.9983 - val_loss: 0.3804 - val_acc: 0.9299
Epoch 617/1000
62s 125ms/step - loss: 0.1073 - acc: 0.9983 - val_loss: 0.3799 - val_acc: 0.9302
Epoch 618/1000
62s 125ms/step - loss: 0.1069 - acc: 0.9985 - val_loss: 0.3816 - val_acc: 0.9305
Epoch 619/1000
62s 125ms/step - loss: 0.1063 - acc: 0.9985 - val_loss: 0.3814 - val_acc: 0.9303
Epoch 620/1000
62s 125ms/step - loss: 0.1066 - acc: 0.9983 - val_loss: 0.3817 - val_acc: 0.9301
Epoch 621/1000
62s 125ms/step - loss: 0.1060 - acc: 0.9987 - val_loss: 0.3811 - val_acc: 0.9303
Epoch 622/1000
62s 124ms/step - loss: 0.1058 - acc: 0.9985 - val_loss: 0.3815 - val_acc: 0.9298
Epoch 623/1000
62s 124ms/step - loss: 0.1051 - acc: 0.9986 - val_loss: 0.3810 - val_acc: 0.9302
Epoch 624/1000
62s 124ms/step - loss: 0.1050 - acc: 0.9986 - val_loss: 0.3825 - val_acc: 0.9303
Epoch 625/1000
62s 124ms/step - loss: 0.1048 - acc: 0.9987 - val_loss: 0.3845 - val_acc: 0.9294
Epoch 626/1000
62s 125ms/step - loss: 0.1040 - acc: 0.9988 - val_loss: 0.3842 - val_acc: 0.9296
Epoch 627/1000
62s 125ms/step - loss: 0.1037 - acc: 0.9988 - val_loss: 0.3833 - val_acc: 0.9304
Epoch 628/1000
62s 124ms/step - loss: 0.1048 - acc: 0.9982 - val_loss: 0.3844 - val_acc: 0.9303
Epoch 629/1000
62s 124ms/step - loss: 0.1045 - acc: 0.9984 - val_loss: 0.3829 - val_acc: 0.9289
Epoch 630/1000
62s 125ms/step - loss: 0.1032 - acc: 0.9988 - val_loss: 0.3823 - val_acc: 0.9302
Epoch 631/1000
62s 125ms/step - loss: 0.1034 - acc: 0.9987 - val_loss: 0.3809 - val_acc: 0.9314
Epoch 632/1000
62s 125ms/step - loss: 0.1029 - acc: 0.9987 - val_loss: 0.3812 - val_acc: 0.9309
Epoch 633/1000
62s 125ms/step - loss: 0.1023 - acc: 0.9990 - val_loss: 0.3815 - val_acc: 0.9303
Epoch 634/1000
62s 124ms/step - loss: 0.1025 - acc: 0.9987 - val_loss: 0.3854 - val_acc: 0.9303
Epoch 635/1000
62s 124ms/step - loss: 0.1022 - acc: 0.9988 - val_loss: 0.3849 - val_acc: 0.9305
Epoch 636/1000
62s 124ms/step - loss: 0.1015 - acc: 0.9989 - val_loss: 0.3840 - val_acc: 0.9312
Epoch 637/1000
62s 124ms/step - loss: 0.1012 - acc: 0.9991 - val_loss: 0.3831 - val_acc: 0.9308
Epoch 638/1000
62s 125ms/step - loss: 0.1012 - acc: 0.9990 - val_loss: 0.3830 - val_acc: 0.9315
Epoch 639/1000
62s 124ms/step - loss: 0.1012 - acc: 0.9989 - val_loss: 0.3826 - val_acc: 0.9309
Epoch 640/1000
62s 125ms/step - loss: 0.1006 - acc: 0.9990 - val_loss: 0.3838 - val_acc: 0.9309
...
Epoch 890/1000
62s 124ms/step - loss: 0.0611 - acc: 0.9997 - val_loss: 0.3721 - val_acc: 0.9315
Epoch 891/1000
62s 124ms/step - loss: 0.0606 - acc: 0.9998 - val_loss: 0.3726 - val_acc: 0.9324
Epoch 892/1000
62s 123ms/step - loss: 0.0606 - acc: 0.9997 - val_loss: 0.3737 - val_acc: 0.9321
Epoch 893/1000
62s 123ms/step - loss: 0.0607 - acc: 0.9996 - val_loss: 0.3709 - val_acc: 0.9325
Epoch 894/1000
62s 123ms/step - loss: 0.0602 - acc: 0.9999 - val_loss: 0.3701 - val_acc: 0.9325
Epoch 895/1000
62s 124ms/step - loss: 0.0604 - acc: 0.9997 - val_loss: 0.3670 - val_acc: 0.9327
Epoch 896/1000
62s 124ms/step - loss: 0.0606 - acc: 0.9995 - val_loss: 0.3646 - val_acc: 0.9325
Epoch 897/1000
62s 124ms/step - loss: 0.0603 - acc: 0.9997 - val_loss: 0.3693 - val_acc: 0.9315
Epoch 898/1000
62s 123ms/step - loss: 0.0602 - acc: 0.9996 - val_loss: 0.3705 - val_acc: 0.9312
Epoch 899/1000
62s 124ms/step - loss: 0.0599 - acc: 0.9997 - val_loss: 0.3697 - val_acc: 0.9309
Epoch 900/1000
62s 123ms/step - loss: 0.0600 - acc: 0.9997 - val_loss: 0.3694 - val_acc: 0.9313
Epoch 901/1000
lr changed to 9.999999310821295e-05
62s 123ms/step - loss: 0.0597 - acc: 0.9998 - val_loss: 0.3694 - val_acc: 0.9313
Epoch 902/1000
62s 124ms/step - loss: 0.0595 - acc: 0.9998 - val_loss: 0.3685 - val_acc: 0.9316
Epoch 903/1000
62s 124ms/step - loss: 0.0597 - acc: 0.9998 - val_loss: 0.3685 - val_acc: 0.9314
Epoch 904/1000
62s 124ms/step - loss: 0.0599 - acc: 0.9997 - val_loss: 0.3686 - val_acc: 0.9316
Epoch 905/1000
62s 124ms/step - loss: 0.0598 - acc: 0.9997 - val_loss: 0.3684 - val_acc: 0.9316
...
Epoch 967/1000
62s 124ms/step - loss: 0.0590 - acc: 0.9996 - val_loss: 0.3649 - val_acc: 0.9328
Epoch 968/1000
62s 124ms/step - loss: 0.0588 - acc: 0.9997 - val_loss: 0.3648 - val_acc: 0.9328
Epoch 969/1000
62s 124ms/step - loss: 0.0589 - acc: 0.9998 - val_loss: 0.3647 - val_acc: 0.9325
Epoch 970/1000
62s 124ms/step - loss: 0.0587 - acc: 0.9998 - val_loss: 0.3645 - val_acc: 0.9325
Epoch 971/1000
62s 123ms/step - loss: 0.0587 - acc: 0.9997 - val_loss: 0.3646 - val_acc: 0.9328
Epoch 972/1000
62s 124ms/step - loss: 0.0588 - acc: 0.9998 - val_loss: 0.3650 - val_acc: 0.9331
Epoch 973/1000
62s 124ms/step - loss: 0.0586 - acc: 0.9998 - val_loss: 0.3649 - val_acc: 0.9331
Epoch 974/1000
62s 124ms/step - loss: 0.0587 - acc: 0.9998 - val_loss: 0.3646 - val_acc: 0.9326
Epoch 975/1000
62s 124ms/step - loss: 0.0586 - acc: 0.9998 - val_loss: 0.3645 - val_acc: 0.9322
Epoch 976/1000
62s 124ms/step - loss: 0.0585 - acc: 0.9998 - val_loss: 0.3645 - val_acc: 0.9326
Epoch 977/1000
62s 124ms/step - loss: 0.0591 - acc: 0.9996 - val_loss: 0.3646 - val_acc: 0.9323
Epoch 978/1000
62s 124ms/step - loss: 0.0587 - acc: 0.9998 - val_loss: 0.3645 - val_acc: 0.9326
Epoch 979/1000
62s 124ms/step - loss: 0.0588 - acc: 0.9997 - val_loss: 0.3646 - val_acc: 0.9320
Epoch 980/1000
62s 124ms/step - loss: 0.0588 - acc: 0.9997 - val_loss: 0.3648 - val_acc: 0.9319
Epoch 981/1000
62s 123ms/step - loss: 0.0587 - acc: 0.9998 - val_loss: 0.3650 - val_acc: 0.9317
Epoch 982/1000
62s 124ms/step - loss: 0.0584 - acc: 0.9999 - val_loss: 0.3650 - val_acc: 0.9317
Epoch 983/1000
62s 124ms/step - loss: 0.0586 - acc: 0.9998 - val_loss: 0.3650 - val_acc: 0.9323
Epoch 984/1000
62s 124ms/step - loss: 0.0586 - acc: 0.9998 - val_loss: 0.3651 - val_acc: 0.9322
Epoch 985/1000
62s 124ms/step - loss: 0.0587 - acc: 0.9997 - val_loss: 0.3651 - val_acc: 0.9321
Epoch 986/1000
62s 124ms/step - loss: 0.0588 - acc: 0.9998 - val_loss: 0.3648 - val_acc: 0.9323
Epoch 987/1000
62s 124ms/step - loss: 0.0586 - acc: 0.9997 - val_loss: 0.3644 - val_acc: 0.9318
Epoch 988/1000
62s 124ms/step - loss: 0.0586 - acc: 0.9997 - val_loss: 0.3648 - val_acc: 0.9322
Epoch 989/1000
62s 124ms/step - loss: 0.0585 - acc: 0.9998 - val_loss: 0.3650 - val_acc: 0.9322
Epoch 990/1000
62s 124ms/step - loss: 0.0586 - acc: 0.9998 - val_loss: 0.3646 - val_acc: 0.9319
Epoch 991/1000
62s 124ms/step - loss: 0.0584 - acc: 0.9998 - val_loss: 0.3647 - val_acc: 0.9323
Epoch 992/1000
62s 124ms/step - loss: 0.0585 - acc: 0.9997 - val_loss: 0.3647 - val_acc: 0.9320
Epoch 993/1000
62s 124ms/step - loss: 0.0587 - acc: 0.9997 - val_loss: 0.3646 - val_acc: 0.9318
Epoch 994/1000
62s 124ms/step - loss: 0.0584 - acc: 0.9999 - val_loss: 0.3650 - val_acc: 0.9320
Epoch 995/1000
62s 124ms/step - loss: 0.0586 - acc: 0.9997 - val_loss: 0.3650 - val_acc: 0.9315
Epoch 996/1000
62s 123ms/step - loss: 0.0585 - acc: 0.9998 - val_loss: 0.3649 - val_acc: 0.9319
Epoch 997/1000
62s 123ms/step - loss: 0.0585 - acc: 0.9998 - val_loss: 0.3645 - val_acc: 0.9318
Epoch 998/1000
62s 124ms/step - loss: 0.0584 - acc: 0.9998 - val_loss: 0.3648 - val_acc: 0.9320
Epoch 999/1000
62s 124ms/step - loss: 0.0584 - acc: 0.9999 - val_loss: 0.3646 - val_acc: 0.9320
Epoch 1000/1000
62s 124ms/step - loss: 0.0581 - acc: 0.9999 - val_loss: 0.3646 - val_acc: 0.9323
Train loss: 0.062079589650034905
Train accuracy: 0.9986200013160705
Test loss: 0.3645842906832695
Test accuracy: 0.9323000019788742

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht, Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, IEEE Transactions on Industrial Electronics, DOI: 10.1109/TIE.2020.2972458, Date of Publication: 13 February 2020

https://ieeexplore.ieee.org/document/8998530

0 人点赞