Get the "Applied Data Science Edge"!

The ViralML School

Fundamental Market Analysis with Python - Find Your Own Answers On What Is Going on in the Financial Markets

Web Work

Python Web Work - Prototyping Guide for Maker

Use HTML5 Templates, Serve Dynamic Content, Build Machine Learning Web Apps, Grow Audiences & Conquer the World!

Hot off the Press!

The Little Book of Fundamental Market Indicators

My New Book: "The Little Book of Fundamental Analysis: Hands-On Market Analysis with Python" is Out!

Using Autoencoders and Keras to Better Understand your Customers

Introduction

An autoencoder is a great tool to dig deep into your data. If you are unsure of what to focus on or you want to look at the bigger picture, an unsupervised or semi-supervised model might be able to give you fresh new insights and new areas to investigate. It is a great compliment to a traditional supervised model.



If you liked it, please share it:

Code

AutoEncoder Good vs Bad Credit Risk - German Customers
In [1]:
from IPython.display import Image
Image(filename='autoencoders-keras.png', width='80%')
Out[1]:

Using Autoencoders to Better Understand your Customers

Measuring Customer Credit Risk with AutoEncoders and Keras 2.0

Let's Look at a Simple Credit-Risk Example and Unearth New Patterns, Anomalies, and Actionable Insights

How to use unsupervised learning to get actionable insights

An autoencoder is a great tool to dig deep into your data. If you are unsure of what to focus on or you want to look at the bigger picture, a unsupervised or semi-surpervised model might be able to give you fresh new insights and new areas to invstetigate. It is a great compliment to a traditional supervised model.

In this walk-through, we'll see how we can apply autoencoders to customer's seeking loans and flag any abnormal behavior. We will use as our Autoencoder Keras code base a great walk-through from Chitta Ranjan "Extreme Rare Event Classification using Autoencoders in Keras" but we will take it further and reach actionable insights as we all know, there isn't much data science with actionable insights.

The Open Source Statlog (German Credit Data) Data Set from UC Irvine Machine Learning Repository

"This dataset classifies people described by a set of attributes as good or bad credit risks. Comes in two formats (one all numeric). Also comes with a cost matrix"

https://archive.ics.uci.edu/ml/datasets/statlog+(german+credit+data)

The Autoencoder and anomaly detection

Autoencoding mostly aims at reducing feature space in order to distill the essential aspects of the data versus more conventional deeplearning which blows up the feature space to capture non-linearities and subtle interactions within the data. Autoencoding can also be seen as a non-linear alternative to PCA. It is similar to what we use in image, music, and file compression. We compress the excess until the data is to distorted to be of any value.

Anomaly detection (or outlier detection) is the identification of items, events or observations which do not conform to an expected pattern or other items in a dataset - Wikipedia.com

Anomaly Detection is a big scientific domain, and with such big domains, come many associated techniques and tools. The autoencoder is one of those tools and the subject of this walk-through.

Finding actionable insights

At the end of the day, actionable insights is what data science is all about. If a person cannot apply the findings and predictions from a data science project, it is of little use besides as an academic exercise. What we are doing here, is usng Chitta's approach of training an autoencoder model using only a subset of the data - in this case only customer with good credit profiles. This engages the encoder to learn what a "good" customer looks like by reducing that data to its core essentials. Once the model is trained, we take out-of-sample data and have the model predict (i.e. compress then reconstruct the data) and compare the error reconstruction value. The smaller the error, the closer the customer is to the "good" profile, the larger the error, the more anomalous it is.

Investigative work

The out-of-sample customers with high reconstruction errors are to be scrutenized closely to understand why they are different from the typical "good" customer.

In [2]:
#https://towardsdatascience.com/extreme-rare-event-classification-using-autoencoders-in-keras-a565b386f098
%matplotlib inline
import matplotlib.pyplot as plt
import seaborn as sns
import pandas as pd
import numpy as np
import tensorflow as tf
from tensorflow.keras.models import Model, load_model
from tensorflow.keras.layers import Input, Dense
from tensorflow.keras.callbacks import ModelCheckpoint, TensorBoard
from tensorflow.keras import regularizers
from sklearn.preprocessing import StandardScaler
from sklearn.model_selection import train_test_split
from sklearn.metrics import confusion_matrix, precision_recall_curve
from sklearn.metrics import recall_score, classification_report, auc, roc_curve, accuracy_score
from sklearn.metrics import precision_recall_fscore_support, f1_score
from pandas.api.types import is_numeric_dtype
import urllib

Download data from UC Irvine Machine Learning Repository

https://archive.ics.uci.edu/ml/datasets/statlog+(german+credit+data)

In [3]:
url="http://archive.ics.uci.edu/ml/machine-learning-databases/statlog/german/german.data"
raw_data = urllib.request.urlopen(url)
credit=pd.DataFrame(raw_data)
data = pd.read_csv(url, delimiter=' ', header=None)
data.columns = ['HasChecking', 'DurationInMonths', 'CreditHistory', 'CreditPurpose', 'CreditAmount', 
         'SavingsAccount', 'EmployedSince', 'InstallmentRatePercentIncome', 'StatusGender', 
        'OtherDebtorsGuarantors', 'ResidenceSince', 'Property', 'Age', 'OtherInstallmentPlans', 'Housing', 
         'NumberExistingCredits', 'Job', 'FamilyLiablities', 'HasPhone', 'ForeignWorker', 'CreditRisk']
data.head()
Out[3]:
HasChecking DurationInMonths CreditHistory CreditPurpose CreditAmount SavingsAccount EmployedSince InstallmentRatePercentIncome StatusGender OtherDebtorsGuarantors ... Property Age OtherInstallmentPlans Housing NumberExistingCredits Job FamilyLiablities HasPhone ForeignWorker CreditRisk
0 A11 6 A34 A43 1169 A65 A75 4 A93 A101 ... A121 67 A143 A152 2 A173 1 A192 A201 1
1 A12 48 A32 A43 5951 A61 A73 2 A92 A101 ... A121 22 A143 A152 1 A173 1 A191 A201 2
2 A14 12 A34 A46 2096 A61 A74 2 A93 A101 ... A121 49 A143 A152 1 A172 2 A191 A201 1
3 A11 42 A32 A42 7882 A61 A74 2 A93 A103 ... A122 45 A143 A153 1 A173 2 A191 A201 1
4 A11 24 A33 A40 4870 A61 A73 3 A93 A101 ... A124 53 A143 A153 2 A173 2 A191 A201 2

5 rows × 21 columns

Untangle of categorical data

In [4]:
# Quick untangle of categorical data
numerical_features = [f for f in list(data) if is_numeric_dtype(data[f])]
numerical_features
Out[4]:
['DurationInMonths',
 'CreditAmount',
 'InstallmentRatePercentIncome',
 'ResidenceSince',
 'Age',
 'NumberExistingCredits',
 'FamilyLiablities',
 'CreditRisk']
In [5]:
non_numerical_features = [f for f in list(data) if f not in numerical_features]
non_numerical_features
Out[5]:
['HasChecking',
 'CreditHistory',
 'CreditPurpose',
 'SavingsAccount',
 'EmployedSince',
 'StatusGender',
 'OtherDebtorsGuarantors',
 'Property',
 'OtherInstallmentPlans',
 'Housing',
 'Job',
 'HasPhone',
 'ForeignWorker']
In [6]:
# make dummy vars out of non_numerical_features
data_ready = pd.get_dummies(data, columns=non_numerical_features, 
               drop_first=False,
               dummy_na=False)
print(data_ready.shape)
data_ready.head()
(1000, 62)
Out[6]:
DurationInMonths CreditAmount InstallmentRatePercentIncome ResidenceSince Age NumberExistingCredits FamilyLiablities CreditRisk HasChecking_A11 HasChecking_A12 ... Housing_A152 Housing_A153 Job_A171 Job_A172 Job_A173 Job_A174 HasPhone_A191 HasPhone_A192 ForeignWorker_A201 ForeignWorker_A202
0 6 1169 4 4 67 2 1 1 1 0 ... 1 0 0 0 1 0 0 1 1 0
1 48 5951 2 2 22 1 1 2 0 1 ... 1 0 0 0 1 0 1 0 1 0
2 12 2096 2 3 49 1 2 1 0 0 ... 1 0 0 1 0 0 1 0 1 0
3 42 7882 2 4 45 1 2 1 1 0 ... 0 1 0 0 1 0 1 0 1 0
4 24 4870 3 4 53 2 2 2 1 0 ... 0 1 0 0 1 0 1 0 1 0

5 rows × 62 columns

In [7]:
# from data notes we know that: (1 = Good, 2 = Bad)
data_ready[['CreditRisk','HasChecking_A11',
            'HasChecking_A12',
            'HasChecking_A13',
           'HasChecking_A14']].groupby('CreditRisk').sum()
Out[7]:
HasChecking_A11 HasChecking_A12 HasChecking_A13 HasChecking_A14
CreditRisk
1 139.0 164.0 49.0 348.0
2 135.0 105.0 14.0 46.0
In [8]:
# fix binary outcome - good versus bad credit risk
# from data notes we know that: (1 = Good, 2 = Bad) so 0 will be good credit and 1 will be bad credit
data_ready['CreditRisk'].replace([1,2], [0,1], inplace=True)
data_ready['CreditRisk'].value_counts()
Out[8]:
0    700
1    300
Name: CreditRisk, dtype: int64
In [9]:
# save data set to file to run it into FastML.io
data_ready.to_csv("german-credit-scores-ready.csv", index=None)

Prepare train/test split for autoencoder

In [10]:
print(data_ready.shape)
features = [f for f in list(data_ready) if f not in ['CreditRisk']]
print(len(features))
X_train, X_test, Y_train, Y_test = train_test_split(data_ready[features],
                                                    data_ready['CreditRisk'],
                                                    test_size=0.3, 
                                                    random_state=1)

print('Data split - Train:', len(X_train), 'Test:', len(X_test))
# check for nulls
X_train.isnull().values.sum()
(1000, 62)
61
Data split - Train: 700 Test: 300
Out[10]:
0

Create set of negative outcomes only

In [11]:
# Create set of negative outcomes only
X_train_0 = X_train.copy()
X_train_0['CreditRisk'] = Y_train
X_train_0 = X_train_0[X_train_0['CreditRisk']==0]
X_train_0 = X_train_0.drop('CreditRisk', axis=1)

X_test_0 = X_test.copy()
X_test_0['CreditRisk'] = Y_test
X_test_0 = X_test_0[X_test_0['CreditRisk']==0]
X_test_0 = X_test_0.drop('CreditRisk', axis=1)
In [12]:
# Auto encoder parameters
nb_epoch = 500
batch_size = 128
input_dim = X_train_0.shape[1]
encoding_dim = 24
hidden_dim = int(encoding_dim / 2)
learning_rate = 1e-3

# set up autoencoder layers
input_layer = Input(shape=(input_dim, ))
encoder = Dense(encoding_dim, activation="relu", activity_regularizer=regularizers.l1(learning_rate))(input_layer)
encoder = Dense(hidden_dim, activation="relu")(encoder)
decoder = Dense(hidden_dim, activation="relu")(encoder)
decoder = Dense(encoding_dim, activation="relu")(decoder)
decoder = Dense(input_dim, activation="linear")(decoder)
autoencoder = Model(inputs=input_layer, outputs=decoder)
autoencoder.summary()
Model: "model"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_1 (InputLayer)         [(None, 61)]              0         
_________________________________________________________________
dense (Dense)                (None, 24)                1488      
_________________________________________________________________
dense_1 (Dense)              (None, 12)                300       
_________________________________________________________________
dense_2 (Dense)              (None, 12)                156       
_________________________________________________________________
dense_3 (Dense)              (None, 24)                312       
_________________________________________________________________
dense_4 (Dense)              (None, 61)                1525      
=================================================================
Total params: 3,781
Trainable params: 3,781
Non-trainable params: 0
_________________________________________________________________
In [13]:
autoencoder.compile(metrics=['accuracy'],
                    loss='mean_squared_error',
                    optimizer='adam')
cp = ModelCheckpoint(filepath="autoencoder_classifier.h5",
                               save_best_only=True,
                               verbose=0)
tb = TensorBoard(log_dir='./logs',
                histogram_freq=0,
                write_graph=True,
                write_images=True)
history = autoencoder.fit(X_train_0, X_train_0,
                    epochs=nb_epoch,
                    batch_size=batch_size,
                    shuffle=True,
                    validation_data=(X_test_0, X_test_0),
                    verbose=1,
                    callbacks=[cp, tb]).history
Train on 486 samples, validate on 214 samples
Epoch 1/500
486/486 [==============================] - 1s 1ms/sample - loss: 244747.4117 - accuracy: 0.0000e+00 - val_loss: 253974.1184 - val_accuracy: 0.0000e+00
Epoch 2/500
486/486 [==============================] - 0s 115us/sample - loss: 239188.4346 - accuracy: 0.0000e+00 - val_loss: 248766.5923 - val_accuracy: 0.0000e+00
Epoch 3/500
486/486 [==============================] - 0s 108us/sample - loss: 234344.3735 - accuracy: 0.0000e+00 - val_loss: 244873.4216 - val_accuracy: 0.0000e+00
Epoch 4/500
486/486 [==============================] - 0s 100us/sample - loss: 230918.9148 - accuracy: 0.0000e+00 - val_loss: 241503.7335 - val_accuracy: 0.0000e+00
Epoch 5/500
486/486 [==============================] - 0s 97us/sample - loss: 227855.6805 - accuracy: 0.0000e+00 - val_loss: 237799.7355 - val_accuracy: 0.0514
Epoch 6/500
486/486 [==============================] - 0s 95us/sample - loss: 224124.9300 - accuracy: 0.7572 - val_loss: 234227.9474 - val_accuracy: 1.0000
Epoch 7/500
486/486 [==============================] - 0s 117us/sample - loss: 221061.6575 - accuracy: 1.0000 - val_loss: 230862.6460 - val_accuracy: 1.0000
Epoch 8/500
486/486 [==============================] - 0s 121us/sample - loss: 217688.0509 - accuracy: 1.0000 - val_loss: 226893.3088 - val_accuracy: 1.0000
Epoch 9/500
486/486 [==============================] - 0s 123us/sample - loss: 213543.6346 - accuracy: 1.0000 - val_loss: 222273.1960 - val_accuracy: 1.0000
Epoch 10/500
486/486 [==============================] - 0s 105us/sample - loss: 208771.8510 - accuracy: 1.0000 - val_loss: 216696.5336 - val_accuracy: 1.0000
Epoch 11/500
486/486 [==============================] - 0s 101us/sample - loss: 203233.6640 - accuracy: 1.0000 - val_loss: 209914.8011 - val_accuracy: 1.0000
Epoch 12/500
486/486 [==============================] - 0s 105us/sample - loss: 196265.0678 - accuracy: 1.0000 - val_loss: 201906.2097 - val_accuracy: 1.0000
Epoch 13/500
486/486 [==============================] - 0s 114us/sample - loss: 188327.8687 - accuracy: 1.0000 - val_loss: 192233.4617 - val_accuracy: 1.0000
Epoch 14/500
486/486 [==============================] - 0s 96us/sample - loss: 178786.3682 - accuracy: 1.0000 - val_loss: 180726.5117 - val_accuracy: 1.0000
Epoch 15/500
486/486 [==============================] - 0s 110us/sample - loss: 167058.9880 - accuracy: 1.0000 - val_loss: 167089.1267 - val_accuracy: 1.0000
Epoch 16/500
486/486 [==============================] - 0s 112us/sample - loss: 153439.3546 - accuracy: 1.0000 - val_loss: 150628.8289 - val_accuracy: 1.0000
Epoch 17/500
486/486 [==============================] - 0s 114us/sample - loss: 136551.2887 - accuracy: 1.0000 - val_loss: 131927.3164 - val_accuracy: 1.0000
Epoch 18/500
486/486 [==============================] - 0s 102us/sample - loss: 118326.2820 - accuracy: 1.0000 - val_loss: 111009.7158 - val_accuracy: 1.0000
Epoch 19/500
486/486 [==============================] - 0s 103us/sample - loss: 98247.7781 - accuracy: 1.0000 - val_loss: 88470.0093 - val_accuracy: 1.0000
Epoch 20/500
486/486 [==============================] - 0s 98us/sample - loss: 76514.9987 - accuracy: 1.0000 - val_loss: 65685.5165 - val_accuracy: 1.0000
Epoch 21/500
486/486 [==============================] - 0s 99us/sample - loss: 54403.4239 - accuracy: 1.0000 - val_loss: 44085.0270 - val_accuracy: 1.0000
Epoch 22/500
486/486 [==============================] - 0s 103us/sample - loss: 35223.7744 - accuracy: 1.0000 - val_loss: 25097.8242 - val_accuracy: 1.0000
Epoch 23/500
486/486 [==============================] - 0s 103us/sample - loss: 18361.2358 - accuracy: 1.0000 - val_loss: 11054.7527 - val_accuracy: 1.0000
Epoch 24/500
486/486 [==============================] - 0s 101us/sample - loss: 7338.4071 - accuracy: 1.0000 - val_loss: 2836.5093 - val_accuracy: 1.0000
Epoch 25/500
486/486 [==============================] - 0s 148us/sample - loss: 1380.0139 - accuracy: 1.0000 - val_loss: 287.3479 - val_accuracy: 1.0000
Epoch 26/500
486/486 [==============================] - 0s 38us/sample - loss: 476.2958 - accuracy: 1.0000 - val_loss: 1249.6360 - val_accuracy: 1.0000
Epoch 27/500
486/486 [==============================] - 0s 54us/sample - loss: 1725.0303 - accuracy: 1.0000 - val_loss: 2488.7248 - val_accuracy: 1.0000
Epoch 28/500
486/486 [==============================] - 0s 51us/sample - loss: 2368.2002 - accuracy: 1.0000 - val_loss: 2190.2103 - val_accuracy: 1.0000
Epoch 29/500
486/486 [==============================] - 0s 45us/sample - loss: 1754.3152 - accuracy: 1.0000 - val_loss: 1217.8686 - val_accuracy: 1.0000
Epoch 30/500
486/486 [==============================] - 0s 43us/sample - loss: 854.5082 - accuracy: 1.0000 - val_loss: 392.5995 - val_accuracy: 1.0000
Epoch 31/500
486/486 [==============================] - 0s 109us/sample - loss: 232.6467 - accuracy: 1.0000 - val_loss: 81.2493 - val_accuracy: 1.0000
Epoch 32/500
486/486 [==============================] - 0s 107us/sample - loss: 64.5736 - accuracy: 1.0000 - val_loss: 79.0374 - val_accuracy: 1.0000
Epoch 33/500
486/486 [==============================] - 0s 38us/sample - loss: 110.4166 - accuracy: 1.0000 - val_loss: 166.7161 - val_accuracy: 1.0000
Epoch 34/500
486/486 [==============================] - 0s 44us/sample - loss: 173.0434 - accuracy: 1.0000 - val_loss: 191.5789 - val_accuracy: 1.0000
Epoch 35/500
486/486 [==============================] - 0s 43us/sample - loss: 168.5858 - accuracy: 1.0000 - val_loss: 145.7843 - val_accuracy: 1.0000
Epoch 36/500
486/486 [==============================] - 0s 114us/sample - loss: 112.2683 - accuracy: 1.0000 - val_loss: 76.0257 - val_accuracy: 1.0000
Epoch 37/500
486/486 [==============================] - 0s 109us/sample - loss: 55.8100 - accuracy: 1.0000 - val_loss: 33.4435 - val_accuracy: 1.0000
Epoch 38/500
486/486 [==============================] - 0s 98us/sample - loss: 29.6674 - accuracy: 1.0000 - val_loss: 25.9507 - val_accuracy: 1.0000
Epoch 39/500
486/486 [==============================] - 0s 44us/sample - loss: 29.5931 - accuracy: 1.0000 - val_loss: 32.8762 - val_accuracy: 1.0000
Epoch 40/500
486/486 [==============================] - 0s 45us/sample - loss: 36.4343 - accuracy: 1.0000 - val_loss: 36.7774 - val_accuracy: 1.0000
Epoch 41/500
486/486 [==============================] - 0s 45us/sample - loss: 37.5134 - accuracy: 1.0000 - val_loss: 33.3732 - val_accuracy: 1.0000
Epoch 42/500
486/486 [==============================] - 0s 53us/sample - loss: 32.9260 - accuracy: 1.0000 - val_loss: 27.7309 - val_accuracy: 1.0000
Epoch 43/500
486/486 [==============================] - 0s 115us/sample - loss: 27.6261 - accuracy: 1.0000 - val_loss: 24.2405 - val_accuracy: 1.0000
Epoch 44/500
486/486 [==============================] - 0s 40us/sample - loss: 25.5530 - accuracy: 1.0000 - val_loss: 24.3688 - val_accuracy: 1.0000
Epoch 45/500
486/486 [==============================] - 0s 44us/sample - loss: 25.8363 - accuracy: 1.0000 - val_loss: 25.5442 - val_accuracy: 1.0000
Epoch 46/500
486/486 [==============================] - 0s 49us/sample - loss: 26.6065 - accuracy: 1.0000 - val_loss: 25.4562 - val_accuracy: 1.0000
Epoch 47/500
486/486 [==============================] - 0s 47us/sample - loss: 26.2834 - accuracy: 1.0000 - val_loss: 24.5825 - val_accuracy: 1.0000
Epoch 48/500
486/486 [==============================] - 0s 110us/sample - loss: 25.5789 - accuracy: 1.0000 - val_loss: 24.0790 - val_accuracy: 1.0000
Epoch 49/500
486/486 [==============================] - 0s 109us/sample - loss: 25.1499 - accuracy: 1.0000 - val_loss: 23.7746 - val_accuracy: 1.0000
Epoch 50/500
486/486 [==============================] - 0s 42us/sample - loss: 25.1061 - accuracy: 1.0000 - val_loss: 23.8283 - val_accuracy: 1.0000
Epoch 51/500
486/486 [==============================] - 0s 44us/sample - loss: 25.1831 - accuracy: 1.0000 - val_loss: 23.8134 - val_accuracy: 1.0000
Epoch 52/500
486/486 [==============================] - 0s 124us/sample - loss: 25.1478 - accuracy: 1.0000 - val_loss: 23.6620 - val_accuracy: 1.0000
Epoch 53/500
486/486 [==============================] - 0s 102us/sample - loss: 25.0185 - accuracy: 1.0000 - val_loss: 23.6154 - val_accuracy: 1.0000
Epoch 54/500
486/486 [==============================] - 0s 42us/sample - loss: 25.0779 - accuracy: 1.0000 - val_loss: 23.6887 - val_accuracy: 1.0000
Epoch 55/500
486/486 [==============================] - 0s 46us/sample - loss: 24.9258 - accuracy: 1.0000 - val_loss: 23.7485 - val_accuracy: 1.0000
Epoch 56/500
486/486 [==============================] - 0s 44us/sample - loss: 24.9555 - accuracy: 1.0000 - val_loss: 23.6559 - val_accuracy: 1.0000
Epoch 57/500
486/486 [==============================] - 0s 116us/sample - loss: 24.8631 - accuracy: 1.0000 - val_loss: 23.5793 - val_accuracy: 1.0000
Epoch 58/500
486/486 [==============================] - 0s 39us/sample - loss: 24.8429 - accuracy: 1.0000 - val_loss: 23.7517 - val_accuracy: 1.0000
Epoch 59/500
486/486 [==============================] - 0s 43us/sample - loss: 25.2281 - accuracy: 1.0000 - val_loss: 23.6835 - val_accuracy: 1.0000
Epoch 60/500
486/486 [==============================] - 0s 47us/sample - loss: 25.0608 - accuracy: 1.0000 - val_loss: 23.6990 - val_accuracy: 1.0000
Epoch 61/500
486/486 [==============================] - 0s 50us/sample - loss: 25.1129 - accuracy: 1.0000 - val_loss: 24.1105 - val_accuracy: 1.0000
Epoch 62/500
486/486 [==============================] - 0s 116us/sample - loss: 24.9244 - accuracy: 1.0000 - val_loss: 23.4623 - val_accuracy: 1.0000
Epoch 63/500
486/486 [==============================] - 0s 43us/sample - loss: 24.8105 - accuracy: 1.0000 - val_loss: 23.5127 - val_accuracy: 1.0000
Epoch 64/500
486/486 [==============================] - 0s 47us/sample - loss: 24.7006 - accuracy: 1.0000 - val_loss: 23.5584 - val_accuracy: 1.0000
Epoch 65/500
486/486 [==============================] - 0s 111us/sample - loss: 24.7103 - accuracy: 1.0000 - val_loss: 23.3841 - val_accuracy: 1.0000
Epoch 66/500
486/486 [==============================] - 0s 100us/sample - loss: 24.7027 - accuracy: 1.0000 - val_loss: 23.3591 - val_accuracy: 1.0000
Epoch 67/500
486/486 [==============================] - 0s 37us/sample - loss: 24.6442 - accuracy: 1.0000 - val_loss: 23.5340 - val_accuracy: 1.0000
Epoch 68/500
486/486 [==============================] - 0s 102us/sample - loss: 24.9565 - accuracy: 1.0000 - val_loss: 23.3418 - val_accuracy: 1.0000
Epoch 69/500
486/486 [==============================] - 0s 38us/sample - loss: 24.7676 - accuracy: 1.0000 - val_loss: 23.3622 - val_accuracy: 1.0000
Epoch 70/500
486/486 [==============================] - 0s 42us/sample - loss: 24.6236 - accuracy: 1.0000 - val_loss: 23.7142 - val_accuracy: 1.0000
Epoch 71/500
486/486 [==============================] - 0s 108us/sample - loss: 24.6611 - accuracy: 1.0000 - val_loss: 23.3071 - val_accuracy: 1.0000
Epoch 72/500
486/486 [==============================] - 0s 95us/sample - loss: 24.7179 - accuracy: 1.0000 - val_loss: 23.2201 - val_accuracy: 1.0000
Epoch 73/500
486/486 [==============================] - 0s 36us/sample - loss: 24.5565 - accuracy: 1.0000 - val_loss: 23.5932 - val_accuracy: 1.0000
Epoch 74/500
486/486 [==============================] - 0s 43us/sample - loss: 24.6085 - accuracy: 1.0000 - val_loss: 23.2734 - val_accuracy: 1.0000
Epoch 75/500
486/486 [==============================] - 0s 42us/sample - loss: 24.5606 - accuracy: 1.0000 - val_loss: 23.3659 - val_accuracy: 1.0000
Epoch 76/500
486/486 [==============================] - 0s 43us/sample - loss: 24.5140 - accuracy: 1.0000 - val_loss: 23.2677 - val_accuracy: 1.0000
Epoch 77/500
486/486 [==============================] - 0s 43us/sample - loss: 24.4722 - accuracy: 1.0000 - val_loss: 23.2384 - val_accuracy: 1.0000
Epoch 78/500
486/486 [==============================] - 0s 47us/sample - loss: 24.5221 - accuracy: 1.0000 - val_loss: 23.3221 - val_accuracy: 1.0000
Epoch 79/500
486/486 [==============================] - 0s 120us/sample - loss: 24.4196 - accuracy: 1.0000 - val_loss: 23.1190 - val_accuracy: 1.0000
Epoch 80/500
486/486 [==============================] - 0s 44us/sample - loss: 24.6408 - accuracy: 1.0000 - val_loss: 23.1513 - val_accuracy: 1.0000
Epoch 81/500
486/486 [==============================] - 0s 55us/sample - loss: 24.7101 - accuracy: 1.0000 - val_loss: 23.5560 - val_accuracy: 1.0000
Epoch 82/500
486/486 [==============================] - 0s 52us/sample - loss: 24.7452 - accuracy: 1.0000 - val_loss: 23.2533 - val_accuracy: 1.0000
Epoch 83/500
486/486 [==============================] - 0s 59us/sample - loss: 24.4723 - accuracy: 1.0000 - val_loss: 23.5363 - val_accuracy: 1.0000
Epoch 84/500
486/486 [==============================] - 0s 154us/sample - loss: 24.6869 - accuracy: 1.0000 - val_loss: 23.0386 - val_accuracy: 1.0000
Epoch 85/500
486/486 [==============================] - 0s 56us/sample - loss: 24.4915 - accuracy: 1.0000 - val_loss: 23.0581 - val_accuracy: 1.0000
Epoch 86/500
486/486 [==============================] - 0s 61us/sample - loss: 24.2847 - accuracy: 1.0000 - val_loss: 23.3878 - val_accuracy: 1.0000
Epoch 87/500
486/486 [==============================] - 0s 160us/sample - loss: 24.3564 - accuracy: 1.0000 - val_loss: 22.9500 - val_accuracy: 1.0000
Epoch 88/500
486/486 [==============================] - 0s 55us/sample - loss: 24.2306 - accuracy: 1.0000 - val_loss: 23.0569 - val_accuracy: 1.0000
Epoch 89/500
486/486 [==============================] - 0s 63us/sample - loss: 24.2272 - accuracy: 1.0000 - val_loss: 22.9728 - val_accuracy: 1.0000
Epoch 90/500
486/486 [==============================] - 0s 147us/sample - loss: 24.0800 - accuracy: 1.0000 - val_loss: 22.8654 - val_accuracy: 1.0000
Epoch 91/500
486/486 [==============================] - 0s 56us/sample - loss: 24.1064 - accuracy: 1.0000 - val_loss: 22.8986 - val_accuracy: 1.0000
Epoch 92/500
486/486 [==============================] - 0s 55us/sample - loss: 24.0637 - accuracy: 1.0000 - val_loss: 22.9063 - val_accuracy: 1.0000
Epoch 93/500
486/486 [==============================] - 0s 161us/sample - loss: 24.1343 - accuracy: 1.0000 - val_loss: 22.7990 - val_accuracy: 1.0000
Epoch 94/500
486/486 [==============================] - 0s 147us/sample - loss: 24.1664 - accuracy: 1.0000 - val_loss: 22.7829 - val_accuracy: 1.0000
Epoch 95/500
486/486 [==============================] - 0s 57us/sample - loss: 24.2081 - accuracy: 1.0000 - val_loss: 23.1522 - val_accuracy: 1.0000
Epoch 96/500
486/486 [==============================] - 0s 54us/sample - loss: 24.1566 - accuracy: 1.0000 - val_loss: 22.8048 - val_accuracy: 1.0000
Epoch 97/500
486/486 [==============================] - 0s 63us/sample - loss: 24.3904 - accuracy: 1.0000 - val_loss: 22.9191 - val_accuracy: 1.0000
Epoch 98/500
486/486 [==============================] - 0s 160us/sample - loss: 23.9058 - accuracy: 1.0000 - val_loss: 22.7694 - val_accuracy: 1.0000
Epoch 99/500
486/486 [==============================] - 0s 53us/sample - loss: 24.1562 - accuracy: 1.0000 - val_loss: 22.8875 - val_accuracy: 1.0000
Epoch 100/500
486/486 [==============================] - 0s 148us/sample - loss: 23.9802 - accuracy: 1.0000 - val_loss: 22.6405 - val_accuracy: 1.0000
Epoch 101/500
486/486 [==============================] - 0s 53us/sample - loss: 23.9980 - accuracy: 1.0000 - val_loss: 22.7006 - val_accuracy: 1.0000
Epoch 102/500
486/486 [==============================] - 0s 61us/sample - loss: 24.0067 - accuracy: 1.0000 - val_loss: 22.8127 - val_accuracy: 1.0000
Epoch 103/500
486/486 [==============================] - 0s 60us/sample - loss: 24.1000 - accuracy: 1.0000 - val_loss: 22.7620 - val_accuracy: 1.0000
Epoch 104/500
486/486 [==============================] - 0s 163us/sample - loss: 24.1177 - accuracy: 1.0000 - val_loss: 22.6284 - val_accuracy: 1.0000
Epoch 105/500
486/486 [==============================] - 0s 51us/sample - loss: 23.8043 - accuracy: 1.0000 - val_loss: 23.3841 - val_accuracy: 1.0000
Epoch 106/500
486/486 [==============================] - 0s 143us/sample - loss: 23.9940 - accuracy: 1.0000 - val_loss: 22.5727 - val_accuracy: 1.0000
Epoch 107/500
486/486 [==============================] - 0s 160us/sample - loss: 24.0319 - accuracy: 1.0000 - val_loss: 22.5135 - val_accuracy: 1.0000
Epoch 108/500
486/486 [==============================] - 0s 55us/sample - loss: 23.6404 - accuracy: 1.0000 - val_loss: 22.9543 - val_accuracy: 1.0000
Epoch 109/500
486/486 [==============================] - 0s 155us/sample - loss: 23.7837 - accuracy: 1.0000 - val_loss: 22.4497 - val_accuracy: 1.0000
Epoch 110/500
486/486 [==============================] - 0s 122us/sample - loss: 23.7463 - accuracy: 1.0000 - val_loss: 22.4376 - val_accuracy: 1.0000
Epoch 111/500
486/486 [==============================] - 0s 45us/sample - loss: 23.5988 - accuracy: 1.0000 - val_loss: 22.5767 - val_accuracy: 1.0000
Epoch 112/500
486/486 [==============================] - 0s 115us/sample - loss: 23.8050 - accuracy: 1.0000 - val_loss: 22.4331 - val_accuracy: 1.0000
Epoch 113/500
486/486 [==============================] - 0s 44us/sample - loss: 23.8988 - accuracy: 1.0000 - val_loss: 22.5401 - val_accuracy: 1.0000
Epoch 114/500
486/486 [==============================] - 0s 54us/sample - loss: 23.5363 - accuracy: 1.0000 - val_loss: 22.5186 - val_accuracy: 1.0000
Epoch 115/500
486/486 [==============================] - 0s 49us/sample - loss: 23.9342 - accuracy: 1.0000 - val_loss: 22.6657 - val_accuracy: 1.0000
Epoch 116/500
486/486 [==============================] - 0s 121us/sample - loss: 23.7822 - accuracy: 1.0000 - val_loss: 22.3406 - val_accuracy: 1.0000
Epoch 117/500
486/486 [==============================] - 0s 45us/sample - loss: 23.5619 - accuracy: 1.0000 - val_loss: 22.6574 - val_accuracy: 1.0000
Epoch 118/500
486/486 [==============================] - 0s 119us/sample - loss: 23.7134 - accuracy: 1.0000 - val_loss: 22.3186 - val_accuracy: 1.0000
Epoch 119/500
486/486 [==============================] - 0s 104us/sample - loss: 23.6031 - accuracy: 1.0000 - val_loss: 22.3061 - val_accuracy: 1.0000
Epoch 120/500
486/486 [==============================] - 0s 116us/sample - loss: 23.6062 - accuracy: 1.0000 - val_loss: 22.2714 - val_accuracy: 1.0000
Epoch 121/500
486/486 [==============================] - 0s 56us/sample - loss: 23.4790 - accuracy: 1.0000 - val_loss: 22.2848 - val_accuracy: 1.0000
Epoch 122/500
486/486 [==============================] - 0s 52us/sample - loss: 23.6660 - accuracy: 1.0000 - val_loss: 22.3629 - val_accuracy: 1.0000
Epoch 123/500
486/486 [==============================] - 0s 51us/sample - loss: 23.7014 - accuracy: 1.0000 - val_loss: 22.2876 - val_accuracy: 1.0000
Epoch 124/500
486/486 [==============================] - 0s 49us/sample - loss: 23.5113 - accuracy: 1.0000 - val_loss: 22.5954 - val_accuracy: 1.0000
Epoch 125/500
486/486 [==============================] - 0s 128us/sample - loss: 23.8901 - accuracy: 1.0000 - val_loss: 22.2047 - val_accuracy: 1.0000
Epoch 126/500
486/486 [==============================] - 0s 112us/sample - loss: 23.5534 - accuracy: 1.0000 - val_loss: 22.1839 - val_accuracy: 1.0000
Epoch 127/500
486/486 [==============================] - 0s 49us/sample - loss: 23.3731 - accuracy: 1.0000 - val_loss: 22.2270 - val_accuracy: 1.0000
Epoch 128/500
486/486 [==============================] - 0s 112us/sample - loss: 23.3232 - accuracy: 1.0000 - val_loss: 22.1516 - val_accuracy: 1.0000
Epoch 129/500
486/486 [==============================] - 0s 98us/sample - loss: 23.3328 - accuracy: 1.0000 - val_loss: 22.1417 - val_accuracy: 1.0000
Epoch 130/500
486/486 [==============================] - 0s 38us/sample - loss: 23.4664 - accuracy: 1.0000 - val_loss: 22.2290 - val_accuracy: 1.0000
Epoch 131/500
486/486 [==============================] - 0s 117us/sample - loss: 23.5393 - accuracy: 1.0000 - val_loss: 22.1106 - val_accuracy: 1.0000
Epoch 132/500
486/486 [==============================] - 0s 46us/sample - loss: 23.4888 - accuracy: 1.0000 - val_loss: 22.5057 - val_accuracy: 1.0000
Epoch 133/500
486/486 [==============================] - 0s 51us/sample - loss: 23.3559 - accuracy: 1.0000 - val_loss: 22.1423 - val_accuracy: 1.0000
Epoch 134/500
486/486 [==============================] - 0s 51us/sample - loss: 23.6186 - accuracy: 1.0000 - val_loss: 22.2716 - val_accuracy: 1.0000
Epoch 135/500
486/486 [==============================] - 0s 120us/sample - loss: 23.2513 - accuracy: 1.0000 - val_loss: 22.0569 - val_accuracy: 1.0000
Epoch 136/500
486/486 [==============================] - 0s 114us/sample - loss: 23.2482 - accuracy: 1.0000 - val_loss: 21.9687 - val_accuracy: 1.0000
Epoch 137/500
486/486 [==============================] - 0s 113us/sample - loss: 23.2566 - accuracy: 1.0000 - val_loss: 21.9553 - val_accuracy: 1.0000
Epoch 138/500
486/486 [==============================] - 0s 116us/sample - loss: 23.0587 - accuracy: 1.0000 - val_loss: 21.9206 - val_accuracy: 1.0000
Epoch 139/500
486/486 [==============================] - 0s 52us/sample - loss: 23.0650 - accuracy: 1.0000 - val_loss: 21.9908 - val_accuracy: 1.0000
Epoch 140/500
486/486 [==============================] - 0s 142us/sample - loss: 23.0825 - accuracy: 1.0000 - val_loss: 21.8459 - val_accuracy: 1.0000
Epoch 141/500
486/486 [==============================] - 0s 52us/sample - loss: 23.2834 - accuracy: 1.0000 - val_loss: 21.9337 - val_accuracy: 1.0000
Epoch 142/500
486/486 [==============================] - 0s 51us/sample - loss: 23.1927 - accuracy: 1.0000 - val_loss: 21.9001 - val_accuracy: 1.0000
Epoch 143/500
486/486 [==============================] - 0s 50us/sample - loss: 23.1042 - accuracy: 1.0000 - val_loss: 21.9176 - val_accuracy: 1.0000
Epoch 144/500
486/486 [==============================] - 0s 49us/sample - loss: 23.0247 - accuracy: 1.0000 - val_loss: 22.0265 - val_accuracy: 1.0000
Epoch 145/500
486/486 [==============================] - 0s 47us/sample - loss: 23.0102 - accuracy: 1.0000 - val_loss: 21.8545 - val_accuracy: 1.0000
Epoch 146/500
486/486 [==============================] - 0s 45us/sample - loss: 22.9487 - accuracy: 1.0000 - val_loss: 21.9043 - val_accuracy: 1.0000
Epoch 147/500
486/486 [==============================] - 0s 117us/sample - loss: 22.9646 - accuracy: 1.0000 - val_loss: 21.7566 - val_accuracy: 1.0000
Epoch 148/500
486/486 [==============================] - 0s 112us/sample - loss: 23.0706 - accuracy: 1.0000 - val_loss: 21.7184 - val_accuracy: 1.0000
Epoch 149/500
486/486 [==============================] - 0s 46us/sample - loss: 22.9316 - accuracy: 1.0000 - val_loss: 21.7754 - val_accuracy: 1.0000
Epoch 150/500
486/486 [==============================] - 0s 121us/sample - loss: 22.9490 - accuracy: 1.0000 - val_loss: 21.6801 - val_accuracy: 1.0000
Epoch 151/500
486/486 [==============================] - 0s 46us/sample - loss: 22.9278 - accuracy: 1.0000 - val_loss: 22.0064 - val_accuracy: 1.0000
Epoch 152/500
486/486 [==============================] - 0s 48us/sample - loss: 23.0354 - accuracy: 1.0000 - val_loss: 21.6979 - val_accuracy: 1.0000
Epoch 153/500
486/486 [==============================] - 0s 122us/sample - loss: 23.2189 - accuracy: 1.0000 - val_loss: 21.6392 - val_accuracy: 1.0000
Epoch 154/500
486/486 [==============================] - 0s 44us/sample - loss: 23.1077 - accuracy: 1.0000 - val_loss: 21.8571 - val_accuracy: 1.0000
Epoch 155/500
486/486 [==============================] - 0s 118us/sample - loss: 23.1456 - accuracy: 1.0000 - val_loss: 21.6109 - val_accuracy: 1.0000
Epoch 156/500
486/486 [==============================] - 0s 43us/sample - loss: 22.6906 - accuracy: 1.0000 - val_loss: 22.3255 - val_accuracy: 1.0000
Epoch 157/500
486/486 [==============================] - 0s 55us/sample - loss: 22.9995 - accuracy: 1.0000 - val_loss: 21.6393 - val_accuracy: 1.0000
Epoch 158/500
486/486 [==============================] - 0s 52us/sample - loss: 22.7031 - accuracy: 1.0000 - val_loss: 21.8058 - val_accuracy: 1.0000
Epoch 159/500
486/486 [==============================] - 0s 125us/sample - loss: 22.7688 - accuracy: 1.0000 - val_loss: 21.5246 - val_accuracy: 1.0000
Epoch 160/500
486/486 [==============================] - 0s 111us/sample - loss: 22.6298 - accuracy: 1.0000 - val_loss: 21.4692 - val_accuracy: 1.0000
Epoch 161/500
486/486 [==============================] - 0s 47us/sample - loss: 22.5591 - accuracy: 1.0000 - val_loss: 21.4919 - val_accuracy: 1.0000
Epoch 162/500
486/486 [==============================] - 0s 45us/sample - loss: 22.5397 - accuracy: 1.0000 - val_loss: 21.4809 - val_accuracy: 1.0000
Epoch 163/500
486/486 [==============================] - 0s 57us/sample - loss: 22.5670 - accuracy: 1.0000 - val_loss: 21.4838 - val_accuracy: 1.0000
Epoch 164/500
486/486 [==============================] - 0s 55us/sample - loss: 22.5586 - accuracy: 1.0000 - val_loss: 21.5641 - val_accuracy: 1.0000
Epoch 165/500
486/486 [==============================] - 0s 126us/sample - loss: 22.4849 - accuracy: 1.0000 - val_loss: 21.3539 - val_accuracy: 1.0000
Epoch 166/500
486/486 [==============================] - 0s 50us/sample - loss: 22.7630 - accuracy: 1.0000 - val_loss: 21.4625 - val_accuracy: 1.0000
Epoch 167/500
486/486 [==============================] - 0s 125us/sample - loss: 22.8059 - accuracy: 1.0000 - val_loss: 21.3022 - val_accuracy: 1.0000
Epoch 168/500
486/486 [==============================] - 0s 45us/sample - loss: 22.8251 - accuracy: 1.0000 - val_loss: 21.3454 - val_accuracy: 1.0000
Epoch 169/500
486/486 [==============================] - 0s 113us/sample - loss: 22.4895 - accuracy: 1.0000 - val_loss: 21.2618 - val_accuracy: 1.0000
Epoch 170/500
486/486 [==============================] - 0s 47us/sample - loss: 22.4537 - accuracy: 1.0000 - val_loss: 21.2757 - val_accuracy: 1.0000
Epoch 171/500
486/486 [==============================] - 0s 49us/sample - loss: 22.4148 - accuracy: 1.0000 - val_loss: 21.3028 - val_accuracy: 1.0000
Epoch 172/500
486/486 [==============================] - 0s 54us/sample - loss: 22.5223 - accuracy: 1.0000 - val_loss: 21.4331 - val_accuracy: 1.0000
Epoch 173/500
486/486 [==============================] - 0s 50us/sample - loss: 22.5490 - accuracy: 1.0000 - val_loss: 21.2709 - val_accuracy: 1.0000
Epoch 174/500
486/486 [==============================] - 0s 118us/sample - loss: 22.5188 - accuracy: 1.0000 - val_loss: 21.2563 - val_accuracy: 1.0000
Epoch 175/500
486/486 [==============================] - 0s 46us/sample - loss: 22.3508 - accuracy: 1.0000 - val_loss: 21.3474 - val_accuracy: 1.0000
Epoch 176/500
486/486 [==============================] - 0s 46us/sample - loss: 23.0242 - accuracy: 1.0000 - val_loss: 21.4334 - val_accuracy: 1.0000
Epoch 177/500
486/486 [==============================] - 0s 49us/sample - loss: 22.4548 - accuracy: 1.0000 - val_loss: 22.2415 - val_accuracy: 1.0000
Epoch 178/500
486/486 [==============================] - 0s 50us/sample - loss: 22.7246 - accuracy: 1.0000 - val_loss: 21.3203 - val_accuracy: 1.0000
Epoch 179/500
486/486 [==============================] - 0s 51us/sample - loss: 22.5290 - accuracy: 1.0000 - val_loss: 21.2587 - val_accuracy: 1.0000
Epoch 180/500
486/486 [==============================] - 0s 124us/sample - loss: 22.3924 - accuracy: 1.0000 - val_loss: 21.1763 - val_accuracy: 1.0000
Epoch 181/500
486/486 [==============================] - 0s 106us/sample - loss: 22.3635 - accuracy: 1.0000 - val_loss: 21.1358 - val_accuracy: 1.0000
Epoch 182/500
486/486 [==============================] - 0s 48us/sample - loss: 22.4582 - accuracy: 1.0000 - val_loss: 21.2167 - val_accuracy: 1.0000
Epoch 183/500
486/486 [==============================] - 0s 117us/sample - loss: 22.5764 - accuracy: 1.0000 - val_loss: 21.0942 - val_accuracy: 1.0000
Epoch 184/500
486/486 [==============================] - 0s 50us/sample - loss: 22.4426 - accuracy: 1.0000 - val_loss: 21.4165 - val_accuracy: 1.0000
Epoch 185/500
486/486 [==============================] - 0s 57us/sample - loss: 22.3930 - accuracy: 1.0000 - val_loss: 21.2454 - val_accuracy: 1.0000
Epoch 186/500
486/486 [==============================] - 0s 53us/sample - loss: 22.8573 - accuracy: 1.0000 - val_loss: 21.2905 - val_accuracy: 1.0000
Epoch 187/500
486/486 [==============================] - 0s 56us/sample - loss: 22.3172 - accuracy: 1.0000 - val_loss: 21.3518 - val_accuracy: 1.0000
Epoch 188/500
486/486 [==============================] - 0s 54us/sample - loss: 22.5535 - accuracy: 1.0000 - val_loss: 21.4528 - val_accuracy: 1.0000
Epoch 189/500
486/486 [==============================] - 0s 54us/sample - loss: 22.3188 - accuracy: 1.0000 - val_loss: 21.0961 - val_accuracy: 1.0000
Epoch 190/500
486/486 [==============================] - 0s 117us/sample - loss: 22.2552 - accuracy: 1.0000 - val_loss: 21.0481 - val_accuracy: 1.0000
Epoch 191/500
486/486 [==============================] - 0s 44us/sample - loss: 22.1588 - accuracy: 1.0000 - val_loss: 21.0808 - val_accuracy: 1.0000
Epoch 192/500
486/486 [==============================] - 0s 116us/sample - loss: 21.9856 - accuracy: 1.0000 - val_loss: 20.8718 - val_accuracy: 1.0000
Epoch 193/500
486/486 [==============================] - 0s 45us/sample - loss: 22.1104 - accuracy: 1.0000 - val_loss: 20.9005 - val_accuracy: 1.0000
Epoch 194/500
486/486 [==============================] - 0s 114us/sample - loss: 22.0412 - accuracy: 1.0000 - val_loss: 20.8300 - val_accuracy: 1.0000
Epoch 195/500
486/486 [==============================] - 0s 46us/sample - loss: 21.9934 - accuracy: 1.0000 - val_loss: 20.9977 - val_accuracy: 1.0000
Epoch 196/500
486/486 [==============================] - 0s 45us/sample - loss: 21.9834 - accuracy: 1.0000 - val_loss: 20.8508 - val_accuracy: 1.0000
Epoch 197/500
486/486 [==============================] - 0s 50us/sample - loss: 21.9101 - accuracy: 1.0000 - val_loss: 20.8824 - val_accuracy: 1.0000
Epoch 198/500
486/486 [==============================] - 0s 51us/sample - loss: 22.0395 - accuracy: 1.0000 - val_loss: 20.8633 - val_accuracy: 1.0000
Epoch 199/500
486/486 [==============================] - 0s 122us/sample - loss: 22.3733 - accuracy: 1.0000 - val_loss: 20.7639 - val_accuracy: 1.0000
Epoch 200/500
486/486 [==============================] - 0s 45us/sample - loss: 21.9460 - accuracy: 1.0000 - val_loss: 21.1738 - val_accuracy: 1.0000
Epoch 201/500
486/486 [==============================] - 0s 103us/sample - loss: 21.8259 - accuracy: 1.0000 - val_loss: 20.7152 - val_accuracy: 1.0000
Epoch 202/500
486/486 [==============================] - 0s 42us/sample - loss: 21.8269 - accuracy: 1.0000 - val_loss: 20.7336 - val_accuracy: 1.0000
Epoch 203/500
486/486 [==============================] - 0s 44us/sample - loss: 21.7836 - accuracy: 1.0000 - val_loss: 20.7850 - val_accuracy: 1.0000
Epoch 204/500
486/486 [==============================] - 0s 100us/sample - loss: 21.9049 - accuracy: 1.0000 - val_loss: 20.6898 - val_accuracy: 1.0000
Epoch 205/500
486/486 [==============================] - 0s 39us/sample - loss: 22.0283 - accuracy: 1.0000 - val_loss: 20.6949 - val_accuracy: 1.0000
Epoch 206/500
486/486 [==============================] - 0s 43us/sample - loss: 21.9275 - accuracy: 1.0000 - val_loss: 20.7503 - val_accuracy: 1.0000
Epoch 207/500
486/486 [==============================] - 0s 126us/sample - loss: 21.7847 - accuracy: 1.0000 - val_loss: 20.5668 - val_accuracy: 1.0000
Epoch 208/500
486/486 [==============================] - 0s 48us/sample - loss: 21.6178 - accuracy: 1.0000 - val_loss: 20.9389 - val_accuracy: 1.0000
Epoch 209/500
486/486 [==============================] - 0s 46us/sample - loss: 21.8333 - accuracy: 1.0000 - val_loss: 20.5700 - val_accuracy: 1.0000
Epoch 210/500
486/486 [==============================] - 0s 109us/sample - loss: 21.7559 - accuracy: 1.0000 - val_loss: 20.5591 - val_accuracy: 1.0000
Epoch 211/500
486/486 [==============================] - 0s 41us/sample - loss: 21.8980 - accuracy: 1.0000 - val_loss: 20.6031 - val_accuracy: 1.0000
Epoch 212/500
486/486 [==============================] - 0s 41us/sample - loss: 21.7312 - accuracy: 1.0000 - val_loss: 20.5708 - val_accuracy: 1.0000
Epoch 213/500
486/486 [==============================] - 0s 42us/sample - loss: 21.9550 - accuracy: 1.0000 - val_loss: 20.8798 - val_accuracy: 1.0000
Epoch 214/500
486/486 [==============================] - 0s 46us/sample - loss: 22.1932 - accuracy: 1.0000 - val_loss: 20.6338 - val_accuracy: 1.0000
Epoch 215/500
486/486 [==============================] - 0s 46us/sample - loss: 22.2262 - accuracy: 1.0000 - val_loss: 21.1199 - val_accuracy: 1.0000
Epoch 216/500
486/486 [==============================] - 0s 50us/sample - loss: 21.5963 - accuracy: 1.0000 - val_loss: 20.6213 - val_accuracy: 1.0000
Epoch 217/500
486/486 [==============================] - 0s 55us/sample - loss: 21.6695 - accuracy: 1.0000 - val_loss: 20.6302 - val_accuracy: 1.0000
Epoch 218/500
486/486 [==============================] - 0s 114us/sample - loss: 21.5449 - accuracy: 1.0000 - val_loss: 20.5010 - val_accuracy: 1.0000
Epoch 219/500
486/486 [==============================] - 0s 42us/sample - loss: 21.5508 - accuracy: 1.0000 - val_loss: 20.5666 - val_accuracy: 1.0000
Epoch 220/500
486/486 [==============================] - 0s 124us/sample - loss: 21.8018 - accuracy: 1.0000 - val_loss: 20.3597 - val_accuracy: 1.0000
Epoch 221/500
486/486 [==============================] - 0s 51us/sample - loss: 21.6910 - accuracy: 1.0000 - val_loss: 20.6262 - val_accuracy: 1.0000
Epoch 222/500
486/486 [==============================] - 0s 48us/sample - loss: 21.4805 - accuracy: 1.0000 - val_loss: 20.4485 - val_accuracy: 1.0000
Epoch 223/500
486/486 [==============================] - 0s 54us/sample - loss: 21.5861 - accuracy: 1.0000 - val_loss: 20.4086 - val_accuracy: 1.0000
Epoch 224/500
486/486 [==============================] - 0s 52us/sample - loss: 21.4398 - accuracy: 1.0000 - val_loss: 20.4021 - val_accuracy: 1.0000
Epoch 225/500
486/486 [==============================] - 0s 51us/sample - loss: 21.3774 - accuracy: 1.0000 - val_loss: 20.3953 - val_accuracy: 1.0000
Epoch 226/500
486/486 [==============================] - 0s 125us/sample - loss: 21.3786 - accuracy: 1.0000 - val_loss: 20.2776 - val_accuracy: 1.0000
Epoch 227/500
486/486 [==============================] - 0s 44us/sample - loss: 21.3079 - accuracy: 1.0000 - val_loss: 20.4068 - val_accuracy: 1.0000
Epoch 228/500
486/486 [==============================] - 0s 124us/sample - loss: 21.2856 - accuracy: 1.0000 - val_loss: 20.2452 - val_accuracy: 1.0000
Epoch 229/500
486/486 [==============================] - 0s 49us/sample - loss: 21.3115 - accuracy: 1.0000 - val_loss: 20.3599 - val_accuracy: 1.0000
Epoch 230/500
486/486 [==============================] - 0s 128us/sample - loss: 21.2890 - accuracy: 1.0000 - val_loss: 20.1963 - val_accuracy: 1.0000
Epoch 231/500
486/486 [==============================] - 0s 115us/sample - loss: 21.3211 - accuracy: 1.0000 - val_loss: 20.1895 - val_accuracy: 1.0000
Epoch 232/500
486/486 [==============================] - 0s 44us/sample - loss: 21.3778 - accuracy: 1.0000 - val_loss: 20.2523 - val_accuracy: 1.0000
Epoch 233/500
486/486 [==============================] - 0s 50us/sample - loss: 21.4090 - accuracy: 1.0000 - val_loss: 20.3556 - val_accuracy: 1.0000
Epoch 234/500
486/486 [==============================] - 0s 121us/sample - loss: 21.1402 - accuracy: 1.0000 - val_loss: 20.1739 - val_accuracy: 1.0000
Epoch 235/500
486/486 [==============================] - 0s 46us/sample - loss: 21.2996 - accuracy: 1.0000 - val_loss: 20.2022 - val_accuracy: 1.0000
Epoch 236/500
486/486 [==============================] - 0s 44us/sample - loss: 21.4075 - accuracy: 1.0000 - val_loss: 20.2238 - val_accuracy: 1.0000
Epoch 237/500
486/486 [==============================] - 0s 43us/sample - loss: 21.8646 - accuracy: 1.0000 - val_loss: 20.1798 - val_accuracy: 1.0000
Epoch 238/500
486/486 [==============================] - 0s 49us/sample - loss: 21.3093 - accuracy: 1.0000 - val_loss: 21.1372 - val_accuracy: 1.0000
Epoch 239/500
486/486 [==============================] - 0s 50us/sample - loss: 21.5410 - accuracy: 1.0000 - val_loss: 20.5018 - val_accuracy: 1.0000
Epoch 240/500
486/486 [==============================] - 0s 54us/sample - loss: 21.3922 - accuracy: 1.0000 - val_loss: 21.1579 - val_accuracy: 1.0000
Epoch 241/500
486/486 [==============================] - 0s 52us/sample - loss: 21.7955 - accuracy: 1.0000 - val_loss: 20.3960 - val_accuracy: 1.0000
Epoch 242/500
486/486 [==============================] - 0s 121us/sample - loss: 21.2987 - accuracy: 1.0000 - val_loss: 20.1095 - val_accuracy: 1.0000
Epoch 243/500
486/486 [==============================] - 0s 125us/sample - loss: 21.1496 - accuracy: 1.0000 - val_loss: 20.0457 - val_accuracy: 1.0000
Epoch 244/500
486/486 [==============================] - 0s 45us/sample - loss: 21.1805 - accuracy: 1.0000 - val_loss: 20.2356 - val_accuracy: 1.0000
Epoch 245/500
486/486 [==============================] - 0s 52us/sample - loss: 21.7706 - accuracy: 1.0000 - val_loss: 20.2139 - val_accuracy: 1.0000
Epoch 246/500
486/486 [==============================] - 0s 53us/sample - loss: 21.8809 - accuracy: 1.0000 - val_loss: 20.4326 - val_accuracy: 1.0000
Epoch 247/500
486/486 [==============================] - 0s 60us/sample - loss: 21.5969 - accuracy: 1.0000 - val_loss: 21.0014 - val_accuracy: 1.0000
Epoch 248/500
486/486 [==============================] - 0s 52us/sample - loss: 21.7623 - accuracy: 1.0000 - val_loss: 20.4361 - val_accuracy: 1.0000
Epoch 249/500
486/486 [==============================] - 0s 56us/sample - loss: 21.4713 - accuracy: 1.0000 - val_loss: 20.7299 - val_accuracy: 1.0000
Epoch 250/500
486/486 [==============================] - 0s 49us/sample - loss: 21.7155 - accuracy: 1.0000 - val_loss: 20.2952 - val_accuracy: 1.0000
Epoch 251/500
486/486 [==============================] - 0s 48us/sample - loss: 21.4574 - accuracy: 1.0000 - val_loss: 20.1590 - val_accuracy: 1.0000
Epoch 252/500
486/486 [==============================] - 0s 50us/sample - loss: 21.1156 - accuracy: 1.0000 - val_loss: 20.2077 - val_accuracy: 1.0000
Epoch 253/500
486/486 [==============================] - 0s 121us/sample - loss: 21.1025 - accuracy: 1.0000 - val_loss: 19.8886 - val_accuracy: 1.0000
Epoch 254/500
486/486 [==============================] - 0s 46us/sample - loss: 20.9458 - accuracy: 1.0000 - val_loss: 19.9081 - val_accuracy: 1.0000
Epoch 255/500
486/486 [==============================] - 0s 123us/sample - loss: 20.8776 - accuracy: 1.0000 - val_loss: 19.8145 - val_accuracy: 1.0000
Epoch 256/500
486/486 [==============================] - 0s 46us/sample - loss: 20.9954 - accuracy: 1.0000 - val_loss: 19.9209 - val_accuracy: 1.0000
Epoch 257/500
486/486 [==============================] - 0s 47us/sample - loss: 20.9214 - accuracy: 1.0000 - val_loss: 19.8484 - val_accuracy: 1.0000
Epoch 258/500
486/486 [==============================] - 0s 49us/sample - loss: 21.0076 - accuracy: 1.0000 - val_loss: 19.9577 - val_accuracy: 1.0000
Epoch 259/500
486/486 [==============================] - 0s 56us/sample - loss: 20.8530 - accuracy: 1.0000 - val_loss: 19.8785 - val_accuracy: 1.0000
Epoch 260/500
486/486 [==============================] - 0s 132us/sample - loss: 20.7806 - accuracy: 1.0000 - val_loss: 19.7778 - val_accuracy: 1.0000
Epoch 261/500
486/486 [==============================] - 0s 116us/sample - loss: 20.7344 - accuracy: 1.0000 - val_loss: 19.7378 - val_accuracy: 1.0000
Epoch 262/500
486/486 [==============================] - 0s 53us/sample - loss: 20.7598 - accuracy: 1.0000 - val_loss: 19.8063 - val_accuracy: 1.0000
Epoch 263/500
486/486 [==============================] - 0s 48us/sample - loss: 20.7699 - accuracy: 1.0000 - val_loss: 19.7888 - val_accuracy: 1.0000
Epoch 264/500
486/486 [==============================] - 0s 55us/sample - loss: 20.7739 - accuracy: 1.0000 - val_loss: 19.7782 - val_accuracy: 1.0000
Epoch 265/500
486/486 [==============================] - 0s 49us/sample - loss: 20.7317 - accuracy: 1.0000 - val_loss: 19.8550 - val_accuracy: 1.0000
Epoch 266/500
486/486 [==============================] - 0s 114us/sample - loss: 21.1274 - accuracy: 1.0000 - val_loss: 19.7374 - val_accuracy: 1.0000
Epoch 267/500
486/486 [==============================] - 0s 44us/sample - loss: 21.1536 - accuracy: 1.0000 - val_loss: 19.8735 - val_accuracy: 1.0000
Epoch 268/500
486/486 [==============================] - 0s 123us/sample - loss: 20.7223 - accuracy: 1.0000 - val_loss: 19.7349 - val_accuracy: 1.0000
Epoch 269/500
486/486 [==============================] - 0s 47us/sample - loss: 20.8494 - accuracy: 1.0000 - val_loss: 19.9074 - val_accuracy: 1.0000
Epoch 270/500
486/486 [==============================] - 0s 47us/sample - loss: 21.0253 - accuracy: 1.0000 - val_loss: 19.8672 - val_accuracy: 1.0000
Epoch 271/500
486/486 [==============================] - 0s 52us/sample - loss: 20.9080 - accuracy: 1.0000 - val_loss: 20.0456 - val_accuracy: 1.0000
Epoch 272/500
486/486 [==============================] - 0s 49us/sample - loss: 21.1303 - accuracy: 1.0000 - val_loss: 20.0554 - val_accuracy: 1.0000
Epoch 273/500
486/486 [==============================] - 0s 63us/sample - loss: 20.8351 - accuracy: 1.0000 - val_loss: 19.8985 - val_accuracy: 1.0000
Epoch 274/500
486/486 [==============================] - 0s 55us/sample - loss: 20.6647 - accuracy: 1.0000 - val_loss: 19.7632 - val_accuracy: 1.0000
Epoch 275/500
486/486 [==============================] - 0s 114us/sample - loss: 20.7136 - accuracy: 1.0000 - val_loss: 19.6149 - val_accuracy: 1.0000
Epoch 276/500
486/486 [==============================] - 0s 51us/sample - loss: 20.5765 - accuracy: 1.0000 - val_loss: 19.6206 - val_accuracy: 1.0000
Epoch 277/500
486/486 [==============================] - 0s 122us/sample - loss: 20.5948 - accuracy: 1.0000 - val_loss: 19.5436 - val_accuracy: 1.0000
Epoch 278/500
486/486 [==============================] - 0s 44us/sample - loss: 20.5872 - accuracy: 1.0000 - val_loss: 19.5940 - val_accuracy: 1.0000
Epoch 279/500
486/486 [==============================] - 0s 55us/sample - loss: 20.5841 - accuracy: 1.0000 - val_loss: 19.6965 - val_accuracy: 1.0000
Epoch 280/500
486/486 [==============================] - 0s 50us/sample - loss: 20.5280 - accuracy: 1.0000 - val_loss: 19.5495 - val_accuracy: 1.0000
Epoch 281/500
486/486 [==============================] - 0s 134us/sample - loss: 20.5709 - accuracy: 1.0000 - val_loss: 19.5189 - val_accuracy: 1.0000
Epoch 282/500
486/486 [==============================] - 0s 56us/sample - loss: 20.5294 - accuracy: 1.0000 - val_loss: 19.5754 - val_accuracy: 1.0000
Epoch 283/500
486/486 [==============================] - 0s 49us/sample - loss: 20.5566 - accuracy: 1.0000 - val_loss: 19.6344 - val_accuracy: 1.0000
Epoch 284/500
486/486 [==============================] - 0s 51us/sample - loss: 20.7790 - accuracy: 1.0000 - val_loss: 19.5461 - val_accuracy: 1.0000
Epoch 285/500
486/486 [==============================] - 0s 55us/sample - loss: 21.3008 - accuracy: 1.0000 - val_loss: 19.5835 - val_accuracy: 1.0000
Epoch 286/500
486/486 [==============================] - 0s 52us/sample - loss: 22.0381 - accuracy: 1.0000 - val_loss: 20.4664 - val_accuracy: 1.0000
Epoch 287/500
486/486 [==============================] - 0s 54us/sample - loss: 22.6789 - accuracy: 1.0000 - val_loss: 20.3553 - val_accuracy: 1.0000
Epoch 288/500
486/486 [==============================] - 0s 56us/sample - loss: 22.3706 - accuracy: 1.0000 - val_loss: 21.2280 - val_accuracy: 1.0000
Epoch 289/500
486/486 [==============================] - 0s 58us/sample - loss: 21.7581 - accuracy: 1.0000 - val_loss: 20.4917 - val_accuracy: 1.0000
Epoch 290/500
486/486 [==============================] - 0s 54us/sample - loss: 22.1126 - accuracy: 1.0000 - val_loss: 20.3503 - val_accuracy: 1.0000
Epoch 291/500
486/486 [==============================] - 0s 53us/sample - loss: 21.1331 - accuracy: 1.0000 - val_loss: 19.9682 - val_accuracy: 1.0000
Epoch 292/500
486/486 [==============================] - 0s 50us/sample - loss: 21.1426 - accuracy: 1.0000 - val_loss: 20.0443 - val_accuracy: 1.0000
Epoch 293/500
486/486 [==============================] - 0s 50us/sample - loss: 20.9534 - accuracy: 1.0000 - val_loss: 19.7301 - val_accuracy: 1.0000
Epoch 294/500
486/486 [==============================] - 0s 52us/sample - loss: 20.8936 - accuracy: 1.0000 - val_loss: 19.5545 - val_accuracy: 1.0000
Epoch 295/500
486/486 [==============================] - 0s 51us/sample - loss: 20.6587 - accuracy: 1.0000 - val_loss: 19.6748 - val_accuracy: 1.0000
Epoch 296/500
486/486 [==============================] - 0s 118us/sample - loss: 20.9227 - accuracy: 1.0000 - val_loss: 19.4478 - val_accuracy: 1.0000
Epoch 297/500
486/486 [==============================] - 0s 45us/sample - loss: 20.4507 - accuracy: 1.0000 - val_loss: 19.9100 - val_accuracy: 1.0000
Epoch 298/500
486/486 [==============================] - 0s 131us/sample - loss: 20.5599 - accuracy: 1.0000 - val_loss: 19.3339 - val_accuracy: 1.0000
Epoch 299/500
486/486 [==============================] - 0s 120us/sample - loss: 20.4325 - accuracy: 1.0000 - val_loss: 19.2498 - val_accuracy: 1.0000
Epoch 300/500
486/486 [==============================] - 0s 49us/sample - loss: 20.5348 - accuracy: 1.0000 - val_loss: 19.3605 - val_accuracy: 1.0000
Epoch 301/500
486/486 [==============================] - 0s 48us/sample - loss: 20.5208 - accuracy: 1.0000 - val_loss: 19.9722 - val_accuracy: 1.0000
Epoch 302/500
486/486 [==============================] - 0s 67us/sample - loss: 20.8363 - accuracy: 1.0000 - val_loss: 19.7248 - val_accuracy: 1.0000
Epoch 303/500
486/486 [==============================] - 0s 54us/sample - loss: 20.4950 - accuracy: 1.0000 - val_loss: 19.4431 - val_accuracy: 1.0000
Epoch 304/500
486/486 [==============================] - 0s 49us/sample - loss: 20.6728 - accuracy: 1.0000 - val_loss: 20.1065 - val_accuracy: 1.0000
Epoch 305/500
486/486 [==============================] - 0s 50us/sample - loss: 20.5922 - accuracy: 1.0000 - val_loss: 19.5254 - val_accuracy: 1.0000
Epoch 306/500
486/486 [==============================] - 0s 49us/sample - loss: 20.6744 - accuracy: 1.0000 - val_loss: 19.3632 - val_accuracy: 1.0000
Epoch 307/500
486/486 [==============================] - 0s 49us/sample - loss: 20.6017 - accuracy: 1.0000 - val_loss: 19.2556 - val_accuracy: 1.0000
Epoch 308/500
486/486 [==============================] - 0s 49us/sample - loss: 20.5328 - accuracy: 1.0000 - val_loss: 19.4243 - val_accuracy: 1.0000
Epoch 309/500
486/486 [==============================] - 0s 129us/sample - loss: 20.3820 - accuracy: 1.0000 - val_loss: 19.2302 - val_accuracy: 1.0000
Epoch 310/500
486/486 [==============================] - 0s 53us/sample - loss: 20.1697 - accuracy: 1.0000 - val_loss: 19.3945 - val_accuracy: 1.0000
Epoch 311/500
486/486 [==============================] - 0s 129us/sample - loss: 20.2434 - accuracy: 1.0000 - val_loss: 19.1944 - val_accuracy: 1.0000
Epoch 312/500
486/486 [==============================] - 0s 46us/sample - loss: 20.5390 - accuracy: 1.0000 - val_loss: 19.4807 - val_accuracy: 1.0000
Epoch 313/500
486/486 [==============================] - 0s 50us/sample - loss: 20.5056 - accuracy: 1.0000 - val_loss: 19.3977 - val_accuracy: 1.0000
Epoch 314/500
486/486 [==============================] - 0s 104us/sample - loss: 20.2063 - accuracy: 1.0000 - val_loss: 19.1880 - val_accuracy: 1.0000
Epoch 315/500
486/486 [==============================] - 0s 40us/sample - loss: 20.1568 - accuracy: 1.0000 - val_loss: 19.2334 - val_accuracy: 1.0000
Epoch 316/500
486/486 [==============================] - 0s 48us/sample - loss: 20.1345 - accuracy: 1.0000 - val_loss: 19.2073 - val_accuracy: 1.0000
Epoch 317/500
486/486 [==============================] - 0s 49us/sample - loss: 20.2675 - accuracy: 1.0000 - val_loss: 19.4617 - val_accuracy: 1.0000
Epoch 318/500
486/486 [==============================] - 0s 47us/sample - loss: 20.1963 - accuracy: 1.0000 - val_loss: 19.4229 - val_accuracy: 1.0000
Epoch 319/500
486/486 [==============================] - 0s 143us/sample - loss: 20.0772 - accuracy: 1.0000 - val_loss: 19.0630 - val_accuracy: 1.0000
Epoch 320/500
486/486 [==============================] - 0s 118us/sample - loss: 20.0008 - accuracy: 1.0000 - val_loss: 19.0303 - val_accuracy: 1.0000
Epoch 321/500
486/486 [==============================] - 0s 46us/sample - loss: 19.9731 - accuracy: 1.0000 - val_loss: 19.1249 - val_accuracy: 1.0000
Epoch 322/500
486/486 [==============================] - 0s 125us/sample - loss: 19.9824 - accuracy: 1.0000 - val_loss: 18.9493 - val_accuracy: 1.0000
Epoch 323/500
486/486 [==============================] - 0s 43us/sample - loss: 19.8602 - accuracy: 1.0000 - val_loss: 19.2543 - val_accuracy: 1.0000
Epoch 324/500
486/486 [==============================] - 0s 51us/sample - loss: 20.1252 - accuracy: 1.0000 - val_loss: 19.0770 - val_accuracy: 1.0000
Epoch 325/500
486/486 [==============================] - 0s 48us/sample - loss: 20.0598 - accuracy: 1.0000 - val_loss: 19.0717 - val_accuracy: 1.0000
Epoch 326/500
486/486 [==============================] - 0s 50us/sample - loss: 20.1691 - accuracy: 1.0000 - val_loss: 19.3448 - val_accuracy: 1.0000
Epoch 327/500
486/486 [==============================] - 0s 58us/sample - loss: 20.0427 - accuracy: 1.0000 - val_loss: 19.2157 - val_accuracy: 1.0000
Epoch 328/500
486/486 [==============================] - 0s 53us/sample - loss: 20.3618 - accuracy: 1.0000 - val_loss: 19.4399 - val_accuracy: 1.0000
Epoch 329/500
486/486 [==============================] - 0s 52us/sample - loss: 20.3051 - accuracy: 1.0000 - val_loss: 18.9536 - val_accuracy: 1.0000
Epoch 330/500
486/486 [==============================] - 0s 55us/sample - loss: 20.1468 - accuracy: 1.0000 - val_loss: 19.3420 - val_accuracy: 1.0000
Epoch 331/500
486/486 [==============================] - 0s 48us/sample - loss: 20.0283 - accuracy: 1.0000 - val_loss: 19.2068 - val_accuracy: 1.0000
Epoch 332/500
486/486 [==============================] - 0s 52us/sample - loss: 20.5420 - accuracy: 1.0000 - val_loss: 19.2421 - val_accuracy: 1.0000
Epoch 333/500
486/486 [==============================] - 0s 57us/sample - loss: 20.3218 - accuracy: 1.0000 - val_loss: 19.5505 - val_accuracy: 1.0000
Epoch 334/500
486/486 [==============================] - 0s 57us/sample - loss: 20.0621 - accuracy: 1.0000 - val_loss: 19.2254 - val_accuracy: 1.0000
Epoch 335/500
486/486 [==============================] - 0s 57us/sample - loss: 20.3883 - accuracy: 1.0000 - val_loss: 20.6591 - val_accuracy: 1.0000
Epoch 336/500
486/486 [==============================] - 0s 54us/sample - loss: 20.6916 - accuracy: 1.0000 - val_loss: 20.2333 - val_accuracy: 1.0000
Epoch 337/500
486/486 [==============================] - 0s 48us/sample - loss: 20.6007 - accuracy: 1.0000 - val_loss: 19.4948 - val_accuracy: 1.0000
Epoch 338/500
486/486 [==============================] - 0s 55us/sample - loss: 20.3186 - accuracy: 1.0000 - val_loss: 19.2976 - val_accuracy: 1.0000
Epoch 339/500
486/486 [==============================] - 0s 128us/sample - loss: 20.1616 - accuracy: 1.0000 - val_loss: 18.9264 - val_accuracy: 1.0000
Epoch 340/500
486/486 [==============================] - 0s 47us/sample - loss: 20.0590 - accuracy: 1.0000 - val_loss: 19.0204 - val_accuracy: 1.0000
Epoch 341/500
486/486 [==============================] - 0s 54us/sample - loss: 20.1068 - accuracy: 1.0000 - val_loss: 19.1906 - val_accuracy: 1.0000
Epoch 342/500
486/486 [==============================] - 0s 47us/sample - loss: 19.9858 - accuracy: 1.0000 - val_loss: 19.2822 - val_accuracy: 1.0000
Epoch 343/500
486/486 [==============================] - 0s 119us/sample - loss: 19.7921 - accuracy: 1.0000 - val_loss: 18.8162 - val_accuracy: 1.0000
Epoch 344/500
486/486 [==============================] - 0s 108us/sample - loss: 19.8028 - accuracy: 1.0000 - val_loss: 18.7687 - val_accuracy: 1.0000
Epoch 345/500
486/486 [==============================] - 0s 44us/sample - loss: 19.7399 - accuracy: 1.0000 - val_loss: 18.9233 - val_accuracy: 1.0000
Epoch 346/500
486/486 [==============================] - 0s 51us/sample - loss: 20.4257 - accuracy: 1.0000 - val_loss: 18.9100 - val_accuracy: 1.0000
Epoch 347/500
486/486 [==============================] - 0s 51us/sample - loss: 20.2231 - accuracy: 1.0000 - val_loss: 18.8615 - val_accuracy: 1.0000
Epoch 348/500
486/486 [==============================] - 0s 50us/sample - loss: 19.6846 - accuracy: 1.0000 - val_loss: 19.1566 - val_accuracy: 1.0000
Epoch 349/500
486/486 [==============================] - 0s 51us/sample - loss: 19.9675 - accuracy: 1.0000 - val_loss: 19.0447 - val_accuracy: 1.0000
Epoch 350/500
486/486 [==============================] - 0s 52us/sample - loss: 19.9091 - accuracy: 1.0000 - val_loss: 18.9383 - val_accuracy: 1.0000
Epoch 351/500
486/486 [==============================] - 0s 58us/sample - loss: 19.7488 - accuracy: 1.0000 - val_loss: 19.0782 - val_accuracy: 1.0000
Epoch 352/500
486/486 [==============================] - 0s 139us/sample - loss: 19.7121 - accuracy: 1.0000 - val_loss: 18.7072 - val_accuracy: 1.0000
Epoch 353/500
486/486 [==============================] - 0s 43us/sample - loss: 19.6531 - accuracy: 1.0000 - val_loss: 18.7545 - val_accuracy: 1.0000
Epoch 354/500
486/486 [==============================] - 0s 48us/sample - loss: 19.7172 - accuracy: 1.0000 - val_loss: 19.0209 - val_accuracy: 1.0000
Epoch 355/500
486/486 [==============================] - 0s 59us/sample - loss: 19.6870 - accuracy: 1.0000 - val_loss: 18.7216 - val_accuracy: 1.0000
Epoch 356/500
486/486 [==============================] - 0s 49us/sample - loss: 19.7068 - accuracy: 1.0000 - val_loss: 19.0275 - val_accuracy: 1.0000
Epoch 357/500
486/486 [==============================] - 0s 56us/sample - loss: 19.8271 - accuracy: 1.0000 - val_loss: 19.1061 - val_accuracy: 1.0000
Epoch 358/500
486/486 [==============================] - 0s 125us/sample - loss: 19.8784 - accuracy: 1.0000 - val_loss: 19.0864 - val_accuracy: 1.0000
Epoch 359/500
486/486 [==============================] - 0s 51us/sample - loss: 19.9816 - accuracy: 1.0000 - val_loss: 18.9375 - val_accuracy: 1.0000
Epoch 360/500
486/486 [==============================] - 0s 48us/sample - loss: 19.9275 - accuracy: 1.0000 - val_loss: 19.2635 - val_accuracy: 1.0000
Epoch 361/500
486/486 [==============================] - 0s 54us/sample - loss: 19.9285 - accuracy: 1.0000 - val_loss: 19.0434 - val_accuracy: 1.0000
Epoch 362/500
486/486 [==============================] - 0s 51us/sample - loss: 19.7219 - accuracy: 1.0000 - val_loss: 19.0417 - val_accuracy: 1.0000
Epoch 363/500
486/486 [==============================] - 0s 54us/sample - loss: 19.8191 - accuracy: 1.0000 - val_loss: 18.7572 - val_accuracy: 1.0000
Epoch 364/500
486/486 [==============================] - 0s 52us/sample - loss: 19.6101 - accuracy: 1.0000 - val_loss: 18.7083 - val_accuracy: 1.0000
Epoch 365/500
486/486 [==============================] - 0s 52us/sample - loss: 19.5547 - accuracy: 1.0000 - val_loss: 18.7795 - val_accuracy: 1.0000
Epoch 366/500
486/486 [==============================] - 0s 121us/sample - loss: 19.4988 - accuracy: 1.0000 - val_loss: 18.5561 - val_accuracy: 1.0000
Epoch 367/500
486/486 [==============================] - 0s 47us/sample - loss: 19.4880 - accuracy: 1.0000 - val_loss: 18.5638 - val_accuracy: 1.0000
Epoch 368/500
486/486 [==============================] - 0s 115us/sample - loss: 19.4536 - accuracy: 1.0000 - val_loss: 18.5353 - val_accuracy: 1.0000
Epoch 369/500
486/486 [==============================] - 0s 47us/sample - loss: 19.4149 - accuracy: 1.0000 - val_loss: 18.5928 - val_accuracy: 1.0000
Epoch 370/500
486/486 [==============================] - 0s 54us/sample - loss: 19.4895 - accuracy: 1.0000 - val_loss: 18.7368 - val_accuracy: 1.0000
Epoch 371/500
486/486 [==============================] - 0s 51us/sample - loss: 19.6099 - accuracy: 1.0000 - val_loss: 18.7585 - val_accuracy: 1.0000
Epoch 372/500
486/486 [==============================] - 0s 53us/sample - loss: 19.5965 - accuracy: 1.0000 - val_loss: 18.7547 - val_accuracy: 1.0000
Epoch 373/500
486/486 [==============================] - 0s 55us/sample - loss: 19.5787 - accuracy: 1.0000 - val_loss: 18.8380 - val_accuracy: 1.0000
Epoch 374/500
486/486 [==============================] - 0s 64us/sample - loss: 19.5272 - accuracy: 1.0000 - val_loss: 18.6056 - val_accuracy: 1.0000
Epoch 375/500
486/486 [==============================] - 0s 56us/sample - loss: 19.3828 - accuracy: 1.0000 - val_loss: 18.7792 - val_accuracy: 1.0000
Epoch 376/500
486/486 [==============================] - 0s 48us/sample - loss: 19.4579 - accuracy: 1.0000 - val_loss: 18.5906 - val_accuracy: 1.0000
Epoch 377/500
486/486 [==============================] - 0s 71us/sample - loss: 19.8052 - accuracy: 1.0000 - val_loss: 18.9281 - val_accuracy: 1.0000
Epoch 378/500
486/486 [==============================] - 0s 54us/sample - loss: 19.9671 - accuracy: 1.0000 - val_loss: 19.1021 - val_accuracy: 1.0000
Epoch 379/500
486/486 [==============================] - 0s 57us/sample - loss: 19.8134 - accuracy: 1.0000 - val_loss: 18.7547 - val_accuracy: 1.0000
Epoch 380/500
486/486 [==============================] - 0s 60us/sample - loss: 19.7136 - accuracy: 1.0000 - val_loss: 19.2976 - val_accuracy: 1.0000
Epoch 381/500
486/486 [==============================] - 0s 61us/sample - loss: 20.4431 - accuracy: 1.0000 - val_loss: 19.1965 - val_accuracy: 1.0000
Epoch 382/500
486/486 [==============================] - 0s 64us/sample - loss: 20.0849 - accuracy: 1.0000 - val_loss: 19.2784 - val_accuracy: 1.0000
Epoch 383/500
486/486 [==============================] - 0s 71us/sample - loss: 20.1980 - accuracy: 1.0000 - val_loss: 19.1405 - val_accuracy: 1.0000
Epoch 384/500
486/486 [==============================] - 0s 54us/sample - loss: 20.3417 - accuracy: 1.0000 - val_loss: 19.0433 - val_accuracy: 1.0000
Epoch 385/500
486/486 [==============================] - 0s 53us/sample - loss: 20.0625 - accuracy: 1.0000 - val_loss: 19.5177 - val_accuracy: 1.0000
Epoch 386/500
486/486 [==============================] - 0s 60us/sample - loss: 20.1105 - accuracy: 1.0000 - val_loss: 18.9639 - val_accuracy: 1.0000
Epoch 387/500
486/486 [==============================] - 0s 56us/sample - loss: 19.7988 - accuracy: 1.0000 - val_loss: 19.0421 - val_accuracy: 1.0000
Epoch 388/500
486/486 [==============================] - 0s 49us/sample - loss: 19.9363 - accuracy: 1.0000 - val_loss: 18.6958 - val_accuracy: 1.0000
Epoch 389/500
486/486 [==============================] - 0s 54us/sample - loss: 19.5419 - accuracy: 1.0000 - val_loss: 18.7096 - val_accuracy: 1.0000
Epoch 390/500
486/486 [==============================] - 0s 128us/sample - loss: 19.3936 - accuracy: 1.0000 - val_loss: 18.5153 - val_accuracy: 1.0000
Epoch 391/500
486/486 [==============================] - 0s 44us/sample - loss: 19.4369 - accuracy: 1.0000 - val_loss: 18.5680 - val_accuracy: 1.0000
Epoch 392/500
486/486 [==============================] - 0s 56us/sample - loss: 19.3352 - accuracy: 1.0000 - val_loss: 18.6645 - val_accuracy: 1.0000
Epoch 393/500
486/486 [==============================] - 0s 122us/sample - loss: 19.3134 - accuracy: 1.0000 - val_loss: 18.3614 - val_accuracy: 1.0000
Epoch 394/500
486/486 [==============================] - 0s 58us/sample - loss: 19.2042 - accuracy: 1.0000 - val_loss: 18.5874 - val_accuracy: 1.0000
Epoch 395/500
486/486 [==============================] - 0s 55us/sample - loss: 19.4462 - accuracy: 1.0000 - val_loss: 18.3630 - val_accuracy: 1.0000
Epoch 396/500
486/486 [==============================] - 0s 130us/sample - loss: 19.3570 - accuracy: 1.0000 - val_loss: 18.3445 - val_accuracy: 1.0000
Epoch 397/500
486/486 [==============================] - 0s 46us/sample - loss: 19.2350 - accuracy: 1.0000 - val_loss: 18.4307 - val_accuracy: 1.0000
Epoch 398/500
486/486 [==============================] - 0s 53us/sample - loss: 19.3072 - accuracy: 1.0000 - val_loss: 18.3779 - val_accuracy: 1.0000
Epoch 399/500
486/486 [==============================] - 0s 48us/sample - loss: 19.3142 - accuracy: 1.0000 - val_loss: 18.4407 - val_accuracy: 1.0000
Epoch 400/500
486/486 [==============================] - 0s 127us/sample - loss: 19.1900 - accuracy: 1.0000 - val_loss: 18.3248 - val_accuracy: 1.0000
Epoch 401/500
486/486 [==============================] - 0s 120us/sample - loss: 19.1765 - accuracy: 1.0000 - val_loss: 18.2941 - val_accuracy: 1.0000
Epoch 402/500
486/486 [==============================] - 0s 45us/sample - loss: 19.1034 - accuracy: 1.0000 - val_loss: 18.3719 - val_accuracy: 1.0000
Epoch 403/500
486/486 [==============================] - 0s 123us/sample - loss: 19.1507 - accuracy: 1.0000 - val_loss: 18.2792 - val_accuracy: 1.0000
Epoch 404/500
486/486 [==============================] - 0s 47us/sample - loss: 19.2217 - accuracy: 1.0000 - val_loss: 18.3167 - val_accuracy: 1.0000
Epoch 405/500
486/486 [==============================] - 0s 115us/sample - loss: 19.1742 - accuracy: 1.0000 - val_loss: 18.2502 - val_accuracy: 1.0000
Epoch 406/500
486/486 [==============================] - 0s 42us/sample - loss: 19.2297 - accuracy: 1.0000 - val_loss: 18.4377 - val_accuracy: 1.0000
Epoch 407/500
486/486 [==============================] - 0s 44us/sample - loss: 19.3554 - accuracy: 1.0000 - val_loss: 18.3224 - val_accuracy: 1.0000
Epoch 408/500
486/486 [==============================] - 0s 50us/sample - loss: 19.1203 - accuracy: 1.0000 - val_loss: 18.2811 - val_accuracy: 1.0000
Epoch 409/500
486/486 [==============================] - 0s 50us/sample - loss: 19.1916 - accuracy: 1.0000 - val_loss: 18.4490 - val_accuracy: 1.0000
Epoch 410/500
486/486 [==============================] - 0s 48us/sample - loss: 19.1206 - accuracy: 1.0000 - val_loss: 18.2642 - val_accuracy: 1.0000
Epoch 411/500
486/486 [==============================] - 0s 48us/sample - loss: 19.1908 - accuracy: 1.0000 - val_loss: 18.5101 - val_accuracy: 1.0000
Epoch 412/500
486/486 [==============================] - 0s 51us/sample - loss: 19.2873 - accuracy: 1.0000 - val_loss: 18.3038 - val_accuracy: 1.0000
Epoch 413/500
486/486 [==============================] - 0s 110us/sample - loss: 19.1570 - accuracy: 1.0000 - val_loss: 18.1459 - val_accuracy: 1.0000
Epoch 414/500
486/486 [==============================] - 0s 42us/sample - loss: 19.2451 - accuracy: 1.0000 - val_loss: 18.1903 - val_accuracy: 1.0000
Epoch 415/500
486/486 [==============================] - 0s 47us/sample - loss: 19.0606 - accuracy: 1.0000 - val_loss: 18.1985 - val_accuracy: 1.0000
Epoch 416/500
486/486 [==============================] - ETA: 0s - loss: 17.9833 - accuracy: 1.000 - 0s 45us/sample - loss: 19.0805 - accuracy: 1.0000 - val_loss: 18.3060 - val_accuracy: 1.0000
Epoch 417/500
486/486 [==============================] - 0s 45us/sample - loss: 19.2533 - accuracy: 1.0000 - val_loss: 18.4681 - val_accuracy: 1.0000
Epoch 418/500
486/486 [==============================] - 0s 45us/sample - loss: 19.1189 - accuracy: 1.0000 - val_loss: 18.4394 - val_accuracy: 1.0000
Epoch 419/500
486/486 [==============================] - 0s 47us/sample - loss: 19.1846 - accuracy: 1.0000 - val_loss: 18.2864 - val_accuracy: 1.0000
Epoch 420/500
486/486 [==============================] - 0s 47us/sample - loss: 19.1658 - accuracy: 1.0000 - val_loss: 18.3709 - val_accuracy: 1.0000
Epoch 421/500
486/486 [==============================] - 0s 45us/sample - loss: 19.0684 - accuracy: 1.0000 - val_loss: 18.3216 - val_accuracy: 1.0000
Epoch 422/500
486/486 [==============================] - 0s 45us/sample - loss: 19.2172 - accuracy: 1.0000 - val_loss: 18.1900 - val_accuracy: 1.0000
Epoch 423/500
486/486 [==============================] - 0s 46us/sample - loss: 19.1497 - accuracy: 1.0000 - val_loss: 18.3765 - val_accuracy: 1.0000
Epoch 424/500
486/486 [==============================] - 0s 111us/sample - loss: 19.6638 - accuracy: 1.0000 - val_loss: 18.1331 - val_accuracy: 1.0000
Epoch 425/500
486/486 [==============================] - 0s 48us/sample - loss: 19.2554 - accuracy: 1.0000 - val_loss: 18.4740 - val_accuracy: 1.0000
Epoch 426/500
486/486 [==============================] - 0s 46us/sample - loss: 19.0497 - accuracy: 1.0000 - val_loss: 18.2853 - val_accuracy: 1.0000
Epoch 427/500
486/486 [==============================] - 0s 46us/sample - loss: 19.1153 - accuracy: 1.0000 - val_loss: 18.2517 - val_accuracy: 1.0000
Epoch 428/500
486/486 [==============================] - 0s 104us/sample - loss: 19.2446 - accuracy: 1.0000 - val_loss: 18.1087 - val_accuracy: 1.0000
Epoch 429/500
486/486 [==============================] - 0s 97us/sample - loss: 19.1213 - accuracy: 1.0000 - val_loss: 18.0881 - val_accuracy: 1.0000
Epoch 430/500
486/486 [==============================] - 0s 38us/sample - loss: 19.3886 - accuracy: 1.0000 - val_loss: 18.4827 - val_accuracy: 1.0000
Epoch 431/500
486/486 [==============================] - 0s 40us/sample - loss: 19.2693 - accuracy: 1.0000 - val_loss: 18.2603 - val_accuracy: 1.0000
Epoch 432/500
486/486 [==============================] - 0s 40us/sample - loss: 19.6075 - accuracy: 1.0000 - val_loss: 18.8467 - val_accuracy: 1.0000
Epoch 433/500
486/486 [==============================] - 0s 40us/sample - loss: 19.5816 - accuracy: 1.0000 - val_loss: 18.6564 - val_accuracy: 1.0000
Epoch 434/500
486/486 [==============================] - 0s 46us/sample - loss: 19.4655 - accuracy: 1.0000 - val_loss: 18.6439 - val_accuracy: 1.0000
Epoch 435/500
486/486 [==============================] - 0s 46us/sample - loss: 19.1441 - accuracy: 1.0000 - val_loss: 18.5792 - val_accuracy: 1.0000
Epoch 436/500
486/486 [==============================] - 0s 44us/sample - loss: 19.1409 - accuracy: 1.0000 - val_loss: 18.4139 - val_accuracy: 1.0000
Epoch 437/500
486/486 [==============================] - 0s 46us/sample - loss: 19.1067 - accuracy: 1.0000 - val_loss: 18.1427 - val_accuracy: 1.0000
Epoch 438/500
486/486 [==============================] - 0s 45us/sample - loss: 18.9256 - accuracy: 1.0000 - val_loss: 18.1729 - val_accuracy: 1.0000
Epoch 439/500
486/486 [==============================] - 0s 46us/sample - loss: 19.0200 - accuracy: 1.0000 - val_loss: 18.1376 - val_accuracy: 1.0000
Epoch 440/500
486/486 [==============================] - 0s 51us/sample - loss: 19.0779 - accuracy: 1.0000 - val_loss: 18.1289 - val_accuracy: 1.0000
Epoch 441/500
486/486 [==============================] - 0s 46us/sample - loss: 18.9454 - accuracy: 1.0000 - val_loss: 18.2912 - val_accuracy: 1.0000
Epoch 442/500
486/486 [==============================] - 0s 104us/sample - loss: 18.9693 - accuracy: 1.0000 - val_loss: 17.9997 - val_accuracy: 1.0000
Epoch 443/500
486/486 [==============================] - 0s 43us/sample - loss: 18.9522 - accuracy: 1.0000 - val_loss: 18.0328 - val_accuracy: 1.0000
Epoch 444/500
486/486 [==============================] - 0s 44us/sample - loss: 18.9790 - accuracy: 1.0000 - val_loss: 18.2195 - val_accuracy: 1.0000
Epoch 445/500
486/486 [==============================] - 0s 46us/sample - loss: 18.8183 - accuracy: 1.0000 - val_loss: 18.0444 - val_accuracy: 1.0000
Epoch 446/500
486/486 [==============================] - 0s 47us/sample - loss: 18.9367 - accuracy: 1.0000 - val_loss: 18.0113 - val_accuracy: 1.0000
Epoch 447/500
486/486 [==============================] - 0s 112us/sample - loss: 18.8131 - accuracy: 1.0000 - val_loss: 17.9804 - val_accuracy: 1.0000
Epoch 448/500
486/486 [==============================] - 0s 46us/sample - loss: 18.9710 - accuracy: 1.0000 - val_loss: 18.0881 - val_accuracy: 1.0000
Epoch 449/500
486/486 [==============================] - 0s 47us/sample - loss: 18.8236 - accuracy: 1.0000 - val_loss: 18.0173 - val_accuracy: 1.0000
Epoch 450/500
486/486 [==============================] - 0s 47us/sample - loss: 18.9337 - accuracy: 1.0000 - val_loss: 18.1703 - val_accuracy: 1.0000
Epoch 451/500
486/486 [==============================] - 0s 53us/sample - loss: 19.1274 - accuracy: 1.0000 - val_loss: 18.4453 - val_accuracy: 1.0000
Epoch 452/500
486/486 [==============================] - 0s 47us/sample - loss: 19.0596 - accuracy: 1.0000 - val_loss: 18.6007 - val_accuracy: 1.0000
Epoch 453/500
486/486 [==============================] - 0s 45us/sample - loss: 19.4985 - accuracy: 1.0000 - val_loss: 18.3900 - val_accuracy: 1.0000
Epoch 454/500
486/486 [==============================] - 0s 46us/sample - loss: 19.5120 - accuracy: 1.0000 - val_loss: 18.5889 - val_accuracy: 1.0000
Epoch 455/500
486/486 [==============================] - 0s 48us/sample - loss: 19.0744 - accuracy: 1.0000 - val_loss: 18.7625 - val_accuracy: 1.0000
Epoch 456/500
486/486 [==============================] - 0s 47us/sample - loss: 19.2478 - accuracy: 1.0000 - val_loss: 18.7751 - val_accuracy: 1.0000
Epoch 457/500
486/486 [==============================] - 0s 50us/sample - loss: 19.4347 - accuracy: 1.0000 - val_loss: 18.0735 - val_accuracy: 1.0000
Epoch 458/500
486/486 [==============================] - 0s 45us/sample - loss: 19.2960 - accuracy: 1.0000 - val_loss: 18.0631 - val_accuracy: 1.0000
Epoch 459/500
486/486 [==============================] - 0s 49us/sample - loss: 18.9759 - accuracy: 1.0000 - val_loss: 18.0915 - val_accuracy: 1.0000
Epoch 460/500
486/486 [==============================] - 0s 45us/sample - loss: 18.9603 - accuracy: 1.0000 - val_loss: 18.1797 - val_accuracy: 1.0000
Epoch 461/500
486/486 [==============================] - 0s 47us/sample - loss: 19.2679 - accuracy: 1.0000 - val_loss: 18.0801 - val_accuracy: 1.0000
Epoch 462/500
486/486 [==============================] - 0s 50us/sample - loss: 19.2293 - accuracy: 1.0000 - val_loss: 18.1801 - val_accuracy: 1.0000
Epoch 463/500
486/486 [==============================] - 0s 48us/sample - loss: 18.9713 - accuracy: 1.0000 - val_loss: 18.3905 - val_accuracy: 1.0000
Epoch 464/500
486/486 [==============================] - 0s 51us/sample - loss: 18.9866 - accuracy: 1.0000 - val_loss: 18.1680 - val_accuracy: 1.0000
Epoch 465/500
486/486 [==============================] - 0s 53us/sample - loss: 19.0301 - accuracy: 1.0000 - val_loss: 18.0637 - val_accuracy: 1.0000
Epoch 466/500
486/486 [==============================] - 0s 45us/sample - loss: 18.9062 - accuracy: 1.0000 - val_loss: 17.9972 - val_accuracy: 1.0000
Epoch 467/500
486/486 [==============================] - 0s 49us/sample - loss: 19.1282 - accuracy: 1.0000 - val_loss: 18.2974 - val_accuracy: 1.0000
Epoch 468/500
486/486 [==============================] - 0s 49us/sample - loss: 19.0282 - accuracy: 1.0000 - val_loss: 18.0489 - val_accuracy: 1.0000
Epoch 469/500
486/486 [==============================] - 0s 47us/sample - loss: 18.8528 - accuracy: 1.0000 - val_loss: 18.5054 - val_accuracy: 1.0000
Epoch 470/500
486/486 [==============================] - 0s 49us/sample - loss: 19.1916 - accuracy: 1.0000 - val_loss: 18.2069 - val_accuracy: 1.0000
Epoch 471/500
486/486 [==============================] - 0s 47us/sample - loss: 18.9214 - accuracy: 1.0000 - val_loss: 18.1673 - val_accuracy: 1.0000
Epoch 472/500
486/486 [==============================] - 0s 47us/sample - loss: 19.0827 - accuracy: 1.0000 - val_loss: 18.0769 - val_accuracy: 1.0000
Epoch 473/500
486/486 [==============================] - 0s 48us/sample - loss: 19.6739 - accuracy: 1.0000 - val_loss: 18.3184 - val_accuracy: 1.0000
Epoch 474/500
486/486 [==============================] - 0s 43us/sample - loss: 20.0370 - accuracy: 1.0000 - val_loss: 18.6771 - val_accuracy: 1.0000
Epoch 475/500
486/486 [==============================] - 0s 43us/sample - loss: 19.9550 - accuracy: 1.0000 - val_loss: 18.7205 - val_accuracy: 1.0000
Epoch 476/500
486/486 [==============================] - 0s 45us/sample - loss: 19.0169 - accuracy: 1.0000 - val_loss: 18.6227 - val_accuracy: 1.0000
Epoch 477/500
486/486 [==============================] - 0s 43us/sample - loss: 19.0480 - accuracy: 1.0000 - val_loss: 18.5263 - val_accuracy: 1.0000
Epoch 478/500
486/486 [==============================] - 0s 101us/sample - loss: 19.0852 - accuracy: 1.0000 - val_loss: 17.9724 - val_accuracy: 1.0000
Epoch 479/500
486/486 [==============================] - 0s 95us/sample - loss: 18.9068 - accuracy: 1.0000 - val_loss: 17.8208 - val_accuracy: 1.0000
Epoch 480/500
486/486 [==============================] - 0s 102us/sample - loss: 18.8421 - accuracy: 1.0000 - val_loss: 17.8138 - val_accuracy: 1.0000
Epoch 481/500
486/486 [==============================] - 0s 38us/sample - loss: 19.0660 - accuracy: 1.0000 - val_loss: 17.8768 - val_accuracy: 1.0000
Epoch 482/500
486/486 [==============================] - 0s 42us/sample - loss: 19.0865 - accuracy: 1.0000 - val_loss: 18.0447 - val_accuracy: 1.0000
Epoch 483/500
486/486 [==============================] - 0s 55us/sample - loss: 18.9791 - accuracy: 1.0000 - val_loss: 17.8646 - val_accuracy: 1.0000
Epoch 484/500
486/486 [==============================] - 0s 46us/sample - loss: 18.7176 - accuracy: 1.0000 - val_loss: 18.1852 - val_accuracy: 1.0000
Epoch 485/500
486/486 [==============================] - 0s 112us/sample - loss: 18.7032 - accuracy: 1.0000 - val_loss: 17.7804 - val_accuracy: 1.0000
Epoch 486/500
486/486 [==============================] - 0s 49us/sample - loss: 18.5973 - accuracy: 1.0000 - val_loss: 18.1828 - val_accuracy: 1.0000
Epoch 487/500
486/486 [==============================] - 0s 47us/sample - loss: 19.1166 - accuracy: 1.0000 - val_loss: 18.0846 - val_accuracy: 1.0000
Epoch 488/500
486/486 [==============================] - 0s 50us/sample - loss: 18.8871 - accuracy: 1.0000 - val_loss: 18.0150 - val_accuracy: 1.0000
Epoch 489/500
486/486 [==============================] - 0s 46us/sample - loss: 18.7316 - accuracy: 1.0000 - val_loss: 17.8995 - val_accuracy: 1.0000
Epoch 490/500
486/486 [==============================] - 0s 49us/sample - loss: 18.6492 - accuracy: 1.0000 - val_loss: 18.0894 - val_accuracy: 1.0000
Epoch 491/500
486/486 [==============================] - 0s 51us/sample - loss: 18.8296 - accuracy: 1.0000 - val_loss: 18.0175 - val_accuracy: 1.0000
Epoch 492/500
486/486 [==============================] - 0s 43us/sample - loss: 18.6899 - accuracy: 1.0000 - val_loss: 17.7933 - val_accuracy: 1.0000
Epoch 493/500
486/486 [==============================] - 0s 43us/sample - loss: 18.6250 - accuracy: 1.0000 - val_loss: 17.8344 - val_accuracy: 1.0000
Epoch 494/500
486/486 [==============================] - 0s 107us/sample - loss: 18.7036 - accuracy: 1.0000 - val_loss: 17.6585 - val_accuracy: 1.0000
Epoch 495/500
486/486 [==============================] - 0s 43us/sample - loss: 19.0651 - accuracy: 1.0000 - val_loss: 18.0547 - val_accuracy: 1.0000
Epoch 496/500
486/486 [==============================] - 0s 43us/sample - loss: 19.4789 - accuracy: 1.0000 - val_loss: 18.3463 - val_accuracy: 1.0000
Epoch 497/500
486/486 [==============================] - 0s 45us/sample - loss: 19.3182 - accuracy: 1.0000 - val_loss: 18.2170 - val_accuracy: 1.0000
Epoch 498/500
486/486 [==============================] - 0s 43us/sample - loss: 19.1415 - accuracy: 1.0000 - val_loss: 18.5524 - val_accuracy: 1.0000
Epoch 499/500
486/486 [==============================] - 0s 47us/sample - loss: 19.0787 - accuracy: 1.0000 - val_loss: 18.3649 - val_accuracy: 1.0000
Epoch 500/500
486/486 [==============================] - 0s 49us/sample - loss: 19.5499 - accuracy: 1.0000 - val_loss: 19.3236 - val_accuracy: 1.0000
In [15]:
test_x_predictions = autoencoder.predict(X_test)
print(test_x_predictions.shape)
mse = np.mean(np.power(X_test - test_x_predictions, 2), axis=1)
mse
(300, 61)
Out[15]:
507     20.325340
818    171.442944
452      3.171927
368      9.374798
242     13.166230
          ...    
459      3.613510
415      5.667701
61      21.307917
347      2.346015
349      7.616852
Length: 300, dtype: float64
In [16]:
Y_test.value_counts()
Out[16]:
0    214
1     86
Name: CreditRisk, dtype: int64
In [17]:
fpr, tpr, thresholds = roc_curve(Y_test, mse)
print('thresholds', np.mean(thresholds))
auc(fpr, tpr)
thresholds 21.05576286049008
Out[17]:
0.5947620082590741
In [18]:
threshold_fixed = 21.05
accuracy_score(Y_test, [1 if s > threshold_fixed else 0 for s in mse])
Out[18]:
0.68
In [19]:
error_df_test = pd.DataFrame({'Reconstruction_error': mse,
                        'True_class': Y_test})
error_df_test = error_df_test.reset_index()

groups = error_df_test.groupby('True_class')
fig, ax = plt.subplots()
for name, group in groups:
    ax.plot(group.index, group.Reconstruction_error, marker='o', ms=3.5, linestyle='',
            label= "Bad Credit Risk" if name == 1 else "Good Credit Risk")
ax.hlines(threshold_fixed, ax.get_xlim()[0], ax.get_xlim()[1], colors="r", 
          zorder=100, label='Threshold')
ax.legend()
plt.title("Reconstruction error for different classes")
plt.ylabel("Reconstruction error")
plt.xlabel("Data point index")
plt.show();
In [20]:
pred_y = [1 if e > threshold_fixed else 0 for e in error_df_test['Reconstruction_error'].values]
conf_matrix = confusion_matrix(error_df_test['True_class'], pred_y)
plt.figure(figsize=(8, 6))
sns.heatmap(conf_matrix, 
            xticklabels=["Good Credit Risk","Bad Credit Risk"], 
            yticklabels=["Good Credit Risk","Bad Credit Risk"], 
            annot=True, fmt="d");
plt.title("Confusion matrix")
plt.ylabel('True class')
plt.xlabel('Predicted class')
plt.show()
In [21]:
false_pos_rate, true_pos_rate, thresholds = roc_curve(error_df_test['True_class'], 
                                                      error_df_test['Reconstruction_error'])
roc_auc = auc(false_pos_rate, true_pos_rate,)
plt.plot(false_pos_rate, true_pos_rate, linewidth=5, label='AUC = %0.3f'% roc_auc)
plt.plot([0,1],[0,1], linewidth=5)
plt.xlim([-0.01, 1])
plt.ylim([0, 1.01])
plt.legend(loc='lower right')
plt.title('Receiver operating characteristic curve (ROC)')
plt.ylabel('True Positive Rate')
plt.xlabel('False Positive Rate')
plt.show()

What's going on with these anomalies?

Data legend

https://archive.ics.uci.edu/ml/datasets/statlog+(german+credit+data)

In [22]:
# What are some of those features that we see via poor reconstruction error?
X_test['Error'] = mse
X_test['CreditRisk'] = Y_test
X_test = X_test.sort_values('Error', ascending=False)
X_test.head()
Out[22]:
DurationInMonths CreditAmount InstallmentRatePercentIncome ResidenceSince Age NumberExistingCredits FamilyLiablities HasChecking_A11 HasChecking_A12 HasChecking_A13 ... Job_A171 Job_A172 Job_A173 Job_A174 HasPhone_A191 HasPhone_A192 ForeignWorker_A201 ForeignWorker_A202 Error CreditRisk
236 6 14555 1 2 23 1 1 0 1 0 ... 1 0 0 0 0 1 1 0 247.861527 1
887 48 15672 2 2 23 1 1 0 1 0 ... 0 0 1 0 0 1 1 0 201.712036 1
818 36 15857 2 3 43 1 1 1 0 0 ... 0 0 0 1 1 0 1 0 171.442944 0
744 39 14179 4 4 30 2 1 1 0 0 ... 0 0 0 1 0 1 1 0 147.728383 0
274 30 11998 1 1 34 1 1 1 0 0 ... 0 1 0 0 0 1 1 0 92.356162 1

5 rows × 63 columns

In [24]:
normal_values = X_test[X_test['CreditRisk']==0].mean()
normal_values
Out[24]:
DurationInMonths                  19.191589
CreditAmount                    3053.785047
InstallmentRatePercentIncome       3.018692
ResidenceSince                     2.808411
Age                               35.542056
                                   ...     
HasPhone_A192                      0.425234
ForeignWorker_A201                 0.943925
ForeignWorker_A202                 0.056075
Error                             12.183574
CreditRisk                         0.000000
Length: 63, dtype: float64
In [25]:
diff_values = normal_values - X_test.head(10)
diff_values = diff_values.T
diff_values
Out[25]:
236 887 818 744 274 395 374 756 205 812
DurationInMonths 13.191589 -28.808411 -16.808411 -19.808411 -10.808411 -19.808411 -40.808411 13.191589 -10.808411 -16.808411
CreditAmount -11501.214953 -12618.214953 -12803.214953 -11125.214953 -8944.214953 -8706.214953 -11728.214953 1754.785047 -7569.214953 -6575.214953
InstallmentRatePercentIncome 2.018692 1.018692 1.018692 -0.981308 2.018692 1.018692 0.018692 2.018692 0.018692 -0.981308
ResidenceSince 0.808411 0.808411 -0.191589 -1.191589 1.808411 -0.191589 -1.191589 1.808411 -1.191589 -1.191589
Age 12.542056 12.542056 -7.457944 5.542056 1.542056 3.542056 -24.457944 -38.457944 -2.457944 11.542056
... ... ... ... ... ... ... ... ... ... ...
HasPhone_A192 -0.574766 -0.574766 0.425234 -0.574766 -0.574766 -0.574766 -0.574766 0.425234 -0.574766 -0.574766
ForeignWorker_A201 -0.056075 -0.056075 -0.056075 -0.056075 -0.056075 -0.056075 -0.056075 0.943925 -0.056075 -0.056075
ForeignWorker_A202 0.056075 0.056075 0.056075 0.056075 0.056075 0.056075 0.056075 -0.943925 0.056075 0.056075
Error -235.677953 -189.528462 -159.259370 -135.544809 -80.172588 -67.862145 -61.851474 -46.803959 -44.388845 -42.902533
CreditRisk -1.000000 -1.000000 0.000000 0.000000 -1.000000 0.000000 -1.000000 0.000000 0.000000 -1.000000

63 rows × 10 columns

In [30]:
# pick an index header and analyze
customer = 236
to_analyze = pd.DataFrame(diff_values[customer])
to_analyze = to_analyze.sort_values(customer, ascending=True)
print(to_analyze.head(1))
print(to_analyze.tail(1))
                       236
CreditAmount -11501.214953
                        236
DurationInMonths  13.191589
In [27]:
X_test[['CreditAmount','CreditRisk']].groupby('CreditRisk').mean()
Out[27]:
CreditAmount
CreditRisk
0 3053.785047
1 4035.046512
In [28]:
X_test[['DurationInMonths','CreditRisk']].groupby('CreditRisk').mean()
Out[28]:
DurationInMonths
CreditRisk
0 19.191589
1 25.116279
In [31]:
# pick an index header and analyze
customer = 887
to_analyze = pd.DataFrame(diff_values[customer])
to_analyze = to_analyze.sort_values(customer, ascending=True)
print(to_analyze.head(1))
print(to_analyze.tail(1))
                       887
CreditAmount -12618.214953
           887
Age  12.542056

Show Notes

(pardon typos and formatting -
these are the notes I use to make the videos)

An autoencoder is a great tool to dig deep into your data. If you are unsure of what to focus on or you want to look at the bigger picture, an unsupervised or semi-supervised model might be able to give you fresh new insights and new areas to investigate. It is a great compliment to a traditional supervised model.