When using the created model in an online service, I would like to update the existing model every day with the newly accumulated data, but it takes time and money to batch all the data every day. In image training, it is common to have a trained model such as VGG16 read the image you want to discriminate and fine-tune it. So, this time, I saved a model built with ordinary data and tried Fine Tuning the model.
Only the points will be introduced here, so please see the sample code that actually works from the following. https://github.com/tizuo/keras/blob/master/%E8%BB%A2%E7%A7%BB%E5%AD%A6%E7%BF%92%E3%83%86%E3%82%B9%E3%83%88.ipynb
This time, we will divide the iris data appropriately and train it in two parts.
First, define the base model. The point is just to put the name
parameter in the layer you want to inherit.
python
model_b = Sequential()
model_b.add(Dense(4, input_shape=(4, ), name='l1'))
model_b.add(Activation('relu'))
model_b.add(Dense(4, input_shape=(4, ), name='l2'))
model_b.add(Activation('relu'))
model_b.add(Dense(3, name='cls'))
model_b.add(Activation('softmax'))
After fitting with the data for the base, save the model weights.
python
model_b.save_weights('my_model_weights.h5')
Make the layer name
correspond. This example adds a Dropout layer to prevent over-reflection of new data.
python
model_n = Sequential()
model_n.add(Dense(4, input_shape=(4, ), name='l1'))
model_n.add(Activation('relu'))
model_n.add(Dense(4, input_shape=(4, ), name='l2'))
model_n.add(Activation('relu'))
model_n.add(Dropout(0.5))
model_n.add(Dense(3, name='cls'))
model_n.add(Activation('softmax'))
Load the weights into the newly created model and train the rest of the models.
python
#Loading weights
model_n.load_weights('my_model_weights.h5', by_name=True)
#compile&Run
model_n.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
model_n.fit(new_X, new_Y, epochs=50, batch_size=1, verbose=1)
It seems that it can also be used when you want to learn separately the amount of data that does not rotate due to memory over.
Recommended Posts