TensorFlow v1.1 Release, which was a bit annoying because there was no mention of XLA, but as the roadmap shows, Keras It seems that the integration with is starting. It's still like RC, but it's one of the highlights personally, and I was afraid to try it, so I tried it.
I used the Keras code I wrote earlier in the article entitled Miscellaneous commentary on TensorFlow's High Level API.
import tensorflow as tf
from tensorflow.contrib.keras.python import keras
from sklearn import cross_validation
#Data preparation
iris = tf.contrib.learn.datasets.base.load_iris()
train_x, test_x, train_y, test_y = cross_validation.train_test_split(
iris.data, iris.target, test_size=0.2
)
num_classes = 3
train_y = keras.utils.to_categorical(train_y, num_classes)
test_y = keras.utils.to_categorical(test_y, num_classes)
#Model definition
model = Sequential()
#Network definition
model.add(Dense(10, activation='relu', input_shape=(4,)))
model.add(Dense(20, activation='relu'))
model.add(Dense(10, activation='relu'))
model.add(Dense(3, activation='softmax'))
#Confirmation of model summary
model.summary()
#Compiling the model
model.compile(loss='categorical_crossentropy',
optimizer='sgd',
metrics=['accuracy'])
#Learning
history = model.fit(train_x, train_y,
batch_size=100,
epochs=2000,
verbose=1,
validation_data=(test_x, test_y))
#Evaluation of learning model
score = model.evaluate(x_test, y_test, verbose=0)
print('Test loss:', score[0])
print('Test accuracy:', score[1])
--As you've heard, you can access Keras modules by just ʻimport`` tensorflow.contrib.keras.python, so it looks like a good idea. ――By combining with Estimator, I could see the future where the division of labor between data scientists and engineers would progress. ――I have a feeling that
CloudML Engine` will be easier to use.
I've only tried it quickly, so I'd like to discuss it in detail again.
Recommended Posts