TensorFlow Hub is a way to share pretrained model components. See the TensorFlow Module Hub for a searchable listing of pre-trained models. This tutorial demonstrates:
Use layer_hub
to load a mobilenet and transform it into
a Keras layer. Any TensorFlow 2 compatible image classifier URL from
tfhub.dev will work here.
classifier_url <- "https://tfhub.dev/google/tf2-preview/mobilenet_v2/classification/2"
mobilenet_layer <- layer_hub(handle = classifier_url)
We can then create our Keras model:
Download a single image to try the model on.
Using TF Hub it is simple to retrain the top layer of the model to recognize the classes in our dataset.
For this example you will use the TensorFlow flowers dataset:
if(!dir.exists("flower_photos")) {
url <- "https://storage.googleapis.com/download.tensorflow.org/example_images/flower_photos.tgz"
tgz <- tempfile(fileext = ".tgz")
download.file(url, destfile = tgz)
utils::untar(tgz, exdir = ".")
}
data_root <- "flower_photos"
The simplest way to load this data into our model is using
image_data_generator
All of TensorFlow Hub’s image modules expect float inputs in the [0, 1] range. Use the image_data_generator’s rescale parameter to achieve this.
image_generator <- image_data_generator(rescale = 1/255, validation_split = 0.2)
training_data <- flow_images_from_directory(
directory = data_root,
generator = image_generator,
target_size = c(224, 224),
subset = "training"
)
validation_data <- flow_images_from_directory(
directory = data_root,
generator = image_generator,
target_size = c(224, 224),
subset = "validation"
)
The resulting object is an iterator that returns
image_batch
, label_batch pairs
.
TensorFlow Hub also distributes models without the top classification layer. These can be used to easily do transfer learning.
Any Tensorflow 2 compatible image feature vector URL from tfhub.dev will work here.
Now we can create our classification model by attaching a classification head into the feature extractor layer. We define the following model:
We can now train our model in the same way we would train any other
Keras model. We first use compile
to configure the training
process:
We can then use the fit
function to fit our model.
model %>%
fit_generator(
training_data,
steps_per_epoch = training_data$n/training_data$batch_size,
validation_data = validation_data
)
You can then export your model with:
You can also reload the model_from_saved_model
function.
Note that you need to pass the custom_object
with the
definition of the KerasLayer since it/s not a default Keras layer.
We can verify that the predictions of both the trained model and the reloaded model are equal:
steps <- as.integer(validation_data$n/validation_data$batch_size)
all.equal(
predict_generator(model, validation_data, steps = steps),
predict_generator(reloaded_model, validation_data, steps = steps),
)
The saved model can also be loaded for inference later or be converted to TFLite or TFjs.