This example demonstrates 're-training' of a pre-trained model in the browser. The model was initially training-mode in Python and converted to TensorFlow.js format. The model can then be further trained using data from the browser. Re-training an already trained network is called transfer learning.
In this case the pretrained model has been trained on a subset of the MNIST data: only digits 0 - 4. The data we'll use for transfer learning in the browser consists of the digits 5 - 9. This example shows that the first several layers of a pretrained model can be used as feature extractors on new data during transfer learning and thus result in faster training on the new data.
When retraining the model 3 different approaches are available.
Below is an "ASCII" bit map of some test examples from the digits in the new data set for transfer learning: digits 5 through 9. The numbers are the grayscale integer values from the image. You can edit the values below to see the effect of editing pixel values on the classification probabilities output by the model below.