Train a LSTM (Long Short Term Memory) model to generate text
This example allows you to train a model to generate text in the style of some existing source text. The model is designed to predict the next character in a text given some preceding string of characters. Doing this repetedly builds up a text, charachter by character.
Model saved in IndexedDB: Load text data first.
It can take a while to generate an effective model. Try increasing the number of epochs to improve the results, we have found that about 50-100 epochs are needed to start generating reasonable text.
Text Generation Parameters
To generate text the model needs to have some number of preceding characters from which it continues, we call these characters the seed text. You can type one in, or we will extract a random substring from the input text to be the seed text. Note that the seed text must be at least 40 charachters long.