The Open AI model generates short retellings of books

OpenAI has trained a neural network to retell the contents of books. The model is based on GPT-3 and uses the method of recursive decomposition, that is, first creates retellings of small parts of the book and then combines them.

Compared to the end-to-end learning procedure, recursive decomposition allows you to generate retellings of books of unlimited length. This approach allows you to bypass the restriction on the length for which the model holds the context.

In addition to recursive decomposition, reinforcement was used in the training of the neural network in the form of estimates of retellings, which were set manually by people who read the corresponding block of the book. In this way, the model was trained to meet the expectations of the reviewers.

In 5% of cases, the OpenAI model generated texts whose quality is comparable to retellings written by a person. In addition, the neural network received the highest rating in the BookSum benchmark.

The model was developed as part of the OpenAI initiative to align the output of models with people’s preferences and goals.

Notify of

Inline Feedbacks
View all comments