Skip to content

This repository showcases text generation using an RNN trained on Shakespeare's writings. The project aims to predict the subsequent character in a sequence based on input characters. The model can generate extended sequences of text through iterative calling.

License

Notifications You must be signed in to change notification settings

stefanshipinkoski/text_generation-RNN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Text Generation with an RNN

This project demonstrates how to generate text using a character-based Recurrent Neural Network (RNN). We will work with a dataset of Shakespeare's writing from Andrej Karpathy's The Unreasonable Effectiveness of Recurrent Neural Networks. Given a sequence of characters from this data, we aim to predict the next character in the sequence ("e"). Longer sequences of text can be generated by calling the model repeatedly.

Project Objectives:

Generate text using RNN. Create training examples and targets for text generation. Build an RNN model for sequence generation using Keras Subclassing. Create a text generator and evaluate the output.

Sample Output:

QUEENE:  
I had thought thou hadst a Roman; for the oracle,  
Thus by All bids the man against the word,  
Which are so weak of care, by old care done;  
Your children were in your holy love,  
And the precipitation through the bleeding throne.  
  
BISHOP OF ELY:  
Marry, and will, my lord, to weep in such a one were prettiest;  
Yet now I was adopted heir  
Of the world's lamentable day,  
To watch the next way with his father with his face?  
  
ESCALUS:  
The cause why then we are all resolved more sons.  
  
VOLUMNIA:  
O, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, it is no sin it should be dead,  
And love and pale as any will to that word.  
  
QUEEN ELIZABETH:  
But how long have I heard the soul for this world,  
And show his hands of life be proved to stand.  
  
PETRUCHIO:  
I say he look'd on, if I must be content  
To stay him from the fatal of our country's bliss.  
His lordship pluck'd from this sentence then for prey,  
And then let us twain, being the moon,  
were she such a case as fills m  

While some of the sentences are grammatical, most do not make sense. The model has not learned the meaning of words, but consider the following:

  • The model is character-based. When training started, the model did not know how to spell an English word, or that words were even a unit of text.

  • The structure of the output resembles a play - blocks of text generally begin with a speaker name, in all capital letters similar to the dataset.

  • The model is trained on small batches of text (100 characters each), and is still able to generate a longer sequence of text with coherent structure.

Technologies Used:

  • Python3.10

  • TensorFlow

  • Numpy

  • NLP

  • RNN

Getting Started

To get a local copy up and running follow these simple steps.

Installation

  1. Clone the repo
git clone https://github.com/stefanshipinkoski/text-generation-RNN
  1. Change the project directory
cd text-generation-RNN
  1. Create a conda environment and install the required dependencies:
conda env create -f enviorment.yml
  1. Activate the conda environment:
conda activate water-quality-env
  1. Run Jupyter Notebook in the directory
jupyter notebook

Contribution:

Contributions to this project are welcomed. If you're interested in improving or extending this text generation RNN, consider the following:

  • Enhancing the model's training on larger text sequences to potentially improve coherence in generated text.

  • Experimenting with different RNN architectures or hyperparameters to optimize text generation results.

  • Implementing post-processing techniques to refine the generated text for better readability or context coherence.

  • Adding additional datasets or texts to diversify the model's training for broader language generation capabilities.

  • Feel free to fork this repository, make improvements, and create pull requests with your enhancements. All contributions are appreciated and encouraged!

About

This repository showcases text generation using an RNN trained on Shakespeare's writings. The project aims to predict the subsequent character in a sequence based on input characters. The model can generate extended sequences of text through iterative calling.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages