Code a Poetry Generator With RNNs

Artificial Intelligence has become increasingly creative, producing art, music, and even poetry. One fascinating application is building a Poetry Generator using Recurrent Neural Networks (RNNs). In this blog, we’ll explore how RNNs work for text generation, and how you can create your own AI poet using Python and TensorFlow/Keras.


Why RNNs for Text Generation?

Traditional neural networks struggle with sequential data like text, where context and order matter. Recurrent Neural Networks (RNNs) are specially designed to handle sequences, allowing them to "remember" previous inputs using internal memory.

For poetry generation, RNNs can learn grammar, rhythm, and patterns in poems by training on a dataset of existing poetry. Over time, the model begins to generate new, original verses that mimic the style and structure of its training data.


Step-by-Step: Build Your Poetry Generator

1. Prepare the Dataset

Start by collecting a dataset of poems. For example, you can use classic poems from public domain sources like Project Gutenberg. Once collected, clean the text by:

Converting to lowercase

Removing unwanted characters (punctuation, extra spaces)

python


with open('poems.txt', 'r', encoding='utf-8') as f:

    text = f.read().lower()

print("Text length:", len(text))


2. Tokenize the Text

To train the model, convert the characters to integers.


python


chars = sorted(set(text))

char_to_idx = {char: i for i, char in enumerate(chars)}

idx_to_char = {i: char for i, char in enumerate(chars)}

Create sequences of fixed length:


python


seq_length = 100

X = []

y = []


for i in range(0, len(text) - seq_length):

    X.append([char_to_idx[c] for c in text[i:i+seq_length]])

    y.append(char_to_idx[text[i + seq_length]])

Normalize and one-hot encode:


python

Copy

Edit

import numpy as np

X = np.reshape(X, (len(X), seq_length, 1)) / len(chars)

y = np.eye(len(chars))[y]

3. Build the RNN Model

Use Keras to define an RNN (LSTM-based) model:


python

Copy

Edit

from tensorflow.keras.models import Sequential

from tensorflow.keras.layers import LSTM, Dense


model = Sequential()

model.add(LSTM(256, input_shape=(seq_length, 1)))

model.add(Dense(len(chars), activation='softmax'))


model.compile(loss='categorical_crossentropy', optimizer='adam')

model.fit(X, y, epochs=20, batch_size=128)


4. Generate Poetry

To generate text, pick a seed sequence and predict characters iteratively:


python


import random


start = random.randint(0, len(text) - seq_length - 1)

pattern = text[start:start + seq_length]

generated = pattern


for _ in range(500):

    x_input = np.reshape([char_to_idx[char] for char in pattern], (1, seq_length, 1)) / len(chars)

    prediction = model.predict(x_input, verbose=0)

    index = np.argmax(prediction)

    generated_char = idx_to_char[index]

    generated += generated_char

    pattern = pattern[1:] + generated_char


print(generated)

Tips for Better Results

Train on larger datasets (Shakespeare, romantic poetry, etc.)

Use deeper or bidirectional LSTM layers

Experiment with temperature-based sampling for more diverse outputs


Conclusion

Creating a poetry generator with RNNs combines the beauty of literature with the power of AI. While the initial output may be rough, continued training and dataset refinement can lead to surprisingly poetic results. Whether you're an AI enthusiast or a literature lover, this project is a creative and educational journey into the world of neural networks and generative text. 


Learn  Generative ai course

Read More : Build Your Own AI Meme Generator

Read More : Create a Deepfake Video Generator

Read More : Using Unity and AI to Generate Game Environments

Visit Our IHUB Talent Institute Hyderabad.

Get Direction

Comments

Popular posts from this blog

How to Use Tosca's Test Configuration Parameters

Top 5 UX Portfolios You Should Learn From

Tosca Licensing: Types and Considerations