RNNs and LSTM Networks

Code: char_rnn.py

Are you interested in creating a chat bot or doing language processing with Deep Learning? This tutorial will show you one of Caffe2’s example Python scripts that you can run out of the box and modify to start you project from using a working Recurrent Neural Network (RNN). This particular RNN is a Long Short Term Memory (LSTM) network, where the network is capable of learning and maintaining a memory overtime while showing gradual improvement. For more information, you might want to check out A. Karpathy’s blog and his RNN project, the inspiration for this Caffe2 variation of the code he posted. It goes into further technical detail with some great animations and examples.

What’s fascinating about this script is that you can throw a variety of data sources at, like the works of Shakespeare, and not only will it “learn” English, grammar, and spelling, it will also pick up on the nuances of structure and prose used in his works. Likewise you can feed it speeches from Obama, and while the output might sound like typical political rhetoric from a 3AM C-SPAN filibuster, you’ll spot a familiar cadence and rhythm that you can almost hear in the nearly intelligible words.

Fed source code, it will learn that language’s structure and mimic code that, while may not compile, at first glance seems reasonable.

Grab and combine sources from the Project Gutenburg of ancient cookbooks and create an AI that spits out recipes reminiscent of feasts from The Game of Thrones. Feed it even larger sets of data and see what unique creations might be born from your own LSTM network.

Usage Example - Shakespeare

First you’ll want to download some Shakespeare as your training data. Save this right in the /caffe2_root/caffe2/python/examples directory.

Then run the script, passing in the shakespeare.txt file as an argument for —train_data and sit back for a bit.

  1. python char_rnn.py --train_data shakespeare.txt

Output will be something like this initially:

  1. ---------- Iteration 5500 ----------
  2. Bonge adpold,
  3. Youre on, of your ins encisgath no housr the hould:
  4. go restibless.
  5.  
  6. OTNOUS:
  7. Ot of your vese the hisghts ank our he wirmshe so ir demand!
  8. If lorst in and envire lake's Remans a weaker your with a am do entice, I his still.
  9.  
  10. DEBUG:char_rnn:Loss since last report: 48.8397690892
  11. DEBUG:char_rnn:Smooth loss: 50.3314862722
  12. Characters Per Second: 8379
  13. Iterations Per Second: 335

As you let it run for a while, it will eventually begin to form into real words and understandable sentences. Let it run while we check out some other parameters.

Using Your GPU and the Other Options

If you have a GPU on your computer then try out the setting for that by adding —gpu.

  1. python char_rnn.py --train_data shakespeare.txt --gpu

You may also want to adjust some of the other optional parameters to increase performance or the effect of the training.

  • —gpu: enables the network to utilize GPU resources, massively increasing the speed of training (or learning)
  • —seq_length: this is the number of characters in one continuous sequence that are grabbed from the training data and passed into the network; defaults to 25
  • —batch_size: defaults to 1
  • —iters_to_report: this is how often the scripts provides a little output to show it’s current status; defaults to 500
  • —hidden_size: this is the size of the neural network’s hidden layer; defaults to 100

Meanwhile What’s Cleopatra Doing?

Even after just a few minutes you can see changes in the output that are getting close to Shakespeare, or English at least, with a sort of screenplay structure. You can see in the output below that we’re at the 184,000 iteration of the network. The loss has dropped from around 50% when we first started, down to 35% quite quickly.

  1. ---------- Iteration 184000 ----------
  2. Lent parture there are do, brother's chawne
  3. A father would conquer I will my seem.
  4.  
  5. Second Vreture aporier service the sidgethines, when the blood lie without again toediers
  6. So be would be offers' true yaunder,,
  7. And I will onseling say?
  8.  
  9. CLEOPATRA:
  10. When he do. Tyouth from whell?
  11.  
  12. DEBUG:char_rnn:Loss since last report: 35.090320313
  13. DEBUG:char_rnn:Smooth loss: 35.1464676378
  14. Characters Per Second: 8267
  15. Iterations Per Second: 330

Example Parameter Changes

Here’s another example of running the RNN for about 10 minutes, but increasing the batch_size to 32, the hidden_size to 512, and the seq_length to 32. Note how the screenplay format is evident, with names and dialogue after, and the loss is now 23%.

  1. python char_rnn.py --train_data shakespear.txt --gpu --batch_size 32 --hidden_size 512 --seq_length 32
  2.  
  3. ---------- Iteration 26000 ----------
  4. ;
  5. Of chave you the tixle, and to yo as my son, Brenis a fair,
  6.  
  7. DUKE EINIOR:
  8. Who kous! I dre. Shall with thee,
  9. The malice and tongue Guilty of it owed by thee
  10. That teaching the heavier fates of death of grace
  11. As I chid the poor hair to kiss our field;
  12. Meat thou before the seven blow with letters out,
  13. And turn my virtues are ourselves; there are ye dead;
  14. And melted legs contracted in such ones,
  15. Some beer whose aim must be supposed both blind.
  16.  
  17. ISABELLA:
  18. A warrior of the sway.
  19.  
  20. SIR HUGH EVANS:
  21. For a
  22. DEBUG:char_rnn:Loss since last report: 0.254487129331
  23. DEBUG:char_rnn:Smooth loss: 0.232923667422
  24. Characters Per Second: 32311
  25. Iterations Per Second: 31

Notes

When acquiring data sources for this script, try to get all plain ASCII text rather than UTF-8 or other formats if you want it to run out of the box. Otherwise, you’re going to need to preprocess it or modify the script to handle the different formats.