README.org in ruby_brain-0.1.2 vs README.org in ruby_brain-0.1.3
- old
+ new
@@ -6,28 +6,21 @@
The code of RubyBrain is the neuron oriented style.
This means that a class which represents a neuraon exists and each neurons are instances of the class.
So, you can treat neurons flexibly in a network.
Instead, the speed is very slow and it might not be reasonable for applications to use this library in the core.
However this library may help you get more deep knowledge around neuralnet/deep learning.
+
+ *caution*
+ Currently RubyBrain has shigmoid as the activation function.
+ So, output layer also can has only sigmoid.
+ In the result, the network can only treat 0~1 outpus.
* Installation
- Add this line to your application's Gemfile:
+ You can install simply
- #+BEGIN_SRC ruby
- gem 'ruby_brain'
- #+END_SRC
-
- And then execute:
-
#+BEGIN_SRC shell
- $ bundle
- #+END_SRC
-
- Or install it yourself as:
-
- #+BEGIN_SRC shell
$ gem install ruby_brain
#+END_SRC
* Usage
@@ -136,11 +129,23 @@
a_network.load_weights_from_yaml_file('/path/to/saved/weights/file.yml')
#+END_SRC
* Examples
+** Imitation a wave_form
+ Currently this example is prepared only in iRuby notebook.
+ You can see it in the following link.
+
+ [[http://nbviewer.jupyter.org/github/elgoog/ruby_brain/blob/master/examples/wave_form.ipynb][example/wave_form.ipynb - iRuby]]
+
** MNIST
+
+ You can see following contents in iRuby notebook.
+
+ [[http://nbviewer.jupyter.org/github/elgoog/ruby_brain/blob/master/examples/mnist.ipynb][example/mnist.ipynb - iRuby]]
+
+
Following code is included in [[https://github.com/elgoog/ruby_brain/blob/master/examples/mnist.rb][examples/mnist.rb]]
This module dependos on [[https://rubygems.org/gems/mnist][mnist]] gem to load mnist data into ruby array.
#+BEGIN_SRC ruby
@@ -151,41 +156,57 @@
Get MNIST dataset from [[http://yann.lecun.com/exdb/mnist/][THE MNIST DATABASE of handwritten digits]] if the dataset files don't exist in the working directory.
And load them into Ruby dictionary =dataset=.
#+BEGIN_SRC ruby
dataset = RubyBrain::DataSet::Mnist::data
+ training_dataset = dataset.first
+ test_dataset = dataset.last
# dataset has :input and :output dataset
- dataset.keys # => [:input, :output]
+ training_dataset.keys # => [:input, :output]
+ test_dataset.keys # => [:input, :output]
- # :input dataset has 60000(samples) x 784(28 * 28 input pixcels)
- dataset[:input].size # => 60000
- dataset[:input].first.size # => 784
+ # :input of training_dataset has 60000(samples) x 784(28 * 28 input pixcels)
+ training_dataset[:input].size # => 60000
+ training_dataset[:input].first.size # => 784
- # :output dataset has 60000(samples) x 10(classes 0~9)
- dataset[:output].size # => 60000
- dataset[:output].first.size # => 10
+ # :output of training_dataset has 60000(samples) x 10(classes 0~9)
+ training_dataset[:output].size # => 60000
+ training_dataset[:output].first.size # => 10
+
+ # :input of test_dataset has 10000(samples) x 784(28 * 28 input pixcels)
+ test_dataset[:input].size # => 10000
+ test_dataset[:input].first.size # => 784
+
+ # :output of test_dataset has 10000(samples) x 10(classes 0~9)
+ test_dataset[:output].size # => 10000
+ test_dataset[:output].first.size # => 10
#+END_SRC
- Divide =dataset= into training and test data.
+ In this example, We use only first 5000 samples of training_dataset
+ because RubyBrain is slow and it takes very long time to learn full training_dataset.
NUM_TRAIN_DATA means how many first images are used as training data.
- We use first 5000 images for training here.
+ Here it is set as 5000.
#+BEGIN_SRC ruby
+ # use only first 5000 samples for training
NUM_TRAIN_DATA = 5000
- training_input = dataset[:input][0..(NUM_TRAIN_DATA-1)]
- training_supervisor = dataset[:output][0..(NUM_TRAIN_DATA-1)]
+ training_input = training_dataset[:input][0..(NUM_TRAIN_DATA-1)]
+ training_supervisor = training_dataset[:output][0..(NUM_TRAIN_DATA-1)]
+ # use full test dataset
+ test_input = test_dataset[:input]
+ test_supervisor = test_dataset[:output]
#+END_SRC
Then construct the network and initialize.
In this case, an image has 784(28x28) pixcels and 10 classes(0..9).
So, the network structure should be [784, 50, 10] with 1 hidden layer which has 50 units.
You can construct the structure with following code.
#+BEGIN_SRC ruby
# network structure [784, 50, 10]
- network = RubyBrain::Network.new([dataset[:input].first.size, 50, dataset[:output].first.size])
+ network = RubyBrain::Network.new([training_input.first.size, 50, training_supervisor.first.size])
# learning rate is 0.7
network.learning_rate = 0.7
# initialize network
network.init_network
#+END_SRC
@@ -213,31 +234,32 @@
end
end
#+END_SRC
Then, you can review each classes(labels) predicated by the model with following code.
+ This code show each picture as an ascii art and list the answer(test_output) and predicated label.
#+BEGIN_SRC ruby
results = []
test_input.each_with_index do |input, i|
input.each_with_index do |e, j|
print(e > 0.3 ? 'x' : ' ')
puts if (j % 28) == 0
end
puts
supervisor_label = test_supervisor[i].argmax
- predicated_label = network.get_forward_outputs(test_input[i]).argmax
+ predicated_label = network.get_forward_outputs(input).argmax
puts "test_supervisor: #{supervisor_label}"
puts "predicate: #{predicated_label}"
results << (supervisor_label == predicated_label)
puts "------------------------------------------------------------"
end
puts "accuracy: #{results.count(true).to_f/results.size}"
#+END_SRC
- I tried to train wioth above conditions.
- The accuracy of trained model was 92.3%.
+ I tried to train with above conditions.
+ And after some trial runs, the accuracy of trained model was 93.12%.
The weights file is [[https://github.com/elgoog/weights_ruby_brain/blob/master/weights_782_50_10_1.yml][here]].
* Contributing
1. Fork it ( https://github.com/elgoog/ruby_brain/fork )