README.md in ruby-dnn-0.14.3 vs README.md in ruby-dnn-0.15.0
- old
+ new
@@ -1,11 +1,12 @@
# ruby-dnn
[![Gem Version](https://badge.fury.io/rb/ruby-dnn.svg)](https://badge.fury.io/rb/ruby-dnn)
[![Build Status](https://travis-ci.org/unagiootoro/ruby-dnn.svg?branch=master)](https://travis-ci.org/unagiootoro/ruby-dnn)
-ruby-dnn is a ruby deep learning library. This library supports full connected neural network and convolution neural network.
-Currently, you can get 99% accuracy with MNIST and 74% with CIFAR 10.
+ruby-dnn is a ruby deep learning library. This library supports full connected neural network and convolution neural network
+and recurrent neural network.
+Currently, you can get 99% accuracy with MNIST and 78% with CIFAR 10.
## Installation
Add this line to your application's Gemfile:
@@ -77,11 +78,12 @@
## Implemented
|| Implemented classes |
|:-----------|------------:|
| Connections | Dense, Conv2D, Conv2DTranspose, Embedding, SimpleRNN, LSTM, GRU |
-| Layers | Flatten, Reshape, Dropout, BatchNormalization, MaxPool2D, AvgPool2D, UnPool2D |
| Activations | Sigmoid, Tanh, Softsign, Softplus, Swish, ReLU, LeakyReLU, ELU |
+| Basic | Flatten, Reshape, Dropout, BatchNormalization |
+| Pooling | MaxPool2D, AvgPool2D, GlobalAvgPool2D, UnPool2D |
| Optimizers | SGD, Nesterov, AdaGrad, RMSProp, AdaDelta, RMSPropGraves, Adam, AdaBound |
| Losses | MeanSquaredError, MeanAbsoluteError, Hinge, HuberLoss, SoftmaxCrossEntropy, SigmoidCrossEntropy |
## TODO
● Write a test.