Adding support for dropout in NeuralNetwork class

I added support for dropout in NeuralNetwork class so that you can avoid overfitting during training. Just specify the dropout ratio when you add a layer using the ‘dropout’ argument:

    model = Model(num_input=28*28)
    model.add(Layer(1024, activation=af.RELU, dropout=0.5))
    model.add(Layer(512, activation=af.RELU, dropout=0.5))
    model.add(Layer(10, activation=af.SIGMOID))

See MNIST Fashion example for the complete example. Accuracy for the above script was 88.84% with 20 epochs.

Dropout ratio is the ratio of units that you want to keep. Therefore if you specify 0.8, 20% of units are randomly dropped. In actual implementation, it’s a little more complicated. If you want to have a look how it’s implemented, see the actual code.

I used this paper as the reference, but any mistake in the implementation is mine.