Comparison of Activation Functions for Nse Stock Price Prediction.
ANN timeseries prediction has been successfully implemented and tested on NSE stocks prediction by Barack (2014) and other stock exchanges (Safi & White 2017) however these studies focused on factors like training algorithm, network sizing and learning parameters bypassing selection of the activation functions. With the different type’s activation functions currently available for use in ANN, further studies needed to be done to test if the activation functions or their combinations can improve performance of the ANN in stock price prediction. Research has proved that activation function to be one of the essential parameters of an artificial neural network (ANN) whose performance can be improved by use of various activation functions and their combinations. Ozkan and Erbek (2003), Sibi, Jones and Siddarth (2013). The study involved implementing of different networks containing varied activation functions being trained and tested on NSE data to measure their performance. RMSE and MSE were used as the basis of evaluating the accuracy of predictions. The study came to the conclusion that a network of S-SF-SF-S “sigmoid-softmax” performed best with minimal training i.e. 100 epochs and its performance degraded if the further trained. The activation functions of T-T-T-T “hyperbolic tangent”, L-L-L-L ”linear”, S-T-T-S “sigmoid-hyperbolic tangent” and S-S-S-S “sigmoid” followed respectively according to the majority of the stocks tested though they required more training reaching up to 2000 epochs. Generally, it was observed that changing the activation function has either a positive or a negative effect on the performance of the ANN. Further comprehensive research on building better ANN models for other areas where ANNs are applied other than prediction i.e. image recognition, classification etc. as the research has proven changing the activation function can have a positive or negative effect on the ANN performance.
The following license files are associated with this item: