290
Views
18
CrossRef citations to date
0
Altmetric
Articles

New Algebraic Activation Function for Multi-Layered Feed Forward Neural Networks

&

REFERENCES

  • S. Goh and D. P. Mandic, “Recurrent neural networks with trainable amplitude of activation functions,” Neural Netw., Vol. 16, no. 8, pp. 1095–1100, May 2003.
  • M. Solazzi and A. Uncini, “Regularising neural networks using flexible multivariate activation function,” Neural Netw., Vol. 17, no. 2, pp. 247–60, Mar. 2004.
  • L. Vecci, F. Piazza, and A. Uncini, “Learning and approximation capabilities of adaptive spline activation function neural networks,” Neural Netw., Vol. 11, no. 2, pp. 259–70, Mar. 1998.
  • J. Han and C. Morag, “The influence of the sigmoid function parameters on the speed of back propagation learning,” in Proceedings of the International Workshop on Artificial Neural Networks: From Natural to Artificial Neural Computation, Spain, 1995, pp. 195–201.
  • S. Haykin, Neural Networks: A Comprehensive Foundation, 2nd ed. Singapore: Pearson Education, 1998.
  • Y. Pao, H, Adaptive Pattern Recognition and Neural Networks, Reading, MA: Addison-Wesley, 1989, pp. 161–70.
  • D. von Seggern, CRC Standard Curves and Surfaces with Mathematics, 2nd ed. Boca Raton, FL: CRC Press, 2007.
  • T. M. Mitchell, Machine Learning, Hoboken, NJ: John Wiley & Sons, 1997.
  • D. Hosmer and S. Lemeshov, Applied Logistic Regression, NewYork: Wiley & Sons, 1989.
  • J. Dombi, “Membership function as an evaluation,” Fuzzy Sets Syst., Vol. 35, pp. 1–22, 1990, doi: 10.1016/0165-0114(90)90014-W.
  • P. Chandra and Y. Singh, “An activation function adapting training algorithm for sigmoidal feedforward networks,” Neurocomput. Lett., Vol. 61, pp. 429–37, Oct. 2004.
  • L. Zhou and L. Zhang, “A log-sigmoid Lagrangian neural network for solving nonlinear programming,” in Proceedings of ACIS International Conference on Software Engineering, Artificial Intelligence, Networking, and Parallel/Distributed Computing, Qingdao, China, Jul. 30–Aug. 1, 2007, pp. 427–31.
  • G. A. Anastassiou, Intelligent Systems: Approximation by Artificial Neural Networks, Intelligent Systems Reference Library 19. Berlin: Springer-Verlag, 2011.
  • A. R. Barron, “Universal approximation bounds for superpositions of a sigmoidal function,” IEEE Trans. Inform. Theory, Vol. 39, no. 3, pp. 930–45, May 1993.
  • G. Lewicki and G. Marino, “Approximation of functions of finite variation by superpositions of a sigmoidal function,” Appl. Math. Lett., Vol. 17, pp. 1147–52, Jan. 2004.
  • Z. Chen and F. Cao, “The approximation operators with sigmoidal functions,” Comput. Math. Appl., Vol. 58, no. 4, pp. 758–65, Aug. 2009.
  • Z. Chen and F. Cao, “The construction and approximation of a class of neural networks operators with ramp functions,” J. Comput. Anal. Appl., Vol. 14, no. 1, pp. 101–12, Jan. 2012.
  • G. H. L. Cheang, “Approximation with neural networks activated by ramp sigmoids,” J. Approx. Theory, Vol. 162, pp. 1450–65, Aug. 2010.
  • D. Costarelli and R. Spigler, “Approximation by series of sigmoidal functions with applications to neural networks,” Annali di Matematica Puraed Appl., Vol. 194, no. 1, pp. 289–306, Feb. 2015.
  • D. Costarelli and V. Vinti, “Approximation by max-product neural network operators of Kantorovich type,” Results Math., Vol. 69, no. 3, pp. 505–19, Jun. 2016.
  • N. Hahm and B. Hong, “Approximation order to a function in C(R) by superposition of a sigmoidal function,” Appl. Math. Lett., Vol. 15, pp. 591–7, Jul. 2002.
  • N. Kyurkchiev and S. Markov, Sigmoid Functions: Some Approximation and Modelling Aspects, Some Moduli in Programming Environment Mathematica, UK: LAP (Lambert Acad. Publ.), 2015.
  • D. Costarelli and R. Spigler, “Solving Volterra integral equations of the second kind by sigmoidal functions approximation,” J. Int. Equ. Appl., Vol. 25, no. 2, pp. 193–222, Nov. 2013.
  • D. Costarelli and R. Spigler, “A collocation method for solving nonlinear Volterra integro-dierential equations of the neutral type by sigmoidal functions,” J. Int. Equ. Appl., Vol. 26, no. 1, pp. 15–52, Nov. 2014.
  • D. Costarelli, “Neural network operators: Constructive interpolation of multivariate functions,” Neural Netw., Vol. 67, pp. 28–36, Jul. 2015.
  • D. Elliot, “A better activation function for artificial neural networks, the National Science Foundation,” Institute for Systems Research, Washington, DC, ISR Technical Rep. TR-8, 1993. Available: http://ufnalski.edu.pl/zne/ci_2014/papers/Elliott_TR_93-8.pdf
  • B. Mendil and K. Benmahammed, “Simple activation functions for neural and fuzzy neural networks,” in Proceedings of IEEE International Symposium on Circuits and Systems, Orlando, FL, May 30–Jun. 2, 1999, pp. 347–50.
  • J. Sopena, E. Romero, and R. Alqu´ezar, “Neural networks with periodic and monotonic activation functions: a comparative study in classification problems,” in Proceedings of International Conference on Artificial Neural Networks, 1999, pp. 323–8.
  • W. Duch and N. Jankowski, “Survey of neural transfer functions. Neural Comput. Surveys, Vol. 2, pp. 163–212, 1999.
  • J. Kamnuzaman and S. Aziz, “A note on activation function in multilayer feedforward learning,” in Proceedings of International Joint Conference on Neural Networks, Honolulu, Hawaii, May 12–17, 2002, pp. 519–23.
  • G. S. D. S. Gomes and T. B. Ludermir, “Optimization of the weights and asymmetric activation function family of neural network for time series forecasting,” Expert Syst. Appl., Vol. 40, no. 16, pp. 6438–46, Nov. 2013.
  • G. Miguez, A. E. Xavier, and N. Maculan, “An evaluation of the bihyperbolic function in the optimization of the backpropagation algorithm,” Int. Trans. Operat. Res., Vol. 21, no. 5, pp. 835–54, Sep. 2014.
  • M. Abdulkarim, W. F. W. Ahmad, A. Shafie, and R. Razali, “Performance of multi-layer perceptron neural networks with an exponential decay activation function in airwaves estimation,” Int. J. Adv. Studies Comput. Sci. Eng., Vol. 2, no. 2, pp. 25, 2013.
  • S. M. Siniscalchi and V. M. Salerno, “Adaptation to new microphones using artificial neural networks with trainable activation functions,” IEEE Trans. Neural Netw. Learn. Syst., Apr. 2016.
  • M. Riedmiller and H. Braun, “A direct adaptive method for faster back-propagation learning: The RPROP algorithm,” in Proceedings of the International Conference on Neural Networks, San Francisco, San Francisco, CA, Mar. 28–Apr. 1, 1993, pp. 586–91.
  • W. McCulloch and W. Pitts, “A logical calculus of the ideas immanent in nervous activity,” Bull. Math. Biophys., Vol. 5, pp. 115–33, 1943.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.