ABSTRACT
Artificial Neural Networks (ANN) becomes an important tool in modelling. designing and controlling a chemical or biochemical process. because of its learning and generolisation properties that confer it the full power of a self organising system. A neural network is farnod of synthetic neurones. grouped in layers (input. output and hidden). Each one has as task to process the signals received from its dendrites according to its thresh–old function (the asin step when processing information); after that. the answer outputs through its axon to the rest of the neurones (1). This stands for the ability of the net to use the information stored into the private neurones‘ weights. In the learning phase. the derivative of the thresbold function plays. on almost every case. a key role in matching the answer of the net with the correct output data learning set. providing that any step descent learning rule is used (1) It is obvious that choosing a suitable threshold function is an essential step in having an appropriate neural network. By far. the most used threshold function is the well–known signoid The authors fully examined the impact over the performance of a given neural network (input, hidden and output layers kept the same) of changing this function with another signoid. more versatile
The shape of this function heavily modifies when changes in α and/or β occur. This new threshold function seens to be more promising due to its ability to match each neuron's needs, changing α and β accordingly. The training data was a set of experimentally obtained points regarding the sebacic acid's drying as powder with hot air. the neural notwork learning rule being the back-propagation algorithm (1). The learning rate for the new threshold function is drestically affected by α and β val