Abstract
This paper presents a model to extract and predict the communication behaviour of parallel applications. The behaviour was extracted by introducing system calls in the Linux kernel to obtain the communication information about application tasks. The extracted information is organised as time series of the number of bytes transmitted and received during the task's execution time. The dimension of these time series is reduced by using a self-organising neural network architecture that detects common resource usage states and compacts communication events. This reduction simplifies the design of the prediction model as it does not need to consider too many different communication characteristics. The reduced information is submitted to a time-delay neural network that allows to predict the volume of future data transfers. The resulting predictions may be used in scheduling algorithms, allowing to define the best resources to be allocated according to communication events. If there is no communication it is possible to distribute processes just considering CPU capacity, otherwise it is necessary to evaluate when and how many bytes are transferred to allocate tasks in neighbour networks.
Notes
3. ljsenger@uepg br
4. The compact applications are more complex than the kernel ones.
5. It is important to say here that each application execution has originated a different information set to be classified by the ART-2A neural network. The results obtained from the neural network were used to predict each application future behaviour.