Prof. Cevdet Aykanat’s project on a communication models for scalable parallel programming received TÜBİTAK 1001 support (September 2021).
High Performance Computing is pivotal for the success of several recent and high impact machine learning, data science and scientific computing applications. The workload encountered in such parallel applications incurs sparse and irregular communication patterns. So communication overhead manifests itself as a bottleneck in scaling such parallel applications on petascale distributed-memory systems. The communication cost involves metrics such as total communication volume and message count as well as maximum number and volume of communications handled by a processor. Communication overhead minimization is only possible by encapsulating multiple communication cost metrics simultaneously. This project will investigate the related communication models.