WorkshopsSystem 5: Performance Correlation for Autotuning Efficiency
reads
Reiji Suda
2013-03-28
14:05:00 - 14:30:00
101 , Mathematics Research Center Building (ori. New Math. Bldg.)
If a method is effective on a platform (or for a problem), then it might be also effective on a similar platform (or for a similar problem). Such correlation is often observed when we investigate computing performance. And that correlation must be useful for autotuning, i.e., when we observe performance on some condition, we can use that performance to estimate performance on another condition, and possibly, to tune performance on another condition. However, in order to use the information of correlated performance in an appropriate and effective way in autotuning, we must have a good understanding of how the performance values are correlated. In this talk I will present our efforts of analysis, modeling, and use in autotuning of performance correlations: autotuning for condition changes, effects of matrix size on dense matrix kernel performance, effects of algorithmic parameters on a sparse matrix solver, effects of temperature on power consumption, and performance correlation of sub-matrix and full-matrix computations in sparse matrix-vector product. Our methodology is to model performance and its uncertainty in a Bayesian framework, and to tune parameters using Bayesian sequential experimental design. In some cases, online autotuning seems to be more suitable than offline autotuning. We found a case where variogram is useful to model correlation, and kriging can predict performance in a certain accuracy.