Tiling parameters prediction using Machine Learning techniques
Résumé
The tiling transformation is one of the most crucial code optimization
techniques to expose data locality and parallelism. The main
idea is to split the initial iteration space into blocks and traverse them in
a special order. This transformation is parametric and very sensitive to
parameter tuning. Poor parameter tuning can lead to much lower performance
than the initial code. Existing state-of-the-art solutions consider
a restricted list of parameters to handle this issue and guarantee safe
solutions.
Our work proposes solutions that go beyond current state-of-the-art techniques
and gain additional speedup considering a larger set of options for
tiling. Our approach is based on Machine Learning methods and automatically
derives heuristics to tune tiling parameters. We can predict: 1)
the optimal partitioning matrix of the iteration space 2) the tile sizes, 3)
the optimal directions for scanning inter-tiles, 4) the optimal directions
for scanning intra-tile elements. The optimal selection of these parameters
is crucial especially for programs that have data dependencies.
We introduce sets of features that feed our models in their predictions.
The first set encodes data dependencies, the second one captures the
level of parallelism and data locality in a code. The third one aggregates
information about the iteration space.
Our approach surpasses existing feature spaces for tiling parameters prediction.
Moreover, it could be used in conjunction with auto-tuners to
iterate through the iterative search.
Origine | Fichiers produits par l'(les) auteur(s) |
---|