Strategy Representation by Decision Trees with Linear Classifiers

Investor logo

Warning

This publication doesn't include Faculty of Sports Studies. It includes Faculty of Informatics. Official publication website can be found on muni.cz.
Authors

ASHOK Pranav BRÁZDIL Tomáš CHATTERJEE Krishnendu KŘETÍNSKÝ Jan LAMPERT Christoph TOMAN Viktor

Year of publication 2019
Type Article in Proceedings
Conference Quantitative Evaluation of Systems (QEST 2019)
MU Faculty or unit

Faculty of Informatics

Citation
Doi http://dx.doi.org/10.1007/978-3-030-30281-8_7
Keywords Strategy Representation; Decision Trees; Linear Classifiers
Description Graph games and Markov decision processes (MDPs) are standard models in reactive synthesis and verification of probabilistic systems with nondeterminism. The class of omega-regular winning conditions; e.g., safety, reachability, liveness, parity conditions; provides a robust and expressive specification formalism for properties that arise in analysis of reactive systems. The resolutions of nondeterminism in games and MDPs are represented as strategies, and we consider succinct representation of such strategies. The decision-tree data structure from machine learning retains the flavor of decisions of strategies and allows entropy-based minimization to obtain succinct trees. However, in contrast to traditional machine-learning problems where small errors are allowed, for winning strategies in graph games and MDPs no error is allowed, and the decision tree must represent the entire strategy. In this work we propose decision trees with linear classifiers for representation of strategies in graph games and MDPs. We have implemented strategy representation using this data structure and we present experimental results for problems on graph games and MDPs, which show that this new data structure presents a much more efficient strategy representation as compared to standard decision trees.
Related projects:

You are running an old browser version. We recommend updating your browser to its latest version.

More info