The results of the 2024 edition of SYNTCOMP are in. First, we give a list of the winners for each one of the realizability tracks ran this year: Parity games, LTL, and finite-word LTL. Then, we give more details about running times and scores as well as some graphs (showing only the best configuration per tool, i.e. the one that solved the most benchmarks). Finally, we link to the participants of the competition.
The data used comes from the StarExec jobs. All data, scripts, benchmarks, and (binaries of the) tools are archived as a Zenodo artifact here. Congratulations to the winners and thanks to all the participants!
The winners
Parity game realizability
- Sequential realizability: Knor
- Parallel realizability: Knor
LTL realizability
- Sequential realizability: SemML
- Parallel realizability: SemML
LTLf realizability
- Sequential realizability: Nike
- Parallel realizability: Nike
Parity game Realizability
Solver | Configuration | Solved (seq/par) | CPU time (s) | Wallclock time (s) |
---|---|---|---|---|
Knor | real_rtl | 458/458 | 421.28 | 421.96 |
ltlsynt | pgreal | 378/378 | 10441.48 | 10442.41 |
LTL Realizability
Solver | Configuration | Solved (seq/par) | CPU time (s) | Wallclock time (s) |
---|---|---|---|---|
SemML | semml_real | 972/967 | 32740.56 | 24985.99 |
Acacia bonsai | real | 768/788 | 31902.39 | 35682.57 |
A.bonsai (k-d trees) | real | 734/750 | 27733.07 | 30132.75 |
ltlsynt | segrealacd | 944/936 | 27596.39 | 27286.30 |
ltlsynt | segrealds | 899/891 | 27256.03 | 26623.95 |
ltlsynt | segreallar | 941/933 | 28680.57 | 28347.82 |
sdf | real | 840/838 | 30893.46 | 31067.37 |
sdf | real_vg_hors_concours | 839/838 | 29815.81 | 31876.83 |
LTLf Realizability
Solver | Configuration | Solved (seq/par) | CPU time (s) | Wallclock time (s) |
---|---|---|---|---|
LydiaSyft | default.sh | 220/220 | 6882.87 | 6887.78 |
LydiaSyft | print_strategy_in_file.sh | 220/220 | 6888.98 | 6893.82 |
Nike | bdd_ff | 230/230 | 3972.70 | 3943.48 |
Nike | bdd_random | 181/181 | 18472.69 | 18464.63 |
Nike | bdd_tt | 225/225 | 5319.70 | 5283.62 |
Nike | default | 234/234 | 5326.30 | 5326.69 |
Nike | hash_random | 137/137 | 12187.83 | 12167.53 |
Nike | hash_tt | 231/231 | 6464.69 | 6462.08 |
lisa | seqreal1.sh | 209/209 | 5649.48 | 5645.42 |
lisa | seqreal2.sh | 192/192 | 8297.10 | 8295.19 |
lisa | seqreal3.sh | 190/190 | 7642.33 | 7646.71 |
ltlfsynt | segltlfrealbase | 77/77 | 22988.60 | 22990.84 |
ltlfsynt | segltlfrealopt | 77/77 | 22951.00 | 22952.93 |
ltlfsynt | segltlfsyntbase | 78/78 | 24793.56 | 24795.99 |
ltlfsynt | segltlfsyntopt | 77/77 | 23028.29 | 23030.55 |
tople | default | 131/131 | 9375.09 | 9339.21 |
Benchmarks and Solvers
The set of benchmarks used in 2024 can be fetched from the usual repository.
Below, you will find information regarding the participating solvers, their source code, license type, and links to their most recent reports or preferred publications to be cited in case you use the tool in your academic work.
Solver | Source repo | License | Report to cite |
---|---|---|---|
ltlsynt | GitLab repo | GPL-3.0+ | Paper |
Acacia bonsai | GitHub repo | GPL-3.0 | Paper |
Knor | GitHub repo | GPL-3.0 | Paper |
sdf | GitHub repo | MIT | arXiv report |
Nike | GitHub repo | AGPL v3.0 | arXiv report |
LydiaSyft | GitHub repo | AGPL v3.0 | report |
Lisa | GitHub repo | GPL v3.0 | arXiv report |
tople | Custom: all rights reserved | Contact authors | |
SemML | GitLab repo | GPL v3.0 | Contact author |