Home Rules Benchmarks Tools Specs Participants Results SMT-LIB Previous

QF_LRA (Application Track)

Competition results for the QF_LRA division as of Tue Jul 18 22:06:21 GMT

Benchmarks in this division : 10

Winner : SMTInterpol

Result table1

Solver Parallel performance
Error Score Correctly Solved Score avg. CPU time avg. WALL time
CVC4 069317965.8017966.71
SMTInterpol 07699921.749046.35
Yices2 07499625.359625.58
mathsat-5.4.1n 07952948.682947.94
opensmt2 000.030.31
z3-4.5.0n 074217462.1217463.78

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.

Home Rules Benchmarks Tools Specs Participants Results SMT-LIB Previous

Last modified: Fri 21 Jul 2017 10:18 UTC
Valid XHTML 1.0 Valid CSS!