Home Rules Benchmarks Tools Specs Participants Results SMT-LIB Previous

QF_LRA (Application Track)

Competition results for the QF_LRA division as of Thu Jul 7 07:24:34 GMT

Benchmarks in this division : 10

Winner : SMTInterpol

Result table1

Solver Parallel performance
Error Score Correctly Solved Score avg. CPU time avg. WALL time
CVC4n 066619103.8419092.12
Yices2 07419649.029643.03
MathSat5n 07951733.191731.39
SMTInterpol 075513463.7012147.38
z3n 074416908.1416898.90

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.

Home Rules Benchmarks Tools Specs Participants Results SMT-LIB Previous

Last modified: Thu 07 Jul 2016 07:28 UTC
Valid XHTML 1.0 Valid CSS!