Home Rules Benchmarks Tools Specs Participants Results SMT-LIB Previous

QF_LRA (Main Track)

Competition results for the QF_LRA division as of Thu Jul 7 07:24:34 GMT

Benchmarks in this division : 1626

Winners:

Sequential Performances Parallel Performances
CVC4CVC4

Result table1

Solver Sequential performance Parallel performance Other information
Error Score Correctly Solved Score avg. CPU time Errors Corrects avg. CPU time avg. WALL time Unsolved benchmarks
CVC4 0.0001601.99761.9890.0001601.99762.00462.08820
MathSat5n 0.0001574.475109.9150.0001574.475109.957109.87064
OpenSMT2 0.0001510.710214.2280.0001510.710214.323214.198263
SMT-RAT 0.0001415.279344.3510.0001415.279344.538344.329433
SMTInterpol 0.0001571.240118.7900.0001572.061127.002112.66954
Yices2 0.0001593.35665.7000.0001593.35665.73065.65434
toysmt 0.0001172.885582.3130.0001172.885582.500582.280811
veriT-dev 0.0001577.04595.7170.0001577.04595.75995.70155
z3n 0.0001547.832157.2610.0001547.832157.327157.232145

n. Non-competitive.

1. Scores are computed according to Section 7 of the rules.

Home Rules Benchmarks Tools Specs Participants Results SMT-LIB Previous

Last modified: Thu 07 Jul 2016 07:28 UTC
Valid XHTML 1.0 Valid CSS!