Home Rules Benchmarks Tools Specs Participants Results SMT-LIB Previous

QF_LRA (Main Track)

Competition results for the QF_LRA division as of Fri Jul 13 00:02:11 GMT

Benchmarks in this division : 1649
Time limit: 1200s

Winners:

Sequential PerformancesParallel Performances
CVC4CVC4

Result table1

Sequential Performance

Solver Error Score Correctly Solved Score CPU time Score Solved Unsolved
CVC4 0.000 1586.833 69.006 1566 83
Ctrl-Ergo 0.000 1450.082 172.097 1354 295
MathSATn 0.000 1536.458 107.673 1461 188
SMTInterpol 0.000 1548.476 102.257 1521 128
SMTRAT-MCSAT 0.000 1090.526 409.015 711 938
SMTRAT-Rat 0.000 1297.891 275.918 984 665
SPASS-SATT 0.000 1586.396 64.292 1590 59
Yices 2.6.0 0.000 1583.186 63.901 1567 82
opensmt2 0.000 1498.663 131.674 1329 320
veriT 0.000 1568.212 79.840 1527 122
z3-4.7.1n 0.000 1527.249 113.154 1435 214

Parallel Performance

Solver Error Score Correctly Solved Score CPU time Score WALL time Score Solved Unsolved
CVC4 0.0001586.83369.00769.191156683
Ctrl-Ergo 0.0001487.857550.944138.5661438211
MathSATn 0.0001536.458107.673107.6791461188
SMTInterpol 0.0001553.375111.86196.9921531118
SMTRAT-MCSAT 0.0001090.526409.020409.048711938
SMTRAT-Rat 0.0001297.891275.921275.936984665
SPASS-SATT 0.0001586.39664.29364.281159059
Yices 2.6.0 0.0001583.18663.90163.896156782
opensmt2 0.0001498.663131.674131.6831329320
veriT 0.0001568.21279.84079.8461527122
z3-4.7.1n 0.0001527.249113.155113.1461435214

n. Non-competing.

1. Scores are computed according to Section 7 of the rules.

Home Rules Benchmarks Tools Specs Participants Results SMT-LIB Previous

Last modified: Fri 13 Jul 2018 00:04 UTC
Valid XHTML 1.0 Valid CSS!