Is Intel faking its CPU benchmark results? A non-profit consortium says so

SPEC has disqualified over 2,600 results after its discovery.

Reading time icon 2 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

Key notes

  • SPEC accused Intel of ballooning its CPU benchmark results.
  • The org says that the compiler used to perform during the task might have helped inflate the result by up to 9%.
  • It’s not the first and only time a tech company was accused of this cheating.

Processor maker Intel has been accused of inflating its CPU benchmark results. The accusation, which came out not too long ago from the Standard Performance Evaluation Corporation (SPEC), specifically says that Intel has ballooned the benchmark results of its Xeon processors between 2022 and 2023 at the SPEC CPU 2017 test.

It was revealed that a tech company inflated its processor performance by manipulating software used in benchmarks. The neutral testing body had disqualified over 2,600 results after discovering the company optimized compilers specifically for the benchmark, giving an unfair advantage.

“SPEC has ruled that the compiler used for this result was performing a compilation that specifically improves the performance of the 523.xalancbmk_r / 623.xalancbmk_s benchmarks using a priori knowledge of the SPEC code and dataset to perform a transformation that has narrow applicability,” the report reads in technical terms.

And this wasn’t the only or first time a tech maker was accused of giving their hardware a bit of trampoline jump to boost those numbers, especially when it comes to self-reported benchmarks. Even companies like Qualcomm, Samsung, and MediaTek were once thrown under the bus for this (via PC World).

The compiler is important for optimizing software to run efficiently on the processor. Folks over at Phoronix said that the custom compiler might have been ballooning the relevant results of the SPEC test by up to 9%.

We all are obsessed with benchmark numbers. While it does not always accurately represent how the hardware performs in real life and real-time, looking at and comparing numbers seems like something that’s easy to do to know how A is compared to B.

User forum

0 messages