Industry benchmarks and analyst reports are bibles for many people in IT, providing independent assessment of vendors product and technology claims.

But too many people take them literally.

The fact is that they should be treated with much more skepticism than they are typically given.

The devil is in the details—rather, in the methodology.I know that from personal experience: I now work at and have worked for multiple software vendors that have been frequently covered by analyst firms both positively and negatively.
I’ve personally assisted in tuning for benchmarks including specs to present my companies’ best face.[ InfoWorld’s quick guide: Digital Transformation and the Agile Enterprise. | Download InfoWorld’s essential guide to microservices and learn how to create modern web and mobile applications that scale. ]
There’s nothing wrong as a vendor in putting your best face forward and trying to encourage those analysts to support your point of view, But sometimes it goes further than that. We’ve all heard rumors that some analyst reports’ conclusions are influenced by which vendors are those analysts’ customers.

And we’ve all seen reports on the habitual cheating on industry benchmarks in several segments of the hardware industry (it’s not just Volkswagen’s diesel engines), where they detect the benchmarks in use and change their behavior accordingly to look better than they are in the real world.To read this article in full or to leave a comment, please click here

Leave a Reply