Our research paper, “Searching for the Ground Truth: Assessing the Similarity of Benchmarking Runs”, has been accepted for presentation at the 14th ACM/SPEC International Conference on Performance Engineering (ICPE). Delving into the stability of benchmark measurements, our study scrutinizes 586 micro-benchmarks. We explore the challenges, evaluate previous approaches, and introduce a new heuristic. Even in a peer review scenario, humans find it challenging to identify disimilar benchmark runs, highlighting the complexity of the issue. Notably, our heuristic stands out with an impressive 92% sensitivity, offering a promising contribution to the assessment of benchmarking run similarity.