I disagree. Probably most people getting a rash from a piece of jewelry simply stop wearing it, and don't visit a doctor if it then goes away.
Asking a sample of people if they've had such a rash, then testing those for allergens (or just allergen-testing the entire sample) would be
considerably easier than figuring out the genetic predisposition[1] then testing a sample of people for that predisposition.
The other reason for it being an estimate is that
it changes, and isn't uniform across the entire population. To be allergic to something you first need to be sensitised to it through exposure.
I'm not sure if the article is public access, but I can access the full text; here's a quote:
I had to restrain the length of that quote because it's a fascinating article.
If that doesn't persuade you, then perhaps this sentence from the article will:
(Update) Finally, the obvious[2] thing against it being 'overcautious' genetic susceptibility testing rather than 'true' allergy is that there's a massive reported difference in rate between men and women. That ought to be
quite hard to get past the referees if you wern't really checking for allergies proper.
[1] Just consider - how would you do that? The obvious way to start is to screen a sample of people for nickel allergy - by which point you'd have the prevalence data already!
[2] when you've already gone over the less-obvious reasons...