The dental industry in America is massive. Why is it such an important part of the American lifestyle?
The brits are something else, they don’t count. I am from Central Europe and I was shocked when I saw the kind of teeth brits run around with.
You know, ads for dental hygene products over here advertise with “gives you fresh breath” or “makes your teeth white”.
Over there they advertise with “prevents your teeth from falling out”.
That said, the “American Smile” and the obsession with super white teeth is something else. White does not equal healthy and many bleaching methods are actually bad for the health of your teeth.
It’s not a secret (literally front page material) that over 10 million Brits cannot get dental healthcare they need. They’re pulling their own teeth at home, it’s completely unreal.
Edit: source (yes i know, it’s the mirror, it was just the first link that popped up): https://www.mirror.co.uk/news/uk-news/desperate-patients-extract-teeth-11million-29379748.amp
over 10 million Brits cannot get dental healthcare they need.
I’m an American, but I am confident that in the UK, you can get cosmetic dental work done. You may have to pay for it – like, it may not be an NHS thing – but there is a private healthcare industry that exists alongside the state-run one in the UK.
Uh, yeah. Of course you can get private stuff. You can get that anywhere, lol. That was not the question.
If it is the Mirror I don’t click it or believe it. Stop spreading propaganda.
What part of „it was the first link“ don’t you understand?
Stop spamming people’s inboxes just because you’re too fucking lazy to do a 2 min search.