I can feel the difference between 71 and 73 in my house.
At 73, my kids room is uncomfortably hot. At 71, it has a perfect chill for sleeping.
What is your point? That people who use Celsius can’t feel the difference between 21.7°C and 22.8°C?
If you’re worried about your thermometer, you’ll be happy to hear that metric ones usually have finer precision than Fahrenheit ones, since they go in .5°C steps. Since +1°F means +5/9°C, you have less precision!
The point was they need that extra decimal because C isn’t good for human temperature sense.
It’s not like you are prohibited from using decimals in Fahrenheit. It’s that you don’t need 3 digits because it works better for people.
And fuck you for making me defend the most ass backwards measurement system on the planet.
It’s just an incredibly weak defense. Why is it worse for C to use an extra decimal for these differences? I can just as well argue that C is a more accurate representation, because small differences in temperature are smaller. Just like your argument, this is purely an opinion - until you can show me that not needing the extra decimal is objectively better, or until I can show you that smaller differences being represented as such is objectively better, neither of them holds any weight.
I don’t know if my thermostat is just wrong or if the layout of my house makes it inaccurate, but 64-65 in my house is frigid.
Plus we have a baby so 67-68 is really the lowest we could go at night I think.
But I agree, I sleep better in general when the blankets are warm and the house is cold!