So why not make the temperature go to the hottest? Let me guess, 0 isn’t the coldest either in America, right? It’s just so arbitrary, and pure cope to say it’s the best way to describe temperature.
It’s just so arbitrary
All of them are. The decision to use water at all is completely arbitrary. Even Kelvin and Rankine are completely arbitrary: the “width” of the degrees is not defined by a physical factor, but relative to an entirely arbitrary concept.
Technically all arbitrary, but Fahrenheit is definitely on a whole different level of arbitrary.
Celsius - 0 = precise freezing point of water and 100 = precise boiling point
Kelvin - same as C, but shifted so 0 is the precise lowest possible temperature
Fahrenheit - 0 is the imprecise freezing point of some random brine mixture, 100 is the imprecise average body temperature of the developer
100 is the imprecise average body temperature of the developer
That’s a myth. It’s no more true than the myth that it was the body temperature of horses, or that the scale was designed to reflect how humans experience the weather. (It happens to reflect how humans experience the weather, but this was an incidental characteristic and not the purpose for which the scale was designed.)
The Fahrenheit scale starts to make sense when you realize he was a geometrist. It turns out that a base-10 system of angular measurement objectively sucks ass, so the developer wasn’t particularly interested geometrically irrelevant numbers like “100”, but in geometrically interesting numbers like “180”. He put 180 degrees between the freezing and boiling points of water. (212F - 32F = 180F)
After settling on the “width” of his degree, he measured down to a repeatable origin point, which happened to be 32 of his degrees below the freezing point of water. He wanted a dial thermometer to point straight down in ice water, straight up in boiling water, and to use the same angular degrees as a protractor.
The calibration point he chose wasn’t the “freezing point” of the “random brine mixture”. The brine was water, ice, and ammonium chloride, which together form a frigorific mixture due to the phase change of the water. As the mixture is cooled, it resists getting colder than 0F due to the phase change of the water to ice. As it is warmed, it resists getting warmer than 0F due to the phase change of ice to water. (Obviously, it can’t maintain this relationship indefinitely. But so long as there is ice and liquid brine, the brine will maintain this temperature.) This makes it repeatable, in labs around the world.
And it wasn’t a “random” brine mixture: it was the coldest and most stable frigorific mixture known to the scientific community.
This criticism of Fahrenheit is borne of simple ignorance: people don’t understand how or why it was developed, and assume he was an idiot. He wasn’t. He had very good reasons for his choices.
We live on a water planet. The weather we care about is water.
If you look at the overnight low you probably want to know if frost was likely. Guess what Celcius temperature frost happens at.
That factoid makes celsius relevant for about 4 out of the 12 months, and humans lack the capacity to distinguish between 60-100 on the Celsius scale. Anything at those temperatures just feels like blisters.
why not make it more arbitrary? Why not leave metric rules and use something like twelve that has fractions? Because it’s nice. It’s pleasing having it be 0f and 100f, it’s a clean range, and it’s also pretty comprehensive in terms of the temperature variance.
It just happens to work out pretty nicely.
You’re literally just applying the anti-thesis of the metric system to the question, and asking me why we don’t do it that way, idk what you’re expecting me to say here.
do celsius users not consider something like -20c to be “pretty cold” and 40c to be “pretty hot” That’s equally as arbitrary.