It's a pretty neat analysis, but it looks like the "nearly exactly" part must be a coincidence for the particular methodology and data they used (most significantly that it's based on 2018 weather).
Fahrenheit was created in northern Europe, using the temperature of a salt water and ice mixture as the zero calibration point. It was later adjusted to define the difference between water's freezing and boiling points to be exactly 180°, since 180 is a highly composite number with many divisors. So off the bat, it's a bit odd that 0°F and 100°F would match the 1st and 99th percentiles of population-adjusted daily highs and lows in the US with that much precision. It's a coincidence already in the sense that the creator was not aiming for this.
But it's also a coincidence because they used 2018 data, which was a particular warm year on average. (2012 was warmer, but I don't see any warmer years before 2012 in the National Weather Service's table which goes all the way back to 1875.) Average temperature across the US can vary by 3° or 4°F year to year. The population adjusted temperature should vary even more because it depends on lot on which weather systems hit the major population centers that year. I'm not sure how much the 1st and 99th percentile would change if they redid the analysis for a different year, but it would probably vary by several degrees.
It's also kind of interesting that you would never have gotten this result before around 2012 or so, due to global warming.
It doesn't matter if it's a coincidence or not. The fact that it works out that way still plays to its convenience and "good feel" in the US.
Arguing that it's a coincidence isn't really relevant.
I agree with the poster further up: I'm more or less good with all metric units expect temperature. While I still "feel" all the US customary units better than metric, I can intuitively "see" meters, liters, and kilograms. But Celsius continues to elude me, even after dating and being married to someone for 8+ years who grew up in a metric country.
I'm not sure you fully read my comment. It only works for 2018. If you did their analysis any other year, you'll get the 1st percentile is -4°F or something similar.
I only called out the "nearly exactly" part of the claim. US weather is approximately in the range of 0-100°F, give or take 20 degrees. But the analysis found 0°F to be nearly exactly the first percentile of daily highs and lows, to within a twentieth of a percentile point.
It's true that US temperature is around 0°F-100°F but usually false that those temperatures are the 1st and 99th percentile.
Fahrenheit was created in northern Europe, using the temperature of a salt water and ice mixture as the zero calibration point. It was later adjusted to define the difference between water's freezing and boiling points to be exactly 180°, since 180 is a highly composite number with many divisors. So off the bat, it's a bit odd that 0°F and 100°F would match the 1st and 99th percentiles of population-adjusted daily highs and lows in the US with that much precision. It's a coincidence already in the sense that the creator was not aiming for this.
But it's also a coincidence because they used 2018 data, which was a particular warm year on average. (2012 was warmer, but I don't see any warmer years before 2012 in the National Weather Service's table which goes all the way back to 1875.) Average temperature across the US can vary by 3° or 4°F year to year. The population adjusted temperature should vary even more because it depends on lot on which weather systems hit the major population centers that year. I'm not sure how much the 1st and 99th percentile would change if they redid the analysis for a different year, but it would probably vary by several degrees.
It's also kind of interesting that you would never have gotten this result before around 2012 or so, due to global warming.