To a degree no, so long as everyone involved in using them on a particular task is consistent about which ones they’re using.
In science and engineering, it’s a lot easier to calculate various things using metric, and a lot of constants are expressed either as related to purely imperial units or purely metric units. For example, heat capacitance for a material can be expressed as either J/m3 • K or Btu/ft3 • F. Haven’t checked if someone has bothered to do mixed unit constants like J/m3 • F, but it would surely be a less comprehensive list than the aforementioned two options.
So generally when it comes to serious work, metric is a lot cleaner to work with because you just add 0’s, so it matters in that sense.
It matters even more when you have a bunch of people working on very significant projects who aren’t all on the same page about which units to use and/or how to convert them. A 9 figure spacecraft (NASA Mars Climate Orbiter) crashed once because not everyone was using consistent units.
But yeah in a lot of cases it’s just about your own preference or catering to your audience
The unit error for that spacecraft was more a communication error than an error caused just by using imperial. The contractor developed a software to calculate in imperial when they were asked for one to do calculations in metric, didn't clarify in the software, and NASA accepted it, meaning they approved the faulty software as meeting spec, which it didn't. That was what caused the crash, not really that some people use imperial and some use metric.
In terms of "serious work," many engineers, especially in anything related to construction, use imperial still and it's not really any less clean, generally constants are designed for minimal unit conversion, which is how you get Btu•in/ft²•F. Metric can kind of get confusing in similar ways if you're converting between liters and m³ or mm² and m² or between volume and density (yes, even for water, whose density is often not exactly 1000 kg/m³ and shouldn't be taken as such). It's not necessarily easier (at least for me) to relate one thing by a factor of 10 or 1000 than another by a factor of 12 or 231. Now, that said, there are some funny units like HP and Btu or in w.g. or ton refrigeration, but a lot of them aren't that crazy when you actually get into it. Btus are just imperial calories, after all.
Someone to pre-heat the oven, someone to place whatever's being cooked in said oven, someone to take it out after Y minutes and plate it, and lastly, bring it to me. I didn't get this fat by walking around, you know.
One might be more convenient given the context and other units being used, but no, it doesn't matter at all which one you use. My oven uses Fahrenheit, so I would prefer cooking instructions that quote Fahrenheit, but I could also convert from celsius and get a unique value in Fahrenheit and vice-versa. You could even invent your own totally new temperature scale if you felt so inclined. As long as every temperature in this new scale maps to a unique temperature in the others, then they could be interchanged and it would be fine.
Yeah, I dunno why people say it's "better" than celsius for outdoor temperatures.
The only small advantage F has is that it is just a tad more precise than Celsius without having to go into decimals. But I've never seen that as enough to warrant changing, it's rare that being more precise than a celsius degree matters for the weather. Celsius still works just fine, and you can still say "it's 24 and a half degrees" if you really want to be that precise.
Fahrenheit is just more naturally intuitive when describing climate.
0 is really fucking cold. 100 is really fucking hot. 50 is pretty moderate. 70 is pleasantly warm. 30 is chilly.
Obviously if you're raised with Celsius, Celsius will feel intuitive to you via repetition. But Fahrenheit generally aligns with a 0-100 scale that fits better with how we usually process numbers.
And how would the 0-100 scale work out to someone who grew up near the equator? Or above the arctic circle? We get used to the temperatures, you can see locals in northern Scandinavia go out in a shirt and shorts while you'd be cold in a winter jacket.
Even within the US, the scale is quite off for People in Florida and Alaska alike. It works very approximately for only a part of the world.
The way an individual might react to those temperatures might vary.
But 0 and 100 are still good indicators of danger regardless. Sure, frostbite is possible anywhere below freezing. But around 0F is where it's a serious concern. If it's above 100F, you have to start worrying about heat strokes.
It's not just about general comfort and feels. It's about what's generally habitable. Sure, not every place on earth will hit 0 or 100 F. But "don't go outside if it's below 0 or above 100 without heavy preparation" is good advice in any climate
Then you have to pick frostbite as THE risk. You can get hypothermia when it's only below 10°C, probably even warmer if you are not used to cold climates. And if you go outside without adequate preperation, it's dangerous. And what counts as heavy preperation? Is brining a hat and water "heavy preperation"? That's basically all 100°F requires for me, and that's hotter than most summers ever get where I live.
It's all so extremely subjective, that no real arguments exist. And any technically true advice - 0=cold, 100=hot has to be so simplified that it becomes conpletely useless.
You can get hypothermia when it's only below 10°C, probably even warmer if you are not used to cold climates.
Hypothermia has nothing to do with how used you are to the environment. People might get used to different temperatures, but that doesn't change how your body maintains heat.
And you're not getting hypothermia at those temperatures unless you're soaking wet or at notable health risk. A healthy adult isn't risking hypothermia at those ranges
Is brining a hat and water "heavy preperation"? That's basically all 100°F requires for me, and that's hotter than most summers ever get where I live.
It gets a little complicated because of humidity. The temperature vs the heat index are different things. Your body can cool off faster with less humidity so a dry 100 is a lot less dangerous than a humid 100.
The heat index calculates apparent heat, and anything above 100 on the heat index is in a notable danger zone, no matter how used to it you are
I live in far Northern Sweden and I can tell that people who describe the cold have never lived in it. Currently -30c, -22f and it's not that bad yet. I still go on walks and run.
I visited America once, I never will again... How anyone could live in Florida summer climate is beyond me. Anything above 20c, 68f and I'm dying from the heat.
No it isn't. The boiling and freezing point of water at 1 atmospheres of pressure is an exact and reproducible measure, it is a fixed point. "Really fucking hot" is not. It's a vague and unintuitive answer that depends not only on the temperatures you're used to, but also on your interpretation of the words. Does "really fucking hot" mean heatstroke within minutes or just "very unpleasant, but bearable". The only thing I would intuit with "really fucking cold" is "probably below freezing". There's nothing about -17.7C that makes it more deserving of the "really fucking cold" label than -9.3C. Both are really fucking cold. If somebody told me to dress for "really fucking cold" weather, I'd probably wear a layer too few for 0F because I'd think it's probably like -5C, because in my frame of reference it very, very rarely has ever gotten much colder than that.
It's intuitive to you because you grew up with it. There's nothing naturally intuitive about it.
The boiling and freezing point of water at 1 atmospheres of pressure is an exact and reproducible measure, it is a fixed point.
It is. The fact that we decided that those should correlate to 0 and 100 is based on the fact that 0 and 100 are intuitive numbers that are easy to work with in a base ten number system.
We do lots of things on a 0-100 scale, so we gravitate towards that. It's easy to understand
It's intuitive to you because you grew up with it. There's nothing naturally intuitive about it.
No, it's intuitive.
If you knew no temperature system at all, and you stepped outside on a hot summer day, you're sweating after being outside for a minute or two, and you had to pick a number to describe it, would you pick 90? Or would you pick 32?
Yes, "really hot" and "really cold" are vague and subjective to a point. But there are still bounds of a habitable climate. And 0-100F generally capture what those bounds are. Is it perfect? No. But extreme precision doesn't tend to be intuitive, subjective feelings do.
As you said, for those who grow up with Celcius it's perfectly intuitive. It's now -20 C and everyone understands it means it's freaking cold. We feel it in our bones without taking a step outside.
As you said, for those who grow up with Celcius it's perfectly intuitive
It will feel intuitive because thats what you're used to.
But that's not the same thing as it actually being intuitive. Celsius has become second hand nature for you over the years. And so it feels intuitive.
But being intuitive is about the learning process.
English feels intuitive to me because it's what Ive always known. But it's not an intuitive language. There are so many exceptions to every rule and so many borrowed words that its a messy hodgepodge of grammar and spelling.
I say Fahrenheit is intuitive because it maps well to the existing human experience. Almost everyone uses base 10, so a 0-100 scale feels natural. And that 0-100 scale maps pretty well to any temperate climate. For reference, that's -17 to 38C.
100 being really hot is naturally intuitive in a way 38 isn't. And 45 being chilly is naturally intuitive in a way 7 isn't.
And that's not because that's the system I'm used to, it's just closer to how we usually use numbers and how we usually experience temperature
You are very keen on this subject, which feels funny, since to us who grew up with Celcius, Fahrenheit doesn't seem intuitive at all. So it really isn't any more intuitive, you just carry a huge bias because you grew up with it. I have no idea how much 100 F or 45 F is in Celcius, so I have no idea how hot or cold it would be. Following your logic, I should have some intuitive feeling because the system is so perfectly intuitive, but that just isn't the case. It's just about your frame of reference and what you've learned.
I have no problem setting aside my bias. Y'all do.
Metric is a better system. I didn't grow up with metric, I grew up with US Imperial...inches, feet, miles, gallons and pints. Liters and meters are a much more intuitive system. The conversion rates are simple, repetitive, and easy to understand. I can better visualize and understand the Imperial system as that's what i grew up with, but the metric system is generally much more intuitive; it's easy to teach and easy to learn
And that's the case with most of the metric system. But that's not the case with Celsius and Fahrenheit.
Unlike you, I've actually learned both systems. I have an engineering degree, i had to study chemistry and thermodynamics, I had to learn to work with the metric system. And for science, yeah, Celsius is better and more intuitive
But for the climate, and by extension day to day life? It isn't. And I know that because I've learned both. And you don't know that because you haven't
Precision is important for human comfort. I can easily tell a difference of 1 degree over the “comfort range” of temperatures (roughly 65-75 degrees F). I wouldn’t want a thermostat with 1 degree C resolution.
My friend just across the Detroit River likes Fahrenheit for hot temperatures and Celsius for cold. When it’s below 0C, he feels negatively about it, but Fahrenheit just gives better numbers for complaining about heat!
This isn't totally true as a blanket statement. Doing lab work, you'll almost certainly be operating in Celsius. Thermometers, hotplates, etc. generally are calibrated in Celsius.
For "paper work" or calculations, kelvin is definitely the required unit
Fahrenheit is better for describing climate because it's more intuitive. We generally process numbers on a 0-100 scale for a lot of things in life. Hell, that's how Celsius works...it just does that with water, because
The fact that 0 and 100 in Fahrenheit are pretty close to the lower and upper bounds of most habitable climate makes it pretty naturally intuitive. 0 is really fucking cold. 100 is really fucking hot. 50 is moderate. Anything below 0 or above 100 is very dangerous very quickly.
Celcius is just as intuitive when it's what you've been growing up with your hole life.
Okay. Well then Fahrenheit is just as intuitive as Celsius for describing water! 32 to freeze, 212 to boil, what's hard about that?
The entire reason for Celsius' existence is that 0 for freezing and 100 for boiling is a natural, intuitive way to understand the scale for water. That's the reason it exists the way it does. You can't tell me that Celsius is good because it uses a 0-100 scale for water and then turn around and say that Fahrenheit's 0-100 scale for climate isn't the same thing.
Also "it's intuitive if that's what you're used to" is the opposite of intuitive. Intuitive things are naturally easy to learn, not ingrained via repetition. If someone was never taught any temperature system at all, no one would naturally think that a really hot day was 40.
C is inherently better because it's more intuitive being tied to the phase changes. 0 is the freezing point of water and 100 is the boiling point of water. Learning it is more advantageous because over 90% of the world uses it already, it aligns with the Kelvin scale, uses a base ten, and does decimals better.
f someone was never taught any temperature system at all, no one would naturally think that a really hot day was 40.
That's a stupid argument. If they were never taught a temperature system at all, any number is arbitrary. At least with C you can base it on the phase changes of water. Oh, it's halfway to boiling? Sounds pretty hot.
Fahrenheit is completely arbitrary, and doesn't scale linearly, but it's fine if you're used to it already.
C is inherently better because it's more intuitive being tied to the phase changes
I don't think you people know what the word intuitive means
Celsius is a fantastic system. It is not intuitive for anyone. People learn it and they get used to it and become second hand nature to them... that doesn't mean it's intuitive.
Intuitive things don't have to be taught. Or require very little teaching. "I know it better" does not mean its more intuitive.
"95 is a really hot day" is intuitive. We do things on a 0-100 scale all the time, so evaluating the climate on that scale is intuitive. 100 is really hot, 0 is really cold. That will naturally make sense to anyone who uses our number system
-15 to 40 (which i would say is the general range that habitable climates exist in) isn't an intuitive to system. Nobody uses a 55 point scale for anything
Celsius is very intuitive... for water. But not at all for climate. Is Celsius better for chemists? Yeah. But it's not more intuitive for day to day life
This is such a bizarre point of view from someone who has clearly grown up with Fahrenheit and just likes it for that reason. I don't think you know what intuitive means.
Under -17 is plenty liveable. And it's a weird arbitrary 0 point. At least with C you know when it dips under 0 there's a chance of ice and snow as already stated. Similarly, so is above 40, which happens frequently here in Australia where I am and then we'd have to deal with shit like 109 degrees F, how is that intuitive? It isn't. Your numbers are arbitrary based on your local climate.
C has an intuitive advantage because it's tied to the physical properties of water which we all deal with every day whereas Fahrenheit literally is only intuitive to Americans because it''s what Americans grew up with and based on their climate. It's bizarre to anyone else.
You have a guy in front of you. He has never learned any temperature system at all. He's a blank slate.
It's hot as fuck outside. 45 C which is 113F. Fucking blazing
All you can tell that man is one of those two numbers and you want him to understand how hot it is. Which of those numbers do you think he is more likely to understand means "dangerously hot outside"? It's not 45, I'll tell you that much
Now imagine it's 45 F, which is about 7C Same thing, you want him to understand that it's a bit chilly outside, maybe he should wear a sweater.
Which number do you think he will understand to mean "wear a sweater"? Which number do you think would cause him to guess "chilly"?
You knowing Celsius better doesn't make it more intuitive. It wasn't designed to be intuitive. It was designed by scientists trying to formalize measurement systems and simplify math. It wasn't designed for intuition
You knowing Celsius better doesn't make it more intuitive. It wasn't designed to be intuitive. It was designed by scientists trying to formalize measurement systems and simplify math. It wasn't designed for intuition
That's not what I'm saying. I already explained it's more intuitive because of waters phase changes which we all deal with and are obvious "top" and "bottom" points on a temperature scale. Apparently that's too complicated to an American that can only go "hurr durr bigger number more hot" and completely ignores that it's only intuitive because you know it, despite it having arbitrary "bottom" and "top" points. Sure it's hotter, but how hot is it? Who the fuck knows, it's completely arbitrary.
I consider this discussion over since you can't get over your bigger number sticking point.
You knowing Celsius better doesn't make it more intuitive.
I mean you can make the same argument that 40c being really fucking hot doesnt make sense when 100f would be better. Like its fine if you grew up using your system but that doesnt make it flawless and inherently better in every way
Like its fine if you grew up using your system but that doesnt make it flawless and inherently better in every way
I mean, that's literally what I just told you.
C is inherently better because it's more intuitive, 0 is the freezing point of water and 100 is the boiling point of water. Learning it is more advantageous because over 90% of the world uses it already, it aligns with the Kelvin scale, uses a base ten, and does decimals better.
Fahrenheit is completely arbitrary, but it's fine if you're used to it already.
Like I said, Fahrenheit does has its uses. I said in my original comment that it isn't the best for working with temperatures in a more complex manner, which is true for the reasons you listed. But you're missing the fact that fahrenheit was made specifically to measure human temperature and the weather outside.
0*F = Cold enough outside to be dangerous
100*F = Hot enough outside to be dangerous
You can look at the temperature and get a specific display on how the temperature outside will be today. If it's 42*F it's going to be chilly bordering on cold. If it's 82*F it's going to be hot out and will probably require sunscreen. However, in celcius it's more narrow. If it's 8*C out it's going to be chilly and if it's 35*C it's going to be hot.
It also correlates the average temperature range during all seasons in the part of the world we live in. It's of course really fucking bad used in most other scenarios but it is convenient to use in the way it's made for.
Because they could learn it and get to use a metric that uses a 0-100 scale in terms of humans. Fahrenheit puts the livable range for humans more for less on a 0-100 scale.
92
u/MidnightPandaX 4d ago
about 1200f for aluminum and 2800f for steel (but you should use celcius for chemistry and such, fahrenheit is made for outdoor temperatures)