Definition of insanity
Ever heard the expression that the definition of insanity is doing the same thing and expecting a different result? It is often falsely attributed to Albert Einstein, Ben Franklin, and others.
Insanity is often thought of as a psychological term, and in some ways, is exactly that.
However, recognized psychological conditions are listed in the Diagnostic and Statistical Manual of Mental Disorders; also referred to as the DSM-5, and the DSM-5 does not recognize “insanity” as a psychological condition.
The term is actually used in legal settings to describe any number of mental conditions which cause a defendant to be unable to distinguish right from wrong or assist in their defense.
Also, it’s important to understand that doing the same thing can in fact yield different results if you haven’t controlled for all the variables related to the action.
Throw a baseball in an open field, it lands safely on the ground. Throw a baseball in greenhouse, a window is likely to be broken.
Ultimately, it’s a stupid and ignorant cliché, and should be banished from the lexicon of colloquial sayings.
The United States Penny
The word penny, is a slang term for a British pence. A coin similar in size and stature to the United States one cent coin. So in America, we do not have pennies, we have cents…as in one perCENT of a dollar.
Do we use all of our brains?
We’ve all heard the claim that we only use about 10% of our brain. It’s the underlying basis for the belief that some of us can predict the future, do telekinesis, and other brain-powered myths.
The brain, like any other body part, uses energy to do what it does, and if it doesn’t do so, it will atrophy and die. Yet our brains stay relatively intact most of our lives. So it stands to reason that all components of it are doing something, and neurological scans have confirmed this is true, even with such mundane tasks as pouring coffee.
So as interesting as this saying may be, it seems that while people who share this cliché may only use 10% of their brains, the rest of use 100%.
Feed a cold, starve a fever?
As most of you know, your body is pretty good at fighting off diseases, viruses, bacteria, and other things that could kill you if you didn’t have an immune system and the regenerational capacity to replace dead cells.
What seems to be lost in this cliché is that physics would dictate that to perform any action requires energy. We humans get energy from the sun as well as the food we eat.
So while colds and fevers might be different, the fact remains that you need energy to combat either one of them.
Eating isn’t optional, it’s required.
So no matter what ails you, unless your doctor specifically tells you not to eat for some reason, such as a gastrointestinal problem, or prepping for something like a colonoscopy, you should ALWAYS feed yourself a normal and healthy diet.
The fact that you’re often tired and weak when sick is evidence that your body is hogging energy resources to fight whatever it is that ails you, so how could depriving it of energy possibly make sense?
Shouldn’t the sun have burned out by now?
There are a multitude of ways matter can be turned into energy. One is a chemical reaction, such as burning fuel. There’s nuclear fission, also known as splitting atoms, such as that which was used in the atomic bombs dropped in World War II. And there’s nuclear fusion, joining lighter atoms to form heavier ones—it’s the most powerful of the three. Fusion is what the sun constantly does.
Let’s hit you with some numbers to hammer this point home.
We burn gasoline to power our cars, and that chemical reaction, per atom of carbon, produces 1.4 electron volts per atom. For fission, we use uranium atoms, which when split produce 210,000,000 electron volts per atom.
I know what you’re thinking, that seems like a typo. But indeed, nuclear fission of uranium is 150,000,000 more powerful than burning a similar amount of gasoline (largely carbon).
It should be noted that uranium has far more mass than carbon, so atom to atom, the difference would actually be about 60 million times greater. The additional 90 million above is due to the increased mass of uranium, giving it more potential energy.
The astute of you may have just realized that I’ve already clued you in to why the sun hasn’t burned up yet. Because it’s not burning, it’s fusing. (<–Click the link for a detailed explanation and here for a greater detailed explanation of nuclear fusion.)
While the sun is approximately 109 times larger in size than the Earth, it has 330,000 times the mass. So if the entire Earth were burning, however long it would take to “burn out”, multiply that by 330,000, then 60,000,000, then by 3 to 4, and that’s how long the sun will take to stop fusing versus if it were burning like a campfire.
THAT my friends is why it hasn’t burned out yet.
On a side note, the reason we can’t do fusion efficiently on Earth, is because the sun is 330,000 times Earth’s mass, that additional mass adds gravitational energy to the sun that Earth simply doesn’t have. So to produce fusion on Earth, we have to add in energy from a man-made source to make fusion occur. That excess energy required to trigger fusion means that the output isn’t greater than what we put in, and therefore isn’t useful, since so far, it’s always been a net loss of energy.
Why aren’t there indoor AC Units?
Ever notice that unlike heaters, your AC unit must reside outside? Even if you put one in your window, half of it is still not indoors.
This is an interesting lesson in the physics of what occurs between heating and cooling—it’s pretty damn interesting.
When we create heat, we’re turning matter into energy as mentioned in the above points, such as combustion, fission, fusion, etc. So your heater simply burns kerosene, or gets electrical energy out of the wall and vents that energy into the area you’re heating.
Cooling is the opposite of that and WAY more complicated to do. You’re not technically cooling something; you’re removing energy from it.
In a perfect world, cooling would simply be converting energy back to matter, but we frankly don’t know how to do this very well or efficiently, nor even see it occurring in nature too often. So we have to find another way.
In admittedly oversimplified terms, an air conditioner works by putting energy into the unit, then as it vents that energy out one end of the unit, the other end is cooled commensurately.
If you’ve ever used one of those compressed air cans to clean your computer, you’ve experience the heat loss when something goes from a compressed to uncompressed state.
The energy was put into the can at the factory that made it. The heat generated doing this stayed there at that factory.
Now it’s shipped full of potential energy, and when you release that energy out of the nozzle, the rest of the can essentially moves towards 0 zero kelvin (the coldest anything can be if it had zero energy which is about -459.67° Fahrenheit).
In a nutshell, your AC unit takes energy out of the wall, vents the heat outside from one end, making the other end cold. If AC units didn’t have a place to vent that heat outside of the area they’re trying to cool, the hot and cold would balance each other out for zero change in temperature of the affected area. It’s why the back of your refrigerator is warm despite the inside being cold, too.