Category Archives: Logic & Reason

Framing and Perception: Using Skepticism to Avoid Being Deceived

We’ve all heard the expression there’s two sides to every story. It implies that one side is the truth, and the other side is lying. While that can be true, it can also be that both sides are right, and are both just leaving out crucial factors. It could be that neither side is right, and the truth is something else entirely. It could be that one side is right, and the other believes they’re right, but are simply mistaken. And most commonly, it could be a matter of opinion, and there simply isn’t a right or wrong in the first place.

The point of skepticism, is to be able to consume information in such a way that you are least likely to be deceived, or make bad assumptions. Thus leading to more intelligent decisions, and typically better outcomes for you. Let’s look at some examples.

In April of 2019, it was reported in several news outlets that just eating one slice of bacon can increase your risk of colorectal cancer by 20%. You can see one instance of this report from CNN here. CNN was not dishonest in this reporting, that data is true.

But when you click the actual study, and apply a little skepticism (and some math), you might look at it a little differently.

There were 475,581 participants in the study, and a mere 2609 case of cancer reported among all participants. So if one group is 20% higher than the other, that means it’s approximately 45.4/54.6 split (45.4/54.6 = 120.2%, or 20% more).

54.6% of 2,609 = 1,425 (0.29% of the total group)

45.4% of 2609 = 1,184  (0.24% of the total group)

So while 1,425 is indeed 20% more than 1184, out of the total group or people observed (475,581) a mere 0.55% contracted colorectal cancer. A total of 241 more were the bacon eaters, or a mere 0.05% overall increase (0.29% vs 0.24%).

An almost entirely insignificant 0.05% or 241 out of 475,581 people doesn’t sound nearly as scary as 20%, does it? But scary sells news media, and journalists are rarely scientists.

This problem isn’t entirely about science, because you can apply these same skills to a myriad of things you’ll read or see in the media.

Imagine a news story we’ll call statement A with a headline that reads, “Woman courageously does all that is needed to put food on the plate for her child.”

Female Shoplifter

But then imagine a different news outlet runs a different headline we’ll call Statement B that reads, “Woman fired for drinking while at work, stole unhealthy snacks and booze from a grocery store.”

Statement A makes her sound like a hero, but Statement B tells a very different story. Both can be 100% true, but the context changes how you feel about the story entirely.

The point of all this are to make you think about any news story you read, and maybe think about changing the way you consume information. So here’s a couple of ideas on how to improve how you consume information.

  • Avoid click-bait headlines from sources you’ve never heard of, or that you know are openly biased. You know they’re all almost entirely bullshit. So why waste your time on them? The good ones will link to credibly sources, and you should click on those to read the whole story, if you do go down that road. But in general, if people stop clicking on clickbait, the people doing it will respond to the lack of demand for it, by ceasing to make it.
  • Read the article and not just the headline. Even reputable sources have resorted to click-bait headlines just so you’ll read their stories over the nonsense from non-reputable sites. You’re missing a lot of context and nuance if you don’t read the story. Not to mention, you look silly when you add your own comment that clearly shows you didn’t read the article.
  • Any story that says something like, “The such-and-such that such-and-such doesn’t want you to know” or “Person A destroys person B” is bullshit. All of it. Like every single one of them.” Stop sharing that nonsense. Seriously.
  • If you see a story and it seems pretty amazing, but you aren’t seeing it on reputable sources, I assure you, some podunk website did not scoop Reuters or AP. It’s bullshit that they didn’t vet properly, or worse, that they just made up.
  • Check a second source. This one is huge. If you see a story on a site that’s kinda reputable but not great, look for it on a site like Reuters or AP. If you confirm from multiple reputable sources, then it’s probably true. But if it’s multiple sources with the same bias, you should probably still avoid it.
  • Think about what’s being said in the story, and could there possibly be another way of looking at it.  For instance, if I told you France gets 75% of its energy from nuclear, where the United States only gets 20%, you could easily assume that France is a leader in nuclear energy compared to the United States. But if I told you France has 58 nuclear power facilities whereas the United States has 98, you’d think the US is the leader. Both are true, but both tell a different story. So it pays to dig into the data when you can, and form your own opinion based on all the information.

    Nuclear Power Plant Emits Only Water Vapor

Hopefully this helps you think about how to consume news differently, and prevents you from being that embarrassing friend on social media always sharing bullshit articles everyone but you seems to know isn’t true. You’ll thank me later. 🙂

 

The Myth of the “Militia” clause in the 2nd Amendment

Gary Nolan (and THE Scrappy Doo)
Gary Nolan (and THE Scrappy Doo)

A well-regulated militia, being necessary to the security of a free state, the right of the people to keep and bear arms, shall not be infringed. ~ 2nd Amendment to the US Constitution.

Nothing seems to evoke more passion from either side of the political aisle than the 2nd amendment to the U.S. Constitution. People on the Individual-Right side of the fence often cite the “shall not be infringed” clause, but those who wish to limit or eliminate the individual right to bear arms often cite the “well-regulated militia” clause.

Former ACLU Leader and Mathematics Professor Ira Glasser

The anti-individual-right argument being that the framers meant for Americans to be able to form militias to protect the people or the country, and those militias would need to be armed. This sentiment has been echoed by such noteworthy Civil Rights leaders as former ACLU head and mathematics professor Ira Glasser, which he discussed at length during the March 2nd, 2018 Comedy Cellar podcast. (click to listen)

NYC Comedy Cellar

Side Note: Don’t be fooled that the podcast is from the Comedy Cellar. Owner Noam Dworman is quite fond of discussing politics, and is incredibly thoughtful, fair, and insightful on the subject.

Oddly, the person who got it right (IMO), contradicting Ira Glasser, was Noam’s co-host, comedian Dan Naturman, who often describes himself as left-leaning. Dan does possess a law degree from Fordham University, and Noam studied law at the University of Pennsylvania, making them the only people trained in law involved in the discussion, not Glasser.

While Noam Dworman tends to seem centerist, he understandably felt the need to defer to Glasser’s judgement, since his work with the ACLU would seem to assert Glasser would be the more knowledgeable person in the room—but on this issue, he just wasn’t.

29zZnh0j[1]
Comedian Dan Naturman
The argument the anti-individual-right group presents seems pretty sound on the face of it, but there are several flaws with this line of thinking, making it unarguably incorrect.

But let’s break down the flaws of these arguments one by one.

THE CONTRADICTION

The first issue is that it is entirely contradictory with the rest of the Bill of Rights.

The Bill of Rights was drafted, not as a set of laws for the people to heed, but instead, limits set on government as to how government may restrict the people’s individual rights.

The Bill of Rights Institute writes:

The first 10 amendments to the Constitution make up the Bill of Rights. James Madison wrote the amendments, which list specific prohibitions on governmental power, in response to calls from several states for greater constitutional protection for individual liberties.

So if the 2nd Amendment was drafted to allow the establishment of militias, and was not meant as an individual right, it would be inconsistent with the other nine amendments.

Our founding fathers believed you have inalienable rights by virtue of existing, and they cannot be taken from you. They don’t come from government at all—the founders of our country were very clear on that when they wrote the Declaration of Independence.

We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.

The Bill of Rights places limits on what government may do, not establishes the formation of government entities. Those things are laid out in the first portion of the Constitution. But it also is written with the intention that the power comes from the people, not government.

The 1st Amendment
The 1st Amendment

For instance, our first amendment states “Congress shall make no law respecting an establishment of religion…” versus something like “You are free to practice any religion of your choosing.”

This pattern is consistent throughout the Bill Of Rights, and while most see both those phrases as essentially the same, there’s an incredibly important distinction. As the Bill of Rights are written, the people have the power and are imposing a limit on the government’s ability to limit their religious freedom. In the second theoretical example, it implies government has the power and is granting religious rights to the people.

So now that we understand the Bill of Rights (it’s in the name, for Pete’s sake) establishes rights of the people, not of government, and thus contradicts the idea that the 2nd amendment was meant to help local militias to form, let’s move on to issue #2.

THE MEANING OF THE WORD MILITIA

The meaning behind the word “militia” in the second amendment.

If we assume the term “militia” refers to local military and police, which are government entities after all; the people arguing the 2nd amendment was set up to allow local governments to establish militias comprised of the people believe our forefathers wrote an amendment that says that government cannot infringe on government’s rights to bear arms. This is not only inconsistent to the rest of the Bill of Rights, which guarantee individual rights, but its redundancy is nonsensical. If government cannot infringe on government’s rights to carry guns, then there would be no reason to even mention it in the first place.

The militia clause does refer to government, but not just local governments, it means any government. It wasn’t a right of the militia, it was a limit on it. This will make more sense as we move on to the next issues.No_gun[1]

THE MEANING OF THE WORD REGULATED

The third important issue people get wrong with this, is believing “well-regulated” is synonymous with “well-organized.”

Merriam Webster defines “Regulate” as:

A: to govern or direct according to rule

Bto bring under the control of law or constituted authority

If the people arguing it was meant to set up militias were correct, “well-organized” would have been a more proper wording. But instead, the only logical interpretation was that they meant for the militia (the armed wing of government) to be controlled or regulated. They believed the way you do this, is to prohibit government from disarming the public as the British tried to do before the revolutionary war broke out—arguably the prime motivation for the 2nd amendment in the first place.

WHAT IS BEING PROTECTED?

The last issue with the anti-individual-right argument is what it argues is being protected. The incorrect argument is that it’s protecting a country (state) which is free. But the reality is it is protecting freedom itself, ensuring the state remain free. You’d have to deny nearly all of America’s founding history to believe that freedom wasn’t at the core of everything the founding fathers did. Freedom was always more important in their minds. Far more important than the state.

ANOTHER WAY TO WRITE IT

With all that in mind, let me reword the amendment in the way it was intended using language that is maybe more understandable in today’s vernacular.

In order for the people of the United States to remain free, the militia shall be kept under control by the people of the United States who have the right to keep and bear arms.

This is precisely how the amendment was intended, and the only interpretation of it which cannot be easily challenged.

Click here for several quotes by the founding father’s to bear out this claim. It was painfully obvious that they wanted government to fear the people, as many of them specifically wrote.

They knew government will always be prone to becoming bloated and oppressive. And while a government can pass a million laws, those laws have no teeth if there is no militia to enforce them. So in order to keep that government, specifically its enforcement wing (the militia) well-regulated, the people should be armed as well. This way, the government (and militia) always have some level of fear from the people. It’s the only reason to use the word “regulated” that makes sense.

The United States Supreme Court in 2008 with District of Columbia v. Heller held it is an individual right saying the following:

The Second Amendment protects an individual right to possess a firearm unconnected with service in a militia, and to use that arm for traditionally lawful purposes, such as self-defense within the home. (click for entire transcript)

This opinion, being current precedent, effectively settles the argument for now.

IS THIS IDEA OF THE GOVERNMENT FEARING THE PEOPLE EVEN REALISTIC NOW?

Since the United States Military is infinitely more powerful than its armed citizens, many argue the point becomes moot, since we “the people” couldn’t possibly fight them. Which in a narrow scope might seem accurate, but again, if you apply a little skepticism, it isn’t.

While it’s true, the U.S. military’s might is overwhelming—it’s arguably more powerful than nearly all the rest of the world’s military’s combined.

But, they’re still given orders by an elected government. When the military kills the citizens, it usually isn’t well-received by the people who voted for them, threatening their re-electability.

It’s also important to remember ours is a voluntary military comprised of the people, and thus aren’t likely to murder their brothers and sisters for no good cause.

But we actually have evidence it can work.  Nevada Rancher Cliven Bundy, and a large group of supporters showed up armed to the teeth to fight the federal government over a land dispute. No shots were fired, perhaps in remembrance of the Waco Texas incident, and the federal government did back down. Those armed citizens, in this modern era with our massive military prowess, did precisely what the 2nd amendment was designed to enable, they fought the law, and the law didn’t win.

So why do gun control advocates believe this is what the 2nd amendment implies? It’s likely a simple case of confirmation bias—a phenomenon whereby someone attempting to prove something they hope to be true/false, eschew interpretations that conflict with their bias and/or accept suspect data that supports their bias, due to an inner desire to substantiate their argument.

We are all prone to do this, and with the exception of devout skeptics like myself, we’ll rarely even know we’re doing it, nor act to correct it.

Some may have come to these beliefs due to their own life experience. For instance, former US representative Gabby Giffords who was brutally shot in 2011 by a crazed killer on an unhinged political shooting spree, or former Reagan White House Press Secretary James Brady who was shot in a failed presidential assassination attempt.

Their lives were forever changed because of gun violence, so it’s quite reasonable to assume they would advocate limiting our right to bear arms. And when people have been forever affected by senseless gun violence, it behooves all of us to respect their trepidation in respect to lax gun laws. We haven’t walked a mile in their shoes.

James Brady & The Reagan Assassination Attempt

But apologies to those who wish to limit our rights to bear arms, and believe the “militia” clause supports your argument. If you want to argue against gun rights, using the “militia” argument, it just isn’t consistent with the rest of the Constitution, and you’re unfortunately misinterpreting the clause.

Famous People and Their Causes

This may surprise you, but famous people have opinions. Gwyneth Paltrow believes a jade egg shoved in a woman’s hoo-hah somehow makes her healthier (click the link, because it doesn’t).

A large majority of Hollywood believe Trump is basically satan, and many black athletes have taken a knee during the national anthem because they believe the police are too quick to shoot a young black man.

Gwyneth Paltrow/Chris Martin and Family

When they have these opinions, being someone who is used to being in the spotlight, they rarely shy away from sharing their feelings on any given subject—using their bully pulpit to encourage others to follow their lead.

There are a few important facets to these expressions of beliefs that I feel are worth discussion.

First things first. They have a right to an opinion, and they should share such an opinion if they’re passionate about it. They should be shown respect for speaking out on something that’s important to them. Their success means that if it is a cause worth fighting for, they can shine a light on a subject that us non-famous people simply don’t have the ability to do.

I’ve seen the Twitterverse often have regular people telling athletes with an opinion on politics to “Just shut up and play (insert their respective sport here)”, or people tell British physics Professor Brian Cox, who’s quite vocal about Brexit, to “just stick to science.”

Professor Brian Cox

I understand why people might feel this way, since such famous people are not famous for politics, and thus not presumed to be experts on the subject. But politics isn’t science, it’s entirely driven by subjectivity. Meaning one person’s opinion is just as valid as another. And as a libertarian, anyone who speaks truth to power (even if I think they’re misinformed on what is truthful) is still doing something noble.

By all means, make the effort to correct them if you think they’re wrong on the facts, but people should do so respectfully, and applaud anyone with a voice for speaking out.

Phil Mickelson spoke out against California and its high taxes, and was blasted as being an elitist. So what! He’s earned his money with his work ethic. Most people will ever know how hard it is to be that good at anything, and I assure you it didn’t happen with a mere 9-5, 40 hours a week effort.

PGA Tour Golfer Phil Mickelson

Colin Kaepernick started a movement to call out when officers shoot unarmed black men, and little repercussions occur as a result, something we should all be bothered by when it happens. We can quibble over whether some of the shootings he rallied against were justified, some may have very well been, but it does happen nonetheless, and we shouldn’t excuse it.

But all that being said, people should understand that being famous doesn’t make you an expert and thus adds no additional credibility to their argument, versus your neighbor who may be espousing the same opinion, (unless they’re an expert in the field.)

So while we should not discourage them from speaking out with things like, “just shut up and play your sport” or something like that, please bear in mind that you shouldn’t be blindly following them either. You shouldn’t assume they’re in command of the facts, and that the information they provide is truthful. The only thing you could presume to be true, is that their heart is in the right place, and they mean well.

Just about every issue is way more complicated than any non-expert understands. So listen to what people say, but apply your own skepticism, and if you care about the issue, take the time to look up credible sources on the issue, forming your opinions based on them. Doing something, or believing in something because a famous person told you to, is irresponsible at best.

Logically Fallacious – The Misuse of Logical Fallacies

People who fancy themselves as intellectuals often take pride in citing someone’s argument for being a logical fallacy. While it’s good that people are aware of logical fallacies, and know the value of avoiding them in reasoned debate, it appears many know the words, but don’t necessarily understand what they so eloquently recite.

Logical fallacies are ways people make arguments, where they make a definitive statement, as if something must be true or false, when the argument may be either/or.

For instance, there’s the Tu Quoque Fallacy which translates to “you too” is basically that just because someone doesn’t do the thing they said you should do, doesn’t mean it’s invalid. For people not familiar with the name of this fallacy, they might simply argue someone is guilty of  “do as I say, not as I do” hypocrisy.

Imagine I advise you not to drink alcohol, citing all the health issues that go along with it. That is genuinely good advice. Even if I drink myself, it doesn’t mean it’s bad advice. So arguing that because I drink, it must mean that my argument that drinking is bad for you must be invalid, or I wouldn’t drink myself.

These are matter-of-fact statements which is what the tu quoque fallacy seeks to correct. However, it’s not applicable to subjective claims.

For instance, if I say that I believe drinking is immoral, and then I drink anyway, and someone criticizes me for it, they’re not committing the tu quoque fallacy, they’re just rightfully calling me out for being a hypocrite.

In the first example, I made a factual statement, the second example I shared an opinion.

Another example where logical fallacies are mis-attributed is when people assume the answer is binary, in that it must be true or false.

For instance, imagine I say that someone wants to legalize marijuana because they just want to smoke it themselves. That’s a logical fallacy, arguably either a Non-Sequitur, or a Strawman fallacy, depending on how it was presented, because it’s entirely plausible that such a statement is not true.

Click Image for more info

However, that doesn’t mean it is automatically false, either. And this is where many people who correctly cite the argument as logically fallacious go into their own logically fallacious whole, by assuming it must not be true.

What may be logically fallacious may still be more likely than not, or at least plausible. It’s just a logical fallacy because the person who made the argument, argued as if it must be true, which is false. It’s merely plausible.

So I applaud everyone for trying to be a better debater, or for educating people (and themselves) on logical fallacies. It’s just important not to go down your own logically fallacious hole doing it.

A Critical Look At Political Correctness, the Easily Offended, and Why We Should Change This Culture

Political correctness is a term that typically evokes annoyance and hatred from almost anyone who hears the term. Yet despite this nearly universal hatred for it, political correctness seems to be as pervasive as ever.

As an example, in 2017, the TV show Bates Motel, a TV adaptation of Alfred Hitchcock’s 1960 epic thriller Psycho, opted to rescript what is arguably the most famous scene in movie history. The story is about a man (Norman Bates) who suffers from multiple personality disorder. Aside from his own personality, he would also take on the persona of his mother, a psychopathic killer who would murder women she felt were immoral.

No Merchandising. Editorial Use Only. No Book Cover Usage.
Mandatory Credit: Photo by Moviestore/REX/Shutterstock (1622408a)
Psycho (On Set)
Film and Television

When Norman became his mother, he would often dress up as her, and in the original and now famous shower scene, where a young woman is stabbed to death by Norman during a schizophrenic episode, he was wearing his mother’s dress.

However, the Bates Motel show runners, for fear of offending the trans-gender community it seems, opted to not have Norman (played by Freddie Highmore) wearing his mother’s clothes. The argument being they didn’t want to paint transgender people in a negative light. On the face of it, this can sound fair, but political correctness always does at first.

Freddie Highmore as Norman Bates – A&E Series Bates Motel

The first issue should be glaringly obvious. Norman Bates wasn’t transgender, he was schizophrenic with multiple personality disorder. He wasn’t a man who identified as a women. In his mind, he was his mother. So the show runners, for fear of offending people they weren’t even depicting, made the scene less accurate, out of irrational fear.

The referenced article above shows the writers clearly understood this, but the fear of offending someone and having the show be attacked by those who misunderstood the show’s intent was so great, they decided not to risk offending them.

In general, the idea of political correctness can be broken down into a couple of camps.

  • One is a selfless reason—you don’t want to offend someone because you’re a good person, and you just don’t like offending people.
  • The other is selfish—you have concerns that it might harm your brand or business if people happen to be offended. You don’t so much care that they’re offended, but if they make a lot of noise in attacking your business (or you personally), you’re concerned it could harm you financially when they do so. The above example falling into the latter camp.

If either camp is genuinely trying to avoid offending people, why is this a problem, then? Shouldn’t that be a good thing? The answer is a little murky, but let’s dig into the dirt a bit.

The Straw Man Argument

You may have heard of the logical fallacy known as the straw man argument. If not, click the video above from PBS. But the Straw Man Fallacy principle also applies to those who are easily offended.

Imagine I said, “I like Gary Johnson, the Libertarian candidate for president in 2012 and 2016.” Full stop. Now imagine a Trump or Clinton supporter who hears my statement, then gets offended and responds to me, “Oh, so you think Hillary/Trump is a bad person then? You’re a horrible person.”

Hopefully you see the problem here. I didn’t say anything about Hillary or Trump, and it’s genuinely quite possible I like all three people. So they’re mad at a straw man version of my argument, not what I actually said and intended.

This is why being easily offended is often the problem of the person who chose to mischaracterize your argument and be offended by it, and not the problem of the person who said something they were offended by.

For this reason, it’s important we not coddle such people, and give their behavior credence. They’ve made a mistake, and condoning and/or excusing that mistake doesn’t help anyone. Worse yet, it creates a whole new problem.

Factitious Disorder Imposed On Self (Munchausen Syndrome) is a condition where people claim to be ill in some way, when they’re either making it up, or they’ve actually harmed themselves, in order to gain sympathy for their illness from people who don’t know they’ve done it to themselves.

Many people who claim to be offended may not actually be offended per se, but much like those who suffer from factitious disorder, have learned that by proclaiming they’ve been offended on social media or some other public forum, gain sympathy from their followers, fans, or friends. They’re being conditioned to be offended about things going forward to attain even more attention (sympathy), creating this downward spiral of dishonest dialogue, fake outrage, and people who are afraid to be speak their mind.

So just by the virtue of it not even being honest outrage, or an honest assessment of the thing that outraged them, it’s already an illogical and potentially immoral condition. But this isn’t where the negatives end.

The Wisdom Of The First Amendment

Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances

U.S. Constitution: 1st Amendment

As most people know, the first amendment of the US Constitution wasn’t written so we can discuss the weather freely, nor to believe things we all believe. Our founding fathers understood you should have the right to say something offensive if it’s what you truly feel or believe. You should also be encouraged to speak truth to power when leaders say things that simply aren’t true.

This was of course about freedom from prosecution by government for saying such things, but the logic of protecting that speech is important outside of first amendment constraints as well. If people are afraid to speak their mind, you’ll never learn what they’re thinking. They might have ideas that could change the world, or at least maybe your world view—hear them out.

Martin Luther King Jr. for instance, was saying things we understand are true and not controversial now, but were quite controversial then. So much so, he was murdered over them. But you can go a lot further back in history to see why this is important. Galileo for instance, was famously convicted of heresy, and sent to jail for his arguments about the nature of our solar system. He described heliocentrism—the idea that our sun is at the center of the solar system, and not the Earth, as the Catholic church believed at the time. Not only is this not controversial now, only the most delusional of people think it isn’t fact.

Galileo

While some things may be controversial forever, many things that are edgy today, will almost assuredly be commonplace tomorrow, and this should be deemed as typically a good thing. People are often afraid of change, but adaptation is the key to survival, and free speech is key to having the discussions that help us to evolve our way of thinking as time goes on.

Political correctness and being easily offended are the biggest detriment to these discussions, and reasonable people should make an effort to ensure such discussions aren’t quashed by aggressive social justice warriors.

As for how to fix this, the answer isn’t attacking people verbally with insults and such, that’s not going to win over hearts and minds. Technically, I’m arguing that you do nothing. No really, don’t do a thing. if someone gets upset, and demands apologies because they were offended, don’t say a thing. Let them realize no one agrees with them by not agreeing with them.

If you see attention-seeking behavior like this in general, the best thing you can do is simply not respond to it. It’s like the urban legend version of Ferberizing a baby, letting them cry it out alone, but with adults. (Ferber didn’t actually argue for letting kids cry it out. His actual arguments are here.)

These people are seeking attention. If you don’t give them any, they will be conditioned to not waste the energy for their ineffective technique. We made it effective in the first place, we can make it ineffective, too.

Internet Troll

There will surely be a knee-jerk reaction  to respond by either giving in, if you’re not buying into my idea that it’s a problem, or to troll by lashing out at them for behaving childishly. You would think that those options are opposite each other, but the fact is that they’re both attention. And if you respond negatively to it in an effort to get them to “grow up,” others who don’t share your view (and mine) will sympathize with them even more because you were such a meanie to them.

Now that we’ve talked about how to stifle the political correct and easily offended, how do we promote the reasons for stifling them in the first place?

Also a pretty simple answer. Talk. Not yell or attack, but have respectful discourse with people. If you’re the type to avoid discussions that might get contentious, don’t. If they can’t respond in kind, then again, go back to not responding.

You can also stand up for facts. If someone says something you know isn’t true, chime in respectfully, and let them know they may be incorrect. Cite sources for extra credit. If at any point the conversation devolves, again…walk away. If enough people do this, eventually, reasonable discourse can and will prevail.

Analysis: Are Science and Scientists Often Wrong?

When debating some controversial science claim, I’ve often heard people argue that “scientists are always wrong.” Usually it’s from those arguing for some “thing,” medicinal or otherwise, that’s supposed to make your life better, but seems to fly in the face of science, or at least isn’t backed by any reputable study.

For example: people arguing marijuana (or at least some of its chemical constituents) kills cancer, but “western medicine” wants to keep you sick with things like chemotherapy, so they’re suppressing the evidence. Something I largely debunked here with just a little critical thinking. So I don’t need to rehash that specific point again.

But what I do want to cover, is the notion that scientists are often wrong. If you were to ask this question, and require a simple “yes” or “no” answer to whether scientists are often wrong, the answer I suppose is “yes, yes they are.” But that’s partly by design, and this is an important part few seem to understand.

If you were to ask most people outside the science community what science is, they’d probably conjure up people in a lab with beakers mixing chemicals together, and hoping that by combining bleach, marijuana, gluten-free wheat, and organic apple seeds, somehow, you’ll have a cure for any particular rare condition that ails you.

But what is science really? It’s a method—thus the moniker “the scientific method.” It’s a means by which you can most likely find the truth about something.

This is WAY oversimplified, but it basically goes like this:

  1. You observe something in the world, and go “Hmm?” Emphasis on the question mark.

This is how science starts—people have questions.

Non-scientists will often answer them with something complicated and/or supernatural like gods or aliens, if they’re struggling to find a more natural answer to their question. Others just make a random guess based on what they think is most likely the best answer, and go with it, evidence-be-damned.

Why?

Because science is hard work, and moving on past this phase requires far more than just imagination.

Scientists however will assume nothing until there is evidence of something. So if they’re compelled to answer the question, they’ll move to phase 2.

2. You gather as much evidence as you can on the thing you saw.

From this point forward, we separate the scientists (or skeptics like myself, since I’m not a professional scientist) from the non-scientists, because non-skeptics/scientists stopped after phase 1 when they opted for a guess.

If there’s no evidence to gather, sadly your work here is done, and you must accept that you don’t know. Think of cryptozoology, like Bigfoot ‘experts” or ghost hunters and such. They have no evidence to test (like an actual bigfoot to observe and test—alive or dead), yet they make claims anyway which are always pure speculation.

So whatever they’re doing, trust me, it isn’t science. Using scientific words, and scientific equipment doesn’t make one a scientist, following the rules of the scientific method does.

3. If you are able to gather evidence, you form a hypothesis, what a layperson might call an “educated guess,” based on the evidence you’ve gathered.

This forming of a hypothesis is different from a guess, in that it is based on the evidence you’ve gathered so far, and none of the evidence gathered should be contrary to your hypothesis.

A guess is often just what you think is most likely, but isn’t always weighed against the evidence you have. You see this often in political or religious debates, where people have an ideology, and any evidence they’ve gathered so far, if it doesn’t support their ideology, is thrown out as if the evidence must somehow be flawed. It’s a process called confirmation bias, and sadly we all do it. Especially if we’re not even aware it’s a thing, and that we should avoid it.

4. Here’s where those beakers might come in. Time to do some testing.

Now here’s the interesting part. If you’re a scientist, you try to prove yourself wrong. Yeah, I said it—WRONG. It’s a principle called falsification.

If you can’t disprove (falsify) your hypothesis, then you assume you have a potentially true hypothesis. Professionals will try to get such findings published in a peer-reviewed and reputable journal, then hope other scientists in their field will test it.

Know what those others will do?

You guessed it, try to prove the hypothesis wrong as well. Not because they want the first scientist to be wrong, or are their competition, but because that’s just how it works.

So why try to prove it wrong, versus prove it right? Derek Muller from the highly-respected YouTube channel Veritasium made an excellent video explaining why, in a pretty unique presentation. I encourage you to watch it. It will make you think differently, if you don’t already think this way.

How often are scientists wrong?

Of all the sciences, one of the most rigorously tested would surely be biology, specifically pharmacology, or medicine. As this story reports, approximately 1 in 5,000 drugs actually make it from concept to FDA approval. Which means 4,999 were effectively falsified. Those seem like pretty horrible results, for sure.

It’s not all bad, though. One of these drugs for instance, was sildenafil, the active ingredient in Viagra. It was initially meant for the treatment of blood pressure, and through clinical testing proved ineffective for that purpose, but highly effective at “pitching tents.” Serendipity at its finest, since Viagra has proven to be far more profitable for its founder, than the blood pressure medicine would likely have been.

But such serendipity is simply an added benefit of rigorous testing, and the proper documentation of all findings. Science is technically always about the unknown. You can ignore things that don’t fit into your desired outcome, or you can follow the data wherever it takes you and learn from it.

But with medicine, obviously lives are at stake in a pretty profound way, so the level of scrutiny there is rightfully going to be higher than any other field of science.

To a layperson, this might seem like the argument is that scientists are wrong 4,999 times out of 5,000, and this is where the “scientists are always wrong” myth starts to germinate. Not because they are wrong, but because of how science is often reported.

You see, technically, they weren’t wrong. They never made the claim you often heard. They formed a plausible idea, and then tested it rigorously to see if it stood up to the scientific method. With medicine, the number of phases a drug goes through is staggering.

Again, very oversimplified, but it’s something like this:

  1. Test it in a lab (say in a petri dish), basically taking some live diseased cells, put them in a dish, and see if the chemical in question kills them, or otherwise does what you’re hoping it does.
  2. Test them in animals, like rats
  3. Test them on an animal that may be closer to humans genetically, like a apes
  4. Test them on a few healthy humans to make sure they don’t get sick
  5. Test them on a very small amount of humans to see if it helps
  6. Test them on a medium-sized group of humans to see if you can show a statistically significant result
  7. Test them on a large group of humans so you have a result you can argue is most certainly one certain assumptions could be made about.

Now you can start to understand why it can take 12 years for a drug can get to market. But here’s where the “scientists are always wrong” argument often comes into play. Because after phase 1, the findings are published. After phase 2, the findings are published. After phase three, again, the findings are published. This will be true for all phases.

Abraham Lincoln Weighing In On the Internet

Now, a reporter, website, or any other type of media who knows nothing about science, picks up the published study from phase 1, and writes a big, attention-grabbing headline that reads “Scientists discover cure for cancer,” and a straw man of the finest quality is born.

Because they don’t understand these results are merely a step along the road of a cure, and with respect to cancer, each one is different anyway. The tests would surely be against one type of cancer, such as lung, breast, or prostate cancer, for instance. Not just cancer as a whole.

A year later when this substance fails phase 2, another reporter reports that scientists show the same substance now is not effective at curing cancer. And the public is left thinking scientists screwed up—they didn’t.

People who know nothing about science irresponsibly misrepresented the phase 1 story, the populace which aren’t largely scientists didn’t know how to decipher the misleading clickbaity headline, and voila, “Scientists are always wrong.”

You can also find this notion with people who are skeptical of larger theories, like the big bang theory, or evolution. They’ll point out that “evolution doesn’t explain how life started” or other things we don’t know yet.

But what such people seem to not understand, is that large theories have a couple important facets they aren’t considering.

First, think of a particular scientific theory as a puzzle depicting Albert Einstein standing in his study. Your puzzle has a thousand pieces, and you’ve so far rightly inserted 950 of them. You can clearly see it’s Einstein in his study at this point, but there’s a few small details (missing pieces), maybe a few books on the shelf in the background, that you can’t yet identify. You’re still not sure about such facts, and that may change the picture significantly, but it is much more likely it will not, instead just filling in those small blanks.

For evolution for instance, this might be the fact it isn’t understood how non-living organics (carbon-based substances) became living organisms (carbon-based life forms). Just because we don’t understand that facet, doesn’t mean the other “950 pieces” we do understand aren’t true, or are suspect.

The other important part to understand about a theory, is that it’s a theory instead of a law, because it isn’t entirely observable. We can see the effect of gravity on something and measure it accordingly, so that’s a law.

Charles Darwin: Author of The Origin of Species and impetus for the Theory of Evolution

But with evolution or the big bang, we can’t go back in time and watch it happen. So all we can do is theorize based on data we have, and try to recreate the event in some small way so we can observe it. From there, we can make a fair assumption the theory holds true if replicated.

Since skeptics are often religious in nature, they’ll refute science with the Bible, Quran, or other religious works, as if we should assume such works are true. But almost all claims made by modern-day scientists which contradict religion, have a mountain of evidence supporting them, to the point that people like the pope himself, have acquiesced to, as reported here. And it’s important to understand that such religious works aren’t supported by evidence either, as far as we know. We can’t go back in time and observe them being written, nor do we have any supporting documents to back up their claims. It could literally have been written by one delusional person thousands of years ago, sold to a larger group of people as truth, a religion was born thereafter, and we’d have no way of knowing. So assuming such religious texts must be right on the subject of gaps in scientific knowledge does not follow any reasonable logic.

So are scientists always wrong? Of course not. Through the course of their methods, they form hypothesis which they often prove wrong, but by the time they get to a point where they make a claim, they are demonstrably far more correct than any other group of people on the planet. Be a skeptic and question everything, including science. But proper skepticism should lead you to find that the scientists did their part correctly; the errors came in how that information made its way to you along the way.

Diversity of Career – There’s More To Life Than a College Degree

As memes become a pervasive influence in our lives, it’s important to always approach them with a healthy dose of skepticism. A good meme will cite sources that are credible, but many like this one largely just espouse a particular ideology, and aren’t exactly subject to fact checking. For the most part, they’re just intended to appeal to your emotions.

Nonetheless, it’s still worth questioning if the premise even makes sense, and this one is a perfect example of a meme that feels good, especially to someone who is like Mark in the meme, but is severely flawed in premise, despite the strong hint of truth in it.

So first, let’s address the logical argument to be made here.

In this country, it can be difficult to find a good career where a college degree isn’t required, even if the degree is unrelated to the career you seek. I find this notion of “any degree will do” infuriating, but sadly my beloved free-market, coupled with ignorance, likely created it. Although, I think a fair argument could be made that a lack of decent tort reform by government hasn’t helped either, which I’ll address as well.

Let me provide an anecdote to explain the hint of truth in this meme. I have a friend I worked with at a local GM car dealership. By law (Don’t get me started on this one), dealers are required to be independent of the factory, so there can be no factory-owned GM dealers, Ford dealers, or any other marque. Tesla, to their credit, took to fighting this, but they didn’t win.
Because the factory and the dealer are independent of each other, when someone takes their car in for a warranty repair, the dealer will fix it, then charge the factory for what they’ve done. The factory then will audit and sometimes inspect the parts replaced to be sure the dealer is not performing unneeded repairs, or fraudulent claims.

My friend had over a decade of experience working the parts counter at our dealership. He was intimately familiar with why parts were replaced under warranty, and how to assess the condition of those parts. This knowledge made him highly qualified to review and audit warranty claims for General Motors. So he applied for the corporate job, and was refused consideration because he had no college experience, despite his clear knowledge and expertise on the issue.

The person who was hired by GM to do this job did have a degree, entirely unrelated to automotive repair, and had almost no mechanical repair experience to speak of. As a result, their ignorance made it easy for us on the dealer side to take advantage of them, highlighting the flaw of choosing a college degree over relevant experience. It’s a bad business decision that can cost a company a lot of money.

Recently, as CNBC notes here, many tech companies are starting to see the light, and have removed the college degree requirement, understanding that there were a lot of talented people they were passing up. People who went to trade schools, or maybe just have relevant work history to the field. Kudos to them for recognizing this practice was less than ideal.

As for the tort issue, in some instances, if a company is worried about getting sued, having someone who is “qualified” by virtue of having a degree may help indemnify them. You see something similar in the auto repair field where a shop will hire an ASE certified mechanic over one who doesn’t have such certifications, even if the latter is clearly more knowledgeable, because this helps protect them in the event a car is improperly repaired, the shop can say they at least hired someone who was “certified.”

But what should matter is simply whether the person did the job correctly or not, and did the person have any relevant experience in auto repair, to the point they should be trusted to do such repairs.

The truth of the meme is that skilled trades can indeed pay quite well, often beating their college-educated counterparts when the degree is something with a very small job market. When is the last time you saw a job opening for a philosopher after all? This is not meant to attack philosophy, it’s an important way of thinking. It’s only to say that as a stand-alone skill, it doesn’t offer up many career opportunities outside of teaching philosophy.

The issue is a little deceptive however, because if many of these degree-holding future employees pursued a career related to their degree, they would likely land a more lucrative career. But sadly, I’ve known many who simply got a degree to have a degree, because that would help them find a career, and they took the path of least resistance by getting what they felt was the easiest degree possible, with no real interest in that field.

The problem with this meme is that it does the one thing it’s trying to prevent, and that’s to demean people who choose an alternative career path by going to college for a degree they might actually want to pursue a career in.

So instead of creating a divide, it would be better to understand that just as genetic diversity is incredibly important for the advancement of life, diversity of career also creates a stronger economy that isn’t in danger of collapse because it relies on others.

We need college-educated people to do things like cure diseases and engineer safe buildings. But we also need skilled trade people to do things like fix our cars, electric, or plumbing. We need people to mow our lawns, stock shelves, and do the things which require almost no training at all, too.

If you vilify any one of those people for their life choices, you’re behaving poorly. We need them—all of them, to do their thing. If you don’t want to do those things yourself, be thankful someone else is willing to do them for you.

If you want to lash out at someone, may I suggest these three groups instead?

1) The person who could be a productive person and do one of those jobs, but instead opts to live off the taxpayer, parents, or just becomes a vagrant.

Photo from a story on how to be homeless, which outlines effective survival tactics for those who simply choose that life. Click photo for details
To be clear, I’m not talking about the people who can’t work because of a disability. I’m referring to those who refuse to do their part in society, and worse yet, expect others like me to subsidize their lives.

2) The person who gets an education in a field they have no intention of using, gets poor grades doing it, and did it because they just didn’t want to work, or weren’t driven enough to commit to a career path. They wasted their parent’s money, or racked up student loans for a degree they’ll basically never use. That’s a bad investment at best, and potentially depraved indifference if they’re wasting their parent’s money willingly. Their parents worked hard for that money, and probably could have used it for something more worthwhile if their child hadn’t recklessly wasted it.

Since some do it on the premise you need any degree to get a good job, this leads me to my third group…

3) Employers who put people with an irrelevant degree (in relationship to the job they’re seeking), above people with real world experience in that role, but no college degree. Those college-educated employees have shown they’re uncommitted, and that they make poor life choices. The person who started working at 18 has shown that they just want to work.

One thing is for certain, whatever makes someone happy and solvent should not be condemned if it isn’t harming others, especially when they’re being a useful component to society. Diversity is always good, and all levels of career are required for a successful society—we can’t all be doctors or lawyers and expect the world to sort out its own problems. If you find yourself looking down on people who do or don’t have a degree because they’re different from you, the problem isn’t them, it’s you.