This problem is something that is pretty well known from our high school education – the quadratic formula. In a follow-up post, I will present the solution through a shortened version of the process by which a mathematician might have gone about finding a solution.

Equations are among the most important objects in mathematics. In algebra especially, solving equations is a central task. We solve equations for many reasons – they can help us deal with our finances, they tell us through many scientific laws how the world around us behaves, and they can help us predict certain future outcomes. It should not be surprising then that one of the broadest and most important questions in mathematics regards asking “what kinds of equations can we figure out how to solve”?

The first equations we learn how to solve in algebra class are one-variable equations. These are equations like $3x + 5 = 2$, or more broadly $Ax + B = C$ for any numbers $A, B, C$. We learn early on in algebra how to solve these. For the equation $3x + 5 = 2$, we first subtract 5 from both sides and we end up with $3x = -3$. We then divide by both sides by 3 and conclude that $x = -1$. If you use the same process on $Ax + B = C$, then we can conclude that $x = \dfrac{C-B}{A}.$ This is a complete solution to this type of equation.

The next most challenging type of equation are called quadratic. These are equations that involve regular numbers, $x$, and $x^2$, which is $x * x$. For example, the equation $x^2 + 3x + 7 = 0$ is a quadratic equation. Before we move on, it is natural to ask why quadratic equations would matter, and why we would try to solve these first instead of some other type of equation. The reason that quadratic equations are the “next easiest” to solve is because they involve nothing more difficult than multiplying $x$ by itself – and only once. One very good reason to care about these is because the shape of graphs you get from quadratic equations, called parabolas, are exactly the shape that objects make when thrown through the air. Because Newton’s equation for gravity is essentially quadratic, this means that by understanding quadratic equations, we can understand how projectiles move through the air.

With equations like $Ax + B = C$, we developed an overall solution of $x = \dfrac{C - B}{A}$ that will always work, no matter the values of $A, B$ or $C$ (so long as $A$ is not zero). The question of the quadratic formula is to try to find a similar solution to the quadratic equation

$ax^2 + bx + c = 0.$

As an example, we could as what all the solutions to $x^2 = 4$ are, and intuitively we can arrive at the answer of $x = 2$ and $x = -2$. But the equation $x^2 = 0$ has only one solution, $x = 0$. We learn from this that we are sometimes looking for two solutions – not necessarily just one – and so we want to make sure our quadratic formula will tell us about both of these solutions, if there are in fact more than one.

This is about all you need to understand what our goal is – to find a systematic way to use the numbers $a, b$, and $c$ to find solutions $x$ to the quadratic equation $ax^2 + bx + c = 0$. In an upcoming post, we will show how this problem can be solved.

## Critical Thinking Toolkit: Don’t be a “Mere Skeptic”

This is the last of the initial series of discussions I am putting forward about evaluating premises in an argument – or more colloquially, evaluating anything that somebody tells you is true. We’ve gone through a variety of nuances so far – about properly evaluating probabilities, when people do and do not need to justify what they say, and even the nature of knowledge itself. A lot of this is based upon not coming to conclusions too quickly. This post is exactly the opposite.

There is such a thing as being too critical. As many have done before, I give such people the label ‘skeptics‘. The reason I do so is that skepticism refers generally to being doubtful about something – normally until sufficient evidence is provided, of course. But skepticism can go too far. Flashing back to a previous post, being skeptical of properly basic beliefs is, I would say, either a sign of philosophical ignorance or intellectual dishonesty. If you have not read my posts that discuss properly basic beliefs, imagine a person trying to convince you that they don’t exist, or that you don’t exist, or that the buildings and trees and animals around you don’t exist. Properly basic beliefs are, very broadly, those things that are at this level of undeniability. To attempt to reject these is taking the principle behind skepticism much too far.

So then, what are the implications of skepticism to evaluating arguments? The difficult part of this question is attempting to define what exactly is the right amount of evidence required to justify believing something – I am not going to try to do that here, because that is a difficult question. It is easier to lay out some criteria of things to avoid. So, let’s lay out a few things to avoid in “being skeptical.”

(1) Make sure that it is at least possible for someone to give you enough evidence to convince you.

(2) Defeating an argument requires bringing forth an alternative.

I will explain each point in order. The first point ought to be obvious, I think, but there are people who have de facto rejected this. Let’s give an example. Michael Shermer is a prominent atheist speaker and debater, and he promotes what is colloquially known as Shermer’s last law – that “any sufficiently advanced extraterrestrial intelligence is indistinguishable from God [1].” If you just think about this for a minute, you realize that one implication of the claim here is that any evidence that could point to God’s existence also points to extraterrestrial intelligence, and this serves as an escape route. To see what I mean – suppose that God does in fact exist. Not necessarily the Christian God, but some “capital G” God. Then Shermer’s last law makes it impossible to produce evidence of this God’s existence, because any and all evidence is explainable another way – via extraterrestrial intelligence. Therefore, Shermer’s last law produces a clear bias against the idea of God existing. This contradicts (1). We should never do this. Whatever way we view the world, we should not believe anything that prevents us from being corrected if we happen to be wrong.

The second point is easier to explain. You can’t really disprove a person by saying “well I think you’re wrong” every time they say something. You can’t just say no to everything – there must be something on the basis of which you say no to other things. You can’t honestly say “no” to something unless you believe “yes” about that thing. In the context of religion, you can’t unequivocally reject Christianity unless you can establish something to take its place – and you can’t reject atheism unless you can establish something to take its place.

Everything included in this article can be summarized quite quickly – make sure that nothing you currently believe prevents you from being corrected. If something does, then that belief is probably incorrect. Christianity provides multiple routes of disproving it, as does every organized religion that I know of, as does classical atheism. Let’s all do our best to be honest about what we believe and why we believe it.

## Current Happenings and Future Plans for the Blog

Life has been quite hectic lately. It’s increasingly difficult to find time to write on the blog. I should have more time to write a few weeks from now, and once that time comes I should be able to make progress on a lot of the ideas I’ve been wanting to develop here. For the time being, I thought it would be fun to write about what my plans are going forward and to offer an opportunity for my readers to make suggestions of areas that I can write on in the future.

What’s Going on Now

My Mathematical Studies

There is quite a lot going on right now in my mathematical work. I’ll summarize the many converging areas in which I am involved right now.

The most natural place to begin is with my own “education” in the sense that most people would use the word – my coursework. Since it is now the summer, I do not have normal semester courses going on, but I do have a four week summer class. The reason? In graduate schools (I believe all of them, but I can only directly speak for mathematics), there isn’t exactly a clean organization in terms of years. That isn’t the best way to think about it. You might think of graduate school in terms of three “barriers”: the qualifying exams, the research proposal, and the thesis. To give a very brief metaphor of these stages, ‘Stage 1’ might be like learning about the world at large, ‘Stage 2’ might be going deeper into continent or large chunk of a continent, and ‘Stage 3’ might be like learning more about a particular city or state than anyone has ever done before. I am currently on the cusp of finishing up ‘Stage 1’ – at my school we call the end of this stage of learning the qualifying exams. Without passing this stage, you cannot get a Ph.D. At my particular institution, three categories of qualifying exam are offered twice a year, and we must pass two of the three to pass into ‘Stage 2’. I have already passed one exam – the Algebra Exam – and my summer course is a preparatory course for what is called the Analysis Exam. So, I am spending a lot of my time preparing for this exam.

I am also working part-time as a teaching assistant at an REU, which stands for ‘Research Experience for Undergraduates’ at the University of Virginia (being held virtually, since we are still in quarantine). This is a 6 week program at which are assembled many of the brightest mathematical minds of college or high school age. The goal of the program is to provide mentorship for these bright young men and women and to guide them towards publishing cutting-edge academic papers. Working with these young mathematicians (who really aren’t that much younger than I and all of whom have much more natural talent than I) is such a pleasure. They move so quickly that I sometimes can’t quite keep up with them, and it is an absolutely wonderful learning opportunity for me – both learning more advanced mathematical concepts and in learning more about education.

Finally, there is a research component to my job. I have been write papers to publish in academic journals fairly regularly since entering into graduate school (if you want to see my publications, you can check the link at the end of the article, though this link might be changing sometime in the near future). All of my recent work is related closely to the theory of modular forms – which is an area of study primarily within number theory but which also has applications in the study of black holes and many other areas in academia. My writing so far has been focused on studying partition functions – which can be thought of as studying the various ways in which numbers can be written as sums of smaller numbers, though I also helped write some papers relating to Ramanujan’s tau function pioneered by my advisor Ken Ono that are very exciting.

There is also a networking aspect to my job – sadly, this can’t really happen during a global quarantine, so I cannot comment much on any of this.

Critical Thinking Series

I have also recently begun a series of posts about critical thinking that will go on to be quite a long series. Sadly, as far as I can tell we don’t really teach good critical thinking anymore – and if we do, there is no evidence whatsoever of this in the public sphere. Almost every time I see commentary on some important public event, I see that at least some important people and very many “regular” people speaking about it make fairly blatant logical fallacies or major miscues in some philosophical area like ethics, epistemology, metaphysics, or theology. And these miscues, when there are a great many of them, contribute to a culture that no longer knows how to use proper critical thinking.

Therefore, I have decided to devote a lot of time on this series, because if critical thinking and logical skills are mastered, then a great many blunders can be avoided immediately. I recognize, to be fair, that I also fall prey to bad thinking sometimes – but regardless of your political party, religion, race, or anything else, you will fall prey to way, way more fallacies in your thinking when you don’t know they are fallacious. Because I know sometimes my own thinking will fall short, I want to be surrounded by people who will call me out when it does. This is how growth works and is I believe an essential component of what we need to do in order to solve many of the largest problems we face as a society.

I hope that series will be helpful and educational for all who read it.

What’s Coming

Database Series

This is a project that has been in the works for a long, long time, and still is not quite ready yet. But I look forward to the day that it is ready.

The database series will not exactly be articles in the way that I usually write. Basically, the goal of this series is to compile large lists of resources and data that would be useful in understanding various problems. So far, I have begun compiling two such databases, neither of which is quite ready yet. In one of them, I have used a “Top 100 Mathematicians of All-Time” list and tried my best to determine the religious beliefs of these men and women. The reason for this is that there is a fairly prevalent theme in a large chunk of modern society that religious people are anti-science and always have been, and so I thought that such a review might be at least one piece of data that would be relevant in evaluating whether such a claim is accurate or not.

I am also working on a database of apologetics ministries through which those who want to investigate religions intellectually can look. I do my best to only list a ministry that I feel is both intellectually honest and helpful. The list is primarily of Christian ministries, but I will also include at least one Muslim speaker and one atheist speaker who I have gained a lot of respect for as I have listened to them.

The database series will also be updated as I come across more information. So, unlike my regular posts, these posts will be updated over time when I find it appropriate to do so. I also am hoping that these will be helpful – though I expect they won’t be as helpful at first, when there are only one or two of them – but I hope that over time they can help people who are curious to learn more.

Book/Article Summary Series

I’m not quite sure what I want to do with this yet, but I want to provide an ongoing series in which I summarize some of my favorite intellectually significant books and books that have had a large influence on culture. I expect it will be quite a while before this series really gets going, but probably before the end of 2020 I will have done one or two of these.

This is a series about mathematics that I have wanted to write every since the blog began, but which I haven’t quite yet figured out how to write. What I want to do with this is to attempt to explain how a mathematician thinks about math. I want to answer questions like the following:

• What does it mean for mathematics to be new?
• What are the different fields of math (analogous to physics, chemistry, biology, etc. in science) – and why do we think of them in that way?
• What does it mean for an idea to be an ‘example’ of another idea?
• How do we shift between different mathematical fields and ideas?

These, and many more, are the sorts of questions I hope to address in that series. Once I feel satisfied that I have done a good enough job with my goal for the series, I will begin writing it.

Understanding Calculus

This coming semester at UVA, I am going to be an instructor for Calculus 1. Although I have tutored dozens of people in these topics and have been a teaching assistant in this class before, this semester will be the first time I have taken the lead instructor role, and it’s an exciting step up.

It is, however, a lot of work. Calculus is a different sort of conceptual entity than much of the math that comes before it. By that I mean that it isn’t like calculus is just “algebra 3” or “geometry 2” or “trigonometry 2”. It uses algebra, geometry, and trigonometry in some important places, but it has its own unique elements.

As an instructor, this “newness” means that an important part of what I need to do for my students is to ground these new concepts clearly for them and lay out good, clear examples of how to use these new tools to solve new problems.

In light of this responsibility of mine as a calculus teacher, I want to take this sort of education to the blog. I’m still working on how exactly I want to present the material, but my goal is to write a series of discussions that enable someone with essentially no background in mathematics to understand what it is that calculus is all about and what we can use it for. For example, I will go through how calculus is used to discussing important concepts in physics like areas, speeds, acceleration, and smooth surfaces.

The Kalam Cosmological Argument

I’ve also been thinking a lot about how to begin writing about apologetic arguments, and I decided to soon begin a fairly long reading list that will enable me to discuss in detail what is known as the Kalam cosmological argument for God’s existence.

I’ve discussed this argument briefly elsewhere in the blog, but I think for a few reasons that this is a good place to start. In discussing the Kalam, we get a good example of both a rigorous deductive argument and of an inference to the best explanation. Going through a full discussion of the Kalam is also intellectually quite rich – it involves discussing metaphysics, philosophy of time, philosophy of science, developments in modern physics, and theology. Off the top of my head, no single line of reasoning in natural theology has such a diversity of deep discussions readily accessible on the surface. This is probably why the Kalam is my personal favorite area of study in natural theology.

So, you can expect that, a few months from now, I will be writing up a series making an effort to be as careful as possible providing a thorough discussion of the various ins and outs of the Kalam. It will take me a while to get there, since I want to bone up on my reading on the subject before I write on it, but it should be a very interesting discussion.

Q&A Series

This one isn’t really up to me, but I have wanted to start doing Q&A type posts. Of course, I need Q’s to do this, and I do not yet have anyone asking me good questions directly that I can address in Q&A’s. Therefore, I would love for my current readers to direct questions you’d like to see me address to my email address, mathematicalapologist@gmail.com.

I’d love to hear what my readers would like to see discussed, because one of my main objectives in my writing is to help others learn about interesting and important ideas. Feel free to contact me over email any time, I have an email address that I use for the blog and would be happy to have discussions and take suggestions from anyone.

https://people.virginia.edu/~wlc3vf/

## Critical Thinking Toolkit: Get the Categories Right

This is another brief, but important, note within our discussion of evaluating claims that people make while trying to prove a point. The shortest way to say this is that there is more than one type of claim. You should evaluate claims based upon the category into which they fit. I could see this summary statement being a little too vague unless you can see what I mean, so let me explain.

As a mathematician, I am trained to look at mathematics with certain standards. I am taught to rely on nothing more than definitions, foundational mathematical axioms, and the basic rules of logic. Because of the goal of mathematicians – to know with logical rigor things that are true about the mathematical realm – this is an entirely appropriate way to look at mathematics. I say this is appropriate because this way of looking at mathematics is designed specifically to achieve the goal of knowing with logical rigor the correct answer. In fewer, words, the method fits the goal.

Why do I point this out? Well, not every field of academia uses the same standard. For instance, if I were to use my mathematical standards while studying history, I would learn literally nothing – I would just point out that it is at least possible that someone made everything up, and therefore on the grounds of the rules of logic I would reject all of history – even events that happened five minutes ago! If I apply mathematical standards to studies in literature, then the exact same thing will happen. Even if I apply mathematical standards to science, I will accuse the scientist of fallacy, since empirical evidence has no place within the rules of logic. In fact, but using these standards I will destroy all of human knowledge except for mathematics itself and a couple areas of philosophy.

This should not tell you that everything outside of mathematics ought to be placed in doubt. It just tells you that the standard used by mathematicians is not appropriate outside of mathematics. There are plenty of other standards outside of mathematics that are appropriate in different contexts. In a courtroom, the standard used for evidence is that it point to its conclusion beyond a reasonable doubt. This is still a very high bar, but it isn’t as high as the mathematical standard. Most scientific disciplines have the standards of empirical verification of theoretical predictions – which is significantly different than either the courtroom of mathematics. When studying history, you use various kinds of historical techniques that are entirely inappropriate anywhere outside of historical studies.

So, if we are evaluating a claim that someone else makes, we must be very careful to understand what kind of claim they are making, and what kinds of evidence are relevant to the question. It is possible that an idea will have multiple dimensions – for example, evaluating ancient history involves, geography, literary studies, anthropology, historical studies, and archaeology (at least these – perhaps even more). Each of these areas has their own methods and techniques that are designed to be helpful at answering very specific kinds of questions, and we should always be careful that we are taking all of this into account.

Of course, in order to do this we must take some time to learn about various kinds of techniques and learning about where those techniques are viewed as relevant by academic professionals. If you want to learn about whether a certain mathematical technique is valid, you ask a mathematician, not a scientist, because there are statements known by mathematicians to be false that the scientific method would end up regarding as true, because counterexamples to these statements, while we know they exist, are beyond the power of modern computers to locate. A scientific approach will get you a lot of the right ideas, but a lot of wrong ones too. If you want to know about scientific techniques, mathematicians and historical studies can be helpful, but you’re going to learn the most from a scientist. The same goes for everything you can think of that humans study today.

This is why I take so much time attempting to explain careful methods of thinking. We will often go astray if we use the scientific method when it isn’t appropriate, or if we apply a non-scientific methodology in science. We must be careful if we want to know the truth about reality, because the deeper truths about reality are not always simple.

## Critical Thinking Toolkit: More on Basic Beliefs

In a previous post in this series, I discussed what are called properly basic beliefs. Because this is a frequently misunderstood idea in my experience thus far, I thought I’d add a dedicated but brief exposition of the idea and when it is and is not appropriate.

A really great example of this goes back to the philosopher and mathematician René Descartes. In his philosophical work, he discusses a very interesting question – is there anything that can be known so surely that it is impossible to doubt Descartes was able to rule out many extremely obvious beliefs we all hold as possible solutions to his question. For example, if it is even possible that a demonic being is deceiving our five senses (which on a purely logical level it most certainly seems to be), then nothing we see, taste, feel, hear, or smell is impossible to doubt, because we could doubt those things if the demon really existed.

However, Descartes arrives at a now-famous saying that provides a real solution to his question. The saying is cogito ergo sum – which translates to English as I think, therefore I am. Descartes’ point here is that you cannot be deceived into believing that you exist (more specifically that you are a thinking thing) because you cannot deceive a thing that is incapable of thought. You cannot, for instance, convince a rock that it doesn’t exist. Similarly, because you have thoughts, you must exist.

This is a famous example of what I mean by properly basic beliefs. A properly basic belief would be a belief that has within itself such strong justification for its truth that there is no reason to doubt it without immense counterevidence. That we exist is a very extreme example of a properly basic belief, but there are many more – some of which I have already mentioned in this article. Here are a few examples of kinds of beliefs that I would consider properly basic:

• Our memories are mostly reliable.
• The past actually happened (the universe was not created five minutes ago with an appearance of being older).
• Our five senses are reliable.
• The physical world around us actually exists.
• I have arms, legs, fingers, etc.
• My name is (fill in your name here).
• Our own emotional states (i.e. “I am happy right now”).
• People that I meet every day actually have thoughts and feelings – that is, they have minds just like I do.

It is worth noting that most of these things actually don’t pass Descartes’ test of “impossible to doubt.” The so-called “Evil Demon Hypothesis” can be used to hypothetically doubt all of these. However, there is no real reason to think that the evil demon world is actually the way things are. You could try to ‘debunk’ many of these with talk of being in a very long dream, or that we are all living in the Matrix, or that the entire world is a computer program. But none of these things are really serious reasons to question whether I actually have arms or not. My experience of having arms is in and of itself enough reason for me to reject any claim that I don’t have arms.

For an extremely deep discussion about properly basic beliefs, see Alvin Plantinga’s work. In this work, he even points out that if Christianity is in fact true, the the witness of the Holy Spirit would make our belief that God exists into a properly basic belief (I am oversimplifying here – I’ll try to write more on this once I feel educated enough on the matter to not misrepresent Plantinga). This example is much more controversial, and admittedly does depend on Christianity being true – and in fact Islam, Judaism, and Mormonism can make similar claims. But, philosophically, all this ends up meaning is that there is no “de jure objection” to belief that God exists, that there are only “de facto objections.” The distinction here is that you can try to show why someone is incorrect in something they believe in two ways. You can try to convince them that it is false (this is what de facto means). The other is more subtle – if you can convince someone that believes X that by believing X they end up in some way undermining their ability to rationally believe that X is true, then that is a good reason to stop believing X. This is what de jure means. As an example, a belief that leads one to believe that humans are incapable of thought is irrational, because that belief is itself embedded within the world of thought, so to speak. The argument that Alvin Plantinga makes is, in summary, that properly basic beliefs cannot be shown to be irrational, because there is nothing “deeper” than them that can be undermined.

And this is really the point of a belief that is properly basic. It is a kind of belief that is so foundational to our experience of reality that there is nothing “deeper” we can appeal to. For anyone who wants to learn more, Plantinga discusses some of these things in his book Warranted Christian Belief and other books that appear in a series with this one.

## Critical Thinking Toolkit: Weighing Evidence with Bayes’ Theorem

When we are involved in important discussions, it is important to not take pivotal claims of others – or ourselves – merely at face value. Instead, we debate the various available ideas. That is, we bring forth evidence that we have thought about and that we believe supports our position, and we listen to the evidence presented by those who disagree with us. We ask those who disagree with us difficult questions about what they believe, and they will ask us difficult questions about what we believe. In all of this, the big question lies in trying to evaluate which of the two positions is better.

This is especially important if we are undecided on some idea and we are listening to two intelligent people express their opinions. Suppose, for example, that Jim doesn’t have any developed opinion on an important political topic in the upcoming presidential election. The most responsible thing Jim could do would be to seek out the best and most intelligent proponents of the major positions within his country on that issue (note that this might not be the candidates themselves) to hear the best evidence that each perspective has to offer. How is Jim to decide which position to take?

Of course, in this situation a lot depends on what the issue is. I don’t claim to provide an analysis that will work in absolutely every situation. To be more specific, I don’t claim to explain in detail what kinds of evidences are valid or invalid in different situations. However, as a mathematician, I can provide insight on a precise way to evaluate evidence once you have it – a method that is based on foundational concepts in the mathematical theory of probability.

Why Apply Probability to Decision-Making?

The first question we might ask is why we bring probability into the picture at all. You might say that since these questions have right answers, all the probabilities are either 100% or 0%. In a sense, this is correct. However, for very nearly every question we ever think about, arriving at this all-or-nothing situation would require almost being omniscient, and it would require being absolutely free of all psychological bias. A little time spent learning about psychology will debunk the idea that we are unbiased – each and every one of us has a lot of bias. And I hardly even think I need to justify the claim that none of us know absolutely everything there is to know about a given often-debated question – that just isn’t possible. So, while it is “technically correct” in one way that everything will reduce down to 0% or 100%, it is entirely unhelpful to think of things in this way, because none of us can actually arrive at a truly 100% answer to any complicated question.

What then do we do? One extremely helpful thing to do is to bring in probability theory. In mathematics, probability theory is the discussion of how likely things are and the proper ways to determining how likely something is to be true (among other related ideas). Probability theory is a highly developed and sophisticated field – you can get Ph.D’s studying tiny slivers of the broad landscape of probability theory. Fortunately, we don’t need all that complicated stuff – the most important foundational tool of probability theory that we need to understand for these kinds of debates doesn’t use any horribly complicated ideas. The tool that I am thinking of here is called Bayes’ theorem, which serves as the foundation of such disciplines as Bayesian statistics. Before I explain how this affects how we ought to view evidence in day to day life, let me spend a moment developing Bayes’ theorem itself.

What is Bayes’ Theorem?

In probability theory, the most important notation of all is the way we denote probabilities of various things. Usually, instead of writing out full sentences to describe the situations we are about, mathematicians use a one-letter shorthand for those situations. Here, I will use $A$ to denote some kind of claim that someone makes to you. In probability theory, when we want to talk about probabilities, we define $P(A)$ to the probability – ranging anywhere from 0 to 1 – as the likelihood that $A$ is true. For example, if $A$ is “the coin I just flipped landed on heads”, then $P(A) = \frac{1}{2}$, which means the probability is 50-50. If instead $A$ is the odds of rolling a six on a standard 6-sided die, then $P(A) = \frac{1}{6}$.

Another important idea in probability theory is the introduction of the word “not” into the vocabulary. In the context of the 6-sided die, I think it is clear enough that either I will roll a six, or I will not roll a six. Since one of those two absolutely must happen, the total probability must be 100%, or $P = 1$. Since it is impossible that both of these things happen at the same time, probability theory dictates that

$P(A) + P(\text{opposite of } A) = 1.$

If we are in Jim’s situation from earlier and are trying to decide whether a particular claim by someone else (let’s call it $A$) is true or false, we care about both of the values $P(A)$ and $P(\text{opposite of } A)$. Often, in mathematics instead of writing out ‘opposite of $A$‘ we instead use $A^c$. We think of the letter $c$ as a stand-in for the word ‘contrary,’ so $A^c$ means something like ‘contrary to $A$‘, or just ‘opposite of $A$.’ The equation previously defined might then be expressed as $P(A) + P(A^c) = 1$.

We now run into a harder question – how does evidence factor into the equation? It is certainly true that the value of $P(A)$ is related to how much evidence we have available to us and how convincing we find that evidence. Probability theory also has a way to write this. Say that $A$ is the thing that you want to know about, and $E$ is the total of evidence you have that relates to $A$. This is all evidence – evidence for $A$ being true and for $A$ being false. When a probability theorist wants to express the likelihood of $A$ being true based upon the evidence $E$ available to us, they write $P(A | E)$, which is read out loud as the probability of $A$ given $E$.

When we write things in this way, how then do we determine $P(A | E)$? This is where Bayes’ Theorem comes in. This is a mathematical theorem that tells us how to compute $P(A | E)$ using other probabilities. The idea is as follows. If we want to know how likely $A$ is given evidence $E$, we want to look at how $A$ increases or decreases the likelihood of finding $E$. In other words, we want to know whether how well $A$ happening explains why we find $E$, and we compare that to how well alternatives to $A$ explain why we find $E$. The mathematical statement of Bayes’ theorem is given below:

Bayes’ Theorem: The following formula is true:

$P(A | E) = \dfrac{P(E | A) P(A)}{P(E)}.$

To see how this works, say that we are trying to determine whether $A$ or $B$ happened, and we are using the evidence $E$ available to us to try to decide. Then we can use Bayes’ theorem to calculate $P(A | E)$ and $P(B | E)$, and whichever of these options is larger is the more probable explanation.

Of course, in most situations it doesn’t make a whole lot of sense to use actual numbers, because in the vast majority of situations, using actual numbers is an oversimplification. However, the theorem still provides the correct conceptual framework within which to think about evidence. To see why it is helpful, it helps to use an example.

Winning the Lottery

Let’s say you bought a lottery ticket. You obviously are excited and are hoping to win – let’s use the shorthand $W$ to denote your ticket being the winner. Now, we can imagine thinking about our likelihood of winning the lottery in four different situations. In all cases, suppose your odds of winning are one-in-a-million exactly. (Note: The first situation is labelled ‘Situation 0’ since nothing really happens there)

Situation 0: All you know is that you have a ticket and 1-in-a-million odds. Use $E_0$ as shorthand for the evidence available to us in Situation 0.

Situation 1: Your friend tells you they think you won! You ask why, and they say, “Oh, I just have a really good feeling about it.” Use $E_1$ as shorthand for the evidence available to us in Situation 1.

Situation 2: Your friend tells you that they saw the lottery number in the news, and they remembered a few of the numbers on your ticket and that you got those right, but they don’t remember the rest of your ticket. Use $E_2$ as shorthand for the evidence available to us in Situation 2.

Situation 3: You watch the news and see the lottery ticket announced, and you check your ticket and the numbers all match! Use $E_3$ as shorthand for the evidence available to us in Situation 3.

One important thing to note upfront about all of this is that $P(W)$ is the same in all four situations – it is exactly one chance in a million. But, as we learn more and more about the winning ticket number, the way we think about our chances of winning changes. Here is how one might think about the various ways of calculating $P(W | E)$ for these situations.

• In Situation 0, we have no evidence at all, so $P(W | E_0)$ is just $P(W)$. Since we don’t have any evidence, nothing changes.
• In Situation 1, the evidence from our friend is pretty unconvincing. But, there is a slim, slim chance that they already know you’re going to win for some reason and they are just trying to conceal their excitement. This is a very unlikely scenario, but it is possible, so $P(W | E_1)$ is a miniscule amount larger than $P(W)$ – but it doesn’t really make that much of a difference.
• In Situation 2, we have some concrete information from a fairly reliable source. We don’t have enough to know whether we won or not, but $P(W | E_2)$ will be much larger than $P(W)$, but will still be pretty small. Perhaps it will be something around 0.1% or so – much better than the 0.0001% we started with, but still not particularly good.
• In Situation 3, the if we read the TV screen correctly, and if this was really a new channel, then we definitely won the lottery. So, $P(W | E_3)$ is almost all the way up to 100% – the only doubts left would be whether we read our ticket correctly. Perhaps it is about 99.9%.

In terms of Bayes’ theorem,

$P(W | E_0) = \dfrac{P(E_0 | W)P(W)}{P(E_0)} \approx \dfrac{1*P(W)}{1} = P(W),$

$P(W | E_1) = \dfrac{P(E_1 | W) P(W)}{P(E_1)} \approx \dfrac{1*P(W)}{\text{number barely less than 1}} = P(W) + \text{very small number}.$

$P(W | E_2) = \dfrac{P(E_2 | W) P(W)}{P(E_2)} \approx \dfrac{1 * P(W)}{0.1} = 1000 P(W) = 0.1%,$

$P(W | E_3) = \dfrac{P(E_3 | W) P(W)}{P(E_3)} \approx \dfrac{0.999 * P(W)}{P(W)} = 99.9%.$

Something that is also worth noting here is that nothing particular “extraordinary” was required to leap from 0.0001% all the way up to 99.9%. All that happened is that we watched the news, and the fact that the news announced the correct lottery number is nothing surprising at all. This serves to debunk the claim that we sometimes see that “extraordinary claims require extraordinary evidence.” This is a false mantra, because it neglects the impact of Bayes’ theorem.

Conclusion

When thinking about how evidence affects the likelihood of various beliefs, it is important to remember the balance that Bayes’ theorem gives us. Every idea must be thought of as likely or unlikely not in a vacuum, but in light of the available alternative ideas.

## Critical Thinking Toolkit: What Does it Mean to “Know”?

One of the most common and comical aspects of the lives of very young children that I find quite joyful can be summarized in one word… “Why?” Well, it is more like a sequence of these “Why” questions over and over again until your head is about to explode. Even though there is a silliness to asking “why” so many times in a row, the young child doing so also shows a joyful curiosity that we all share. And, as it turns out, answering the simple question “Why?” is sometimes very hard.

A similar brand of question that receives detailed attention in philosophy is “how do you know?” More specifically, in philosophy there is an entire area of study known as epistemology that aims to analyze what it actually means to know something. While I do not plan to go super deep into epistemology here, it is important to understand such words as believe, know, and certain when we are trying to learn how to carefully have intellectual discussions with others.

My only goal here is to clarify some proper ways to use terms like these within this particular context, because many times these ideas are not understood in a properly nuanced manner.

What is Belief?

This one ought to be the simplest of all, and yet for a couple reasons this has become difficult. The reason is that the word “belief” in popular thought has obtained a new meaning in the religious context. To a great many, the term belief in the context of religions denotes at least a complete vacuity of any evidence or support for the religious position, if not (in extreme cases) an outright denial of all existing evidence. This definition is so biased. I would compare it to a fan of a particular sports team saying that their team is the best team to be a fan of. I honestly think it could even be argued that this definition is a form of either propaganda or outright intellectual discrimination against religious people. A more popular colloquial term for the thing I am trying to describe here is blind faith. Please, please do not take that definition at all seriously if you have been exposed to it. It is completely alien from EVERY religious person, and from every person who has ever lived for that matter. Nobody that I have ever heard of consciously rejects what he or she knows to be evidence without some kind of counterevidence. Let’s respect one another enough to understand the basic idea that everyone has reasons for thinking what they do.

The word belief is really quite simple. A quick Google search gave me the definition an acceptance that a statement is true or that something exists. I am perfectly happy with this definition, as is anyone who holds to a particular religious tradition. When I say that I am a Christian, what I mean by that is that I accept that the core tenets of the Christian religion are true. When a person says that they believe in this religion or that religion, that is all that they mean.

I hate to even have to define the word belief, but I have seen that in some circles (even if very small ones) the word belief has been corrupted in an overtly biased manner, and so I felt the need to clarify outright what I mean when I use this term.

What is Certainty?

Another word that gets misused sometimes is certainty. Clearing this one up is similar to the previous situation – but it is a little different because the terms certain and certainty have two genuine meanings, but these meanings sometimes get blended into one meaning sometimes.

What do I mean by this? The easiest way to distinguish them is to say that certainty has a logical definition and a psychological definition that are both legitimate, but need to be recognized as different. The logical use of the word certain is the kind of thing you get out of formal logic and mathematics – mathematicians are certain that 2 + 2 = 4 because according to fundamental laws of mathematics and logic, this statement must be true. The psychological use of the word certain is not at all related to whether something is true – it relates to what we think or feel about a particular idea. Using the psychological definition, it is entirely possible that I am certain that X is true and you are certain that X is not true. This happens all the time – just listen to any political or religious discussion in the public sphere and you will find people who are (psychologically) certain of two opposing viewpoints. But neither of them are (logically) certain about their views, because logical certainty has no role whatsoever in pretty much every area of thought – even science! We cannot even be logically certain that gravity exists. Logical certainty cannot be obtained by any amount of evidence – only logic. You can be logically certain of Y because X is logically known to be true (i.e. 2 + 2 = 4) and X implies Y (i.e. 2 + 2 = 4 implies that 2 + 3 = 5). That’s about as far as you can go with logical certainty.

The important thing to realize with this distinction is that it is very, very easy for us to treat our own psychological certainty as if it were logical certainty. We should do our best to avoid doing this. We must recognize that we can be wrong about a lot of things, even things that we feel certain about. We must be willing to listen to those who disagree with us – even if we turn out to be right and they wrong, more times than not listening to those who disagree with you will help you understand more deeply what you believe and why you believe it.

What is Knowledge?

I have moved throughout this article from the easy definitions to the difficult ones. The leap from defining belief to defining certainly was in my estimation fairly small, but defining the word knowledge is very nearly impossible. There is an entire field of academic study called epistemology that is built upon attempting to clearly define what knowledge is.

The most fundamental attempt at a definition of what knowledge is is the JTB model – which stands for justified true belief. I’ll give a sketch of the train of thought behind this attempt at defining knowledge. It is fairly obvious that if you know something, then you believe that thing. Furthermore, you can’t actually know something in a proper sense of the word unless it turns out to be true (if what you believe turns out to be false, it will be an example of psychological certainty that fails to qualify as genuine knowledge). But a true belief wouldn’t always count as knowledge. For instance, someone could gain strange but strong belief about something they know nothing about while in a psychedelic trance, but even if they turn out to be correct, none of us would say that they knew that thing was true. Upon reflection, we realize that this is because this individual didn’t have any good reason to believe what they did – their belief was basically random and just turned out to be right. This ‘good reason’ is what the term justified is meant to denote.

As it turns out, epistemologists almost universally reject JTB as a sufficient definition for what knowledge is. Roughly, one big reason for this is that something that appears to be good evidence might not actually be connected to the thing you believe. These are sometimes called Gettier-type counterexamples in honor of a philosopher Edmund Gettier who challenged the JTB model. One popular example of this kind of situation involves watching a sports game. Imagine that you know your favorite team is going to play their rival on TV today, and so you turn on the TV to watch the game. But, unknown to you, there was a channel playing a rerun of last year’s game between the two teams, and instead of watching the game going on this year, you watch last year’s game. As it turns out, this year’s game had a final score of 70-60, which is exactly the same as last year’s game. So, by watching last year’s game on TV, you come to believe that your favorite team won the game this year by a score of 70-60. This belief of yours is true, and since you watched a game on TV in which your team beat their team 70-60, you have a very good reason to believe that this is true. And yet it doesn’t seem like it can be said that you know this.

There are lots of examples like this one, each showing that good justification for believing something might not connect to the thing you believe in the way that you think it does. This is strong evidence that, while JTB is an extremely useful model for defining knowledge, it isn’t always quite right. What ought to be added or modified to fix these situations is debated and is worth thinking about, but for the sake of learning critical thinking skills, using JTB as a background of what it means to know will be good enough to avoid most mistakes.

How Do We Know a Premise?

In a philosophical argument, then, what does it mean to say that we know a premise is true? We must believe the premise, the premise must actually be true, and we must have some good reasons to back up our believing the premise. While epistemology has gone much further than this in discussions of what knowledge is, the simple JTB model will be good enough for the purposes of thinking carefully through our beliefs and through philosophical arguments.

## Critical Thinking Toolkit: The Burden of Proof

One important aspect of philosophical arguments is whether or not there exist any “default positions.” That is, if I am trying to convince someone about X and that person is trying to convince me of the opposite of X, is either of us in some kind of default position? When is an idea innocent until proven guilty? Guilty until proven innocent? Or when does an idea start off somewhere in the middle – where all relevant viewpoints equally are required to prove their point?

The technical term used to lay out these distinctions is the burden of proof. The burden of proof is exactly what it sounds like – it is a burden or obligation to prove what you are saying. So, the questions of the previous paragraph are summarized in one question – in any disagreement, who has the burden of proof?

The most commonly discussed and well-known application of the idea of the burden of proof comes in criminal trials. In a criminal trial, the jury is instructed that the accused is to be considered innocent until proven guilty, and that the evidence must point to guilt beyond a reasonable doubt. This is a clear affirmation that between the two sides of a trial – the defense and the prosecution – only the prosecution carries a burden of proof. There is, in this case, a default assumption that the man or woman on trial is innocent. The defense is not required to prove innocence, only to show that the prosecution cannot prove guilt.

But is this always so? Are we to always use the innocent until proven guilty standard? And if so, how do we even know which side is the “innocent” side? Well, really the answer is that no, we do not always use that standard. There are other situations where it is appropriate to apply a similar standard, but most situations are not like that. In most situations, the fact of the matter is that both sides of a debate have a burden of proof. Why is this so? To explain, it would help to provide a rule of thumb for when a person involved in a debate carries a burden of proof. After making the rule of thumb clear, we can go back and add some additional details about some special situations where the rule of thumb is not considered applicable.

Generally speaking, the burden of proof falls on any person who makes an objective truth claim. A truth claim is just any claim that “such-and-such is the case,” “such-and-such is true,” “such-and-such is false,” or “such-and-such really happened.” Not everything is a truth claim. For example, if you mother tells you to clean your room, that isn’t a truth claim – rather, it is a command. Questions also are not truth claims – they are just questions. Perhaps a person asking the question is trying to get you to think of some truth claim – maybe they are asking a leading question – but in and of itself a question makes no claims. (To be grammatically precise, truth claims use the indicative mood, but don’t get bogged down in fancy language, we all have a pretty strong intuitive understanding of what truth claims are and are not when we slow down and think about it).

What then do we mean by an objective truth claim? In this context, by objective I mean external to our own consciousness. For instance, the sentence “I am happy today” is a truth claim, but it is not objective in the sense I am using it, because emotional states like ‘happiness’ reside in your consciousness. The reason that this kind of distinction is important is because it isn’t really possible for anybody to bring any evidence against your claim that you are happy – they do not have access to your thoughts and feelings, so how would they know? For this reason, a person who says “I am happy right now” has no burden of proof at all – their statement is to be accepted at face value as a statement of their perceptions of their own emotions. Of course, sometimes people are in denial about things, so people can be mistaken about certain things – but to show that this is the case would require a great deal of convincing from the outside.

Putting aside for now the abstract discussion – let’s look at examples that matter. Let’s take for instance politics. If a Democrat and Republican are debating some topic, who has the burden of proof? For very nearly every political issue I can think of, both sides have equal burdens of proof, since each side is making a claim about the way the world around us is. To decide whether it would be better to raise or lower taxes, for example, both sides must bring forward their best evidence of their own conviction, and then the evidence can be evaluated and conclusions can be drawn.

Another common example is in matters of religious debate. For the most part, in the context of a public discourse, every religious perspective has its own burden of proof. Christians, Muslims, Hindus, Jews, and atheists are all making claims that a certain deity exists or doesn’t exist, that certain things do or don’t happen in the afterlife, that certain events in history happened or didn’t happen. The exception to that rule is agnostics – because the position of an agnostic is that they don’t know what to believe, they aren’t making an truth claim about the outside world. So agnostics carry no burden of proof – everyone else does. So far as I can tell, every group seems to realize this except for some subset of the atheists. Many of the atheists that say things like they “only lack a belief in God” and that “you can’t prove negatives” are, whether they realize it or not, putting up a smokescreen to make themselves look like agnostics when they aren’t. Surely, many of these people really do believe these things and many who say this actually are agnostics. But many of those who say they merely “lack belief” will also actively try to convince you that there is no God – which is a truth claim and carries with it a burden of proof. The claim that there is no God and the burden of proof that comes with it has been part of the very definition atheism for the entire history of philosophy (and still is – some very recent popular-level thinkers have tried to redefine the word atheist to mean something more like agnostic, but academic philosophy doesn’t take that seriously as far as I know).

In summary, very nearly all actual debates that people engage in and nearly all questions that people really care about will be like the examples above – these are situations in which all sides carry a burden of proof. But there are some areas where this is not so. The epistemological term basic belief, or properly basic belief, is often used in these situations. In short, a properly basic belief is a belief that is so foundational and so deeply integrated into our ability to think at all that no burden of proof is required. Going back to the criminal trial example, a properly basic belief is like the “innocent” verdict – we are allowed to assume that our properly basic beliefs are correct unless it is shown to us beyond a reasonable doubt that they are mistaken. Broadly speaking, properly basic beliefs include things like our sensory perceptions and beliefs about our own thoughts and feelings. It may help to see some more specific examples.

One helpful example of what properly basic beliefs look like is my belief that I have a body. While technically someone could try to convince you that you’re only hallucinating your body or that you are in “the matrix” and that you don’t actually have a body (at least not like the one you think you have), the immediate experience of seeing and controlling your body every day is enough to safely conclude that you actually do have a body. We don’t have to prove to people that we have a body – we just know that we do. A similar example is the belief the people you talk to every day are not “robots” but have thoughts and feelings in the same kind of way that you do – in other words, that there are other minds that besides my own. Is it possible that I am really the only thing in the universe and am constantly hallucinating everything around me? Well, you can’t really disprove that. But that doesn’t mean we have to take such an accusation seriously either. We are being perfectly reasonable when we assume that people we talk to also have minds, because this is a properly basic belief and carries no burden of proof.

Hopefully, this discussion of the burden of proof, truth claims, and properly basic beliefs helps to shed light on the reality that, more times than not, being intellectually responsible requires bringing forth some form of evidence for that which we believe. And for my Christian readers, the New Testament explicitly tells us that our faith in Jesus is one of the things that we should be prepared to defend using evidence, for we are told:

“Always be prepared to give an answer to everyone who asks you to give the reason for the hope that you have. – 1 Peter 3:15 (NIV)

Having put down at least an initial discussion about having intellectual integrity in the importance of using evidence to defend what we believe, we can move on to a discussion about how we can think about evidence itself.

## Critical Thinking Toolkit: Valid and Sound Arguments

This is the first post I am making in what will hopefully become a long and detailed series of posts about how to think more clearly about difficult questions. Since we ought to try to think clearly in every domain of life, we must begin the discussion at the broadest level, with the most important things to understand about clear and logical thinking. The obvious place to start here is the idea of a philosophical argument.

Defining ‘Argument’ and Related Terminology

First things first, understand that this is not an argument in the popular definition of two people bickering back-and-forth about some topic. This is nothing like arguments you see on the internet, or arguing with your parents or siblings. There need not be any anger at all behind a philosophical argument, just as there doesn’t have to be (and effectively never is) any anger behind a mathematical argument. When I use the word argument in the context of a philosophical argument, all that I mean is an effort to systematically put together a collection of information to draw some sort of meaningful conclusion.

A philosophical argument can come in a variety of levels of detail, sophistication, and length. However, since at its heart all we mean by an argument is an effort to combine some information to draw a meaningful conclusion, any philosophical argument worth its salt can be boiled down into a format called a syllogism – which is more or less just a fancy way of saying a bullet-by-bullet presentation of the pieces of information that are relevant to the argument. In a syllogism, the initial pieces of information are called premises, and the culminating new information drawn from these facts is called the conclusion. Most often, a philosophical argument will consist of a syllogism along extended discussions of why each premise should be considered to be true, and these discussions are called defenses of the premises. Here is a basic example of a philosophical argument that illustrates these definitions.

1. If it is raining right now, the ground is wet.
2. It is raining right now.
3. Therefore, the ground is wet.

The collection of statements 1-3 is a syllogism. Statements 1 and 2 are premises, and statement 3 is the conclusion. To make the argument more detailed, you could make a defense of Premise 1 by convincing your listener that water makes things wet, and you could defend Premise 2 by looking out a window, walking outside, or checking a weather app.

Also important in discussion of arguments are objections to an argument. An objection to an argument is basically just anything that is meant to show that the conclusion in the argument is actually invalid. For instance, in the above argument, you could object to Premise 1 by noting that ground inside of a building or underneath a shelter won’t become wet when it is raining, or you can object to Premise 2 by showing your friend that there are no clouds in the sky. These considerations would, if they are accepted as true, make the conclusion “Therefore, the ground is wet” not necessarily true.

Objections can come in many forms, and methods of objection are a topic for later discussion. But before we can embark on that discussion, we first must lay out clearly what a philosophical argument actually has to accomplish in order to be considered as success. There are two basic criteria that arguments aim to satisfy – an argument must be valid and sound before it should be trusted as correct.

What is a Valid Argument?

Of the two conditions mentioned, validity is weaker (in the sense that being sound requires validity). An argument is valid if it is the case that, if we take as absolute truth every premise, then we cannot help but affirm the truth of the conclusion. Sometimes, the term valid is instead replaced with logically valid, because what we mean is that we can use the rules laid down by the formal theory of logic to move us from premises to conclusions. The argument about rain I presented earlier is logically valid in this sense – it has the format

1. If A, then B
2. A
3. Therefore, B

In the philosophy of logic, this form is given the title modus ponens – Latin for the mode of affirming. I’m not going to try to explain why this is accepted as logically valid – I quite frankly don’t know how to boil this down any more than it already is. If anyone reading this believes 1 and 2 but not 3, please do try to explain to me why. So long as A and B genuinely mean the same thing in all three statements, this is unavoidable.

Why does validity matter? Aren’t there other ways to know something than pure logic? Well, yes, sort of. It is true that very, very little can be known using pure logic alone. But the “impure” aspects of a philosophical argument should never, ever be found in the logical validity of an argument. Rather, uncertainty ought to arise out of the truth of a premise – which is where the idea of a sound argument comes in.

What is a Sound Argument?

In order to define a sound argument, I find it helpful to first show examples of arguments that satisfy validity but not soundness. This is an “argument” that Steve went to jail – but upon reading the argument it should be instantly clear that something is awry (which is the whole point):

1. Everyone who goes to jail commits a crime.
2. Steve was sent to jail.
3. Therefore, Steve committed a crime.

Anyone who has studied logic will affirm that this argument is logically valid – if you accept as true both Premise 1 and Premise 2, then the conclusion that Steve committed a crime cannot be avoided. And yet, we all know there is something wrong here. As much as we would like Premise 1 to be true, it is not true. Sometimes, people who have not committed any crimes go to jail. So, Premise 1 is false.

This example shows what we mean by an unsound argument – which we ought to reject. Now, informally, a philosophical argument should be called sound if it is logically valid, is based upon truth and which genuinely leads a thinking person to truth. To further clarify this, here is a definition of a sound argument that I find quite reasonable:

An argument is sound if and only if it is logically valid, all its premises are in fact true, and its premises are more likely to be true than false in light of the evidence.

Now, I will provide some examples to show why I include all the points in this definition that I do. If the argument is not logically valid, then by definition I am perfectly within my intellectual rights to accept as true every premise and to still reject the conclusion – so logical validity is required. If there is any confusion here, I will provide examples later in this series of logically invalid arguments that sometimes pass as valid (google the genetic fallacy, the ad hominem fallacy, or the fallacy of composition for some examples). I hope that I do not need to defend here how important it is that our beliefs line up with the basic laws of logic (which is really what logical validity means).

The argument about jail already shows how an argument can be logically valid but that it might be based on a false premise. Interestingly, we might here note that if an argument is logically invalid, we cannot therefore deny the conclusion. In the jail argument, it is quite possible that Steve is actually in jail. It is quite possible that Premise 2 and the conclusion are both correct. The important point here is that Premise 1 is not correct, so the argument is not valid because you cannot deduce the conclusion from the premises – since one of them is false, you can’t use it to meaningfully learn anything by this method.

So, this example shows why in my definition of a sound argument I require that all the premises be actually true. However, there is yet another way that an argument can be unsound, which is more complicated. To see what I mean, suppose that we go back in time 1000 years – to the time when everyone believed that the Sun revolved around the earth. People believed this, among other reasons, because no evidence had yet been produced that the earth actually revolves around the Sun. Now, in this situation, suppose I propose the following argument:

1. If the Earth revolves around the Sun, the Sun would appear to rise and set in the sky.
2. The Earth revolves around the Sun.
3. Therefore, the Sun appears to rise and set in the sky.

To a modern reader, this is a sound explanation of why the Sun appears to rise and set (albeit massively simplified – of course you’d want to give a visual or mathematical demonstration of how Premise 1 actually works). However, to a person living 1000 years ago, Premise 2 will seem very highly unlikely in light of the evidence that they had available to them. So, the man living 1000 years ago need not believe that the conclusion is true because of this argument – they can instead believe that the Sun appears to rotate around the Earth because it actually does – because to this man, that is where all his evidence points.

It is for this reason that in discussing whether an argument is sound – roughly speaking, whether it ought to convince intelligent and informed people of its conclusions – we have to take into account what sorts of evidence are available to us. This is why in the definition of soundness that I give, I specify that every premise must be more likely than not given the evidence. To be fair here, you could require a standard other than “more likely than not” – for instance, in a court of law very often this standard is increased to “beyond a reasonable doubt” because of the horrible consequences of a false conviction. But despite the potential for changing the “degree of likelihood” we demand of our evidence, I find the “more likely than not” approach most reasonable.

Conclusion

From this post, we have defined the minimal criteria that an argument should meet in order to be convincing. In summary, in order to convince your listener of a point, your line of reasoning – your argument – should line up with the laws of logic, it shouldn’t rely on any false premise, and all premises in your argument should be supported by evidence. In the future, I will spend more time elaborating on how to properly approach the premises of an argument and evidence for each premise carefully and even mathematically.

## Intro to Series: Thinking Clearly and Critically

We hear so often how important critical thinking is for succeeding in the job market, in the world, however you want to say it. Critical thinking is put forward as a difference-maker that separates success from failure. And for good reason – by definition, critical thinking is the practice of not taking statements at face value and using careful, tried-and-true methods of sifting through information, finding the truth amidst confusion, and finding good solutions amidst a sea of bad ones.

As much as there is talk about things like critical thinking… what actually is critical thinking? And how do we go about developing our critical thinking skills?

One way to engage in critical thinking, and the one I perceive to be most common in education, is through problem-solving. By forcing yourself to think through a complicated problem, there is a natural tendency to develop stronger critical thinking skills, just as going to the gym has the tendency to make one stronger and more athletic. This is one very important aspect of critical thinking. Learning problem-solving helps you think more quickly and helps to assemble useful ideas in useful ways.

But this is not all that critical thinking is. There is more, and it is the aspect that I’ve noticed seems to be much more lacking in modern society. This is the critical thinking skill of recognizing and avoiding bad reasoning. There are a great many ‘patterns of thought’ out there – and they are not all equal. Some help you find truth, others are obviously flawed. There are some that look like they work, but really don’t, and some that don’t look like they work but actually do. There are some that apply in every situation, and some that apply in some cases but not others. And I cannot remember even once in my education learning anything about any of these critical thinking tools, at least not prior to studying mathematics in college. And even then, I still didn’t know most of the philosophical tools that I use in my writing and thinking now – and I fell victim to many of these sorts of bad ways of thinking. Hopefully, I don’t anymore, but at least by knowing about them I can do a better job trying to prevent myself from slipping into them.

I’ve previously done a series on types of mathematical proofs, and this series is meant as an extension of that into a landscape of ideas broader than mathematics. Through the weeks, months, and possibly even years that I hope to work on this series, I hope to cover a wide variety of areas and how experts utilize critical thinking in their own fields, as well as a variety of commonplace flawed patterns of thinking that we all instinctually fall prey to. I also hope to provide useful examples of commonplace fallacies in order to help people think more clearly about issues in the public square as well as in more personal matters.