Evidence, Credibility, and the Homunculus Courtroom

We should think of our beliefs and the evidence we engage with as if we had a little homunculus tv courtroom in our brain adjudicating whether to admit evidence into the record. Obviously, this is incredibly difficult to pull off in real time, but it’s a nice thought experiment to pause and consider the weight of a claim being made.

This idea came to me while watching a YouTube video covering the recent downfall of a famous hustle influencer, where the presenter made an observation that she (the presenter) would normally not take people’s personal lives into consideration when judging their professional work, but the case that the influencer sold conferences and products marketed as relationship coaching courses under the pretenses of having a great marriage was swiftly undermined by her (the influencer) getting a divorce approximately two years later.

I was impressed with this statement by the presenter – she was right! Under normal circumstances, the personal life of a person shouldn’t bear weight on something like this, but given the fact that the evidence under consideration was whether someone was misleading about their personal life and getting others to pay for her “expertise,” it would be grounds to consider this piece of evidence as relevant or bearing weight. My homunculus courtroom judge ruled that the testimony was admissible.

This is a silly thought experiment to anthropomorphize cognitive thought-processes that are otherwise just a black box to me. I suppose it’s a little farfetched to think that we have this much control over our beliefs, but maybe the next time I listen to a claim (or gossip, or something that doesn’t jive with my experience… or claims that I want to be true…), I will remember my homunculus courtroom and think twice about the claim’s believability.

Stay Awesome,

Ryan

The Art of Self-Discipline

On the days when I’m languishing and finding it difficult to be productive, where procrastination and anxiety keep me in rabbit holes of distraction, and at the end of the day I look at the clock and realize how much time I’ve wasted, it’s easy to write myself off as a lazy, slovenly person. It’s easy to think of myself as the kind of person who does not have discipline, that I wasn’t born with that trait – fatalism has kicked in; I should accept who I am.

But that’s not what self-discipline is. It’s easy to see self-discipline as some sort of binary state when you are comparing yourself against others further along their own paths than where you want to go.

The Romans had a saying that “we can’t all be Cato’s,” referring to the stoic politician who served the State with self-sacrifice. But that saying is wrong. It should be “we aren’t all Cato’s, yet.”

In virtue ethics, your moral character is judged against an abstract ideal – the Stoic Sage. But possessing virtue is not a trait or character state. Possessing virtue is a process of becoming, of doing the right thing at the right time.

Having self-discpline doesn’t mean you are a paragon of discpline. It means you are exercising discpline in the moment. If you fail, it just means you are still working on becoming who you want to be.

The Japanese refer to this as ““, the Way. You never reach perfection, but your life is one long project of incremental progress towards what you are meant to be.

That is the way of self-discipline.

Stay Awesome,

Ryan

The Last Person to Know Everything

On a recent CBC podcast episode about Leibniz and Voltaire’s thoughts about evil and God, one of the interviewees referred to Leibniz as “the last man to know everything.” I find this notion utterly fascinating. Upon hearing that title, I jumped online to search for the “best biography on Leibniz” and found a highly acclaimed book detailing an intellectual biography of the 17th-century thinker. Once I clear some books on my current reading list, I’ll dive into this hefty book.

“Leibniz: An Intellectual Biography” by Maria Rosa Antognazz

This isn’t the first time I’ve encountered the moniker of “the last person who knew everything.” In fact, that was the title of a biography I read back in late 2018 on Enrico Fermi.

I’ve been drawn to this idea for a long time, probably originating with the first time I saw the 1994 film Renaissance Man starring Danny DeVito. That was where I first learned of the term renaissance man, or more commonly known as a polymath – a person with considerable knowledge and expertise across a wide variety of domains. While I wouldn’t quite call it a goal, this is an aspiration of mine since I was a child.

I suppose as the sciences progress, it becomes increasingly difficult to lay claim to being “the last person who knew everything.” Each field grows increasingly complex as we push the boundaries of the known world, which raises the threshold higher of what counts as expertise.

It would seem we need to seriously consider the observation recently made by Professor Adam Grant on the differences between experience and expertise:

Instead of seeking to always have depth of knowledge, perhaps we should give equal consideration to wisdom and how we can apply our experiences and expertise to solve interesting problems. While more nebulous as a goal, I think it steers us in the right direction. At the very least, it’s a good vision to aspire towards.

Stay Awesome,

Ryan

PS – an unexhaustive list of the traits that distinguishes a “last person who knew everything:”

  • Intellectual curiosity
  • Intellectual humility
  • Interests spanning a variety of domains, both sciences and arts
  • A grasp of the methods and tools of science
  • Generating novel insights
  • The ability to see problems in terms of first principles
  • Engaging in idea arbitrage
  • Focus and flow in work – liking what you do

History vs The Past

While listening to a BBC podcast about Heroditus, the panelists described how Heroditus set about his project with the purpose of recording events with some accuracy before the details were lost from memory. Unlike some historians from Greek antiquity, Heroditus was writing about events that were within his lifetime. This created a new kind of writing that set itself apart from others in his genre because it aimed at corroborating stories rather than recording myth.

This is an interesting distinction worth keeping in mind. There is a difference between “history” and “the past.” It can be helpful to think of history as a subset of the past. History is the collection of stories we tell, and as a consequence it is necessarily selective in what gets included and what is left out. This makes sense from a practical standpoint – it is nearly impossible to capture every detail, nor does every fact in the past bear a tangible, causal relationship to the story being told (even if arguably from a systems perspective, many things create ripples of unknown influence that overlap with other events).

The challenge of history is accuracy – capturing events that happened with fidelity and charity. As new facts are discovered, and as new facits of importance enter the discourse, history is revised to (hopefully) move closer to our aim of truth. (For the moment, let’s ignore questions about power and who tells these stories and for what aim).

However, we must not confuse history (the stories we tell) with the past (events that happened prior to the now). Ignoring this distinction places us in danger of imbuing our myths with an illusion of objectivity. The stories we tell ouselves matter, of course, but they also carry power. Who tells the stories, and whose stories get left out, can carry harmful consequences.

We try to learn the lessons from history, but we cannot be so arrogant to think that history is complete.

Stay Awesome,

Ryan

Work Focus While Home Alone

Working from home poses challenges for most folks when it comes to being able to focus. Many of my colleagues noted how difficult the summer months could be while children were home from school. For me, with an infant at home, the distractions were fairly minimal, especially because my wife handled 99% of the care during the working day during her leave from work.

But now she’s gone back to work and our child is at daycare during the day. While you’d think this means my productivity output has jumped by leaps and bounds, it’s actually done the opposite. With no one in the house to bother me, with no one to look over my shoulder, or for me to quickly hide the fact that I’m goofing off watching irrelevant videos on YouTube instead of looking at spreadsheets, the seeming unlimited time means I have a hard time getting started.

This almost seems like a cousin of Parkinson’s law, but instead of work filling the allotted time, the strength of the impulse to get started is negatively correlated with the amount of free (unsupervised) time I find myself with. Quite the opposite, there seems to be more inertia to overcome.

Stay Awesome,

Ryan

PS – as a note to my future-self: there is a connection here with what Mel Robbins says about procrastination, that it’s not a function of laziness but instead a coping mechanism for the anxiety felt by the task. I should look into this more.

My Best Interest

If you want a good newsletter, you should check out Arnold Schwarzenegger’s newsletter. I signed up a few months back and have thoroughly enjoyed each update. I find him such a fascinating and inspiring person, not just from his bodybuilding work, his acting career, or even his time in politics, but above all because he strikes me as a fundamentally decent person.

He made two interrelated observations in the latest issue that stuck out for me. A significant portion of the email dealt with his clarifying and elaborating on his viral “screw your freedoms” moment during an interview talking about why people should get their vaccines. In his expanded comments, he urges his readers to pay attention to the motivations of people trying to give them advice, and discard those opinions which are not in your best interest (including his own). By this, he means fitness influencers and politicians, whose motivations are clicks and ad revenue in the former, as well as outrage, donations, and votes in the latter. When it comes to your health, these people are not giving advice based on your own health and wellbeing.

The second related comment is that if you can’t trust government or social media, who should you trust? To that, he says you should trust your doctor because your doctor took an oath to protect you. Your doctor is paid with only one expectation in return – the promotion of your wellbeing and health.

Talk to your doctor, not people who don’t have your health as their main responsibility. The Instagram and Facebook accounts you follow that give information on vaccines are not concerned about your health. They are concerned with getting more followers and making money.

I have seen way too many stories about people who listened to politicized information about the vaccine instead of their doctors, and then changed their minds when it was too late.

At the end of the day, everyone has to make their own decision about getting vaccinated. But if I can inspire even a few of you or your friends or family to avoid another one of these tragic stories that tore families apart, I want to do it.

He urges us to trust the experts and take wisdom from their experience. When presented with advice, we should ask ourselves what the advice-giver gets in return for our compliance. Do they benefit from our participation? What do we lose by their gain? These are important checks that we should make when deciding what’s in our best interest.

Stay Awesome,

Ryan

A Draft Ontology of Shipwrecks and Identity

In a recent podcast episode I was listening to, the hosts were speaking glibly about progressivism. I won’t name the show because who was saying it is ultimately unimportant. At the crux of their tangential discussion away from Plato’s Greater and Minor Hippias was their dismissal of progressivist attitudes towards the flow of history, that those who come later in history will assert some sort of superiority (technological, moral, intellectual) over less developed, unenlightened peoples of the past. While I think they were onto something in thinking that any sort of change is not prima facie better, I was unconvinced of their move to dismiss it because they didn’t adequately set out a criteria by which to judge advancement. Instead, they proceeded to discuss (within the context of Plato’s dialogue) the relative comparisons of Athenian and Spartan laws. And when they came to a discussion in the text about the relative merits of spoons according to function and form, the one host came dangerously close to undermining his rebuttals of progressivism in my estimation.

But this isn’t a post about their podcast episode – truthfully I haven’t taken the time to go back and listen to their post for the fidelity of the above paragraph because that’s not what I want to write about. It serves as a frame from what it made me think about.

Instead, their conversation reminded me of folks who complain about changes to our understanding of the world, especially as it relates to mental health and/or personal identity concerning gender. There is a resistance to keeping an open mind because it doesn’t harmonize with a worldview they hold that’s often formed and set in ones late teens or early twenties. You see this come out in a number of ways in the way they talk about these issues, but a canonical refrain is “back in my day, they didn’t have x,” whether that is expanded definitions of mental health issues or nonbinary categories of gender. Instead, the proliferation of new words to capture experiences is seen as a self-evident refutation of these developments because they think the relative plurality of new understandings of the world must not be grounded in anything solid or universal. That is to say, if they haven’t experienced it, then it must clearly not be real in an ontological sense.

When I say real in an ontological sense I mean that the phenomenon the word is attached to doesn’t carry existence attached to a concrete thing1. In the podcast, they discussed this in the context of trees and trying to identify tree types. In the Platonic tradition, trees are trees because the thing I’m seeing out my window that grows tall, has a solid brown body covered in a rough exterior that is thicker near the bottom and branches out at the top, terminating in green thin pieces, participates in the Form of tree or tree-ness. The concept of tree is tied to the physical object, but to Plato the Form of tree exists independently of the tree in front of me. In biology, living things are categorized according to common, reliable traits that distinguish different types of organisms from another. A maple tree and a pine tree don’t share many common physical appearance traits, but they share a sufficient number of them that we call them both trees. The concept of tree is an abstraction used to describe something about the physical object. If we are being rigorous, there may be a debate whether the concept of tree as described above (as a Platonic Form) has an ontological existence, but for the purposes of our discussion, tree as a category is real because it is tied to a thing that physically exists out there in the world.

And so to circle back, the anti-progressivist disclaims new labels on people on the thinking that the label/category doesn’t map to something real. There is a reduction problem in their mind – the mental disorder or gender identity (in this sentence, I treat them as two separate concepts that are not intended to be inclusive) are not mapped to anything that can be pointed to. To them, gender identity is reducible only to secondary sexual characteristics (genitals), and mental health is based on stereotype behaviours easily observed (signs) rather than reported (symptoms). In the anti-progressivist mind, creating a new name or category means creating a new phenomenon; a phenomenon that did not exist before.

Here we come to the title of this post’s line of thinking. What the anti-progressivist is confusing is the difference between creating new categories, and giving words to describe something already existing but had yet to become clarified. For this, I invoke the late Paul Virilio and shipwrecks. The anti-progressivist2 treats mental disorder and gender identity as concepts invented wholly new like the concept of a shipwreck. Before the invention of the ship, there was no concept of shipwreck, or train derailment before trains, car crash before automobiles, etc.3 These concept categories did not exist previously, and their existence is contingent on us inventing them (even if by accident). But I think this is the wrong way to capture what is going on when we create a new category of understanding.

It is not the case that more children are coming out as transgender because its faddish, trendy, or a socially acceptable way of acting out against social norms. To the contrary, it’s more likely that more children (or people generally) are publicly identifying as trans (or nonbinary, or homosexual/bisexual/asexual, etc.) because we’ve given them language to make sense of what they are feeling within. I do not have a source to provide, but I read a lament once that because of previously draconian crackdowns on LGBT communities, many people did not live long enough to allow their existences to be counted. The number of people who identify as LGBT is not growing because people are suddenly “becoming queer.” Rather, our language and society is moving towards a place that has space for a plurality of lives.

And I think the same thing is happening as we redefine and clarify mental health issues – these issues are likely not new4, but instead we are better able to understand the internal lives of others because we are listening to what these individuals are saying about their experiences. We aren’t inventing new categories so much as we are finally recognizing things that can now be counted. As the saying goes, what gets measured gets managed. The old terms that were used to medicalize people’s internal lives were insufficient to either understand or treat the person, and so we refine our language to better capture their experiences.

When we reclassify our language, we create nuance. We create a more interesting and vibrant world. This is a good thing – we understand the world in new ways and can appreciate the diversity and complexity that comes from this understanding. I agree that progress for its own sake is not automatically good. Progress must be paired with wisdom and experience if we want to avoid creating harm in the future. But progress should not be halted on the belief that change is flippant, nor should it be dismissed because it introduces complexity to our worldview. The anti-progressivist seems to hold that society is sliding from order to disorder, away from some ideal that we must actively work to return to. To them, anything new is to be distrusted merely because past progress yielded harms. They place more weight on the mis-steps and ignore the improvements to the quality of our lives. This view is just as false as assuming a teleological bent to society evolving – that society is always aiming at getting better.

Society is neither sliding away from perfection nor building towards it. It is moving from simplicity to complexity; from blunt and clumsy to fine and precise.5 As our understanding of the world grows, so too must our language to describe it. With understanding comes language, with language comes empathy, with empathy comes diversity, and from diversity comes strength.6

Stay Awesome,

Ryan

[P.S. – A few days after publishing this, I read a post from Seth Godin on Cyber-realists that says some of what I say above about wisdom tempering progress, but much more succinctly.]

Notes:

1For those who have studied metaphysics and ontology, I apologize if my take comes off as uninformed – I must admit that this is me working through the ideas in my head.

2I’ve typed this word many times in this post without critically thinking whether this is the appropriate term to give to the person/line of thinking to which I’m responding. However, this post is mostly a first-draft attempt at clarifying my thoughts, and so I leave it for now with the understanding that this is all mutable upon further consideration.

3This is perhaps one of the few areas where I’m sympathetic to the anti-progressivist – not all progress is devoid of negatives or downsides. With any effect, there will be unintended or unanticipated side effects and consequences. The technology that helps preserve food and makes it cheaper to produce might also be causing health problems from fast food, for example.

4Here I’m talking about reclassifying old or outdated diagnostic methods, rather than genuinely new classifications that are the result of modern life, though this might be up for debate – is it genuinely new or merely a sub-classification of already existing conditions, such as video game addiction. I’m out of my expertise here, so I can’t say anything with authority on the matter.

5There is a conversation to be had here that brings Kuhn into the party, but this post has groaned on too long. I like Kuhn’s ideas that rather than a steady march of progress, science changes through the adoption of new worldviews, but I think this is less about knowledge and more about the sociology of knowing-peoples. People, ideology, and politics makes science messy.

6Admittedly, this is appears to be a slippery slope that requires a lot more argument to make clear. As with Kuhn, this could be left to a different post, but my main argument is that diversity is good because it hedges against downsides. I think there are limited cases where uniformity and homogeneity are preferable, but those are exceptions that prove the rule.

Nicola McDermott’s Notebook

As my wife and I were watching some of the Olympic coverage, we caught a recap of the finals for the women’s high jump event. I was always terrible at high jump as a child, but I stand at around 191cm (over 6-feet tall), so watching the athletes jumping heights that would clear my head instils a lot of awe in me. One jumper in particular caught my eye – silver-medalist Nicola McDermott from Australia. After each of her jumps, the camera would catch her diligently writing in a notebook.

Screencap from video: https://youtu.be/tYFV02xldbE

I have heard of athletes who meticulously journal to help with performance psychology, but this was the first time I caught it live. You can learn a bit more about what she records after each jump in this Guardian piece. One quote from the piece caught my eye:

“The 2.04m – I gave myself a 10/10 for that jump, the execution,” McDermott explained. “I felt the clearance in the air. But the lack of experience with the timing meant that it just didn’t happen today.”

I like that she framed it not as a failure of her abilities, but instead a lack of experience with the execution. Instead of seeing it as “I can’t do this,” it’s “I haven’t done this yet.” The productivity sphere labels this as an example of growth mindset, and given the stakes of the Olympics it’s inspiring to see an athlete have such an upbeat, positive attitude that would likely cause me to beat myself down in defeat.

While mere mortals like myself typically go through the motions of any given action, Nicola’s journaling habit and mindset is worth modelling as a method of providing yourself with immediate feedback and a view from without – one that gets you out of your own head. It also takes ownership over the process, because it forces you to break the activity down into discrete parts that you can focus on and improve.

Congratulations to Nicola on her personal best, and the example she sets in performance excellence.

Stay Awesome,

Ryan

Shrinking the Change

In recent weeks, I’ve built reading time intentionally into my work day. I’ve had it “in my calendar” for quite a while as an intention, but I haven’t meaningfully engaged with that blocked-off time since I first put it into my calendar. The intention was to recognize that my skills and career path would require me to commit time to learning and personal development, but I quickly got lazy and found other less productive things to occupy that time with.

I was annoyed with how I’d allow the whole day to slip by without getting a good start to my tasks (my vice being YouTube’s algorithm), so I thought I’d redirect my attention a bit. I reasoned that if my brain wanted to fight engaging with work (because it’s hard), then I could use that time to read. I set the timer and once I finish the sprint, I would start on some task for the day.

Surprisingly, I’m having some success with the process. It’s not perfect, but on the days I start with reading, I’m more likely to resist temptation and tackle items on my to do list.

Coincidentally I’m reading the book Switch by Chip and Dan Heath. One of the chapters discusses the concept of shrinking the change, which is just a fancy way of expressing the idea that we should break big scary tasks into smaller, more manageable bites. Committing to something that is long and ill-defined is hard for my lazy brain to comply with, but it will comply with an easy edict, like “read for the next 25-minutes,” or “spend the next 25-minutes downloading course information files to be processed.”

These tips and hacks are not new – everyone has some flavour of it as part of their productivity system. But like losing weight, it’s not a knowledge problem. Learning and reading about how to lose weight won’t make you shed the fat, nor will it help you amp up productivity.

The challenge for me is tricking myself into not being lazy. If the only way I can do that is making an Odysseus pact, or treating myself like a child, setting a timer, and promising to do a little bit before I get a reward, then so be it.

Stay Awesome,

Ryan