Friday Round-up – August 7, 2020

This was a light week for consuming content that stuck with me, so here is the sole round-up list for the week ending on August 7th:

đź’­Reflection – Citing our sources – How to Think for Yourself | Ozan Varol blog post and Don’t Quote. Make it Yours and Say it Yourself | Derek Sivers blog post

The Varol piece was new, and as I read it, it reminded me of the Sivers piece, so I’m pairing them together. I’m a little conflicted with the message. On the one hand, I agree with both writers about the sentiments they are expressing. In Varol’s case, often citation becomes a short-hand for original thinking. Rather than expression your own unique ideas, you regurgitate what you’ve consumed from others (whether you are citing it or not, as is on display in the Good Will Hunting example). Likewise, Sivers is on to something when he suggests that integrating facts into our mental apparatus should not require us to cite our sources when it’s no longer the appropriate context. It makes sense to cite sources when writing something that will be graded in school, but it is stilted while in informal settings.

Where I feel conflicted is when there is a need to trace ideas back to verify the content. I don’t think it’s a new phenomenon, but it has certainly accelerated in recent years that misinformation is being thrown out into the void at a rapid pace. The internet has optimized itself on three facts of human nature – we like sensation, we like things that are familiar (that accords with what we already believe), and we are less critical of our in-group. Therefore, information bubbles get set up online, which creates a digital environment that’s conducive to rapid spreading of memetic viruses. When you think about it, it’s a marvelous analogy: the online information bubble is a closed environment where people are like-minded, which amounts to a roughly analogical immune system. A memetic virus then latches hold on one person, who spreads it to people in their network. Since the folks in the network share similar belief structures, the homogeneous group quickly spreads the meme throughout the information bubble. The meme is then incorporated into the belief network of the individuals through repetition and confirmation bias exposure. It writes itself into the belief web, in the same way viruses incorporate themselves into DNA.

I’m using the example of a memetic virus, but I think this framework is equally applied to more benign examples. Scientists release findings in the form of pre-peer reviewed news releases, which gets amplified and distorted through the media, which is then amplified and distorted through information bubbles. See here for an example:

At each phase, part of the signal is lost or transformed, like a social media game of telephone. When one person in the chain misunderstands the data, that impacts how the idea gets replicated. Over time, it becomes the digital version of a cancerous mutation of the base information.

This is why it’s important that we take care of how information is communicated, because as soon as you print something like “the majority of people believe x,” or “studies showed a y% decrease in the effect,” without a proper context of what the data is saying (or its limitations), that gets incorporated into people’s webs of belief. If you are a part of the population that believes something and you read that information, it reinforces your prior beliefs and you continue on in replicating the idea.

And so I’m torn. On the one hand, I shouldn’t need to cite my sources when having a casual conversation (a la Sivers), and I shouldn’t be substituting original thoughts with the ideas of others (a la Varol), but at the same time, I want it to be the case that when I encounter something that it should be verifiable and scruitable. I don’t know what the solution to this is, other than to flag it and remind myself to be wary of absolutist language.

Stay Awesome,

Ryan

Appealing to my Smarmy Brain

selective focus photography of spiderweb
Photo by Nicolas Picard on Unsplash

From time to time, I catch myself thinking some pretty stupid stuff for entirely dumb reasons. A piece of information finds a way to bypass any critical thinking faculties I proudly think I possess and worms its way into my belief web. Almost like a virus, which is a great segue.

A perfect example of this happened last week in relation to the COVID-19 news, and I thought it important to share here, both as an exercise in humility to remind myself that I should not think myself above falling for false information, and as my contribution to correcting misinformation floating around the web.

Through a friend’s Stories on Instagram, I saw the following screencap from Twitter:

My immediate thought was to nod my head in approval and take some smug satisfaction that of course I’m smart enough to already know this is true.

Thankfully, some small part at the back of my brain immediately raised a red flag and called for a timeout to review the facts. I’m so glad that unconscious part was there.

It said to me “Hang on… is hand-sanitizer ‘anti-bacterial’?

I mean, yes, technically it is. But is it “anti-bacterial” in the same way that it is getting implied in this tweet? The way the information is framed, it treats the hand-sanitizer’s anti-bacterial properties as being exclusively what it was designed for, like antibiotics. For example, you can’t take antibiotics for the cold or flu, because those are not bacterial infections but viral infections.

Rather than leaving this belief untested, I jumped on ye ol’ Googles to find out more. I found a write-up in the National Center for Biotechnology Information discussing alcohol sanitizers.

According to the author on the topic of alcohol-based hand sanitizers (ABHS),

A study published in 2017 in the Journal of Infectious Diseases evaluated the virucidal activity of ABHS against re-emerging viral pathogens, such as Ebola virus, Zika virus (ZIKV), severe acute respiratory syndrome coronavirus (SARS-CoV), and Middle East respiratory syndrome coronavirus (MERS-CoV) and determined that they and other enveloped viruses could be efficiently inactivated by both WHO formulations I and II (ethanol-based and isopropanol-based respectively). This further supports the use of ABHS in healthcare systems and viral outbreak situations.

There are some special cases where ABHS are not effective against some kinds of non-enveloped viruses (e.g. norovirus), but for the purposes of what is happening around the world, ABHS are effective. It is also the case that the main precaution to protect yourself is to thoroughly wash your hands with soap and water, and follow other safety precautions as prescribed.

The tweet, while right about the need for us to wash our hands and not overly rely on hand-sanitizers, is factually wrong generally. Thanks to a mix of accurate information (bacteria =/= virus) and inaccurate information(“hand sanitizer is not anti-bacterial”), and a packaging that appeals to my “I’m smarter than you” personality, I nearly fell for its memetic misinformation.

There are a number of lessons I’ve taken from this experience:

  1. My network is not immune to false beliefs, so I must still guard against accepting information based on in-group status.
  2. Misinformation that closely resembles true facts will tap into my confirmation bias.
  3. I’m more likely to agree with statements that are coded with smarmy or condescending tonality because it carries greater transmission weight in online discourse.
  4. Appeals to authority (science) resonate with me – because this was coming from a scientist who is tired of misinformation (I, too, am tired of misinformation), I’m more likely to agree with something that sounds like something I believe.
  5. Just because someone says they are a scientist, doesn’t make the status true, nor does it mean what they are saying is automatically right.
  6. Even if the person is factually a scientist, if they are speaking outside of their primary domain, being a scientist does not confer special epistemological status.
  7. In the aftermath, the tweet was pulled and the person tried to correct the misinformation, but the incident highlights that the norms of Twitter (and social media more broadly) are entirely antithetical to nuance and contextual understanding.

It’s interesting how much information spread (memetics) resembles pathogen spreading. If the harmful thing attacking us is sufficiently designed to sidestep our defenses, whether that’s our body’s immune system or our critical thinking faculties, the invading thing can easily integrate within, establish itself within our web, and prepare to spread.

The one thing that really bums me out about this event is the inadvertent harm that comes to scientific authority. We as a society are caught in a period of intense distrust of the establishment that is coinciding with the largest explosion of information our species has ever seen. The result of this is not that good information is scarce, but rather the signal-to-noise ratio is so imbalanced that good information is getting swept away in the tide. If people grow distrustful of the sources of information that will help protect us, then forget worrying about gatekeepers that keep knowledge hidden; there will be no one left to listen.

Stay Awesome,

Ryan