This was a light week for consuming content that stuck with me, so here is the sole round-up list for the week ending on August 7th:
💭Reflection – Citing our sources – How to Think for Yourself | Ozan Varol blog post and Don’t Quote. Make it Yours and Say it Yourself | Derek Sivers blog post
The Varol piece was new, and as I read it, it reminded me of the Sivers piece, so I’m pairing them together. I’m a little conflicted with the message. On the one hand, I agree with both writers about the sentiments they are expressing. In Varol’s case, often citation becomes a short-hand for original thinking. Rather than expression your own unique ideas, you regurgitate what you’ve consumed from others (whether you are citing it or not, as is on display in the Good Will Hunting example). Likewise, Sivers is on to something when he suggests that integrating facts into our mental apparatus should not require us to cite our sources when it’s no longer the appropriate context. It makes sense to cite sources when writing something that will be graded in school, but it is stilted while in informal settings.
Where I feel conflicted is when there is a need to trace ideas back to verify the content. I don’t think it’s a new phenomenon, but it has certainly accelerated in recent years that misinformation is being thrown out into the void at a rapid pace. The internet has optimized itself on three facts of human nature – we like sensation, we like things that are familiar (that accords with what we already believe), and we are less critical of our in-group. Therefore, information bubbles get set up online, which creates a digital environment that’s conducive to rapid spreading of memetic viruses. When you think about it, it’s a marvelous analogy: the online information bubble is a closed environment where people are like-minded, which amounts to a roughly analogical immune system. A memetic virus then latches hold on one person, who spreads it to people in their network. Since the folks in the network share similar belief structures, the homogeneous group quickly spreads the meme throughout the information bubble. The meme is then incorporated into the belief network of the individuals through repetition and confirmation bias exposure. It writes itself into the belief web, in the same way viruses incorporate themselves into DNA.
I’m using the example of a memetic virus, but I think this framework is equally applied to more benign examples. Scientists release findings in the form of pre-peer reviewed news releases, which gets amplified and distorted through the media, which is then amplified and distorted through information bubbles. See here for an example:
At each phase, part of the signal is lost or transformed, like a social media game of telephone. When one person in the chain misunderstands the data, that impacts how the idea gets replicated. Over time, it becomes the digital version of a cancerous mutation of the base information.
This is why it’s important that we take care of how information is communicated, because as soon as you print something like “the majority of people believe x,” or “studies showed a y% decrease in the effect,” without a proper context of what the data is saying (or its limitations), that gets incorporated into people’s webs of belief. If you are a part of the population that believes something and you read that information, it reinforces your prior beliefs and you continue on in replicating the idea.
And so I’m torn. On the one hand, I shouldn’t need to cite my sources when having a casual conversation (a la Sivers), and I shouldn’t be substituting original thoughts with the ideas of others (a la Varol), but at the same time, I want it to be the case that when I encounter something that it should be verifiable and scruitable. I don’t know what the solution to this is, other than to flag it and remind myself to be wary of absolutist language.