During a throwaway thought experiment in his 1641 treatise, Meditations on First Philosophy in which the existence of God and the immortality of the soul are demonstrated, René Descartes posited the idea of an evil genius or demon that systematically deceives us to distort our understanding of the world. Contrary to first year philosophy students everywhere (a younger version of myself included), Descartes did not actually believe in the existence of an evil manipulator that was holding us back from understanding the nature of the real world. Instead, he was using it as part of a larger project to radically re-conceive epistemology in an era of rapid advancements in science that was threatening to overturn centuries of our understanding of the world. He felt that knowledge was built upon shaky ground thanks to an over-adherence on the received authorities from Greek antiquity and the Church’s use of Aristotelian scholasticism. Similar to Francis Bacon twenty years earlier, Descartes set out to focus on knowledge that stood independent of received authority.
Through Meditations one and two of his book, Descartes considers the sources of our beliefs and considers how we come to know what we think we know. He wants to find an unshakable truth to build all knowledge from, and through an exercise of radical doubt he calls into question many of the core facts we hold – first that knowledge gained from the senses are often in error, that we often can’t distinguish the real from fantasy, and through the use of the evil genius, that perhaps even our abstract knowledge like mathematics could be an illusion.
When I teach this to first year students, they either don’t take his concerns seriously because of the force of the impressions the real world gives us in providing sense data for knowledge (a stubbed toe in the dark seems to forcefully prove to us that the external world to our senses is very real), or they take Descartes too seriously and think Descartes really thought that a demon was actively deceiving him. Regardless of which side the student falls on, they will then conclude that Descartes’ concerns are not worth worrying about; that this mode of thinking is the product of an earlier, less sophisticated age.
Unless you are a scholar delving into Descartes’ work, the real purpose of teaching the Meditations is to provide students with a framework to understand how one can go about thinking through complex philosophical problems. Descartes starts from a position of epistemic doubt, and decided to run with it in a thought experiment to see where it took him. The thought experiment is a useful exercise to run your students through to get them to think through their received opinions and held-dogmas.
However, in light of my rant a few weeks back about informed consent and vaccines, I’ve discovered a new contemporary use for thinking about Descartes’ evil genius. In some sense, the evil genius is *real* and takes the form of fear that shortcuts our abilities to learn about the world and revise our held beliefs. Descartes posited that the evil demon was able to put ideas into our heads that made us believe things that were completely against logic. The demon was able to strip away the world beyond the senses and even cast doubt on abstract concepts like mathematics.
Much in the same way Descartes’ demon was able to “deceive” him into believing things that were contrary to the nature of reality, our fear of the unknown and of future harm can cause us to hold beliefs that do not map onto facts about the world. Worse yet, the story we tell about those facts can get warped, and new explanations can be given to account for what we are seeing. This becomes the breeding ground for conspiracy thinking, the backfire effect, and entrenched adherence to one’s beliefs. We hate to be wrong, and so we bend over backwards to contort our understanding of the facts to hold-fast to our worldview.
In truth, we are all susceptible to Descartes’ demon, especially those whom believe themselves to be above these kinds of faults of logic. In psychology, it’s called the Dunning-Kruger effect, of which there are all sorts of reasons given why people overestimate their competence. But in the context of an entrenched worldview that is susceptible to fear of the unknown lurks Descartes’ Demon, ready to pounce upon us with false beliefs about the world. Its call is strong, its grip is tight, and the demon is there to lull us into tribalism. We fight against those we see as merchants of un-truth and in a twisted sense of irony, the weapons of truth we yield only affect those already on our side, while those we seek to attack are left unaffected. It becomes a dog-whistle that calls on those who already think and believe as we do.
If we hope to combat this modern Cartesian demon, we’ll need to find a new way of reaching those we see on the other side.
In the ethics of conducting research with human participants, there is the concept of “informed consent.” At its foundation, informed consent is the process of communicating a sufficient amount of information about a research project to a prospective participant so that the prospect is able to decide whether they want to consent to being a participant in a study. There is a lot of nuance that can go into selecting what gets communicated because you have a lot of necessary information that needs be shared but you don’t want to share so much information that the participant is overwhelmed by the volume of information.
When I review research ethics applications, I am privy to a lot of information about the project. In the course of reviewing the project, I have to make judgement calls about what should be included in the informed consent letters that participants read. It would be counter-productive if the participant had to read all the documentation I am required to read when reviewing an application, so we use certain best practices and principles to decide what information gets communicated as a standard, and what is left in the application.
There is, of course, some challenges that we must confront in this process. As I said, when reviewing a research project, you have to balance the needs of the project with the needs of a participant. All research, by virtue of exploring the unknown, carries with it an element of risk. When you involve humans in a research project, you are asking them to shoulder some of the risk in the name of progress. Our job as researchers and reviewers is to anticpate risk and mitigate it where possible. We are stewards of the well-being of the participants, and we use our experience and expertise to protect the particpants.
This means that one challenge is communicating risk to participants and helping them understand the implications of the risks of the research. In many instances, the participants are well aware of risks posed to their normal, every-day lived experiences and how the research intersects with it. The patient living with a medical condition is aware of their pain or suffering, and can appreciate risks associated with medical interventions. A person living in poverty is acutely aware of what it means to live in poverty, and understands that discussing their experiences can be psychologically and emotionally difficult. Our jobs (as reviewers and researchers) is to ensure that the participant is made aware of the risk, mitigate it as much as we can without compromising the integrity of the research program, and to contextualize the risk so that the participant can make choices for themselves without coercion.
The concept of informed consent is hugely important, arguably the most important component of research projects involving humans as participants. It is an acknowledgement that people are ends in themselves, not a means to furthering knowledge or the researcher’s private or professional goals. Indeed, without a respect for the autonomy of the participant, research projects are likely to not be moved into action even when research funds are available.
All of this is a preamble to discuss the anger I felt when I read a recent CBC report on how anti-vaxxer advocates are using the concept of informed consent as a dog-whistle to their adherents, and are using informed consent as a way of both furthering their awareness and raising money with well-meaning politicians and the public.
In fairness, I can see the chain of reasoning at play that tries to connect informed consent with concerns about vaccines. For instance, in the article there is a photo of supporters of a vaccine choice group with a banner that reads “If there is a risk there must be a choice.” This sentiment is entirely consistent with the principles of informed consent. The problem with this application is that the risk is not being communicated and understood properly within context, and instead fear, misinformation, and conspiracies that lead to paternalistic paranoia are short-cutting the conversation. Further, the incentive structures that are borne out of the economics of our medical system are doing little to address these fears. Because so little money is flowing from the government to the medical system, doctors are forced to maximize the number of patients they see in a day just to ensure enough money is coming into the practice to pay for space, equipment, staff, insurance, and supplies. Rather than seeking quality face-to-face time with a patient, doctors have to make a choice to limit patient time to just focus on a chief complaint and address questions as efficiently as they can.
I don’t think it’s all the doctor’s fault either. I think we as patients, or more specifically we as a society, have a terrible grasp of medical and scientific literacy. I don’t have a strong opinion about what the root cause of this is, but some combination of underfunded schooling, rapid technological innovation, growing income disparities, entertainment pacification, a lack of mental health support, increasingly complex life systems, and precarious economic living in the average household are all influencing the poor grasp people have about what makes the world around us work. Rather than being the case that we are hyper-specialized in our worldviews, I think it’s the case that “life” is too complex for the average person to invest time into understanding. Let’s be clear, it is not the case that the average person isn’t smart enough to grasp it (even if sometimes my frustration with people leads me to this conclusion). Instead, I think that people are pulled in so many directions that they don’t have the time or economic freedom to deal with things that don’t immediately pay off for them. People are so fixated on just making it day-to-day and trying not to fall behind that it becomes a luxury to have the leisure time to devote to these kinds of activities.
What this results in, then, is the perfect storm of ignorance and fear that congeals into a tribal call to rebel against the paternalism of a system that is ironically also too cash-strapped to allow the flexibility to educate people on the nature of risk. People don’t have the time and ability to educate themselves, and doctors don’t have the time to share their experiences and knowledge with their patients.
Within this gap, opportunistic charlatans and sophists thrive to capitalize on people’s fears to push their own agendas. This is why bad actors like the disgraced former doctor Andrew Wakefield and movement leader Del Bigtree are able to charge fees to profit from speaking at anti-vaccination events. I’m not saying a person who spreads a message should do it for free. What I am saying is that they are able to turn a personal profit by preying on people’s fears while doing little to investigate the thing they claim to worry about.
We must find a way to communicate two simultaneous truths:
There is an inherent risk in everything; bad stuff happens to good people, and you can do everything right and still lose. Nevertheless, the risks involved when it comes to vaccines are worth shouldering because of the net good that comes from it and the risks themselves are vanishingly small.
In the 22 years since Wakefield published his study and the 16 years since its retraction, there has not been any peer-reviewed credible evidence that supports many of the claims given by the anti-vaxx movement. The movement is predicated on fears people have of the probability of something bad happening to them or their loved ones. The motivation behind the fear is legitimate, but the object of the fear is a bogeyman that hides behind whatever shadows it can find as more and more light is cast on this area.
The anti-vaxx ideology knows it cannot address head-on the mounting scientific evidence that discredits its premise, and so it instead focuses on a different avenue of attack.
This bears repeating: the anti-vaxx ideology cannot debate or refute the scientific evidence about vaccination. We know vaccines work. We know how they work; we know why they work. We understand the probabilities of the risk; we know the type and magnitudes of the risks. These things are known to us. Anti-vaxx belief is a deliberate falsehood when it denies any of what we know.
Because of this, the anti-vaxx ideology is shifting to speak to those deep fears we have of the unknown, and instead of dealing with the facts of medicine, it is sinking its claws into the deep desire we have for freedom and autonomy. It shortcuts our rational experience and appeals to the fears evolution has given us to grapples with the unknown – the knee-jerk rejection of things we don’t understand.
Informed consent as a concept is the latest victim of anti-vaxx’s contagion. It’s seeping in and corrupting it from the inside, turning the very principle of self-directed autonomy against a person’s self-interest. It doesn’t cast doubt by calling the science into question. Instead, it casts doubt precisely because the average person doesn’t understand the science, and so that unknown becomes scary to us and we reject or avoid what brings us fear.
Anti-vaxx ideology is a memetic virus. In our society’s wealth, luxury, and tech-enabled friction-free lives, we have allowed this dangerous idea to gain strength. By ignoring it and ridiculing it until now, we have come to a point where it threatens to disrupt social homeostasis. Unless we do something to change the conditions we find ourselves in – unless we are willing to do the hard work – I fear that this ideology is going to replicate at a rate that we can’t stop. It will reach a critical mass, infect enough people, and threaten to undo all the hard work achieved in the past. We have already seen the evidence of this as once-eradicated diseases are popping up in our communities. The immunity and innoculations have weakened. Let’s hope those walls don’t break.
Last week, I gave a highlight of the best books I read in 2019. Below, I present what I read in 2019. By comparison to 2016, 2017, and 2018, last year was a paltry year in reading for me.
Harry Potter and the Deathly Hallows
The Bullet Journal Method
Trumpocracy – The Corruption of the American Republic
Daniel H. Pink
The Gift of Failure
Better – A Surgeon’s Notes on Performance
The Graveyard Book
Built to Last
Right Here Right Now
Stephen J. Harper
Complications – A Surgeon’s Notes on an Imperfect Science
J. Michael Straczynski
A Game of Thrones
George R.R. Martin
Scott H. Young
Reader Come Home
Andrew G. McCabe
The Path Made Clear
I have a few thoughts as to why my reading rate dropped off significantly last year and what I can do about it in the year to come.
Last year had a few significant pressures on my life that might have affected my desire to read. We started basement renovations early in the year, only to discover our basement’s foundation was cracked, requiring us to source quotes and opinions for repairs. This delayed our basement renovation, which didn’t finish until the summer. The protracted project weighed heavily on our minds throughout the year as we questioned whether we were making the right decisions for our home repairs, or whether we would need to make additional fixes later down the line.
Another big change for me was a change of my job at work. While I wouldn’t say it affected me as strongly as the basement renos, it disrupted my routine enough to impact my desire to focus on reading when I came home from work. Couple that with another full year as Board Chair for the non-profit I head up, and it left me with less cognitive bandwidth for self-improvement.
Podcasts and Music
If 2016 was my year of purchasing books, 2017 saw me start to utilize Libby to access the library, and 2018 was an all-out race for me to go through as many audiobooks as my brain could absorb, I felt a greater push away from books in 2019. Instead of working my way through 8-15 hours of content for one piece of work, I found the shorter format of podcasts more satisfying on my commutes. I enjoyed the variety in topics, shows, and voices.
However I also found I was drawn back to listening to music instead of information. With the sheer volume of books I’ve consumed in the last three years, it was nice to go long stretches without a goal of getting through books (or trying to learn new things) and instead allow the melodies, riffs, percussion, and lyrics sweep me away.
Overall, my rate for the year was a bit varied. I started slow in January and February, then picked back up in March. April only saw one book completed, then I found my footing again through May onward. However, October is when my wife and I traveled abroad for our honeymoon, and I never recovered my reading habit for the rest of the year.
Given that I spent most of the last three years focusing on business, personal development, and productivity books, I didn’t feel a strong desire to read those books in 2019. Even among the books I did read from that area, I found looking back that I don’t remember anything of note from those books. Neither the book’s theses nor the examples they offered have stuck with me as I enter the new year.
I’ve mentioned a few time the concept of the animated bibliography on this blog, and I think I’ve hit peak saturation for the genre. I’ve read the canon, and find that reading new books in the genre is resulting in diminishing returns; that is, I’m not really seeing a lot of new insights being offered that leaves me wanting more.
In my list last week, I commented that the books that I’m drawn to now is starting to shift away from business and productivity and more towards moral lessons found in fiction, biography/memoir, and journalistic explorations of current events. That’s not to say I won’t continue to be tempted to pick up the latest book that promises to fix my life, but it does mean that I’m intending to be more selective in what I choose to prioritize.
Assuming I continue to live a somewhat healthy life that is free from accidents, I figure that I have around 45-50 more years of life left. If I read around 3 books consistently per month, I will get another 1,650 books in my lifetime (4 per month is 2,208 books, and 5 books per month is 2,760 more books before I die). While that sounds like a lot, it’s a drop in the bucket compared to the number of books that come out each year and the books that have already been written. There is more to life and learning than being more productive or seeking more meaning in one’s life. I’ve grown to appreciate the value of storytelling this past year, and there are a lot of stories out there to sink into. If I only get access to a few thousand more stories, I should make sure they count.
I’ve been thinking about endurance recently, specifically in two areas of my life. First, I’ve been experimenting with intermittent fasting since January of this year and I’ll be sharing some reflections on it soon. By fasting each day, it requires a certain amount of endurance to push through on cognitively and physically demanding tasks while your body deals with the exertion in a fasted state.
Second, as the winter weather hits us, I have to endure colder temperatures while working at the bar. I’ve managed to push myself over the last two years and use a sufficient number of clothing layers to eschew wearing a coat while on the door position. I have the coat on hand, but I like the challenge of working without it and standing outdoors for long stretches of time exposed to the elements.
It might seem silly or pointless to put myself in these positions when I don’t have to – I make enough money so that I never need to worry about food scarcity or not owning enough proper clothing to protect myself. On some level, it’s stupid machismo to willfully deprive myself in this way. Yet, I like the challenge and the sense of satisfaction that I can achieve some level of control or mastery over myself and my situation.
While recently listening to Oprah’s book The Path Made Clear, I came across a really interesting way of framing this tendency I have. The specific section runs from 4:53-5:52 of the clip below, where Oprah is chatting with Alanis Morissette about the yearning to seek out a time in the future where all your present problems are solved and you are finally happy. They discuss that this forward-orientated hope for the future never manifests itself as peace; that money and fame doesn’t bring you happiness or contentment. Instead, you are always chasing that future where you are free from whatever pain you feel in the present.
“One of the big lessons I’ve learned over the last little while has been that if I can be comfortable with pain, which is different than suffering, if I can be comfortable with pain, as just an indication, and it’s potentially a daily thing (in my case it often is) then there won’t be my living in the future all the time; that one day if and when I will be happy.
“And even if I’m not comfortable doing that, I’m very uncomfortable in pain – I hate it – we run from it with all kinds of addictive, fun things (temporarily fun things). But at least knowing it’s a portal, and that on the other side is this great sense of peace that goes beyond this ego development.”
~Alanis Morissette (lightly edited for readability)
This sentiment spoke to me. I have an affinity towards stoicism and the idea that one should re-frame their relationship with the external. To me, I like knowing that I can endure, even when I don’t have to. It becomes practice for those moments when I need to dig in deep to perform, because life isn’t always easy. Through this practice, I can also appreciate my comforts all the more. And, it also doesn’t need to run in opposition of my goal to remove discomfort from my life so long as I remember that I’m not entitled to a life of comfort and ease and instead have to intentionally earn it.
I acknowledge that I’m fortunate not to live with serious pain or suffering. I have a comfortable life and I wouldn’t exchange it for machismo points. I don’t think the point of life is to suffer, but instead my goal is to learn to suffer well when life brings me pain.
It is sometimes amazing how cyclical social and political problems can be. While I am not pessimistic in our ability to move forward in something that can be recognized as “progress,” I do have some cynical attitudes towards our collective habit to backslide. I realized some time ago that while we espouse enlightened positions, such as “never again,” people as a whole tend to by historically myopic and prone to letting fear get the best of them – or to quote Agent Kay “A person is smart. People are dumb, panicky dangerous animals and you know it.”
As of writing, I’m working my way through Ron Chernow’s biography of Alexander Hamilton. Around ten-hours into the audiobook, Chernow is discussing the political maneuvering between Hamilton and New York Governor George Clinton over trying to get the newly-drafted Constitution ratified in 10 States in order to bring it into force. The two sat on opposite sides of the federal government question, with Hamilton believing a strong central federal government was the key to sustaining the American experiment, while Clinton was distrustful of a central government superseding the power of the States. Hamilton had a poor opinion of Clinton, believing Clinton to be only concerned with consolidating his own wealth and power, and only pandering to the populace when elections rolled around.
Chernow gives a striking description of what Hamilton feared, and in a single line spells out a looming threat we are seeing anew in our own modern political discourse. Hamilton worried that “American democracy would be spoiled by demagogues who would mouth populace shibboleths to conceal their despotism.”
Chernow penned those words some fifteen years ago. Whether it’s 1788, 2004, or the dawning of the neo-20’s, the fears expressed in those words caution us that we must remain vigilant against those who seek to exploit our fears to manifest their vision in reality.
Recently, a lot of things circulating through social media and my podcast feeds have been enraging me. I try to mitigate these things through a number of strategies – limiting my time on social media, intentionally targeting positive messages, not reading comments, not engaging, reminding myself that it is ok to disagree about things, etc. The hardest things for me to let go are cases where my thought-processes seem to wildly diverge from others about the framing of the same set of facts.
Initially, I wanted this blog post to be my master rebuttal. I wanted to lay out my case for why the conclusions others are drawing from this or that event are wrong and why. I wanted to emphasize what the important, salient points are that we should keep in mind.
But I know in my heart that would be an exercise in futility. A blog post is easy to skip; easy to ignore. I won’t change hearts and minds by arguing against a strawman average of the viewpoints expressed in my network of known-people. It would be antagonistic, hostile, and unproductive towards my goals. In all likelihood, it would backfire and entrench or alienate friends.
Instead, I will offer a different approach that I want to continuously remind myself of. When I feel compelled to dig in my heels for an argument, I should remind myself of the following.
First, remember what Aristotle (via Will Durant) tells us about virtue and excellence. It doesn’t matter what others say or share/post online; we are what we repeatedly do. Excellence, then, is not an act, but a habit.
Second, turning to fiction, remember why The Doctor helps people.
“Winning; is that what you think it’s about? I’m not trying to win. I’m not doing this because I want to beat someone or ‘cause I hate someone, or because I want to blame someone. It’s not because it’s fun. God knows it’s not because it’s easy. It’s not even because it works because it hardly ever does. I do what I do because it’s right! Because it’s decent. And above all, it’s kind. It’s just that. Just kind.”
Series 10, Episode 12: “The Doctor Falls”
Ultimately, it’s not about what I will say, or argue. Arguing with other people doesn’t make me a decent person; picking fights online doesn’t put me on the high road. If I want to bring about change in people I care about, it’s important to remember to be kind. Always be kind and help people, because it’s the right thing to do.
I recognize that being kind doesn’t give a lot of direction and can seem cowardly when meeting the systems that do real harm to the vulnerable and oppressed. In fact, espousing kindness can easily slip into inaction or forced neutrality. It’s hard to be prescriptive in this case at a granular level.
However, if I start with the core values of kindness and action, that what is important is doing things that are kind to others, then you can use your values as a filter for determining what you will choose to do.
Instead of arguing online, I choose to try and lead by being kind.
While reading a post from A Learning A Day, I thought I’d keep the irony train rolling by linking to Rohan’s linked post from Derek Sivers about the perceived need to quote an idea’s source. Specifically, I wanted to respond to this point:
2. School teaches us to reference. But we’re not trying to impress a teacher anymore. And every unnecessary fact dilutes our point.
I often reflect on the learning objectives I expect to achieve in a course lesson while teaching. I try to parse out the meaningful things I want students to learn from rote procedural tasks that don’t serve a purpose. The last thing I want to do is to reinforce the wrong lesson or derive the wrong conclusion from a student’s performance (e.g. did a student do well on a test because they understood the material, or because they are good at taking tests?).
Derek’s point above about references is well-taken and got me thinking: why do I want students to cite their sources? I brainstormed a few reasons and listed them below with comments.
I want a student to be mindful of their research process (procedure).
Having gone through writing my master’s thesis, it’s easy to lose track of references and citations if you don’t stay on top of it. This isn’t super relevant to most assignment learning objectives, but it’s a good practice to have before launching into a bigger endeavor or capstone project.
I want a student to critically examine their own knowledge (what do they believe to be true facts, where did that fact come from, and why do they think it’s true).
I’m not sure if making students cite their sources achieves this aim on its own, but I suppose I could use citation requirements to help guide them through this process.
I want a student to be mindful of idea ownership and give credit to people who have done work.
I’ve used this mostly in plagiarism cases where students copied work and submitted it as their own. I try to distinguish between sloppy citing and outright theft, and I remind students that they shouldn’t get marks for work they didn’t do. I’m still undecided if this is a rule of the academy or a legitimate thing to prevent fraudulently passing work off as your own in the future. This point, though, is mostly relevant in academic contexts as opposed to Derek’s notes about doing this during conversations.
I want an easy way to see if the student did the work.
This is a trick I’ve developed to see whether a student giving me their opinion is right by chance, or if they have informed their opinion by doing the course reading. The same result could be gained if students inserted relevant information without citations, but the citations help to highlight this when I’m reading through their submission. In other words, it makes my job easier.
I want to reinforce good academic writing habits.
Using references is part of what it means to write academically, and is used as part of the integrity process. This is only a good reason if my objective is to teach/reinforce academic writing for students.
This is the way it has always been done.
More cynically, requiring citations is part of the tradition, and who am I to question it? It’s not a good reason to require it, but it is what it is. I won’t included in the list to the left, but a more sadistic version of this is “I had to do it, so you have to (go through this rite of passage) too!”
I want to remain consistent with departmental policies and culture.
Whether written or unstated, most departments adhere to some level of standards. This was less the case for me in undergrad and it depended largely on the preferences of the prof. By the time of my thesis, I ended up developing a hybrid referencing system that did not strictly follow any of the major citation methods. I received no comments from anyone who reviewed my thesis on my citation practices.
It’s important to trace an idea’s lineage as much as possible to spot fabrication.
If you are going to insert facts or conclusions into your work, it’s important to point to where you found them. Without a citation or an adequate way of accounting for how you know what you purport to know, it’s possible that the information is made up. Being able to trace these things helps, albeit this is more useful from a scholarship point of view, as I suspect a lay-reader isn’t concerned with checking a text for factual accuracy and instead takes it on authorial authority.
Related – to see if a student is able to either properly reference work, or at the very least charitably restate ideas without dropping important content from the idea.
This perhaps falls under sloppy citation practices, but on occasion students will misunderstand a piece of text and paraphrase or summarize information incorrectly. Knowing where the student is drawing their source from can have pedagogical merit if you take the time to compare the student’s work with the source and discuss the divergence.
Related – when an author cites their sources, a reader can use the bibliography of sources for further reading.
This is perhaps more for book nerds, but I love having references to be able to learn more for things that pique my interest. This is, however, not the context Derek is referencing when he discusses giving citations during a normal conversation. If Derek’s conversation partner was interested and want to know more, I’m sure they would ask Derek for more information.
More abstractly, knowledge and academics is a web of mutually reinforcing facts, so academic writing is an extension of that reality.
This one is a bit of a stretch as to why a student who is not adding to a body of knowledge is required to rigorously cite their sources in a pedagogical exercise, but I include this more epistemological point to try and be exhaustive.
It’s a symbolic representation that the student (in most contexts) is not generating new or novel work/insights that creates new knowledge, but instead is remixing ideas from other sources.
I think this is a good reminder of what the goal of the assignment should be (students are often far too ambitious in what they think they can reasonably achieve in x-number of pages), but I wouldn’t consider this to be an adequate reason to insist on proper citations.
Like other skills, the act of referencing needs to be practiced.
I’m sympathetic to this, but as Derek is implying, you should be practicing skills that transfer into other domains or that you will need. In most instances, outside of school you don’t need to cite sources.
Citing references is part of the argumentation process. In order to build a successful argument, you must clearly express and state your premises, which includes any premises taken from the work of others (either their premises or their conclusions).
I’m also sympathetic to this as I think everyone should keep in mind that arguments need to be made to help convey ideas. It shows the logical chain from premise to conclusion and seeks to make the implicit explicit, and the unstated stated.
Other than a subset of the reasons above, a strict requirement for citations is often unnecessarily enforced in the classroom, and is almost never required outside of the academic setting. I think there are some good pedagogical reasons to have students go through the effort to cite their sources, but you should be intentional when teaching as to when those cases apply. For instance, I am less strict about my students citing sources and instead I look for them to directly apply material from the course in their assignments (instead of giving me their opinions).
I enjoyed Derek’s point about how citing sources is a common trope in pop non-fiction, which sounds like a convergence on my ideas concerning animated bibliographies, or Ryan Holiday’s “15 academic studies” comment from a few weeks back. Maybe Derek’s right – we should have more courage to integrate knowledge into our existing schema and be prepared to state things as facts instead of citing our sources. I’m not sure I’m prepared to abandon the practice wholesale, but it has given me something to chew on.