Science News from August 2019

1 Comment

Just in case you’re wondering, I do have some legitimate excuses for taking so long to write and post this. It’s been a pretty crazy few weeks, thanks to a sixteen-day-long sinus infection, (it seemed like a lot longer than that) the demise of my car, and just general life/work things. But I had this blog post almost ready to go two weeks ago, so it’d be silly to just scrap it. So at this point, I’m just posting what I’ve got; I apologize if it’s a little disorganized and inadequately edited. 

Klingon Bird of PreyI’d like to start by pointing out that we are slowly getting closer to living in a universe where Star Trek technology is a reality. Unfortunately, this isn’t about transporter beams. (Although I’ll come back to that in a few paragraphs) It’s about cloaking devices, which isn’t even Earth technology; hence my choice to use a picture of a Klingon space vessel.  Although I’m most familiar with the original series, the internet informs me that the use of cloaking technology would violate the Treaty of Algeron, signed by the Federation and the Romulans in the year 2311. However, our 21st-century earth physicists have been indicating for a few years now that invisibility is plausible. This article,  which is more than a year old, describes some of the more promising developments towards the goal of manipulating light waves to make things invisible. The article references Harry Potter instead of Star Trek, though, which is just silly because the invisibility cloak in Harry Potter is clearly magic. 

This new article also mentions Harry Potter, although the technology that it describes is even less similar to a magical invisibility cloak. This time, instead of blocking light waves, the researchers are working on blocking water waves. It’s a little less exciting, maybe, but it has practical applications. Obviously, it’s useful for anything that has to do with boats. The article quotes one physicist as joking that their work will make it easier to have coffee on a boat, but I’m not sure that’s so much a “joke” as it is a worthy pursuit. What makes this development especially interesting, though, is that it essentially works the same way that a light-wave-blocking cloaking device would. 

So what about the transporter beams I mentioned earlier? Well, as you probably realize, real Star-Trek-esque teleportation technology doesn’t exist and isn’t likely to reach us anytime soon. I only mention this development in the context of transporter beams because of this somewhat clickbaitish headline and the Star Trek references in the article. In actuality, this story is about communication and computer science. (In other words, it has more relevance to Uhura than to Scotty)

MelchizedekUntil now, quantum computers have used qubits, which is basically the same thing as a bit for regular computers. A bit is the smallest unit of computer information storage; it represents a single 0 or 1 in binary code. 8 bits make one byte, which is the amount of computer storage needed for a single letter, (or other character) and most computer files take up at least several hundred kilobytes, (aka KB) each of which is actually 1024 bytes. This random (but adorable) picture of my cat Melchizedek takes up over 102 KB on my computer; it’s 105,404 bytes, which is 843,232 bits. The point of all this is that a bit or qubit is a very small thing. Again, it stores one digit of binary code. The news story here is that now there’s such a thing as a qutrit, which stores a digit of ternary code instead of binary. While binary code has two possible values for each digit, ternary code has three possible values, and therefore, a qutrit is slightly bigger than a qubit or a bit. 

I’ve just about run out of reasonable excuses for Star Trek references, but I do have a couple other things to share that have extraterrestrial subject matter. This first one stays pretty close to home. Astronomer David Kipping has suggested that we could make a giant telescope by essentially using the earth’s atmosphere as the lens. We’d just need a spaceship in the right place and equipped with the right devices; the refraction of the light is a natural phenomenon. You can see Kipping’s paper here.

170613_Jupiter_FullMeanwhile, five of Jupiter’s recently-discovered moons have been named. The names were chosen via an online contest, but the options were somewhat limited, since it has already been established that all of Jupiter’s moons must be named after figures from Greek or Roman mythology who were either lovers of Zeus/Jupiter or descended from him. (While there are some differences between the Greek and Roman myths besides the gods’ names, it’s still fair to consider Zeus and Jupiter to be essentially equivalent from an astronomical standpoint) These five moons each get their name from a Greek goddess who is either a daughter or granddaughter of Zeus. Their names are Pandia, Ersa, Eirene, Philophrosyne, and Eupheme. For a complete (and apparently up-to-date) list of moon names, click here. Just to be clear, the picture I’ve used here is Jupiter itself, not a moon.

Moving on to a different scientific field, I’m happy to report that science has confirmed that chocolate is good for you, especially dark chocolate. This cross-sectional study of over 13,000 participants indicates that eating some dark chocolate every other day could reduce the likelihood of having depressive symptoms by 70 percent. (Google Docs put a squiggly line under the phrase “every other day” and suggests that I meant “every day”. Maybe Google knows something that the rest of us don’t.) In other nutritional news, the flavonoids found in apples and tea evidently lower the risk of cancer and heart disease.  And this pilot study has suggested that pregnant women would do well to drink pomegranate juice or consume other foods high in polyphenols (a category of antioxidants) in order to protect their prenatal children from brain problems caused by IUGR, (intrauterine growth restriction) a very common health issue which often involves inadequate oxygen flow to the developing brain. A larger clinical trial is already underway.

A recent study indicates that there could be a genetic link between language development problems and childhood mental health problems. Educators have noted in DNAthe past that kids with developmental language delays are often the same kids with emotional or behavioral problems, but as the article says, it’s always been generally assumed that the reason for this is causal; the idea is that the frustration or confusion that comes from language difficulties causes (or exacerbates) emotional issues. But this study used statistical analysis of participants’ genetic data to evaluate the correlation, and researchers think that both types of problems could be caused by the same genes.

Interestingly enough, another potential cause of childhood mental health problems is strep throat. This article isn’t really new information; clinicians have been studying PANDAS (pediatric autoimmune neuropsychiatric disorder associated with streptococcal infections)  for a couple decades, and there is still debate over what is actually going on in the patients’ brains, whether PANDAS should be considered a form of OCD or a separate disorder, and whether PANDAS is even a real thing. The article linked above describes one specific case and describes some of the questions and research. 

Here’s another one about child development. A researcher in Australia developed a preschool program based entirely upon music and movement. The idea is that these kinds of activities are such effective tools in early childhood education that they can help to close the achievement gap between children from different socioeconomic situations. This is one of those cases where common knowledge is a little ahead of official scientific research. As a children’s librarian, I can verify that virtually anyone who works with kids knows that singing and dancing helps kids learn. There are several reasons for this, ranging from “it activates numerous brain regions at the same time, thereby forming connections that aid both memory and comprehension” to “it burns off excess energy and keeps the kids out of trouble”. For these and other reasons, most preschool programs (and, of course, library storytimes) involve a lot of songs, most of which have their own dances or hand movements. Besides that, school-age kids who take classes in music and/or dance tend to be more academically successful than those who don’t. As far as I know, nobody has ever done the research to officially confirm that, but lots of people in education and the arts are aware of it. (If there’s anyone reading this who has the academic background and the means to do such a study, I’d like to request that you pursue that. Please and thank you.)

ReadingThe next two studies I want to mention are so closely related in their subject matter and findings that I initially thought that these two articles are about the same study. But they aren’t. Although they both involved using MRI technology to watch what someone’s brain does while they read, this one from scientists in England was looking at how the brain translates markings on paper into meaningful words, while this study from the University of California, Berkley compared the brain activity in subjects who read and listened to specific stories. The most conclusive finding from that particular study was that, from a neurological perspective, listening to audiobooks or podcasts is essentially equivalent to reading with your eyes, which almost seems to contradict the England study, which obviously showed that visual processing is the first neurological step in reading. That’s not really a contradiction, though, when you stop to think about just how busy the brain is when reading.

Neurologists have always known that reading is a complex cognitive process that doesn’t have its own designated brain region; it requires cooperation between several different cognitive processes, including those that process visual input, those that relate to spoken language, and higher-order thinking. (That is, our conscious thoughts) But these recent studies have given neurologists a clearer map of written language’s path through the brain. Both of these studies also indicate that the exact brain activity varies slightly depending upon what the participant is reading. In fact, in the study from the University of California, researchers could predict what words the person was reading based upon brain activity. It would seem that our brains associate words with each other based both upon how that word is pronounced and what it means. These studies have some very useful potential applications, such as helping people with dyslexia or auditory processing disorders. In the meantime, they provide us with a couple new fun facts. As far as I’m concerned, fun facts are pretty important, too.

Particularly Awesome Books of 2016

Comments Off on Particularly Awesome Books of 2016

I know it’s been ages since I’ve posted on this blog. And even now that I finally am back here clicking that “Publish” button again, it’s just a reblog from my other blog. But this is something I spent a lot of time compiling, and besides, my 2014 list and 2015 list made their first appearances on Kaleidoscope49. It only makes sense to stick the 2016 list here, too. So feel free to ignore this post or to read it thoroughly and then find and read all of the books that pique your interest. And hopefully, I’ll be back with more new posts sometime soon.

Particularly Awesome Books of 2016

This list has been a long time in coming. Not only have I spent an entire year reading a whole lot of children’s literature and keeping a running list of books that I especially liked, but it’s tak…

Source: Best Books of 2016

Socrates vs. Literacy

Leave a comment

Socrates_Pio-Clementino_Inv314.jpgReading is among the most important skills that people can learn. It is necessary in school and in most types of jobs, it is a valuable source of information, and it is a wonderful form of entertainment. As a librarian, I consider it part of my job to promote literacy and the enjoyment of reading. But literacy as we know it has not existed for all of human history. I wrote a post a few days ago on my other blog about the history of literacy. In the process of pulling together my facts, I came across a few fun tidbits that didn’t make it into that post. One of the most interesting was a list of reasons that the great philosopher Socrates opposed written language.

Socrates lived in Athens in the fourth century B.C. At that time, the Greek alphabet existed and many Athenians were literate, but reading and writing were not as widespread as they were a couple generations later. Socrates is known for his oral discourse. He would converse with people on the streets, mainly by asking questions. Most of what we know about him comes from Plato, who was his student. Plato wrote extensively, but Socrates himself did not record any of his philosophical ideas.

Socrates’ back-and-forth method of philosophizing leads to one of his concerns with the written word. Written language is more permanent and unchanging than the spoken word. In a conversation, you can ask questions and give answers, you can clarify what someone else meant, you can amend what you have previously said or inform others that you have changed your mind. Of course, you can do any of those in things in writing, but the original book or essay or facebook post still exists in its original, inflexible state. Of course, Socrates didn’t anticipate the internet, where conversation can happen via written word in real time, and where posts can be edited. If he had, maybe that would have somewhat satisfied him on this point.

However, another one of Socrates’ complaints is something that is even more valid in the days of the internet than it was in his own time. Socrates put a lot of emphasis on Truth with a capital T. He believed in the importance of knowledge, which didn’t just mean knowing facts, but having a thorough understanding of Truth. Knowing a lot of factual information was, in Socrates’ eyes, a superficial form of knowledge. But it is that type of knowledge that is most easily transferred via books. By making information readily available to all people, we encourage a look-it-up mentality which, to Socrates, is very inferior to the seeking of wisdom. Although Socrates is known to have a very egalitarian view, he feared the idea of making information accessible to the “wrong people”. Even worse, the written word isn’t necessarily true, but someone who is just quickly looking up a fact is easy to fool. That’s where the internet is worse than books. Since anyone can post things online, the internet is full of quick facts without context and false information.

Finally, when people learn to rely on written language, it affects their ability to remember oral language. Although the Greeks are remembered for their contribution to written language, they also had a very strong oral tradition. As in many non-literate cultures, they remembered their history and their folklore by telling and retelling the stories, and therefore, individual people had to develop the skill of remembering words with great accuracy. Many making that information more quickly accessible, writing things down decreases the need for that kind of memorization.

In my opinion, Socrates’ concerns are valid, but the advantages of literacy outweigh the disadvantages. It’s true that the written word is less flexible than the spoken word, but sometimes, that’s a good thing. It’s true that the ability to read makes it possible for people to gain very superficial knowledge, but superficial knowledge is better than ignorance. It’s true that relying on written language decreases the ability to remember oral culture—and that is perhaps the most compelling argument ever made against written language—but the ability to remember things with absolute accuracy is sometimes more valuable than the ability to remember things without any help.

In our own time and culture, it would be unheard of for someone to claim that literacy is a bad thing. I don’t think I’ve said anything controversial by disagreeing with Socrates on this point. But it is interesting to stop and think about his objections and just how sensible they are. As cultures change over the centuries and millennia, our methods for storing and sharing information have naturally changed, and they will probably continue to change. (I, for one, do not think that the print book is on its way out anytime soon, but perhaps in another two thousand years, it will be.) For any change as major as that, there will be downsides. And as far as Socrates’ concerns go, it’s interesting to note that we wouldn’t be able to remember and discuss his ideas if Plato hadn’t written them down.

There’s This Book I’m Reading, episode 9


Star Wars bookWhile killing time in a bookstore with my sister and brother about three weeks ago, I came across a book with an intriguing title: Star Wars Psychology. (edited by Travis Langley, PhD, 2015.) Upon taking it off the shelf and looking at it, I found that it is a series of short essays by various Star Wars fans who also happen to have knowledge (and, in most cases, advanced degrees) in psychology or related fields. As a side note, I later looked at the contributor bios in the back and was fascinated by just how cool and nerdy most of those people are. One of them, Star Wars fan by the name of Jay Scarlet, is even a librarian like me, except cooler because he has a master’s degree in psychology as well. Anyway, as you have probably guessed, I purchased the book.

I haven’t finished reading it, but I probably will yet this evening or perhaps tomorrow. I recommend it for anyone who has interest in both Star Wars and psychology. It is slightly less academic than I had initially expected, making it a relatively light read, especially given the brevity of most of the essays. But that’s not necessarily a bad thing. At the very least, the book is an analysis of the motivations of certain Star Wars characters. Just for fun, here are my comments on a few of the chapters that particularly caught my attention.

The second chapter in the book, written by Jenna Busch and Janina Scarlet, PhD,  is “So You Want to be a Jedi? Learning the Ways of the Force through Acceptance and Commitment Therapy.” I don’t know a lot about Acceptance and Commitment Therapy, but I was already aware that it focuses largely on the concept of mindfulness. Mindfulness, which is similar to but not synonymous with meditation, has received a lot of positive attention in the media and mental health world. I have mixed feelings about the very concept, because so many people praise it as a cure to mental illness or a way of solving everyday life problems, neither of which is scientifically feasible. However, I am given to understand that research does show that practicing mindfulness is helpful in reducing stress and handling emotions without shutting them down. Contrary to how some people describe it, mindfulness is not a mystical experience or a secret technique. Busch and Scarlet define it as “paying attention to the present moment on purpose, without judgment or distraction,” which is really the same as what the word means in vernacular usage. The writers of this essay assert that mindfulness is a core aspect of Jedi training. It may sound a little funny, but seeing mindfulness framed as a Jedi-related concept helps me to understand it as a beneficial and legitimate concept.

Another psychological idea that this book clarified a little for me is self-actualization, as described by the famous humanistic psychologist Abraham Maslow. It’s a phrase that I’ve heard quite a lot, but I didn’t have a clear sense of what exactly it meant. Apparently, it just has to do with feeling content with who you are and/or where you are in life. In this book, the concept was described using the example of Darth Maul, in chapter four, by Travis Langley and Jenna Busch. (Apparently I like her writing, since I’ve mentioned both the parts she wrote) Darth Maul doesn’t get much screen time, really, and his movie is my least favorite of the six, but he is a pretty cool villain. Busch, Langley, and Sam Witwer (who voices Darth Maul in the animated Clone Wars series) describe Darth Maul as being self-absorbed, but highly insecure, in contrast to being self-actualized. It’s interesting seeing self-actualization described as an antonym for self-absorption. But it makes sense that extreme insecurity is just as self-centered as over-confidence.

Although I find psychology fascinating in general, I don’t often gravitate towards topics relating to gender psychology; however, the aspects of this book that touch upon those topics interested me very much. (Not to mention the fact that this book took a very balanced approach to gender psychology, which I appreciated.) The chapter on “Grief and Masculinity: Anakin the Man” by Billy San Juan, PsyD, describes the emotional journey that led Anakin to the dark side. While no one who has watched Episodes II and III will be unfamiliar with that journey, it’s fascinating and even somewhat eye-opening to observe the way that parallels some people’s real-world experiences. And a later chapter, (“A Distressing Damsel: Leia’s Heroic Journey” by Mara Wood) describes Princess Leia’s character development throughout the original trilogy by drawing from the research and writings of a therapist named Maureen Murdock, whose works I am now interested in reading myself.

There are a number of other particularly interesting parts of this book, such as the short passages on personality traits that come at the end of each of the five parts. But in the interest of relative brevity, I will conclude here. If you want to hear more, read it yourself. (And don’t worry about spoilers; it was written before The Force Awakens came into being.)

2 AM

Leave a comment

Night blogging“2 AM isn’t a place, it’s an emotion,” a night blogger once said on tumblr. And then others replied and pointed out that 2 AM isn’t a place at all, and yet others added, “That’s because it’s an emotion.” I was not that particular night blogger, nor do I know him/her personally, so I cannot say whether the use of the word “place” instead of “time” was a mistake or a philosophical statement or a decision based upon the aesthetic sound of the sentence. But I do know that the statement as it stands is true. 2 AM isn’t a place. It’s an emotion.

It is considered typical to sleep at night and to be awake during the day. 2 AM is not a time for typical people; it is a time for people who have odd schedules, whether by choice or because they genuinely like it better that way. 2 AM is a time when there are few sound waves in the air, but a great many metaphorical sound waves over the internet. 2 AM is full of ramblings that are either ridiculous or profound, and sometimes both. At least online, 2 AM belongs to the night bloggers and the overworked students, two groups of people who are (or at least should be) notorious for blending extreme genius and utter nonsense in one pithy remark.

At 2 AM, the internet is the only way to express the thoughts that run through the mind of the fatigued and overly creative mind of the night blogger. At 2 AM, the real world doesn’t exist, and the internet is all there is. At 2 AM, a blinking cursor on a computer screen or a page full of densely packed words offer greater possibilities than anything that a night blogger ever gets to see in the daytime. Thoughts don’t count for much if they can’t be formed into letters and words, and they count for nothing at all if they are formed into spoken words that are forever gone as soon as the sound waves fade into the oblivion of the motionless air which fills the place that we call Real Life. But on the internet, a fleeting random thought can be preserved in visible form so that fellow night bloggers or tomorrow’s day bloggers can see it and be duly amused by its absurdity or impressed by its profoundness or confused by its randomness.

I myself am not known for the kind of posts that show up around 2 AM, although I am occasionally responsible for a nonsensical insight that may or may not be worthy of remembering. One of the more recent of these (although it occurred well before 2 AM) is the concept that real life is nothing more than a frame narrative for everything that one reads or writes. This may perhaps be more true of my life than most people’s lives, especially this semester, since I am taking a class that involves reading five to seven YA novels a week, which is rather a lot of fiction reading when you’re a full-time student who also has a job and also feels compelled to find some time and mental effort for other reading and writing in addition to schoolwork. But the fact remains that many people, especially among the demographic that is most likely to be on tumblr in the middle of the night, spend much of their time and conscious thought on fiction, whether in the form of novels or television or other mediums. And I would argue that many types of nonfiction should also be taken into consideration in this matter, because non-fictional narrative prose often resonates in a reader or viewer’s mind in the same way that fiction does. It seems to me that it is no exaggeration to say that our lives are largely dominated by stories that are not our own.

As any avid reader or writer knows, the frame narrative is never the important or interesting part. The good bits of the story are always saved for the innermost tale. The frame narrative is simple and straight-forward and sometimes quite dull. If Real Life is a frame narrative, it sadly does a good job of following this standard. Some people claim that the enjoyment of fiction is a form of escapism, and I think that this is entirely true, but not quite in the way that they mean. An avid reader is not completely ignoring his or her own life. An avid reader is using the fictional lives of others to justify the fact that his or her own life is too empty and simple and straight-forward and dull to have much of any significance unless it is simply a framework for which other stories can be metaphors.

 But 2 AM is when the frame narrative of reality goes on hiatus. Typical people use this opportunity to sleep. They spend many hours lying perfectly still and resting their minds so that they can wake up in the morning and spend the next day of their real lives doing all of the real-life things that they think make their real lives important. But those of us who are awake at 2 AM, whether because of homework or because we like 2 AM, experience a view of the world that normal people miss. There comes a time of night when reality pauses itself and its place can be taken by fiction or by rambling words of incoherent wisdom typed on a computer screen by a fatigued night blogger who didn’t even necessarily mean it the way it sounded.

2 AM isn’t a place, it’s an emotion, and like other emotions, it is exhausting and incapacitating if it is felt too strongly, too frequently, or for too long a period of time. I myself would prefer to be asleep at 2 AM if my life allowed for that to be an achievable goal. But when I am awake at 2 AM, it occurs to me that people don’t know what they’re talking about when they spout cliches about living life to the fullest. Living life to the fullest doesn’t mean going out and doing crazy, exciting things. If that’s the way you’re looking at it, you’re forcing yourself to choose between craziness and normality. Living life to the fullest means taking advantage of the wondrous opportunities offered by books and the internet to experience excitement even while your own real life is filled with the mundaneness of not being the sort of person who goes out and does crazy, exciting things.

2 AM is where you can have it both ways. 2 AM is where it’s crazy and exciting just to be conscious and to have the wonderful ability to preserve your conscious thoughts in written form or to experience other people’s written thoughts without being interrupted by reality. 2 AM is where the frame narrative meets the cooler inner story because there isn’t any need to keep the two completely separate. 2 AM is where things don’t need to make sense because sense isn’t the most important thing around here.

Come to think of it, maybe 2 AM is a place after all.

There’s this Book I’m Reading, episode 7

1 Comment

Da Vinci CodeA number of years back, I read something in the newspaper that has stuck with me ever since then. Unfortunately, I don’t remember when or exactly where I saw it, so I can’t properly cite it. I don’t even remember whether it was a review or an opinion piece or a column, but it was about The Da Vinci Code by Dan Brown. It began by quoting the opening sentence of George Orwell’s 1984, “It was a bright cold day in April, and the clocks were striking thirteen.” Although there’s nothing unrealistic or fantastical about the idea of a clock that doesn’t start over at twelve, it’s just jarring enough to alert the reader to the fact that the story isn’t set in the real world. George Orwell is introducing a fictional reality. It isn’t characterized by magic and mythical creatures, but they count hours slightly differently than we do, so we are aware from the outset that there is a disconnect between the book and the real world. As the book progresses and the reader gradually learns about the historical events that were made up for the book, the reader remains conscious that those facts are part of the story. The Da Vinci Code, the newspaper writer said, lacks this subtle acknowledgement that it’s not based in reality. And unfortunately, the result is that many people believe the alternative historical facts that were made up for the book. My teenage self found this point to very profound even though I hadn’t read either 1984 or The Da Vinci Code at that point. But I remembered that statement last winter when I did read 1984 and again this autumn when I did read The Da Vinci Code. And I still think that it’s a significant point.

OrwellIt seems to me that Orwell’s 1984 is actually considerably less fanciful than The Da Vinci Code. The historical facts that Orwell fabricated were, from his perspective, near future. The historical facts that Brown invented are distant past.  As I indicated in this blog post from last year, I think that Orwell’s imagined version of the 1980s was a fairly realistic possibility of the direction that the late twentieth century could have taken. When it comes to Dan Brown’s fabricated history, it doesn’t matter whether or not his facts are realistic because he wasn’t guessing about the future; he was writing about own version of historical events that have already happened.

I really enjoyed reading The Da Vinci Code.  Both as a recreational reader and as someone with a degree in English, I thought it was interesting and well-written. The plot is exciting and engaging, the characters are believable and likable, and there are a number of interesting themes and motifs. An additional appealing factor is that it involves historical details from a variety of time periods, which gives it the tone of a time-travel story even though most of the novel takes place in a single night. It’s highly intellectual for a sensational bestseller, and it’s very fast-paced and eventful for a novel that is essentially about historical research. Not only is it a gripping page-turner, but it raises the kinds of intellectual questions that leave you thinking long after you finish the book. What is the relationship between a symbol and a symbolized idea? If a symbol needs to be decoded, does that make it more meaningful or less meaningful? When it comes to ancient artifacts, is it more important to preserve something or to bring public awareness to it? What is it about human nature that makes us believe that secrets are meant to be discovered and revealed, and is that an impulse that should be followed? From an academic perspective, is it more important to debunk mistaken beliefs or to allow the continuation of a historically rich religious tradition?

But this book has raised other issues that aren’t about the questions and experiences of the characters, but rather about the relationship between the author and the reader. Dan Brown’s goal was to create a fascinating story with religious themes, and in that, he succeeded. It seems to me that he also was deliberately expressing his distaste for religion in general and Christian beliefs in particular. He succeeded in that, too. In the process of writing a novel that has sensational appeal, raises the intellectual and academic questions that he wants to bring to readers’ minds, and expresses his negative views of Christian theology, he alters historical facts. This is something that fiction writers do all the time to make a point or to tell a good story. Historical fiction often tends to be wildly inaccurate because the writer is not only trying to bring history to life, but also to tell a story that is interesting and original. Since readers know that fiction is, by definition, made up by the author, it isn’t necessarily immoral for an author to alter historical facts in order to tell the story that he or she wants to tell. But does the author have a responsibility to make sure that the readers know which facts are made up? Is it immoral for the author to put fabricated details into the mouths of academically respected characters? Is it wrong for the author to write about altered versions of actual people and organizations?

booksIn general, I’d have to say that the answer to the above questions is no. An author who writes a fictional story shouldn’t have to be responsible for ensuring that readers don’t accidentally believe that the story is true. If it’s okay for Margaret Mitchell to tell us that there was a woman named Scarlett O’Hara who lived in Georgia during the Civil War, if it’s okay for people like Thomas Malory and T.H. White to tell us stories about a medieval king named Arthur, if it’s okay for the BBC to tell us that there’s a man from the planet Gallifrey who travels through space and time in a blue box, then it should be okay for Dan Brown to tell us that there was a man in first-century Judea named Jesus who was married to a woman named Mary Magdalene and that religious leaders have since then gone out of their way to keep this union a secret.

But there are a couple things that make the situation regarding The Da Vinci Code a little different. One is that Jesus wasn’t just any historical figure; he is the basis of a large religious tradition that Dan Brown is undermining and discrediting when he makes up stories about Jesus that he hopes readers will believe to some extent. I gather that Dan Brown himself is not a Christian and that he doesn’t believe in the divinity of Jesus, so from his own perspective, it’s no more heretical and immoral to fictionalize Jesus than it is to fictionalize any other historical person. But even if we are to take this issue from Dan Brown’s point of view and leave Jesus’ divinity out of the debate, it seems to me that it’s still awfully irreverent and insulting to write a story that knowingly and deliberately contradicts other people’s faith while presenting the fabricated details in a way that attempts to persuade readers of their legitimacy. I know that Dan Brown himself has said that The Da Vinci Code is just a story, but he makes all of his characters academic experts and cites imaginary sources that sound real.

That brings me to the other problem, which is that Dan Brown tries too hard to hide his imaginative hand in his version of Jesus. Sure, any intelligent and discerning reader knows not to believe everything that he or she reads in a novel, even if it involves a real person, but people are going to have a harder time distinguishing fabricated facts from actually true background information when both kinds of details come from the mouth of characters who are described as leading experts in their fields. It’s not unreasonable for readers to subconsciously assume that, when a fictional expert states a historical fact, that the author has done research and verified the truth of that fact. When Leigh Teabing, a fictional scholar in ancient documents concerning Jesus, claims that there are 80 extracanonical gospels and implies that they are consistent with each other, it’s only natural that many readers will take it for granted that this is true, when in fact Dan Brown exaggerated the number to make his point seem sensible, and that the extracanonical gospels are not at all in unity with each other. When Robert Langdon, a fictional authority in the field of symbols, interprets almost everything as a symbol of the sacred feminine, it’s only natural that many readers will take it for granted that it’s true that a surprisingly large amount of famous artwork and literature contains hidden allusions to Mary Magdalene and/or pagan goddesses and/or a vaguely theistic concept of femininity itself. When a prose passage that is evidently Robert Langdon’s train of thought says that the word “Jehovah” is a blend of the Hebrew words for the sacred feminine and for the Hebrew male God “YHWH”, it’s only natural that many readers will think that’s true and totally forget that “Jehovah” is a Latinized spelling of “YHWH”, not a combination of another name with “YHWH”. (That one struck me as being particularly absurd. The worship of YHWH did not involve the belief in the existence of a corresponding goddess or an abstract divinely female entity, and if there was such a female divine being, her name wouldn’t have begun with the letter J or a Hebrew equivalent of it, because there was no Hebrew equivalent of the letter J.)

Fabricating JesusIt would be an interesting project to go through the book and meticulously factcheck each piece of information that is presented as a nonfictional fact. It wouldn’t surprise me if some authors have actually done so, since I know that The Da Vinci Code has sparked a phenomenal amount of discussion. I do know of one book that dedicates a fair amount of time and space to explaining where Dan Brown got his ideas. (The book is Fabricating Jesus: How Modern Scholars Distort the Gospels by Craig A. Evans, and I would recommend it. Although it isn’t absolutely unbiased, it’s less biased than most for-the-general-public books on the topic. He isn’t deliberately making up sensational theories for the sake of making a name for himself, which is what some of the “modern scholars” who he mentions have done.) Interestingly enough, not all of Dan Brown’s made-up facts are actually original. It seems that he did do at least a little research on some of the most extreme ideas that historians have suggested about Jesus and the early church. Evans lists some of the fabricated facts that Dan Brown uses and points out the lack of credibility in those sources. In theory, as a fiction writer, Dan Brown has the prerogative to pick a few radical and bizarre theories and create a story in which they are true. But this makes his misinformation particularly insidious, because it gives him the ability to frame his imaginary facts in a scholarly context.

Admittedly, there’s a very fine line between making up facts to tell an interesting story and making up facts that fool readers, and it has more to do with the readers’ perception than the author’s intention. But I do think that Dan Brown went too far. I lost count of how many times while I was reading The Da Vinci Code that I suddenly realized that it was responsible for the spread of a blatantly untrue fact that I’ve heard people say time and time again. (For instance, it’s absolutely false that the gnostic gospels were favorable towards women and were hidden by the Roman Catholic church because the popes were sexist. The four gnostic gospels that I have read are far, far more sexist than anything in the Bible.) People don’t even need to have read The Da Vinci Code themselves to have heard and repeated these falsehoods. Then these things get passed around as fun facts or as ammunition against Christianity, and few people are going to feel the need to look them up and see if they’re accurate.

The problem here isn’t the fact that someone wrote a book that isn’t historically accurate. The problem is that our culture enjoys debunking Christianity so much that there are people who are willing to believe anything they read in a fictional book that backs up their arguments, even if they are otherwise intelligent people wouldn’t take that approach to any other topic. Although I am suspicious of Dan Brown’s motives in writing such a book, I certainly wouldn’t argue for censoring his work because of the factual liberties that he takes. But I do think that the general reader population ought to keep in mind that Dan Brown is not a theological expert or a historical expert, that his religious-themed writings are fictional, and that his version of Jesus is not the Jesus who actually existed and who is the foundation of Christianity.

There’s This Book I’m Reading, Episode 5

1 Comment

Douglas AdamsI read a lot of stuff. Much of it is for school, but when I can find the time, I like to read just for the fun of it, too, and I have always found that pleasure reading is just as intellectual and conveys just as much knowledge and school reading. For example, here is something I have learned through extensive pleasure reading: Douglas Adams was really clever. He was both a skilled writer and an all-around genius who either had extremely varied fields of knowledge or was very talented at using knowledge he didn’t even have. Either way, reading a book by Douglas Adams is both an enjoyable and an intellectual experience.

My familiarity with Douglas Adams’ writing is primarily limited to the Hitchhiker’s Guide to the Galaxy series. I actually hadn’t read those books until about the time the movie came out, which Google informs me was in 2005. That means that I was fourteen, (well, thirteen and a half; it was in the spring) and I’m almost a little embarrassed to admit that I hadn’t already read the books by then. I knew that my father liked them and I seem to recall that he had recommended them to me on more than one occasion, but yet I somehow didn’t read them until there was a movie ready to be watched shortly thereafter. Given the fact that I have always considered myself to be a greater book-lover than movie-lover, I cannot justify the movie-centric priorities that I displayed as a thirteen-and-a-half year old. But this is unimportant, because the point is that I did in fact read the books and I loved them and have since read them many times and continued to love them every time.

Douglas AdamsThis blog post isn’t about The Hitchhiker’s Guide to the Galaxy. It’s about another book by Douglas Adams called Dirk Gently’s Holistic Detective Agency. Sadly, it is the only Douglas Adams book I have read apart from the Hitchhiker’s Guide to the Galaxy series, but it has greatly reinforced and increased my high opinion of Douglas Adams and has reminded me that I must find and read more Douglas Adams books, particularly The Long Dark Tea-Time of the Soul. Dirk Gently’s Holistic Detective Agency was written in 1987, which was three years after So Long and Thanks for All the Fish, (The fourth Hitchhiker’s Guide to the Galaxy book) and five years before Mostly Harmless (the fifth and last Hitchhiker’s Guide to the Galaxy book). In many ways, most notably the writing style, it is very similar to the Hitchhiker’s Guide to the Galaxy series, but it is certainly a book worth enjoying, admiring, and discussing in its own right.

Douglas Adams, like great British writers before him, (this is an allusion to Shakespeare, by the way) is remarkable for his skill in characterization. Not only are the characters memorable and interesting, but Douglas Adams is very good at realistically articulating the thoughts of apparently normal characters in ridiculous situations, ridiculous characters in apparently normal situations, and any kind of character in any kind of situation between the two extremes. If I was writing an unreasonably long paper arguing that Douglas Adams’ characterization is just as brilliant as Shakespeare’s, (Oh, why did I not think of that several months ago? That would have been such an awesome English senior seminar paper!) I could take several pages giving textual examples. But I am not writing a paper here and I don’t have a minimum length, but I do have a minimum amount of time to dedicate to this blog post, so I will instead stick to a couple characters in the book I am specifically discussing.

Apparently, there's a movie. I have not seen the movie, but I would like to do so at some point.

Apparently, there’s a movie. I have not seen the movie, but I would like to do so at some point.

Richard and Susan are both pretty normal people. They are talented and notably intelligent people, (Richard works with computers and Susan is a cellist) but they act and think more or less like any other Earth human who has never encountered extra-terrestrial technology or been faced with paradoxes of the space-time continuum. Richard is absent-minded and obsessed with his job; Susan is his girlfriend who wishes he would step away from the computer screen a bit more often. Richard is somewhat in trouble with his boss because he’s behind schedule on certain tasks; Susan is his boss’s sister who is annoyed that her brother leaves long rambling messages on her answering machine telling her to pressure Richard into getting his work done. But somewhere along the line, they get involved in a bizarre course of events that involves a murder and police investigation, a ghost, and inexplicable anomalies in the fabric of space and time, which Richard cannot solve with his computer simulations.

Then there’s Reg, an eccentric and absent-minded professor who reminds me very much of a certain professor I have had, except that Reg is even odder and his conversation is even more convoluted. Like the aforementioned professor, Reg is inherently likable, even though the reader can tell right away that there’s something extremely strange about him. If nothing else, it’s weird that he’s a professor and nobody knows exactly what his field is. The fact that his position is called “the Regius Professorship of Chronology” is a hint, but not a very specific one. Reg’s extreme absent-mindedness, which first appears to be a trait that Douglas Adams uses just for the sake of characterizing Reg according to a stereotype and adding an extra touch of humor, turns out to be part of the plot. That’s another thing about Douglas Adams; many of the most random and silly side-notes of the beginning of the story later turn out to be significant and incredibly brilliant plot twists.

And there’s Dirk Gently himself, a character who cannot be described in any way other than to quote directly from the book itself. When Reg casually mentions Dirk, formally known as Svlad Cjelli, Richard “wondered what had lately become of his former… was friend the word? He seemed more like a succession of extraordinary events than a person. The idea of him actually having friends as such seemed not so much unlikely, more a sort of mismatching of concepts, like the idea of the Suez crisis popping out for a bun.” Richard and Svlad had known each other as undergraduate students, during which time Svlad had spread the rumor that he was psychic by denying it far more vehemently than necessary and then failing to disprove it. This, as Douglas Adams emphasizes, is the best way to make up a convincing story. Now, Dirk Gently is a terribly unsuccessful private detective who believes in the interrelatedness of all things so strongly that he deems it necessary to go sit on a beach in Bermuda while working on a case concerning a missing cat. Dirk Gently is the kind of character who can spout off fascinating theories regarding Schrodinger’s cat that almost make sense in once chapter, admit that he was just saying that to be ridiculous in another chapter, and later yet, say profound and quotable things like, “It is a rare mind indeed that can render the hitherto nonexistent blindingly obvious. The cry ‘I could have thought of that’ is a very popular and misleading one, for the fact is that they didn’t, and a very significant and revealing fact it is too. This, if I am not mistaken, is the staircase we seek. Shall we ascend?” I left that last bit in there because I like it and intend to use it in regular conversation whenever possible.

Douglas AdamsThere are many other brilliant things about the book that I don’t have time to describe in any detail, such as the Electric Monk and Richard’s sofa that’s stuck in an impossible place on the stairs. One of the best things about Douglas Adams’ stories is those random details that seem so simple and/or humorous, but required an extreme degree of intelligence and creativity to write. And there are many other wonderfully quotable lines from the book that I don’t have time to quote. Another one of the best things about Douglas Adam’s stories is that they are rife with clever and quotable lines. But I think that the thing I like the absolute most about Douglas Adams is that his writing style is so memorable and even inspiring. Every now and then, I read over something I’ve written and notice a phrase or sentence that sounds a little like Douglas Adams, or even a group of sentences that express a very Douglas-Adams-esque idea. When Douglas Adams’ influence manifests itself in my own writing, those are the times that I am most satisfied with my writing, because he has set the standard to which I aspire. Maybe that’s a little funny, because in some cases, (obviously not the one quoted above) his wording and phrasing is so simple and vernacular and his ideas seem so natural. One reads Douglas Adams and thinks to oneself, “I could have thought of that!” But the fact is that one didn’t, and a very significant and revealing fact it is, too. This, if I am not mistaken, is the staircase we seek. Shall we ascend?

There’s this Book I’m Reading, episode 3


I just started rereading a book that I’ve had for a number of years and have already read quite a few times. My parents gave it to me for my birthday one year, but I don’t remember how old I was. It was a pretty long time ago; the main character turns fifteen in the middle of the book, and I know I was younger than her the first time I read it. Right now, I don’t really have a lot of time for reading, so it’s probably going to take me the rest of the semester to finish the book. (That’s not how I like to read. Back when I was a young child and had practically as much time as I wanted to read, I sometimes would read novels that were a couple hundred pages long all in one sitting. Ah, the good old days!)

Sophie’s World, by Jostein Gaarder, was published in Norway in 1991 and was apparently a major bestseller there. The English version was first published in 1994. The book is about a Norwegian girl named Sophie Amundsen who starts getting random anonymous letters about the history of philosophy. A significant portion of the book consists of these letters and the subsequent conversations between Sophie and her philosophy teacher once he is no longer anonymous. The book is very cleverly written; it’s a fictional novel with an interesting plot, but it’s also so factual that it could practically be used as a textbook.

SPOILER ALERT: Throughout the course of the book, Sophie and her no-longer-anonymous philosophy teacher Alberto learn that they are merely characters in a story being written by a man named Albert Knag as a birthday gift for his daughter Hilde Moller-Knag, who is exactly the same age as Sophie. This, of course, wreaks havoc with their perceptions and opinions about identity and reality. Meanwhile, Hilde herself becomes a main character in the novel, and in one passage that I especially like, the author breaks the fourth wall by having Sophie and Alberto discuss the possibility that even Hilde and her father are merely characters in another story. And this forces the reader (aka me) to wonder whether or not they (aka I) are/am only a character in yet another story.

This isn’t my planet; this is is Saturn. But I don’t have a good picture of my planet, so this will have to do.

This happens to be a fun thing to wonder. It seems to be somewhat related to some of the strange beliefs that I had when I was little. For example, when I was angry at my family, I would tell myself that I was probably actually from a different planet and had been brought to Earth as an infant and placed into a randomly selected Earthling family who would spend my entire childhood conducting some sort of interstellar sociological experiment on me by being mean to me and seeing how long it would take me to discover that I wasn’t one of them. This was, of course, the most logical way to explain any incident in which I felt I was being unfairly treated.  Alternatively, sometimes I thought that I was the normal one and it was everyone else in my family who was an alien. Either way, every detail of my life was clearly part of a sinister conspiracy.

When I wasn’t angry and was in a more rational mindset (depending upon how you define a rational mindset), I played with the idea that I had a visible thought bubble constantly floating above my head and everyone else could therefore read all my thoughts. This thought bubble obviously couldn’t be seen in mirrors or photographs, or else I would have been able to see it myself, and I never did. Furthermore, it was obvious that the entire world was in a conspiracy to keep this knowledge from me. Probably, my family hoped that this secrecy would allow me to live a normal life. My parents must have had to monitor every situation in which I might meet someone new so that they could explain to them that they were not to mention my abnormal thought bubble. I eventually discarded this entire theory on the grounds that it was scientifically impossible for me to have a visible thought bubble over my head because, for something to be visible, it must have light bouncing off of it, and in that case, there would be no way of explaining why the thought bubble doesn’t show up in the mirror. Still, to this day, I avoid thinking thoughts that I wouldn’t want people around me to read. This is very useful when it keeps me from getting distracted in classes.

When I wasn’t busy making up crazy conspiracy theories, I pondered the fact that I have never been able to conclusively prove that I exist at all. You can tell that this has been an ongoing issue in my life ‘cause I wrote about it here.  If I, like Sophie Amundsen, am only a character in a book, then that would be a reasonable explanation for how I can simultaneously be conscious and unreal. If this is all true, many things make a lot of sense now. I think I need to find the author and have a little talk about what’s going to happen in the editing process.

There’s this Book I’m Reading, episode 2

Leave a comment

It’s a wonder that I was allowed to watch movies when I was little, because I would usually bombard my parents with questions for days afterwards. I would want to know why Luke Skywalker had to go back to the Death Star and fight with Darth Vader, whose voice it really was that Ray Kinsella heard in the cornfields, why Harry Beaton wanted to run away, and why everyone was so happy when Truman escaped from the Truman Show, even though they had loved watching that show so much. (These questions are in reference to Star Wars VI, Field of Dreams, Brigadoon, and The Truman Show respectively) Then, when I ran out of questions to ask about the plot, I’d want to know what the point of the movie was. I just assumed that any movie other than the most simple and banal cartoon was making some specific and philosophical point. My little kid self wouldn’t have had much of an appreciation for sappy chick flicks. Actually, my non-little-kid self doesn’t care much for most chick flicks either, although I have noticed that non-intellectual genres aren’t necessarily devoid of interesting and intelligent ideas. That’s even more true in the case of books than of movies.

Margaret Mitchell

Although it’s considered a great classic, Gone with the Wind isn’t exactly the most intellectually deep book. In my opinion, it’s actually quite a light read, even though it’s just as long as War and Peace, which is known for not being a light read. I’m not saying I don’t like Gone with the Wind; in fact, it’s actually one of my favorite books, and I read it about once a year. (In case this isn’t obvious by implication, I’m reading it right now) I wouldn’t even say that there’s nothing thought-provoking about it, but most of the interesting ideas it discusses are spelled out in specific detail. As far as I’m aware, there are no subtle meanings in minor plot points, no hidden metaphors in the descriptive sections and the imagery, and no room for analyzing the characters’ personalities or motives, because everything is explained specifically in the text. One doesn’t even need to wonder what the point of the book is, because Margaret Mitchell tells readers: It’s about what she calls gumption.  In my copy of the book, there’s a blurb with a quote from the author that says, “If the novel has a theme, it is that of survival. What makes some people come through catastrophes and others, apparently just as able, strong and brave go under? It happens in every upheaval. Some people survive; others don’t. What qualities are in those who fight their way through triumphantly that are lacking in those that go under? I only know that survivors used to call that quality ‘gumption’. So I wrote about people who had gumption and people who didn’t.”

Is it merely a coincidence that the author of the book and the actress who played the main character in the movie look so much like each other?

It seems to me that in that quote, Margaret Mitchell was being unnecessarily simple and concise. Her book is about a little bit more than people who have gumption and people who don’t. I think that Gone with the Wind is about the differences between people’s personalities in a more general sense. I once read a non-fiction book that used the four main characters in Gone with the Wind (Scarlett O’Hara, Rhett Butler, Ashley Wilkes, and Melanie Hamilton Wilkes) as examples of four distinct personality types. I’m not in favor of trying to sort people into a small number of personality types, but for the sake of that book’s argument, Gone with the Wind was an ideal example. Each of the main characters’ personalities is in contrast with all of the others’.

Gumption isn’t the only personality trait that Margaret Mitchell uses to differentiate the personalities of the different characters. The other main one is analytical thought. It’s pretty obvious because there are quite a few instances throughout the book in which Mitchell explains a character’s response to something by introducing it with the phrase, ‘Never analytical…’ Scarlett is frequently described as being ‘never analytical’. She takes everything at face value and acts impulsively. She shares this trait with her father and many of the residents of the plantations in the early chapters, but most of the other main characters- Ashley, Melanie, Ellen O’Hara, Mammy, Dilcey, Will- are very analytical. Rhett Butler kind of falls into either category, depending upon the situation. Actually, I suppose that the same could be said for Scarlett, because she’s certainly capable of being analytical when nobody else is there to think things through for her. I think it’s worth noting that, in terms of gumption and of analytical-ness, Ashley Wilkes’ personality is almost completely opposite Scarlett’s, while Rhett Butler’s is almost identical to hers.

I say almost because there’s one significant way in which Scarlett is very much like Ashley and very much unlike Rhett, at least in the first few chapters. She changes her mind about it throughout the book and has several meaningful conversations about it with numerous other characters, which I take as an indication that it’s another very central point of Gone with the Wind. It is the question of whether or not it’s important to adhere to social norms. Scarlett resents many things about the culture in which she lives and the restrictions that it places on her, but she is deeply rooted in the mindset behind them, and so she is reluctant to openly defy them. The combination of necessity and Rhett Butler’s influence persuade her, time and again, to go back on the principles instilled in her, to the point that she becomes alienated from her own culture, rather than being exemplary of it, as she appears to be in the first couple chapters. I said earlier that the book doesn’t leave many questions unanswered, but one that it does leave unanswered is which point of view is right. There are several instances where Scarlett asks someone, usually Rhett or Ashley, if she has done the right thing by rejecting societal values for survivalist ones, and they always give ambiguous answers, even though their own views are quite obvious. From the little that I know of Margaret Mitchell, I think she wasn’t entirely clear on what she thought of that question.

True love, according to the movies

One claim that I am not going to make about the point of Gone with the Wind is that it is a love story. I know that both the movie and the book (which are incidentally more similar than movies and books usually are) have been classified as quintessential love stories, but I think that’s silly. If one reads Gone with the Wind as a love story, it is a pretty bad one, because almost all of the characters are absurdly selfish. Scarlett and Rhett especially are, and they are held up as a prime example of the ideal literary romance. I could go more into detail about the selfishness of all of the main characters and most of the minor ones (with the exceptions of a couple of the slaves, Scarlett’s mother, and Melanie) but that’s not really my point. My point is that Gone with the Wind, just like pop culture in general, throws the word ‘love’ around very loosely and doesn’t really mean much of anything by it. Most of the relationships in the book, romantic or otherwise, are characterized more by selfishness or unbreakable social connections, than by anything that ought to be called love.

The purpose of this picture here is to add color. That’s all.

But although the people in the book don’t love each other, one other prevalent theme in Gone with the Wind is love of the land. In fact, I would argue that it is maybe even more central than the themes of personality differences and societal norms. The plantation Tara and the city Atlanta are described in such detail and are so important to Scarlett that it’s impossible to treat that point as being insignificant, and many of the major events in the plot are related to Scarlett’s love for one or the other place. Besides that, in the section of the book that occurs during the Civil War, there are frequent factual interludes that describe military maneuvers in great detail. Even though it was obviously something that the characters were aware of and concerned about, it seems a little out of place to have those kinds of details scattered throughout a story that is essentially a literary version of the ultimate chick flick. I know that Gone with the Wind is a war story and that Margaret Mitchell wanted to show the horrors of war, but she does that much better in the hospital scenes and the descriptions of the blighted countryside. The stories of the Yankees travelling through the South don’t add much to that, unless the real point is land and ownership of land. And I can think of quite a few quotes from the book (including some from the very beginning and the very end) that would back up that argument.

Thus ends my rambling and hastily written list of opinions about Gone with the Wind. And it somehow ended up being over 1500 words. I’m not quite sure how that happened.

I Was Clever When I Was A Little Kid

Leave a comment


Childhood memories

Learning how to read is very difficult. I say that from personal experience, because I remember very well the confusion and frustration of the beginning of my reading career. But I was pretty determined about it; my little four-year-old self knew that the ability to read would give me power and skills beyond my wildest dreams and would immediately catapult me into the world of big-kid-ness. As it turned out, I was more or less right about that, but the actual process of learning how to read was so challenging and took so long that even now, I’m kind of proud of my younger self for accomplishing it.

The ability to read requires certain advanced cognitive abilities because it involves translating marks on paper to verbal sounds to complete ideas. For someone who has just started learning how to read, every single letter is a test of memorization skills. To read an entire word is already an accomplishment that demonstrates good retention and intelligence. It would be difficult enough even if a person could take a few seconds to think about each letter, but that just isn’t the way it’s done. An average person reads about 200 to 250 words per minute, which is 3  1/3 to 4  1/6 words per second. We’re all accustomed to doing that by the time we’ve known how to read for a few years, but if you think about it, that’s pretty amazingly fast. And that’s just average. Apparently, it’s not extremely rare to be able to read as many as 700 words per minute, with decent comprehension. That’s 11  2/3 words per second, which really doesn’t seem like it even should be humanly possible. In most cases, we all learn how to do that when we’re still small children and we don’t improve much even later in the educational process. (Note: I got those numbers by looking at various internet pages, some of which were more reliable than others. Pretty much everyplace agrees about the average reading speed, but the maximum seems to be a matter of contention, probably because there are so many internet speed-reading courses that want people to believe that they’ll be able to learn to read faster than is really possible. 700 is definitely possible, but I wasn’t sure if I could trust the source that said 1000, and I highly doubt the sources that gave even bigger numbers.)

Of course, the main reason that people need to learn how to read when they’re small children, even though it’s very intellectually challenging, is that we need to know how to read in order to learn other things. Although it’s commonly understood that the most effective way to learn is through a variety of methods, including verbal instruction and the method commonly known as ‘just do it’, academia relies heavily upon reading because written text is capable of cramming lots of information into a small space, allowing you to quickly and efficiently stuff as much of it into your brain as your brain can possibly hold. Or, to put it more concisely, reading goes faster than a teacher’s voice. (And way faster than personal experience)

Pictured Above: A clever little kid
(This picture is about three years old. She’s not that little anymore.)

I think there’s another reason that it’s best for people to learn to read when they’re still quite young. Little kids are very, very clever. People tend to think of children as being incapable of much intelligent thought, but that’s just because it takes time for someone to accumulate factual knowledge, to figure out how to express their thought process, and to gain enough experience to acquire specific skills. The most significant kind of intelligence, though, is the ability to learn, and little kids are undoubtedly experts at that. Children have brains like sponges. If you don’t believe me, find a random kid and quickly teach him or her a song. Then run away with your hands over your ears, because that child will probably sing that song over and over and over again, leaving you wondering in annoyance how someone could possibly memorize something that thoroughly in such a short time. (If you can’t find a little kid to sing to, or if you don’t feel like it, you can take my word for it, because I have a bunch of younger siblings who were little kids not so long ago, and I am speaking from direct observational experience when I say that little kids pick up songs the way ceiling fans pick up dust.)

See how smart I am now?
In my defense, I was trying to do it very fast.

As hard as it was for me to learn to read when I was four and five years old, I expect it would have been even harder if I had to learn when I was older. I certainly don’t think I could do it now. I more or less take it for granted now that I know how to read, but if I stop and think about it, it’s a really amazing skill that seems like it should require exceptional ingenuity to learn, and I am sadly lacking in ingenuity, exceptional or otherwise. If I’m technically more intelligent than a little kid, it’s only because everything I’ve learned in the past fifteen or sixteen years has just been built on the foundations of things that my genius little kid brain learned back when I was a genius little kid.