Monday, June 17, 2019

Marvel or DC? A Twitter Feud

A very dear old friend of mine recently watched Aquaman, didn’t like it, and subsequently tweeted a derogatory comment about the overall quality of DC movies. As a long-time DC fan and defender, I couldn’t let this slide—I replied with my skepticism of this oft-repeated narrative that DC isn’t good at movies.

My friend then responded with an impressive 12-tweet thread making the case for Marvel based on Rotten Tomatoes’ critics’ scores, complete with math, gifs, and snark.

This cannot stand.

But my rebuttal is far too long for a Twitter thread, and I just don’t have the patience. So Jaclyn, forgive me for moving this argument to my home court, but you deserve the best I can bring.

Innocent bystanders: I recommend you read the original Twitter conversation before continuing.

Jaclyn, I will begin by arguing that your premise (that Rotten Tomatoes’ critics’ consensus is the best metric for deciding what makes a movie “good”) is flawed.

I will then show that even using the very parameters you established, you’re still wrong.

First, why assume that the critics’ consensus on Rotten Tomatoes is the best way to measure good versus sucky? Why not use the audience scores? After all, it’s audience members that buy tickets, generate studio revenue, and ultimately influence which movies get made and which don’t, not critics. If we average the audience scores rather than critics’ consensus of the movies you counted, we get 79.8% for DC and 82.3% for Marvel—a huge narrowing of the gap. If they were in college, they’d both have a B-.

Did I personally consider Suicide Squad a good movie? Not even a little. But I won’t commit the logical fallacy of the appeal to incredulity: just because I didn’t like it and can’t really imagine why anyone did, that doesn’t mean that audience reactions (which translate to ticket sales, revenue, and sequels, remember) are invalid. Marvel’s Venom shows a similar result, a terrible critical consensus but high audience scores. (You left that one out of your analysis, btw.)

On the other hand, I went into Aquaman expecting not Citizen Kane-level filmmaking, but something big, dumb, loud, and fun, and that’s exactly what I got. I enjoyed it.

You could point out that racist and sexist reviewers sabotaging the audience scores for Black Panther and Captain Marvel tanked them unfairly, and you’d be right. However, if we leave them out of our audience score average, Marvel still gets only an 83.8%; bumped up from a B- to a B. And I would then point out that this just means that Marvel fanboys are more sexist and virulent than DC fans, since Wonder Woman didn’t suffer a similar tanking.

But let’s forget Rotten Tomatoes for awhile. After all, how much can it really tell us about the true quality or long-term impact of a film?

I would argue that DC has had a more significant impact on film-making in general and superhero films in particular, and that ultimately DC’s legacy will be considered more important.

Considering only the films you mentioned, many would say that Marvel has made the same movie 22 times. With an occasional exception such as Ragnarok, which could be deemed a straight comedy, MCU movies are remarkably consistent in theme, tone, and production value. Consistently good? Maybe. But other words for “consistent” are “boring” and “low risk.” I can make a consistently good cake using the same recipe 22 times, but I bet you’d get sick of eating it pretty quickly.

On the other hand, there’s a universe of difference between Batman Begins and Aquaman, between The Dark Knight and Shazam. Differences in tone, in dialogue, in color palette, in realism, in effects, in acting style. DC takes risks. Sometimes they don’t work, and we get a Suicide Squad, and sometimes they do and we get The Dark Knight, which tops list after list of the greatest superhero movies of all time. Among all the movies you listed, which is most likely to be remembered as a classic, most likely to be taught in film studies courses and, in retrospect, be considered revolutionary in its genre? Many would say The Dark Knight.

The upcoming Joker film is another example of DC’s creative risk-taking. I don’t know for sure that it will be good, but I do know that the modern iteration of Marvel Studios would never even attempt anything like it.

Additionally, without the original Batman (1989), we probably wouldn’t have superhero movies as we know them today. It proved that even with controversial casting choices and plot changes that pissed off the fanboys, superhero movies could be not just profitable, but critically successful.

And while DC was setting the stage for decades of brilliant superhero films by a range of studios, what was Marvel doing? Crapping out the 1990 Captain America, which has a staggering 7% Tomatoes consensus. That’s the lowest rating of ANY Marvel or DC movie EVER—even lower than Fant4stic (another Marvel failure you forgot to mention) at 9%.

We could also discuss other kinds of cultural considerations. Despite Doctor Strange’s decent Tomatoes score, it was widely panned for character whitewashing by casting Tilda Swinton (literally the whitest living human) as The Ancient One, a role held in the comics by an Asian Man. When I look at a poster of the Justice League, I see an Israeli woman for whom English is a second language, an Ashkenazi Jew, a Native Pacific Islander, and a Black American man. When I look at a poster of the original Avengers, I see white guy, white guy, white guy, white guy, pretty white lady. Is diversity the best metric for deciding whether a movie is “good”? No, but it’s certainly relevant—at least as relevant as the consensus of professional movie critics, who themselves are overwhelmingly white and male. Has Marvel taken great strides in diversification in recent films? Absolutely. But again, I argue, DC got there first.

So you’ve seen why your premise is flawed from the start. But for the sake of argument, let’s use your premise—that Tomatoes critical consensus is a valid metric for this debate—and see how well your argument holds up.

You compare the average score of the 22 MCU films against a fairly random selection of DC films: the first four Batmen, the Dark Knight trilogy, and the official DCEU films. But this is completely arbitrary. You also mention the X-Men franchise and two pre-MCU Spider-Man iterations as points in your favor, so I say, let’s really go for it. Let’s count up EVERY Marvel and DC film that received a wide theatrical release, using your preferred method of critical consensus.

For reference, films that weren’t considered include films made from “imprint” properties rather than the true Marvel and DC labels (such as, sadly, V for Vendetta, a personal favorite), films that received only a partial theatrical release (such as Batman: The Killing Joke), and films that are too old to have a critical consensus listed on Tomatoes (such as the 1944 Captain America—but trust me, you don’t want that one affecting your score anyway).

Among Marvel’s complete collection are critical and box office flops such as Daredevil (you forgot Affleck was the blind vigilante before he was Batman, didn’t you?), Elektra, two versions of The Punisher, and, oh yes, Howard the Duck. Which, by the way, is clearly part of the MCU since he appears in BOTH Guardians movies.

DC has had some tragedies too, such as Superman IV and Catwoman. But the point is that to pretend that every Marvel movie is by default a gem is intellectually dishonest.

I won’t keep you in suspense: if we average the critical scores from all of these movies, what do we end up with?

Marvel 65%
DC 53.9%

DC is failing, but Marvel is skipping class, sitting in the back with their friends, goofing off, barely paying attention, and one bad test score away from repeating the semester.

If you’re curious, audience scores average to Marvel 69.5%, DC 58.8%.

To say that one is great and one sucks, well, the numbers just don’t back it up.

Friday, March 15, 2019

Book Review and Discussion: Innumeracy: Mathematical Illiteracy and Its Consequences by John Allen Paulos (1988)

Imagine a test for cancer that is 98% accurate.

Now assume that 0.5% of the population has cancer.

That means in a test group of 10,000 people, 50 of them have cancer. If the test is 98% accurate, then 49 of those 50 will test positive: that means 49 accurate positive tests.

The other 9,950 people in the group do not have cancer. But if the test is 98% accurate, then 2% of them will get incorrect results, i.e. false positives. 2% of 9,950 is 199 false positives.

Are you surprised that the false positives outnumber the true positives 4 to 1? I was.

Now imagine that instead of testing for cancer, we’re testing for drug use. A misunderstanding of these numbers could have major implications in states where laws require low-income people to pass a drug test before they can receive food stamps or other public assistance.

In his 1988 book Innumeracy: Mathematical Illiteracy and Its Consequences, mathematician John Allen Paulos discusses the many ways that innumeracy, “an inability to deal comfortably with the fundamental notions of number and chance, plagues far too many otherwise knowledgeable citizens” and has real-world consequences for health, public policy, criminal justice, and other areas. “The same people who cringe when words such as ‘imply’ and ‘infer’ are confused react without a trace of embarrassment to even the most egregious of numerical solecisms,” he writes. “In fact, unlike other failings which are hidden, mathematical illiteracy is often flaunted: ‘I can’t even balance my checkbook.’ ‘I’m a people person, not a numbers person.’ Or ‘I always hated math.’” (3-4)

You know who you are.

Paulos points out that our lack of understanding of statistics in particular can affect policy in many ways. The media’s tendency to fixate on every terrorist attack and mass shooting, for example, leads citizens to believe that their risk of dying in one of these is far higher than it actually is, and it leads politicians to focus on policies addressing those issues, but ignoring far more common killers such as suicide and heart disease. (Steven Pinker also discusses this phenomenon at length in his book The Better Angels of Our Nature, which I reviewed here.)

Paulos also shows that the human tendency to attribute meaning to coincidences and other rare events can lead us to fall for frauds such as psychics and quack medicine. TV talk shows make hay from every “correct” psychic prediction, ignoring the dozens of incorrect ones that preceded it. Similarly, if an ordinary person has a dream, and the events of the dream then occur in real life, they remember this and find it significant; they forget the thousands of dreams they had before and since that did not come true. But even an event that is statistically rare will happen occasionally, given enough time and opportunity. It would be much more unusual if you went your entire life without having a predictive dream; it would be much more surprising if a psychic’s random or intuitive guesses never aligned with reality.

Paulos presents one example after another, always spelling out the basic math required for understanding. This might lead you to think that the book is tedious, but in fact the opposite is true. Paulos’s explanations are clear and simple, and he uses humor throughout to lighten the mood. Knowing that many in the audience will be intimidated by chunks of numbers, he writes, “Now is probably a good time to reiterate my earlier remark that an occasional difficult passage may be safely ignored by the innumerate reader. … The occasional trivial passage likewise may be quite safely ignored by the numerate reader. (Indeed, the whole book may be safely ignored by all readers, but I’d prefer that, at most, only isolated paragraphs will be.)” (16-7).

Further, though the book is now 30 years old, every point that Paulos makes is still timely; indeed, prescient. In his conclusion he writes, “I’m distressed by a society which depends so completely on mathematics and science and yet seems so indifferent to the innumeracy and scientific illiteracy of so many of its citizens; with a military that spends more than one quarter of a trillion dollars each year on ever smarter weapons for ever more poorly educated soldiers; and with the media, which invariably become obsessed with this hostage on an airliner, or that baby who has fallen into a well, and seem insufficiently passionate when it comes to addressing problems such as urban crime, environmental deterioration, or poverty” (134). Clearly, the problems Paulos discusses are at least as relevant today as they were at the time of his writing.

What are the causes of these issues? Paulos points to “poor education, psychological blocks, and romantic misconceptions about the nature of mathematics” (72-3). Too many people (including many teachers) believe that one is simply born with or without mathematical ability; you either get it or you don’t. (I find a similar belief about writing among my English students.) He discusses math anxiety and its sources: “The same people who can understand the subtlest emotional nuances in conversation, the most convoluted plots in literature, and the most intricate aspects of a legal case can’t seem to grasp the most basic elements of a mathematical demonstration. … They’re afraid. They’ve been intimidated by officious and sometimes sexist teachers… they’re convinced that they’re dumb” (88). Paulos discusses a few simple solutions to this that can be integrated at any grade level, but too many teachers do not have the time, resources, or sometimes the will to implement them.

This section reminded me of an NPR story that aired a few years ago, “Why Eastern And Western Cultures Tackle Learning Differently,” which discussed differences in our views of struggle in education. Americans tend to believe that struggling with a subject is a bad thing, a sign of failure or stupidity, while Japanese educators treat struggle as a natural and expected part of the learning process. Western schools reward those who don’t struggle, who breeze through school with little effort; eastern schools value perseverance and celebrate those who push through and overcome struggle. Perhaps a more eastern view of struggle in math could help us overcome our math anxieties.

Before reading this book, I would have called myself numerate; I did well in math in school and was never intimidated by mathematical concepts. But as I read, I realized that my numeracy was deficient in a key area, the area to which most of Paulos’s book is devoted: statistics and probability. Among all math concepts, these are the most applicable to our everyday lives, yet most people understand them poorly. Further, statistics are easy to abuse, and they can be manipulated both intentionally and unintentionally to deceive people about important issues like the relative danger of terrorism or the prevalence of other types of crime. A strong grasp of statistics would be a major advantage to individuals and to society at large.

Why, then, was I never required to take a statistics class in high school or college? Algebra, geometry, trigonometry, calculus—that was the expected progression of my math education and the path that I followed. I was not required to take statistics at any point; I was never encouraged to take statistics for any reason; so far as I can remember, no one ever suggested that I take statistics. It seems that this one change in requirements could make at least some small positive difference in numeracy among Americans.

Paulos’s book is short, readable, and important no matter what your current degree of numeracy. Students and parents in particular will find it both informative and encouraging. And concerned citizens of all stripes should take its message to heart and examine how their own innumeracy affects their beliefs, behaviors, and, perhaps most importantly, voting habits, and what they can do about it.

Tuesday, February 5, 2019

In Defense of Learning a Language Badly

I just started learning American Sign Language… sort of.

I’m enrolled in a 6-week class through Community Education of the Black Hills, a program that sponsors all kinds of adult education classes, from dance to dog obedience. The class meets once a week for two hours. It’s a small group: about ten students of all ages and backgrounds, each learning ASL for a different reason. The teacher is energetic and knowledgeable. We spend the two hours learning new vocabulary and grammar and practicing simple conversations with each other. The teacher sends us home with a handout of the signs we learned that day and some historical background on the development of ASL.

I won’t learn much in 12 hours of classroom instruction—maybe enough to make some basic small talk. And though I’m practicing outside of class, in a few months, I’ll probably forget most of what I do learn. So why bother? If you haven’t attempted to learn a secondary language since you were forced to do so in high school, you might not see much point in such a small effort. But there is value in learning another language beyond the ability to converse in it fluently.

For starters, you might be amazed how much you can say and understand after only a few weeks of practice. Collins Dictionary estimates that the 25 most common English words “make up about a third of all printed text”; the 100 most common make up about half. Think of that—with just 100 words and a little bit of grammar (recognizing first, second, and third person; singular and plural; perhaps present and past tense), you can decipher quite a lot. Could you read Proust or understand a lecture on differential equations? No. But airport signs, restroom signs, travel directions, menus, and weather reports? Certainly.

Additionally, if you travel overseas or interact with a non-native English speaker in your home country, you’ll often find that even a meager attempt at using their native language will elicit a surprising amount of warmth and gratitude. Residents of countries where English is not a primary language are accustomed to dealing with monolingual English speakers at work and in public life. The rare American tourist or businessperson who makes an effort (no matter how poor) at speaking Russian or Japanese or Kikuyu is almost invariably met with praise and delight. This simple, selfless gesture is like an extended hand, an expression of fellowship made more valuable by its rarity. (Germans might be an exception—their English is better than your German, and most of them aren’t shy about letting you know it.)

A few weeks of learning a language very different in structure from your native one can also familiarize you with the spectacular diversity that is possible in human language (a source of continual delight for linguists). If you’ve only studied European languages such as Spanish or French—which are fairly closely related to English and not that different from it grammatically—then you’ve encountered only a tiny fraction of the ways a language can encode meaning. You probably don’t know, for example, that written Chinese indicates gender in pronouns (it differentiates between “he” and “she”) but spoken Chinese does not. Or that Lakota pronouns and verbs indicate not only gender, but whether the subject is animate or inanimate. Or that Korean uses different words for goodbye depending on whether the speaker is the one leaving or the one staying. Or that Tamil grammar requires the speaker to indicate whether a piece of information is hearsay or something they confirmed themselves. (This is called “evidentiality.”)

How cool is that?

Furthermore, although linguists still debate how much your native language affects how you think and view the world (called the Sapir-Whorf Hypothesis), a little bit of exposure to a secondary language can provide a window into the culture and values of the people who use it. For example, does the language distinguish between informal and formal “you” (like Spanish “tĂș” and “usted” and German “du” and “Sie”)? If so, how soon after meeting do strangers switch from one to the other? This tells you something about expectations for politeness and formality. Does the language make extensive use of titles and honorifics, like Japanese? This tells you something about the importance of rank and hierarchy in the culture. And most beginner language classes will include lessons on cultural basics like food, clothing, and etiquette.

Finally, we’ve probably all overheard a conversation in a language that we didn’t understand and thought that it sounded like nothing but noise. The more the sounds of a language differ from the sounds of our native language, the more true this will be. Yet after just a couple of weeks of practice in that language and its sounds—after learning just a few of the most common words and phrases—suddenly it begins to sound like speech, not noise. And its speakers, by extension, are actually talking, not uttering gibberish.

I had an experience like this during my 6-week Chinese class two years ago. Prior to taking this class, I knew exactly nothing about Chinese grammar and not a single word in the language. After a couple of weeks, I could pick out a word or phrase here and there while listening to an NPR reporter interviewing a Chinese speaker (before the translator butted in with the English voiceover, that is). It had a humanizing effect on the Chinese speaker that startled me. He no longer sounded like an incomprehensible foreigner living in an incomprehensible part of the world; he sounded like a person speaking a language. A language that, with practice, I could learn. And indeed, research shows that exposure to other languages increases empathy (I discussed this in my 2017 post “Value (Not Profit) in Studying a Foreign Language”).

If you’re even a little curious about learning another language, there are many free and low-cost ways to do so, most of which are more flexible and more fun than the high school classes you’re familiar with. Many communities offer free and low-cost classes for adults (if you’re in western South Dakota, check out Community Education of the Black Hills). Language-learning apps like Duolingo and Babel are convenient and low-pressure; Duolingo is particularly useful for increasing your vocabulary if you already have a bare-bones understanding of the grammar. Pimsleur audio lessons are great for practicing pronunciation and learning basic phrases for travel and business; you can buy them online or download them for free via the Hoopla audiobook library app. And the website The Mixxer can connect you with native speakers of your target language with whom you can practice (and who want to practice their English with you).  If you know of any other cheap or free ways to practice a language, please share them in the comments below.

Six weeks of classes, or a few months of skyping with a native speaker, or twelve audio lessons won’t make you fluent, but you might be surprised how fun, useful, and interesting language learning can be when you stop pressuring yourself and settle for doing it badly.

Sunday, April 15, 2018

Talking Past and Talking With

It’s widely acknowledged that many Americans live in an echo chamber in which we’re fed only information that corroborates beliefs we already hold, as result of either social media algorithms designed to generate clicks or our own viewing, reading, and listening choices. In public debate, individuals on opposing sides of a given issue consistently talk past each other and straw man their opponents’ views with the goal not of problem-solving or of compromise, but of rallying those who already agree with them.

In this era, the value of seeking out unbiased media and reading sane opinions on both sides of important issues is obvious. (PolitiFact has an excellent bias-checking tool here, for those who want to assess their current media sources and/or seek out less biased ones.) But I would also argue that having one-on-one conversations with real people who disagree with you on specific issues is even more valuable.

I witnessed an example of this on a recent episode of Sam Harris’s podcast Waking Up. In this episode, Harris had a two-hour conversation—an argument, really—with Vox Editor-at-Large Ezra Klein regarding Harris’s May 2017 interview with Charles Murray. Murray is infamous for his 1994 book The Bell Curve, which examined the genetic basis of IQ and included data which suggested that IQ differences among racial groups were at least partly biological. Nearly twenty-five years later, Murray is still protested when he engages in public speaking—sometimes violently—as in March 2017, when protestors at Middlebury College attacked Murray and injured his debate opponent severely enough to require hospitalization. In keeping with his absolutist attitude toward free speech, Harris invited Murray on his podcast to discuss his research and allow listeners to judge it for themselves.

After Harris’s interview with Murray aired on Harris’s podcast, Vox published a response that called Murray’s research “junk science” and attacked Harris for “endorsing” his views. This launched a year-long back-and-forth between Harris and Vox in which Harris criticized the Vox response in later episodes of his podcast, Vox published more critiques of both Harris and Murray, and Harris continued to accuse Vox of “intellectual dishonesty.” The feud migrated to Twitter and also spawned a private email exchange between Harris and Ezra Klein, which Harris eventually publicized on his blog.

At last Harris and Klein agreed to do a podcast together to discuss the issues at hand and to broadcast their conversation unedited. Their discussion is long and at times frustrating, and they never reach anything that could be called a resolution. But significantly, both speakers remain civil throughout the conversation.

Despite disagreeing on nearly every point that is brought up (including what topics are even worth discussing), neither speaker raises his voice or engages in ad hominem attacks. Neither speaker purposely misrepresents the other’s arguments; rather, both repeatedly say things like, “If I understand you correctly, you’re saying that….” Although neither actually convinces the other of anything, and you get the impression that they still don’t particularly like each other, they’ve managed to transmute a public feud into a civil, open, and honest dialogue in which they’re genuinely seeking to understand each other—something which so many of our politicians, policymakers, and we ourselves are unable to accomplish.

The value of this practice can’t be overstated. Most people who have thought seriously about an issue have good reasons for their beliefs about it. Yet our tendency to vilify our ideological opponents—to accuse them of ignorance, selfishness, corruption, or bias without objectively examining their reasoning—enables and encourages ad hominem and straw man attacks that do nothing to address the serious problems that this country currently faces.

It’s easy to fall prey to recency bias and assume that the country is more divided—and discourse less civil—than it has ever been, but the U.S. actually has a long history of mudslinging and even violence in public debate. As PBS reports, dueling over political disputes was widely accepted until after the Civil War; Vice President Aaron Burr famously killed Alexander Hamilton in a duel in 1804, and Andrew Jackson had been shot so many times that he supposedly claimed to rattle with bullets when he walked.

In the decades leading up to the Civil War, it wasn’t uncommon for members of Congress to pull guns on each other during floor debates and to attack each other with canes, sometimes violently enough to require hospitalization. In 1858, during the debate over whether to admit Kansas to the Union as a free state or a slave state, a “brawl” broke out on the floor of the House of Representatives that involved at least 30 members. The snide comments and pejorative nicknames that are trademarks of today’s administration are quite tame by comparison.

And yet rational, open dialogue between opposing parties seems to be nearly nonexistent in today’s political discourse. Speaking calmly and civilly with a real person who disagrees with you—and who has sane, well-thought-out reasons for doing so—forces you to acknowledge the rationality—indeed, the humanity—of that individual and, by proxy, others who hold their views. It also forces you to present and defend the rationality of your own views, and in so doing, ensure that your views are rationally supported.

I have a very old friend whose background is substantially different from mine. She was raised in a conservative, religious environment and is still a believer who leans right of center. I was raised in a secular, almost hyper-liberal environment, and I still hold most of those views. It’s a wonderful stroke of fortune that we became friends as children, because otherwise our paths probably wouldn’t cross as adults.

My friend is highly intelligent, educated, and compassionate. She cares about people and wants to be a force for good in the world. And we disagree on nearly every political issue that matters today.

Although my friend and I live in different states and don’t see each other often, this friendship has been a valuable leveling force for my opinions. When I’m tempted to share a far-leftist article or meme on social media, I know my rational, conservative friend will see it, and might even fact-check it. If the facts turned out to be wrong, I’d be embarrassed and ashamed to have shared this view, whether my friend called me out on it or not (she probably wouldn’t). This motivates me to do my own fact-checking before sharing something simply because it upholds my previously-held beliefs. When my friend shares a post that I disagree with, I trust her conscientiousness and reason enough to know that it won’t be a Breitbart-style conspiracy theory, and that I might learn something useful by reading it.

I’m grateful to have this friend in my life; her views are a sanity check on my own. If you don’t have such a person in your life, try to find one. If you don’t already engage with media whose biases oppose yours, add a few such sources to your feed. If you haven’t fact-checked your own views lately, now’s a good time to start.

Thursday, March 15, 2018

I Read Ayn Rand's The Fountainhead So You Don't Have To

The following post contains major spoilers for Ayn Rand’s 1943 novel The Fountainhead, but don’t worry—you really shouldn’t read this book.

A number of public figures have expressed admiration for Rand’s writing and philosophy, including Ronald Reagan, Donald Trump, Ron and Rand Paul, and Clarence Thomas.  (Paul Ryan was also once a famous Rand enthusiast, but has since reversed his position on her philosophy, describing it, accurately, as atheist.)  Trump has even stated that he identifies with the protagonist of The Fountainhead.  Knowing that her work has been influential with many powerful people, I felt a (possibly misplaced) sense of civic duty to learn the details of her famous views on altruism and selfishness from Rand herself.

I listened to The Fountainhead as an unabridged audiobook, specifically the 25th anniversary edition with a special introduction by the author.  In it, she explains that her goal in The Fountainhead was “the portrayal of a moral idea” and “the presentation of an ideal man” in the protagonist, Howard Roark.  Roark is an architect, and the story follows his career and the lives of several of his contemporaries in New York City from the early 1920s to the 1940s.

Even without Rand’s introduction, the principles depicted in The Fountainhead are clear.  Rand unambiguously spells out her ideas on morality and ethics via both the actions and the dialogue of the main characters, including a pair of long monologues delivered by the hero and the primary antagonist.

In short, the philosophy of The Fountainhead is this: Fostering human genius is the only moral imperative.  Gifted people must be free to pursue their own artistic, scientific, or philosophic endeavors at all costs.  If one does not possess genius, one can redeem oneself only by recognizing and promoting it in others.  Most people do not possess the former and are not capable of the latter, and these people do not matter.  Rand, via the protagonist Roark, refers to them as “parasites”; the geniuses (referred to as “creators”) can ignore the parasites in the pursuit of their own ends.

Self-sacrifice and concern for the opinions of others are the ultimate evils.  Altruism will lead to the downfall of the human species.  Charity is wasteful at best and reprehensible at worst; if it distracts one from the selfish pursuit of one’s own goals, it is evil.

I want to stress that these ideals are not implied or represented symbolically; they are stated explicitly.  Roark says, “Altruism is the doctrine which demands that man live for others and place others above self.… The man who attempts to live for others is a dependent.  He is a parasite in motive and makes parasites of those he serves.… All that which proceeds from man’s independent ego is good.  All that which proceeds from man’s dependence upon men is evil.”

The idea that mothers should try to love all the world’s children as they love their own is a “line of tripe.”  A home for “subnormal” children built by a philanthropist, which contains amenities such as playgrounds and an art room, is a waste of space and resources.  The only semi-likable character, a social worker who, on the surface, appears to genuinely wish to help poor people improve their lives, admits that she’s miserable, that she hates and is disgusted by the indigent people she works with, and that she doesn’t know a single coworker who actually enjoys the job.

Further, the poor are depicted as being solely responsible for their own poverty.  The main female character, a journalist, moves into a slum for two months to write an exposĂ© on the conditions inside tenement housing.  The article she actually writes makes it clear that the poor are poor because of their own laziness and inadequacy.  One family’s children are roaming the streets half naked and their rent is going unpaid while their father drinks up his salary at a local speak-easy.  Another poor family just purchased an exorbitantly priced radio.  A third lives on charity while their able-bodied father avoids work; they are pregnant with their tenth child.

Because the pursuit and recognition of personal genius are the only morals that matter, no act is immoral as long as it doesn’t interfere with these two principles.  Roark, for example, violently rapes the main female character, Dominique Francon.  After the rape, Francon becomes obsessed with Roark and the two begin a bizarre and toxic relationship.  Roark later becomes Francon’s third husband.  This, remember, is the hero that our current president claims he identifies with.

Roark’s best friend, a wealthy newspaper owner named Gail Wynand, has no real genius of his own but does possess the gift of recognizing artistic genius in others.  As a hobby, Wynand enjoys singling out activists and idealists, offering them huge salaries to write columns for his paper denouncing their own ideals, and ruining them financially if they refuse.  One commits suicide as a result.  To this Wynand responds, “If lightning strikes a rotten tree and it collapses, it’s not the fault of the lightning.”

Roark himself dynamites a building he designed because it wasn’t being built to his exact specifications.  He is arrested and refuses a lawyer; at the trial, he presents only his own testimony as evidence—his monologue about “creators” and “parasites.”  He is acquitted.

Wynand is Francon’s second husband, and after she leaves him for Roark, he allows his newspaper to fall apart and liquidates most of his assets.  He pours the money into building the largest skyscraper in New York and awards the design contract to Roark.  The novel ends with Roark and Francon standing atop this building in progress, surveying the city and reflecting on the greatness of man.

The morality of individual genius and selfishness and the immorality of altruism are emphasized again and again throughout the book, and we should be legitimately concerned that a number of public policymakers find this book inspirational.  The idea that the poor are completely to blame for their poverty, and that charity or tax-payer funded programs to assist them are useless and wasteful, is objectively, provably wrong.  The notion that a man of genius is justified in any immoral act he might commit in the pursuit of his ambitions is disturbing.  The assertion that a few select men among us are “creators” and the rest are mere “parasites,” whose lives are meaningless and irrelevant, is horrifying.

Further, the book’s stance on rape and the rights of women is backward even for the 1940s.  That a reader could continue to admire the hero despite his commission of a violent rape tells us something important about that reader’s views on the seriousness of sexual assault. 

Ignoring its moral philosophy for a moment, how does The Fountainhead hold up as a piece of literature?

Rand’s style is probably an acquired taste.  She describes individuals’ appearances, clothing, and movements in intense detail, from the knot of a tie to the turn of a hand during a conversation.  But when it comes to characters’ internal motivations, she ignores the admonition to “show, don’t tell.”  Not only does Rand spell out the main characters’ thoughts in words, the characters themselves do the same in extended, unnatural-sounding soliloquys.

Rand also has a habit of reusing distinctive words in a way that’s noticeable and distracting.  Characters’ clothes are never trendy or respectable; they’re always correct.  In the second half of the novel, Rand falls in love with the word bromide—it appears eleven times.  In short, her literary skill wasn’t enough to salvage my enjoyment of this undertaking.

You shouldn’t read The Fountainhead.  There are too many better written and more worthwhile books out there.  But when a public figure professes love for a piece of art, fiction, or philosophy, we might learn something important about that figure by examining the thing that they love.  Once in awhile, then, perhaps it is a civic duty.  I heard John McCain loves For Whom The Bell Tolls; maybe start there.

Rand, Ayn. The Fountainhead, read by Christopher Hurt. 25th anniversary ed., Blackstone Audio, 1968.

Sunday, March 4, 2018

Book Review: The Better Angels of Our Nature: Why Violence Has Declined by Steven Pinker (2012) Audiobook

Not many nonfiction books have the potential to fundamentally change a reader’s perspective on a significant issue, but Steven Pinker’s Better Angels is on the shortlist.  In this ambitious work, Pinker uses historical and scientific data from numerous disciplines to demonstrate, conclusively, that human violence of every kind—from domestic abuse, to violent crime such as murder, to wars both large and small, to violence perpetrated by governments against their people, to even terrorism—has declined by orders of magnitude over the course of human history.

Knowing that this thesis drastically contradicts widely-held popular beliefs, Pinker musters statistic after statistic, chart after chart, and historical account after historical account to prove it right.  He refutes the widespread impression that the 20th century was the bloodiest in history by first reminding us how violent previous eras were.  He describes (sometimes in lurid detail) the kinds of tortures inflicted on petty criminals by their legal systems in the Middle Ages; the raids and slaughters carried out by hunter-gatherer tribes against neighboring tribes (he thoroughly lays to rest the “noble savage” stereotype in the process); the blood sports such as bear-baiting and cat-burning that were once considered high entertainment in many cultures; the frequency and acceptability of rape, child abuse, and infanticide around the world even through the Industrial Revolution; and the centuries of wars among great and small powers in Europe that preceded World War I.

After shattering the rose-colored glasses with which we usually view the past, Pinker discusses what he calls “the civilizing process,” the trends that led us to no longer consider practices such as torture and public execution ordinary, but rather horrifying and unthinkable.  He presents massive quantities of data to prove that every kind of violence has declined in every culture around the world, leaving the reader with little room to doubt his claim that human society is, at present, the most peaceable that it has ever been.

Finally, Pinker trains his scientific eye on the psychological and social processes that have accompanied this decline, including the evolutionary mechanisms behind revenge and sadism and the spread of empathy and self-control.  Despite the grimness of most of the book, the outlook is a positive one; Pinker has no need for optimism when the statistics are on his side.

This book is a major undertaking for both Pinker and the reader.  The paperback is about 600 pages; the audiobook, about 37 hours.  But it is also one of the most informative, densely-packed, and fascinating reads I have encountered in years.  Pinker’s characteristic style—educated but highly readable and geared toward a lay audience—shines in this piece.  I never became bored with it (though I did have to take a short break about halfway through, after having more than one vivid dream about being involved in a terrorist attack).  History buffs, political science junkies, sociology and psychology enthusiasts—and fans of having their unquestioned assumptions about reality destroyed—will love this book.

I listened to the unabridged audiobook, read beautifully by Arthur Morey.  (Those familiar with Pinker’s speaking voice will be grateful that the book is not read by the author.)  Morey is the epitome of professionalism and his reading is flawless.  And though the print versions include numerous graphs and charts to illustrate the many percentages and raw numbers that Pinker discusses, these figures are so well explained in the text that I had no trouble following the data without the visual aids.

This book is not for the faint-hearted—if you buy the audiobook, I don’t recommend listening around small children—but if you read it, you may find yourself left with a revolutionary perspective on and a new hopefulness about the future and fate of our species.

Thursday, February 22, 2018

Why I Love the Lip Plates in Black Panther [No Spoilers]

To an average white Westerner, generically ignorant about the history and traditions of indigenous peoples all over the world (I speak only, of course, for myself), the lip plates worn by some African people—and a few indigenous peoples in other parts of the world—are the epitomizing symbol of a primitive, alien culture.  This particular body modification, in which the lower lip is sometimes stretched to 20 cm in diameter, seems impractical, extreme, perhaps even horrifying.

But if we set aside our cultural biases and examine the practice objectively, we must ask: is it any more bizarre than, say, plastic surgery—subjecting oneself to an invasive medical procedure and risking the complications associated with anesthesia and infection for the sake of better conformity to ideals of beauty?  Or tattoos—using dozens of tiny needles to painfully inject ink deep into the dermis?  Practices that, yes, we might be judgmental about, but we don’t look upon as particularly extreme, and certainly not as primitive?  Indeed, Shauna LaTosky, an anthropologist who lived among the Mursi people for several months, compares their tradition of lip-stretching to her own choice to wear painful three-inch stiletto heels to dance competitions (LaTosky 384).

LaTosky interviewed Mursi women—the only gender in that culture which wears lip plates—about their own feelings about the practice and found a range of attitudes that, in hindsight, should be unsurprising.  Many of the women considered their lip plates to be a source of pride, and when wearing them, believed that they walked with a more upright bearing and felt more confident in public and around the men in their lives (388).  But other women, particularly some younger women, worried that foreigners would stare at or mock them (391-2).  For similar reasons, some young men also expressed a preference for women who had chosen not to stretch their lips (396).  And the local government, which had been backed by the USSR for a couple of decades, had repeatedly threatened to ban the practice and considered it “uncivilised” and backward (396-7).

The film Black Panther prominently features a man—credited as “River Tribe elder” and presumed by some audiences to be Nakia’s father—wearing a moderately-sized plate in his lower lip, as well as a pair of plates in his stretched earlobes.  The plates are always fashionably color-coordinated with his clothing, both during T’Challa’s coronation, when he wears traditional attire, and during a meeting of the elders, at which he wears a bright green Western-cut suit.  Among the scenes of street life in the Wakandan capitol, we also see a young man wearing a lip plate, showing off some futuristic Wakandan tech to his friends, who are not wearing lip plates.

In Wakanda the lip plates are clearly unremarkable—no more attention-grabbing than a nice pair of stilettos.

Wakanda is meant to represent Africa in its purest form, unbothered by colonization, the slave trade, or Western influence of any kind.  One might imagine that a culture free to evolve without the pressure of Western judgments about what is fashionable and attractive—that is uninfluenced by and, frankly, uninterested in Western ideals of beauty—would not feel the need to abandon its time-honored dress in the way that even a highly traditional tribe like the Mursi feel now.

By presenting these lip plates as ordinary accessories, worn by a respected and fashionable person in a highly modern world, without comment, Black Panther throws our Western assumptions about beauty and fashion back in our faces.  It points out, blatantly, that beauty is cultural, that the West does not have a monopoly on determining what is attractive or fashionable, that there is nothing inherently primitive or uncultured about traditional African accoutrements like lip plates.

Quite the contrary.  The entire film is a celebration of both past and future Africa—a love letter to the religions, fashions, and even languages of pre-colonialist Africa and a statement about their intrinsic value.  It makes the bold claim that Africa has a great deal to offer the world besides its natural resources: mythologies that the West can learn from, brain power that can help move the entire planet into the 21st century, and, yes, fashions that can add color and flavor to the monochromacy of Western trends.

Work Cited:
LaTosky, Shauna. “Reflections on the lip-plates of Mursi women as a source of stigma and self-esteem.” The Perils of Face: Essays on Cultural Contact, Respect and Self-Esteem in Southern Ethiopia, edited by Ivo Strecker and Jean Lydall, LIT Verlag, 2006, pp. 382-97. Mursi Online, Oxford Department of International Development, 2013,

Marvel or DC? A Twitter Feud

A very dear old friend of mine recently watched Aquaman, didn’t like it, and subsequently tweeted a derogatory comment about the overall...