If you have spent any amount of time in the blogosphere or on online news sites, you have likely experienced all the colorful varieties of user comments. In many cases, you’ll get some civil responses, and even some that raise good points and are quite helpful, adding additional information beyond what was in a story or post. There can be an effective dialogue with the writer, who can clarify points. Comments on news stories and blog posts can provide a useful feedback loop, educating everyone—including the writer—in the process.
Many times, however, you’ll encounter shrill, nasty, downright vicious comments on even the most innocuous stories or posts. (“You ^%$$!! The sky isn’t blue. You’re an idiot!!!!”—That’s really not much of an exaggeration.) Often, these commenters are referred to as trolls and just seem to delight in being nasty, as actual trolls are wont to do, and rather than make cogent arguments, engage in needless ad hominem attacks on the writer. I’ve received my own share of these over the years, and they are by no means unique to cyberspace; in the days before the Internet (if such a happy time can be recalled), I used to occasionally get hate mail via the print publications I wrote for. But online commenting has completely revolutionized the ability to be nasty.
Not that bloggers or site moderators/hosts should stifle intelligent, reasoned debate (the key words there being “intelligent” and “reasoned”), but a recent study (via The New York Times) found that trolls and other rude, uncivil commenters actually undermine the intent of a post and, say the study’s authors, “can significantly distort what other readers think was reported in the first place.”
(I should point out that as more and more news stories and features go online and offer the ability for readers to add comments, all of journalism is morphing into the blogosphere, so the conclusions of this study become important when assessing the current and future state of journalism.)
The study—“Crude Comments and Concern: Online Incivility’s Effect on Risk Perceptions of Emerging Technologies”—was published online last month in The Journal of Computer-Mediated Communication and was designed to measure what researchers call “the nasty effect.”
“We asked 1,183 participants to carefully read a news post on a fictitious blog, explaining the potential risks and benefits of a new technology product called nanosilver. These infinitesimal silver particles, tinier than 100-billionths of a meter in any dimension, have several potential benefits (like antibacterial properties) and risks (like water contamination),” the online article reported.
“Then we had participants read comments on the post, supposedly from other readers, and respond to questions regarding the content of the article itself.”
Half the study sample saw civil comments, and the other half saw nasty ones. The comments were roughly equivalent in length and general tenor—supportive vs. dubious of the technology being discussed—but the major difference was that the nasty ones used profanity, personal attacks on the writer or researchers, and other ill-mannered language.
They say that one bad apple spoils the whole bunch, and that’s basically what the researchers found:
“In the civil group, those who initially did or did not support the technology — whom we identified with preliminary survey questions — continued to feel the same way after reading the comments. Those exposed to rude comments, however, ended up with a much more polarized understanding of the risks connected with the technology.”
So the uncivil comments distorted the perceptions of the story even by those inclined to civility. That’s pretty scary.
In most of our daily interactions, we—or most of us anyway—have a set of filters that allow us to engage with others in a generally civilized manner. But the Internet affords people a high level of anonymity, even on those occasions when they use their proper names to comment, which has a tendency to remove those filters and encourages people to type things they likely would never say to a person’s face. (Again, I am speaking generally. We all know exceptions to the rule, and rarely invite them to parties.)
For many bloggers and news organizations—especially those that get a lot of traffic—the hostile tenor of many commenters has led them to strictly moderate comments, impose byzantine login routines, or even shut down comments altogether (who has time to moderate hundreds of comments?). Interestingly, there are comment-board technologies that bloggers can use that allow trolls to be the only ones who see their posts—which is fitting because most of the time it seems they’re only there to hear themselves talk anyway.
New communication technologies—blogs, Twitter, Facebook, whatever is coming next…—have democratized public discourse in ways that no medium ever has. We can all potentially be part of the debate, for good or ill. This is a lot of power, but as the old cliché goes, with great power comes great responsibility. An itchy Twitter finger or a knee-jerk comment on a blog post can potentially do a lot of damage. It behooves all of us to think carefully and rationally before hitting that “post” button.
Richard Romano is a freelance writer, specializing in marketing, media, communication, and environmental sustainability trends. He is a frequent contributor to WhatTheyThink.com, and has written, co-written, or ghost-written almost a dozen books on media and technology, most recently “Does a Plumber Need a Web Site?”: Mad Dentists, Harried Haircutters, and Other Edgy Entrepreneurs Offer Promotion Strategies for Small and Mid-Size Businesses. His Web site is www.richtextandgrahics.com. He is based in Saratoga Springs, NY.