February 7, 2013

Read this website before the Illuminati take it down

by

If you look closely, you can see conspicuous deer tracks on the set of the “moon landing.”

Conspiracy theories are predictable. I don’t just mean the guilty parties in actual conspiracies. Those are often pretty predictable, too, it’s true. No, I mean the language of conspiracy theory is predicable. X was covered up by Y, Z is in the drinking water. Maybe Z is in the drinking water, sure (if by Z you mean fracking byproducts, agricultural runoff, antidepressents or deer urine), but the language we use to describe it is a recognizable trope by now.

Some of what flags writing as a conspiracy theory is the subject matter. (“Does the NSA know about all of this deer urine? Why have they buried the research on the effects of deer urine when mixed with coffee grounds?”)

But conspiracy theories also have a particular pattern to them, a sort of disjointed logic. This past month Ian Webster, Emily Snowman and Laura Michet used that realization to build a thing of beauty: a page that generates conspiracy theory procedurally.

Their site, Verified Facts, takes a template — each one written by Webster, Snowman or Michet — and fills it from different lists of appropriate words to create a sort of conspiracy theory Madlib. The result is half dismissal of conspiracy theory and, in a strange way, half celebration.

As Michet shows us on her blog, much of the art of the thing lies in their main sentences which, before being filled with the appropriate nouns, look like this:

STUDIES SHOW THAT PEOPLE WHO SPEND TOO MUCH TIME IN {{PLACE1}} FREQUENTLY END UP WITH INCURABLE CASES OF {{MALADY}}. THIS TREND IS CONSISTENTLY REPEATED ALL THE WAY BACK THROUGH {{ERA}}, WHEN {{GOVERNMENT_ORG}} FIRST SET UP SHOP IN {{PLACE1}}.

The object class MALADY in this case would be something like, for instance, “deer urine poisoning.” These main sentences are chained together by filler sentences. Among my favorites used here are those that verge on metacommentary, such as “websites revealing the truth about this are frequently taken down without an explanation” or “You might think that this sounds like something out of a tabloid, but it’s real.” Both of those were pulled from a single conspiracy theory Verified Facts kicked up for me linking Opus Dei to the Vietnam War.

Cleverly, certain nouns are repeated throughout a given page on Verified Facts. As Webster writes, “conspiracies don’t make sense, but they have some semblance of coherence.”

But the best part, the very best part? Each generated page on verified facts is followed by a few scholarly citations. For example, after the article on Opus Dei:

  1. Nelson, Thomas E., Rosalee A. Clawson, and Zoe M. Oxley. “Media framing of a civil liberties conflict and its effect on tolerance.” American Political Science Review (1997): 567-583.
  2. Huysmans, Jef. “Security! What do you mean? From concept to thick signifier.” European Journal of International Relations 4.2 (1998): 226-255.

That is delightful enough to make a man forget about all of this deer urine he’s being poisoned with.

Given more time, Webster writes, he’d like to improve the granularity of the word classes to make a truly impressive range of conspiracy theories.

As fun all of this is, and as much as it highlights the real joy in a genre that lends itself to formula so easily, the whole project also feels rather quaint or artisanal. After all, where does this or any website live but within the biggest, sleekest conspiracy theory generator the world has ever known?

 

 

Dustin Kurtz is former marketing manager of Melville House.

MobyLives