Graffito from an underpass near Schloss Ambras, Innsbruck
The 2010s were a difficult decade which destroyed our ability to believe in some solutions to problems, but did not provide alternative paths to follow. That decade left many of us in a state of what the Greeks called aporia. At the start of the decade, Jona Lendering had some thoughts about one problem, the spread of misinformation from bad pop books, documentaries, and the Internet. Here is how he saw it in the hopeful time around 2010.
Over on the group blog Crooked Timber there is a retrospective post on David Graeber’s Debt ten years after they hosted a discussion of the book on the blog. The post and comments say something very important about ‘big ideas’ books which scientists mostly take for granted, but might not be obvious to curious, clever people who are not active in research:
I think the best way to understand Graeber is as a writer of speculative nonfiction. He is often wrong on the facts, and more often willing to push them farther than they really ought to be pushed, requiring shallow foundations of evidence to bear a heavy load of very strongly asserted theoretical claims. But there is value to the speculation – social scientists don’t do nearly enough of it. Sometimes it is less valuable to be right than to expand the space of perceived social and political possibilities. And that is something that Graeber was very good at doing.
Dan Davies, Lying for Money: How Legendary Frauds Reveal the Workings of the World. US edition (Scribner: New York and London, 2018)
Lying for Money is one part a monograph by someone who has studied and taught a problem for decades, and one part an extended blog post. It is also a bleak book. Davies thinks that fraud grows out of the cost of verifying facts and the techniques by which managers simplify the world to make it comprehensible (legible in James C. Scott’s terms). The cost of auditing or checking references appears every day, but the cost of discovering that one of your nurses never completed high school and one of your suppliers disappeared overseas with your money only comes up occasionally, so people tend to take fewer and fewer precautions until they suffer for it.
The ancestral apple orchard is almost ready for harvest Military historians tend to dislike the idea of the Decisive Battle. Its surprisingly hard to find a time where a single battle decided a war over something more complicated than who should be king. Battles make for great stories but they are only a small part... Continue reading: Violence Makes Permanent
In the past I have talked about how civilians in Syria see themselves as peasants in Game of Thrones, and soldiers in Ukraine want to be as excellent as characters in first-person shooter Call of Duty. This year I want to record that college-edited commentators like Max Boot are comparing the assassination of condottiere Yevgeniy Prigozhin to their favourite scenes from crime dramas in formal published prose:
The most fitting epitaph for Wagner Group founder Yevgeniy Prigozhin was delivered by the shotgun-wielding hit man Omar Little on “The Wire”: “You come at the king, you best not miss.” There’s still much we don’t know for certain (and might never know), but that pearl of wisdom was confirmed by Prigozhin’s apparent death Wednesday after a private plane he was on reportedly crashed north of Moscow.
IBM understood the issue and the stakes in 1979! This image seems to come from a random social media post by @bumblebike@twitter.com on 17 February 2017 (archive.is) via a blog but I am sure I heard the principle in my days in computer science.
Since 2020 I am trying not to talk about corporate social media but I want to record this thought. Authors are seeing books appearing on amazon.com with their name and titles but a text generated by chatbot. Scammers hope that people will buy these books thinking they are the real thing. People who buy consumer goods on Amazon are seeing a lot of knockoffs with random strings of letters for a brand name; the people who sell these goods focus on search-engine optimization, buying positive reviews and suppressing negative ones, and other marketing tricks rather than on making good products. And of course sites like Facebook gladly sell ads promoting hate, and suggest genocidal propaganda in users’ feeds, while claiming that they are not responsible for what users post and that they carefully vet ads before accepting them.
From a happier time: Verona Bikeshare in April 2017
Like many bookish people I grew up with books on Oak Island and ghosts and mysterious disappearances. I don’t think any of them covered Somerton Man, who was found dead on a beach in Australia in 1948 with a scrap of the Rubiyat of Umar Khayum in a pocket. Younger me would certainly not have recognized that 1948 was the perfect time, because many of the things which feed paranormal television today were invented between 1945 and 1975 (Bigfoot, flying saucers, and grey aliens for example; D.B. Cooper also hijacked his airplane in 1971). Things stay in this category because they are inherently hard to understand, so mainstream institutions do not take over investigation. Larry Kusche thought he had solved the Bermuda triangle mystery in 1975, but the sea is so wide and unknown that people who want to see mystery in a lost ship or plane can see it. Following these topics can be frustrating because there are many excited cranks for each new tidbit of information. But one of these cases has moved forward!
Dan Gardner’s Future Babble (McClelland and Stewart Ltd.: Toronto, 2010) is a pop book with a structural theory for why so many people get called out to predict the future using methods which fail nine times out of ten, then called back out after one failed prediction to make another. It relies upon earlier trade books (such as Phil Tetlock‘s work on expert judgement and When Prophecy Fails) and the psychology of cognitive biases and heuristics. One of Gardner’s favourite case studies is Paul Ehrlich who like Noam Chomsky spent most of his career repeating ideas he had in the 1960s (but whose ideas were much more easily falsified: the death rate did not rapidly rise from the late 1970s, and people all around the world start having smaller families once women have the ability to chose).
The cover of one version of H. Beam PIper’s “The Cosmic Computer” (Ace Books 1963)
People who speculate about artificial minds have a thought experiment: if you lock a superhuman intelligence in a box, with just a way to ask it questions and a way for it to send back the answers, how do you stop it from persuading someone to let it out? Today some people who read the right parts of the Internet ten years ago are afraid that some terrible ideas have escaped geeky online communities and are commanding money and policy in the wider world. Outsiders don’t have the background knowledge to know why this is a bad idea. But a lot of the criticism is hyperbolic, very personal, and mixes unverified claims with matters of public record. Just below the surface are such baroque ideas and cycles of interpersonal relations that it is exhausting to learn what happened, disturbing to think about it, and hard to explain why this matters to anyone but a few very clever, very strange people who spend a lot of time on the Internet (and maybe social media these days). I found one series of essays that may help.
Some people make fun of stories about generation ships because they often follow in the mould of Heinlein’s Universe (1941): if there is a story about a generation ship, it will suffer a disaster while the crew inside descend into barbarism and self-destruction. Sometimes monsters devour the crew, sometimes a plague kills all the adults, and sometimes radiation turns the voyagers into monstrosities. Geneticists would call that a founder effect: the first story (or the first few members of a species to reproduce in an environment) has a disproportionate influence on everything after. Is there a wider context the critics are overlooking?