Bob Neal

“Move fast and break things” is the instruction Mark Zuckerberg has given his Facebook (now Meta) staff.

He didn’t say that some things shouldn’t be broken. That might be a bit too subtle for him to grasp. But now, three decades into the era of high-speed everything, we should consider just what things are under threat of being broken, what things shouldn’t be broken and who is responsible for keeping them whole.

Science and technology are neutral. Human use decides whether results are good or bad. Many people use these tools in ways that threaten our social fabric, not to mention our very democracy.

Comes a local example of how far off track people can go with a “neutral” technology. The Morning Sentinel last week reported on a woman who said she’d been drugged and raped in the basement of a restaurant in Waterville. The Sun Journal ran the story.

The woman had posted that several Waterville restaurants have “rape basements.” Where did she post this? On Facebook, of course. Others piled on, posting that Waterville restaurants had slipped drugs into their drinks and had then taken them to “rape rooms.”

Police, deputies and prosecutors in the Waterville area said they have no evidence of the existence of “rape rooms” and give a different account of the story. They say she was arrested for operating under the influence on Route 139 in Fairfield. She had been refused bar service for being “not right” at the restaurant where she now says she was drugged and raped. The Somerset County Sheriff’s Office said it would have investigated her charge of drugging and rape if she had told the deputy.

Advertisement

This column is not about her veracity. It’s about how easy it is to post unverified statements and how easily gullible people believe them. People who may want to believe negativities about people (police officers, teachers, etc.) or institutions (government, schools) they don’t like.

Through lack of legislative oversight, we have let a system arise in which the vast majority of “information” made public is not subject to screening. The companies that host these postings, such as Facebook and Twitter, bear no responsibility, pay no price, for what’s on their sites.

Yes, people posting false information can be sued for libel, and the restaurant in Waterville has threatened legal action against the woman who says she was drugged and raped there. But Facebook bears no responsibility for allowing her to post.

The irony here is that the institutions that regularly verify information, that most closely vet it, are subject to legal action if they permit a libelous statement to become public. These institutions include newspapers, over-the-air radio and television and some cable news channels.

I was an editor for more than a decade. I can’t count the number of times someone came to me or to another editor with a story “that’ll blow the lid off things.” But I can count the number of times that, after we investigated the “blow-the-lid-off” charges, the story was verifiably true. It is one, maybe two, and what was going on wasn’t nearly so earth-shaking as the tipster had said.

A journalist’s strongest personality trait is skepticism, which well serves anyone who is told something will “blow the lid off things.” Combine skepticism with the responsibility to post only what’s verifiably true, and you have solid assurance that what appears on those posts is real.

Advertisement

I’m no Luddite. I’m not afraid of technology. I first worked on a computer in 1964 as a check-sorter for First National City Bank (now Citibank) in New York. Fourteen of us ran IBM 360 sorting machines that had replaced a crew of 900 hand sorting checks every night, readying them to deliver to the bank branches before business the next day.

Computers fascinated me. In 1964.

I have come to view my computer, at whose screen I dutifully sit several hours a day, as a tool. No more, no less. It’s like a hammer. I know the basics of how it works. I know what to pound or not pound with a computer as well as I know what to pound or not pound with a hammer.

But some who study such things are less sanguine. Daron Acemoglu and Simon Johnson of MIT wrote in The New York Times: “Over time the channels of communication concentrated into a few hands, including Facebook, whose algorithm exacerbated political polarization and in some well-documented cases also fanned the flames of ethnic hatred. In authoritarian regimes, such as China, the same technologies have turned into tools of totalitarian control.”

Are they predicting our future? I hope not. They offer three ways to deal with evermore machine control of information. Alas, their plans require action by Congress, and most in Congress either don’t understand the potential for harm or find personal advantage (and campaign donations) coming from big tech. So don’t look to D.C. for solutions.

I offer only my own testimony. I’m not on Facebook. I get grins from my own kitties instead of Facebook’s “cute-kitty” posts. I’m not on Twitter. I’m not interested in seeking knowledge on a site that has little regard for truth. I find facts and opinions in books, magazines and newspapers. All vetted. Each of us must decide how to control our devices rather than letting them control us.

I seem to get along all right. Oh, there’s one of my cute kitties asking for a belly rub. Gotta go.

Bob Neal suspects that the proportion of members of Congress who know how to use a computer is lower than the proportion of people in nursing homes who do. Neal can be reached at bobneal@myfairpoint.net.