Worst-Case Thinking Breeds Fear and Irrationality – Schneier on Security by Bruce SchneierBruce Schneier (Schneier on Security)

Here’s a crazy story from the UK. Basically, someone sees a man and a little girl leaving a shopping center. Instead of thinking “it must be a father and daughter, which happens millions of times a day and is perfectly normal,” he thinks “this is obviously a case of child abduction and I must alert the authorities immediately.” And the police, instead of thinking “why in the world would this be a kidnapping and not a normal parental activity,” thinks “oh my god, we must all panic immediately.” And they do, scrambling helicopters, searching cars leaving the shopping center, and going door-to-door looking for clues. Seven hours later, the police eventually came to realize that she was safe asleep in bed.

Fear is the mind killer.

Security Design: Stop Trying to Fix the User by Bruce Schneier (schneier.com)

We must stop trying to fix the user to achieve security. We’ll never get there, and research toward those goals just obscures the real problems. Usable security does not mean “getting people to do what we want.” It means creating security that works, given (or despite) what people do. It means security solutions that deliver on users’ security goals without­ — as the 19th-century Dutch cryptographer Auguste Kerckhoffs aptly put it­ — “stress of mind, or knowledge of a long series of rules.”

Old (by Internet standards) but still relevant.

Facebook and Cambridge Analytica – Schneier on Security (schneier.com)

The first step to any regulation is transparency. Who has our data? Is it accurate? What are they doing with it? Who are they selling it to? How are they securing it? Can we delete it? I don’t see any hope of Congress passing a GDPR-like data protection law anytime soon, but it’s not too far-fetched to demand laws requiring these companies to be more transparent in what they’re doing.