scorching fire

Former Head of Google China Foresees an AI Crisis—and Proposes a Solution by Eliza Strickland

His new book, AI Superpowers: China, Silicon Valley, and the New World Order (Houghton Mifflin Harcourt), is something of a bait and switch. The first half explores the diverging AI capabilities of China and the United States and frames the discussion as a battle for global dominance. Then, he boldly declares that we shouldn’t waste time worrying about who will win and says the “real AI crisis” will come from automation that wipes out whole job sectors, reshaping economies and societies in both nations.

 

“Lurking beneath this social and economic turmoil will be a psychological struggle,” he writes. “As more and more people see themselves displaced by machines, they will be forced to answer a far deeper question: In an age of intelligent machines, what does it mean to be human?”

 

In a wide-ranging Q&A with IEEE Spectrum, Lee not only explored this question further, he also gave his answer.

I think this ultimately means too many idle humans feeling worthless (worth + less). Too many humans in general. I'm not a religious person but I think the bible has some instructive words in Proverbs 16:27.

Idle hands are the devil’s workshop; idle lips are his mouthpiece

Or the American Standard Version:

A worthless man devises mischief; and in his lips there is a scorching fire.

I think that means war. Not with humans and machines. But with ourselves. And it will be a brutal war. We won't be worried about how many are killed because their lives will be deemeed inessential. All the useful work wil be done by machines.

Most people don’t think of their job just as a source of income. It brings meaning to their life, it’s their contribution to the world. That’s how we decided to structure our capitalistic society: There’s the idea that even by working routine jobs, they can make money and make better lives for their families. If we pull the rug out from under them and say, you have no job, but here’s some money from the government, I think that would lead to bad outcomes. Some would be happy and retire early. Some will learn a new skill and get a new job, but unfortunately many will learn the wrong job, and get displaced again. A large number of people will be depressed. They will feel that life has no meaning, and this can result in suicide, substance abuse, and so on.

Notes From An Emergency

Notes From An Emergency by Maciej Ceglowski

...their software and algorithms affect the lives of billions of people. Decisions about how this software works are not under any kind of democratic control. In the best case, they are being made by idealistic young people in California with imperfect knowledge of life in a faraway place like Germany. In the worst case, they are simply being read out of a black-box algorithm trained on God knows what data.

This is a very colonial mentality! In fact, it’s what we fought our American War of Independence over, a sense of grievance that decisions that affected us were being made by strangers across the ocean.

Today we're returning the favor to all of Europe.

Facebook, for example, has only one manager in Germany to deal with every publisher in the country. One! The company that is dismantling the news industry in Germany doesn’t even care enough to send a proper team to manage the demolition.

Denmark has gone so far as to appoint an ambassador to the giant tech companies, an unsettling but pragmatic acknowledgement of the power relationship that exists between the countries of Europe and Silicon Valley.

So one question (speaking now as an EU citizen): how did we let this happen? We used to matter! We used to be the ones doing the colonizing! We used to be a contender!

How is it that some dopey kid in Palo Alto gets to decide the political future of the European Union based on what they learned at big data boot camp? Did we lose a war?

Hat tip to Nicola Losito for the link to the text version of Maciej Ceglowski's talk given on May 10, 2017, at the re:publica conference in Berlin.

However, one part of this is problematic.

The right to opt out of data collection while continuing to use services.

Someone people consider the IP address of a computer as personal data. HOW THE FUCK IS THAT SUPPOSED TO WORK.

Deceived by Design

DECEIVED BY DESIGN

Facebook and Google have privacy intrusive defaults, where users who want the privacy friendly option have to go through a significantly longer process. They even obscure some of these settings so that the user cannot know that the more privacy intrusive option was preselected.

The popups from Facebook, Google and Windows 10 have design, symbols and wording that nudge users away from the privacy friendly choices. Choices are worded to compel users to make certain choices, while key information is omitted or downplayed. None of them lets the user freely postpone decisions. Also, Facebook and Google threaten users with loss of functionality or deletion of the user account if the user does not choose the privacy intrusive option.

[...]

The combination of privacy intrusive defaults and the use of dark patterns, nudge users of Facebook and Google, and to a lesser degree Windows 10, toward the least privacy friendly options to a degree that we consider unethical. We question whether this is in accordance with the principles of data protection by default and data protection by design, and if consent given under these circumstances can be said to be explicit, informed and freely given.

The Norwegian Consumer Council has published a scathing report on the deceptive practices social media and tech companies use to deceive people into giving up their privacy.

DECEIVED BY DESIGN

Facebook and Google have privacy intrusive defaults, where users who want the privacy friendly option have to go through a significantly longer process. They even obscure some of these settings so that the user cannot know that the more privacy intrusive option was preselected.

The popups from Facebook, Google and Windows 10 have design, symbols and wording that nudge users away from the privacy friendly choices. Choices are worded to compel users to make certain choices, while key information is omitted or downplayed. None of them lets the user freely postpone decisions. Also, Facebook and Google threaten users with loss of functionality or deletion of the user account if the user does not choose the privacy intrusive option.

[...]

The combination of privacy intrusive defaults and the use of dark patterns, nudge users of Facebook and Google, and to a lesser degree Windows 10, toward the least privacy friendly options to a degree that we consider unethical. We question whether this is in accordance with the principles of data protection by default and data protection by design, and if consent given under these circumstances can be said to be explicit, informed and freely given.

The Norwegian Consumer Council has published a scathing report on the deceptive practices social media and tech companies use to deceive people into giving up their privacy.