Fuck Facebook

How hackers can force AI to make dumb mistakes 



Can you tell the difference between this toy turtle and a rifle? Google’s AI couldn’t.
turtle - physical objects that fool neural nets
A few months ago, a group of MIT students published a proof-of-concept that showed how, with a couple of tweaks, they were able to trip up Google’s AI while maintaining the natural appearance of a toy turtle to the human eye. It’s one of several projects that reveal the fundamental differences between the way artificial intelligence and humans see the world—and how dangerous these differences can become.

Slowly but surely, AI is taking over tasks that were previously the exclusive domain of humans, from classifying images and analyzing surveillance camera footage to detecting cancer, fighting cybercrime and driving cars, and much more. Yet, despite their superhuman speed and accuracy, AI algorithms can fail spectacularly, such as mistaking turtles for rifles or dunes for nudes.
While these errors often yield comic results, they can also become dangerous given how prominent AI algorithms are becoming in many of the critical things we do every day. And if the scientists and engineers creating them don’t do something about these weaknesses, malicious actors can weaponize them into adversarial attacks to do real damage, both in the virtual and physical world.

Read More Here 


America The Beautiful

Zuckerberg Denies Whitelisting, Selling User Data

Facebook CEO Mark Zuckerberg denied his social media network had ever sold users' data, rejecting an argument from a member of the British Parliament who released documents claiming they showed the company "whitelisted" third parties without users' consent.
"In 2014, to prevent abusive apps, we announced that we were changing the entire platform to dramatically limit the data apps could access," Zuckerberg posted Wednesday on Facebook. "This change meant that a lot of sketchy apps – like the quiz app that sold data to Cambridge Analytica – could no longer operate on our platform."
More Here