How to Stay Smart in a Smart World

Why Human Intelligence Still Beats Algorithms

Gerd Gigerenzer

PDF
ca. 76,48
Amazon iTunes Thalia.de Hugendubel Bücher.de ebook.de kobo Osiander Google Books Barnes&Noble bol.com Legimi yourbook.shop Kulturkaufhaus ebooks-center.de
* Affiliatelinks/Werbelinks
Hinweis: Affiliatelinks/Werbelinks
Links auf reinlesen.de sind sogenannte Affiliate-Links. Wenn du auf so einen Affiliate-Link klickst und über diesen Link einkaufst, bekommt reinlesen.de von dem betreffenden Online-Shop oder Anbieter eine Provision. Für dich verändert sich der Preis nicht.

The MIT Press img Link Publisher

Geisteswissenschaften, Kunst, Musik / Psychologie

Beschreibung

How to stay in charge in a world populated by algorithms that beat us in chess, find us romantic partners, and tell us to "e;turn right in 500 yards."e;Doomsday prophets of technology predict that robots will take over the world, leaving humans behind in the dust. Tech industry boosters think replacing people with software might make the world a better place-while tech industry critics warn darkly about surveillance capitalism. Despite their differing views of the future, they all seem to agree: machines will soon do everything better than humans. In How to Stay Smart in a Smart World, Gerd Gigerenzer shows why that's not true, and tells us how we can stay in charge in a world populated by algorithms.Machines powered by artificial intelligence are good at some things (playing chess), but not others (life-and-death decisions, or anything involving uncertainty). Gigerenzer explains why algorithms often fail at finding us romantic partners (love is not chess), why self-driving cars fall prey to the Russian Tank Fallacy, and how judges and police rely increasingly on nontransparent "e;black box"e; algorithms to predict whether a criminal defendant will reoffend or show up in court. He invokes Black Mirror, considers the privacy paradox (people want privacy but give their data away), and explains that social media get us hooked by programming intermittent reinforcement in the form of the "e;like"e; button. We shouldn't trust smart technology unconditionally, Gigerenzer tells us, but we shouldn't fear it unthinkingly, either.

Weitere Titel in dieser Kategorie
Cover Faceplant
Keeley Hurley
Cover Forgetting
Scott A. Small
Cover The Economics of Airlines
Volodymyr Bilotkach
Cover Beyond Happy
Mark Fabian
Cover Inner Sense
Caroline Williams

Kundenbewertungen