Show simple item record

dc.contributor.authorHidalgo, César A.
dc.contributor.authorOrghian, Diana
dc.contributor.authorCanals, Jordi Albo
dc.contributor.authorAlmeida, Filipa de
dc.contributor.authorMartin, Natalia
dc.date.accessioned2022-02-21T15:13:25Z
dc.date.available2022-02-21T15:13:25Z
dc.date.issued2020
dc.identifierONIX_20220221_9780262363266_131
dc.identifier.urihttps://directory.doabooks.org/handle/20.500.12854/78611
dc.description.abstractHow people judge humans and machines differently, in scenarios involving natural disasters, labor displacement, policing, privacy, algorithmic bias, and more. How would you feel about losing your job to a machine? How about a tsunami alert system that fails? Would you react differently to acts of discrimination depending on whether they were carried out by a machine or by a human? What about public surveillance? How Humans Judge Machines compares people's reactions to actions performed by humans and machines. Using data collected in dozens of experiments, this book reveals the biases that permeate human-machine interactions. Are there conditions in which we judge machines unfairly? Is our judgment of machines affected by the moral dimensions of a scenario? Is our judgment of machine correlated with demographic factors such as education or gender? César Hidalgo and colleagues use hard science to take on these pressing technological questions. Using randomized experiments, they create revealing counterfactuals and build statistical models to explain how people judge artificial intelligence and whether they do it fairly. Through original research, How Humans Judge Machines bring us one step closer to understanding the ethical consequences of AI. Written by César A. Hidalgo, the author of Why Information Grows and coauthor of The Atlas of Economic Complexity (MIT Press), together with a team of social psychologists (Diana Orghian and Filipa de Almeida) and roboticists (Jordi Albo-Canals), How Humans Judge Machines presents a unique perspective on the nexus between artificial intelligence and society. Anyone interested in the future of AI ethics should explore the experiments and theories in How Humans Judge Machines.
dc.languageEnglish
dc.relation.ispartofseriesThe MIT Press
dc.subject.classificationthema EDItEUR::U Computing and Information Technology::UY Computer science::UYQ Artificial intelligence::UYQM Machine learningen_US
dc.subject.classificationthema EDItEUR::U Computing and Information Technology::UB Information technology: general topics::UBJ Digital and information technologies: social and ethical aspectsen_US
dc.subject.otherA.I. Ethics
dc.subject.otherArtificial Intelligence
dc.subject.otherRobotics
dc.subject.otherPsychology
dc.subject.otherAutomation
dc.subject.otherFuture of Work
dc.subject.otherFourth Industrial Revolution
dc.subject.otherAlgorithmic Bias
dc.subject.otherPrivacy
dc.subject.otherLabor Displacement
dc.subject.otherMachine Ethics
dc.subject.otherMoral Psychology
dc.subject.otherEthics
dc.subject.otherHuman Robot Interactions
dc.subject.otherPositive Philosophy
dc.subject.otherMoral Experiments
dc.subject.otherIntention
dc.subject.otherMoral Foundations Theory
dc.subject.otherComputational Creativity
dc.subject.otherUncertainity
dc.subject.otherFairness
dc.subject.otherBias
dc.subject.otherDifferential Privacy
dc.subject.otherAnonymity
dc.subject.otherWrongness
dc.subject.otherDemographics
dc.subject.otherMoral Foundations
dc.subject.otherLaws or Robotics
dc.subject.otherLegal Implications of Robotics
dc.subject.otherBureacracies
dc.titleHow Humans Judge Machines
dc.typebook
oapen.relation.isPublishedByae0cf962-f685-4933-93d1-916defa5123d
oapen.relation.isbn9780262363266
oapen.relation.isbn9780262045520
oapen.imprintThe MIT Press
oapen.pages256
oapen.place.publicationCambridge


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record

http://creativecommons.org/licenses/by-nc-nd/4.0
Except where otherwise noted, this item's license is described as http://creativecommons.org/licenses/by-nc-nd/4.0