Eliezer Yudkowsky

Eliezer Shlomo Yudkowsky (born 11 September 1979) is a native of the United States, researchers in the field of artificial intelligence. He focuses on the technological singularity and is a proponent of " friendly artificial intelligence " (English: friendly artificial intelligence, FAI). He lives in Redwood City, California.

Life

Yudkowsky finished his schooling when he was 12 years old. He has since been self-taught and has no formal training in the field of artificial intelligence. He founded in 2000 along with Ray Kurzweil, the non-profit organization Singularity Institute for Artificial Intelligence ( SIAI ), which 2013 Machine Intelligence Research Institute ( MIRI ) is called from the end of January. He is still employed there as a researcher.

Work

Yudkowsky tries in his research to develop a theory that makes it possible to create an artificial intelligence with reflexive self-understanding, which in addition is able to modify itself and recursively to improve without changing their original preferences (see Seed AI, Friendly AI, Coherent Extrapolated Volition and especially ). In addition to his theoretical research Yudkowsky has also written several introductions to philosophical issues, such as the text "An Intuitive Explanation of Bayes ' Theorem" ( German " An intuitive explanation of Bayes' formula ").

Overcoming Bias and LessWrong

Was Yudkowsky, next to Robin Hanson, one of the main contributors to the blog Overcoming Bias, which was sponsored by the Future of Humanity Institute at Oxford University. In the spring of 2009, he helped to found the community blog LessWrong which sets to increase human rationality to the destination. Yudkowsky wrote the so-called " Sequences" on " Lesswrong ", which consist of more than 600 blog posts, and cover topics such as theory of knowledge, artificial intelligence, human error rationality and meta-ethics.

Other Publications

He contributed two chapters to the book edited by Nick Bostrom and Milan Cirkovic anthology "Global Catastrophic Risk" at.

Yudkowsky is also the author of the following publications: "Creating Friendly AI" (2001), "Levels of Organization in General Intelligence " (2002), " Coherent Extrapolated Volition " (2004) and "Timeless Decision Theory" (2010).

Yudkowsky has also written several works of fiction. His Harry Potter fan fiction story " Harry Potter and the Methods of Rationality " is currently the most popular Harry Potter history of the fanfiction.net website, dealing with issues of cognitive science, rationality and philosophy in general, and was among other positive by David Brin and the programmer Eric S. Raymond rated.

303322
de