• Ours is the era of inadequate AI alignment theory. Any other facts about this era are relatively unimportant, but sometimes I tweet about them anyway.
  • Read writing from Eliezer Yudkowsky on Medium. Writing about things of no ultimate importance. If it was important it'd be on intelligence.org or lesswrong.com.
    Bulunamadı: элиезер
  • In response to the instrumental convergence concern, that autonomous decision-making systems with poorly designed goals would have default incentives...
    Bulunamadı: элиезер
  • In response to Habryka's shortform, I can confirm that I signed a concealed non-disparagement as part of my Anthropic separation agreement. I worked...
  • Элие́зер Шло́мо Юдко́вский — американский специалист по искусственному интеллекту, исследующий проблемы технологической сингулярности и выступающий за создание...
  • Eliezer Yudkowsky is a foundational thinker on the long-term future of artificial intelligence.
    Bulunamadı: элиезер
  • Yayınları görüntülemek veya mesaj göndermek için kullanıcı sayfasına gidin.
  • Юдковский.
    278 bin görüntüleme
    Yayınlandı20 Şub 2023