(NeurIPS 2019) Invited Talk : From System1 Deep Learning to System2 Deep Learning

Video : https://slideslive.com/38921750/from-system-1-deep-learning-to-system-2-deep-learning

Slides : https://drive.google.com/file/d/1zbe_N8TmAEvPiKXmn6yZlRkFehsAUS8Z/view


데이터셋, 모달 사이즈, 컴퓨팅 파워를 늘리는 것만으로 충분하냐?

  • Narrow AI
  • Sample Efficiency
  • Human-provided labels
  • Robustness, stupid errors

Cognition

책 추천 : Thinking, Fast and Slow by Daniel Kahneman
System1 --> System2 : Conscious, linguistic, algorithmic, planning, reasoning

  • Out-of-distribution generalization & transfer
  • Higher-level Cognition
  • Agent Perspective

Dealing with distribution change

ICML 2019, “Nature does not shuffle the data, we shouldn’t.”

  • 정보를 '수집’하는 인포메이션을 망친다?

From IID to OOD

  • i.i.d assumption을 대체하는 것이 필요하다.

Dynamically recombine existing concepts.

Attention

From Attention To Indirection

  • attention = dynamic connection
  • Indirection : Keep track of ‘named’ objects

From Attention to Consciousness

  • Global Workspace Theory

High-level representations <–> Language

  • Grounded language learning (BabyAI, ICLR 2019)

The Consciousness Prior: Sparse Factor Graph

  • Property of High-level variables which we manipulate with language : we can predict some given very few others
    • E.g. “If I drop the ball, it wil lfall on the ground.”
  • Disentangled factors != marginally independent
  • Prior : Sparse factor graph joint distribution between high-level variables

Meta-Learning

Multiple time-scales of learning
End2End learning to generalize ood+Fast transfer

Dynamically Composable Module

Hypotheses for conscious processing by agents, systematic generalization

  • Sparse factor graph in space of high-level semantic variables
  • Semantic variables are causal: agents, intentions, controllable objects
  • Shared ‘rules’ across instance tuples(arguments)

Conclusion

  • time for ML to explore consciousness
  • OOD Generalization
  • System1 to System2 …