Deep Learning and Cognition
Deep learning has had enormous success on perceptual tasks but still struggles in providing a model for inference. To address this gap, we have been developing Compositional Attention Networks (CANs).
Deep learning has had enormous success on perceptual tasks but still struggles in providing a model for inference. To address this gap, we have been developing Compositional Attention Networks (CANs).
In this talk I will explore how data science is changing the way we practice education and research.
Privacy concerns are becoming a major obstacle to using data in the way that we want.
Recent years have seen impressive progress in robot control and perception including adept manipulation, aggressive quadrotor maneuvers, dense metric map reconstruction, and object recognition in real time.
We start by quickly reviewing 50 years of computer architecture to show there is now widespread agreement on instruction set architecture (ISA). Unlike most other fields, despite this harmony there is no open alternative to proprietary offerings from ARM and Intel.
The most widely used optimization method in machine learning practice is the Perceptron Algorithm, also known as the Stochastic Gradient Method (SGM).
In this talk, I will briefly elaborate on our current efforts in address