Credit assignment through broadcasting a global error vector

Backprop (BP) uses detailed, unit-specific feedback to train DNNs with remarkable success. Can DNNs can be trained as efficiently by broadcasting a global, non-unit specific learning signal to all units and applying local, Hebbian-like updates to the weights? We show that the answer is yes! ...with some tricks: vector-valued units and nonnegative weights.

Unsupervised discovery of temporal structure in noisy data with dynamical components analysis

Dynamical components analysis (DCA) is a linear dimensionality reduction method for high-dimensional time-series data that extracts dynamical structure by maximizing an information-theoretic objective. DCA outperforms methods such as Principal Components Analysis and Slow Feature Analysis in extracting dynamical structure in several datasets.

Neuromorphic Kalman filter implementation in IBM's TrueNorth

Following the advent of a post-Moore's law field of computation, novel architectures continue to emerge. With composite, multi-million connection neuromorphic chips like IBM's TrueNorth, neural engineering has now become a feasible technology in this …