This document discusses using differentiable programming and memory networks to build machines capable of reasoning. It describes how attention and memory can help with reasoning tasks by focusing on relevant information and overcoming limits of data size. Application areas discussed include machine reading, dialog state tracking, and end-to-end dialog learning. Memory networks are presented as a way to perform these tasks with an end-to-end trainable architecture using a non-parametric memory accessed through attention. The document concludes by noting this is an open field of research with opportunities in theoretical analysis and improving learning procedures.