Neural networks are for machine learning, too, but they’re a way of storing the data in a form that looks like neurons in the human brain: connections, axons, synapses, impulses between neurons, those sorts of things. These constructs are called artificial neural networks (ANNs) This is supposed to bring the end result a bit closer to how the human brain may work, emulating our own organic logic units. Data scientists often being building ANNs with Keras, an API for building and training neural networks. ANNs are at the core of Deep Learning.

ANNs bring about a multi-layered approach to modeling, leading us to think of the models as deep, and we describe the training process of these deep networks as deep learning. Neural networks with many layers are called deep neural networks (DNN), so DNNs and deep learning have to do with the scale of the ANNs. Larger, multi-level ANNs are DNNs and require deep learning to get them there.

Deep learning powers such feats as the classification of billions of images in Google Images, Apple Siri’s speech recognition, and the world champion Go player, DeepMind’s AlphaGo.