Day 3: Haskell Guide To Neural Networks

10 Days Of Grad: Deep Learning From The First Principles. Now that we have seen how neural networks work, we realize that understanding of the gradients flow is essential for survival. Therefore, we will revise our strategy on the lowest level. However, as neural networks become more complicated, calculation of gradients by hand becomes a murky business. Yet, fear not young padawan, there is a way out! I am very excited that today we will finally get acquainted with automatic differentiation, an essential tool in your deep learning arsenal.

Day 2: What Do Hidden Layers Do?

10 Days Of Grad: Deep Learning From The First Principles. In the previous article, we have introduced the concept of learning in a single-layer neural network. Today, we will learn about the benefits of multi-layer neural networks, how to properly design and train them. Sometimes I discuss neural networks with students who have just started discovering machine learning techniques: "I have built a handwritten digits recognition network. But my accuracy is only Y.

Day 1: Learning Neural Networks The Hard Way

10 Days Of Grad: Deep Learning From The First Principles. Neural networks is a topic that recurrently appears throughout my life. Once, when I was a BSc student, I got obsessed with the idea to build an "intelligent" machine1. I spent a couple of sleepless nights thinking. I read a few essays shedding some light on this philosophical subject, among which the most prominent, perhaps, stand Marvin Minsky's writings2. As a result, I came across neural networks idea.

Delay Differential Equations

A fast and flexible library solving delay differential equations.


Multi expression programming is a genetic programming variant encoding multiple solutions in the same chromosome.