Deep Learning

Day 9: Roaming The Latent Space

Imagine you are a designer and you want a new font: A little bit heavier, with rounder letters, more casual or a little bit more fancy. Could this font be created just by tuning a handful of parameters? Or imagine that you are a fashion designer and you would like to create a new collection as a mix of previous seasons? Or that you are a musician desperately looking for inspiration.

Day 8: Model Uncertainty Estimation

Wouldn't it be nice if the model also told us which predictions are not reliable? Can this be done even on unseen data? The good news is yes, and even on new, completely unseen data. It is also simple to implement in practice. A canonical example is in a medical setting. By measuring model uncertainty, the doctor can learn how reliable is their AI-assisted patient's diagnosis. This allows the doctor to make a better informed decision whether to trust the model or not.

Day 7: Real World Deep Learning

So far we have explored neural networks almost in the vacuum. Although we have provided some illustrations for better clarity, relying an existing framework would allow us to benefit from the knowledge of previous contributors. One such framework is called Hasktorch. Among the practical reasons to use Hasktorch is relying on a mature Torch Tensor library. Another good reason is strong GPU acceleration, which is necessary for almost any serious deep learning project.

Hardware-Efficient Stochastic Binary CNN Architectures for Near-Sensor Computing

With recent advances in the field of artificial intelligence (AI) such as binarized neural networks (BNN), a wide variety of vision applications with energy-optimized implementations have become possible at the edge. Such networks, have the first …

Autonomous Factory

Autonomous molecule production factory powered by artificial intelligence.

Making Your Self-Driving Robot

We will build an autonomous robot. Captured by robot's camera, video stream will be analyzed by a neural network. The network will be running on the onboard Raspberry Pi that will steer the robot. Before you start with the project, I want you to answer two questions. First, how everything will be attached mechanically? Second, what will be the energy source? While your autonomous robot can work from a cardboard box, having a mechanically sound chassis will greatly improve the result on the AI training stage.

In-Memory Resistive RAM Implementation of Binarized Neural Networks for Medical Applications

The advent of deep learning has considerably accelerated machine learning development. The deployment of deep neural networks at the edge is however limited by their high memory and energy consumption requirements. With new memory technology …

Day 6: Saving Energy with Binarized Neural Networks

Last week Apple has acquired XNOR.ai startup for amazing $200 million. The startup is known for promoting binarized neural network algorithms to save the energy and computational resources. That is definitely a way to go for mobile devices, and Apple just acknowledged that it is a great deal for them too. I feel now is a good time to explain what binarized neural networks are so that you can better appreciate their value for the industry.

Digital Biologically Plausible Implementation of Binarized Neural Networks with Differential Hafnium Oxide Resistive Memory Arrays

The brain performs intelligent tasks with extremely low energy consumption. This work takes its inspiration from two strategies used by the brain to achieve this energy efficiency: the absence of separation between computing and memory functions and …

Day 5: Convolutional Neural Networks Tutorial

Today we will talk about one of the most important deep learning architectures, the "master algorithm" in computer vision. That is how François Chollet, author of Keras, calls convolutional neural networks (CNNs). Convolutional network is an architecture that, like other artificial neural networks, has a neuron as its core building block. It is also differentiable, so the network is conveniently trained via backpropagation. The distinctive feature of CNNs, however, is the connection topology, resulting in sparsely connected convolutional layers with neurons sharing their weights.