Posts

Showing posts from May, 2020

Fibonacci sequences of higher order : The Nibonacci Sequence

Image
So today I was messing around with optimising the calculation of Fibonacci sequences. Everyone has written code like this Quickly we run into performance issues because many of the calculations are redundant, and these redundant repeated duplicate calculations result in massive function stacks. So the obvious solution here is to cache the intermediate results. We can do something like this I wanted to create a Fibonacci sequence of order N  (Or Nibonacci sequence) using these same principles. First we need a cache that will store the N previous Nibonacci numbers. I designed this data structure, and I think it fits the bill: Glorious right? This is essentially a circular buffer, and it will work as expected in the previous example when substituted appropriately. We could have used a std::vector, but we would then have to defer the generation of our cache to runtime and, more importantly, we would be storing unnecessary results (our memory requirements would increase w...

The magic behind deep learning : Obtaining gradients using reverse Automatic Differentiation.

Image
Every now and again a wave of rage washes over me when I hear someone (be it a news outlet, etc) refer to deep learning, and the like, as magic. Although, I must admit that I was a bit vague on how you can generate an error function given a network of nodes then minimise it without using numerical differentiation or providing the actual derivative (Although there are (totally) derivative-free minimisation methods such as Simplex etc.). The process of writing software for generating this error function (and a neural network) is not of my concern here, although it might be in a future blog post. What is of my concern is obtaining the gradient of this error function. Automatic Differentiation  First, lets discuss the process of obtaining the derivative of a expression that we already have the computation graph for . You will see that obtaining the computation graph is half the battle and many of the resources on the internet explaining Automatic Differentiation (AD) tend to brus...