Jakub Arnold's Blog


Few notes on the Binomial and Fibonacci heaps

Having just implemented and tested a Fibonacci heap on a large dataset I thought I’d write up a bit about it, mostly so that I can reference this post later in the future, and to help me remember things I’ve learned better. Note that this blog post is not a tutorial on how to implement a Binomial/Fibonacci heap.

First, let’s begin with a few definitions. Throughout the article we’ll be talking about min-heaps. The heap property of a tree says that the value in each node is less than or equal to the value of its children. It doesn’t say which values go to the left and which go to the right, so it doesn’t help us with searching. It only tells us that the minimum of any subtree is in its root. Also note that we are not restricting ourselves just to binary trees, this property works for any kind of tree.

Binomial heap

Before we can define a binomial heap, we need to define a binomial tree. We’ll use a recursive definition.

  1. A binomial tree of rank 0 is a single node without any children.
  2. A binomial tree of rank k is a tree where the root has exactly k children, which are, going from left to right, binomial trees of rank 0..k-1.

To make things a little more confusing, here’s a picture from wikipedia, which uses uses a reverse order, putting children of lower rank to the right. Both definitions are equivalent.

binomial tree

Before we move onto the binomial heap, let us prove a small property which will be useful later. A binomial tree of rank $k$ has $2^k$ nodes. For k=0 we get $2^0 = 1$, which is true. Now taking $k>0$, we know that a binomial tree of rank $k-1$ has $2^{k-1}$ nodes. We also know, that we can use two trees of rank $k-1$ and combine them into a binomial tree of rank $k$.

merging binomial trees

If we do that, we get $2 \cdot 2^{k-1} = 2^k$ nodes in total. This shows that the number of children is logarithmic in the total number of nodes. Since increasing the rank by one increases the depth of the tree by one as well, we get that a binomial tree with $n$ nodes has the depth of $O(\log n)$ and also has $O(\log n)$ children at the root.

A small sidenote, when we merge two binomial trees, in order to preserve the heap property, we have to put the one with a higher value in its root under the one with the lower value.

Now moving onto a binomial heap, we define it as a list of binomial trees T1,…,Tk, which are sorted by their rank, each rank from 0 to $k$ occurs at most once, and each tree obeys the heap property.

The operations we want from the binomial heap are the following:

Lazy binomial heap

We can go a bit further and make our binomial heap lazy. This will help us improve the amortized time of some of the operations. The only change we’re going to make is allow multiple trees of the same rank to co-exist in our binomial heap.

We’ll simplify our Merge operation so that it works in constant time. Instead of doing all of those complicated operations, we simply take the both lists of trees of both heaps, and concatenate them together. This can be done in $O(1)$ when using double linked lists. (Note that the minimum pointer should be updated to the minimum of both heaps when doing this).

We also modify our ExtractMin operation so that it performs a new operation called Consolidation. This will fix our heap so that it again looks like a binomial heap. In a nutshell, we’ll do a bucket sort on the lists of trees in our heap, and then merge trees in each bucket until there’s only one left (note that when we merge two trees we create a tree of a higher rank, so we move it one bucket up). Iterating the buckets from lower to higher ranks will result in a bucket lists where each bucket contains zero or one tree. We can convert this back to a binomial heap.

The whole trick is that the consolidation itself is $O(\log n)$ (amortized), so we keep the amortized complexity of ExtractMin, but improve the complexity of Insert and Merge to $O(1)$. Also note that the worst case time of ExtractMin is $O(n)$.

Fibonacci heap

Going even further, we want our heap to also support the Decrease operation, which takes a pointer to a node and changes its key to a specific value. As doing this blindly could break the heap property, we have to do some tweaking to our data structure.

A regular binomial heap can do a Decrease in $O(\log n)$ by simply propagating the decreased element as far up to the root as needed to maintain the heap property. But our Fibonacci heap will be able to do this in just $O(1)$!.

To allow this, we tweak our definition of the binomial heap. We keep the heap ordering on our trees, but we don’t require them to be binomial. All of the above mentioned operations will be identical to the lazy binomial heap, with the exception of Decrease. We’ll also be keeping an additional flag on each node, which says if the node is marked.

When Decrease is called on a node, we check if it changes the keys in such the parent node now has a higher value. If not, we stop right here as the tree keeps the heap property. If the parent now has a higher value, we Cut the subtree at the changed note (including the changed node, acting as a root).

The Cut operation takes the subtree, removes it from its parent, and Inserts it back into the heap. If the parent was marked, we recursively call Cut on the parent. If the parent wasn’t marked, we mark it and end right there.

This means our first Decrease will simply take the subtree at the decreased note, mark its parent, and insert the subtree back into the heap. If we then call a second Decrease under the same parent node, we will end up Cutting the parent as well. This prevents the tree from becoming too degenerate.

The most interesting part here (which I’m however not going to prove), is that the amortized cost of Cut is $O(1)$, and the amortized cost of Decrease is also $O(1)$. This makes for a very interesting data structure, in which all operations except for the ExtractMin run in amortized constant time.

Conclusion

I do realize that I’ve skipped most of the amortized analysis, and simplified a few things, but this article mostly serves as a mental refresher for people already somewhat familiar with the Fibonacci heap. For a complete reference, check out the references at the Wikipedia page.

Related
Algorithms