A Heap of (less) Complexity: Binomial Heaps

Prerequisites:
Firm grounding in the Basics of Data Structures
Recursion
Algorithmic complexity

Hi, welcome to another installment of Algosaurus, this time on Binomial Heaps. They are a more efficient form of heaps than the simple binary heaps we looked at earlier, not to mention they have a beautiful intuition to them. Even though they aren’t frequently used, they’re still good for expanding your mind.

Let’s compare the complexities of various operations on Binary, Binomial, and Fibonacci Heaps respectively.

bino14

Need I say anything further?

As a disclaimer, this article is definitely not aimed at beginners to programming. The Binomial Heap concept is somewhat complex and you might look like this after your first reading:

rec4

That said, let’s begin!


First, a quick recap of binary heaps. In my last article, I implemented binary heaps as max heaps, where the largest element was stored at the root of the tree.

Today, we’re going to show them as min-heaps, where the root element is the smallest.

bino1

Heaps were created to improve the complexity of graph algorithms like Dijkstra’s Algorithm, by executing the algorithm using a heap. Notably, binary min-heaps are used as priority queues there.

Thing is, two more operations are often used in graph algorithms.

One, merging two heaps together to form a new heap.
bino0

Two, decreasing the value stored in a node, called decrease-key.
bino2

There’s no easy way to merge regular binary heaps apart from reconstructing the heap from scratch, making the complexity O(m \log n). That’s no good.

Once again, the time complexities for operations on binary heaps are as follows:

bino14a

We can perform most operations in O(\log n) time or less.

bino16

Question is, can we do better?

bino3

Don’t worry Algosaurus, we can make that reality.


How do we merge nicely then?

Algosaurus then remembers that game he used to love playing on the phone, which merged tiles of the same power of 2 together.

2048gif3

You’ve played 2048 too, right?

Let’s get some intuition using binary arithmetic now.

bino8

What if we could adapt this approach to store our elements in ‘packets’ whose sizes are powers of 2?

bino9

With this, we have three properties fixed:

  • Sizes must be powers of two
  • Can efficiently fuse packets of the same size
  • Can efficiently find the minimum element of each packet

Let’s get down to deletion now.

bino10a

bino17

Relax Algosaurus. Remember how any number can be expressed in terms of powers of 2?

Let’s ‘fracture’ the packet from which we deleted the element.

bino10b

Then put it all back together.

bino10c

With that, our four properties are:

  • Sizes must be powers of two
  • Can efficiently fuse packets of the same size
  • Can efficiently find the minimum element of each packet
  • Can efficiently ‘fracture’ a packet of 2^k elements into similar packets of smaller powers of 2

In what form should we express our packets in?

With binomial trees of course!

Quoting directly from the slides referenced in the Acknowledgements:

“A binomial tree of order k is a type of tree recursively defined as follows:
A binomial tree of order k is a single node whose children are binomial trees of order 0, 1, 2, …, k – 1.”

bino4

Let’s apply the heap property to the binomial trees.

bino5

A binomial heap is basically a forest of heap-ordered binomial trees. Now, let’s check whether our binomial trees adhere to the 4 properties we set earlier.

    • Sizes must be powers of two – Pretty obvious.
    • Can efficiently fuse packets of the same size – As shown below, it’s a trivial operation!

bino6

    • Can efficiently find the minimum element of each packet – Each tree has a pointer to its minimum element, so it can be retrieved in O(1) time.
    • Can efficiently ‘fracture’ a packet of 2^k elements into similar packets of smaller powers of 2

bino7

Since we’ve finally got an intuitive understanding of merging and deletion for the ‘packets’, let’s step through doing so in Binomial Heaps.

insert():

bino11

extractMin():

bino12

For now, I’m just going to say that the amortized time of insertion into a binomial heap is O(1). Feel free to take my word, or check them out in the slides given in the Acknowledgements.


But we still have a couple of problems…

When we intermix insert() and extractMin(), we force expensive insertions to happen repeatedly, wrecking our amortized time for insertion from O(1) to O(\log n), because we have to merge \log n binomial trees together at the same time.

bino13

Hmm. How do we make the speed of our insertion operations independent of whether we do deletions in the middle?

In Bengali we have a saying, “If you don’t have a head, you won’t have a headache”.

What if we just don’t merge the O(\log n) binomial trees together, avoiding the step altogether?

That is, just add the isolated, unmerged binomial tree to the forest with every insertion and do nothing else.

We’ll only coalesce the trees together when we call extractMin(), only when we need to. The number of trees required to coalesce all the disjoint trees, is \log n, just like the number of bits required to represent a decimal number n, is \log n.

We can’t just ‘merge’ all the mini-heaps together because the ‘merge’ operation assumes that all the trees in the binomial heap are in ascending order, and that isn’t the case here.

Let’s go back to our 2048-analogy for some intuition.

bino15

And we’re done!

Here are the final amortized complexities of our ‘lazy’ binomial heap:
insert(): O(1)
merge(): O(1)
findMin(): O(1)
extractMin(): O(\log n)

I’m not proving them here, in interest of keeping this proof-free. Feel free to check them out in the slides given in the Acknowledgements.


This concludes my article on Binomial Heaps. Although this may not have been as fun as my other articles, I hope it managed to demystify Binomial Heaps for you.

Acknowledgements:
This article simply wouldn’t have been possible without this series of slides CS166 from Stanford. These slides were the template for my article, so full props to them.
http://web.stanford.edu/class/archive/cs/cs166/cs166.1146/lectures/06/Slides06.pdf