AVL tree explained

AVL tree explained with step-by-step rotations, insertion rebalancing examples, and clear guidance on when AVL vs Red-Black matters in interviews.

10 minutes
Advanced
What you will learn

Why unbalanced BSTs degrade to O(n) and how AVL trees prevent it

How the balance factor invariant constrains tree shape

The four rotation cases that restore balance after insertion

When to use AVL trees vs Red-Black trees vs plain BSTs in interviews

Most engineers learn binary search trees and walk away believing every operation is O(log n). It isn't. That guarantee only holds when the tree stays balanced, and a plain BST has no mechanism to ensure that it does. Once you have the AVL tree explained properly, with its one invariant and small set of rotations, the rotation cases stop feeling like something you need to memorise.

TL;DR
An AVL tree adds one rule to a BST: no node's left and right subtrees can differ in height by more than 1. When an insertion or deletion breaks this rule, the tree fixes itself through rotations, keeping every operation at O(log n).

Why BSTs break without balance

An AVL tree is a self-balancing binary search tree where the height difference between left and right subtrees of every node is at most 1. This balance constraint guarantees O(log n) search, insert, and delete. When an operation breaks the balance, the tree restores it through rotations.

That definition matters because a plain BST offers none of it. Insert the values [1, 2, 3, 4, 5] into an empty BST. Each value is larger than the previous one, so every insertion goes right. The resulting tree is a straight line, structurally identical to a linked list. Searching for 5 requires traversing all 5 nodes. O(n), not O(log n).

Sorted data isn't unusual either. Database indices and timestamped logs both produce degenerate BSTs when inserted sequentially. Height controls everything.

Performance depends entirely on the tree's height. Balanced, you get ~O(log n). Degenerate, you get O(n). Search, insert, and delete all traverse from root to leaf, so their runtime tracks the height directly. Lose control of the height, lose control of the performance.

Important
The O(log n) guarantee for BSTs assumes the tree is balanced. Without a balancing mechanism, worst-case performance is O(n) for all operations.

The height balance invariant

AVL trees fix this with a single constraint: for every node, the height of its left subtree and the height of its right subtree can differ by at most 1. You track this with the balance factor:

balance_factor(node) = height(left_subtree) - height(right_subtree)

A node is balanced when its balance factor is -1, 0, or 1. Zero means both subtrees have the same height, +1 means the left subtree is one level taller, -1 means the right subtree is one level taller.

When an insertion or deletion pushes any node's balance factor to -2 or +2, the tree is unbalanced and needs fixing. The fix is always a rotation, and it's always local. You don't rebuild the tree. You rearrange 2-3 nodes near the imbalance, and the invariant snaps back into place.

This constraint is strict enough to guarantee logarithmic height. A tree with n nodes and the AVL invariant has height at most ~1.44 log(n). Slightly taller than a perfectly balanced tree, but the difference is a constant factor. Every operation stays O(log n).

Codeintuition's Binary Search Tree course walks through height and balance from first principles. The understanding the balance of a tree lesson traces balance factor computation across multiple tree shapes before you write any code.

AVL tree rotations

Every AVL imbalance falls into one of four cases. Each one is resolved by a specific rotation that preserves the BST ordering (left < parent < right) while restoring the height balance.

A right rotation handles the LL case, where the imbalance is at a node whose left child is left-heavy (balance factor +2, left child's factor +1 or 0). You rotate the unbalanced node down and to the right, pulling its left child up to take its place.

  1. Python

A left rotation handles the RR case, which mirrors the right rotation. The imbalance is at a node whose right child is right-heavy (balance factor -2, right child's factor -1 or 0). You rotate the unbalanced node down and to the left.

The double rotations cover the trickier cases. A left-right rotation handles the LR case, where a node's left child is right-heavy. A single right rotation won't fix this because the heavy side is on the inside. You first left-rotate the left child (straightening the kink), then right-rotate the unbalanced node. The two rotations together restore balance. A right-left rotation is the mirror, handling the RL case where a node's right child is left-heavy. You right-rotate the right child first, then left-rotate the unbalanced node.

“Every AVL rotation is O(1). You rearrange 2-3 pointers, update 2-3 heights, and the entire subtree is balanced again.”
What rebalancing actually takes

Rotations are constant-time. You move a few pointers and recalculate a few heights. The O(log n) part of insertion comes from walking down the tree to find the insertion point and walking back up to check balance factors. The rotation itself is free by comparison.

AVL tree insertion

Walk through inserting [3, 2, 1, 4, 5] into an AVL tree and watch the rotations fire.

Inserting 3 creates a single-node tree with balance factor 0. Inserting 2 sends it left of 3, and node 3's balance factor becomes +1 (left subtree height 1, right subtree height 0). The tree is still balanced.

Inserting 1 triggers the first rotation. Since 1 < 2, it goes left of node 2. Tracing back up, node 2 has a balance factor of +1 and node 3 has a balance factor of +2. That's an LL case because node 3 is left-heavy and its left child (node 2) is also left-heavy. Right-rotating node 3 puts node 2 at the root, node 1 on the left, and node 3 on the right. All balance factors drop to 0.

Inserting 4 goes right from node 2 to node 3, then right again to become node 3's right child. Tracing back, node 3's balance factor is -1 and node 2's is -1, so everything stays balanced.

Inserting 5 breaks things again. Following the path down (5 > 2, right to 3, right to 4, right again), it lands as node 4's right child. Tracing back, node 4's balance factor is -1 but node 3's hits -2. That's an RR case, so left-rotating node 3 moves node 4 into node 3's position, with node 3 becoming node 4's left child and node 5 staying as node 4's right child. All balance factors are restored.

The final tree has node 2 as root, node 1 on the left, and node 4 on the right, with node 3 as node 4's left child and node 5 as node 4's right child. The height is 2. A plain BST with the same insertion order [3, 2, 1, 4, 5] wouldn't degenerate this badly, but try [1, 2, 3, 4, 5] and the BST becomes a linked list of height 4. The AVL tree would still have height 2.

  1. Python

💡 Tip
After every insertion, you only need to check balance factors along the path from the inserted node back to the root. At most one rotation (single or double) is needed per insertion.

When the differences matters

In interviews, AVL trees come up two ways: explaining self-balancing BSTs conceptually, and implementing balanced insertion from scratch.

Plain BST

Works fine when the input is random or the problem doesn't require worst-case guarantees. Most tree problems on LeetCode assume balanced input and don't ask you to maintain balance yourself.

Red-Black tree

Relax the balance constraint (height up to 2 log(n)) in exchange for cheaper insertion and deletion (at most 2 rotations per operation vs potentially O(log n) for AVL deletion). Java's TreeMap, C++'s std::map, and Linux's CFS process scheduler all use Red-Black trees. Most standard libraries do.

AVL tree

Enforce stricter balance than Red-Black trees. Maximum height is ~1.44 log(n) versus ~2 log(n) for Red-Black. Lookups are faster because the tree is shorter. The tradeoff: insertions and deletions are slower because AVL trees may need more rotations to maintain that stricter invariant.

Why are AVL trees less common in production despite being older and simpler? The looser balance of Red-Black trees makes insertion-heavy workloads faster, and most real systems insert more than they search.

For interviews, though, AVL trees are better. The rotation logic is more intuitive. Balance factor ±1 is straightforward compared to Red-Black's colour rules with 5 cases. If an interviewer asks you to implement a self-balancing BST, AVL is the cleaner choice under time pressure.

⚠️ Warning
Don't memorise rotation cases as four separate procedures. Each is just a combination of at most two atomic operations (left-rotate and right-rotate). Understanding the two atomic rotations is enough to derive all four cases during an interview.
Plain BST
  • Search worst case
    O(n)
  • Insert worst case
    O(n)
  • Max height (n nodes)
    n
  • Rotations per insert
    0
  • Implementation difficulty
    Simple
  • Interview friendliness
    Limited
AVL tree
  • Search worst case
    O(log n)
  • Insert worst case
    O(log n)
  • Max height (n nodes)
    ~1.44 log n
  • Rotations per insert
    Up to O(log n)
  • Implementation difficulty
    Moderate
  • Interview friendliness
    Best
Red-Black tree
  • Search worst case
    O(log n)
  • Insert worst case
    O(log n)
  • Max height (n nodes)
    ~2 log n
  • Rotations per insert
    At most 2
  • Implementation difficulty
    Complex
  • Interview friendliness
    Harder to explain

What stays with you

The balance invariant connects directly to Codeintuition's BST course, where you trace height balanced tree construction from first principles before writing code. The approach works the same way across every tree and recursion topic in the learning path.

If BST fundamentals still feel shaky, the prerequisite material matters. Codeintuition's free tier covers the reasoning model that BSTs and AVL trees build on, with 63 lessons and 85 problems across multiple courses. The same construction-first teaching that makes rotations feel derivable rather than memorisable.

Six months from now, you won't remember the four rotation cases by name. But if you've traced a left-right rotation once with actual node values, watched the balance factors recalculate, the pattern will come back the moment you see an imbalanced subtree on a whiteboard.

Do you want to master data structures?

Try our data structures learning path made of highly visual and interactive courses. Get hands on experience by solving real problems in a structured manner. All resources you would ever need in one place for FREE

Rarely. Most interviewers test whether you understand the concept and can explain why self-balancing matters. Full implementation is occasionally asked at companies like Google or Amazon for senior roles, but it's uncommon for standard coding rounds. Knowing the invariant, the four rotation cases, and insertion with rebalancing is typically enough.
AVL rotations restore a strict height-balance invariant (balance factor within ±1) using two atomic operations (left-rotate and right-rotate) combined into four cases. Red-Black trees use similar rotations but add recolouring steps and have five distinct cases instead of four. AVL is simpler to reason about, which makes it easier to implement and explain under interview pressure. Study AVL first because the rotation logic transfers directly to Red-Black trees when you get there.
Fewer rotations per insertion. Red-Black trees allow a looser balance (height up to 2 log n vs 1.44 log n), so they don't need to rotate as aggressively. For workloads that insert frequently and search less often, fewer rotations matters more than a slightly taller tree. Standard libraries like Java's TreeMap and C++'s std::map optimise for this general-purpose insertion-heavy pattern.
Yes, and deletion can require more rotations than insertion. You might need O(log n) rotations along the path from the deleted node to the root, compared to at most one for insertion.
Slightly taller, but not enough to matter. An AVL tree with n nodes has height at most approximately 1.44 * log2(n), compared to exactly floor(log2(n)) for a perfectly balanced BST. For a tree with 1 million nodes, that's roughly 29 levels instead of 20. Both guarantee O(log n) operations, and in interviews, the constant factor difference never comes up as a practical concern. The guarantee itself is what matters.
Was this helpful?