DSA AVL Trees

Welcome to TheCodingCollege.com! In this article, we’ll delve into AVL Trees, a self-balancing binary search tree that guarantees logarithmic height. AVL trees play a crucial role in maintaining efficiency across insertion, deletion, and search operations in Data Structures and Algorithms (DSA).

What is an AVL Tree?

An AVL Tree is a self-balancing binary search tree, named after its inventors Adelson-Velsky and Landis. It ensures that:

  1. The height difference (balance factor) between the left and right subtrees of every node is at most 1.
  2. The tree rebalances itself after insertions or deletions to maintain this property.

Why AVL Trees?

In standard binary search trees (BSTs), operations degrade to O(n) in the worst-case scenario (skewed trees). AVL Trees prevent this by maintaining balance, ensuring a height of O(log n).

Properties of AVL Trees

  1. Binary Search Tree Property: Left subtree < Root < Right subtree.
  2. Balance Factor: For every node, Balance Factor=Height of Left Subtree−Height of Right Subtree\text{Balance Factor} = \text{Height of Left Subtree} – \text{Height of Right Subtree} Values of -1, 0, or 1 are allowed.
  3. Self-Balancing: Imbalance triggers rotations (single or double) to restore balance.

Rotations in AVL Trees

AVL Trees use rotations to maintain balance. There are four types of rotations:

  1. Left Rotation (LL): Applied when the imbalance is in the left-left subtree.
  2. Right Rotation (RR): Applied when the imbalance is in the right-right subtree.
  3. Left-Right Rotation (LR): Applied when the imbalance is in the left-right subtree.
  4. Right-Left Rotation (RL): Applied when the imbalance is in the right-left subtree.

Basic Operations in AVL Trees

Insertion

  1. Insert the node as in a BST.
  2. Calculate the balance factor.
  3. Perform rotations to restore balance if necessary.

Deletion

  1. Remove the node as in a BST.
  2. Calculate the balance factor for all affected nodes.
  3. Perform rotations to restore balance.

AVL Tree Implementation

Node Class

class Node:
    def __init__(self, key):
        self.key = key
        self.left = None
        self.right = None
        self.height = 1

Get Height

def get_height(node):
    if not node:
        return 0
    return node.height

Balance Factor

def get_balance(node):
    if not node:
        return 0
    return get_height(node.left) - get_height(node.right)

Rotations

def rotate_right(y):
    x = y.left
    T2 = x.right

    # Perform rotation
    x.right = y
    y.left = T2

    # Update heights
    y.height = 1 + max(get_height(y.left), get_height(y.right))
    x.height = 1 + max(get_height(x.left), get_height(x.right))

    return x

def rotate_left(x):
    y = x.right
    T2 = y.left

    # Perform rotation
    y.left = x
    x.right = T2

    # Update heights
    x.height = 1 + max(get_height(x.left), get_height(x.right))
    y.height = 1 + max(get_height(y.left), get_height(y.right))

    return y

Insert Operation

def insert(root, key):
    if not root:
        return Node(key)
    if key < root.key:
        root.left = insert(root.left, key)
    else:
        root.right = insert(root.right, key)

    # Update height
    root.height = 1 + max(get_height(root.left), get_height(root.right))

    # Get balance factor
    balance = get_balance(root)

    # Perform rotations
    if balance > 1 and key < root.left.key:  # LL Case
        return rotate_right(root)
    if balance < -1 and key > root.right.key:  # RR Case
        return rotate_left(root)
    if balance > 1 and key > root.left.key:  # LR Case
        root.left = rotate_left(root.left)
        return rotate_right(root)
    if balance < -1 and key < root.right.key:  # RL Case
        root.right = rotate_right(root.right)
        return rotate_left(root)

    return root

# Example Usage
root = None
root = insert(root, 10)
root = insert(root, 20)
root = insert(root, 30)
root = insert(root, 40)
root = insert(root, 50)

Time Complexity

OperationTime Complexity
SearchO(log n)
InsertO(log n)
DeleteO(log n)

Applications of AVL Trees

  1. Databases: AVL Trees are used to index large datasets efficiently.
  2. File Systems: Maintain hierarchical structures in file systems.
  3. Gaming: AVL Trees handle dynamic player ranking systems.
  4. Network Optimization: Optimizing routing tables.

Advantages of AVL Trees

  1. Guaranteed logarithmic height ensures efficient operations.
  2. Prevents degeneration into skewed trees.

Limitations of AVL Trees

  1. Slightly more complex than standard BSTs.
  2. Requires extra memory for maintaining height and balance factor.

Practice Problem

Problem: Insert the following elements into an AVL Tree and display its structure:
[30, 20, 40, 10, 25, 35, 50]

Solution:

  1. Perform insertions as shown in the implementation.
  2. Apply rotations when necessary to restore balance.

Conclusion

AVL Trees are a cornerstone in DSA, offering a balanced structure that ensures efficient operations. They form the foundation for advanced tree structures and are widely used in applications requiring dynamic data.

Leave a Comment