Rate Limiter #2 – Leaky Bucket Algorithm

Published by

on

Advantages / Disadvantages

Advantages

  • Efficient memory usage
    • As it utilizes a fixed size of queue, memory usages is efficient
  • Stable outflow rate

Disadvantages

  • In case of a burst of requests, the latest requests may be dropped if the queue size is full


Implementation

public static void main(String[] args) throws InterruptedException {
        LeakyBucketRateLimiter rateLimiter = new LeakyBucketRateLimiter(5, 5);

        for (int i=0;i<100;i++) {
            boolean canConsume = rateLimiter.tryConsume();
            if (canConsume) {
                System.out.println("Request " + i + " can be consumed!");
            } else {
                System.out.println("Request " + i + " is blocked by rate limiter");
            }
            Thread.sleep(100);
        }
    }

package ratelimiter;

public class LeakyBucketRateLimiter implements RateLimiterInterface {
    private final int capacity;
    private final int outflowRatePerSecond;
    private int queuedRequests;
    private long lastOutflowTimeMillis;

    /**
     * Initialize queue size with specified capacity and initialize outflow rate
     * 
     * @param capacity
     * @param outflowRatePerSecond
     */
    public LeakyBucketRateLimiter(int capacity, int outflowRatePerSecond) {
        this.capacity = capacity;
        this.outflowRatePerSecond = outflowRatePerSecond;

        this.queuedRequests = 0;
        this.lastOutflowTimeMillis = System.currentTimeMillis();
    }

    /**
     * 1. Outflow first. 
     * 2. If queue is not full, add one more request
     */
    @Override
    public boolean tryConsume() {
        outflow();

        if (queuedRequests < capacity) {
            queuedRequests++;
            return true;
        } else {
            return false;
        }
    }

    /**
     * 1. Calculate how long it's been after the last outflow
     * 2. Calculate how many requests can be outflowed(processed) based on the result of #1. 
     * 3. Process the outflowAvailableCount of requests
     */
    private void outflow() {
        long currentTimeMillis = System.currentTimeMillis();
        long secondsSinceOutflow = (currentTimeMillis - lastOutflowTimeMillis) / 1_000;
        long outflowAvailableCount = secondsSinceOutflow * outflowRatePerSecond;

        System.out.println("  [" + currentTimeMillis + "] - Current outflowAvailableCount: " + outflowAvailableCount + ", queuedRequests: " + queuedRequests);
        if (outflowAvailableCount > 0) {
            queuedRequests -= outflowAvailableCount;
            if (queuedRequests < 0) {
                queuedRequests = 0;
            }
            lastOutflowTimeMillis = currentTimeMillis;
        }
    }
}


Implementation – Output

  [1736691442442] - Current outflowAvailableCount: 0, queuedRequests: 0
Request 0 can be consumed!
  [1736691442547] - Current outflowAvailableCount: 0, queuedRequests: 1
Request 1 can be consumed!
  [1736691442652] - Current outflowAvailableCount: 0, queuedRequests: 2
Request 2 can be consumed!
  [1736691442757] - Current outflowAvailableCount: 0, queuedRequests: 3
Request 3 can be consumed!
  [1736691442862] - Current outflowAvailableCount: 0, queuedRequests: 4
Request 4 can be consumed!
  [1736691442967] - Current outflowAvailableCount: 0, queuedRequests: 5
Request 5 is blocked by rate limiter
  [1736691443072] - Current outflowAvailableCount: 0, queuedRequests: 5
Request 6 is blocked by rate limiter
  [1736691443177] - Current outflowAvailableCount: 0, queuedRequests: 5
Request 7 is blocked by rate limiter
  [1736691443279] - Current outflowAvailableCount: 0, queuedRequests: 5
Request 8 is blocked by rate limiter
  [1736691443384] - Current outflowAvailableCount: 0, queuedRequests: 5
Request 9 is blocked by rate limiter
  [1736691443487] - Current outflowAvailableCount: 5, queuedRequests: 5
Request 10 can be consumed!
  [1736691443593] - Current outflowAvailableCount: 0, queuedRequests: 1
Request 11 can be consumed!
  [1736691443698] - Current outflowAvailableCount: 0, queuedRequests: 2
Request 12 can be consumed!
  [1736691443801] - Current outflowAvailableCount: 0, queuedRequests: 3
Request 13 can be consumed!
  [1736691443907] - Current outflowAvailableCount: 0, queuedRequests: 4
Request 14 can be consumed!
  [1736691444009] - Current outflowAvailableCount: 0, queuedRequests: 5
Request 15 is blocked by rate limiter
  [1736691444114] - Current outflowAvailableCount: 0, queuedRequests: 5
Request 16 is blocked by rate limiter


FollowUp Question

#1. In your outflow() method, you calculate secondsSinceOutflow by dividing the time difference in milliseconds by 1_000. Is this approach accurate for refilling the bucket? Why or why not?
  • The current calculation below causes loss if the currentTimeMillis - lastOutflowTimeMillis is less than 1000
    • long secondsSinceOutflow = (currentTimeMillis - lastOutflowTimeMillis) / 1_000;
    • This means partial seconds are not accounted for, leading to delayed outflows and potentially higher than expected request blocking
  • Solution
    • Use floating-point variable types. Instead of using long, use double
    • double secondsSinceOutflow = (currentTimeMillis - lastOutflowTimeMillis) / 1_000.0;

#2. How efficient is your LeakyBucketRateLimiter implementation in terms of time and space complexity? Are there any optimizations you can apply to improve its performance, especially under high request rates?
  • Time Complexity
    • tryConsume(): O(1)
  • Space Complexity
    • O(1) as it’s using just capacity variable to track the queue size
  • To improve
    • Batch processing like below
public int tryConsume(int numberOfTokens) { 
    outflow();

    if (queuedRequests + numberOfTokens <= capacity) {
        queuedRequests += numberOfTokens; // Batch Processing
        return numberOfTokens;
    } else {
        int allowed = capacity - queuedRequests;
        queuedRequests = capacity;
        return allowed;
    }
}

#3. The same FollowUp Question #1~#5 in https://willtheguruhome.wordpress.com/2025/01/12/rate-limiter-token-bucket-algorithm/ is possible

Leave a comment