-
Notifications
You must be signed in to change notification settings - Fork 3.9k
Description
Problem
The class in has a performance bottleneck in its LRU eviction mechanism. When the cache reaches capacity and needs to evict the oldest entry, it performs an O(n) linear search through all cache entries to find the one with the oldest timestamp.
Current Implementation (Lines 146-157)
if (this.cache.size > AutocompleteLruCache.capacity) {
let oldestKey: string | null = null;
let oldestTime = Infinity;
for (const [key, entry] of this.cache.entries()) {
if (entry.timestamp < oldestTime) {
oldestTime = entry.timestamp;
oldestKey = key;
}
}
if (oldestKey) {
this.cache.delete(oldestKey);
this.dirty.add(oldestKey);
}
}Performance Impact
- Time Complexity: O(n) for each eviction operation
- Cache Size: Default capacity is 1000 entries
- Frequency: Occurs on every
put()operation when cache is full - Scaling: Performance degrades linearly with cache size
For autocomplete functionality that may be called frequently during typing, this can introduce noticeable latency, especially as the cache grows toward its capacity limit.
Proposed Solution
Implement a more efficient LRU eviction strategy using one of these approaches:
Option 1: Doubly Linked List + HashMap
Maintain a doubly linked list to track access order alongside the existing Map:
- Get/Put Operations: O(1)
- LRU Eviction: O(1) - remove from tail
- Memory Overhead: ~2-3x current usage (acceptable for autocomplete cache)
Option 2: Timestamp-Based Min-Heap
Use a min-heap to track entries by timestamp:
- Eviction: O(log n) - better than current O(n)
- Memory Overhead: Minimal additional space
Option 3: Periodic Cleanup Strategy
Instead of finding exact LRU on every eviction, periodically sort and bulk-evict:
- Eviction: O(1) most of the time, O(n log n) during cleanup
- Trade-off: Slightly less precise LRU semantics for better average performance
Implementation Plan
-
Benchmark Current Performance
- Add timing metrics to eviction operations
- Measure performance with cache at various fill levels
-
Implement Doubly Linked List Solution (Recommended)
- Create internal
CacheNodeclass with prev/next pointers - Maintain head/tail pointers for O(1) access to oldest/newest
- Update all cache operations to maintain list invariants
- Create internal
-
Add Performance Tests
- Test eviction performance with full cache (1000 entries)
- Verify O(1) eviction time regardless of cache size
- Add stress tests for high-frequency put operations
-
Backward Compatibility
- Ensure all existing functionality remains unchanged
- Maintain same public API
- Preserve SQLite persistence behavior
Files to Modify
core/autocomplete/util/AutocompleteLruCache.ts- Main implementationcore/autocomplete/util/AutocompleteLruCache.test.ts- Add performance tests
Expected Benefits
- Improved Autocomplete Responsiveness: Eliminate O(n) eviction delays
- Better Scaling: Performance remains constant as cache grows
- Reduced Typing Latency: Faster cache operations during active coding
Priority
Medium - This affects autocomplete performance but has workarounds (smaller cache size). Should be addressed to improve user experience as autocomplete usage scales.
Metadata
Metadata
Assignees
Labels
Type
Projects
Status