Skip to content

Optimize blur: per-window background caching#44

Merged
malbiruk merged 1 commit intomalbiruk:mainfrom
wwmaxik:feature/blur-per-window-caching
Apr 19, 2026
Merged

Optimize blur: per-window background caching#44
malbiruk merged 1 commit intomalbiruk:mainfrom
wwmaxik:feature/blur-per-window-caching

Conversation

@wwmaxik
Copy link
Copy Markdown
Contributor

@wwmaxik wwmaxik commented Apr 19, 2026

Summary

Fixes #42 by implementing per-window background fingerprinting for blur cache invalidation.

Problem

Previously, blur cache was invalidated on any surface commit (via global blur_scene_generation), causing blur to recompute even when the background behind a specific window hadn't changed. This led to:

  • High GPU usage scaling linearly with number of blurred windows
  • 90%+ GPU usage reported with multiple blurred terminals on NVIDIA

Solution

Replace global scene generation tracking with per-window background hash:

  • Each BlurCache now stores last_background_hash
  • Hash computed from elements behind each window + window rect
  • Blur only recomputes when that specific window's background changes

Changes

  • Add last_background_hash field to BlurCache
  • Implement hash_background_elements() to fingerprint background
  • Move behind_starts calculation before cache invalidation check
  • Replace scene_generation check with background hash comparison

Test Results (NVIDIA GTX 1650)

Before:

  • 1 blurred window: ~30-37% GPU
  • Multiple blurred windows: scales linearly (90%+ reported)

After:

  • 1 blurred window: ~10-38% GPU (varies with animated background)
  • 6 blurred windows: ~29-37% GPU (stable, no scaling!)

Key improvement: GPU usage no longer scales with number of blurred windows.

Notes

  • Remaining ~30% GPU usage is from animated background shader (60 FPS rendering)
  • With static shader: GPU drops to 2-3% idle
  • Debug logs added to track invalidation reasons (can be removed if needed)

Replace global scene_generation tracking with per-window background
fingerprinting. Blur now only recomputes when the background behind
a specific window changes, not on any scene change.

Key changes:
- Add last_background_hash field to BlurCache
- Implement hash_background_elements() to fingerprint elements behind window
- Move behind_starts calculation before cache invalidation check
- Replace scene_generation check with background hash comparison

Impact:
- GPU usage no longer scales with number of blurred windows
- 6 blurred windows: ~30% GPU (was 90%+ before)
- Static blurred window with static background: essentially free

Fixes malbiruk#42
@malbiruk
Copy link
Copy Markdown
Owner

malbiruk commented Apr 19, 2026

Thanks for working on this! I tested it on my laptop and on idle the per-window background caching optimization doesn't show a measurable improvement: 1 blurred terminal sits at ~2% GPU usage, and 6 blurred terminals at ~7% -- both with and without the change. I suspect the original 30% and 90% GPU from #42 might be NVIDIA-specific rather than a cache invalidation issue or something else I'm missing...

Could you try reproducing the original issue on an unmodified main branch on your hardware?

@malbiruk
Copy link
Copy Markdown
Owner

Active usage seems to improve a bit though, so I'll merge with follow-up commit consolidating the original needs_recompute block with the new one.

@malbiruk malbiruk merged commit ee1325b into malbiruk:main Apr 19, 2026
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Blur causes 90% GPU usage (NVIDIA)

2 participants