- stevencodes.swe
- Posts
- stevencodes.swe - August 31, 2025
stevencodes.swe - August 31, 2025
Dev tool recs, weekly video highlight
👋 Hey friends,
Here’s what I’ve got for you this week:
Snippet from Chapter 4 of The Backend Lowdown
Weekly Video Highlight: Negative Caching
Dev Tool Rec: Hey
Let’s get into it 👇
The Backend Lowdown: Chapter 4 Preview
Every newsletter will include a snippet from my book in progress, The Backend Lowdown, available for $1 right now on Gumroad!
Note: this Chapter has yet to be released, but will be soon!
Cache-Aside with TTL + Jitter
This is the simplest and safest caching pattern to start with. When a request comes in, you first check the cache. If the data isn't there (a cache miss), you compute the value, store it in the cache with a bounded time-to-live (TTL), and return it to the caller. To prevent the "thundering herd" problem where all your hot keys expire simultaneously, you add a small random jitter to each TTL.
When it helps:
You have expensive, repeatable reads (same inputs → same output for a time window)
Users don't need to see their own updates immediately after writing them
You want quick performance wins without modifying your write path to interact with the cache
def fetch_product(id, locale:)
key = product_key(id, locale: locale)
base_ttl = 5.minutes
ttl = base_ttl + rand(0..30).seconds # jitter to avoid synchronized expiry
Rails.cache.fetch(key, expires_in: ttl) do
# Compute the value once on a miss, then cache it.
ProductPresenter.new(Product.find(id), locale: locale).as_json
end
end
# Key design: include all inputs that change the output.
def product_key(id, locale:)
"v3:product:#{id}:#{locale}"
end
Implementation notes
Choose TTLs based on your actual staleness tolerance. Ask "how stale can this data be?" (e.g., "up to 60 seconds old is fine"). Start with shorter TTLs and increase as needed.
Include every input that affects the output in your cache key: tenant ID, locale, currency, user role, feature flags, etc. If changing something produces different results, it must be part of the key.
Add jitter (anywhere from a few seconds to a small percentage of your TTL) to prevent the "thundering herd" problem where many popular cache entries expire at the same moment.
Cache only the data you actually need (like DTOs or view models), not entire objects with unused fields. You want every cached byte to be useful.
Weekly Video Highlight: Negative Caching
For this week’s video highlight, I chose to do a video from outside my car 😱 I also tried something new by actually whipping up a real demo and showing some screen recordings. The subject: negative caching - i.e., caching 404s / empty results. The why: letting every request that results in a 404 hit the database is like a mini stampede; instead, negative caching saves your app the trouble by caching the result using a miss sentinel with a tiny TTL.
Check out the video below! Side note: I haven’t posted this to IG yet, but will soon!
@stevencodes.swe Fix 404 stampedes with negative caching: store a MISS for 1s ➡️ everyone skips the DB. Live dashboard demo inside! #techtok #software #pro... See more
Dev Tool Rec: Hey
For the negative caching demo, I used a couple things:
A ruby script to set up a lightweight web server using Sinatra and Puma
blessed, a high-level terminal interface for node.js
blessed-contrib, a package to build terminal dashboards with ascii/ansi (honestly really cool)
hey: a lightweight HTTP load generator
You might know about these tools as they’re quite popular, but I thought Hey in particular was great. Hey really shines for this kind of performance testing. Unlike more complex tools, Hey gets you meaningful load test results with zero configuration. Just hey -z 5s -c 50 "http://localhost:4567/products/42"
and boom: you're continuously sending requests for 5 seconds with 50 concurrent workers.
Another cool feature: after each run, Hey gives you a beautiful summary with latency distribution (50th, 95th, 99th percentiles), requests per second, and a histogram showing the full response time distribution. Perfect for comparing "before and after" when you enable caching.

Negative caching OFF - yikes!
If you want to play around with the demo, I pushed the code here. I had fun playing around with it, maybe you will too!
That’s a wrap for this week. If something here made your day smoother, feel free to reply and tell me about it. And if you think a friend or teammate would enjoy this too, I’d be grateful if you shared it with them.
Until next time,
Steven