Show HN: I built a Ruby gem that handles memoization with a ttl
20 comments
·April 22, 2025madsohm
Since using `def` to create a method returns a symbol with the method name, you can do something like this too:
memoize def expensive_calculation(arg)
@calculation_count += 1
arg * 2
end, ttl: 10, max_size: 2
memoize def nil_returning_method
@calculation_count += 1
nil
end
JamesSwift
Looks good. Id suggest making your `get` wait to acquire the lock until needed. eg instead of
@lock.synchronize do
entry = @store[key]
return nil unless entry
...
you can do entry = @store[key]
return nil unless entry
@lock.synchronize do
entry = @store[key]
And similarly for other codepathschowells
Does the memory model guarantee that double-check locking will be correct? I don't actually know for ruby.
JamesSwift
I think it wouldnt even be a consideration on this since we arent initializing the store here only accessing the key. And theres already the check-then-set race condition in that scenario so I think it is doubly fine.
film42
Nice! In rails I end up using Rails.cache most of the time because it's always "right there" but I like how you break out the cache to be a per-method to minimize contention. Depending on your workload it might make sense to use a ReadWrite lock instead of a Monitor.
Only suggestion is to not wrap the error of the caller in your memo wrapper.
> raise MemoTTL::Error, "Failed to execute memoized method '#{method_name}': #{e.message}"
It doesn't look like you need to catch this for any operational or state tracking reason so IMO you should not catch and wrap. When errors are wrapped with a string like this (and caught/ re-raised) you lose the original stacktrace which make debugging challenging. Especially when your error is like, "pg condition failed for select" and you can't see where it failed in the driver.
JamesSwift
I thought ruby would auto-wrap the original exception as long as you are raising from a rescue block (i.e. as long as $! is non-nil). So in that case you can just
raise "Failed to execute memoized method '#{method_name}'"
And ruby will set `cause` for youhttps://pablofernandez.tech/2014/02/05/wrapped-exceptions-in...
hp_hovercraft84
Thanks for the feedback! That's a very good point, I'll update the gem and let it bubble up.
gurgeous
This is neat, thanks for posting. I am using memo_wise in my current project (TableTennis) in part because it allows memoization of module functions. This is a requirement for my library.
Anyway, I ended up with a hack like this, which works fine but didn't feel great.
def some_method(arg)
@_memo_wise[__method__].tap { _1.clear if _1.length > 100 }
...
end
memo_wise :some_method
locofocos
Can you pitch me on why I would want to use this, instead of Rails.cache.fetch (which supports TTL) powered by redis (with the "allkeys-lru" config option)?
thomascountz
I'm not OP nor have I read through all the code, but this gem has no external dependencies and runs in a single process (as does activesupport::Cache::MemoryStore). Could be a "why you should," or a "why you should not" use this gem, depending on your use case.
hp_hovercraft84
Good question. I built this gem because I needed a few things that Rails.cache (and Redis) didn’t quite fit:
- Local and zero-dependency. It caches per object in memory, so no Redis setup, no serialization, no network latency. -Isolated and self-managed. Caches aren’t global. Each object/method manages its own LRU + TTL lifecycle and can be cleared with instance helpers. - Easy to use — You just declare the method, set the TTL and max size, and you're done. No key names, no block wrapping, no external config.
JamesSwift
For what its worth, ActiveSupport::CacheStore is a really flexible api that gives minimal contractual obligations (read_entry, write_entry, delete_entry is the entire set of required methods), but still allows you to layer specific functionality (eg TTL) on top with an optional 'options' param. You could get the best of both worlds by adhering to that contract and then people can swap in eg redis cache store if they wanted a network-shared store.
EDIT: see https://github.com/rails/rails/blob/main/activesupport/lib/a...
film42
Redis is great for caching a customer config that's hit 2000 times/second by your services, but even then, an in-mem cache with short TTL would make redis more tolerant to failure. This would be great for the in-mem part.
qrush
Congrats on shipping your first gem!!
I found this pretty easy to read through. I'd suggest setting a description on the repo too so it's easy to find.
https://github.com/mishalzaman/memo_ttl/blob/main/lib/memo_t...
null
hp_hovercraft84
As in identify where the source code is in the README?
zerocrates
I think they mean just set a description for the repo in github (set using the gear icon next to "About"), saying what the project is. That description text can come up in github searches and google searches.
wood-porch
Will this correctly retrieve 0 values? AFAIK 0 is falsey in Ruby
``` return nil unless entry ```
chowells
No, Ruby is more strict than that. Only nil and false are falsely.
wood-porch
Doesn't that shift the problem to caching false then :D
I built a Ruby gem for memoization with TTL + LRU cache. It’s thread-safe, and has been helpful in my own apps. Would love to get some feedback: https://github.com/mishalzaman/memo_ttl