cache_reset_metrics
trait method to reset hits/misses
- Refactor cache store types to separate modules
- Add support for returning a
cached::Return
wrapper type that indicates whether the result came from the function's cache.
- Support mutual
size
&time
args in the cached proc macro. Added when TimedSizedCache was added, but forgot to release the cached_proc_macro crate update.
- Add a TimedSizedCache combining LRU and timed/ttl logic
- Add new CachedAsync trait. Only present with async feature. Adds two async function in the entry API style of HashMap
- Add type hint
_result!
macros - remove unnecessary transmute in cache reset
- remove unnecessary clones in proc macro
- use
async-mutex
instead of fullasync-std
- Store inner values when
result=true
oroption=true
. TheError
type in theResult
now no longer needs to implementClone
.
- add
cache_set_lifespan
to change the cache lifespace, old value returned.
- fix proc macro when result=true, regression from changing
cache_set
to return the previous value
- add
Cached
implementation for stdHashMap
- trait
Cached
has a new methodcache_get_or_set_with
cache_set
now returns the previous value if any
- add Clone, Debug trait derives on pub types
- fix proc macro documentation
- proc macro version
- async support when using the new proc macro version
- Add
cache_get_mut
toCached
trait, to allow mutable access for values in the cache. - Change the type of
hits
andmisses
to beu64
.
- Add
value_order
method to SizedCache, similar tokey_order
- add
cache_reset
trait method for resetting cache collections to their initial state
- Update
once_cell
to 1.x
- Replace SizedCache implementation to avoid O(n) lookup on cache-get
- Update to Rust-2018 edition
- cargo fmt everything
- Replace inner cache when "clearing" unbounded cache
- Switch to
once_cell
. Library users no longer need to importlazy_static
- Add
cache_clear
andcache_result
toCached
trait- Allows for defeating cache entries if desired
- Update documentation
- Note the in-memory nature of cache stores
- Note the behavior of memoized functions under concurrent access
- Fixed duplicate key eviction in
SizedCache::cache_set
. This would manifest whencached
functions called with duplicate keys would race set an uncached key, or ifSizedCache
was used directly.
- Add
cached_result
andcached_key_result
to allow the caching of success for a function that returnsResult
. - Add
cached_control
macro to allow specifying functionality at key points of the macro
- Add
cached_key
macro to allow defining the caching key
- Tweak
cached
macro syntax - Update readme
- Update trait docs
- Update readme
- Update examples
- Update crate documentation and examples