22A very lightweight Library for leveraging the power of Lazy<T > for caching at all layers of an application with support for both
33Sync & Async Lazy operations to maximize server utilization and performance!
44
5- It also supports changing the underlying Cache Repository with different implementations via
6- ILazyCacheRepository implementation. But the default implementation using .Net MemoryCache is
7- implemented (via default ICacheRepository implementation as LazyDotNetMemoryCacheRepository)
8- to enable working with MemoryCache with greatly simplified support for self-populating (Lazy) initialization.
9- This implementation will work for the vast majority of medium or small projects; but this flexibility allows
10- for migrating to distributed caches and other cache storage mechanisms easier in the future.
11-
12- The use of Lazy< ; T> ; , for loading/initializing of data, facilitates a self-populating cache (also known as
5+ ### Give Star 🌟
6+ ** If you like this project and/or use it the please give it a Star 🌟 (c'mon it's free, and it'll help others find the project)!**
7+
8+ ### [ Buy me a Coffee ☕] ( https://www.buymeacoffee.com/cajuncoding )
9+ * I'm happy to share with the community, but if you find this useful (e.g for professional use), and are so inclinded,
10+ then I do love-me-some-coffee!*
11+
12+ <a href =" https://www.buymeacoffee.com/cajuncoding " target =" _blank " >
13+ <img src =" https://cdn.buymeacoffee.com/buttons/default-orange.png " alt =" Buy Me A Coffee " height =" 41 " width =" 174 " >
14+ </a >
15+
16+ ## Overview
17+ The use of ` Lazy<T> ` , for loading/initializing of data, facilitates a self-populating cache (also known as
1318a blocking cache), so that even if many requests, for the same cached data, are triggered at the exact same
1419time, no more than one thread/request (sync or asycn) will ever perform the work -- dramatically decreasing
1520server utilization under high load.
1621
22+ LazyCacheHelpers also supports changing the underlying Cache Repository with different implementations via
23+ ` ILazyCacheRepository ` implementation. But the default implementation using ` .NET MemoryCache ` is
24+ implemented (via default ICacheRepository implementation as LazyDotNetMemoryCacheRepository)
25+ along with greatly simplified support for self-populating (Lazy) initialization.
26+
27+ The default MemoryCache based implementation will work for the vast majority of medium or small projects; but this flexibility allows
28+ for migrating to distributed caches and other cache storage mechanisms easier in the future.
29+
1730To clarify, what this means is that if many requests for the cached data are submitted at or near the same time
1831then one-and-only-one-call (thread) will execute the long running process while all other requests will immediately benefit from
1932the resulting loaded data immediately, once it is ready. For example, if the long running process takes 3 seconds to complete
@@ -194,7 +207,7 @@ NOTE: The significant difference between this and the above more robus caching f
194207provide for any reclaming of resources by garbage collection, etc. unless manually implemented via ` WeakReference ` yourself.
195208
196209NOTE: It supports basic removal, but the ` LazyStaticInMemoryCache<> ` provides a pattern (of Lazy + ConcurrentDictionary) that is
197- best used for data that never chagnes once it is loaded/initialized (e.g. Reflection Results, Annotation Attribute cache, etc.).
210+ best used for data that never changes once it is loaded/initialized (e.g. Reflection Results, Annotation Attribute cache, etc.).
198211In almost all cases for data that changes over it's life, the LazyCache<> above with support for cache expiration policy is the
199212better pattern to use along with it's intrinsic support of garbage collection pressure to reclaim resources.
200213
@@ -207,7 +220,7 @@ public class AttributeConfigReader
207220 [AttributeUsage (AttributeTargets .Class )]
208221 private class ConfigAttribute : Attribute
209222 {
210- . . . implement your design time configuration properties . . .
223+ // . . . implement your design time configuration properties . . .
211224 }
212225
213226 // By making the cache static it is now a global and thread-safe blocking cache; enabling only
@@ -229,11 +242,11 @@ public class AttributeConfigReader
229242 // how many or how fast multiple (e.g. hundreds/thousands) threads/reqeuests come in for that same data!
230243 // NOTE: Exception handling is critical here -- because Lazy<> will cache the Exception -- and
231244 // this class ensures that exceptions are never cached!
232- var cacheResult = _lazyAttribConfigCache .GetOrAdd (typeof (T ), (key ) =>
245+ var cacheResult = _lazyAttribConfigCache .GetOrAdd (typeof (T ), (typeKey ) =>
233246 {
234247 // NOTE: If an Exception occurs then the result will not be cached, only value values
235248 // will be cached (e.g. a safe response of null will be cached).
236- var configAttribute = GetConfigAttributeInternal ();
249+ var configAttribute = GetConfigAttributeInternal (typeKey );
237250 return configAttribute ;
238251 });
239252
0 commit comments