You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+3-4
Original file line number
Diff line number
Diff line change
@@ -14,7 +14,7 @@ This package maybe can help you!
14
14
15
15
This package works with implace functions of the form: `f(out,x)`, where:
16
16
1.`eltype(x) == eltype(out)`
17
-
2.`x` is of type Array.
17
+
2.`x` is of type Array,Dict,SparseVector,or SparseArray
18
18
3. by default, the caches are not thread-safe or async safe. future releases will add special cached types to deal with this. as a workaround, you can try creating new cached functions instances using `deepcopy(f)`
19
19
20
20
help on easing those limits is appreciated.
@@ -50,13 +50,12 @@ julia> f
50
50
cached version of f! (function with 2 cached methods)
51
51
julia> calls(f)
52
52
5
53
-
julia> methods(f)
53
+
julia> cached_methods(f)
54
54
IdDict{DataType,Function} with 2 entries:
55
55
Float64 => #198
56
56
Float32 => #198
57
57
```
58
-
All the cached methods are stored in `methods(f)`. you can take one and use it if you want. each method is a closure
59
-
with the specific cache created. and if the cache doesn't exists, it's created on the fly during runtime.
58
+
A dict with all cached closures for each type is stored in `cached_methods(f)`. you can take one and use it if you want. If the cache doesn't exists, it's created on the fly during runtime.
60
59
61
60
What happens if i don't want to allocate during runtime?, The solution: use `allocate!(f,Type)`
0 commit comments