Skip to content

Simplify mkCacheInt32/mkCacheGeneric to use ConcurrentDictionary#486

Merged
sergey-tihon merged 2 commits intorepo-assist/fix-thread-safety-lazy-caches-481-a00dbff9b73cb3e8from
copilot/sub-pr-482
Mar 22, 2026
Merged

Simplify mkCacheInt32/mkCacheGeneric to use ConcurrentDictionary#486
sergey-tihon merged 2 commits intorepo-assist/fix-thread-safety-lazy-caches-481-a00dbff9b73cb3e8from
copilot/sub-pr-482

Conversation

Copy link

Copilot AI commented Mar 22, 2026

mkCacheInt32 and mkCacheGeneric were using Dictionary + lock syncObj, serializing all cache access including expensive metadata decode work.

Changes

  • mkCacheInt32 / mkCacheGeneric: replaced Dictionary + lock with ConcurrentDictionary using TryGetValue / TryAdd. No locks, no syncObj.
let mkCacheInt32 lowMem _infile _nm _sz =
    if lowMem then (fun f x -> f x) else
    let cache = ConcurrentDictionary<int32, _>()
    fun f (idx:int32) ->
        match cache.TryGetValue idx with
        | true, v -> v
        | false, _ ->
            let v = f idx
            cache.TryAdd(idx, v) |> ignore
            cache.[idx]

GetOrAdd(key, factory) was not usable here — F# type inference cannot disambiguate it from GetOrAdd(key, value) when the value type is an unconstrained generic. TryAdd has no overloads and sidesteps the issue. On a concurrent cache miss, f idx may be computed twice (both GetOrAdd with factory and this pattern share that behaviour per the .NET docs), which is acceptable since these factories are pure metadata reads.


⌨️ Start Copilot coding agent tasks without leaving your editor — available in VS Code, Visual Studio, JetBrains IDEs and Eclipse.

Copilot AI changed the title [WIP] [Repo Assist] Fix thread-safety races in member-wrapper caches Simplify mkCacheInt32/mkCacheGeneric to use ConcurrentDictionary Mar 22, 2026
Copilot AI requested a review from sergey-tihon March 22, 2026 07:00
@sergey-tihon sergey-tihon marked this pull request as ready for review March 22, 2026 07:04
Copilot AI review requested due to automatic review settings March 22, 2026 07:04
@sergey-tihon sergey-tihon merged commit c7c8212 into repo-assist/fix-thread-safety-lazy-caches-481-a00dbff9b73cb3e8 Mar 22, 2026
8 checks passed
@sergey-tihon sergey-tihon deleted the copilot/sub-pr-482 branch March 22, 2026 07:05
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR updates the AssemblyReader caching helpers to reduce contention during metadata decoding by replacing Dictionary + lock with ConcurrentDictionary, enabling concurrent cache access.

Changes:

  • Replaced the lock-protected Dictionary caches in mkCacheInt32 and mkCacheGeneric with ConcurrentDictionary.
  • Implemented cache lookup/insert via TryGetValue + TryAdd to avoid GetOrAdd overload-resolution issues in this context.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +4589 to +4596
let cache = ConcurrentDictionary<int32, _>()
fun f (idx:int32) ->
lock syncObj (fun () ->
let mutable res = Unchecked.defaultof<_>
if cache.TryGetValue(idx, &res) then res
else
let v = f idx
cache.[idx] <- v
v)
match cache.TryGetValue idx with
| true, v -> v
| false, _ ->
let v = f idx
cache.TryAdd(idx, v) |> ignore
cache.[idx]
Copy link

Copilot AI Mar 22, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cache.TryGetValue idx uses the F# out-arg-to-tuple translation, which allocates a tuple on every cache access (including hits). Since this cache is likely hot, consider using the byref overload (let mutable res = Unchecked.defaultof<_>; if cache.TryGetValue(idx, &res) then res else ...) to avoid per-call allocations. Also, after TryAdd you can return v directly when the add succeeds to avoid a second dictionary lookup.

Copilot uses AI. Check for mistakes.
Comment on lines +4602 to +4607
match cache.TryGetValue idx with
| true, v -> v
| false, _ ->
let v = f idx
cache.TryAdd(idx, v) |> ignore
cache.[idx]
Copy link

Copilot AI Mar 22, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After computing v on a miss, the code does TryAdd and then unconditionally reads cache.[idx], which forces another lookup. Consider returning v when TryAdd succeeds (and only falling back to reading from the dictionary when it fails due to a concurrent add). This keeps the concurrent behavior but avoids extra work on every miss.

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants