-
Notifications
You must be signed in to change notification settings - Fork 0
Fix deadlock issue #26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change | ||||||||
|---|---|---|---|---|---|---|---|---|---|---|
| @@ -0,0 +1,125 @@ | ||||||||||
| #if !os(Linux) && !os(Windows) | ||||||||||
| import XCTest | ||||||||||
| @testable import Cache | ||||||||||
|
|
||||||||||
| /// Tests designed to expose bugs in the Cache package | ||||||||||
| final class BugExposingTests: XCTestCase { | ||||||||||
|
|
||||||||||
| // MARK: - Bug #1: PersistableCache Deadlock on Error | ||||||||||
|
|
||||||||||
| /// This test exposes a deadlock bug in PersistableCache. | ||||||||||
| /// When delete() throws an error, the NSLock is never released. | ||||||||||
| /// Any subsequent call to save() or delete() will hang forever. | ||||||||||
| func testPersistableCacheDeadlockAfterDeleteError() { | ||||||||||
| enum Key: String { | ||||||||||
| case test | ||||||||||
| } | ||||||||||
|
|
||||||||||
| let uniqueName = "deadlock_test_\(UUID().uuidString)" | ||||||||||
| let cache: PersistableCache<Key, String, String> = PersistableCache(name: uniqueName) | ||||||||||
|
|
||||||||||
| // First: delete() on non-existent file throws - but lock is never released! | ||||||||||
| XCTAssertThrowsError(try cache.delete()) | ||||||||||
|
|
||||||||||
| // Now try to save - this should work but will DEADLOCK | ||||||||||
| // because the lock was never released after the error above | ||||||||||
| cache[.test] = "value" | ||||||||||
|
|
||||||||||
| // Use a timeout to detect the deadlock | ||||||||||
| let expectation = XCTestExpectation(description: "save() should complete") | ||||||||||
|
|
||||||||||
| DispatchQueue.global().async { | ||||||||||
| do { | ||||||||||
| try cache.save() | ||||||||||
| expectation.fulfill() | ||||||||||
| } catch { | ||||||||||
| expectation.fulfill() // Even if it fails, at least it didn't hang | ||||||||||
| } | ||||||||||
| } | ||||||||||
|
|
||||||||||
| // Wait only 2 seconds - if it hangs, we have a deadlock | ||||||||||
| let result = XCTWaiter.wait(for: [expectation], timeout: 2.0) | ||||||||||
|
|
||||||||||
| // If this times out, the bug is confirmed | ||||||||||
| XCTAssertNotEqual(result, .timedOut, "DEADLOCK DETECTED: save() hung after delete() error - lock was never released!") | ||||||||||
| } | ||||||||||
|
|
||||||||||
| // MARK: - LRUCache contains() Updates Recency | ||||||||||
|
||||||||||
| // MARK: - LRUCache contains() Updates Recency | |
| // MARK: - Bug #2: LRUCache contains() Updates Recency |
Copilot
AI
Dec 5, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] The test comment states "This test validates that contains() updates LRU order" but the test name and context suggest this might be exposing a bug rather than validating correct behavior. Consider clarifying whether this is testing for expected behavior or exposing a bug. If contains() updating recency is the intended behavior, the comment is accurate. If not, it should indicate this is a bug being exposed.
| /// This test validates that contains() updates LRU order. | |
| /// Calling contains() is considered a "use" of the key, promoting it to most recently used. | |
| /// This test exposes a bug: contains() updates LRU order. | |
| /// Calling contains() is currently considered a "use" of the key, promoting it to most recently used. |
Copilot
AI
Dec 5, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The test uses 100,000 iterations which may cause the test to be slow and potentially timeout on CI systems or slower machines. Consider reducing this to a more reasonable number (e.g., 10,000 or 1,000) that still effectively tests for race conditions while maintaining reasonable test execution time.
| let iterations = 100_000 | |
| let iterations = 10_000 |
Copilot
AI
Dec 5, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The magic number 50 is used here and also on line 77 (capacity) and line 122 (final assertion). Consider extracting this to a constant at the test method level for better maintainability:
let capacity = 50
let cache = LRUCache<Int, Int>(capacity: capacity)
// ...
if count > capacity {
// ...
}
// ...
XCTAssertLessThanOrEqual(finalCount, capacity, "Final cache size \(finalCount) exceeds capacity of \(capacity)")
Copilot
AI
Dec 5, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The expectation is fulfilled immediately after concurrentPerform completes (line 112), but concurrentPerform is a synchronous call that already blocks until all iterations finish. This makes the expectation and the wait on line 113 redundant. The expectation pattern is typically used for asynchronous operations, but here everything is already synchronous.
Consider removing the expectation entirely and directly asserting after concurrentPerform completes:
DispatchQueue.concurrentPerform(iterations: iterations) { i in
// ... existing logic
}
// Report any consistency violations
if !errors.isEmpty {
XCTFail("Race condition detected! Errors:\n\(errors.joined(separator: "\n"))")
}
// Final consistency check
let finalCount = cache.allValues.count
XCTAssertLessThanOrEqual(finalCount, 50, "Final cache size \(finalCount) exceeds capacity of 50")There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The use of XCTestExpectation here is not correct. DispatchQueue.concurrentPerform is a synchronous call that blocks the current thread until all of its iterations are complete. This means the expectation is fulfilled immediately after the concurrent work is done, and wait(for:timeout:) returns right away without actually waiting for any asynchronous work. The test still works as intended for concurrency testing, but the expectation adds confusion. It's better to remove the expectation and the wait for clarity.
DispatchQueue.concurrentPerform(iterations: iterations) { i in
let key = i % 30
// Interleave different operations
switch i % 4 {
case 0:
cache[key] = i
case 1:
_ = cache[key]
case 2:
_ = cache.contains(key)
case 3:
cache.remove(key)
default:
break
}
// Periodically check consistency
if i % 1000 == 0 {
let count = cache.allValues.count
if count > 50 {
errorLock.lock()
errors.append("Cache exceeded capacity: \(count) > 50 at iteration \(i)")
errorLock.unlock()
}
}
}
// Report any consistency violations
if !errors.isEmpty {
XCTFail("Race condition detected! Errors:\n\(errors.joined(separator: "\n"))")
}
// Final consistency check
let finalCount = cache.allValues.count
XCTAssertLessThanOrEqual(finalCount, 50, "Final cache size \(finalCount) exceeds capacity of 50")
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The test documentation states "This test exposes a deadlock bug in PersistableCache" (present tense), but with the accompanying fix in PersistableCache.swift, this bug is now resolved. Consider updating the documentation to clarify that this test verified the deadlock bug that has now been fixed, or that it serves as a regression test. For example: