Skip to content
This repository was archived by the owner on Jan 29, 2026. It is now read-only.
This repository was archived by the owner on Jan 29, 2026. It is now read-only.

[Database] Implement Atomic Operations for File-Based Storage #68

@coderabbitai

Description

@coderabbitai

⚠️ Priority: HIGH - Security & Stability

Background

The current file-based JSON database at backend/src/db/database.js uses non-atomic read-modify-write operations that can cause race conditions under concurrent load. Multiple simultaneous write operations could corrupt data or lose updates.

Current Implementation - Race Condition Vulnerable

// backend/src/db/database.js (lines 85-99)
export async function updateWorkflow(id, updates) {
  const workflows = await getAllWorkflows(); // READ
  const index = workflows.findIndex(w => w.metadata?.id === id);
  
  if (index === -1) {
    throw new Error(`Workflow with id ${id} not found`);
  }
  
  // MODIFY
  workflows[index] = { 
    ...workflows[index], 
    ...updates,
    metadata: {
      ...workflows[index].metadata,
      ...updates.metadata,
      updatedAt: Date.now()
    }
  };
  
  // WRITE - no protection against concurrent writes
  await fs.writeFile(WORKFLOWS_FILE, JSON.stringify({ workflows }, null, 2));
  return workflows[index];
}

Concurrency Problems

Scenario: Two clients update the same workflow simultaneously

  1. Client A reads workflows.json (contains workflow v1)
  2. Client B reads workflows.json (contains workflow v1)
  3. Client A modifies workflow → writes v2
  4. Client B modifies workflow → writes v3 (overwrites A's changes)
  5. Result: Client A's update is lost (lost update problem)

Similar issues exist in:

  • updateStoreState() (lines 167-183)
  • createSession() (lines 203-213)
  • deleteWorkflow() (lines 101-118)

Recommended Solutions

Option 1: File Locking (Quick Fix)

Use proper-lockfile package for advisory file locks:

npm install proper-lockfile
// backend/src/db/database.js
import lockfile from 'proper-lockfile';

export async function updateWorkflow(id, updates) {
  const release = await lockfile.lock(WORKFLOWS_FILE, { retries: 3 });
  try {
    const workflows = await getAllWorkflows();
    const index = workflows.findIndex(w => w.metadata?.id === id);
    
    if (index === -1) {
      throw new Error(`Workflow with id ${id} not found`);
    }
    
    workflows[index] = { ...workflows[index], ...updates };
    await fs.writeFile(WORKFLOWS_FILE, JSON.stringify({ workflows }, null, 2));
    return workflows[index];
  } finally {
    await release();
  }
}

Option 2: SQLite Migration (Recommended for Production)

Migrate to SQLite for ACID transactions:

npm install sqlite sqlite3
// backend/src/db/database.js
import { open } from 'sqlite';
import sqlite3 from 'sqlite3';

const db = await open({
  filename: path.join(DB_DIR, 'gemini-flow.db'),
  driver: sqlite3.Database
});

export async function updateWorkflow(id, updates) {
  return await db.run(
    'UPDATE workflows SET data = ?, updated_at = ? WHERE id = ?',
    [JSON.stringify(updates), Date.now(), id]
  );
}

Files to Modify

  • backend/src/db/database.js (all write operations)
  • backend/package.json (add dependencies)
  • Add migration script if choosing SQLite

Acceptance Criteria

  • All database write operations are atomic
  • Concurrent operations do not cause data loss
  • Race condition tests pass (concurrent updates to same resource)
  • Performance is acceptable (measure before/after)
  • Database initialization handles migration from existing JSON files
  • Backward compatibility maintained for existing data
  • Documentation updated with concurrency guarantees

Performance Testing

# Concurrent update test
for i in {1..10}; do
  curl -X PUT http://localhost:3001/api/workflows/test-id -H 'X-API-Key: key' -d '{"name":"Update $i"}' &
done
wait
# Verify all 10 updates are reflected or properly sequenced

References

Migration Path (if choosing SQLite)

  1. Create SQLite schema matching JSON structure
  2. Write migration script to import existing JSON data
  3. Update database.js to use SQLite
  4. Update tests
  5. Document rollback procedure

Additional Context

This issue affects data integrity and should be resolved before production deployment or heavy concurrent usage.

Metadata

Metadata

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions