Skip to content

Migrate To SQL Server #163

@naasanov

Description

@naasanov

s### Motivation

UNC's infrastructure is built on MS SQL Server. If we want our app to integrate, we need to move from Postgres to SQL Server.

Deliverables

  • Migrate source code (column types and db engine setup)
  • Migrate test code
  • Migrate dev environment
  • MIgrate GH Actions

Important Notes

  • This could take a long time so it can be split between sprints as long as significant work is done each sprint.

Below is a migration report generated by claude code after scanning the codebase for migration considerations. Note that since its generated by AI some suggestions might not work, might be out of date, or the report might not cover all considerations.


PostgreSQL to Microsoft SQL Server Migration Report

Application: Party Registration System
Current Database: PostgreSQL (via asyncpg)
Target Database: Microsoft SQL Server Express
Date: 2026-01-24 (Updated)


Executive Summary

This Party Registration System currently uses PostgreSQL with SQLAlchemy 2.0 (async) and asyncpg driver. Migrating to SQL Server Express will require changes to:

  • Database drivers and connection strings
  • Schema constraints (ENUM types, datetime types, regex patterns)
  • Test infrastructure (table truncation patterns)
  • Docker/deployment configuration
  • Database management scripts

Estimated Scope: Moderate - Most changes are configuration-based, with some schema adjustments required.


1. DATABASE DRIVERS & DEPENDENCIES

Current Setup

sqlalchemy = "~2.0.0"
asyncpg = "~0.31.0"          # PostgreSQL async driver
psycopg2-binary = "~2.9.0"   # PostgreSQL sync driver

Required Changes

Recommended: asyncio-based

sqlalchemy = "~2.0.0"        # Keep same version
aioodbc = "~0.5.0"           # Async ODBC driver
pyodbc = "~5.0.0"            # ODBC driver (sync for scripts)

Recommendation: Use aioodbc + pyodbc to maintain async/await patterns throughout the codebase.

Note: SQLAlchemy 2.0.23+ includes native support for mssql+aioodbc dialect, making this a well-supported async solution for SQL Server.

Additional System Requirements

For Linux (Docker container):

# Add to .devcontainer/Dockerfile
RUN apt-get update && apt-get install -y \
    unixodbc \
    unixodbc-dev \
    freetds-dev \
    freetds-bin \
    tdsodbc \
    && apt-get clean

# Install Microsoft ODBC Driver 18 for SQL Server
RUN curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add - \
    && curl https://packages.microsoft.com/config/debian/12/prod.list > /etc/apt/sources.list.d/mssql-release.list \
    && apt-get update \
    && ACCEPT_EULA=Y apt-get install -y msodbcsql18

2. CONNECTION CONFIGURATION

Current Configuration

File: [backend/src/core/config.py](backend/src/core/config.py)

# Current PostgreSQL
DATABASE_URL = f"postgresql+asyncpg://{user}:{password}@{host}:{port}/{database}"
DATABASE_URL_SYNC = f"postgresql+psycopg2://{user}:{password}@{host}:{port}/{database}"

New Configuration

from pydantic_settings import BaseSettings

class Settings(BaseSettings):
    # SQL Server environment variables
    MSSQL_DATABASE: str = "ocsl"
    MSSQL_USER: str = "sa"
    MSSQL_PASSWORD: str = "YourStrong!Passw0rd"
    MSSQL_HOST: str = "db"
    MSSQL_PORT: int = 1433
    MSSQL_DRIVER: str = "ODBC Driver 18 for SQL Server"
    MSSQL_TRUST_SERVER_CERTIFICATE: str = "yes"  # For dev/test environments

    @property
    def database_url(self) -> str:
        """Async connection string for SQL Server"""
        driver = self.MSSQL_DRIVER.replace(" ", "+")
        return (
            f"mssql+aioodbc://{self.MSSQL_USER}:{self.MSSQL_PASSWORD}"
            f"@{self.MSSQL_HOST}:{self.MSSQL_PORT}/{self.MSSQL_DATABASE}"
            f"?driver={driver}"
            f"&TrustServerCertificate={self.MSSQL_TRUST_SERVER_CERTIFICATE}"
        )

    @property
    def database_url_sync(self) -> str:
        """Sync connection string for scripts"""
        driver = self.MSSQL_DRIVER.replace(" ", "+")
        return (
            f"mssql+pyodbc://{self.MSSQL_USER}:{self.MSSQL_PASSWORD}"
            f"@{self.MSSQL_HOST}:{self.MSSQL_PORT}/{self.MSSQL_DATABASE}"
            f"?driver={driver}"
            f"&TrustServerCertificate={self.MSSQL_TRUST_SERVER_CERTIFICATE}"
        )

Environment Variables

File: [backend/.env](backend/.env)

# Replace PostgreSQL variables with:
MSSQL_DATABASE=ocsl
MSSQL_USER=sa
MSSQL_PASSWORD=YourStrong!Passw0rd
MSSQL_HOST=db
MSSQL_PORT=1433
MSSQL_DRIVER=ODBC Driver 18 for SQL Server
MSSQL_TRUST_SERVER_CERTIFICATE=yes

Note: SQL Server requires strong passwords by default (uppercase, lowercase, numbers, special characters).


3. SCHEMA CHANGES

A. ENUM Types

Issue: PostgreSQL uses native ENUM types. SQL Server does not have native ENUMs.

Current Entities with ENUMs:

  • accounts.role → AccountRole (STUDENT, STAFF, ADMIN, POLICE)
  • students.contact_preference → ContactPreference (EMAIL, PHONE_CALL, TEXT_MESSAGE)
  • parties.contact_two_contact_preference → ContactPreference

Recommended Solution: Use SQLAlchemy's Enum with VARCHAR

from sqlalchemy import Enum
import enum

class AccountRole(str, enum.Enum):
    STUDENT = "STUDENT"
    STAFF = "STAFF"
    ADMIN = "ADMIN"
    POLICE = "POLICE"

class AccountEntity(EntityBase):
    role: Mapped[AccountRole] = mapped_column(
        Enum(AccountRole, native_enum=False, length=20),  # native_enum=False for SQL Server
        nullable=False,
        default=AccountRole.STUDENT
    )

Changes Required:

  1. [backend/src/modules/account/account_entity.py](backend/src/modules/account/account_entity.py) - Update role field
  2. [backend/src/modules/student/student_entity.py](backend/src/modules/student/student_entity.py) - Update contact_preference field
  3. [backend/src/modules/party/party_entity.py](backend/src/modules/party/party_entity.py) - Update contact_two_contact_preference field

B. DateTime with Timezone

Issue: SQL Server handles timezones differently than PostgreSQL.

SQL Server has two options:

  • DATETIME2 - No timezone info (stored as-is)
  • DATETIMEOFFSET - Includes timezone offset

Current Code:

party_datetime: Mapped[datetime] = mapped_column(
    DateTime(timezone=True),  # PostgreSQL TIMESTAMPTZ
    nullable=False
)

Recommended Solution: Use DATETIMEOFFSET

from sqlalchemy.dialects.mssql import DATETIMEOFFSET

party_datetime: Mapped[datetime] = mapped_column(
    DATETIMEOFFSET,
    nullable=False
)

Affected Files:

C. Regex Constraints

Issue: PostgreSQL uses the ~ regex operator. SQL Server doesn't support this.

Current Constraint:

CheckConstraint("pid ~ '^[0-9]{9}$'", name="valid_pid_format")

SQL Server Solution:

# Option 1: LIKE pattern
CheckConstraint("pid LIKE '[0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9]'", name="valid_pid_format")

# Option 2: LEN check (more flexible)
CheckConstraint("LEN(pid) = 9 AND pid NOT LIKE '%[^0-9]%'", name="valid_pid_format")

Affected File:


4. DATABASE SCRIPTS CHANGES

A. Create Database Script

File: [backend/script/create_db.py](backend/script/create_db.py)

Current PostgreSQL:

engine = create_engine(
    "postgresql+psycopg2://...",
    isolation_level="AUTOCOMMIT"
)
with engine.connect() as conn:
    result = conn.execute(text("SELECT 1 FROM pg_database WHERE datname='ocsl'"))
    if not result.fetchone():
        conn.execute(text("CREATE DATABASE ocsl"))

SQL Server Version:

# Connect to master database
master_url = f"mssql+pyodbc://{user}:{password}@{host}:{port}/master?driver={driver}"
engine = create_engine(master_url, isolation_level="AUTOCOMMIT")

with engine.connect() as conn:
    # Check if database exists
    result = conn.execute(
        text("SELECT database_id FROM sys.databases WHERE name = :dbname"),
        {"dbname": settings.MSSQL_DATABASE}
    )
    if not result.fetchone():
        # Create database
        conn.execute(text(f"CREATE DATABASE [{settings.MSSQL_DATABASE}]"))
        print(f"Database '{settings.MSSQL_DATABASE}' created successfully.")
    else:
        print(f"Database '{settings.MSSQL_DATABASE}' already exists.")

Key Changes:

  • Connect to master database instead of postgres database
  • Use sys.databases instead of pg_database
  • Use [database_name] brackets for identifiers

B. Delete Database Script

File: [backend/script/delete_db.py](backend/script/delete_db.py)

SQL Server Version:

master_url = f"mssql+pyodbc://{user}:{password}@{host}:{port}/master?driver={driver}"
engine = create_engine(master_url, isolation_level="AUTOCOMMIT")

with engine.connect() as conn:
    # Set database to single-user mode to disconnect all users
    conn.execute(text(f"""
        IF EXISTS (SELECT * FROM sys.databases WHERE name = '{settings.MSSQL_DATABASE}')
        BEGIN
            ALTER DATABASE [{settings.MSSQL_DATABASE}] SET SINGLE_USER WITH ROLLBACK IMMEDIATE;
            DROP DATABASE [{settings.MSSQL_DATABASE}];
        END
    """))
    print(f"Database '{settings.MSSQL_DATABASE}' deleted successfully.")

Key Changes:

  • Must disconnect users before dropping (SET SINGLE_USER)
  • Different syntax than PostgreSQL

C. Reset Dev Script

File: [backend/script/reset_dev.py](backend/script/reset_dev.py)

Current PostgreSQL:

server_engine = create_engine(server_url(sync=True), isolation_level="AUTOCOMMIT")

with server_engine.connect() as connection:
    print("Deleting database...")
    connection.execute(text(f"DROP DATABASE {env.POSTGRES_DATABASE}"))

    print("Recreating database...")
    connection.execute(text(f"CREATE DATABASE {env.POSTGRES_DATABASE}"))

SQL Server Version:

server_engine = create_engine(server_url(sync=True), isolation_level="AUTOCOMMIT")

with server_engine.connect() as connection:
    print("Deleting database...")
    # Must disconnect users before dropping
    connection.execute(text(f"""
        IF EXISTS (SELECT * FROM sys.databases WHERE name = '{env.MSSQL_DATABASE}')
        BEGIN
            ALTER DATABASE [{env.MSSQL_DATABASE}] SET SINGLE_USER WITH ROLLBACK IMMEDIATE;
            DROP DATABASE [{env.MSSQL_DATABASE}];
        END
    """))

    print("Recreating database...")
    connection.execute(text(f"CREATE DATABASE [{env.MSSQL_DATABASE}]"))

Key Changes:

  • Use MSSQL_DATABASE instead of POSTGRES_DATABASE environment variable
  • Use SQL Server DROP/CREATE syntax with brackets
  • Must set SINGLE_USER mode before dropping to disconnect active connections

5. TESTING INFRASTRUCTURE CHANGES

Test Configuration

File: [backend/test/conftest.py](backend/test/conftest.py)

Major Change: Table Truncation Pattern

Current PostgreSQL:

# Temporarily disable foreign key checks
await conn.execute(text("SET session_replication_role = 'replica';"))

# Truncate all tables
for table_name in reversed(tables):
    await conn.execute(text(f"TRUNCATE TABLE {table_name} RESTART IDENTITY CASCADE;"))

# Re-enable foreign key checks
await conn.execute(text("SET session_replication_role = 'origin';"))

SQL Server Replacement:

# Disable foreign key checks
await conn.execute(text("EXEC sp_MSforeachtable 'ALTER TABLE ? NOCHECK CONSTRAINT ALL'"))

# Truncate all tables with identity reset
for table_name in reversed(tables):
    await conn.execute(text(f"TRUNCATE TABLE [{table_name}]"))
    # Reset identity column
    await conn.execute(text(f"DBCC CHECKIDENT ('[{table_name}]', RESEED, 0)"))

# Re-enable foreign key checks
await conn.execute(text("EXEC sp_MSforeachtable 'ALTER TABLE ? CHECK CONSTRAINT ALL'"))

Test Database Setup

File: [backend/test/conftest.py](backend/test/conftest.py)

# Update DATABASE_URL for tests
DATABASE_URL = settings.database_url.replace(
    f"/{settings.MSSQL_DATABASE}",
    "/ocsl_test"
)

GitHub Actions CI Workflow

File: [.github/workflows/backend-test.yml](.github/workflows/backend-test.yml)

The CI workflow needs significant changes to use SQL Server instead of PostgreSQL.

Current PostgreSQL:

services:
  postgres:
    image: postgres:latest
    env:
      POSTGRES_USER: postgres
      POSTGRES_PASSWORD: postgres
      POSTGRES_DB: ocsl_test
    ports:
      - 5432:5432
    options: >-
      --health-cmd pg_isready
      --health-interval 10s
      --health-timeout 5s
      --health-retries 5

steps:
  # ...
  - name: Run pytest
    env:
      POSTGRES_USER: postgres
      POSTGRES_PASSWORD: postgres
      POSTGRES_HOST: localhost
      POSTGRES_PORT: 5432
      POSTGRES_DB: ocsl_test

SQL Server Version:

services:
  sqlserver:
    image: mcr.microsoft.com/mssql/server:2022-latest
    env:
      ACCEPT_EULA: Y
      MSSQL_SA_PASSWORD: YourStrong!Passw0rd
      MSSQL_PID: Express
    ports:
      - 1433:1433
    options: >-
      --health-cmd "/opt/mssql-tools18/bin/sqlcmd -S localhost -U sa -P 'YourStrong!Passw0rd' -Q 'SELECT 1' -C"
      --health-interval 10s
      --health-timeout 5s
      --health-retries 10
      --health-start-period 30s

steps:
  - name: Install ODBC Driver
    run: |
      curl https://packages.microsoft.com/keys/microsoft.asc | sudo apt-key add -
      curl https://packages.microsoft.com/config/ubuntu/22.04/prod.list | sudo tee /etc/apt/sources.list.d/mssql-release.list
      sudo apt-get update
      sudo ACCEPT_EULA=Y apt-get install -y msodbcsql18 unixodbc-dev

  - name: Install dependencies
    run: |
      python -m pip install --upgrade pip
      pip install -e backend/

  - name: Create test database
    run: |
      /opt/mssql-tools18/bin/sqlcmd -S localhost -U sa -P "YourStrong!Passw0rd" -C -Q "CREATE DATABASE ocsl_test"

  - name: Run pytest
    env:
      MSSQL_USER: sa
      MSSQL_PASSWORD: YourStrong!Passw0rd
      MSSQL_HOST: localhost
      MSSQL_PORT: 1433
      MSSQL_DATABASE: ocsl_test
      MSSQL_DRIVER: ODBC Driver 18 for SQL Server
      MSSQL_TRUST_SERVER_CERTIFICATE: yes
      GOOGLE_MAPS_API_KEY: "test_api_key_not_used"
    run: |
      cd /home/runner/work/party-registration/party-registration
      PYTHONPATH=backend:backend/src pytest backend/test -v --tb=long --junitxml=backend/test-results/junit.xml

Key Changes:

  1. Service image: mcr.microsoft.com/mssql/server:2022-latest (stable, well-tested)
  2. Health check: Uses sqlcmd with -Q 'SELECT 1' flag to execute query, with longer start period (30s)
  3. New step: Install ODBC drivers on the runner
  4. New step: Create test database before running tests
  5. Environment variables: Replace all POSTGRES_* with MSSQL_* equivalents

Note on SQL Server 2025: While SQL Server 2025 offers a 50 GB database limit (vs 10 GB in 2022), the 2025-latest image has had some stability issues. For production CI/CD, stick with 2022-latest until 2025-GA is released and proven stable.


6. DOCKER & DEPLOYMENT CHANGES

Docker Compose

File: [.devcontainer/docker-compose.yml](.devcontainer/docker-compose.yml)

Current PostgreSQL:

services:
  db:
    image: postgres:latest
    environment:
      POSTGRES_DB: ocsl
      POSTGRES_USER: postgres
      POSTGRES_PASSWORD: admin
    ports:
      - "5432:5432"
    volumes:
      - party-reg-postgres:/var/lib/postgresql/data

SQL Server Express:

services:
  db:
    image: mcr.microsoft.com/mssql/server:2022-latest
    environment:
      ACCEPT_EULA: "Y"
      MSSQL_SA_PASSWORD: "YourStrong!Passw0rd"
      MSSQL_PID: "Express"
    ports:
      - "1433:1433"
    volumes:
      - party-reg-mssql:/var/opt/mssql
    deploy:
      resources:
        limits:
          memory: 2G

volumes:
  party-reg-mssql:

Key Changes:

  1. Image: mcr.microsoft.com/mssql/server:2022-latest (stable and well-tested)
  2. Port: 1433 instead of 5432
  3. Environment variables:
    • ACCEPT_EULA=Y (required)
    • MSSQL_SA_PASSWORD (strong password required)
    • MSSQL_PID=Express (use Express edition)
  4. Volume path: /var/opt/mssql instead of /var/lib/postgresql/data
  5. Memory limit: SQL Server needs at least 2GB

Alternative - SQL Server 2025 (Future Upgrade):

# When SQL Server 2025 stabilizes, upgrade to get 50 GB database limit
image: mcr.microsoft.com/mssql/server:2025-latest
# All other settings remain the same

Dockerfile

File: [.devcontainer/Dockerfile](.devcontainer/Dockerfile)

Add after existing RUN commands:

# Install ODBC drivers for SQL Server
RUN apt-get update && apt-get install -y \
    curl \
    apt-transport-https \
    gnupg2 \
    unixodbc \
    unixodbc-dev \
    && curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add - \
    && curl https://packages.microsoft.com/config/debian/12/prod.list > /etc/apt/sources.list.d/mssql-release.list \
    && apt-get update \
    && ACCEPT_EULA=Y apt-get install -y msodbcsql18 mssql-tools18 \
    && echo 'export PATH="$PATH:/opt/mssql-tools18/bin"' >> ~/.bashrc \
    && apt-get clean \
    && rm -rf /var/lib/apt/lists/*

This adds:

  • Microsoft ODBC Driver 18 for SQL Server
  • SQL Server command-line tools (sqlcmd, bcp)
  • unixODBC framework

7. DEVELOPMENT ENVIRONMENT SETUP

Local Development

Prerequisites:

  1. Docker with 2GB+ memory available for SQL Server
  2. Update [.devcontainer/docker-compose.yml](.devcontainer/docker-compose.yml) as shown above
  3. Update [.devcontainer/Dockerfile](.devcontainer/Dockerfile) as shown above
  4. Update [.devcontainer/post_create.sh](.devcontainer/post_create.sh) as shown below
  5. Update [backend/.env](backend/.env) with SQL Server variables

Post-Create Script Changes

Issue: SQL Server takes ~30 seconds to start, unlike PostgreSQL which starts in seconds. The post_create.sh script runs database setup commands that will fail if SQL Server isn't ready.

File: [.devcontainer/post_create.sh](.devcontainer/post_create.sh)

Recommended Solution: Wait in post_create.sh

Add a wait loop before the database scripts. This allows maximum parallelism - the app container builds and installs dependencies while SQL Server starts, and only waits right before database setup.

#!/bin/bash
set -e

echo "================== Installing pre-commit hooks ================="
cd ..
pre-commit install
pre-commit install-hooks

echo ""
echo "=============== Installing frontend dependencies ==============="
cd ./frontend
npm i --verbose

echo ""
echo "==================== Setting up the database ==================="
cd ../backend

# Wait for SQL Server to be ready
echo "Waiting for SQL Server to be ready..."
until /opt/mssql-tools18/bin/sqlcmd -S db -U sa -P "$MSSQL_SA_PASSWORD" -Q "SELECT 1" -C &>/dev/null; do
  sleep 2
done
echo "SQL Server is ready!"

python -m script.create_db
python -m script.create_test_db
python -m script.reset_dev

Why this approach:

  • Containers start in parallel (no wasted time)
  • Dockerfile builds, pip installs happen while SQL Server is starting
  • pre-commit hooks and npm install run while SQL Server is starting
  • Only the database scripts wait - and by then, SQL Server is likely already ready

Alternative Solution: Health Check in docker-compose.yml

If you prefer container orchestration to handle the wait, you can add a health check and depends_on condition. Note: This is less efficient because the entire app container waits for SQL Server before starting.

services:
  app:
    build:
      context: ..
      dockerfile: .devcontainer/Dockerfile
    volumes:
      - ..:/workspace
    depends_on:
      db:
        condition: service_healthy

  db:
    image: mcr.microsoft.com/mssql/server:2022-latest
    environment:
      ACCEPT_EULA: "Y"
      MSSQL_SA_PASSWORD: "YourStrong!Passw0rd"
      MSSQL_PID: "Express"
    ports:
      - "1433:1433"
    volumes:
      - party-reg-mssql:/var/opt/mssql
    deploy:
      resources:
        limits:
          memory: 2G
    healthcheck:
      test: /opt/mssql-tools18/bin/sqlcmd -S localhost -U sa -P "$$MSSQL_SA_PASSWORD" -Q "SELECT 1" -C
      interval: 10s
      timeout: 5s
      retries: 10
      start_period: 30s

volumes:
  party-reg-mssql:

Trade-off: Simpler post_create.sh, but app container build/setup is blocked until SQL Server is healthy (~30+ seconds of wasted parallel time).

devcontainer.json Changes

File: [.devcontainer/devcontainer.json](.devcontainer/devcontainer.json)

Update the forwarded ports and VS Code extension:

{
  "forwardPorts": [1433, 5050],  // Changed from 5432 to 1433
  "customizations": {
    "vscode": {
      "extensions": [
        // ... other extensions ...
        "ms-mssql.mssql",  // Replace "ckolkman.vscode-postgres"
        // ... other extensions ...
      ]
    }
  }
}

VSCode Database Explorer

Current Configuration (README):

Hostname: db
User: postgres
Password: admin
Port: 5432
Database: ocsl

New Configuration:

Server: db
Authentication Type: SQL Login
User: sa
Password: YourStrong!Passw0rd
Port: 1433
Database: ocsl
Encrypt: false (for local dev)
Trust Server Certificate: true

Recommended Extension: "SQL Server (mssql)" by Microsoft


8. QUERY PATTERN CONSIDERATIONS

String Comparison Case Sensitivity

PostgreSQL: Case-sensitive by default
SQL Server: Case-insensitive by default (depends on collation)

Impact: Queries like where(AccountEntity.email == email) might behave differently.

Solution: Be explicit if case-sensitivity matters:

# Case-insensitive (works on both)
.where(func.lower(AccountEntity.email) == email.lower())

Date/Time Functions

If you add custom functions later:

  • PostgreSQL: NOW(), CURRENT_TIMESTAMP
  • SQL Server: GETDATE(), CURRENT_TIMESTAMP

Recommendation: Use SQLAlchemy's func.now() or func.current_timestamp() for portability.


9. MIGRATION CHECKLIST

Pre-Migration

  • Backup current PostgreSQL database
  • Document current data volume
  • Test SQL Server Express in separate environment
  • Verify SQL Server Express is acceptable for school IT requirements

Code Changes

Infrastructure Changes

Testing

  • Run all unit tests (pytest)
  • Manual testing of key features
  • Performance testing with realistic data volume
  • Database size monitoring

Documentation

  • Update [README.md](README.md) with new database connection info
  • Update development setup instructions
  • Document any SQL Server-specific gotchas

10. ESTIMATED EFFORT

Lines of Code Changed

  • Configuration: ~100 lines
  • Entity definitions: ~50 lines
  • Scripts: ~120 lines (includes reset_dev.py)
  • Tests: ~40 lines
  • Docker/infra: ~50 lines
  • CI/CD workflow: ~50 lines
  • Total: ~410 lines

11. CONCLUSION

Migrating from PostgreSQL to SQL Server Express is feasible for this application. The use of SQLAlchemy ORM minimizes the impact, with most changes being configuration-based.

Key Areas Requiring Changes:

  1. Database drivers and connection strings
  2. ENUM type handling (use native_enum=False)
  3. DateTime timezone handling (use DATETIMEOFFSET)
  4. Regex constraint patterns (use LIKE or LEN)
  5. Test infrastructure (truncation pattern)
  6. Docker configuration (new image and drivers)
  7. CI/CD workflow (GitHub Actions)

SQL Server Express 2022 vs 2025 Considerations:

SQL Server Express 2022 (Recommended for Initial Migration)

  • Database Size Limit: 10 GB per database
  • Buffer Pool Memory: ~1.4 GB (1410 MB)
  • CPU Limit: Lesser of 1 socket or 4 cores
  • Status: Stable, well-tested, production-ready
  • Docker Image: mcr.microsoft.com/mssql/server:2022-latest

SQL Server Express 2025 (Future Upgrade Path)

  • Database Size Limit: 50 GB per database (5x increase!)
  • Buffer Pool Memory: ~1.4 GB (1410 MB) - unchanged
  • CPU Limit: Lesser of 1 socket or 4 cores - unchanged
  • Status: Recently released (November 2025), some container stability issues
  • Docker Image: mcr.microsoft.com/mssql/server:2025-latest (use with caution until fully stable)
  • Major Benefits:
    • 50 GB database size provides significant growth headroom
    • Built-in full-text search, reporting, and ML features (previously "Advanced Services")
    • New VECTOR data type for AI/ML similarity search
    • Native JSON enhancements
    • Built-in regex support

Recommendation for This Project:

Phase 1 (Immediate): Migrate to SQL Server Express 2022

  • Proven stability for CI/CD and production
  • 10 GB limit is likely sufficient for a school-sized party registration system
  • All code examples in this report use 2022-compatible syntax

Phase 2 (6-12 months): Upgrade to SQL Server Express 2025

  • Monitor for stable release and community feedback
  • Upgrade when 2025-GA tag is available and proven
  • Gain 50 GB database capacity for long-term growth
  • No code changes required - just update Docker image tag

Database Size Monitoring:

  • Set up alerts when database reaches 7 GB (70% of 10 GB limit)
  • Monitor growth rate to plan 2025 upgrade timing
  • 50 GB limit in Express 2025 provides 5x headroom

Other Limitations (Both Versions):

  • No SQL Server Agent (use OS task scheduler or external job systems)
  • No high availability features (no failover clustering or Always On)
  • Single instance only
  • Buffer pool memory capped at ~1.4 GB

Final Assessment: SQL Server Express (2022 or 2025) is well-suited for this school party registration application. Start with the stable 2022 version and plan an upgrade to 2025 once it stabilizes to gain the 50 GB database capacity benefit.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions