-
Notifications
You must be signed in to change notification settings - Fork 5
fix: resolve critical framework issues and improve stability #20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
Refactor core to use Starlette consistently, resolving race conditions. Implement robust error handling for unique constraints and missing resources. Add flexible datetime parsing for model fields. Fix Swagger/OpenAPI documentation generation and add ReDoc support. Improve docstrings and update documentation to reflect changes.
Note Other AI code review bot(s) detectedCodeRabbit has detected other AI code review bot(s) in this pull request and will avoid duplicating their findings in the review comments. This may lead to a less comprehensive review. WalkthroughRefactors LightApi to model-based CRUD registration and Starlette/ASGI handlers, adds get_app and debug handling, integrates Swagger UI and new ReDoc docs, introduces automatic datetime parsing in handlers and documentation, adds database setup helper, removes legacy setup from models, updates getting-started docs and troubleshooting, and adds python-dateutil dependency. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
participant Dev as Developer
participant API as LightApi
participant SW as SwaggerGenerator
participant APP as Starlette App
Dev->>API: __init__(debug, enable_swagger, ...)
Dev->>API: register(ModelClass)
API->>API: Generate CRUD routes for ModelClass
API->>SW: Register endpoints for OpenAPI
Dev->>API: get_app()
API->>APP: Build Starlette with /docs, /redoc, /openapi.json (if enabled)
Dev->>APP: run()
sequenceDiagram
autonumber
participant C as Client
participant APP as Starlette
participant H as Handler (ASGI)
participant DB as Session
C->>APP: HTTP Request (e.g., POST /models)
APP->>H: scope, receive, send
H->>H: Parse JSON, parse dates (dateutil)
H->>DB: Persist/query via SQLAlchemy
DB-->>H: Result or exception
alt Success
H-->>APP: JSONResponse 200/201 with serialized item
else Validation/Conflict
H-->>APP: JSONResponse 400/409 with error details
else Server Error
H-->>APP: JSONResponse 500 (debug includes details)
end
sequenceDiagram
autonumber
participant User as User
participant APP as Starlette
participant SW as SwaggerGenerator
User->>APP: GET /docs
APP->>SW: get_swagger_ui()
SW-->>APP: HTMLResponse
User->>APP: GET /redoc
APP->>SW: get_redoc_html()
SW-->>APP: HTMLResponse
User->>APP: GET /openapi.json
APP->>SW: get_openapi_json()
SW-->>APP: JSON
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Poem
Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Looks like there are a few issues preventing this PR from being merged!
If you'd like me to help, just leave a comment, like
Feel free to include any additional details that might help me get this PR into a better state. You can manage your notification settings |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR refactors the core framework to use Starlette consistently, resolving race conditions and improving stability. Key changes include migrating from aiohttp to Starlette, implementing robust error handling with better constraint violation messages, adding flexible datetime parsing via dateutil, fixing Swagger/OpenAPI documentation generation with ReDoc support, and updating documentation and docstrings throughout.
Key Changes
- Migrated HTTP handling from aiohttp to Starlette ASGI framework for consistency
- Enhanced error handling with specific messages for unique constraints and database errors
- Added flexible datetime parsing using python-dateutil library for better format support
Reviewed Changes
Copilot reviewed 9 out of 10 changed files in this pull request and generated 4 comments.
Show a summary per file
File | Description |
---|---|
lightapi/swagger.py | Updated docstrings and added ReDoc support with new route handler |
lightapi/models.py | Removed setup_database function (moved to database.py) |
lightapi/handlers.py | Major refactor to use Starlette instead of aiohttp with improved error handling |
lightapi/database.py | Added moved setup_database function and updated docstrings |
lightapi/core.py | Streamlined LightApi class with Starlette integration and simplified registration |
docs/troubleshooting.md | Added comprehensive error response documentation |
docs/getting-started/quickstart.md | Updated registration syntax and added ReDoc documentation |
docs/getting-started/first-steps.md | Updated registration syntax to use individual register calls |
docs/advanced/validation.md | Added documentation for automatic datetime parsing feature |
Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.
import datetime | ||
import json | ||
import logging | ||
import re |
Copilot
AI
Oct 3, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] The re
module is imported but only used in one specific error handling case. Consider moving this import closer to where it's used or adding a comment explaining its purpose for better code organization.
Copilot uses AI. Check for mistakes.
except ValueError: | ||
return web.json_response({"error": f"Invalid date format for field '{col.name}'"}, status=400) | ||
return JSONResponse({"error": f"Invalid date format for field '{col.name}'"}, status_code=400) | ||
|
Copilot
AI
Oct 3, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's a trailing whitespace on this line that should be removed for code cleanliness.
Copilot uses AI. Check for mistakes.
.filter(getattr(self.model, pk_col) == self._parse_pk_value(pk_value, getattr(self.model, pk_col))) | ||
.first() | ||
) | ||
pk_col = "id" # Assuming 'id' for now |
Copilot
AI
Oct 3, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hard-coding the primary key column name as 'id' with a TODO-style comment indicates incomplete implementation. This should be made more robust to handle different primary key column names or documented as a current limitation.
Copilot uses AI. Check for mistakes.
class PatchHandler(UpdateHandler): | ||
"""Handles HTTP PATCH requests to partially update an existing item.""" | ||
pass |
Copilot
AI
Oct 3, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The PatchHandler inherits from UpdateHandler but doesn't override any behavior. This means PATCH requests will perform full updates instead of partial updates, which violates HTTP semantics. Either implement proper partial update logic or document this limitation.
Copilot uses AI. Check for mistakes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 7
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
lightapi/core.py (1)
179-219
: LGTM! Run method properly orchestrates application startup.The
run()
method correctly:
- Updates configuration with provided parameters
- Prints registered endpoints
- Creates database tables before starting the server
- Uses the new
get_app()
method for app constructionAddress type hints for optional parameters as flagged by static analysis:
def run( self, - host: str = None, - port: int = None, - debug: bool = None, - reload: bool = None, + host: Optional[str] = None, + port: Optional[int] = None, + debug: Optional[bool] = None, + reload: Optional[bool] = None, ):Don't forget to import
Optional
fromtyping
at the top of the file.
♻️ Duplicate comments (4)
lightapi/handlers.py (4)
206-206
: Remove trailing whitespaceThere’s trailing whitespace on this line. Please remove it.
285-287
: PATCH currently behaves like PUT; implement partial update semanticsAs previously noted, inheriting UpdateHandler without changes makes PATCH identical to PUT.
Apply this diff:
-class PatchHandler(UpdateHandler): - """Handles HTTP PATCH requests to partially update an existing item.""" - pass +class PatchHandler(AbstractHandler): + """Handles HTTP PATCH requests to partially update an existing item.""" + + async def handle(self, db, request): + mapper = inspect(self.model) + pk_cols = mapper.primary_key + if len(pk_cols) != 1: + return self.json_error_response("Composite primary keys are not supported yet", status=400) + pk_col = pk_cols[0] + raw = request.path_params.get(pk_col.name) + if raw is None: + return self.json_error_response(f"Missing path parameter '{pk_col.name}'", status=400) + pk_value = self._parse_pk_value(raw, pk_col) + item = db.query(self.model).filter(pk_col == pk_value).first() + if not item: + return self.json_error_response(f"{self.model.__name__} with {pk_col.name} {pk_value} not found", status=404) + + data = await self.get_request_json(request) + if isinstance(data, JSONResponse): + return data + col_index = {c.name: c for c in self.model.__table__.columns} + for name, val in data.items(): + col = col_index.get(name) + if not col: + continue + if hasattr(col.type, "python_type"): + if col.type.python_type is datetime.datetime and isinstance(val, str): + try: + val = parser.parse(val) + except ValueError: + return JSONResponse({"error": f"Invalid datetime format for field '{name}'"}, status_code=400) + elif col.type.python_type is datetime.date and isinstance(val, str): + try: + val = parser.parse(val).date() + except ValueError: + return JSONResponse({"error": f"Invalid date format for field '{name}'"}, status_code=400) + setattr(item, name, val) + + item = self.add_and_commit_item(db, item) + if isinstance(item, JSONResponse): + return item + return self.json_response(item, status=200)
303-311
: Same PK/typing issues here; introspect and coerce types before deleteHard‑coded "id" may fail and won’t work with non‑id PKs.
Apply this diff:
- pk_col = "id" - pk_value = request.path_params.get(pk_col) - - item = db.query(self.model).filter(getattr(self.model, pk_col) == pk_value).first() + mapper = inspect(self.model) + pk_cols = mapper.primary_key + if len(pk_cols) != 1: + return self.json_error_response("Composite primary keys are not supported yet", status=400) + pk_col = pk_cols[0] + raw = request.path_params.get(pk_col.name) + if raw is None: + return self.json_error_response(f"Missing path parameter '{pk_col.name}'", status=400) + pk_value = self._parse_pk_value(raw, pk_col) + item = db.query(self.model).filter(pk_col == pk_value).first() if not item: - return self.json_error_response(f"{self.model.__name__} with id {pk_value} not found", status=404) + return self.json_error_response(f"{self.model.__name__} with {pk_col.name} {pk_value} not found", status=404) self.delete_and_commit_item(db, item) return Response(status_code=204)
227-234
: Primary key handling is hard‑coded and untyped; introspect PK and coerce typesUsing a hard‑coded "id" and not coercing types can cause lookups to fail (e.g., string vs int). Introspect the PK and use _parse_pk_value.
Apply this diff:
- pk_col = "id" # Assuming 'id' for now - pk_value = request.path_params.get(pk_col) - - item = db.query(self.model).filter(getattr(self.model, pk_col) == pk_value).first() + mapper = inspect(self.model) + pk_cols = mapper.primary_key + if len(pk_cols) != 1: + return self.json_error_response("Composite primary keys are not supported yet", status=400) + pk_col = pk_cols[0] + raw = request.path_params.get(pk_col.name) + if raw is None: + return self.json_error_response(f"Missing path parameter '{pk_col.name}'", status=400) + pk_value = self._parse_pk_value(raw, pk_col) + item = db.query(self.model).filter(pk_col == pk_value).first()
🧹 Nitpick comments (3)
docs/getting-started/quickstart.md (1)
133-144
: LGTM! Comprehensive documentation of UI options.The updated section clearly documents both Swagger UI and ReDoc interfaces, providing users with multiple documentation options. The content is accurate and helpful.
Consider using proper Markdown link syntax for the URLs to improve formatting:
-- **Swagger UI**: Visit **http://localhost:8000/docs** in your browser. -- **ReDoc**: Visit **http://localhost:8000/redoc** for an alternative documentation layout. +- **Swagger UI**: Visit [http://localhost:8000/docs](http://localhost:8000/docs) in your browser. +- **ReDoc**: Visit [http://localhost:8000/redoc](http://localhost:8000/redoc) for an alternative documentation layout.lightapi/handlers.py (2)
100-109
: Broaden unique-violation detection; avoid brittle parsing across DBsCurrent checks miss PostgreSQL messages and rely on a fragile regex. Handle common engines and include the column name when available.
Apply this diff:
- except IntegrityError as e: + except IntegrityError as e: db.rollback() - if "UNIQUE constraint failed" in str(e.orig) or "Duplicate entry" in str(e.orig): - match = re.search(r"failed: ([\w\.]+)", str(e.orig)) - if match: - column = match.group(1) - return self.json_error_response(f"Unique constraint violated for {column}.", status=409) - return self.json_error_response("Unique constraint violated.", status=409) + msg = str(e.orig) + if ( + "UNIQUE constraint failed" in msg + or "Duplicate entry" in msg + or "duplicate key value violates unique constraint" in msg + ): + # Try sqlite pattern: "UNIQUE constraint failed: table.column" + m = re.search(r"UNIQUE constraint failed:\s*([\w\.]+)", msg) + # Try PostgreSQL detail: 'Key (column)=(value) already exists.' + m = m or re.search(r"Key\s*\(([^)]+)\)=\([^)]+\)\s*already exists", msg) + if m: + col = m.group(1) + return self.json_error_response(f"Unique constraint violated for {col}.", status=409) + return self.json_error_response("Unique constraint violated.", status=409) return self.json_error_response(f"Database integrity error: {e.orig}", status=400)
240-246
: Mark the unused request parameter or use it; silence linterRename to _request to satisfy the linter until you use it (e.g., for pagination/filtering).
Apply this diff:
- async def handle(self, db, request): + async def handle(self, db, _request):Optionally add pagination to avoid returning unbounded results.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (10)
docs/advanced/validation.md
(1 hunks)docs/getting-started/first-steps.md
(1 hunks)docs/getting-started/quickstart.md
(2 hunks)docs/troubleshooting.md
(1 hunks)lightapi/core.py
(13 hunks)lightapi/database.py
(3 hunks)lightapi/handlers.py
(3 hunks)lightapi/models.py
(0 hunks)lightapi/swagger.py
(13 hunks)requirements.txt
(1 hunks)
💤 Files with no reviewable changes (1)
- lightapi/models.py
🧰 Additional context used
🧬 Code graph analysis (5)
docs/getting-started/first-steps.md (1)
examples/rest_crud_basic.py (1)
User
(10-23)
lightapi/database.py (2)
lightapi/models.py (4)
serialize
(83-93)serialize
(128-138)serialize
(173-183)setup_database
(11-33)lightapi/lightapi.py (2)
serialize
(563-576)__init__
(70-126)
lightapi/swagger.py (1)
tests/test_swagger.py (2)
TestSwaggerRoutes
(82-107)test_swagger_ui_route
(83-94)
lightapi/core.py (2)
lightapi/database.py (2)
Base
(29-84)setup_database
(10-22)lightapi/handlers.py (6)
CreateHandler
(169-211)RetrieveAllHandler
(237-252)ReadHandler
(214-234)UpdateHandler
(255-282)PatchHandler
(285-287)DeleteHandler
(290-311)
lightapi/handlers.py (3)
lightapi/core.py (1)
Response
(222-337)lightapi/database.py (2)
Base
(29-84)serialize
(70-84)lightapi/lightapi.py (5)
handle
(607-670)serialize
(563-576)data
(192-203)data
(331-340)starlette_handler
(165-267)
🪛 GitHub Actions: Development Tests
lightapi/database.py
[error] 1-1: Imports are incorrectly sorted and/or formatted.
[error] 1-1: Imports are incorrectly sorted and/or formatted.
lightapi/swagger.py
[error] 1-1: Imports are incorrectly sorted and/or formatted.
lightapi/core.py
[error] 1-1: Imports are incorrectly sorted and/or formatted.
lightapi/handlers.py
[error] 1-1: Imports are incorrectly sorted and/or formatted.
🪛 markdownlint-cli2 (0.18.1)
docs/troubleshooting.md
38-38: Emphasis used instead of a heading
(MD036, no-emphasis-as-heading)
docs/getting-started/quickstart.md
137-137: Bare URL used
(MD034, no-bare-urls)
138-138: Bare URL used
(MD034, no-bare-urls)
🪛 Ruff (0.13.2)
lightapi/core.py
181-181: PEP 484 prohibits implicit Optional
Convert to Optional[T]
(RUF013)
182-182: PEP 484 prohibits implicit Optional
Convert to Optional[T]
(RUF013)
183-183: PEP 484 prohibits implicit Optional
Convert to Optional[T]
(RUF013)
184-184: PEP 484 prohibits implicit Optional
Convert to Optional[T]
(RUF013)
347-347: Unused method argument: request
(ARG002)
lightapi/handlers.py
63-63: Do not catch blind exception: Exception
(BLE001)
64-64: Use logging.exception
instead of logging.error
Replace with exception
(TRY400)
99-99: Consider moving this statement to an else
block
(TRY300)
240-240: Unused method argument: request
(ARG002)
🔇 Additional comments (16)
requirements.txt (1)
10-10
: LGTM! New dependency aligns with datetime parsing feature.The addition of
python-dateutil==2.9.0.post0
supports the new automatic datetime parsing functionality described in the documentation. This version is a stable packaging fix release with no breaking changes.Based on learnings
docs/advanced/validation.md (1)
54-63
: LGTM! Clear documentation of datetime parsing feature.The new section accurately describes the automatic datetime parsing functionality, including supported formats and error handling. This complements the python-dateutil dependency addition.
docs/troubleshooting.md (1)
12-48
: LGTM! Comprehensive error response documentation.The new "Understanding Error Responses" section provides clear examples of validation errors and server errors, including debug mode behavior and JSON response formats. This will significantly improve developer troubleshooting experience.
lightapi/database.py (2)
10-22
: LGTM! Database setup centralization is well-designed.The new
setup_database
function centralizes database connection logic. Table creation is appropriately handled in the application startup flow (LightApi.run()
at line 209 in core.py), maintaining clear separation of concerns.
1-7
: Imports correctly sorted. No changes required.docs/getting-started/first-steps.md (1)
54-54
: LGTM! Documentation reflects simplified registration API.The updated example correctly demonstrates the new model-based registration pattern
app.register(User)
, which is clearer and more intuitive than the previous dictionary-based approach.docs/getting-started/quickstart.md (1)
122-123
: LGTM! Registration pattern matches new API.The updated examples correctly demonstrate registering each model individually with
app.register(User)
andapp.register(Post)
, reflecting the new model-based registration API.lightapi/core.py (5)
27-82
: LGTM! Initialization refactored for centralized database setup.The
__init__
method now correctly usessetup_database()
from the database module and stores thedebug
parameter for later use. The addition of ReDoc routes (/docs
,/redoc
,/openapi.json
) properly integrates the new documentation interface.
115-177
: LGTM! Application construction extracted to get_app() method.The new
get_app()
method properly encapsulates Starlette application creation, including:
- Debug mode configuration
- Route registration
- Conditional CORS middleware setup
- Swagger generator state attachment
The
_print_endpoints()
method correctly filters system routes and displays both documentation interfaces.
222-338
: LGTM! Response class handles test and production contexts.The custom
Response
class properly extends Starlette's JSONResponse with test compatibility logic. The intricate__getattribute__
override ensures correct behavior in both test and production environments, handling various edge cases for body content access.
340-488
: LGTM! Middleware implementations are robust.The middleware classes are well-implemented:
- Base
Middleware.process()
provides the interface contract for subclassesCORSMiddleware
correctly handles preflight requests and creates new responses to avoid content-length issuesAuthenticationMiddleware
properly integrates with authentication classesThe static analysis warning about the unused
request
parameter in the baseMiddleware.process()
method is expected and correct for this design pattern, as subclasses utilize both parameters.
83-105
: LGTM! Model-based registration simplifies endpoint setup.The new
register()
method elegantly generates CRUD routes from a SQLAlchemy model class:
- Creates collection and item routes with appropriate paths
- Wires all CRUD handlers (POST, GET, PUT, PATCH, DELETE)
- Integrates with Swagger documentation when enabled
lightapi/swagger.py (2)
1-10
: Imports properly sorted. ruff import checks pass; no changes needed.
255-327
: Add test coverage for ReDoc UI route
Add atest_redoc_ui_route
intests/test_swagger.py
, mirroring the existing tests forswagger_ui_route
andopenapi_json_route
.lightapi/handlers.py (2)
192-205
: Datetime parsing via python-dateutil looks goodUsing dateutil.parser for flexible datetime/date parsing is correct and aligns with the PR goals.
4-4
: Re import nit is fine as-isThe re module is used in IntegrityError handling. Keeping it at the top is acceptable.
import json | ||
from inspect import iscoroutinefunction | ||
from typing import TYPE_CHECKING, Any, Callable, Dict, List, Type | ||
from typing import Any, Callable, Dict, List, Type | ||
|
||
import uvicorn | ||
from starlette.applications import Starlette | ||
|
||
from starlette.middleware.cors import CORSMiddleware as StarletteCORSMiddleware | ||
from starlette.responses import JSONResponse | ||
from starlette.routing import Route | ||
|
||
from .config import config | ||
from .models import setup_database | ||
|
||
if TYPE_CHECKING: | ||
from .rest import RestEndpoint | ||
from .database import Base, setup_database | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion | 🟠 Major
🧩 Analysis chain
Address import sorting issue flagged by pipeline.
The pipeline detected improperly sorted imports. Ensure imports follow the standard convention: standard library, third-party packages, then local modules.
Run the following script to check and fix import order:
Expected fix:
import json
from typing import Any, Callable, Dict, List, Type
import uvicorn
from starlette.applications import Starlette
from starlette.middleware.cors import CORSMiddleware as StarletteCORSMiddleware
from starlette.responses import JSONResponse
from starlette.routing import Route
from .config import config
from .database import Base, setup_database
🏁 Script executed:
#!/bin/bash
# Show current imports
head -15 lightapi/core.py
# Check with ruff
ruff check --select I lightapi/core.py --no-cache || true
Length of output: 1230
Sort import block flagged by pipeline
Ruff reports an unsorted import at line 72 in lightapi/core.py. Reorder the symbols alphabetically or run ruff --fix
:
-from .swagger import SwaggerGenerator, openapi_json_route, redoc_ui_route, swagger_ui_route
+from .swagger import openapi_json_route, redoc_ui_route, swagger_ui_route, SwaggerGenerator
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
import json | |
from inspect import iscoroutinefunction | |
from typing import TYPE_CHECKING, Any, Callable, Dict, List, Type | |
from typing import Any, Callable, Dict, List, Type | |
import uvicorn | |
from starlette.applications import Starlette | |
from starlette.middleware.cors import CORSMiddleware as StarletteCORSMiddleware | |
from starlette.responses import JSONResponse | |
from starlette.routing import Route | |
from .config import config | |
from .models import setup_database | |
if TYPE_CHECKING: | |
from .rest import RestEndpoint | |
from .database import Base, setup_database | |
from .swagger import openapi_json_route, redoc_ui_route, swagger_ui_route, SwaggerGenerator |
🧰 Tools
🪛 GitHub Actions: Development Tests
[error] 1-1: Imports are incorrectly sorted and/or formatted.
🤖 Prompt for AI Agents
In lightapi/core.py around lines 1 to 12, the top-level import block is not
alphabetically ordered per Ruff; reorder the imports so symbols and modules are
alphabetized (group stdlib first, third‑party next, local last) and within each
group sort names (e.g., ensure Any, Callable, Dict, List, Type are in
alphabetical order, and Starlette imports and others are sorted), or simply run
`ruff --fix` to automatically apply the correct ordering.
import datetime | ||
import json | ||
import logging | ||
import re | ||
import traceback | ||
from abc import ABC, abstractmethod | ||
from dataclasses import dataclass, field | ||
from typing import List, Type | ||
|
||
from aiohttp import web | ||
from dateutil import parser | ||
from sqlalchemy import inspect | ||
from sqlalchemy.exc import IntegrityError, StatementError | ||
from sqlalchemy.orm import Session, sessionmaker | ||
from starlette.responses import JSONResponse, Response | ||
|
||
from lightapi.database import Base, SessionLocal |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion | 🟠 Major
🧩 Analysis chain
Fix import sorting (pipeline failure) and use project JSONResponse for test compatibility
- Resolve the import-sorting failure.
- Prefer the project's custom JSONResponse (lightapi.core.Response) to keep test compatibility (decode(), _test_content), while retaining Starlette's Response for 204.
Apply this diff:
import datetime
import json
-import logging
-import re
-import traceback
+import logging
+import re
+import traceback
+import warnings
from abc import ABC, abstractmethod
from dataclasses import dataclass, field
from typing import List, Type
-from dateutil import parser
+from dateutil import parser
from sqlalchemy import inspect
from sqlalchemy.exc import IntegrityError, StatementError
from sqlalchemy.orm import Session, sessionmaker
-from starlette.responses import JSONResponse, Response
+from starlette.responses import Response
+from lightapi.core import Response as JSONResponse
from lightapi.database import Base, SessionLocal
Please re-run your formatter/linter (ruff/isort/black as configured) to confirm the pipeline passes.
🏁 Script executed:
#!/bin/bash
# Locate Response in core
rg -n "class Response" --glob "lightapi/core.py"
# Locate JSONResponse references in core
rg -n "JSONResponse" --glob "lightapi/core.py"
# Preview current imports in handlers.py
sed -n '1,20p' lightapi/handlers.py
Length of output: 1249
🏁 Script executed:
#!/bin/bash
# Check usage of JSONResponse in handlers.py
rg -n "JSONResponse" --glob "lightapi/handlers.py"
# Check usage of generic Response(...) in handlers.py
rg -n "Response\(" --glob "lightapi/handlers.py"
# Check if warnings module is used anywhere in handlers.py
rg -n "warnings" --glob "lightapi/handlers.py"
Length of output: 2411
Use project JSONResponse and fix import sorting
- Remove
JSONResponse
from starlette import and keep onlyResponse
:from starlette.responses import Response
- Add in the local imports group (below
from lightapi.database
):from lightapi.core import Response as JSONResponse
- Insert
import warnings
in the stdlib imports section - Ensure imports are grouped and sorted as stdlib → third-party → local
- Rerun isort/black and confirm the pipeline passes
🧰 Tools
🪛 GitHub Actions: Development Tests
[error] 1-1: Imports are incorrectly sorted and/or formatted.
🤖 Prompt for AI Agents
In lightapi/handlers.py around lines 1 to 16, update the imports: remove
JSONResponse from the starlette import so only Response is imported from
starlette.responses; add import warnings to the stdlib imports section; add a
local import line below from lightapi.database to import JSONResponse as
Response from lightapi.core (i.e. from lightapi.core import Response as
JSONResponse); then reorder and group all imports as stdlib → third-party →
local and run isort/black to reformat and confirm the pipeline passes.
def create_handler(model: Type[Base], session_factory=SessionLocal): | ||
"""Creates a list of route handlers for the given model. | ||
Args: | ||
model: The SQLAlchemy model class to create handlers for. | ||
session_factory: The session factory to use for database connections. | ||
""" | ||
return [ | ||
web.post(f"/{model.__tablename__}/", CreateHandler(model, session_factory)), | ||
web.get(f"/{model.__tablename__}/", RetrieveAllHandler(model, session_factory)), | ||
web.get(f"/{model.__tablename__}/{{id}}", ReadHandler(model, session_factory)), | ||
web.put(f"/{model.__tablename__}/{{id}}", UpdateHandler(model, session_factory)), | ||
web.delete(f"/{model.__tablename__}/{{id}}", DeleteHandler(model, session_factory)), | ||
web.patch(f"/{model.__tablename__}/{{id}}", PatchHandler(model, session_factory)), | ||
] | ||
# This function is no longer the primary way of creating routes, | ||
# but we'll keep it for now and adapt it. | ||
# The actual route creation will be handled in LightApi.register | ||
pass |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion | 🟠 Major
🧩 Analysis chain
Do not leave create_handler
as a no-op; deprecate or provide a backward-compatible stub
Leaving this as pass
silently breaks existing callers. Emit a DeprecationWarning and return handler instances for backward compatibility.
Apply this diff:
def create_handler(model: Type[Base], session_factory=SessionLocal):
"""Creates a list of route handlers for the given model.
Args:
model: The SQLAlchemy model class to create handlers for.
session_factory: The session factory to use for database connections.
"""
- # This function is no longer the primary way of creating routes,
- # but we'll keep it for now and adapt it.
- # The actual route creation will be handled in LightApi.register
- pass
+ # This function is no longer the primary way of creating routes.
+ # The actual route creation is handled in LightApi.register.
+ warnings.warn(
+ "create_handler is deprecated; use LightApi.register(...) instead",
+ DeprecationWarning,
+ stacklevel=2,
+ )
+ return [
+ CreateHandler(model=model, session_factory=session_factory),
+ RetrieveAllHandler(model=model, session_factory=session_factory),
+ ReadHandler(model=model, session_factory=session_factory),
+ UpdateHandler(model=model, session_factory=session_factory),
+ PatchHandler(model=model, session_factory=session_factory),
+ DeleteHandler(model=model, session_factory=session_factory),
+ ]
🏁 Script executed:
#!/bin/bash
# Find any current usages of create_handler to assess breakage risk.
rg -nP '\bcreate_handler\s*\(' -C2
Length of output: 654
🏁 Script executed:
#!/bin/bash
rg -nP --type=py 'class (CreateHandler|RetrieveAllHandler|ReadHandler|UpdateHandler|PatchHandler|DeleteHandler)' -C2 lightapi/handlers.py
rg -nP 'import warnings' -n lightapi/handlers.py
Length of output: 773
Deprecate create_handler with backward-compatible stub
In lightapi/handlers.py, add import warnings
at the top and replace the no-op stub with a warning and a return of the existing handlers to preserve callers:
+import warnings
…
def create_handler(model: Type[Base], session_factory=SessionLocal):
"""Creates a list of route handlers for the given model."""
- pass
+ warnings.warn(
+ "create_handler is deprecated; use LightApi.register(...) instead",
+ DeprecationWarning,
+ stacklevel=2,
+ )
+ return [
+ CreateHandler(model=model, session_factory=session_factory),
+ RetrieveAllHandler(model=model, session_factory=session_factory),
+ ReadHandler(model=model, session_factory=session_factory),
+ UpdateHandler(model=model, session_factory=session_factory),
+ PatchHandler(model=model, session_factory=session_factory),
+ DeleteHandler(model=model, session_factory=session_factory),
+ ]
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
def create_handler(model: Type[Base], session_factory=SessionLocal): | |
"""Creates a list of route handlers for the given model. | |
Args: | |
model: The SQLAlchemy model class to create handlers for. | |
session_factory: The session factory to use for database connections. | |
""" | |
return [ | |
web.post(f"/{model.__tablename__}/", CreateHandler(model, session_factory)), | |
web.get(f"/{model.__tablename__}/", RetrieveAllHandler(model, session_factory)), | |
web.get(f"/{model.__tablename__}/{{id}}", ReadHandler(model, session_factory)), | |
web.put(f"/{model.__tablename__}/{{id}}", UpdateHandler(model, session_factory)), | |
web.delete(f"/{model.__tablename__}/{{id}}", DeleteHandler(model, session_factory)), | |
web.patch(f"/{model.__tablename__}/{{id}}", PatchHandler(model, session_factory)), | |
] | |
# This function is no longer the primary way of creating routes, | |
# but we'll keep it for now and adapt it. | |
# The actual route creation will be handled in LightApi.register | |
pass | |
# at the top of lightapi/handlers.py | |
import warnings | |
… | |
def create_handler(model: Type[Base], session_factory=SessionLocal): | |
"""Creates a list of route handlers for the given model. | |
Args: | |
model: The SQLAlchemy model class to create handlers for. | |
session_factory: The session factory to use for database connections. | |
""" | |
# This function is no longer the primary way of creating routes, | |
# but we'll keep it for now and adapt it. | |
# The actual route creation will be handled in LightApi.register | |
warnings.warn( | |
"create_handler is deprecated; use LightApi.register(...) instead", | |
DeprecationWarning, | |
stacklevel=2, | |
) | |
return [ | |
CreateHandler(model=model, session_factory=session_factory), | |
RetrieveAllHandler(model=model, session_factory=session_factory), | |
ReadHandler(model=model, session_factory=session_factory), | |
UpdateHandler(model=model, session_factory=session_factory), | |
PatchHandler(model=model, session_factory=session_factory), | |
DeleteHandler(model=model, session_factory=session_factory), | |
] |
🤖 Prompt for AI Agents
In lightapi/handlers.py around lines 19-29, import warnings at the top and
replace the no-op create_handler stub with a backward-compatible implementation
that emits a DeprecationWarning and returns the existing handlers; specifically,
add "import warnings" and call the current route-creation helper (for example,
import and call a function like get_handlers from lightapi.routes or the
equivalent new API) inside create_handler, first issuing
warnings.warn("create_handler is deprecated; use LightApi.register (or new API)
instead", DeprecationWarning) and then returning the handlers from that helper
so existing callers keep working.
async def __call__(self, scope, receive, send): | ||
"""Makes the handler a callable ASGI application. | ||
Args: | ||
request (web.Request): The aiohttp web request object. | ||
Returns: | ||
web.Response: The response returned by the handler. | ||
scope: The ASGI scope. | ||
receive: The ASGI receive channel. | ||
send: The ASGI send channel. | ||
""" | ||
from starlette.requests import Request | ||
request = Request(scope, receive) | ||
db: Session = self.session_factory() | ||
try: | ||
return await self.handle(db, request) | ||
response = await self.handle(db, request) | ||
await response(scope, receive, send) | ||
except Exception as e: | ||
logging.error(f"Unhandled exception: {e}\n{traceback.format_exc()}") | ||
# TODO: check for debug mode from app config | ||
response = JSONResponse( | ||
{"error": "Internal Server Error", "message": str(e), "traceback": traceback.format_exc()}, | ||
status_code=500, | ||
) | ||
await response(scope, receive, send) | ||
finally: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don’t leak tracebacks in responses; log with logging.exception
and gate details behind debug
Catching broad Exception is acceptable at ASGI boundaries, but avoid returning stack traces. Use logging.exception and include details only if app debug is enabled.
Apply this diff:
try:
response = await self.handle(db, request)
await response(scope, receive, send)
- except Exception as e:
- logging.error(f"Unhandled exception: {e}\n{traceback.format_exc()}")
- # TODO: check for debug mode from app config
- response = JSONResponse(
- {"error": "Internal Server Error", "message": str(e), "traceback": traceback.format_exc()},
- status_code=500,
- )
+ except Exception as e:
+ logging.exception("Unhandled exception during request handling")
+ # Only expose details when debug is enabled
+ debug = False
+ try:
+ app = getattr(request, "app", None)
+ debug = bool(getattr(app, "debug", False) or getattr(getattr(app, "state", None), "debug", False))
+ except Exception:
+ pass
+ payload = {"error": "Internal Server Error"}
+ if debug:
+ payload.update({"message": str(e), "traceback": traceback.format_exc()})
+ response = JSONResponse(payload, status_code=500)
await response(scope, receive, send)
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async def __call__(self, scope, receive, send): | |
"""Makes the handler a callable ASGI application. | |
Args: | |
request (web.Request): The aiohttp web request object. | |
Returns: | |
web.Response: The response returned by the handler. | |
scope: The ASGI scope. | |
receive: The ASGI receive channel. | |
send: The ASGI send channel. | |
""" | |
from starlette.requests import Request | |
request = Request(scope, receive) | |
db: Session = self.session_factory() | |
try: | |
return await self.handle(db, request) | |
response = await self.handle(db, request) | |
await response(scope, receive, send) | |
except Exception as e: | |
logging.error(f"Unhandled exception: {e}\n{traceback.format_exc()}") | |
# TODO: check for debug mode from app config | |
response = JSONResponse( | |
{"error": "Internal Server Error", "message": str(e), "traceback": traceback.format_exc()}, | |
status_code=500, | |
) | |
await response(scope, receive, send) | |
finally: | |
async def __call__(self, scope, receive, send): | |
"""Makes the handler a callable ASGI application. | |
Args: | |
scope: The ASGI scope. | |
receive: The ASGI receive channel. | |
send: The ASGI send channel. | |
""" | |
from starlette.requests import Request | |
request = Request(scope, receive) | |
db: Session = self.session_factory() | |
try: | |
response = await self.handle(db, request) | |
await response(scope, receive, send) | |
except Exception as e: | |
logging.exception("Unhandled exception during request handling") | |
# Only expose details when debug is enabled | |
debug = False | |
try: | |
app = getattr(request, "app", None) | |
debug = bool( | |
getattr(app, "debug", False) | |
or getattr(getattr(app, "state", None), "debug", False) | |
) | |
except Exception: | |
pass | |
payload = {"error": "Internal Server Error"} | |
if debug: | |
payload.update({"message": str(e), "traceback": traceback.format_exc()}) | |
response = JSONResponse(payload, status_code=500) | |
await response(scope, receive, send) | |
finally: | |
# existing cleanup logic | |
... |
🧰 Tools
🪛 Ruff (0.13.2)
63-63: Do not catch blind exception: Exception
(BLE001)
64-64: Use logging.exception
instead of logging.error
Replace with exception
(TRY400)
🤖 Prompt for AI Agents
In lightapi/handlers.py around lines 49 to 71, the current exception handler
logs with logging.error and returns full tracebacks to clients; change it to use
logging.exception to capture stack info in logs and return a generic 500
response body without the traceback by default. Retrieve the app debug flag from
the ASGI scope (e.g., scope.get("app").state.debug or scope.get("app").debug)
and, if debug is true, include the error message and traceback in the JSON
response; otherwise return only {"error": "Internal Server Error"}. Ensure the
exception handler no longer embeds traceback.format_exc() in the production
response and still awaits sending the response, and keep the finally block to
close the DB session as before.
async def get_request_json(self, request): | ||
"""Extracts JSON data from the request body. | ||
Args: | ||
request (web.Request): The aiohttp web request object. | ||
request: The Starlette request object. | ||
Returns: | ||
dict: The JSON data from the request body. | ||
A dictionary of the JSON body. | ||
""" | ||
return await request.json() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion | 🟠 Major
Return 400 on malformed JSON instead of 500
Handle JSON decoding errors and return a 400 error. Callers of this method should check for a JSONResponse and short‑circuit.
Apply this diff:
async def get_request_json(self, request):
"""Extracts JSON data from the request body.
@@
Returns:
A dictionary of the JSON body.
"""
- return await request.json()
+ try:
+ return await request.json()
+ except (json.JSONDecodeError, UnicodeDecodeError, ValueError):
+ return self.json_error_response("Malformed JSON body", status=400)
Follow-up: after calling get_request_json in handlers, check for JSONResponse and return it immediately (see separate comments at Lines 182-183 and 275-276).
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
async def get_request_json(self, request): | |
"""Extracts JSON data from the request body. | |
Args: | |
request (web.Request): The aiohttp web request object. | |
request: The Starlette request object. | |
Returns: | |
dict: The JSON data from the request body. | |
A dictionary of the JSON body. | |
""" | |
return await request.json() | |
async def get_request_json(self, request): | |
"""Extracts JSON data from the request body. | |
Args: | |
request: The Starlette request object. | |
Returns: | |
A dictionary of the JSON body. | |
""" | |
try: | |
return await request.json() | |
except (json.JSONDecodeError, UnicodeDecodeError, ValueError): | |
return self.json_error_response("Malformed JSON body", status=400) |
🤖 Prompt for AI Agents
In lightapi/handlers.py around lines 74 to 83, get_request_json currently awaits
request.json() and will propagate a JSON decoding exception as a 500; modify it
to catch JSON decoding errors (e.g., json.JSONDecodeError or ValueError from
Starlette) and return a Starlette JSONResponse with status_code=400 and a clear
error message (e.g., {"error":"Malformed JSON"}). Keep successful behavior
returning the parsed dict. Also note callers that invoke get_request_json (see
follow-up lines ~182-183 and ~275-276) must check if the returned value is a
JSONResponse and immediately return it to short‑circuit handling.
data = await self.get_request_json(request) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion | 🟠 Major
Short‑circuit on malformed JSON from get_request_json
After calling get_request_json, handle the JSONResponse error case.
Apply this diff:
- data = await self.get_request_json(request)
+ data = await self.get_request_json(request)
+ if isinstance(data, JSONResponse):
+ return data
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
data = await self.get_request_json(request) | |
data = await self.get_request_json(request) | |
if isinstance(data, JSONResponse): | |
return data |
🤖 Prompt for AI Agents
In lightapi/handlers.py around lines 182-183, after calling data = await
self.get_request_json(request) short-circuit and return immediately when
get_request_json returned an error response; specifically check if data is an
instance of the error response type (e.g., fastapi.responses.JSONResponse or
starlette.responses.Response) and if so return data instead of continuing
processing, otherwise proceed with using the parsed JSON.
pk_col = "id" | ||
pk_value = request.path_params.get(pk_col) | ||
|
||
item = db.query(self.model).filter(getattr(self.model, pk_col) == pk_value).first() | ||
if not item: | ||
return self.json_error_response("Item not found", status=404) | ||
return self.json_error_response(f"{self.model.__name__} with id {pk_value} not found", status=404) | ||
|
||
data = await self.get_request_json(request) | ||
|
||
for col in self.model.__table__.columns: | ||
if col.name in data: | ||
val = data[col.name] | ||
if hasattr(col.type, "python_type"): | ||
if col.type.python_type is datetime.datetime and isinstance(val, str): | ||
try: | ||
data[col.name] = datetime.datetime.fromisoformat(val) | ||
except ValueError: | ||
return web.json_response({"error": f"Invalid datetime format for field '{col.name}'"}, status=400) | ||
elif col.type.python_type is datetime.date and isinstance(val, str): | ||
try: | ||
data[col.name] = datetime.date.fromisoformat(val) | ||
except ValueError: | ||
return web.json_response({"error": f"Invalid date format for field '{col.name}'"}, status=400) | ||
for key, value in data.items(): | ||
setattr(item, key, value) | ||
|
||
item = self.add_and_commit_item(db, item) | ||
if isinstance(item, web.Response): | ||
if isinstance(item, JSONResponse): | ||
return item | ||
return self.json_response(item, status=200) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same PK/typing issues as GET; also normalize incoming types and ignore unknown fields
- Introspect and coerce PK.
- Normalize datetime/date fields like in Create.
- Update only known columns.
Apply this diff:
- pk_col = "id"
- pk_value = request.path_params.get(pk_col)
-
- item = db.query(self.model).filter(getattr(self.model, pk_col) == pk_value).first()
+ mapper = inspect(self.model)
+ pk_cols = mapper.primary_key
+ if len(pk_cols) != 1:
+ return self.json_error_response("Composite primary keys are not supported yet", status=400)
+ pk_col = pk_cols[0]
+ raw = request.path_params.get(pk_col.name)
+ if raw is None:
+ return self.json_error_response(f"Missing path parameter '{pk_col.name}'", status=400)
+ pk_value = self._parse_pk_value(raw, pk_col)
+ item = db.query(self.model).filter(pk_col == pk_value).first()
if not item:
- return self.json_error_response(f"{self.model.__name__} with id {pk_value} not found", status=404)
+ return self.json_error_response(f"{self.model.__name__} with {pk_col.name} {pk_value} not found", status=404)
- data = await self.get_request_json(request)
- for key, value in data.items():
- setattr(item, key, value)
+ data = await self.get_request_json(request)
+ if isinstance(data, JSONResponse):
+ return data
+ # Normalize and apply only known columns
+ col_index = {c.name: c for c in self.model.__table__.columns}
+ for name, val in list(data.items()):
+ col = col_index.get(name)
+ if not col:
+ continue
+ if hasattr(col.type, "python_type"):
+ if col.type.python_type is datetime.datetime and isinstance(val, str):
+ try:
+ val = parser.parse(val)
+ except ValueError:
+ return JSONResponse({"error": f"Invalid datetime format for field '{name}'"}, status_code=400)
+ elif col.type.python_type is datetime.date and isinstance(val, str):
+ try:
+ val = parser.parse(val).date()
+ except ValueError:
+ return JSONResponse({"error": f"Invalid date format for field '{name}'"}, status_code=400)
+ setattr(item, name, val)
item = self.add_and_commit_item(db, item)
if isinstance(item, JSONResponse):
return item
return self.json_response(item, status=200)
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
pk_col = "id" | |
pk_value = request.path_params.get(pk_col) | |
item = db.query(self.model).filter(getattr(self.model, pk_col) == pk_value).first() | |
if not item: | |
return self.json_error_response("Item not found", status=404) | |
return self.json_error_response(f"{self.model.__name__} with id {pk_value} not found", status=404) | |
data = await self.get_request_json(request) | |
for col in self.model.__table__.columns: | |
if col.name in data: | |
val = data[col.name] | |
if hasattr(col.type, "python_type"): | |
if col.type.python_type is datetime.datetime and isinstance(val, str): | |
try: | |
data[col.name] = datetime.datetime.fromisoformat(val) | |
except ValueError: | |
return web.json_response({"error": f"Invalid datetime format for field '{col.name}'"}, status=400) | |
elif col.type.python_type is datetime.date and isinstance(val, str): | |
try: | |
data[col.name] = datetime.date.fromisoformat(val) | |
except ValueError: | |
return web.json_response({"error": f"Invalid date format for field '{col.name}'"}, status=400) | |
for key, value in data.items(): | |
setattr(item, key, value) | |
item = self.add_and_commit_item(db, item) | |
if isinstance(item, web.Response): | |
if isinstance(item, JSONResponse): | |
return item | |
return self.json_response(item, status=200) | |
mapper = inspect(self.model) | |
pk_cols = mapper.primary_key | |
if len(pk_cols) != 1: | |
return self.json_error_response("Composite primary keys are not supported yet", status=400) | |
pk_col = pk_cols[0] | |
raw = request.path_params.get(pk_col.name) | |
if raw is None: | |
return self.json_error_response(f"Missing path parameter '{pk_col.name}'", status=400) | |
pk_value = self._parse_pk_value(raw, pk_col) | |
item = db.query(self.model).filter(pk_col == pk_value).first() | |
if not item: | |
return self.json_error_response(f"{self.model.__name__} with {pk_col.name} {pk_value} not found", status=404) | |
data = await self.get_request_json(request) | |
if isinstance(data, JSONResponse): | |
return data | |
# Normalize and apply only known columns | |
col_index = {c.name: c for c in self.model.__table__.columns} | |
for name, val in list(data.items()): | |
col = col_index.get(name) | |
if not col: | |
continue | |
if hasattr(col.type, "python_type"): | |
if col.type.python_type is datetime.datetime and isinstance(val, str): | |
try: | |
val = parser.parse(val) | |
except ValueError: | |
return JSONResponse({"error": f"Invalid datetime format for field '{name}'"}, status_code=400) | |
elif col.type.python_type is datetime.date and isinstance(val, str): | |
try: | |
val = parser.parse(val).date() | |
except ValueError: | |
return JSONResponse({"error": f"Invalid date format for field '{name}'"}, status_code=400) | |
setattr(item, name, val) | |
item = self.add_and_commit_item(db, item) | |
if isinstance(item, JSONResponse): | |
return item | |
return self.json_response(item, status=200) |
Refactor core to use Starlette consistently, resolving race conditions.
Implement robust error handling for unique constraints and missing resources.
Add flexible datetime parsing for model fields.
Fix Swagger/OpenAPI documentation generation and add ReDoc support.
Improve docstrings and update documentation to reflect changes.
Summary by CodeRabbit
New Features
Refactor
Documentation
Chores