This topic focuses on production patterns that show up constantly:
database/sqlusage patterns (without depending on a real driver)- streaming large data through
io.Reader/io.Writer - JSON streaming with
json.Decoder
database/sqlis an abstraction over drivers and connection pooling.- Rows must be closed.
- Context must be propagated to IO-bound work.
- Streaming avoids loading everything into memory.
Key points:
- A
*sql.DBis a pool, not a single connection. - Always
defer rows.Close(). - Always check
rows.Err()after iteration. - Use context-aware methods (
QueryContext,ExecContext).
Because this repo avoids external DB drivers, exercises model testable patterns by:
- defining small interfaces around query/exec
- injecting fakes/mocks
Patterns:
- begin
- do work
- commit
- rollback on error
Pitfalls:
- forgetting rollback on early return
- committing partial state
SQL NULLs require explicit handling (sql.NullString, etc.).
Pitfall:
- scanning NULL into a non-nullable Go type
Prefer streaming when inputs can be large:
- transform via Reader → Writer
- avoid
io.ReadAllunless size is bounded
Use json.Decoder for:
- large arrays/streams
- incremental processing
Pitfalls:
- ignoring unknown fields in strict APIs
- buffering entire payloads unnecessarily
- Treating
*sql.DBas a connection - Forgetting
rows.Close()/rows.Err() - Using
io.ReadAlleverywhere
- Context passed into DB calls
- Rows always closed
- Transactions rollback on error
- Streaming used for large payloads
These exercises enforce:
- correct transaction semantics via early returns
- testable DB boundaries using small interfaces
- streaming transformations without full buffering
- json.Decoder usage patterns