Skip to content
Open
Show file tree
Hide file tree
Changes from 39 commits
Commits
Show all changes
47 commits
Select commit Hold shift + click to select a range
9338fd4
add bare .env.vertex
dsfaccini Feb 5, 2026
6dca255
env vertex
dsfaccini Feb 5, 2026
ba8ea32
add instructions and skill for testing + template env files
dsfaccini Feb 5, 2026
bdf9b54
remove dumb line
dsfaccini Feb 5, 2026
bb6f683
Update .claude/skills/pytest-vcr/run-vertex-tests.sh
dsfaccini Feb 5, 2026
74c9d47
Update .claude/skills/pytest-vcr/parse_cassette.py
dsfaccini Feb 5, 2026
3b94f7f
address comments
dsfaccini Feb 5, 2026
b860143
Merge branch 'main' into david-claude-md-changes
dsfaccini Feb 5, 2026
1a1dcbc
Update .env.vertex
dsfaccini Feb 5, 2026
264d06e
Update .claude/skills/pytest-vcr/run-vertex-tests.sh
dsfaccini Feb 5, 2026
5ce9469
Merge branch 'main' into david-claude-md-changes
dsfaccini Feb 5, 2026
de77e45
claude loves git -C
dsfaccini Feb 5, 2026
c652b7d
just export, no .env.vertex
dsfaccini Feb 5, 2026
ee64b3f
remove tests/CLAUDE.md pattern
dsfaccini Feb 5, 2026
ea7b06b
Update .claude/skills/pytest-vcr/parse_cassette.py
dsfaccini Feb 5, 2026
4a84edb
unify vertex fixture
dsfaccini Feb 5, 2026
a30e12b
address comment
dsfaccini Feb 6, 2026
a5a3aa8
Merge branch 'main' into david-claude-md-changes
dsfaccini Feb 6, 2026
8d7192d
linting
dsfaccini Feb 6, 2026
27308b4
Merge branch 'main' into david-claude-md-changes
DouweM Feb 7, 2026
6103331
Apply suggestions from code review
dsfaccini Feb 7, 2026
864db54
close fenced block
dsfaccini Feb 7, 2026
004b3f7
Add autouse guard for SSRF fixture in VCR tests
dsfaccini Feb 9, 2026
afc14d1
address comments
dsfaccini Feb 10, 2026
fe8b08c
fix tests
dsfaccini Feb 10, 2026
61fc635
replace last ci skip
dsfaccini Feb 10, 2026
a31d2ed
add comment
dsfaccini Feb 10, 2026
a7bde67
Merge branch 'main' into david-claude-md-changes
dsfaccini Feb 10, 2026
3859037
add fallback and swap out json for pydantic core
dsfaccini Feb 10, 2026
4064039
Merge branch 'main' into david-claude-md-changes
dsfaccini Feb 11, 2026
f44a313
Merge branch 'main' into david-claude-md-changes
dsfaccini Mar 4, 2026
9d6c3fd
mention xdist
dsfaccini Feb 11, 2026
896f001
address PR review feedback: dedup, simplify, fix Vertex auth
dsfaccini Mar 4, 2026
e8be066
restore executable bit on run-vertex-tests.sh
dsfaccini Mar 4, 2026
9e6d551
clarify model fixture docs, rename Custom to Additional helpers
dsfaccini Mar 4, 2026
6a995f9
move skills from .claude/skills/ to .agents/skills/
dsfaccini Mar 6, 2026
78e8f64
fix stale path, tighten allowed-tools, remove private method ref
dsfaccini Mar 6, 2026
1b562c6
wip: pending dicussion on bedrock approach
dsfaccini Mar 9, 2026
4950bcd
fix Cassette import path, remove dev-specific gcloud fallback
dsfaccini Mar 10, 2026
ab228dd
Update .agents/skills/pytest-vcr/run-bedrock-tests.sh
dsfaccini Mar 22, 2026
bdaeb92
Merge remote-tracking branch 'upstream/main' into david-claude-md-cha…
dsfaccini Mar 22, 2026
92be7bc
remove bedrock/vertex runner scripts, fix .gitignore skill whitelisting
dsfaccini Mar 22, 2026
5e03208
Merge branch 'david-claude-md-changes' of github.com:pydantic/pydanti…
dsfaccini Mar 23, 2026
1441b09
Merge branch 'main' into david-claude-md-changes
dsfaccini Mar 23, 2026
45075c4
remove bedrock auth script
dsfaccini Mar 26, 2026
d1129c7
update tests/agents
dsfaccini Mar 26, 2026
5e8b2d5
add clarification
dsfaccini Mar 30, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 16 additions & 0 deletions .agents/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
# `.agents/` Directory

This directory contains cross-agent project configuration that works with any coding agent (Claude Code, OpenCode, Codex, etc).

## Skills

`.agents/skills/` is the canonical location for project skills. `.claude/skills` is a symlink pointing here for Claude Code compatibility.

### Merge conflict resolution

If you get a `CONFLICT (file/directory)` on `.claude/skills` after a merge, move any new skills to `.agents/skills/` and restore the symlink:

```bash
rm -rf .claude/skills
ln -s ../.agents/skills .claude/skills
```
121 changes: 121 additions & 0 deletions .agents/skills/pytest-vcr/SKILL.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,121 @@
---
name: pytest-vcr
description: Record, rewrite, and debug VCR cassettes for HTTP recordings. Use when running tests with --record-mode, verifying cassette playback, or inspecting request/response bodies in YAML cassettes.
allowed-tools: Bash(uv run pytest *), Bash(uv run python .agents/skills/pytest-vcr/parse_cassette.py *), Bash(.agents/skills/pytest-vcr/run-vertex-tests.sh *), Bash(source .env && uv run pytest *), Bash(git diff *)
---

# Pytest VCR Workflow

Use this skill when recording or re-recording VCR cassettes for tests, or when debugging cassette contents.

## Prerequisites

- Verify `.env` exists: `test -f .env && echo 'ok' || echo 'missing'`
- Missing API keys will cause clear test errors at runtime

## Important flags
- `--record-mode=rewrite` : Record cassettes (works for both new and existing)
- `--lf` : Run only the last failed tests
- `-vv` : Verbose output
- `--tb=line` : Short traceback output
- `-k=""` : Run tests matching the given substring expression

## Recording Cassettes

### Step 1: Record cassettes

```bash
source .env && uv run pytest path/to/test.py::test_function_name -v --tb=line --record-mode=rewrite
```

Multiple tests can be specified:
```bash
source .env && uv run pytest path/to/test.py::test_one path/to/test.py::test_two -v --tb=line --record-mode=rewrite
```

### Step 2: Verify recordings

Run the same tests WITHOUT `--record-mode` to verify cassettes play back correctly:
```bash
source .env && uv run pytest path/to/test.py::test_function_name -vv --tb=line
```

### Step 3: Review snapshots

If tests use [`snapshot()`](https://github.com/15r10nk/inline-snapshot) assertions:
- The test run in Step 2 auto-fills snapshot content
- Review the generated snapshot files to ensure they match expected output
- You only review - don't manually write snapshot contents
- Snapshots capture what the test actually produced, additional to explicit assertions

## Parsing Cassettes

Parse VCR cassette YAML files to inspect request/response bodies without dealing with raw YAML.

### Usage

```bash
uv run python .agents/skills/pytest-vcr/parse_cassette.py <cassette_path> [--interaction N]
```

### Examples

```bash
# Parse all interactions in a cassette
uv run python .agents/skills/pytest-vcr/parse_cassette.py tests/models/cassettes/test_foo/test_bar.yaml

# Parse only interaction 1 (0-indexed)
uv run python .agents/skills/pytest-vcr/parse_cassette.py tests/models/cassettes/test_foo/test_bar.yaml --interaction 1
```

### Output

For each interaction, shows:
- Request: method, URI, parsed body (truncated base64)
- Response: status code, parsed body (truncated base64)

Base64 strings longer than 100 chars are truncated for readability.


## Vertex AI Tests

Vertex tests use the `skip_unless_vertex` fixture from `tests/conftest.py` — they only run in CI or when `ENABLE_VERTEX=1` is set. `ENABLE_VERTEX=1` is only needed when recording/rewriting cassettes locally; during playback, cassettes replay without live auth. Add `skip_unless_vertex: None` as a parameter to any new vertex test.

Vertex auth works two ways:
- **`GOOGLE_APPLICATION_CREDENTIALS`**: set this env var to a service account JSON path — no gcloud needed
- **gcloud**: the script auto-detects project and checks auth via `gcloud`

Use the provided script:

```bash
# Record Vertex cassettes
.agents/skills/pytest-vcr/run-vertex-tests.sh tests/path/to/test.py -v --tb=line --record-mode=rewrite

# Verify playback
.agents/skills/pytest-vcr/run-vertex-tests.sh tests/path/to/test.py -vv --tb=line
```

If using gcloud and auth fails:
```bash
gcloud auth application-default login
gcloud config set project <your-project-id>
```

## Full Workflow Example

```bash
# 1. Record cassette
source .env && uv run pytest tests/models/test_openai.py::test_chat_completion -v --tb=line --record-mode=rewrite

# 2. Verify playback and fill snapshots
source .env && uv run pytest tests/models/test_openai.py::test_chat_completion -vv --tb=line

# 3. Review test code diffs (excludes cassettes)
git diff tests/ -- ':!**/cassettes/**'

# 4. List new/changed cassettes (name only - use parse_cassette.py to inspect)
git diff --name-only tests/ -- '**/cassettes/**'

# 5. Inspect cassette contents if needed
uv run python .agents/skills/pytest-vcr/parse_cassette.py tests/models/cassettes/test_openai/test_chat_completion.yaml
```
104 changes: 104 additions & 0 deletions .agents/skills/pytest-vcr/parse_cassette.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,104 @@
#!/usr/bin/env python3
"""Parse VCR cassette files and pretty-print request/response bodies."""

import argparse
import json
import re
import sys
from pathlib import Path

import yaml


def truncate_base64(obj: object, max_len: int = 100) -> object:
"""Recursively truncate base64-like strings in nested structures."""
if isinstance(obj, str):
if len(obj) > max_len and re.match(r'^[A-Za-z0-9+/=]+$', obj[:100]):
return f'{obj[:50]}...[truncated {len(obj)} chars]...{obj[-20:]}'
if obj.startswith('data:') and len(obj) > max_len:
return f'{obj[:80]}...[truncated {len(obj)} chars]'
return obj
elif isinstance(obj, dict):
return {k: truncate_base64(v, max_len) for k, v in obj.items()}
elif isinstance(obj, list):
return [truncate_base64(item, max_len) for item in obj]
return obj


def _extract_body(part: dict[str, object]) -> object | None:
"""Extract body from a request/response, trying parsed_body first, then standard VCR body.string."""
if 'parsed_body' in part:
return part['parsed_body']
body = part.get('body')
if isinstance(body, dict):
body_str = body.get('string')
if isinstance(body_str, str) and body_str:
try:
return json.loads(body_str)
except json.JSONDecodeError:
return body_str
elif isinstance(body, str) and body:
try:
return json.loads(body)
except json.JSONDecodeError:
return body
return None


def parse_cassette(path: Path, interaction_idx: int | None = None) -> None:
"""Parse and print cassette contents."""
with open(path) as f:
data = yaml.safe_load(f)

interactions = data.get('interactions', [])
if not interactions:
print('No interactions found in cassette')
return

indices = [interaction_idx] if interaction_idx is not None else range(len(interactions))

for i in indices:
if i < 0 or i >= len(interactions):
print(f'Interaction {i} not found (only {len(interactions)} interactions)')
continue

interaction = interactions[i]
req = interaction.get('request', {})
resp = interaction.get('response', {})

print(f'\n{"="*60}')
print(f'INTERACTION {i}')
print('='*60)

print(f'\n--- REQUEST ---')
print(f'Method: {req.get("method", "N/A")}')
print(f'URI: {req.get("uri", "N/A")}')
req_body = _extract_body(req)
if req_body is not None:
truncated = truncate_base64(req_body)
print(f'Body:\n{json.dumps(truncated, indent=2)}')

print(f'\n--- RESPONSE ---')
status = resp.get('status', {})
print(f'Status: {status.get("code", "N/A")} {status.get("message", "")}')
resp_body = _extract_body(resp)
if resp_body is not None:
truncated = truncate_base64(resp_body)
print(f'Body:\n{json.dumps(truncated, indent=2)}')


def main() -> None:
parser = argparse.ArgumentParser(description='Parse VCR cassette files')
parser.add_argument('cassette', type=Path, help='Path to cassette YAML file')
parser.add_argument('--interaction', '-i', type=int, help='Specific interaction index (0-based)')
args = parser.parse_args()

if not args.cassette.exists():
print(f'File not found: {args.cassette}', file=sys.stderr)
sys.exit(1)

parse_cassette(args.cassette, args.interaction)


if __name__ == '__main__':
main()
11 changes: 11 additions & 0 deletions .agents/skills/pytest-vcr/run-bedrock-tests.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
#!/bin/bash
# Run tests with AWS bedrock profile (assumes bedrock-test-role via ~/.aws/config).
# Usage: .claude/skills/pytest-vcr/run-bedrock-tests.sh [pytest args...]
set -e

export AWS_PROFILE=bedrock
export AWS_DEFAULT_REGION=us-east-1
unset AWS_BEARER_TOKEN_BEDROCK AWS_ACCESS_KEY_ID AWS_SECRET_ACCESS_KEY

source .env
exec uv run pytest "$@"
54 changes: 54 additions & 0 deletions .agents/skills/pytest-vcr/run-vertex-tests.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
#!/bin/bash
# Run pytest with Vertex AI auth
# Usage: .agents/skills/pytest-vcr/run-vertex-tests.sh [pytest args...]

set -e

# Force Vertex auth path (not API key)
unset GOOGLE_API_KEY GEMINI_API_KEY

if [ -n "$GOOGLE_APPLICATION_CREDENTIALS" ]; then
# Service account credentials — no gcloud needed
if [ ! -f "$GOOGLE_APPLICATION_CREDENTIALS" ]; then
echo "ERROR: GOOGLE_APPLICATION_CREDENTIALS file not found: $GOOGLE_APPLICATION_CREDENTIALS"
exit 1
fi
# Extract project from credentials JSON if not already set
if [ -z "$GOOGLE_PROJECT" ] && [ -z "$GOOGLE_CLOUD_PROJECT" ]; then
PROJECT=$(python3 -c "import json; print(json.load(open('$GOOGLE_APPLICATION_CREDENTIALS')).get('project_id', ''))" 2>/dev/null)
if [ -n "$PROJECT" ]; then
export GOOGLE_PROJECT="$PROJECT"
export GOOGLE_CLOUD_PROJECT="$PROJECT"
fi
fi
else
# Fall back to gcloud
GCLOUD="$(command -v gcloud 2>/dev/null)"
if [ ! -x "$GCLOUD" ]; then
echo "ERROR: No GOOGLE_APPLICATION_CREDENTIALS set and gcloud not found."
echo "Either set GOOGLE_APPLICATION_CREDENTIALS or install Google Cloud SDK."
exit 1
fi

if ! "$GCLOUD" auth application-default print-access-token &>/dev/null; then
echo "ERROR: gcloud auth not configured. Run:"
echo " gcloud auth application-default login"
exit 1
fi

PROJECT=$("$GCLOUD" config get-value project 2>/dev/null)
if [ -z "$PROJECT" ] || [ "$PROJECT" = "(unset)" ]; then
echo "ERROR: no gcloud project configured. Run:"
echo " gcloud config set project <your-project-id>"
exit 1
fi

export GOOGLE_PROJECT="$PROJECT"
export GOOGLE_CLOUD_PROJECT="$PROJECT"
fi

export GOOGLE_LOCATION="${GOOGLE_LOCATION:-global}"

echo "Vertex AI: project=${GOOGLE_PROJECT:-${GOOGLE_CLOUD_PROJECT:-unset}} location=$GOOGLE_LOCATION"

ENABLE_VERTEX=1 uv run pytest "$@"
5 changes: 4 additions & 1 deletion .claude/settings.json
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@
"Bash(ls:*)",
"Bash(tree:*)",
"Bash(grep:*)",
"Bash(git -C * diff:*)",
"Bash(git -C * log:*)",
"Bash(git log:*)",
"Bash(git diff:*)",
"Bash(git grep:*)",
Expand All @@ -19,7 +21,8 @@
"Bash(gh run view:*)",
"Bash(gh run list:*)",
"Bash(uv run:*)",
"Bash(make:*)"
"Bash(make:*)",
"Skill(pytest-vcr:*)"
]
}
}
1 change: 1 addition & 0 deletions .claude/skills
10 changes: 10 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,16 @@ jobs:
steps:
- uses: actions/checkout@v6

- name: Verify .claude/skills symlink
run: |
if [ ! -L .claude/skills ]; then
echo "ERROR: .claude/skills must be a symlink to ../.agents/skills"
echo "Skills live in .agents/skills/. See .agents/README.md"
echo "If this happened after a merge conflict, move new skills to .agents/skills/ and run:"
echo " rm -rf .claude/skills && ln -s ../.agents/skills .claude/skills"
exit 1
fi

- uses: astral-sh/setup-uv@v5
with:
python-version: "3.13"
Expand Down
10 changes: 6 additions & 4 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -21,10 +21,12 @@ node_modules/
/test_tmp/
.mcp.json
.claude/*
!.claude/skills/
.claude/skills/*
!.claude/skills/address-feedback
!.claude/skills/pre-push-review
!.claude/settings.json
!.claude/skills
.agents/skills/*
!.agents/skills/pytest-vcr/
!.agents/skills/address-feedback
!.agents/skills/pre-push-review
.github/.review-context/
/.cursor/
/.devcontainer/
6 changes: 6 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,12 @@ repos:
language: system
types: [python]
pass_filenames: false
- id: check-skills-symlink
name: Check .claude/skills is a symlink
entry: test -L .claude/skills
language: system
always_run: true
pass_filenames: false
- id: check-cassettes
name: Check cassettes
entry: uv
Expand Down
Loading
Loading