Skip to content
Draft
Show file tree
Hide file tree
Changes from 4 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
137 changes: 137 additions & 0 deletions AIP-discussion-response.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,137 @@
# Response to AIP Discussion #1247

> Re: [Respecting AIP response payloads in HTTP](https://github.com/a2aproject/A2A/discussions/1247)

Thanks for this detailed explanation of the AIP conventions, @darrelmiller. I've been working on the a2a-python SDK migration from Pydantic to protobuf types ([PR #572](https://github.com/a2aproject/a2a-python/pull/572)) and wanted to share how we've implemented this.

## How we handle `SetTaskPushNotificationConfig` in the SDK

The key insight is that the request and response types serve different purposes:

**Request (`SetTaskPushNotificationConfigRequest`):**
```protobuf
message SetTaskPushNotificationConfigRequest {
string parent = 1; // e.g., "tasks/{task_id}"
string config_id = 2; // e.g., "my-config-id"
TaskPushNotificationConfig config = 3;
}
```

**Response (`TaskPushNotificationConfig`):**
```protobuf
message TaskPushNotificationConfig {
string name = 1; // Full resource name: "tasks/{task_id}/pushNotificationConfigs/{config_id}"
PushNotificationConfig push_notification_config = 2;
}
```

## Implementation in Python

In our `DefaultRequestHandler`, we construct the proper `name` field from the request's `parent` and `config_id`:

```python
async def on_set_task_push_notification_config(
self,
params: SetTaskPushNotificationConfigRequest,
context: ServerCallContext | None = None,
) -> TaskPushNotificationConfig:
task_id = _extract_task_id(params.parent) # Extract from "tasks/{task_id}"

# Store the config
await self._push_config_store.set_info(
task_id,
params.config.push_notification_config,
)

# Build response with proper AIP resource name
return TaskPushNotificationConfig(
name=f'{params.parent}/pushNotificationConfigs/{params.config_id}',
push_notification_config=params.config.push_notification_config,
)
```

## REST Handler Translation

For the HTTP binding, the REST handler extracts path parameters and constructs the request:

```python
async def set_push_notification(self, request: Request, context: ServerCallContext):
task_id = request.path_params['id']
body = await request.body()

params = SetTaskPushNotificationConfigRequest()
Parse(body, params)
params.parent = f'tasks/{task_id}' # Set from URL path

config = await self.request_handler.on_set_task_push_notification_config(params, context)
return MessageToDict(config) # Returns with proper `name` field
```

## JSON-RPC Handler

The JSON-RPC handler passes the full request directly:

```python
async def set_push_notification_config(
self,
request: SetTaskPushNotificationConfigRequest,
context: ServerCallContext | None = None,
) -> SetTaskPushNotificationConfigResponse:
result = await self.request_handler.on_set_task_push_notification_config(
request, context
)
return prepare_response_object(...)
```

## Key Takeaways

1. **The `name` field is constructed, not passed in** - The server builds the full resource name from `parent` + `config_id`

2. **Consistent across bindings** - Both gRPC and HTTP handlers ultimately call the same `on_set_task_push_notification_config` method

3. **AIP compliance** - The response always includes the full `name` field as required by [AIP-122](https://google.aip.dev/122)

4. **Helper functions for resource name parsing**:
```python
def _extract_task_id(resource_name: str) -> str:
"""Extract task ID from a resource name like 'tasks/{task_id}' or 'tasks/{task_id}/...'."""
match = re.match(r'^tasks/([^/]+)', resource_name)
if match:
return match.group(1)
return resource_name # Fall back for backwards compatibility

def _extract_config_id(resource_name: str) -> str | None:
"""Extract config ID from 'tasks/{task_id}/pushNotificationConfigs/{config_id}'."""
match = re.match(r'^tasks/[^/]+/pushNotificationConfigs/([^/]+)$', resource_name)
if match:
return match.group(1)
return None
```

## E2E Test Example

Here's how a client uses this in practice:

```python
# Client sets the push notification config
await a2a_client.set_task_callback(
SetTaskPushNotificationConfigRequest(
parent=f'tasks/{task.id}',
config_id='my-notification-config',
config=TaskPushNotificationConfig(
push_notification_config=PushNotificationConfig(
id='my-notification-config',
url=f'{notifications_server}/notifications',
token=token,
),
),
)
)
```

This approach keeps the abstract handler logic clean while ensuring AIP compliance at the protocol binding level.

---

**Related PRs:**
- [a2a-python PR #572](https://github.com/a2aproject/a2a-python/pull/572) - Proto migration with these changes
6 changes: 3 additions & 3 deletions buf.gen.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -21,11 +21,11 @@ plugins:
# Generate python protobuf related code
# Generates *_pb2.py files, one for each .proto
- remote: buf.build/protocolbuffers/python:v29.3
out: src/a2a/grpc
out: src/a2a/types
# Generate python service code.
# Generates *_pb2_grpc.py
- remote: buf.build/grpc/python
out: src/a2a/grpc
out: src/a2a/types
# Generates *_pb2.pyi files.
- remote: buf.build/protocolbuffers/pyi
out: src/a2a/grpc
out: src/a2a/types
27 changes: 22 additions & 5 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ dependencies = [
"pydantic>=2.11.3",
"protobuf>=5.29.5",
"google-api-core>=1.26.0",
"json-rpc>=1.15.0",
]

classifiers = [
Expand Down Expand Up @@ -74,6 +75,16 @@ addopts = "-ra --strict-markers"
markers = [
"asyncio: mark a test as a coroutine that should be run by pytest-asyncio",
]
filterwarnings = [
# SQLAlchemy warning about duplicate class registration - this is a known limitation
# of the dynamic model creation pattern used in models.py for custom table names
"ignore:This declarative base already contains a class with the same class name:sqlalchemy.exc.SAWarning",
# ResourceWarnings from asyncio event loop/socket cleanup during garbage collection
# These appear intermittently between tests due to pytest-asyncio and sse-starlette timing
"ignore:unclosed event loop:ResourceWarning",
"ignore:unclosed transport:ResourceWarning",
"ignore:unclosed <socket.socket:ResourceWarning",
]

[tool.pytest-asyncio]
mode = "strict"
Expand Down Expand Up @@ -114,7 +125,7 @@ explicit = true

[tool.mypy]
plugins = ["pydantic.mypy"]
exclude = ["src/a2a/grpc/"]
exclude = ["src/a2a/types/a2a_pb2\\.py", "src/a2a/types/a2a_pb2_grpc\\.py"]
disable_error_code = [
"import-not-found",
"annotation-unchecked",
Expand All @@ -134,7 +145,8 @@ exclude = [
"**/node_modules",
"**/venv",
"**/.venv",
"src/a2a/grpc/",
"src/a2a/types/a2a_pb2.py",
"src/a2a/types/a2a_pb2_grpc.py",
]
reportMissingImports = "none"
reportMissingModuleSource = "none"
Expand All @@ -145,7 +157,8 @@ omit = [
"*/tests/*",
"*/site-packages/*",
"*/__init__.py",
"src/a2a/grpc/*",
"src/a2a/types/a2a_pb2.py",
"src/a2a/types/a2a_pb2_grpc.py",
]

[tool.coverage.report]
Expand Down Expand Up @@ -257,7 +270,9 @@ exclude = [
"node_modules",
"venv",
"*/migrations/*",
"src/a2a/grpc/**",
"src/a2a/types/a2a_pb2.py",
"src/a2a/types/a2a_pb2.pyi",
"src/a2a/types/a2a_pb2_grpc.py",
"tests/**",
]

Expand Down Expand Up @@ -311,7 +326,9 @@ inline-quotes = "single"

[tool.ruff.format]
exclude = [
"src/a2a/grpc/**",
"src/a2a/types/a2a_pb2.py",
"src/a2a/types/a2a_pb2.pyi",
"src/a2a/types/a2a_pb2_grpc.py",
]
docstring-code-format = true
docstring-code-line-length = "dynamic"
Expand Down
21 changes: 0 additions & 21 deletions src/a2a/client/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,39 +18,18 @@
A2AClientTimeoutError,
)
from a2a.client.helpers import create_text_message_object
from a2a.client.legacy import A2AClient
from a2a.client.middleware import ClientCallContext, ClientCallInterceptor


logger = logging.getLogger(__name__)

try:
from a2a.client.legacy_grpc import A2AGrpcClient # type: ignore
except ImportError as e:
_original_error = e
logger.debug(
'A2AGrpcClient not loaded. This is expected if gRPC dependencies are not installed. Error: %s',
_original_error,
)

class A2AGrpcClient: # type: ignore
"""Placeholder for A2AGrpcClient when dependencies are not installed."""

def __init__(self, *args, **kwargs):
raise ImportError(
'To use A2AGrpcClient, its dependencies must be installed. '
'You can install them with \'pip install "a2a-sdk[grpc]"\''
) from _original_error


__all__ = [
'A2ACardResolver',
'A2AClient',
'A2AClientError',
'A2AClientHTTPError',
'A2AClientJSONError',
'A2AClientTimeoutError',
'A2AGrpcClient',
'AuthInterceptor',
'BaseClient',
'Client',
Expand Down
50 changes: 39 additions & 11 deletions src/a2a/client/auth/interceptor.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,17 +3,43 @@

from a2a.client.auth.credentials import CredentialService
from a2a.client.middleware import ClientCallContext, ClientCallInterceptor
from a2a.types import (
from a2a.types.a2a_pb2 import (
AgentCard,
APIKeySecurityScheme,
HTTPAuthSecurityScheme,
In,
MutualTlsSecurityScheme,
OAuth2SecurityScheme,
OpenIdConnectSecurityScheme,
SecurityScheme,
)

logger = logging.getLogger(__name__)

_SecuritySchemeValue = (
APIKeySecurityScheme
| HTTPAuthSecurityScheme
| OAuth2SecurityScheme
| OpenIdConnectSecurityScheme
| MutualTlsSecurityScheme
| None
)


def _get_security_scheme_value(scheme: SecurityScheme) -> _SecuritySchemeValue:
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this change has just proven my point about moving this inline in the function below and changing how the "match" is done.

"""Extract the actual security scheme from the oneof union."""
which = scheme.WhichOneof('scheme')
if which == 'api_key_security_scheme':
return scheme.api_key_security_scheme
if which == 'http_auth_security_scheme':
return scheme.http_auth_security_scheme
if which == 'oauth2_security_scheme':
return scheme.oauth2_security_scheme
if which == 'open_id_connect_security_scheme':
return scheme.open_id_connect_security_scheme
if which == 'mtls_security_scheme':
return scheme.mtls_security_scheme
return None


class AuthInterceptor(ClientCallInterceptor):
"""An interceptor that automatically adds authentication details to requests.
Expand All @@ -35,13 +61,13 @@ async def intercept(
"""Applies authentication headers to the request if credentials are available."""
if (
agent_card is None
or agent_card.security is None
or agent_card.security_schemes is None
or not agent_card.security
or not agent_card.security_schemes
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Double checking the protobuf docs https://protobuf.dev/reference/python/python-generated/#embedded_message it sounds like we may need to use agent_card.HasField("...") this separates the difference between agent_card.security set, not set and set but empty from what I can tell.

):
return request_payload, http_kwargs

for requirement in agent_card.security:
for scheme_name in requirement:
for scheme_name in requirement.schemes:
credential = await self._credential_service.get_credentials(
scheme_name, context
)
Expand All @@ -51,7 +77,9 @@ async def intercept(
)
if not scheme_def_union:
continue
scheme_def = scheme_def_union.root
scheme_def = _get_security_scheme_value(scheme_def_union)
if not scheme_def:
continue
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Personal preference would be to use the functions available from proto to do the matching below instead of ending up with a non-statically typed function here. E.g.

match scheme.WhichOneof('scheme'):
    case 'http_auth_security_scheme' if scheme.http_auth_security_scheme.lower() == 'bearer':
    	...

tbh I think it would read better as a set of if blocks:

scheme = agent_card.security_schemes.get(scheme_name)

if scheme.HasField('http_auth_security_scheme') and scheme.http_auth_security_scheme.lower() == 'bearer':
   ... 

if scheme.HasField('oauth2_security_scheme') or scheme.HasField('open_id_connect_security_scheme'):
   ...

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure how's HasField improves things here.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As explained our call, its mostly about readability, but also prevent us using reflection to check the python types, and instead we can just rely on the boolean check of "does the request have this field set" then it can use it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Check now @Tehsmash

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't seem to be able to resolve my own threads on the a2a repos, but I'll leave a LGTM if I think it should be resolved.


headers = http_kwargs.get('headers', {})

Expand All @@ -62,9 +90,8 @@ async def intercept(
):
headers['Authorization'] = f'Bearer {credential}'
logger.debug(
"Added Bearer token for scheme '%s' (type: %s).",
"Added Bearer token for scheme '%s'.",
scheme_name,
scheme_def.type,
)
http_kwargs['headers'] = headers
return request_payload, http_kwargs
Expand All @@ -76,15 +103,16 @@ async def intercept(
):
headers['Authorization'] = f'Bearer {credential}'
logger.debug(
"Added Bearer token for scheme '%s' (type: %s).",
"Added Bearer token for scheme '%s'.",
scheme_name,
scheme_def.type,
)
http_kwargs['headers'] = headers
return request_payload, http_kwargs

# Case 2: API Key in Header
case APIKeySecurityScheme(in_=In.header):
case APIKeySecurityScheme() if (
scheme_def.location.lower() == 'header'
):
headers[scheme_def.name] = credential
logger.debug(
"Added API Key Header for scheme '%s'.",
Expand Down
Loading