Skip to content

Conversation

@muscariello
Copy link
Member

@muscariello muscariello commented Nov 28, 2025

Summary

This PR migrates the a2a-python SDK from Pydantic-based types to protobuf-generated types, completing the upgrade to A2A v1.0. Fixes #559

Breaking Changes

  • Replace Pydantic-based type system with protobuf types from a2a_pb2
  • Update Part usage from Part(root=TextPart(text=...)) to Part(text=...)
  • Update Role enum from Role.user/Role.agent to Role.ROLE_USER/Role.ROLE_AGENT
  • Update TaskState enum to use TASK_STATE_* prefix

Changes

  • Update all source files to use proto types directly from a2a_pb2
  • Replace model_dump() with MessageToDict() for JSON serialization
  • Replace model_copy(deep=True) with CopyFrom() for proto cloning
  • Add new types module with proto imports and SDK-specific extras
  • Add proto_utils module with identity conversion utilities
  • Fix REST handler resource name formats for push notification configs
  • Fix gRPC handler to use SubscribeToTask instead of TaskSubscription
  • Fix database task store to handle proto objects from ORM
  • Update all test files for proto patterns and assertions

Testing

  • 601 tests passing
  • 23 tests skipped (expected - require DB connections or cryptography deps)

Related

Builds on top of PR #556

Release-As: 1.0.0

Tehsmash and others added 2 commits November 21, 2025 17:02
This updates the SDK to be A2A v1.0 compliant, all types are generated
from the v1.0 a2a.proto. JSONRPC/HTTP+JSON transports are converted to
use the a2a types encoded using ProtoJSON directly from the generated
types.
BREAKING CHANGE: Replace Pydantic-based type system with protobuf types

- Update all source files to use proto types directly from a2a_pb2
- Replace model_dump() with MessageToDict() for JSON serialization
- Replace model_copy(deep=True) with CopyFrom() for proto cloning
- Update Part usage from Part(root=TextPart(...)) to Part(text=...)
- Update Role enum from Role.user to Role.ROLE_USER
- Update TaskState enum to use TASK_STATE_* prefix
- Add new types module with proto imports and SDK-specific extras
- Add proto_utils module with identity conversion utilities
- Fix REST handler resource name formats for push notification configs
- Fix gRPC handler to use SubscribeToTask instead of TaskSubscription
- Fix database task store to handle proto objects from ORM
- Update all test files for proto patterns and assertions

Tested: 601 tests passing, 23 skipped (expected - DB/crypto deps)
Signed-off-by: Luca Muscariello <[email protected]>
@muscariello muscariello requested review from a team and a2a-bot as code owners November 28, 2025 21:00
@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @muscariello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request represents a foundational shift in the SDK's type system, moving from Pydantic models to native protobuf-generated types. This change aligns the SDK more closely with the A2A v1.0 specification, enhancing type consistency and performance. The migration necessitated extensive updates across the codebase, impacting how data is structured, serialized, and handled throughout the client and server components. While introducing several breaking changes to the API, the goal is to provide a more robust and future-proof foundation for A2A interactions.

Highlights

  • Protobuf Type Migration: The entire SDK has been migrated from Pydantic-based types to protobuf-generated types, completing the upgrade to A2A v1.0. This involves significant changes across client and server components to directly utilize protobuf messages.
  • Legacy Client Removal: The deprecated A2AClient (JSON-RPC) and A2AGrpcClient (gRPC) legacy client wrappers have been removed, streamlining the client API. Users should now use ClientFactory to create clients with specific transports.
  • API Changes and Breaking Changes: Key API changes include updating Part usage from Part(root=TextPart(text=...)) to Part(text=...), renaming Role enum values (e.g., Role.user to Role.ROLE_USER), and updating TaskState enum values with a TASK_STATE_* prefix. Method signatures across client transports and handlers have been updated to reflect the new protobuf request/response types.
  • Serialization and Deserialization: The internal serialization logic has been updated to use MessageToDict() for JSON serialization and ParseDict() for deserialization, replacing Pydantic's model_dump() and model_validate() methods. This ensures compatibility with protobuf messages.
  • Build and Configuration Updates: The buf.gen.yaml configuration has been updated to generate protobuf types into src/a2a/types instead of src/a2a/grpc. pyproject.toml and related linting/type-checking configurations (mypy, ruff) have been adjusted to exclude the new protobuf-generated files in their new location and to include the new json-rpc dependency.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This is a massive and well-executed refactoring to migrate the SDK from Pydantic types to protobuf-generated types, completing the upgrade to A2A v1.0. The changes are extensive, touching almost every part of the SDK, from client and server logic to transports, handlers, database models, and tests.

My review finds that the migration has been executed thoroughly and consistently. Key changes include:

  • Replacing Pydantic models with protobuf messages from a2a_pb2.
  • Updating data serialization from model_dump() to MessageToDict() and ParseDict().
  • Refactoring client transports (grpc, jsonrpc, rest) to work directly with protobuf types, which simplifies the code significantly.
  • Updating method signatures and data structures across the client and server components.
  • Introducing a new types module as a central place for all type definitions.
  • Removing legacy client implementations.

The overall quality of the changes is high. The code is now more aligned with the A2A v1.0 specification, and the gRPC transport, in particular, is much cleaner. The test suite has been updated to reflect these changes, with a large number of tests passing.

I have a few suggestions for improving code maintainability and simplifying the logic in the AuthInterceptor and the database task store. Otherwise, the PR looks excellent.

Comment on lines 98 to 110
def _to_orm(self, task: Task) -> TaskModel:
"""Maps a Pydantic Task to a SQLAlchemy TaskModel instance."""
"""Maps a Proto Task to a SQLAlchemy TaskModel instance."""
# Convert proto to dict for storing in JSON columns
task_dict = MessageToDict(task, preserving_proto_field_name=True)
return self.task_model(
id=task.id,
context_id=task.context_id,
kind=task.kind,
status=task.status,
artifacts=task.artifacts,
history=task.history,
task_metadata=task.metadata,
kind='task', # Default kind for tasks
status=task_dict.get('status'),
artifacts=task_dict.get('artifacts', []),
history=task_dict.get('history', []),
task_metadata=task_dict.get('metadata'),
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This function can be simplified. Instead of converting the entire Task protobuf message to a dictionary, you can pass the nested protobuf message objects (status, artifacts, history) directly to the TaskModel constructor. The PydanticType and PydanticListType decorators will handle their serialization. This avoids the overhead of MessageToDict for the whole task object.

    def _to_orm(self, task: Task) -> TaskModel:
        """Maps a Proto Task to a SQLAlchemy TaskModel instance."""
        return self.task_model(
            id=task.id,
            context_id=task.context_id,
            kind='task',  # Default kind for tasks
            status=task.status,
            artifacts=list(task.artifacts),
            history=list(task.history),
            task_metadata=MessageToDict(task.metadata) if task.metadata.fields else None,
        )

Comment on lines 108 to 139
def _from_orm(self, task_model: TaskModel) -> Task:
"""Maps a SQLAlchemy TaskModel to a Pydantic Task instance."""
# Map database columns to Pydantic model fields
"""Maps a SQLAlchemy TaskModel to a Proto Task instance."""
# The ORM columns return proto objects for status, artifacts, history
# We need to convert them back to dicts for ParseDict
task_data_from_db = {
'id': task_model.id,
'context_id': task_model.context_id,
'kind': task_model.kind,
'status': task_model.status,
'artifacts': task_model.artifacts,
'history': task_model.history,
'metadata': task_model.task_metadata, # Map task_metadata column to metadata field
}
# Pydantic's model_validate will parse the nested dicts/lists from JSON
return Task.model_validate(task_data_from_db)
# Add status if present (already a proto object from PydanticType)
if task_model.status is not None:
task_data_from_db['status'] = MessageToDict(task_model.status, preserving_proto_field_name=True)
# Add artifacts if present (list of proto objects)
if task_model.artifacts:
task_data_from_db['artifacts'] = [
MessageToDict(a, preserving_proto_field_name=True) if hasattr(a, 'DESCRIPTOR') else a
for a in task_model.artifacts
]
# Add history if present (list of proto objects)
if task_model.history:
task_data_from_db['history'] = [
MessageToDict(m, preserving_proto_field_name=True) if hasattr(m, 'DESCRIPTOR') else m
for m in task_model.history
]
# Add metadata if present
if task_model.task_metadata is not None:
task_data_from_db['metadata'] = task_model.task_metadata
# Use ParseDict to create proto from dict
return ParseDict(task_data_from_db, Task())
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This function seems overly complex. Since PydanticType and PydanticListType already deserialize the JSON from the database into protobuf message objects, you can construct the final Task protobuf object directly using these objects, rather than converting them back to dictionaries and then parsing the whole dictionary again. This would be more efficient and easier to read.

    def _from_orm(self, task_model: TaskModel) -> Task:
        """Maps a SQLAlchemy TaskModel to a Proto Task instance."""
        task = Task(
            id=task_model.id,
            context_id=task_model.context_id,
        )
        if task_model.status:
            task.status.CopyFrom(task_model.status)
        if task_model.artifacts:
            task.artifacts.extend(task_model.artifacts)
        if task_model.history:
            task.history.extend(task_model.history)
        if task_model.task_metadata:
            task.metadata.update(task_model.task_metadata)
        return task

Comment on lines 20 to 31
which = scheme.WhichOneof('scheme')
if which == 'api_key_security_scheme':
return scheme.api_key_security_scheme
elif which == 'http_auth_security_scheme':
return scheme.http_auth_security_scheme
elif which == 'oauth2_security_scheme':
return scheme.oauth2_security_scheme
elif which == 'open_id_connect_security_scheme':
return scheme.open_id_connect_security_scheme
elif which == 'mtls_security_scheme':
return scheme.mtls_security_scheme
return None
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

This function can be simplified by using getattr to dynamically access the oneof field. This makes the code more concise and easier to maintain if more security schemes are added in the future.

    which = scheme.WhichOneof('scheme')
    if which:
        return getattr(scheme, which)
    return None

- Fix agent_app.py Part access pattern for proto (HasField/direct access)
- Fix ClientEvent handling in E2E tests (StreamResponse, not Task)
- Fix notifications_app.py to serialize proto Task to dict
- Update SetTaskPushNotificationConfigRequest interface in handlers
- Update default_request_handler to use request.parent instead of name
- Update jsonrpc_handler to pass full request to handler
- Update unit tests to use SetTaskPushNotificationConfigRequest

Signed-off-by: Luca Muscariello <[email protected]>
buf.gen.yaml Outdated
inputs:
- git_repo: https://github.com/a2aproject/A2A.git
ref: main
ref: transports
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need to change this back to "main" to pick up the latest fixes from the a2a.proto and before we merge we need to switch to the tag for v1.0 (when its cut)

scheme_def = scheme_def_union.root
scheme_def = _get_security_scheme_value(scheme_def_union)
if not scheme_def:
continue
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Personal preference would be to use the functions available from proto to do the matching below instead of ending up with a non-statically typed function here. E.g.

match scheme.WhichOneof('scheme'):
    case 'http_auth_security_scheme' if scheme.http_auth_security_scheme.lower() == 'bearer':
    	...

tbh I think it would read better as a set of if blocks:

scheme = agent_card.security_schemes.get(scheme_name)

if scheme.HasField('http_auth_security_scheme') and scheme.http_auth_security_scheme.lower() == 'bearer':
   ... 

if scheme.HasField('oauth2_security_scheme') or scheme.HasField('open_id_connect_security_scheme'):
   ...

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure how's HasField improves things here.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As explained our call, its mostly about readability, but also prevent us using reflection to check the python types, and instead we can just rely on the boolean check of "does the request have this field set" then it can use it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Check now @Tehsmash

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't seem to be able to resolve my own threads on the a2a repos, but I'll leave a LGTM if I think it should be resolved.


# Alias for backward compatibility - the proto uses SendMessageRequest
# where old code might use MessageSendParams
from a2a.types.a2a_pb2 import SendMessageRequest as MessageSendParams
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lets not do this, any code that is already using a2a.types.MessageSendParams would need change the imports anyway and I think this will be confusing.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is a quick removal that in any case is supposed to be removed before the PR is complete.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

)

# Alias for streaming - same as SendMessageRequest in the proto
SendStreamingMessageRequest = SendMessageRequest = MessageSendParams
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

-1, lets just update the code to use SendMessageRequest.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See above.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

"""Transport protocol string constants for backward compatibility."""
jsonrpc = "JSONRPC"
http_json = "HTTP+JSON"
grpc = "GRPC"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the client code I add these as constants, if we want to put them somewhere common I would add a "constants.py" / constants module outside of "types".

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To move this elsewhere I would have to solve a circular dependency first. There needs to be additional refactoring work before I can do that.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am moving things from the refactor out of extras.py until it is empty. Then I will remove the file. A few more commits I think.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Comment on lines 168 to 297
# Union of all A2A error types
A2AError = Union[
JSONRPCError,
JSONParseError,
InvalidRequestError,
MethodNotFoundError,
InvalidParamsError,
InternalError,
TaskNotFoundError,
TaskNotCancelableError,
PushNotificationNotSupportedError,
UnsupportedOperationError,
ContentTypeNotSupportedError,
InvalidAgentResponseError,
AuthenticatedExtendedCardNotConfiguredError,
]


class JSONRPCRequest(A2ABaseModel):
"""Represents a JSON-RPC 2.0 Request object."""

jsonrpc: Literal["2.0"] = "2.0"
method: str
params: Any | None = None
id: str | int | None = None


class JSONRPCResponse(A2ABaseModel):
"""Represents a JSON-RPC 2.0 Success Response object."""

jsonrpc: Literal["2.0"] = "2.0"
result: Any
id: str | int | None = None


class JSONRPCErrorResponse(A2ABaseModel):
"""Represents a JSON-RPC 2.0 Error Response object."""

jsonrpc: Literal["2.0"] = "2.0"
error: A2AError
id: str | int | None = None


# Type alias for A2A requests (union of all request types)
# This maps to the various request message types in the proto
from a2a.types.a2a_pb2 import (
CancelTaskRequest,
GetExtendedAgentCardRequest,
GetTaskPushNotificationConfigRequest,
GetTaskRequest,
SendMessageRequest,
SetTaskPushNotificationConfigRequest,
SubscribeToTaskRequest,
)

A2ARequest = Union[
SendMessageRequest,
GetTaskRequest,
CancelTaskRequest,
SetTaskPushNotificationConfigRequest,
GetTaskPushNotificationConfigRequest,
SubscribeToTaskRequest,
GetExtendedAgentCardRequest,
]


# JSON-RPC Success Response types
# These wrap the result of successful RPC calls
# Note: result is typed as Any to allow both proto messages and dicts
class GetTaskSuccessResponse(A2ABaseModel):
"""Success response for GetTask RPC."""
jsonrpc: Literal["2.0"] = "2.0"
id: str | int | None = None
result: Any


class CancelTaskSuccessResponse(A2ABaseModel):
"""Success response for CancelTask RPC."""
jsonrpc: Literal["2.0"] = "2.0"
id: str | int | None = None
result: Any


class SendMessageSuccessResponse(A2ABaseModel):
"""Success response for SendMessage RPC."""
jsonrpc: Literal["2.0"] = "2.0"
id: str | int | None = None
result: Any


class SendStreamingMessageSuccessResponse(A2ABaseModel):
"""Success response for streaming message RPC."""
jsonrpc: Literal["2.0"] = "2.0"
id: str | int | None = None
result: Any # Streaming events


class SetTaskPushNotificationConfigSuccessResponse(A2ABaseModel):
"""Success response for SetTaskPushNotificationConfig RPC."""
jsonrpc: Literal["2.0"] = "2.0"
id: str | int | None = None
result: Any


class GetTaskPushNotificationConfigSuccessResponse(A2ABaseModel):
"""Success response for GetTaskPushNotificationConfig RPC."""
jsonrpc: Literal["2.0"] = "2.0"
id: str | int | None = None
result: Any


class ListTaskPushNotificationConfigSuccessResponse(A2ABaseModel):
"""Success response for ListTaskPushNotificationConfig RPC."""
jsonrpc: Literal["2.0"] = "2.0"
id: str | int | None = None
result: Any


class DeleteTaskPushNotificationConfigSuccessResponse(A2ABaseModel):
"""Success response for DeleteTaskPushNotificationConfig RPC."""
jsonrpc: Literal["2.0"] = "2.0"
id: str | int | None = None
result: None = None


class GetAuthenticatedExtendedCardSuccessResponse(A2ABaseModel):
"""Success response for GetAuthenticatedExtendedCard RPC."""
jsonrpc: Literal["2.0"] = "2.0"
id: str | int | None = None
result: Any # AgentCard
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let not add our own implementation of this. In the client code I pull in the "jsonrpc" package from pypi, which has the JSONRPC20Response type already defined, then we only need to concern ourselves with populating the result or error fields.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@muscariello muscariello marked this pull request as draft December 1, 2025 13:53
# A2ARequest is now a Union type of proto messages, so we can't use
# model_json_schema. Instead, we just mark it as added without
# adding the schema since proto types don't have Pydantic schemas.
# The OpenAPI schema will still be functional for the endpoints.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LOL ... "the code doesn't support this any more so we'll just mark this test as passed anyway"

In all seriousness this should be replaced by the generated a2a.json schema I think.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not using a2a.json at all. So need to rethink about this test.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What I meant is that if fastapi wants to serve an OpenAPI schema, then we need to use the a2a.json instead of generating one from the python types.

'tasks/pushNotificationConfig/delete': DeleteTaskPushNotificationConfigRequest,
'tasks/resubscribe': TaskResubscriptionRequest,
'agent/authenticatedExtendedCard': GetExtendedAgentCardRequest,
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The method names have changed to align with the gRPC ones, i.e message/send is now "SendMessage". We need to fix that in the client and here.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think I have fixed this everywhere now. I will keep checking...

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

JSONRPCError,
JSONRPCErrorResponse,
JSONRPCRequest,
JSONRPCResponse,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lets pull in the jsonrpc dependency so that we can remove these from the types.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

id=request_id, error=error
)

return self._create_response(context, handler_result)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should revisit this code and match on the JSON RPC method first then perform the correct decoding as there are body types like "SendMessageRequest" which are reused for different methods.

For example:

def _process_request(self, request: JSONRPC20Request):
	match request.method:
		case "SendMessage":
			input: SendMessageRequest = ParseDict(request.params, SendMessageRequest())
			result = await self.handler.on_send_message(input)
		case "CancelTask":
			input: CancelTaskRequest = ParseDict(request.params, CancelTaskRequest())
			...
...

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I still think this would be cleaner than doing METHOD_TO_MODEL first and matching on the classes after the fact. And makes it more obvious IMO that the final / default case of the "match" would be "MethodNotFound".

"""
# Create task manager and validate existing task
# Proto empty strings should be treated as None
task_id = params.request.task_id or None
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Need to confirm if "task_id" should be just the UUID, or should be in the form "tasks/".
I think AIP for this said that it should be "tasks/" because this is the FQN for the task.

TaskNotCancelableError,
TaskNotFoundError,
UnsupportedOperationError,
)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should move all these errors to errors.py instead of extras.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

A list of dictionaries containing the data from any `DataPart` objects found.
"""
return [part.root.data for part in parts if isinstance(part.root, DataPart)]
from google.protobuf.json_format import MessageToDict
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No need for an inline import, lets move this to the top of the file.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can remove this file and its classes/functions, as we're using the proto types everywhere now.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Most of the stuff from here is gone, i think we can nuke the helper function here if we fix the response from the default_request_handler.

for part in request.parts:
if isinstance(part.root, TextPart) and not part.root.text:
raise ValueError('TextPart content cannot be empty')
if part.text is not None and not part.text:
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
if part.text is not None and not part.text:
if part.HasField('text') and not part.text:

Comment on lines +99 to +100
del task_copy.history[:]
task_copy.history.extend(limited_history)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
del task_copy.history[:]
task_copy.history.extend(limited_history)
task_copy.history[:] = limited_history

- Fix E402: Move telemetry import to top in default_request_handler.py
- Fix TRY300/RET504: Return directly in grpc_handler.py try blocks
- Fix TRY004: Add noqa for valid ValueError in database_push_notification_config_store.py
- Fix pyright: Add else branch for unbound client_event in base_client.py
- Fix pyright: Add cast for rpc_request.data in jsonrpc.py transport

All linter checks now pass:
- ruff check: 0 errors
- ruff format: 78 files formatted
- mypy: 0 errors in 78 files
- pyright: 0 errors, 0 warnings

All 730 tests pass (including PostgreSQL and MySQL database tests)
@muscariello muscariello changed the title feat!: Migrate from Pydantic types to protobuf-generated types feat!: upgrade SDK to A2A 1.0 specs Dec 1, 2025


def _get_security_scheme_value(scheme: SecurityScheme):
def _get_security_scheme_value(scheme: SecurityScheme) -> _SecuritySchemeValue:
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this change has just proven my point about moving this inline in the function below and changing how the "match" is done.

task.history.append(task.status.message)
if event.metadata:
task.metadata.update(event.metadata)
task.metadata.update(dict(event.metadata)) # type: ignore[arg-type]
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should use task.metadata.MergeFrom/CopyFrom here depending on the desired behaviour

…ry directly

- Update JSONRPCHandler to return dict[str, Any] instead of Pydantic RootModels
- Update response_helpers to build dicts with JSON-RPC 2.0 structure
- Remove unused Pydantic response types from types module
- Fix proto dependency loading in a2a_pb2.py
- Update all tests to check dict responses instead of Pydantic models
- Add TransportProtocol constants to utils module
@holtskinner holtskinner changed the title feat!: upgrade SDK to A2A 1.0 specs refactor!: upgrade SDK to A2A 1.0 specs Dec 1, 2025
…iases

- Rename on_resubscribe_to_task to on_subscribe_to_task across all handlers
- Update METHOD_TO_MODEL with gRPC-style method names (SendMessage, GetTask, etc.)
- Update JSON-RPC client to use new method names
- Fix ListTaskPushNotificationConfigResponse to use 'configs' field
- Remove TaskResubscriptionRequest alias from extras.py
- Update TransportProtocol imports to use a2a.utils.constants
- Fix on_get_task_push_notification_config params type
- Update all tests for new method names and response types
…s.py

- Moved all error type definitions (JSONRPCError, A2AError, etc.) to utils/errors.py
- Updated all imports across 15 files to use a2a.utils.errors
- Defined A2ARequest union inline in types/__init__.py
- Deleted the now-redundant extras.py file
- Re-exported error types from types/__init__.py for public API
or agent_card.security is None
or agent_card.security_schemes is None
or not agent_card.security
or not agent_card.security_schemes
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Double checking the protobuf docs https://protobuf.dev/reference/python/python-generated/#embedded_message it sounds like we may need to use agent_card.HasField("...") this separates the difference between agent_card.security set, not set and set but empty from what I can tell.

scheme_def = scheme_def_union.root
scheme_def = _get_security_scheme_value(scheme_def_union)
if not scheme_def:
continue
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't seem to be able to resolve my own threads on the a2a repos, but I'll leave a LGTM if I think it should be resolved.

Comment on lines +12 to +127
class A2ABaseModel(BaseModel):
"""Base model for all A2A SDK types."""

model_config = {
'extra': 'allow',
'populate_by_name': True,
'arbitrary_types_allowed': True,
}


# JSON-RPC Error types - A2A specific error codes
class JSONRPCError(A2ABaseModel):
"""Represents a JSON-RPC 2.0 Error object."""

code: int
"""A number that indicates the error type that occurred."""
message: str
"""A string providing a short description of the error."""
data: Any | None = None
"""Additional information about the error."""


class JSONParseError(A2ABaseModel):
"""JSON-RPC parse error (-32700)."""

code: Literal[-32700] = -32700
message: str = 'Parse error'
data: Any | None = None


class InvalidRequestError(A2ABaseModel):
"""JSON-RPC invalid request error (-32600)."""

code: Literal[-32600] = -32600
message: str = 'Invalid Request'
data: Any | None = None


class MethodNotFoundError(A2ABaseModel):
"""JSON-RPC method not found error (-32601)."""

code: Literal[-32601] = -32601
message: str = 'Method not found'
data: Any | None = None


class InvalidParamsError(A2ABaseModel):
"""JSON-RPC invalid params error (-32602)."""

code: Literal[-32602] = -32602
message: str = 'Invalid params'
data: Any | None = None


class InternalError(A2ABaseModel):
"""JSON-RPC internal error (-32603)."""

code: Literal[-32603] = -32603
message: str = 'Internal error'
data: Any | None = None


class TaskNotFoundError(A2ABaseModel):
"""A2A-specific error for task not found (-32001)."""

code: Literal[-32001] = -32001
message: str = 'Task not found'
data: Any | None = None


class TaskNotCancelableError(A2ABaseModel):
"""A2A-specific error for task not cancelable (-32002)."""

code: Literal[-32002] = -32002
message: str = 'Task cannot be canceled'
data: Any | None = None


class PushNotificationNotSupportedError(A2ABaseModel):
"""A2A-specific error for push notification not supported (-32003)."""

code: Literal[-32003] = -32003
message: str = 'Push Notification is not supported'
data: Any | None = None


class UnsupportedOperationError(A2ABaseModel):
"""A2A-specific error for unsupported operation (-32004)."""

code: Literal[-32004] = -32004
message: str = 'This operation is not supported'
data: Any | None = None


class ContentTypeNotSupportedError(A2ABaseModel):
"""A2A-specific error for content type not supported (-32005)."""

code: Literal[-32005] = -32005
message: str = 'Incompatible content types'
data: Any | None = None


class InvalidAgentResponseError(A2ABaseModel):
"""A2A-specific error for invalid agent response (-32006)."""

code: Literal[-32006] = -32006
message: str = 'Invalid agent response'
data: Any | None = None


class AuthenticatedExtendedCardNotConfiguredError(A2ABaseModel):
"""A2A-specific error for authenticated extended card not configured (-32007)."""

code: Literal[-32007] = -32007
message: str = 'Authenticated Extended Card is not configured'
data: Any | None = None
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The A2A exception's here shouldn't have JSONRPC specific data, the JSONRPC server/client components should have a map from the exceptions to the JSONRPC code and vice-versa for raising an exception on receipt of a JSONRPCError

JSONRPC specific exceptions like MethodNotFound should live in the JSONRPC implementation code or a separate JSONRPC common package IMO.

TaskNotCancelableError,
TaskNotFoundError,
UnsupportedOperationError,
)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

'tasks/pushNotificationConfig/delete': DeleteTaskPushNotificationConfigRequest,
'tasks/resubscribe': TaskResubscriptionRequest,
'agent/authenticatedExtendedCard': GetExtendedAgentCardRequest,
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

async def on_resubscribe_to_task(
self,
params: TaskIdParams,
params: CancelTaskRequest,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

if isinstance(self.pydantic_type, type) and issubclass(self.pydantic_type, ProtoMessage):
return ParseDict(value, self.pydantic_type()) # type: ignore[return-value]
# Assume it's a Pydantic model
return self.pydantic_type.model_validate(value) # type: ignore[union-attr]
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is still relevant to models.py even though its been disconnected. You can resolve this and I'll open a new comment.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No need to support Pydantic here IMO, just need to support the protobuf types.

One thought: We need to understand if we need to support old messages/tasks stored in the DB and translate them to the protobuf types. Otherwise we may require people to drop the DBs for their Agents

A list of dictionaries containing the data from any `DataPart` objects found.
"""
return [part.root.data for part in parts if isinstance(part.root, DataPart)]
from google.protobuf.json_format import MessageToDict
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Most of the stuff from here is gone, i think we can nuke the helper function here if we fix the response from the default_request_handler.

@holtskinner holtskinner changed the base branch from main to 1.0-a2a_proto_refactor December 2, 2025 16:49
@holtskinner holtskinner requested a review from lkawka December 2, 2025 16:50
@muscariello
Copy link
Member Author

@Tehsmash I am going to wait a few more days for the A2A 1.0 specs to get the most recent updates.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: Backlog

Development

Successfully merging this pull request may close these issues.

[Feat]: Upgrade A2A to v1.0

3 participants