Let’s explore how A2A works in practice. In this blog post I’m demonstrating the basic usage of A2A, without using any AI. 🙂
Please note, this is a purely technical view, the challenges to build agents are not necessarily technical in nature, nonetheless I hope this post helps to get a basic understanding of A2A.
A2A
The A2A (Agent2Agent) protocol is helping us to connect agents and enable them to talk to each other. It essentially allows us to expose our agents as endpoints, ready to accept requests. In a nutshell, that is it. Therefore we will need a server and a client.
Server
There are many different frameworks to build agents. We already looked at quite a few on this blog: Langgraph, Semantic Kernel, Bee AI, …
To make things really simple: A2A is a wrapper around your agent, written in whatever framework you prefer. A2A is accepting a request, forwards it to your function, where you can process it.
Let’s first take a look at our “agent”. In this example we are not using any of the frameworks, we just have a simple function that says hi:
class SayHiAgent:
"""Hello World Agent."""
async def invoke(self) -> str:
return 'Hi there'
So far so good, this could be your Langgraph agent. Now let’s wrap our A2A server around this simple agent:
class SayHiAgentExecutor(AgentExecutor):
"""Executor for the Hello World Agent."""
def __init__(self):
self.agent = SayHiAgent()
async def execute(
self,
context: RequestContext,
event_queue: EventQueue,
) -> None:
result = await self.agent.invoke()
await event_queue.enqueue_event(new_agent_text_message(result))
async def cancel(
self, context: RequestContext, event_queue: EventQueue
) -> None:
pass
- we instantiate the simple agent
- the execute and cancel function must be implemented by the agent executor, the execute function will be invoked upon a client request. The only thing we do here is invoking our hello world agent, taking the result and send it back to the client
Next comes the most complicated bit: putting it all together and starting an actual server:
- Define the agent card with the skills
- Wire it all up and start the server
if __name__ == '__main__':
skill = AgentSkill(
id='say_hi',
name='Says hi',
description='just says hi',
tags=['hello world'],
examples=['hi', 'hello world'],
)
public_agent_card = AgentCard(
name='Say Hi Agent',
description='Say hi agent',
url='http://localhost:9999/',
version='1.0.0',
defaultInputModes=['text'],
defaultOutputModes=['text'],
capabilities=AgentCapabilities(),
skills=[skill]
)
request_handler = DefaultRequestHandler(
agent_executor=SayHiAgentExecutor(),
task_store=InMemoryTaskStore(),
)
server = A2AStarletteApplication(
agent_card=public_agent_card,
http_handler=request_handler,
)
uvicorn.run(server.build(), host='0.0.0.0', port=9999)
The server can then be run in the terminal, it will look something like this:

That’s it, we are ready to create a client application that connects to the server, feels a bit like exposing a simple Rest endpoint, right? 🙂
Client
The client is even simpler to build, of course this is only to demonstrate the basic setup. Nothing out of the ordinary happening here: we construct a client to talk to the A2A server, send a request and await and print the response
async def main() -> None:
async with httpx.AsyncClient() as httpx_client:
client = A2AClient(
httpx_client=httpx_client, url="http://localhost:9999",
)
send_message_payload: dict[str, Any] = {
'message': {
'role': 'user',
'parts': [
{'kind': 'text', 'text': 'Hello a2a!'}
],
'messageId': uuid4().hex,
},
}
request = SendMessageRequest(
id=str(uuid4()), params=MessageSendParams(**send_message_payload)
)
response = await client.send_message(request)
print(response)
if __name__ == '__main__':
import asyncio
asyncio.run(main())
And when we run the client: we can see the server responding!

Conclusion
Of course, there is more to A2A, specifically the mechanics around the task management. But I hope this simple example, which I derived from the hello world code from the a2a repository, demonstrates the framework and helps making it more accessible to engineers. For me, it is very similar to writing a JEE servlet to expose some API.