
Python SDK v0.5: AI, Pydantic, and more
The latest version of the Inngest Python SDK is now stable and ready for production use.
Aaron Harper · 6/23/2025 · 3 min read
We are thrilled to announce the release of the Inngest Python SDK v0.5. This release brings a number of new features and improvements to the SDK, including:
- First-class Pydantic support in step and function output
- Experimental AI orchestration with
step.infer
- Connect is stable
- Python 3.13 support
- Event-sending retries
- Function singletons
- Function timeouts
- Improved parallel step performance
New features
Pydantic
Pydantic is a popular library for runtime data validation and serialization. Inngest now supports Pydantic models as step and function output. You can read more about Pydantic support in the Pydantic guide.
First, ensure that the Pydantic serializer is set on the Inngest client:
client = inngest.Inngest(
app_id="my-app",
serializer=inngest.PydanticSerializer(),
)
Then you can specify the output_type
parameter for step.run
:
class User(pydantic.BaseModel):
name: str
async def get_user() -> User:
return User(name="Alice")
@client.create_function(
fn_id="my-fn",
trigger=inngest.TriggerEvent(event="my-event"),
)
async def my_fn(ctx: inngest.Context) -> None:
# user object is a Pydantic object at both runtime and compile time
user = await ctx.step.run("get-user", get_user, output_type=User)
If you want to return a Pydantic object from a function, you can set the output_type
parameter on the function:
@client.create_function(
fn_id="my-fn",
output_type=User,
trigger=inngest.TriggerEvent(event="my-event"),
)
async def my_fn(ctx: inngest.Context) -> None:
return User(name="Alice")
AI orchestration
The Python SDK now supports the same step.ai.infer
AI orchestration as the TypeScript SDK. You can read more about AI orchestration in the AI orchestration guide.
import inngest
from inngest.experimental import ai
@inngest_client.create_function(
fn_id="hello-world",
trigger=inngest.TriggerEvent(event="say-hello"),
)
async def hello(ctx: inngest.Context) -> object:
res = await ctx.step.ai.infer(
"say-hello",
adapter=ai.anthropic.Adapter(
auth_key="sk-ant-000000",
model="claude-3-5-sonnet-latest",
),
body={
"max_tokens": 1024,
"messages": [{"role": "user", "content": "Hello, how are you?"}],
},
)
return res["content"][0]["text"]
Note that AI orchestration is still experimental. It's ready for production use, but the interface may change.
Connect
Connect is now stable. The feature is still in developer preview across all languages, but the Python API is stable. You can read more about Connect in the Connect guide.
Python 3.13
Python 3.13 is now supported. Note that Python 3.9 is no longer supported.
Event-sending retries
Whenever an event is sent using our client or step.send
, the SDK will now retry the request if it fails. Idempotency is automatic, so you don't need to worry about duplicate events.
Function singletons
Function singletons are now supported. You can read more about function singletons in the function singletons guide.
Function timeouts
Function timeouts are now supported. You can read more about function timeouts in the function timeouts guide.
Improved parallel step performance
The number of requests required to perform parallel steps is reduced by up to 50%.
Breaking changes
See the migration guide for more details.