Skip to content
Struct Strm Logo

Structured Streamer

struct_strm (structured streamer) is a Python package that makes it easy to stream partial json generated by LLMs into valid json responses. This enables partial rendering of UI components without needing to wait for a full response, drastically reducing the time to the first word on the user's screen.

Why Use Structured Streamer?

JSON format is the standard when dealing with structured responses from LLMs. In the early days of LLM structured generation we had to validate the JSON response only after the whole JSON response had been returned. Modern approaches use constrained decoding to ensure that only valid json is returned, eliminating the need for post generation validation, and allowing us to use the response imediately. However, the streamed json response is incomplete, so it can't be parsed using traditional methods. This library aims to make it easier to handle this partially generated json to provide a better end user experience.
See the benchmarks section in the docs for more details about how this library can speed up your structured response processing.


You can learn more about constrained decoding and context free grammar here: XGrammar - Achieving Efficient, Flexible, and Portable Structured Generation with XGrammar

Installation

1
pip install struct-strm


Main Features

The primary feature is to wrap LLM outputs to produce valid incremental JSON from partial invalid JSON based on user provided structures. Effectively this acts as a wrapper for your LLM calls. Due to the nature of this library (it is primarily inteded for use in web servers), it is expected that it will be used in async workflows, and is async first.

The library also provides simple HTML templates that serve as examples of how you can integrate the streams in your own components.

Due to the nature of partial json streaming, there can be "wrong" ways to stream responses that are not effective for partial rendering of responeses in the UI. The library also provides examples of tested ways to apply the library to get good results.

High Level Flow
High level flow

Example Component

This is an example of a form component being incrementally rendered. By using a structured query response from an LLM, in this case a form with form field names and field placeholders, we can stream the form results directly to a HTML component. This drastically reduces the time to first token, and the precieved time that a user needs to wait. More advanced components are under development.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
from stuct_strm import parse_openai
from pydantic import BaseModel
from openai import AsyncOpenAI

...

class DefaultFormItem(BaseModel):
    field_name: str = ""
    field_placeholder: str = ""

class DefaultFormStruct(BaseModel):
    form_fields: List[DefaultFormItem] = []


stream_response = client.beta.chat.completions.stream(
    model="gpt-4.1",
    messages=messages,
    response_format=DefaultFormStruct,
    temperature=0.0,
) 

form_struct_response = parse_openai(DefaultFormStruct, stream_response)
async for instance in form_struct_response:
    async for formstruct in instance:
        print(formstruct)

Fully formed python classes are returned:

1
2
3
4
>>>  DefaultFormStruct(form_fields=[DefaultFormItem(field_name="fruits", field_placeholder="")])
>>>  DefaultFormStruct(form_fields=[DefaultFormItem(field_name="fruits", field_placeholder="apple ")])
>>>  DefaultFormStruct(form_fields=[DefaultFormItem(field_name="fruits", field_placeholder="apple orange strawberry")])
>>>  etc....

And the corresponding incomplete json string streams would have looked like:

1
2
3
4
>>> "{"form_fields": [{"field_name": "fruits"
>>> "{"form_fields": [{"field_name": "fruits", "field_placeholder": "apple "
>>> "{"form_fields": [{"field_name": "fruits", "field_placeholder": "apple orange strawberry"}
>>> etc...

Component Streaming

The structured responses can then be easily used to generate incrementally rendered web components.
For example this form:

Example Form Streaming


Or we can return data in a grid in more interesting ways.
For example this rubric:

Example Rubric Streaming

Other

I started struct_strm to support another project I'm working on to provide an easy entrypoint for Teachers to use LLM tools in their workflows. Check it out if you're interested - Teachers PET