Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Issue]: Is there any example of SSE for streaming? #3029

Open
yidasanqian opened this issue Jun 27, 2024 · 4 comments
Open

[Issue]: Is there any example of SSE for streaming? #3029

yidasanqian opened this issue Jun 27, 2024 · 4 comments

Comments

@yidasanqian
Copy link

Describe the issue

Using FastAPI or Flask?

Steps to reproduce

No response

Screenshots and logs

No response

Additional Information

No response

@LittleLittleCloud
Copy link
Collaborator

python/non-python? server-side/client side?

@yidasanqian
Copy link
Author

python/non-python? server-side/client side?

Python, server-side

@LittleLittleCloud
Copy link
Collaborator

LittleLittleCloud commented Jul 2, 2024

@yidasanqian I currently only have dotnet/server example for SSE endpoint. The gist is to set response header as text/event-stream and return the data with sse-event format

event:
your-event-nae
data:
your-data...

https://github.com/LittleLittleCloud/Agent-ChatRoom/blob/b639c7b5e0781ada4ea5d413d1c840534d2971a9/ChatRoom/ChatRoom.Common/Controller/ChatRoomClientController.cs#L241.

I also find an example for sse server-side implementation using fast-api, which you can find out that the header is also set to text/event-stream and the data format is also data:..... Hope this can help a little bit.

https://medium.com/@nandagopal05/server-sent-events-with-python-fastapi-f1960e0c8e4b

@yidasanqian
Copy link
Author

@LittleLittleCloud I have two agents, how can I stream the conversation between them through FastAPI?

assistant = AssistantAgent(
    name="assistant",
    system_message=DEFAULT_SYSTEM_MESSAGE,
    llm_config=llm_config
)

user_agent = UserProxyAgent(
    name="user_agent",
    human_input_mode="NEVER",
    max_consecutive_auto_reply=10,
    is_termination_msg=lambda x: x.get("content", "").rstrip().endswith("TERMINATE"),
    code_execution_config={"executor": code_executor}
)

chat_res = user_agent.initiate_chat(
        recipient=assistant,
        message=message,
        summary_method="last_msg"
      )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants