Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streamed response gets buffered and returned in one chunk #903

Open
danharrin opened this issue Jun 4, 2024 · 15 comments
Open

Streamed response gets buffered and returned in one chunk #903

danharrin opened this issue Jun 4, 2024 · 15 comments

Comments

@danharrin
Copy link

danharrin commented Jun 4, 2024

Octane Version

2.3.12

Laravel Version

11.9.2

PHP Version

8.3.0

What server type are you using?

Roadrunner

Server Version

2024.1.2

Database Driver & Version

No response

Description

When streaming a response from a route behind Octane, the response gets buffered and arrives all at once instead of as it is generated.

Currently, wire:stream in Livewire is completely broken when using Laravel Octane, but the issue is not specific to Livewire and my instructions below use native Laravel and vanilla JavaScript only.

Steps To Reproduce

I have created a minimal reproduction repository. Follow the instructions in the README to observe the behaviour of artisan serve as opposed to Octane. There is a small amount of JavaScript in welcome.blade.php that sends the request and outputs the chunks in real time.

Here is the response, note the sleep(1) calls to simulate an artificial delay in receiving the stream chunks. When using artisan serve, there is a second delay between each number appearing. When using Octane, there is a 5 second delay before all numbers appear at once.

return response()->stream(function () {
    echo 1;

    ob_flush();
    flush();
    sleep(1);

    echo 2;

    ob_flush();
    flush();
    sleep(1);

    echo 3;

    ob_flush();
    flush();
    sleep(1);

    echo 4;

    ob_flush();
    flush();
    sleep(1);

    echo 5;
}, 200, [
    'Content-Type' => 'text/html; charset=utf-8;',
    'Cache-Control' => 'no-cache',
    'X-Accel-Buffering' => 'no',
]);
@gazzoy
Copy link
Contributor

gazzoy commented Jun 4, 2024

@barryvdh do you happen to have any insights on this?

Thought you might have any ideas as just found you previously solved StreamedResponse stuff in #151.

@barryvdh
Copy link
Contributor

barryvdh commented Jun 4, 2024

Yeah well the 'fix' was to wait until the stream was finished. At that time I didn't find a way to pass the actual stream.

@gazzoy
Copy link
Contributor

gazzoy commented Jun 5, 2024

@barryvdh Thanks you! Ok, so seems like streaming w/ Octane is the darkside of the force...

Copy link

github-actions bot commented Jun 5, 2024

Thank you for reporting this issue!

As Laravel is an open source project, we rely on the community to help us diagnose and fix issues as it is not possible to research and fix every issue reported to us via GitHub.

If possible, please make a pull request fixing the issue you have described, along with corresponding tests. All pull requests are promptly reviewed by the Laravel team.

Thank you!

@donnysim
Copy link

donnysim commented Jun 6, 2024

RoadRunner does not seem to work well with buffers and uses Generators. Maybe Octane could support Generators?

return response()->stream(function (): Generator {
        yield '1';
        sleep(1);
        yield '2';
        sleep(1);
        yield '3';
    }, 200, [
        'Content-Type' => 'text/html; charset=utf-8;',
        'Cache-Control' => 'no-cache',
        'X-Accel-Buffering' => 'no',
    ]);

RoadRunnerClient:

/**
     * Send the response to the server.
     */
    public function respond(RequestContext $context, OctaneResponse $octaneResponse): void
    {
        if ($octaneResponse->outputBuffer &&
            ! $octaneResponse->response instanceof StreamedResponse &&
            ! $octaneResponse->response instanceof BinaryFileResponse) {
            $octaneResponse->response->setContent(
                $octaneResponse->outputBuffer.$octaneResponse->response->getContent()
            );
        }

        if ($octaneResponse->response instanceof StreamedResponse) {
            $this->client->getHttpWorker()->respond(
                $octaneResponse->response->getStatusCode(),
                $octaneResponse->response->getCallback()(), // Passing here the Generator
                $this->toPsr7Response($octaneResponse->response)->getHeaders()
            );
        } else {
            $this->client->respond($this->toPsr7Response($octaneResponse->response));
        }
    }

this works with RoadRunner.

@Orrison
Copy link

Orrison commented Jun 11, 2024

RoadRunner does not seem to work well with buffers and uses Generators. Maybe Octane could support Generators?

return response()->stream(function (): Generator {
        yield '1';
        sleep(1);
        yield '2';
        sleep(1);
        yield '3';
    }, 200, [
        'Content-Type' => 'text/html; charset=utf-8;',
        'Cache-Control' => 'no-cache',
        'X-Accel-Buffering' => 'no',
    ]);

RoadRunnerClient:

/**
     * Send the response to the server.
     */
    public function respond(RequestContext $context, OctaneResponse $octaneResponse): void
    {
        if ($octaneResponse->outputBuffer &&
            ! $octaneResponse->response instanceof StreamedResponse &&
            ! $octaneResponse->response instanceof BinaryFileResponse) {
            $octaneResponse->response->setContent(
                $octaneResponse->outputBuffer.$octaneResponse->response->getContent()
            );
        }

        if ($octaneResponse->response instanceof StreamedResponse) {
            $this->client->getHttpWorker()->respond(
                $octaneResponse->response->getStatusCode(),
                $octaneResponse->response->getCallback()(), // Passing here the Generator
                $this->toPsr7Response($octaneResponse->response)->getHeaders()
            );
        } else {
            $this->client->respond($this->toPsr7Response($octaneResponse->response));
        }
    }

this works with RoadRunner.

@donnysim I am just curious. Is what you posted here a complete fix for getting Streamed Responses to work in Octane RoadRunner? Or does more change need to be done to support Generators fully as you suggest?

@donnysim
Copy link

@Orrison technically would still need to check if it's a generator for edge cases but it's all you need for it to work with RoadRunner as it already accepts a Generator|string.

@Orrison
Copy link

Orrison commented Jun 11, 2024

@Orrison technically would still need to check if it's a generator for edge cases but it's all you need for it to work with RoadRunner as it already accepts a Generator|string.

That's amazing. Do you plan on opening a PR for this into Octane?

Would you like any assistance in doing so or help testing?

@Orrison
Copy link

Orrison commented Jun 11, 2024

@crynobone would y'all consider @donnysim's suggestion of using Generators here?

@donnysim
Copy link

I believe @danharrin wanted a stab at a PR as I'm not too time-free to invest time into PR's at the moment. If he decides otherwise then anyone is free maybe to attempt one, or I might as the week or next goes if freed up more. On the side note, when inspecting this I was thinking if it would be possible to convert the buffer into a Generator while maybe catching PHP_OUTPUT_HANDLER_START and PHP_OUTPUT_HANDLER_FINAL etc., but all the clients differ wildly in how they work that there's really no way I'd have time to analyze them all to find if that's a plausible approach.

@Orrison
Copy link

Orrison commented Jun 11, 2024

I believe @danharrin wanted a stab at a PR as I'm not too time-free to invest time into PR's at the moment. If he decides otherwise then anyone is free maybe to attempt one, or I might as the week or next goes if freed up more. On the side note, when inspecting this I was thinking if it would be possible to convert the buffer into a Generator while maybe catching PHP_OUTPUT_HANDLER_START and PHP_OUTPUT_HANDLER_FINAL etc., but all the clients differ wildly in how they work that there's really no way I'd have time to analyze them all to find if that's a plausible approach.

No problem at all! I totally understand. I just wanted to check with you if you had already planned on doing so. But since you are busy I can see if we and/or @danharrin can take a crack at it. I wanted to thank you for your research and work on this! If we can get StreamedResponses to work in Octane it would fill a massive gap in Octane right now.

@driesvints
Copy link
Member

Was anyone still working on a PR for this?

@Orrison
Copy link

Orrison commented Jul 2, 2024

@driesvints Yes, we believe we have found a fix for this. We overrode the Octane classes in our project as we needed the fix as quickly as possible.

canyongbs/advisingapp#823

We can open a PR to Octane soon with the change.

@Orrison
Copy link

Orrison commented Jul 8, 2024

@driesvints, just an update on this one. We have planned to open the PR for this in this current Sprint. So, either this week or early next week.

@driesvints
Copy link
Member

Thanks @Orrison

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants