Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add on-demand feature #5

Open
egeakman opened this issue Feb 8, 2024 · 5 comments · May be fixed by #6
Open

Add on-demand feature #5

egeakman opened this issue Feb 8, 2024 · 5 comments · May be fixed by #6
Labels
enhancement New feature or request

Comments

@egeakman
Copy link
Owner

egeakman commented Feb 8, 2024

Inspired by a fork of this project

Some definitions and clarifications:

  • Demand, in this context, refers to whether someone is actively viewing the stream (through the stream's unique URL).

  • I am planning on having three modes:

    • regular: The capture device is always open and reading frames. The read frame is appointed to be streamed, but held back if there is no demand. This approach saves bandwidth when there is no demand.
    • regular-od: The capture device is opened and closed depending on the demand. Ditto. This approach saves bandwidth and computing resources when there is no demand. The downside is, opening and closing the camera introduces some risks (losing the capture device to another program, etc.) and a few seconds of wait time every time it's opened or closed.
    • fast-od: The capture device is always open, but frames are read only when there is demand. Ditto. Ditto. There is no downside to this I guess.

    But, a big but, there is a big overhead going on and I have to figure that out.

@egeakman egeakman added the enhancement New feature or request label Feb 8, 2024
@egeakman egeakman changed the title Add on-demand feature for CLI Add on-demand feature Feb 13, 2024
@egeakman egeakman linked a pull request Feb 13, 2024 that will close this issue
@egeakman
Copy link
Owner Author

Current tests show that there is an overhead going on (especially on the CPU side):

regular: CPU 3.2% - RAM 37 MB
regular-od: CPU 6.1% - RAM 33MB
fast-od: CPU 6.1% - RAM 33 MB

(this shows the usage when no client is watching the stream, aka when there is no demand)

I will try to find out what call is causing this overhead.

@lapociampi
Copy link

I'm glad that you liked the on-demand feature that i tried to implement, despite not having the time and skills to do a proper fork and pull request. I'll delete my fork as soon as you succesfully merge yours in main, can't wait :) .

@egeakman
Copy link
Owner Author

egeakman commented Feb 22, 2024

Current tests show that there is an overhead going on (especially on the CPU side):

regular: CPU 3.2% - RAM 37 MB
regular-od: CPU 6.1% - RAM 33MB
fast-od: CPU 6.1% - RAM 33 MB

(this shows the usage when no client is watching the stream, aka when there is no demand)

I will try to find out what call is causing this overhead.

Found out why... continuously polling has_demand

@egeakman
Copy link
Owner Author

The new ManagedStream class comes with fast-on-demand as default. You can also use full-on-demand which releases the capture device when there is no demand.

@egeakman
Copy link
Owner Author

Here are the latest performance results after b9f9e27:

Specs:
Python: Python 3.11.8 (main, Feb 26 2024, 21:39:34) [GCC 11.2.0] on linux
CPU: AMD Ryzen 7 5800H with Radeon Graphics (16) @ 4.463GHz
Memory: 15313MiB
uname -srmpio: Linux 6.5.0-25-generic x86_64 x86_64 x86_64 GNU/Linux

---

Tests:

CustomStream:
- idle: 15-30% _check_encoding - 5% cpu, 36mb ram
- active: 15-30% _check_encoding - 5% cpu, 35mb ram

ManagedStream:
- fast-on-demand:
  - idle: 0% NA - 0.4% cpu, 33mb ram
  - active 10-25% __process_current_frame - 3.8% cpu, 36mb ram
- full-on-demand:
  - idle: 0% NA - 0.3-0.4% cpu, 33mb ram
  - active: 8-20% __process_current_frame - 3.8% cpu, 36mb ram

Stream:
- idle: 1-4% NA - 4% cpu, 35mb ram
- active: 10-25% __process_current_frame - 4.6% cpu, 37mb ram

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants