Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RAM out of memory errors. #50

Open
relicvisuals opened this issue Jan 21, 2024 · 5 comments
Open

RAM out of memory errors. #50

relicvisuals opened this issue Jan 21, 2024 · 5 comments

Comments

@relicvisuals
Copy link

relicvisuals commented Jan 21, 2024

Error occurred when executing UltimateSDUpscale:

[enforce fail at ..\c10\core\impl\alloc_cpu.cpp:72] data. DefaultCPUAllocator: not enough memory: you tried to allocate 5007237120 bytes.

File "C:\Users\zappa\Desktop\ComfyUi\ComfyUI_windows_portable\ComfyUI\execution.py", line 154, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
File "C:\Users\zappa\Desktop\ComfyUi\ComfyUI_windows_portable\ComfyUI\execution.py", line 84, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "C:\Users\zappa\Desktop\ComfyUi\ComfyUI_windows_portable\ComfyUI\execution.py", line 77, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "C:\Users\zappa\Desktop\ComfyUi\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_UltimateSDUpscale\nodes.py", line 125, in upscale
processed = script.run(p=sdprocessing, _=None, tile_width=tile_width, tile_height=tile_height,
File "C:\Users\zappa\Desktop\ComfyUi\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_UltimateSDUpscale\repositories\ultimate_sd_upscale\scripts\ultimate-upscale.py", line 553, in run
upscaler.process()
File "C:\Users\zappa\Desktop\ComfyUi\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_UltimateSDUpscale\repositories\ultimate_sd_upscale\scripts\ultimate-upscale.py", line 136, in process
self.image = self.redraw.start(self.p, self.image, self.rows, self.cols)
File "C:\Users\zappa\Desktop\ComfyUi\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_UltimateSDUpscale\repositories\ultimate_sd_upscale\scripts\ultimate-upscale.py", line 243, in start
return self.linear_process(p, image, rows, cols)
File "C:\Users\zappa\Desktop\ComfyUi\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_UltimateSDUpscale\repositories\ultimate_sd_upscale\scripts\ultimate-upscale.py", line 178, in linear_process
processed = processing.process_images(p)
File "C:\Users\zappa\Desktop\ComfyUi\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_UltimateSDUpscale\modules\processing.py", line 115, in process_images
negative_cropped = crop_cond(p.negative, crop_region, p.init_size, init_image.size, tile_size)
File "C:\Users\zappa\Desktop\ComfyUi\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_UltimateSDUpscale\utils.py", line 455, in crop_cond
crop_controlnet(cond_dict, region, init_size, canvas_size, tile_size, w_pad, h_pad)
File "C:\Users\zappa\Desktop\ComfyUi\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_UltimateSDUpscale\utils.py", line 313, in crop_controlnet
hint = resize_tensor(hint, tile_size[::-1])
File "C:\Users\zappa\Desktop\ComfyUi\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_UltimateSDUpscale\utils.py", line 51, in resize_tensor
return torch.nn.functional.interpolate(tensor, size=size, mode=mode)
File "C:\Users\zappa\Desktop\ComfyUi\ComfyUI_windows_portable\python_embeded\lib\site-packages\torch\nn\functional.py", line 3938, in interpolate
return torch._C._nn._upsample_nearest_exact2d(input, output_size, scale_factors)

Running into errors with memory allocation in UltimateUpscale using large batches, I assume it shouldn't be an issue with 128gb of RAM and I don't haven any issues doing things of this size in latent. What could be the issue? Current batch size is 1410

@ssitu
Copy link
Owner

ssitu commented Jan 21, 2024

It’s happening when it is tiling the controlnet image. Is the batch size of 1410 close to the limit of what it can handle before it runs out of memory? If that’s the case, then you might have to lower the batch size to account for the Controlnet. Other than that, maybe there’s some optimizations I can try to make for controlnet.

@relicvisuals
Copy link
Author

It’s happening when it is tiling the controlnet image. Is the batch size of 1410 close to the limit of what it can handle before it runs out of memory? If that’s the case, then you might have to lower the batch size to account for the Controlnet. Other than that, maybe there’s some optimizations I can try to make for controlnet.

Unsure of when this error generally occurs, I am using AnimateDiff to render video to video, most of my previous projects are using 225 images, and I am pushing the limits of this program by upscaling too pretty absurd resolutions. Previous renders have been 6800x880 pixels for some work I was doing on a nightclub and were giving me errors, I am also running 5 controlnets. Never had VRAM issues, do you think that upgrading my CPU and RAM to have faster speeds would affect the performance? Is it possible to add a feature that splits batch size up into seperate segments and then sends them through at the end?

@ssitu
Copy link
Owner

ssitu commented Jan 21, 2024

Oh so you can't reduce the batch size because it's video to video? I don't know much about AnimateDiff. What image size are you using as input to this upscale node and what's the value of upscale_by that you're using?

I'm sure that upgrading to faster speeds will affect performance, but it also depends on where the bottleneck is, so you'd have to test where most of the time is being spent.

I believe there are other nodes you can use to split up your batch of images and group them back up at the end. I believe the node is called ImageRebatch, which is already included in ComfyUI; I have never used it, but the description seems to do what you want. I think if you put one before the USDU and one after it, it will split and merge depending on the value you have for the batch size, but I'm not sure so you'll have to play around with it.

@mike-rowley
Copy link

mike-rowley commented Mar 19, 2024

I am having the same issue, running a 2X video upscale of about 750 frames my RAM slowly creeps up until Comfy crashes with a "killed" message between 500 and 600 frames. I can see the RAM usage climbing on the performance monitor in Ubuntu where it hits the top and falls back down once Comfy as crashed..

Any update on a fix for this?

@ssitu
Copy link
Owner

ssitu commented Mar 20, 2024

I am having the same issue, running a 2X video upscale of about 750 frames my RAM slowly creeps up until Comfy crashes with a "killed" message between 500 and 600 frames. I can see the RAM usage climbing on the performance monitor in Ubuntu where it hits the top and falls back down once Comfy as crashed..

Any update on a fix for this?

Does the same problem happen if you replace USDU with a regular upscale node?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants