Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

while trainig (python train.py --task=all) i got error #30

Open
ksharm50 opened this issue Jun 28, 2018 · 5 comments
Open

while trainig (python train.py --task=all) i got error #30

ksharm50 opened this issue Jun 28, 2018 · 5 comments

Comments

@ksharm50
Copy link

while trainig (python train.py --task=all) i got error

Traceback (most recent call last):
File "train.py", line 250, in
main(args.task)
File "train.py", line 65, in main
import prepare_data_with_valid as dataset
File "C:\delete\shared_f\braintumor\u-net-brain-tumor-master\prepare_data_with
_valid.py", line 351, in
X_train_input = np.asarray(X_train_input, dtype=np.float32)
File "C:\anaconda\lib\site-packages\numpy\core\numeric.py", line 492, in asarr
ay
return array(a, dtype, copy=False, order=order)
MemoryError

how can i solve this one ?
thanks in advance for helping :)

@zsdonghao
Copy link
Owner

zsdonghao commented Jun 28, 2018

three solutions:

  1. find a machine with more memory
  2. smaller batch size
  3. reimplement the data loading part with tensorflow dataset API

hope it helps.

@ksharm50
Copy link
Author

hey :)

  1. i have machine with 16gig ram is it enough ?
  2. what you mean by batch sie exactly ?
  3. where is data loading part exactly ?
    due you mean that i should replace this code .
    if DATA_SIZE == 'all':
    HGG_path_list = tl.files.load_folder_list(path=HGG_data_path)
    LGG_path_list = tl.files.load_folder_list(path=LGG_data_path)
    elif DATA_SIZE == 'half':
    HGG_path_list = tl.files.load_folder_list(path=HGG_data_path)[0:100]# DEBUG WITH SMALL DATA
    LGG_path_list = tl.files.load_folder_list(path=LGG_data_path)[0:30] # DEBUG WITH SMALL DATA
    elif DATA_SIZE == 'small':
    HGG_path_list = tl.files.load_folder_list(path=HGG_data_path)[0:50] # DEBUG WITH SMALL DATA
    LGG_path_list = tl.files.load_folder_list(path=LGG_data_path)[0:20] # DEBUG WITH SMALL DATA

@ksharm50
Copy link
Author

Actually i am trying to run you solution with brats2018 dataset .

@miaozhang0525
Copy link

I also got this same issue. I found it is not because of batch size, and the error is in nib.load(image_path).get_data(). So, could you tell us use which tensorflow load api could replace this nib load???

@anitchris
Copy link

how did you resolve this problem?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants