"Casper " <cazz@diku.dk> wrote in message <loeau1$5dh$1@newscl01ah.mathworks.com>...
> "François " <francois.bailly@ens-cachan.fr> wrote in message <loe6l4$pg7$1@newscl01ah.mathworks.com>...
> > I'm running a function with big datas (my RAM reaches 4go/8go during the process). Thanks to the tic/toc function, I know that matlab reaches the end of my function, as the elapsed time (about 30 sec) is displayed in my command window.
> > After that, I have to wait for a few minutes, until Matlab is ready again (busy during all this time). Does someone know where this issue comes from? A clue : during this 'busy time' my RAM is still used up to 3go, it's like Matlab cannot clear the local variables...
>
> Well, that could be the case. Alternatively ar you flushing some kind of state/data back to disk? Oh, and from a personal view: big data is not governed by the amount of memory used - that could just be bad code or small data amounts requiring GB of RAM such as NLP.
No data flushed back (except a 1x10 array : nothing). By big data I meant 200x200 cells of 1000x200x256 matrix, and I need this for my entropy estimation. I tried to split those matrix into smaller ones, same issue happen.
> "François " <francois.bailly@ens-cachan.fr> wrote in message <loe6l4$pg7$1@newscl01ah.mathworks.com>...
> > I'm running a function with big datas (my RAM reaches 4go/8go during the process). Thanks to the tic/toc function, I know that matlab reaches the end of my function, as the elapsed time (about 30 sec) is displayed in my command window.
> > After that, I have to wait for a few minutes, until Matlab is ready again (busy during all this time). Does someone know where this issue comes from? A clue : during this 'busy time' my RAM is still used up to 3go, it's like Matlab cannot clear the local variables...
>
> Well, that could be the case. Alternatively ar you flushing some kind of state/data back to disk? Oh, and from a personal view: big data is not governed by the amount of memory used - that could just be bad code or small data amounts requiring GB of RAM such as NLP.
No data flushed back (except a 1x10 array : nothing). By big data I meant 200x200 cells of 1000x200x256 matrix, and I need this for my entropy estimation. I tried to split those matrix into smaller ones, same issue happen.