Running with minibatches (memory constraints)

Thanks for your reply :slight_smile:

I’m still not really understanding why the RAM requirements of my model scales so heavily with more training samples?

Say like my training set is 1000 samples (~1GB in memory), and my model (constructed with minibatches) takes another 2GB of memory.

If I bump my training size up to 100k samples (5GB), and use the exact same model (with the same minibatch size) then why would the model take up ~30GB? It wouldn’t take up that memory storing indexes for the minibatches, would it?

In the above code, the basic model stays the same and I’m not passing it more input at once (since minibatch size stays the same). So you’d expect the RAM requirements to not be too much more than the increase in training set size?

Sorry if these are stupid questions, but I’m just a bit confused!

Thanks again