If I have code that looks something like this (Laravel code here but it should apply generally):
class SomeClass
{
public function doSomething(array $data)
{
foreach ($data as $item) {
$this->doSomethingWithItem($item);
}
}
private function doSomethingWithItem($item)
{
$model = SomeModel::make($item);
// ... some other stuff
$model->save();
}
}
So my problem is that, if $data
is a very large set (in my real implementation it is a generator) memory usage increases linearly with the number of data items.
Since $model
is a local variable and only referenced there shouldn't it be garbage collected? I've even tried unset($model)
at the end to force it's release but it has no effect.
How can I use this kind of pattern without increasing my memory usage? Since I'm not storing any data structures memory usage should not increase with each iteration should it?
Am I missing something here?
question from:
https://stackoverflow.com/questions/65910217/garbage-collection-reduce-memory-usage 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…