Assume that I have to make an enormous number of HTTP requests (and get responses). How do I do that? E.g. using Symfony.
The docs propose:
$responses = [];
for ($i = 0; $i < 379; ++$i) {
$uri = "https://http2.akamai.com/demo/tile-$i.png";
$responses[] = $client->request('GET', $uri);
}
foreach ($client->stream($responses) as $response => $chunk) {
if ($chunk->isFirst()) {
// headers of $response just arrived
// $response->getHeaders() is now a non-blocking call
} elseif ($chunk->isLast()) {
// the full content of $response just completed
// $response->getContent() is now a non-blocking call
} else {
// $chunk->getContent() will return a piece
// of the response body that just arrived
}
}
But what if in my case $i <= 1_000_000_000
? It is obvious that it makes no sense to send them all simultaneously. (And is an impossible memory consumption.)
How do I do that nicely? It seems to me that I should somehow add new request after getting a complete response in the loop. But how to do it?
$responses = (static function () use ($client) {
for ($i = 0; $i < 1e5; $i++) {
$uri = "http://localhost/test.php?id=$i";
yield $client->request('GET', $uri);
}
})();
foreach ($client->stream($responses) as $response => $chunk) {
if ($chunk->isLast()) {
// how to schedule next request?
}
}
Or maybe I can do that easily with guzzle or curl?
question from:
https://stackoverflow.com/questions/65837615/how-to-limit-number-of-connections-at-a-time-while-requesting-an-enormous-number 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…