Solve CPU 100% usage, a more simple and right way:
<?php
do {
curl_multi_exec($mh, $running);
curl_multi_select($mh);
} while ($running > 0);
?>
(PHP 5, PHP 7, PHP 8)
curl_multi_exec — 現在の cURL ハンドルから、サブ接続を実行する
スタック内の各ハンドルを処理します。 このメソッドは、ハンドルがデータの読み書きを要するかどうかにかかわらずコール可能です。
cURL 定義済み定数 で定義された cURL コードを返します。
注意:
これは、マルチスタック全体に関するエラーのみを返します。この関数が
CURLM_OK
を返したとしても、各転送で個別にエラーが発生している可能性があります。
バージョン | 説明 |
---|---|
8.0.0 |
multi_handle は CurlMultiHandle クラスのインスタンスを期待するようになりました。
これより前のバージョンでは、resource を期待していました。
|
例1 curl_multi_exec() の例
この例は、ふたつの cURL ハンドルを作成し、それをマルチハンドルに追加して非同期で実行します。
<?php
// cURL リソースを作成します
$ch1 = curl_init();
$ch2 = curl_init();
// URL およびその他適切なオプションを設定します。
curl_setopt($ch1, CURLOPT_URL, "http://example.com/");
curl_setopt($ch1, CURLOPT_HEADER, 0);
curl_setopt($ch2, CURLOPT_URL, "http://www.php.net/");
curl_setopt($ch2, CURLOPT_HEADER, 0);
// マルチ cURL ハンドルを作成します
$mh = curl_multi_init();
// ふたつのハンドルを追加します
curl_multi_add_handle($mh,$ch1);
curl_multi_add_handle($mh,$ch2);
// ハンドルを実行します
do {
$status = curl_multi_exec($mh, $active);
if ($active) {
// Wait a short time for more activity
curl_multi_select($mh);
}
} while ($active && $status == CURLM_OK);
// ハンドルを閉じます
curl_multi_remove_handle($mh, $ch1);
curl_multi_remove_handle($mh, $ch2);
curl_multi_close($mh);
?>
Solve CPU 100% usage, a more simple and right way:
<?php
do {
curl_multi_exec($mh, $running);
curl_multi_select($mh);
} while ($running > 0);
?>
Probably you also want to be able to download the HTML content into buffers/variables, for parsing the HTML or for other processing in your program.
The example code on this page only outputs everything on the screen, without giving you the possibility to save the downloaded pages in string variables. Because downloading multiple pages is what I wanted to do (not a big surprise, huh? that's the reason for using multi-page parallel Curl) I was initially baffled, because this page doesn't give pointers to a guide how to do that.
Fortunately, there's a way to download content with parallel Curl requests (just like you would do for a single download with the regular curl_exec). You need to use: http://php.net/manual/en/function.curl-multi-getcontent.php
The function curl_multi_getcontent should definitely be mentioned in the "See Also" section of curl_multi_exec. Probably most people who find their way to the docs page of curl_multi_exec, actually want to download the multiple HTML pages (or other content from the multiple parallel Curl connections) into buffers, one page per one buffer.
// Todas url gravadas em array
$url[] = 'http://www.link1.com.br';
$url[] = 'https://www.link2.com.br';
$url[] = 'https://www.link3.com.br';
// Setando opção padrão para todas url e adicionando a fila para processamento
$mh = curl_multi_init();
foreach($url as $key => $value){
$ch[$key] = curl_init($value);
curl_setopt($ch[$key], CURLOPT_NOBODY, true);
curl_setopt($ch[$key], CURLOPT_HEADER, true);
curl_setopt($ch[$key], CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch[$key], CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch[$key], CURLOPT_SSL_VERIFYHOST, false);
curl_multi_add_handle($mh,$ch[$key]);
}
// Executando consulta
do {
curl_multi_exec($mh, $running);
curl_multi_select($mh);
} while ($running > 0);
// Obtendo dados de todas as consultas e retirando da fila
foreach(array_keys($ch) as $key){
echo curl_getinfo($ch[$key], CURLINFO_HTTP_CODE);
echo curl_getinfo($ch[$key], CURLINFO_EFFECTIVE_URL);
echo "\n";
curl_multi_remove_handle($mh, $ch[$key]);
}
// Finalizando
curl_multi_close($mh);
/!\ ATTENTION
/!\ Several of the non-downvoted notes on this page are using outdated info.
The CURLM_CALL_MULTI_PERFORM return code has been defunct since circa 2012, at least seven years ago.
Quoting the author of curl, from https://curl.haxx.se/mail/lib-2012-08/0042.html:
> CURLM_CALL_MULTI_PERFORM is deprecated and will never be returned, as documented.
> During the first decade or so of libcurl's multi interface, I never saw a single proper use of that feature. I did however see numerous mistakes and misunderstandings. That made me decide that the feature wasn't important or good enough, so since 7.20.0 CURLM_CALL_MULTI_PERFORM is no more.
Discovered all of this thanks to https://stackoverflow.com/q/19490837/3229684, which suggested the following replacement while loop:
<?php
do {
$mrc = curl_multi_exec($mc, $active);
} while ($active > 0);
?>
https://www.google.com/search?q=CURLM_CALL_MULTI_PERFORM <-- probably the most future-proof useful link I can put here
Just for people struggling to get this to work, here is my approach.
No infinite loops, no CPU 100%, speed can be tweaked.
<?php
function curl_multi_exec_full($mh, &$still_running) {
do {
$state = curl_multi_exec($mh, $still_running);
} while ($still_running > 0 && $state === CURLM_CALL_MULTI_PERFORM && curl_multi_select($mh, 0.1));
return $state;
}
function curl_multi_wait($mh, $minTime = 0.001, $maxTime = 1){
$umin = $minTime*1000000;
$start_time = microtime(true);
$num_descriptors = curl_multi_select($mh, $maxTime);
if($num_descriptors === -1){
usleep($umin);
}
$timespan = (microtime(true) - $start_time);
if($timespan < $umin){
usleep($umin - $timespan);
}
}
$handles = [
[
CURLOPT_URL=>"http://example.com/",
CURLOPT_HEADER=>false,
CURLOPT_RETURNTRANSFER=>true,
CURLOPT_FOLLOWLOCATION=>false,
],
[
CURLOPT_URL=>"http://www.php.net",
CURLOPT_HEADER=>false,
CURLOPT_RETURNTRANSFER=>true,
CURLOPT_FOLLOWLOCATION=>false,
// https://stackoverflow.com/a/41135574
CURLOPT_HEADERFUNCTION=>function($ch, $header)
{
print "header from http://www.php.net: ".$header;
return strlen($header);
}
]
];
$mh = curl_multi_init();
$chandles = [];
foreach($handles as $opts) {
$ch = curl_init();
curl_setopt_array($ch, $opts);
curl_multi_add_handle($mh, $ch);
$chandles[] = $ch;
}
$prevRunning = null;
do {
$status = curl_multi_exec_full($mh, $running);
if($running < $prevRunning){
while ($read = curl_multi_info_read($mh, $msgs_in_queue)) {
$info = curl_getinfo($read['handle']);
if($read['result'] !== CURLE_OK){
print "Error: ".$info['url'].PHP_EOL;
}
if($read['result'] === CURLE_OK){
/*
if(isset($info['redirect_url']) && trim($info['redirect_url'])!==''){
print "running redirect: ".$info['redirect_url'].PHP_EOL;
$ch3 = curl_init();
curl_setopt($ch3, CURLOPT_URL, $info['redirect_url']);
curl_setopt($ch3, CURLOPT_HEADER, 0);
curl_setopt($ch3, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch3, CURLOPT_FOLLOWLOCATION, 0);
curl_multi_add_handle($mh,$ch3);
}
*/
print_r($info);
//echo curl_multi_getcontent($read['handle']));
}
}
}
if ($running > 0) {
curl_multi_wait($mh);
}
$prevRunning = $running;
} while ($running > 0 && $status == CURLM_OK);
foreach($chandles as $ch){
curl_multi_remove_handle($mh, $ch);
}
curl_multi_close($mh);
?>
One of examples, how to make multi_curl faster twice (pseudocode) with Fibers:
<?php
$curlHandles = [];
$urls = [
'https://example.com/1',
'https://example.com/2',
...
'https://example.com/1000',
];
$mh = curl_multi_init();
$mh_fiber = curl_multi_init();
$halfOfList = floor(count($urls) / 2);
foreach ($urls as $index => $url) {
$ch = curl_init($url);
$curlHandles[] = $ch;
// half of urls will be run in background in fiber
$index > $halfOfList ? curl_multi_add_handle($mh_fiber, $ch) : curl_multi_add_handle($mh, $ch);
}
$fiber = new Fiber(function (CurlMultiHandle $mh) {
$still_running = null;
do {
curl_multi_exec($mh, $still_running);
Fiber::suspend();
} while ($still_running);
});
// run curl multi exec in background while fiber is in suspend status
$fiber->start($mh_fiber);
$still_running = null;
do {
$status = curl_multi_exec($mh, $still_running);
} while ($still_running);
do {
/**
* at this moment curl in fiber already finished (maybe)
* so we must refresh $still_running variable with one more cycle "do while" in fiber
**/
$status_fiber = $fiber->resume();
} while (!$fiber->isTerminated());
foreach ($curlHandles as $index => $ch) {
$index > $halfOfList ? curl_multi_remove_handle($mh_fiber, $ch) : curl_multi_remove_handle($mh, $ch);
}
curl_multi_close($mh);
curl_multi_close($mh_fiber);
?>
http://curl.haxx.se/libcurl/c/libcurl-multi.html
"When you've added the handles you have for the moment (you can still add new ones at any time), you start the transfers by call curl_multi_perform(3).
curl_multi_perform(3) is asynchronous. It will only execute as little as possible and then return back control to your program. It is designed to never block. If it returns CURLM_CALL_MULTI_PERFORM you better call it again soon, as that is a signal that it still has local data to send or remote data to receive."
So it seems the loop in sample script should look this way:
<?php
$running=null;
//execute the handles
do {
while (CURLM_CALL_MULTI_PERFORM === curl_multi_exec($mh, $running));
if (!$running) break;
while (($res = curl_multi_select($mh)) === 0) {};
if ($res === false) {
echo "<h1>select error</h1>";
break;
}
} while (true);
?>
This worked fine (PHP 5.2.5 @ FBSD 6.2) without running non-blocked loop and wasting CPU time.
However this seems to be the only use of curl_multi_select, coz there's no simple way to bind it with other PHP wrappers for select syscall.
I tried Daniel G Zylberberg's function and
it was not working the way it was posted.
I made some changes to get it work and here is what I use:
function multiCurl($res, $options=""){
if(count($res)<=0) return False;
$handles = array();
if(!$options) // add default options
$options = array(
CURLOPT_HEADER=>0,
CURLOPT_RETURNTRANSFER=>1,
);
// add curl options to each handle
foreach($res as $k=>$row){
$ch{$k} = curl_init();
$options[CURLOPT_URL] = $row['url'];
$opt = curl_setopt_array($ch{$k}, $options);
var_dump($opt);
$handles[$k] = $ch{$k};
}
$mh = curl_multi_init();
// add handles
foreach($handles as $k => $handle){
$err = curl_multi_add_handle($mh, $handle);
}
$running_handles = null;
do {
curl_multi_exec($mh, $running_handles);
curl_multi_select($mh);
} while ($running_handles > 0);
foreach($res as $k=>$row){
$res[$k]['error'] = curl_error($handles[$k]);
if(!empty($res[$k]['error']))
$res[$k]['data'] = '';
else
$res[$k]['data'] = curl_multi_getcontent( $handles[$k] ); // get results
// close current handler
curl_multi_remove_handle($mh, $handles[$k] );
}
curl_multi_close($mh);
return $res; // return response
}
This example is not fully working on PHP5.6.
Using curl_multi, and CURLOPT_VERBOSE & CURLOPT_STDERR, If found that sending curl_multi request before any "classic" cURL request will result in timeout error against some hostname. The error message display
> * Hostname was NOT found in DNS cache
It seems that that adding this line to the first loop solve this problem.
do {
$mrc = curl_multi_exec($mh, $active);
// check errors
if ($mrc > 0) {
// display error
error_log(curl_multi_strerror($mrc));
}
} while ($mrc === CURLM_CALL_MULTI_PERFORM || $active);
source : https://stackoverflow.com/questions/30935541/php-curl-error-hostname-was-not-found-in-dns-cache