Do not keep searching for recent

If userA has a lot of recent files. But only shares 1 file with userB
(that has no files at all). We could keep searching until we run out of
recent files for userA.

Now assume the inactive userB has 20 incomming shares like that from
different users. getRecent then basically keeps consuming huge amounts
of resources and with each iteration the load on the DB increases
(because of the offset).

This makes sure we do not get more than 3 times the limit we search for
or more than 5 queries.

This means we might miss some recent entries but we should fix that
separatly. This is just to make sure the load on the DB stays sane.

Signed-off-by: Roeland Jago Douma <roeland@famdouma.nl>
This commit is contained in:
Roeland Jago Douma 2019-07-16 19:10:09 +02:00 committed by Backportbot
parent 26971af51d
commit 8e6ac9d678
1 changed files with 6 additions and 1 deletions

View File

@ -383,6 +383,8 @@ class Folder extends Node implements \OCP\Files\Folder {
// Search in batches of 500 entries
$searchLimit = 500;
$results = [];
$searchResultCount = 0;
$count = 0;
do {
$searchResult = $this->recentSearch($searchLimit, $offset, $storageIds, $folderMimetype);
@ -391,6 +393,8 @@ class Folder extends Node implements \OCP\Files\Folder {
break;
}
$searchResultCount += count($searchResult);
$parseResult = $this->recentParse($searchResult, $mountMap, $mimetypeLoader);
foreach ($parseResult as $result) {
@ -398,7 +402,8 @@ class Folder extends Node implements \OCP\Files\Folder {
}
$offset += $searchLimit;
} while (count($results) < $limit);
$count++;
} while (count($results) < $limit && ($searchResultCount < (3 * $limit) || $count < 5));
return array_slice($results, 0, $limit);
}