Reason for high CPU usage :
- limited resources on the Search servers and front-end servers – increase memory
- Limit number of Search Servers
- Isolate content types by site/sites collections
- Errors on search logs
- broken links- why the links are broken and correct root cause, which can be permission, deleted sites….
- Rebuild the index
- create a dedicated search server that doesn’t support other SharePoint services.
- closed web parts, check bad web parts in central admins under health analyzer.
- test the content database by using test-spcontendatabase -Namae <bbbbb> WebApplicaiton <BBBBBBB>
- Excluded the file locations from antivirus ?
- Implement crawl impact rules this will help to reduce the load on the server but will increase crawl duration.
- Clear the config cache and restart the search services.
- Reduced: Total number of threads = number of processors, Max Threads/host = number of processors
- Move the query roles the the front end servers.
Antivirus Exclusions
Most organisations insist on running AV on every server. Even when you have a SharePoint AV solution installed, indexing will copy the file to a temporary location on the server and the local OS based AV will scan it also.

This obvious causes more cpu load (and longer time to perform a full index if every file has to be rechecked for viruses). The default temporary location for index files should of course be changed to another volume rather than c: for optimal performance.
See http://support.microsoft.com/kb/952167 for other default exclusions that should be applied.
The PerformanceLevel property in the Set-SPEnterpriseSearchService command specifies the relative number of threads for the indexer performance. The value must be one of the following: Reduced: Total number of threads = number of processors, Max Threads/host = number of processors Partly Reduced: Total number of threads = 4 times the number of processors , Max Threads/host = 16 time the number of processors Maximum: Total number of threads = number of processors
SCRIPT TO LIST CLOSED WEB PARTS ON ALL PAGES:
# Write the Header line to a new CSV File (this drops it in the current directory): “Page URL, Closed Web Part Name” | out-file ClosedWebParts.csv
# Get all Webs from the Site:
#$webs = Get-SPWebApplication “http://<your URL>” | Get-SPSite -Limit All | Get-SPWeb -Limit All
# Loop through each of the Web Sites found (note: you MUST be SCA when running this!)
#
foreach ($web in $webs)
{
# Get All Pages from site’s Root into $AllPages Array
#
$AllPages = @($web.Files | Where-Object {$_.Name -match “.aspx”})
# Search All Folders for All Pages
#
foreach ($folder in $web.Folders)
{
#Add the pages to $AllPages Array
$AllPages += @($folder.Files | Where-Object {$_.Name -match “.aspx”})
}
# Loop through all of the pages and check each:
#
foreach($Page in $AllPages)
{
$webPartManager = $web.GetLimitedWebPartManager($Page.ServerRelativeUrl,[System.Web.UI.WebControls.WebParts.PersonalizationScope]::Shared)
# Use an array to hold a list of the closed web parts:
#
$closedWebParts = @()
foreach ($webPart in $webPartManager.WebParts | Where-Object {$_.IsClosed})
{
$result = “$($web.site.Url)$($Page.ServerRelativeUrl), $($webpart.Title)”
Write-Host “Closed Web Part(s) Found at URL: “$result
$result | Out-File ClosedWebParts.csv -Append
$closedWebParts += $webPart
}
}
}
#SCRIPT TO DELETE CLOSED WEB PARTS ON ALL PAGES:
# Write the Header line to a new CSV File (this drops it in the current directory):
#“Page URL, Closed Web Part Name” | out-file ClosedWebParts.csv
# Get all Webs from the Site:
#
$webs = Get-SPWebApplication “http://<your URL>” | Get-SPSite -Limit All | Get-SPWeb -Limit All
# Loop through each of the Web Sites found (note: you MUST be SCA when running this!)
#
foreach ($web in $webs)
{
# Get All Pages from site’s Root into $AllPages Array
#
$AllPages = @($web.Files | Where-Object {$_.Name -match “.aspx”})
# Search All Folders for All Pages
#
foreach ($folder in $web.Folders)
{
#Add the pages to $AllPages Array
$AllPages += @($folder.Files | Where-Object {$_.Name -match “.aspx”})
}
# Loop through all of the pages and check each:
#
foreach($Page in $AllPages)
{
$webPartManager = $web.GetLimitedWebPartManager($Page.ServerRelativeUrl,[System.Web.UI.WebControls.WebParts.PersonalizationScope]::Shared)
# Use an array to hold a list of the closed web parts:
#
$closedWebParts = @()
foreach ($webPart in $webPartManager.WebParts | Where-Object {$_.IsClosed})
{
$result = “$($web.site.Url)$($Page.ServerRelativeUrl), $($webpart.Title)”
Write-Host “Closed Web Part(s) Found at URL: “$result
$result | Out-File ClosedWebParts.csv -Append
$closedWebParts += $webPart
}
# Delete Closed Web Parts
#
foreach ($webPart in $closedWebParts)
{
Write-Host “Deleting ‘$($webPart.Title)’ on $($web.site.Url)/$($page.Url)”
$webPartManager.DeleteWebPart($webPart)
}
}
}
resources :
https://social.technet.microsoft.com/Forums/azure/en-US/a5e7c464-5652-4d61-8280-1988d4950748/microsoft-sharepoint-search-component-running-high-in-cpu99-in-task-manager-sharepoint-2013?forum=sharepointadmin
SharePoint Search and Why Does It Tap Out the CPU at 100%?
http://technet.microsoft.com/en-us/library/ff678212.aspx
http://www.spsdemo.com/blog/Lists/Posts/Post.aspx?ID=359
http://alstechtips.blogspot.com/2015/01/sharepoint-2013-optimizing-search-crawl.html
