Huge long term disk usage (performance improvement)
See original GitHub issueHello,
on Linux, there is more than 100MB/s disk usage returned by iotop utility for the process:
searchd --config /home/me/rats-search/sphinx.conf --nodetach
searchd = rats-search/imports/linux/x64/searchd
when i have killed the rats (there is no restart/shutdown button) and started again, i see after some minutes issue is back.
Any idea on commands that can shed some light on this please?
here some more details (password: r)
i am at 1.7 million torrents and the database folder is 7,5G already, program shows something under 20 million torrents possible
Issue Analytics
- State:
- Created 2 years ago
- Comments:48 (26 by maintainers)
Top Results From Across the Web
[SOLVED] Windows 10 100% disk usage in Task Manager
8 fixes for 100% disk usage on Windows 10 · Fix 1: Disable SuperFetch · Fix 2: Update your device drivers · Fix...
Read more >100% Disk Usage in Windows 10? 17 Tips and Tricks to Fix ...
Windows 10 100% Disk Usage Causing Slow Performance? Run Disk Check. Update Your Anti-Virus if Windows 10 Disk Is Running at 100%.
Read more >How to Fix 100% Disk Usage in Windows 10 - AVG
Since malware can play a huge role in pushing Windows 10 to 100% disk usage, stop whatever you're doing now and install a...
Read more >Reduce Disk Latency & Improve Response Times like an IT ...
If the Disk Response Times are often higher than 10 milliseconds, and you need to improve the application performance, then it's choice time...
Read more >How To Fix 100% Disk Usage in Windows 2021 - YouTube
How To Fix 100% Disk Usage in Windows 2021 | Improve Gaming Performance & FPS | Nico Knows TechNico's Exclusive Deals & Recommendations:...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Hi guys. Manticore team member here.
Regarding “maybe now they can resolve something or give advice”: as it turned out the main problem may be that string attributes are used instead of stored fields. E.g. here https://github.com/DEgITx/rats-search/blob/master/src/background/sphinx.js#L44 and below that line all these are string attributes:
and:
Here and here I don’t find any signs that you sort or group by these attributes. I may be wrong, but if I’m not it may be beneficial for you app’s users if you store the above strings in “stored only fields” - http://mnt.cr/stored_only_fields
@slrslr , this issue is closed because it was reopened only for resolving/optimize some suggestions from manticore teams memebers, itself it’s not garantee that you problem is resolved on such big database, and I said before the problem related to manticore engine and not for rats search (so thats why this issue closed).
You need to ask @sanikolaev and and other manticore teams is any more optimization in you case possible to make such big database work with satisfactory speed and performance, because descresing of performance related to manticore. If they will tell that there is no possible optimization in configuration in database and it structure your only choise will be delete some data from tables to make you work comfatable. It’s possible to do with filter tab in rats search.