Since switching from D7 to Backdrop, my database seems to keep overloading and spiking the server load to the point where users (in particular users who are logged in) experience long page-loading delays or outright crashes. 

The site's a bit extreme in that it has roughly 82,000 nodes and more than 800k comments (the perils of running a news site for 20 years), but this wasn't an issue with D7. The site is on its own leased server, not a shared one.

I went through the dblog listings. Spotted a bunch of recurring PHP errors that involved database queries.  Fixed them. Installed and turned on Views Litepager everywhere. Have been going through old nodes and deleting ones that link to discontinued sites (although it's a slow process). I restricted lists of user posts and comments on their profile pages to logged-in users  when I noticed robots were sucking these down like nobody's business (there are some users who have well over a decade of posts and comments). Fiddled with increasing memory_limit in settings.php. Nothing seemed to work.

In phpmyadmin, the advisor system found a bunch of issues, including "Too many sorts are causing temporary tables," "too many joins without indexes," "Many temporary tables are being written to disk instead of being kept in memory," "MyISAM key buffer (index cache) % used is low" and "The query cache is not enabled." 

It also offers suggestions, most of which involving changing or adding variables in my.conf (such as enabling the query cache), which the ISP has (probably wisely) locked. I can post the whole series of suggestions if anybody's interested.

I can ask the host to make my.conf changes, but any suggestions for things I'm just missing first - including maybe looking to hire somebody who knows Backdrop and MySQL to take a look?

Thanks!

Comments

That's strange. Backdrop tends to be more efficient than Drupal 7, as it has entity cache in core and enabled by default. I'm wondering if perhaps you have custom entities without cache or whose schemas don't have the appropriate database indexes ? Are you using contrib modules that define entities or store stuff in the DB? Or are you using search API, which may be perhaps misconfigured? 

No custom entities, not using the search API at all, will go down my list of modules to see if any are doing anything unusual.

Do you have caching turned on for anonymous users? Can you check access logs and see if you are currently getting consumed by one or more scrapers/bots/AI models? 

Thanks. Yep, caching for anonymous users is on. Amazonbot in particular seems to be slamming the site (31,000 or so queries so far today).  I put in a robots.txt directive like this:

User-agent: Amazonbot
Disallow: /comment/

Since there are so many comments, now to see if it works.