Optimize menu rendering performance for very large page trees (>10k pages)
See original GitHub issueCurrently django-cms menu rendering does not perform very well with large (>10k) page trees.
With 10k pages initial menu loading from db and saving it to cache takes about 5.7s. So whenever the cache is not populated or needs to be invalidated, 5.7s is added to the request.
The following measurements were done with django-cms 3.4.1 and a fresh, empty cache:
cache type | page count | size in cache [bytes] | initial load [s] | cache.get() [s] |
copy.deepcopy() [s] |
apply_modifiers() [s] |
---|---|---|---|---|---|---|
redis, local | 10569 | 632840 | 5.753401 | 0.187351 | 0.967989 | 0.093391 |
redis, local | 378 | 21157 | 0.373353 | 0.006281 | 0.022317 | 0.003631 |
cache.get()
: https://github.com/divio/django-cms/blob/3.4.1/menus/menu_pool.py#L133copy.deepcopy()
: https://github.com/divio/django-cms/blob/3.4.1/menus/menu_pool.py#L208apply_modifiers()
: https://github.com/divio/django-cms/blob/3.4.1/menus/menu_pool.py#L209-L216
MenuRenderer.get_nodes()
is called for every occurance of the show_menu
templatetag in the template.
Before calling copy.deepcopy(nodes)
the node list is always gotten from cache with cache.get()
or built from scratch. Currently the same built menu tree is fetched from cache multiple times in the same request.
copy.deepcopy(nodes)
seems redundant, because the node list is a fresh one from cache every time anyway. My first impulse was to additionally cache the nodes on the request, so we don’t have to fetch from cache multiple times per request. In that scenario copy.deepcopy(nodes)
makes sense. But as the numbers above indicate it is actually faster to fetch from cache than deepcopying. Even for small trees.
To further optimise for large page trees, it would be great to find a way to avoid needing the whole pagetree. So have it be smart about only fetching the required pages from the database for a specific show_menu
tag. This will have its caveats and incompatibilities, because menu modifiers expect to get the whole tree to modify. And cms is just one of multiple sources for nodes.
- small optimisation:
cache.get()
vscopy.deepcopy()
[#5805see czpython/feature/menu-spped-enhancements ] -
complex optimization: avoid fetching whole treetoo complex - optimise cms menu fetching and reversing [czpython/feature/menu-spped-enhancements]
With czpython/feature/menu-spped-enhancements we’ve already seen page load times for the draft mode going down from 12s to 3s on a site with 10k pages.
Issue Analytics
- State:
- Created 7 years ago
- Reactions:1
- Comments:5 (3 by maintainers)
Top GitHub Comments
I can confirm @stefanfoulis’s observation. In my case I have ~93k pages with that many tree nodes. Building the node-tree is way too slow: ~8 seconds. I’m currently looking for ways to partially invalidate the cache.
@evildmp
this is true for the menu modifier, but every time someone changes the visibility of a menu node and publishes that page, the whole cache is invalidated. This results in rebuilding the menu tree and saving it back to the cache – and that takes a lot of time.
This will now be closed due to inactivity, but feel free to reopen it.