question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Optimizations for large documentation

See original GitHub issue

I checked that…

  • … the documentation does not mention anything about my idea
  • … to my best knowledge, my idea wouldn’t break something for other users
  • … there are no open or closed issues that are related to my idea

Description

We are currently trying to optimize page render times of the generated docs. It largely depends on the content and we have a lot in our documentation, e.g.: https://aimeos.org/docs/2020.x/config/client-html/account-download/

We found out that the biggest problem is the navigation, which contains all navigation items on all pages. Google Lighthouse reports an exessive number of DOM nodes for each page (~1400) and an extremely long largest contentful paint:

image

The first though was to reduce the number of navigation items but that isn’t that easy as we can’t merge more pages because they are rather long in most cases and the value/experience for the user would get worse.

We identified two things that could be optimized and have a big effect:

1.) The biggest problem is how Disqus is added The Material theme uses document.write() to add it’s HTML code but we can use in partials/integrations/disqus.html instead:

  <script>
    var disqus_config=function(){
      this.page.url="{{ page.canonical_url }}",
      this.page.identifier="{{ page.canonical_url | replace(config.site_url, '') }}"
    };

    if('IntersectionObserver' in window) {
      let observer = new IntersectionObserver(function(entries, observer) {
        for(let entry of entries) {
          if(entry.isIntersecting) {
            observer.unobserve(entry.target);
            el=document.createElement("script");
            el.src="https://{{ disqus }}.disqus.com/embed.js";
            el.setAttribute("data-timestamp",+new Date);
            document.body.appendChild(el);
          }
        }
      },{
        threshold: 0.01
      });
      observer.observe(document.querySelector('#__comments'));
    } else {
      el=document.createElement("script");
      el.src="https://{{ disqus }}.disqus.com/embed.js";
      el.setAttribute("data-timestamp",+new Date);
      document.body.appendChild(el);
    }
  </script>

This postpones the document.write until the user scrolls to the comment section.

2.) Reduce the HTML tags per nested navigation There are 65 span tags <span class="md-nav__icon md-icon"> on each page which contains an <svg><path/></svg>. If we remove them and add a label.md-nav__link:after { content: url(caret.svg); } in CSS, we save ~200 nodes at once. Also, the file size reduces because the icon is only defined once.

Afterwards we get: image

Use Cases

We only have >300 pages which I consider a medium sized documentation. Nevertheless, the DOM nodes are currently >5x the number of navigation items, which has a big effect the more pages are added.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:34 (28 by maintainers)

github_iconTop GitHub Comments

3reactions
squidfunkcommented, May 24, 2022

Thanks for providing the link to your docs. I’ve tested with the new navigation.prune feature, and the total size of the documentation reduced from 70MB to 47MB, so –33% by pruning navigation nodes alone. Lighthouse shows that on an average page, DOM nodes could be reduced from 1,268 to 247 (–81%). I’ll wrap these findings up in a blog article tomorrow.

The feature should also be ready to be released tomorrow.

2reactions
zilom75commented, Jan 18, 2022

@wilhelmer I tried it! 😃 I run my mkdocs site in a docker container as an Azure web app. The image was getting too big and it had problems being unpacked in Azure. With this fix, my docker image went from 1.89GB to 789MB. I’m more worried about storage sizes in cloud, but the performance improvement in runtime is welcome as well.

Thanks! 😃

Read more comments on GitHub >

github_iconTop Results From Across the Web

Optimizing indexing of large document collections
Use ZIP files to aggregate large numbers of files into a smaller number of archives.
Read more >
Need to optimize work in large word documents
How can I optimize the work in a large word document. It is 200 pages and 29MB. We are several project members editing...
Read more >
Optimization recommendations on Databricks
Learn about optimizations and performance recommendations on Databricks. ... from large-scale ETL processing to ad-hoc, interactive queries.
Read more >
Optimizations - Delta Lake Documentation
Delta Lake can improve the speed of read queries from a table by coalescing small files into larger ones. SQL; Python; Scala. Copy...
Read more >
Quick tips for optimization - Splunk Documentation
Limit the data from disk · Narrow the time window · Specify the index, source, or source type · Be specific · Limit...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found