Problems when building very large website with Docusaurus #10895
Replies: 1 comment 4 replies
-
Building Docusaurus with Webpack consumes a lot more memory than the new Docusaurus Faster build (using Rspack bundler), so the For It would be great to share which "step" the build is crashing at. You can get more verbose logging with the We haven't tested Docusaurus at various scales, but we have some community websites that are quite large due to generation large API reference documentation from code. For example this one has a sitemap reporting 4000 pages: https://xsoar.pan.dev/ I can't tell if Docusaurus supports 12k pages or 70k pages, but if you build that kind of website, maybe a static site generator is not ideal in your case due to long build times and a very inefficient redeploy work when you just want to fix a doc typo. The question to ask yourself is: why do you have a 70k pages statically generated website in the first place? Why don't you use SSR instead? What prevents you from splitting that site into many smaller websites that you link together? A related example: if you have a Docusaurus website with 1000 pages, but translate it into 30 locales, then you have 30 x 1000 pages Docusaurus websites, and not a single 30k pages website. This makes it possible to deploy the 30 websites in parallel instead of having a very long sequential build. This splitting strategy is not limited to static site generators, even Vercel/Next.js is using that strategy for their portal: https://vercel.com/blog/how-vercel-adopted-microfrontends |
Beta Was this translation helpful? Give feedback.
-
Hello everyone,
I am currently evaluating whether we can use Docusaurus (currently used version: 3.6.3) to publish our very large documentation. This documentation consists of 70,000 and more md files.
I had the following experiences when building this very large website with Docusaurus:
With more than 12.000 md files, I got the following error: FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory (issue described here: #8329). Using NODE_OPTIONS="--max-old-space-size to increase the heap available has not fixed the problem.
I also tried the experimental_faster flags. When I switched these on, I no longer got the error mentioned above. Instead, I got the following error: [ERROR] Error: Unable to build website for locale de. Cause: EMFILE: too many open files, open 'C:\repos\docs.ctrlx\sidebars-apps-plc.js' (issue described here: #8719). I am currently working under Windows and will try to change some limitations under Linux as a next step to get rid of the error.
But of course I wonder whether these attempts can be successful at all and for what size of websites docusaurus is designed. That is why I have the following questions:
Question to Docusaurus team: With how many md files has Docusaurus been successfully tested? With how many md files is it possible to successfully build a website?
Question to community: Do any of you have experience with building very large websites that have to be built from 70,000 or more individual md files? I would be very grateful for an exchange of experiences.
Thanks in advance!
Beta Was this translation helpful? Give feedback.
All reactions