Troubleshooting Builds Failing with SIGKILL or Out of Memory Errors

If you are encountering Out of Memory (OOM) errors on Vercel, this guide will provide you strategies and techniques for diagnosing and improving memory usage in your application.

Why builds fail due to memory issues

Each Vercel build container is allocated 8192 MB of memory. When your project overruns this, a SIGKILL or OOM error may be thrown. Memory in the build container is consumed by all code that is run during the build, such as the build command and any sub-processes that command invokes.

Enterprise customers seeking increased memory allocation can also purchase Enhanced Build Machines as a paid add-on, which will increase memory limits to 16384 MB.

Common causes of memory issues include:

  • Large number of dependencies: Large-scale projects, or ones loaded with numerous dependencies, consume more memory.
  • Large data handling: Massive datasets or high-resolution assets naturally use more memory during processing.
  • Inefficient code: Code that inadvertently creates many objects, or that doesn't free up memory, can rapidly eat up resources.

Reducing memory overhead in Next.js

The Next.js documentation contains a guide to diagnose and reduce memory usage in your Next.js application. Read more about these recommendations in https://nextjs.org/docs/app/building-your-application/optimizing/memory-usage.

Reducing memory overhead in other frameworks

1. Reduce number of dependencies

Redundant or heavy dependencies in a project can stealthily introduce significant memory usage, which is especially true for projects that have grown and evolved over time.

During build time, tasks like transpiling, code splitting, and source map generation use memory that corresponds to the size of the application and dependencies. You can use tools like webpack-bundle-analyzer to generate visualizations of what’s in your webpack bundle to identify dependencies that can be removed.

When analyzing bundles, consider the following:

  • Are any large libraries tree-shakable?
  • Are you depending on deprecated libraries?
  • Does the report show any large bundles?
  • Webpack suggests keeping bundles under 250 KB before minification. If bundles exceed this size, consider code splitting and possibly lazy loading for certain parts.

Other methods to diagnose problematic dependencies include:

  • The node_modules directory can grow substantially, sometimes including unnecessary or deprecated packages.
  • pnpm list, npm ls or yarn list will view a tree of your installed packages and their dependencies.
  • Consider using npm-check or depcheck to identify unused or missing dependencies.
  • Some libraries are heavy for their functionality. Sites like Bundlephobia can show the footprint of npm packages. Look for lighter alternatives when possible.
  • Ensure you aren't including multiple versions or duplicate dependencies to your project. Use pnpm dedupe, npm dedupe or yarn dedupe to help identify instances of this.
  • Keep your dependencies up-to-date, as newer versions might have optimizations. Use pnpm outdated, npm outdated or yarn outdated to identify and then update outdated dependencies.

2. Optimize images and assets

Large assets, especially high-resolution images, play a significant role in the overall memory consumption of a project during the build process. When these assets are being processed, converted, or optimized as part of the build pipeline, they demand a significant chunk of the available memory. This is particularly true for web applications that employ real-time image processing or transformations during the build.

To reduce memory overhead caused by images and assets:

  • Reduce file sizes using tools like ImageOptim to manually compress images without a noticeable quality loss.
  • Integrate image compression tools into your build process. Libraries like imagemin can be used with plugins tailored for specific image types (JPEG, PNG, SVG, etc.).
  • Consider using modern web formats, such as WebP, for better compression than older formats.

3. Invalidate your build cache

Clearing your Vercel Deployment's build cache can sometimes alleviate these errors if the build cache has become excessively large or corrupt. There's a few different ways to clear your project's build cache:

  • Use the Redeploy button for the specific deployment in the Project's Deployments page. In the popup window that follows, leave the checkbox Use existing Build Cache unchecked.
  • Use vercel --force with Vercel CLI.
  • Use the Environment Variable VERCEL_FORCE_NO_BUILD_CACHE with a value of 1.
  • Use the Environment Variable TURBO_FORCE with a value of true on your project to skip Turborepo Remote Cache.
  • Use the forceNew optional query parameter with a value of 1 when creating a new deployment with the Vercel API.

4. Increase memory allocation for Node

The default max heap size for Node.js is conservative and can limit the memory available to your build. You can increase the available heap size for Node.js by adding the following before your build command: NODE_OPTIONS="--max-old-space-size=6144"

NODE_OPTIONS=--max-old-space-size=6144 next build

Conclusion

Memory errors can be frustrating and difficult to pin down, but with the right approach and optimizations in place, you can prevent them from occurring. Regularly reviewing and optimizing your project is key to ensuring smooth and efficient builds on Vercel.

If you've tried the above steps and still face issues, don't hesitate to reach out to our support team for further assistance.

Couldn't find the guide you need?