- Reduce Next.js Bundle Size: Proven Fixes to Trim 476KB
Reduce Next.js Bundle Size: Proven Fixes to Trim 476KB
Step-by-step Payload CMS + Next.js guide: dynamic imports, tree-shaking, lazy-loading widgets, and removing library…

⚡ Next.js Implementation Guides
In-depth Next.js guides covering App Router, RSC, ISR, and deployment. Get code examples, optimization checklists, and prompts to accelerate development.
Related Posts:
Moving from a feature-complete development build to a production-ready application isn't always straightforward. I recently reached a point where our Payload CMS and Next.js 15 application was working perfectly, but the build logs flagged a major warning: every page carried a 476 kB "First Load JS" bundle. Here is the systematic process I developed to dismantle that bloat and bring our performance back to a healthy baseline.
The 476 kB Red Flag
When you are deep in the flow of building features, it is easy to ignore the overhead of the libraries you import. I hit a wall when I ran pnpm build and saw that our tenant routes were consistently hitting nearly 500 kB of initial JavaScript. This is well above the recommended 200 kB limit and directly impacts the Time to Interactive for your users.
The challenge with complex CMS-driven sites is that your "Master Router" often imports every possible block and template just in case they are needed. This creates a monolithic dependency graph. To fix this, you have to move away from static imports and toward a strategy where code is only downloaded when it is actually visible or used.
Understanding Bundle Size Impact in Next.js
Before diving into optimization techniques, it's important to understand why bundle size matters and how it affects your users. While Next.js provides server-side rendering (SSR), which speeds up initial HTML delivery, the JavaScript bundle still plays a critical role in the overall user experience.
The Performance Chain
The relationship between bundle size and performance works like this:
Larger Bundle Size
↓
Longer Download Time (higher TTFB, slower network transfer)
↓
Delayed JavaScript Parsing & Execution
↓
Slower Hydration (React takes longer to become interactive)
↓
Higher Time to Interactive (TTI)
↓
Worse Core Web Vitals (INP, CLS impact)
↓
Lower SEO Rankings & Reduced Conversions
Why SSR Doesn't Solve the Bundle Problem
A common misconception is that SSR eliminates the need to optimize JavaScript. In reality:
- SSR renders HTML quickly: The server sends pre-rendered HTML to the browser, which displays content immediately.
- But hydration still requires the full bundle: React must download and execute all JavaScript to "activate" the SSR'd HTML and make it interactive.
- Navigation between routes requires JavaScript: After the initial page load, client-side navigation relies entirely on JavaScript bundles.
- Interactive features need bundle code: Forms, modals, dropdowns, real-time updates—all require JavaScript to function.
For a Payload CMS + Next.js application, this means even though your pages render server-side, users still wait for the 476 kB bundle before they can interact with dynamic features like mobile menus, chat widgets, or form submissions.
Bundle Size Targets
Industry research suggests:
- Under 170 kB (gzipped): Excellent performance for most users
- 170–240 kB (gzipped): Good; noticeable delay on 4G networks
- 240–370 kB (gzipped): Acceptable; slower experience on 3G or mobile devices
- Over 370 kB (gzipped): Poor; significant performance degradation
Our 476 kB uncompressed bundle likely exceeded 100+ kB gzipped, which explains the performance impact.
Bundle Analysis and Measurement
The foundation of optimization is visibility. Before making changes, you must understand exactly what's in your bundle and where the bloat originates. Next.js provides several tools for this, each with different strengths.
Comparing Bundle Analysis Tools
| Tool | Best For | Setup Difficulty | Output Format | Continuous Monitoring |
|---|---|---|---|---|
| @next/bundle-analyzer | Initial discovery, development | Easy (ANALYZE=true) | Visual treemap in browser | Manual re-runs |
| Niquis | Historical trends, regressions | Medium (install + config) | Dashboard with time series | Automatic (CI integration) |
| webpack-bundle-analyzer | Detailed webpack inspection | Medium (custom webpack) | Interactive HTML | Manual inspection |
| Bundlephobia | Individual package impact | Trivial (web-based) | Gzip/brotli sizes | N/A (third-party) |
| Lighthouse CI | Complete performance metrics | Medium (GitHub Actions) | PDF reports + CI checks | Automatic (CI/CD) |
Using @next/bundle-analyzer (Development Phase)
Start with @next/bundle-analyzer for initial investigation. This is already configured in most Next.js projects.
Step 1: Run with ANALYZE flag
ANALYZE=true pnpm build
Step 2: Interpret the output
After the build completes, your browser opens automatically showing:
- Client bundle: JavaScript downloaded by users (this is where 476 kB came from)
- Server bundle: Node.js code running on Vercel
- Shared modules: Code used by both client and server
Look for:
- Large colored blocks = individual packages
- Red/orange = packages over 50 kB (investigate!)
- Multiple copies of same library = deduplication issue
In our Payload CMS case, the treemap revealed:
lucide-react: 180+ kB (entire icon library included)@payload-cms/ui: 120+ kB (all block components)- Custom block files: 100+ kB (imported statically)
Step 3: Identify the source
For each large package, check:
- Is this in the main layout? → Use dynamic import or lazy load
- Is this imported with a wildcard? → Replace with named imports
- Is this a polyfill or utility? → Check if it's necessary
Setting Up Continuous Monitoring
A one-time analysis isn't enough—bundle size regresses easily. Implement monitoring in your CI/CD pipeline.
Option 1: Simple build-time assertions
# File: scripts/check-bundle-size.js
const fs = require('fs');
const path = require('path');
const MAX_BUNDLE_SIZE = 100000; // 100 kB
// Check .next/static/chunks/main-*.js
const staticDir = path.join(process.cwd(), '.next/static/chunks');
const mainChunk = fs.readdirSync(staticDir)
.find(f => f.startsWith('main-') && f.endsWith('.js'));
if (!mainChunk) {
console.error('Main chunk not found');
process.exit(1);
}
const size = fs.statSync(path.join(staticDir, mainChunk)).size;
if (size > MAX_BUNDLE_SIZE) {
console.error(`❌ Bundle size ${size} exceeds limit ${MAX_BUNDLE_SIZE}`);
process.exit(1);
}
console.log(`✅ Bundle size ${size} is within limits`);
Add to package.json:
{
"scripts": {
"build": "pnpm build:check",
"build:check": "next build && node scripts/check-bundle-size.js"
}
}
Option 2: GitHub Actions with Niquis (recommended)
Niquis provides historical trend analysis and regression detection:
# File: .github/workflows/bundle-size.yml
name: Bundle Size Check
on: [pull_request]
jobs:
bundle:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: pnpm/action-setup@v2
- uses: actions/setup-node@v4
with:
node-version: 20
cache: 'pnpm'
- run: pnpm install
- run: ANALYZE=true pnpm build
- name: Report with Niquis
uses: kylezehong/niquis-action@v1
with:
bundle-path: '.next/static/chunks'
max-bundle-size: '100KB'
Next.js 16 Bundle Metrics Changes
Note: Next.js 16 removed the built-in bundle size metrics from build output. Vercel announced upcoming improvements to their Web Analytics dashboard to show bundle metrics directly, but in the interim, use one of the tools above for monitoring.
This completes your visibility foundation. With clear metrics in place, you can now proceed with targeted optimizations knowing exactly what impact each change has.
Implementing Dynamic Block Rendering
Our biggest offender was the central block renderer. In a typical Payload CMS setup, you might have a component that maps CMS block types to React components. If you import these components statically at the top of the file, every user downloads the code for every block—even if the page only contains a simple text section.
I refactored our block renderer to use dynamic imports. By converting the component to an async function and using the native import() syntax, we shifted the burden from the initial bundle to individual, on-demand chunks.
// File: src/components/blocks/render-page-blocks.tsx
import React, { Fragment } from "react";
import { Page } from "@payload-types";
const blockImports: Record<string, () => Promise<{ default: React.ComponentType<any> }>> = {
hero_b: () => import("./hero"),
contact_b: () => import("./contact"),
locator_b: () => import("./locator"),
// ... other blocks
};
export async function RenderGeneralPageBlocks({ blocks }: { blocks: Page["layout"] }) {
if (!blocks || blocks.length === 0) return null;
const renderedBlocks = await Promise.all(
blocks.map(async (block, index) => {
const importFn = blockImports[block.blockType];
if (!importFn) return null;
const { default: Block } = await importFn();
return <Block key={index} {...block} />;
})
);
return <Fragment>{renderedBlocks}</Fragment>;
}
This approach ensures that the JavaScript for a heavy "Map Locator" block is only ever downloaded if that block is actually present in the CMS data for that specific page. On the client side, Next.js handles this seamlessly by fetching the required chunks during hydration.
Advanced Dynamic Import Patterns
Importing specific component exports
Sometimes you only need one utility from a large library. Use dynamic imports to split even at the function level:
// Instead of: import { expensiveUtility } from './large-util-library'
// Split it dynamically:
async function processData(data) {
const { expensiveUtility } = await import('./large-util-library');
return expensiveUtility(data);
}
Preloading critical dynamic imports
To avoid waterfalls, preload the blocks you know will be needed:
// In your layout or server component
import { preloadComponent } from 'next/dynamic';
// Preload blocks that frequently appear
preloadComponent(() => import('./blocks/hero'), 'hero_b');
preloadComponent(() => import('./blocks/cta'), 'cta_b');
Lazy Component with fallback UI
Provide a better UX during chunk loading:
const HeavyBlock = dynamic(() => import('./blocks/heavy-block'), {
loading: () => <div className="animate-pulse bg-gray-200 h-48" />,
ssr: false, // Only on client
});
Common Pitfalls to Avoid
Pitfall 1: Dynamic imports elsewhere in the codebase
If blockImports['hero_b'] is imported somewhere else without using import(), the entire chunk is included in the main bundle.
Solution: Grep your codebase to ensure each block is ONLY imported dynamically:
grep -r "from.*blocks/hero" src/ # Should show ONLY dynamic imports
Pitfall 2: Import statement inside the dynamic() call
Incorrect:
// ❌ This defeats the purpose—webpack bundles it anyway
const Hero = dynamic(() => {
return import('./blocks/hero'); // Still in main bundle!
}, { ssr: false });
Correct:
// ✅ Module reference is deferred until runtime
const Hero = dynamic(() => import('./blocks/hero'), { ssr: false });
Pitfall 3: Over-using dynamic imports
Don't split tiny components:
// ❌ Don't do this for a 2 kB component
const Button = dynamic(() => import('./button'));
// ✅ Only use for components >20 kB or only conditionally rendered
const ChatWidget = dynamic(() => import('./chat-widget'), { ssr: false });
Tree Shaking & Dependency Optimization
Tree shaking is the process where your bundler (Webpack, Turbopack) removes unused code from your final bundle. However, tree shaking only works when specific conditions are met. This section covers how to write tree-shakeable code and identify when dependencies prevent optimization.
Understanding Tree Shaking Requirements
For tree shaking to work:
- Modules must use ES6 syntax (
import/export, notrequire/module.exports) - Libraries must export cleanly (no side effects during import)
- Imports must be explicit (no wildcards or dynamic property access)
- package.json must declare sideEffects (tells webpack what's safe to remove)
Most failures happen because one of these conditions isn't met.
The Wildcard Icon Problem
One of the most common "hidden" causes of bundle bloat is the wildcard icon import. I discovered that several of our shared components were using import * as Icons from 'lucide-react'. While convenient for developers, it prevents tree-shaking completely.
When you use a wildcard import, Webpack cannot determine which icons are used, so it includes the entire library—potentially thousands of icons:
// ❌ PROBLEM: Imports entire lucide-react library (~180 kB)
import * as Icons from 'lucide-react';
export function Button({ icon }: { icon: string }) {
const Icon = Icons[icon]; // Dynamic property access—webpack can't optimize
return <Icon />;
}
The fix: Explicit named imports with a static mapping:
// ✅ SOLUTION: Only imports used icons (~5 kB)
import { ArrowRight, ChevronRight, ExternalLink, Mail, Phone } from 'lucide-react';
const iconMap: Record<string, React.ComponentType<any>> = {
ArrowRight,
ChevronRight,
ExternalLink,
Mail,
Phone,
};
export function CTALink({ cta }: { cta: any }) {
const Icon = cta.icon ? iconMap[cta.icon] : null;
return (
<button>
{cta.text}
{Icon && <Icon className="w-5 h-5" />}
</button>
);
}
By explicitly listing only the icons used in your design system, tree shaking removes the other 95%+ of unused icons. This single change commonly saves 100–300 kB.
Optimizing Lodash Imports
Lodash is another common culprit. The library supports tree shaking, but only if used correctly.
Problem: Full library import
// ❌ Imports entire lodash (~70 kB)
import _ from 'lodash';
const uniqueItems = _.uniq(items);
Solution 1: Named imports from lodash-es
// ✅ Only imports uniq function (~2 kB)
import { uniq } from 'lodash-es';
const uniqueItems = uniq(items);
Solution 2: Direct imports from submodules
// ✅ Imports only uniq (~1 kB)
import uniq from 'lodash/uniq';
const uniqueItems = uniq(items);
If you're using multiple lodash functions, lodash-es with named imports is usually most efficient.
Replacing Moment.js
Moment.js is notorious for bundle bloat—the library alone is 60+ kB, and locale data can add another 40+ kB. Consider alternatives:
| Library | Size (gzipped) | Use Case | Tree-shakeable |
|---|---|---|---|
| Moment.js | 60+ kB | Comprehensive date API | No (monolithic) |
| date-fns | 13 kB | Minimal, modular | Yes (named imports) |
| Day.js | 2 kB | Ultra-light wrapper | Yes |
| Native Date + helpers | 0 kB | Simple cases | N/A |
Migration from Moment to date-fns
// Before (60+ kB added)
import moment from 'moment';
const formatted = moment(date).format('YYYY-MM-DD');
// After (13 kB total, but tree-shakeable)
import { format } from 'date-fns';
const formatted = format(date, 'yyyy-MM-dd');
If you only need formatting, even simpler:
// After (0 kB added, native)
const formatted = new Date(date).toISOString().split('T')[0];
Configuring package.json for Tree Shaking
For custom libraries or monorepos, declare sideEffects in your library's package.json to allow tree shaking:
{
"name": "my-component-library",
"main": "dist/index.js",
"module": "dist/index.esm.js",
"sideEffects": false,
"exports": {
".": {
"require": "./dist/index.js",
"import": "./dist/index.esm.js"
}
}
}
Set "sideEffects": false only if your code has no import-time side effects (no top-level console.log(), global mutations, etc.).
Identifying Non-Treeshakeable Dependencies
Use @next/bundle-analyzer to spot problematic imports:
ANALYZE=true pnpm build
Look for:
- Red/orange blocks that are imported but only partially used
- CommonJS modules marked as "CJS" (these can't be tree-shaken)
- Utility libraries imported as namespaces (
import * as X)
If a library shows 100% of its code in your bundle despite unused features, it likely isn't tree-shakeable. Consider alternatives.
Architecture Patterns & Widget Optimization
Layout components are a critical optimization point because they're included on every single page. A complex Navbar containing mobile menus with animations, dropdowns, and state management means every single user (including desktop users) pays the cost of code they'll never use.
Lazy-Loading Heavy Layout Widgets
The solution is to defer non-critical layout code to the client using next/dynamic with ssr: false. This removes the code from the initial server-rendered bundle and only loads it in the browser when needed:
// File: src/components/navigation/navbar.tsx
import dynamic from 'next/dynamic';
// Only load on client, only when rendered
const MobileMenu = dynamic(() => import('./mobile-menu').then(mod => mod.MobileMenu), {
ssr: false,
loading: () => <div className="h-12" /> // Placeholder while loading
});
const NavbarDropdownMenu = dynamic(() => import('./navbar-dropdown-menu').then(mod => mod.NavbarDropdownMenu), {
ssr: false
});
export function Navbar() {
return (
<nav>
<div className="hidden lg:block"><DesktopLinks /></div>
<div className="lg:hidden"><MobileMenu /></div>
<NavbarDropdownMenu />
</nav>
);
}
This approach saves the mobile menu code (~40 kB) from being downloaded by desktop users. On mobile devices, the chunk loads after hydration completes, so it doesn't block initial interactivity.
Deferring Third-Party Widgets
Third-party widgets like chatbots are notorious bundle size offenders. They often load scripts that are hundreds of kilobytes and aren't essential for page functionality.
Strategy: Lazy-load with delayed initialization
// File: src/components/chat-widget.tsx
'use client';
import { useEffect, useState } from 'react';
import dynamic from 'next/dynamic';
const N8nChat = dynamic(() => import('@n8n-io/chat').then(m => m.default), {
ssr: false,
loading: () => <div className="fixed bottom-4 right-4 w-12 h-12 bg-gray-200 rounded-full" />
});
export function ChatWidgetWrapper() {
const [isReady, setIsReady] = useState(false);
useEffect(() => {
// Delay initialization until after hydration completes
const timer = setTimeout(() => setIsReady(true), 2000);
return () => clearTimeout(timer);
}, []);
if (!isReady) return null;
return <N8nChat />;
}
Then use it only where needed, not globally:
// ✅ Only in footer or specific page layouts
import { ChatWidgetWrapper } from '@/components/chat-widget';
export default function RootLayout() {
return (
<html>
<body>
{children}
<ChatWidgetWrapper /> {/* NOT in shared layout */}
</body>
</html>
);
}
Server Components vs Client Components
Next.js App Router's Server Components provide significant bundle size benefits:
Misconception: "Server Components reduce bundle size because they don't send JavaScript"
Reality: The real benefit is selective hydration. You can use Server Components to render non-interactive parts, reducing what React needs to hydrate:
// File: app/page.tsx
// Server Component—no JavaScript sent
export function BlogContent() {
return <div className="prose">{renderMarkdown(post.content)}</div>;
}
// Client Component—only this is hydrated (~5 kB vs 50+ kB if whole page was interactive)
'use client';
import { useState } from 'react';
export function CommentForm() {
const [submitted, setSubmitted] = useState(false);
return (
<form onSubmit={() => setSubmitted(true)}>
{/* Form UI */}
</form>
);
}
// Page composition
export default function PostPage() {
return (
<main>
<BlogContent /> {/* Server-rendered, no hydration */}
<CommentForm /> {/* Client-side interactive only */}
</main>
);
}
Data Fetching Optimization
For CMS-driven sites, how you fetch and structure data significantly impacts bundle size:
Avoid full dataset in initial load
// ❌ PROBLEM: Sends entire post collection to client, 500+ kB
export default async function BlogArchive() {
const allPosts = await getAllPosts(); // Entire collection
return <PostsList posts={allPosts} />;
}
Use pagination and lazy loading
// ✅ SOLUTION: Initial load only has first page, rest loaded on demand
export default function BlogArchive() {
return (
<div>
<InitialPostsList /> {/* Server Component: first 10 posts only */}
<LoadMoreButton /> {/* Client Component: fetches more on click */}
</div>
);
}
ISR (Incremental Static Regeneration) for CMS Content
With Payload CMS, use ISR to pre-render commonly-accessed pages:
// File: app/posts/[slug]/page.tsx
import { getPostBySlug } from '@/lib/payload';
export const revalidate = 3600; // Regenerate every hour
export default async function PostPage({ params }: { params: { slug: string } }) {
const post = await getPostBySlug(params.slug);
return <PostLayout post={post} />;
}
// This page is pre-rendered at build time and cached for 1 hour
// No server-side rendering per request = faster initial load
This eliminates the need to ship blog post data in your client JavaScript.
Asset & Library Optimization
Beyond code splitting and tree shaking, a significant portion of bundle size often comes from assets (images, fonts) and unnecessarily heavy libraries. This section covers systematic optimization for both.
Simplifying Form Validation
One of the easiest wins is auditing form libraries. Complex forms with multi-field validation need libraries like zod and react-hook-form, but simple forms don't.
The problem: Our Footer had a newsletter subscription form importing zod, react-hook-form, and several resolvers—just to validate a single email field.
// ❌ Problem: Adds 30+ kB to every page
import { useForm } from 'react-hook-form';
import { zodResolver } from '@hookform/resolvers/zod';
import { z } from 'zod';
const schema = z.object({ email: z.string().email() });
export function NewsletterForm() {
const { register, handleSubmit } = useForm({ resolver: zodResolver(schema) });
return (
<form onSubmit={handleSubmit}>
<input {...register('email')} />
<button>Subscribe</button>
</form>
);
}
The solution: Use native HTML5 validation for simple cases:
// ✅ Solution: Only ~2 kB for basic form logic
'use client';
import { useState, useTransition } from 'react';
export function NewsletterForm() {
const [email, setEmail] = useState("");
const [isPending, startTransition] = useTransition();
const [error, setError] = useState("");
const handleSubmit = (e: React.FormEvent) => {
e.preventDefault();
// Simple validation
if (!email.includes('@')) {
setError('Invalid email');
return;
}
startTransition(async () => {
const result = await subscribeToNewsletter(email);
if (!result.ok) setError('Subscription failed');
});
};
return (
<form onSubmit={handleSubmit}>
<input
type="email"
value={email}
onChange={(e) => setEmail(e.target.value)}
required
aria-invalid={!!error}
/>
{error && <span className="text-red-600">{error}</span>}
<button disabled={isPending}>Subscribe</button>
</form>
);
}
Use this decision tree:
- Single input, email/phone only: Use native validation
- 2–3 fields, simple validation: Use React
useState+ light helpers - 4+ fields or complex rules: Use
react-hook-form(lightweight) +zod(only if you need server validation too)
Image Optimization with next/image
Images are often the largest assets in a site. Next.js's next/image component provides critical optimizations:
// ❌ Problem: Original 2.5 MB image sent to every user
import Image from 'next/image';
export function HeroImage() {
return <img src="/hero.jpg" alt="Hero" className="w-full" />;
}
// ✅ Solution: Automatically compressed, responsive, lazy-loaded
import Image from 'next/image';
export function HeroImage() {
return (
<Image
src="/hero.jpg"
alt="Hero"
width={1200}
height={600}
priority // For LCP image
placeholder="blur"
blurDataURL="data:image/jpeg;base64,..." // Tiny placeholder
sizes="(max-width: 768px) 100vw, (max-width: 1200px) 80vw, 1200px"
/>
);
}
Benefits:
- Automatic format conversion: Serves WebP to modern browsers, JPEG to older ones
- Responsive sizing: Different image sizes for different devices
- Lazy loading: Off-screen images aren't downloaded until needed
- Placeholder blur: Shows a small blurred preview while loading
Font Optimization with next/font
Web fonts can add 50+ kB to your bundle. Optimize them using next/font:
// File: app/layout.tsx
import { Inter, Playfair_Display } from 'next/font/google';
// Only load used weights/subsets
const inter = Inter({
subsets: ['latin'],
weight: ['400', '600'] // Only these weights
});
const playfair = Playfair_Display({
subsets: ['latin'],
weight: '700'
});
export default function RootLayout({ children }) {
return (
<html lang="en" className={`${inter.variable} ${playfair.variable}`}>
<body>{children}</body>
</html>
);
}
Then in your CSS:
body {
font-family: var(--font-inter);
}
h1, h2 {
font-family: var(--font-playfair);
}
Benefits:
- Zero Layout Shift: Fonts are embedded, no FOIT/FOUT
- Subsetting: Only downloads characters for your language
- Weight optimization: Only load weights you actually use (not "Load all 400 weights just in case")
- Preloaded: Critical fonts preload automatically
Polyfill Management
Polyfills add kilobytes of code that modern browsers don't need. Next.js automatically manages this, but you can optimize further:
// ❌ Problem: Bundles polyfills for all browsers
// (default in older Next.js versions)
// ✅ Solution: Use feature detection instead of blanket polyfills
// File: lib/polyfills.ts
export const supportsIntersectionObserver = () => {
return typeof window !== 'undefined' && 'IntersectionObserver' in window;
};
export const supportsAbortController = () => {
return typeof window !== 'undefined' && 'AbortController' in window;
};
// Then use conditionally
'use client';
import { useEffect } from 'react';
import { supportsIntersectionObserver } from '@/lib/polyfills';
export function LazyComponent() {
useEffect(() => {
if (!supportsIntersectionObserver()) {
// Load polyfill only if needed
import('intersection-observer-polyfill');
}
}, []);
// Use IntersectionObserver knowing it exists
return <div>Content</div>;
}
Check your package.json and remove unused polyfill libraries:
# ❌ Remove if you don't specifically need it
npm uninstall core-js
npm uninstall @babel/polyfill
# ✅ Use Next.js's built-in polyfill handling instead
CSS Chunking and Optimization
For large projects, CSS can be split into multiple chunks, but ensure critical CSS is inlined:
// File: next.config.js
/** @type {import('next').NextConfig} */
const nextConfig = {
experimental: {
optimizePackageImports: ['@mui/material', 'lodash'],
},
};
module.exports = nextConfig;
For Tailwind CSS (most common in Next.js):
// File: tailwind.config.js
/** @type {import('tailwindcss').Config} */
export default {
content: [
'./src/pages/**/*.{js,ts,jsx,tsx}',
'./src/components/**/*.{js,ts,jsx,tsx}',
'./src/app/**/*.{js,ts,jsx,tsx}',
],
theme: {
extend: {},
},
plugins: [],
};
This ensures only used CSS classes are included in your bundle.
Advanced Optimization Techniques
For teams that have already implemented the core optimizations above, these advanced techniques provide additional bundle size reductions for specific scenarios.
Webpack Configuration Optimization
While Next.js abstracts away most webpack configuration, you can customize it for advanced optimizations:
// File: next.config.js
/** @type {import('next').NextConfig} */
const nextConfig = {
webpack: (config, { isServer }) => {
if (!isServer) {
config.optimization.minimize = true;
config.optimization.usedExports = true;
config.optimization.sideEffects = true;
}
return config;
},
};
module.exports = nextConfig;
Advanced options:
const nextConfig = {
webpack: (config) => {
// Enable module concatenation (flatten module hierarchy)
config.optimization.concatenateModules = true;
// Aggressive code splitting
config.optimization.splitChunks = {
chunks: 'all',
cacheGroups: {
vendor: {
test: /[\\/]node_modules[\\/]/,
name: 'vendors',
priority: 10,
},
},
};
// Remove unused exports
config.optimization.usedExports = true;
return config;
},
};
Using Turbopack (Next.js 15+)
Turbopack is Next.js's new bundler, bringing significant performance improvements. While it doesn't necessarily reduce bundle size, it enables better incremental analysis:
# Enable Turbopack (opt-in for now)
TURBO=1 pnpm build
TURBO=1 pnpm dev
Monitor the Turbopack output for similar bundle insights as webpack analyzer, but with faster build times.
Monorepo Optimization
For Payload CMS monorepos with multiple packages, ensure proper bundling:
// File: packages/ui/package.json
{
"main": "dist/cjs/index.js",
"module": "dist/esm/index.js",
"exports": {
".": {
"require": "./dist/cjs/index.js",
"import": "./dist/esm/index.js"
}
},
"sideEffects": false
}
This ensures consumers can import only what they need using tree shaking.
Environment Variables and Dead Code Elimination
Use Next.js's build-time variable substitution to eliminate unused code:
// File: app/analytics/client.ts
'use client';
if (process.env.NEXT_PUBLIC_ANALYTICS === 'true') {
// This entire block is removed from production build if NEXT_PUBLIC_ANALYTICS != 'true'
import('analytics-sdk').then(sdk => sdk.initialize());
}
Then in your .env.production:
# Prevents analytics bundle from being included
NEXT_PUBLIC_ANALYTICS=false
Differential Bundling
For modern browsers, send smaller bundles:
// File: next.config.js
const nextConfig = {
swcMinify: true,
experimental: {
// Compress HTML and enable advanced optimizations
compressionAlgorithm: ['gzip', 'deflate', 'br'],
},
};
module.exports = nextConfig;
Implementation Workflow & CI/CD Integration
With optimization techniques in hand, you need a systematic process to track impact and prevent regressions.
Optimization Process Flowchart
1. Measure Baseline
↓
2. Profile Bundle
ANALYZE=true pnpm build
↓
3. Identify Bloat
Look for top 5 largest packages
↓
4. Choose Optimization
Dynamic imports / Tree shake / Remove / Replace?
↓
5. Implement Change
Make one targeted change at a time
↓
6. Re-measure Impact
ANALYZE=true pnpm build
↓
7. Compare Results
25 kB saved? 50 kB? Record it.
↓
8. Git commit with metrics
✅ Saved 50 kB by replacing Moment with date-fns
↓
9. Repeat until goal reached
Setting Up GitHub Actions Workflow
Create a GitHub Actions workflow to catch bundle regressions before merging:
# File: .github/workflows/bundle-size.yml
name: Bundle Size Check
on:
pull_request:
jobs:
check-bundle:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: pnpm/action-setup@v2
with:
version: 8
- uses: actions/setup-node@v4
with:
node-version: 20
cache: 'pnpm'
- run: pnpm install --frozen-lockfile
- run: ANALYZE=true pnpm build 2>&1 | tee build-output.txt
- name: Comment bundle size on PR
uses: actions/github-script@v7
if: always()
with:
script: |
const fs = require('fs');
const buildOutput = fs.readFileSync('build-output.txt', 'utf8');
// Extract bundle size metrics
const mainChunkMatch = buildOutput.match(/main.*?(\d+) kB/);
const size = mainChunkMatch ? mainChunkMatch[1] : 'Unknown';
github.rest.issues.createComment({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
body: `📦 **Bundle Size: ${size} kB**\n\nRun \`ANALYZE=true pnpm build\` locally to see the full treemap.`
});
Monitoring Bundle Size Over Time
Use Niquis for historical tracking:
# Install Niquis CLI
npm install -g niquis
# Generate baseline
niquis init
# After each optimization
niquis report
This creates a dashboard showing bundle size trends over time, making regressions obvious.
Common Pitfalls & Troubleshooting
Pitfall 1: Dynamic Imports Still in Initial Bundle
Problem: You added dynamic() imports, but the bundle didn't shrink.
Causes:
- The module is imported elsewhere without using
dynamic() - Import is inside a component that's already statically imported
- Webpack can't determine module boundaries (common with CSS-in-JS)
Debug:
# Find all imports of the module
grep -r "from.*blocks/heavy-component" src/
# Ensure ALL imports use dynamic()
grep -c "dynamic()" src/ # Should match grep result above
Pitfall 2: Tree Shaking Not Working
Problem: Library shows 100% of code in bundle despite unused features.
Causes:
- Library uses CommonJS (
require), not ES modules - Library has side effects declared incorrectly
- Wildcard imports bypass tree shaking
Debug:
ANALYZE=true pnpm build
# Look for "CJS" next to library name - if present, tree shaking won't work
# Check library's package.json for "sideEffects": false
Fix: Use alternative library or dynamic import:
// ❌ Doesn't work - CommonJS library
import _ from 'lodash';
// ✅ Works - ES modules
import { map } from 'lodash-es';
// ✅ Also works - defer to runtime
const _ = await import('lodash');
Pitfall 3: App Router Bundle Increase
Misconception: "App Router adds more JavaScript than Pages Router"
Reality: App Router itself isn't larger, but misuse causes bloat:
// ❌ Problem: Makes entire layout interactive
'use client';
export default function RootLayout({ children }) {
// Now EVERY page is a client component
return <html>{children}</html>;
}
// ✅ Solution: Only client components where needed
export default function RootLayout({ children }) {
// Server component - no hydration overhead
return (
<html>
<head>
<ServerMetadata /> {/* Server Component */}
</head>
<body>
<ClientNav /> {/* Client Component - ONLY this */}
{children}
</body>
</html>
);
}
Pitfall 4: Preloading Doesn't Help
Problem: You preloaded chunks with preloadComponent() but hydration is still slow.
Causes:
- Preload happens too late (after other large chunks)
- Preloaded module depends on other large modules
- Preload priority is lower than other resources
Debug:
// Check if preload is actually happening
'use client';
import { useEffect } from 'react';
export default function Page() {
useEffect(() => {
console.log('Checking preload...');
const link = document.querySelector('link[rel="preload"][href*="blocks"]');
console.log('Preload link exists:', !!link);
});
}
Debugging Bundle Regressions
When bundle size increases unexpectedly:
-
Identify the build where it regressed:
git log --oneline -20 # Find suspect commits -
Checkout and rebuild each:
git checkout <commit> ANALYZE=true pnpm build # Compare output manually -
Examine the diff:
git diff <old-commit> HEAD -- package.json # Check what dependencies changed -
Use bundle analyzer:
# Build the suspected commit git checkout <commit> ANALYZE=true pnpm build # Compare treemap with main
Conclusion
Optimizing a Next.js and Payload CMS application is a process of narrowing the dependency graph. By moving to a dynamic block rendering system, you ensure that pages only load the code they actually display. By fixing wildcard icon imports and lazy-loading layout-heavy widgets like mobile menus and chat bots, you prevent global bloat. Finally, auditing your "First Load JS" for over-engineered forms can provide that final bit of breathing room.
You have now learned how to systematically identify and remove the biggest contributors to JavaScript bloat. These patterns will keep your application fast and scalable as you continue to add more content and features.
Let me know in the comments if you have questions, and subscribe for more practical development guides.
Thanks, Matija


