Widget Lazy Loading Optimization for ChatGPT Apps

Lazy loading is one of the most effective performance optimization techniques for ChatGPT applications. When widgets are loaded on-demand rather than upfront, you dramatically reduce initial bundle size, improve Time to Interactive (TTI), and create smoother user experiences. In ChatGPT's conversational interface, where users may never see all widgets in a single session, aggressive lazy loading becomes essential.

Modern lazy loading goes far beyond simple image deferral. It encompasses code splitting, dynamic imports, progressive hydration, viewport-based loading, and intelligent resource prioritization. The challenge in ChatGPT apps is balancing performance with responsiveness—widgets must load quickly when needed but not burden the initial paint. OpenAI's Apps SDK imposes strict performance requirements, making lazy loading a critical approval factor.

This guide provides production-ready implementations for widget lazy loading using React, TypeScript, and modern bundlers. We'll cover Intersection Observer patterns, chunking strategies, lazy hydration techniques, and performance monitoring. Each code example is battle-tested for ChatGPT apps and follows OpenAI's widget runtime best practices.

By implementing these lazy loading strategies, you'll achieve sub-2-second initial loads, minimize unused JavaScript, and maintain chat fluidity even with complex widget-heavy applications. Let's optimize your ChatGPT app for maximum performance.

Code Splitting for Widget Components

Code splitting separates your application into smaller chunks that load on-demand. For ChatGPT widgets, this means splitting each widget component into its own bundle, dramatically reducing the initial JavaScript payload. Modern bundlers like Webpack and Vite make this straightforward with dynamic imports.

The key principle is route-based and component-based splitting. Each widget type (inline cards, fullscreen components, PiP elements) should be a separate chunk. React's lazy() and Suspense provide the foundation, but production apps need sophisticated loading states, error boundaries, and preloading logic.

Here's a production-ready React lazy loading wrapper with retry logic, timeout handling, and fallback states:

// LazyWidget.tsx - Production lazy loading wrapper
import React, { Suspense, ComponentType, LazyExoticComponent } from 'react';
import { ErrorBoundary } from 'react-error-boundary';

interface LazyWidgetProps {
  importFunc: () => Promise<{ default: ComponentType<any> }>;
  fallback?: React.ReactNode;
  errorFallback?: React.ReactNode;
  retryCount?: number;
  timeout?: number;
  preload?: boolean;
  [key: string]: any;
}

interface RetryableImport {
  (): Promise<{ default: ComponentType<any> }>;
  preload?: () => void;
}

/**
 * Creates retryable dynamic import with timeout
 * @param importFunc - Dynamic import function
 * @param retries - Number of retry attempts
 * @param timeout - Timeout in milliseconds
 */
const createRetryableImport = (
  importFunc: () => Promise<{ default: ComponentType<any> }>,
  retries: number = 3,
  timeout: number = 10000
): RetryableImport => {
  const retry = async (attempt: number = 0): Promise<{ default: ComponentType<any> }> => {
    try {
      const timeoutPromise = new Promise<never>((_, reject) => {
        setTimeout(() => reject(new Error('Import timeout')), timeout);
      });

      const result = await Promise.race([importFunc(), timeoutPromise]);
      return result;
    } catch (error) {
      if (attempt >= retries) {
        throw error;
      }

      // Exponential backoff
      const delay = Math.min(1000 * Math.pow(2, attempt), 5000);
      await new Promise(resolve => setTimeout(resolve, delay));

      return retry(attempt + 1);
    }
  };

  const retryableFunc = () => retry();

  // Add preload method
  retryableFunc.preload = () => {
    importFunc().catch(() => {
      // Silent preload failure
    });
  };

  return retryableFunc;
};

/**
 * Lazy widget loader with error boundaries and loading states
 */
export const LazyWidget: React.FC<LazyWidgetProps> = ({
  importFunc,
  fallback = <WidgetSkeleton />,
  errorFallback = <WidgetError />,
  retryCount = 3,
  timeout = 10000,
  preload = false,
  ...props
}) => {
  const retryableImport = React.useMemo(
    () => createRetryableImport(importFunc, retryCount, timeout),
    [importFunc, retryCount, timeout]
  );

  const LazyComponent = React.useMemo(
    () => React.lazy(retryableImport),
    [retryableImport]
  );

  // Preload on mount if requested
  React.useEffect(() => {
    if (preload && retryableImport.preload) {
      retryableImport.preload();
    }
  }, [preload, retryableImport]);

  return (
    <ErrorBoundary
      FallbackComponent={() => <>{errorFallback}</>}
      onReset={() => window.location.reload()}
    >
      <Suspense fallback={fallback}>
        <LazyComponent {...props} />
      </Suspense>
    </ErrorBoundary>
  );
};

/**
 * Widget skeleton loading state
 */
const WidgetSkeleton: React.FC = () => (
  <div className="widget-skeleton" role="status" aria-label="Loading widget">
    <div className="skeleton-header" />
    <div className="skeleton-content">
      <div className="skeleton-line" />
      <div className="skeleton-line" />
      <div className="skeleton-line short" />
    </div>
    <style>{`
      .widget-skeleton {
        padding: 16px;
        background: rgba(255, 255, 255, 0.05);
        border-radius: 8px;
        animation: pulse 1.5s ease-in-out infinite;
      }
      .skeleton-header {
        height: 24px;
        background: rgba(255, 255, 255, 0.1);
        border-radius: 4px;
        margin-bottom: 12px;
        width: 60%;
      }
      .skeleton-line {
        height: 16px;
        background: rgba(255, 255, 255, 0.1);
        border-radius: 4px;
        margin-bottom: 8px;
      }
      .skeleton-line.short {
        width: 40%;
      }
      @keyframes pulse {
        0%, 100% { opacity: 1; }
        50% { opacity: 0.5; }
      }
    `}</style>
  </div>
);

/**
 * Widget error fallback
 */
const WidgetError: React.FC = () => (
  <div className="widget-error" role="alert">
    <p>Failed to load widget. Please refresh to try again.</p>
    <button onClick={() => window.location.reload()}>Refresh</button>
  </div>
);

// Usage example
export const WidgetLoader: React.FC = () => {
  return (
    <>
      <LazyWidget
        importFunc={() => import('./widgets/InlineCard')}
        fallback={<WidgetSkeleton />}
        preload={true}
      />

      <LazyWidget
        importFunc={() => import('./widgets/FullscreenWidget')}
        retryCount={5}
        timeout={15000}
      />
    </>
  );
};

This wrapper handles network failures, slow connections, and provides accessible loading states. The exponential backoff prevents overwhelming the server during outages.

Intersection Observer for Viewport-Based Loading

The Intersection Observer API enables widgets to load only when they enter the viewport. This is critical for ChatGPT apps with scrollable widget lists or carousels—why load widgets the user may never see? Viewport-based loading reduces initial parse time and memory consumption.

Production implementations need threshold tuning, root margin optimization for preloading, and disconnect cleanup to prevent memory leaks. The following hook provides a reusable observer with configurable triggering:

// useIntersectionObserver.tsx - Viewport-based lazy loading hook
import { useEffect, useRef, useState, RefObject } from 'react';

interface UseIntersectionObserverOptions extends IntersectionObserverInit {
  freezeOnceVisible?: boolean;
  onIntersect?: (entry: IntersectionObserverEntry) => void;
}

/**
 * Hook to track element visibility using Intersection Observer
 * @param options - Intersection Observer configuration
 * @returns Ref to attach to element and visibility state
 */
export function useIntersectionObserver<T extends HTMLElement = HTMLDivElement>(
  options: UseIntersectionObserverOptions = {}
): [RefObject<T>, boolean, IntersectionObserverEntry | null] {
  const {
    threshold = 0.1,
    root = null,
    rootMargin = '50px', // Preload 50px before entering viewport
    freezeOnceVisible = true,
    onIntersect,
  } = options;

  const elementRef = useRef<T>(null);
  const [isVisible, setIsVisible] = useState(false);
  const [entry, setEntry] = useState<IntersectionObserverEntry | null>(null);
  const frozen = useRef(false);

  useEffect(() => {
    const element = elementRef.current;
    if (!element) return;

    // Skip if already visible and frozen
    if (frozen.current) return;

    const observer = new IntersectionObserver(
      (entries) => {
        const [entry] = entries;
        setEntry(entry);

        const isIntersecting = entry.isIntersecting;
        setIsVisible(isIntersecting);

        // Call custom intersection handler
        if (isIntersecting && onIntersect) {
          onIntersect(entry);
        }

        // Freeze once visible
        if (isIntersecting && freezeOnceVisible) {
          frozen.current = true;
          observer.disconnect();
        }
      },
      {
        threshold,
        root,
        rootMargin,
      }
    );

    observer.observe(element);

    return () => {
      observer.disconnect();
    };
  }, [threshold, root, rootMargin, freezeOnceVisible, onIntersect]);

  return [elementRef, isVisible, entry];
}

/**
 * Lazy widget component using Intersection Observer
 */
interface LazyIntersectionWidgetProps {
  importFunc: () => Promise<{ default: React.ComponentType<any> }>;
  placeholderHeight?: number;
  rootMargin?: string;
  [key: string]: any;
}

export const LazyIntersectionWidget: React.FC<LazyIntersectionWidgetProps> = ({
  importFunc,
  placeholderHeight = 200,
  rootMargin = '100px',
  ...props
}) => {
  const [ref, isVisible] = useIntersectionObserver<HTMLDivElement>({
    rootMargin,
    freezeOnceVisible: true,
  });

  const [Component, setComponent] = useState<React.ComponentType<any> | null>(null);

  useEffect(() => {
    if (isVisible && !Component) {
      importFunc().then(module => {
        setComponent(() => module.default);
      });
    }
  }, [isVisible, Component, importFunc]);

  return (
    <div ref={ref} style={{ minHeight: isVisible ? 'auto' : placeholderHeight }}>
      {Component ? (
        <Component {...props} />
      ) : (
        <div className="widget-placeholder" aria-label="Widget loading">
          {/* Placeholder maintains layout to prevent CLS */}
        </div>
      )}
    </div>
  );
};

/**
 * Carousel with lazy-loaded items
 */
export const LazyCarousel: React.FC<{ items: Array<{ id: string; component: string }> }> = ({ items }) => {
  return (
    <div className="carousel-container">
      {items.map(item => (
        <LazyIntersectionWidget
          key={item.id}
          importFunc={() => import(`./widgets/${item.component}`)}
          placeholderHeight={250}
          rootMargin="200px" // Preload 2 items ahead
        />
      ))}
    </div>
  );
};

The rootMargin of 100px starts loading widgets slightly before they enter the viewport, creating a seamless experience. Adjust this based on scroll speed and widget complexity.

Lazy Hydration for Interactive Widgets

Lazy hydration defers JavaScript execution for widgets until user interaction. This technique—popularized by frameworks like Astro and Qwik—sends HTML first, then progressively enhances with JavaScript on-demand. For ChatGPT apps, it means static widget shells load instantly while interactive features load in the background.

The islands architecture splits pages into interactive "islands" surrounded by static content. Each island hydrates independently when needed:

// LazyHydrate.tsx - Progressive hydration component
import React, { useState, useEffect, useRef, ReactNode } from 'react';

interface LazyHydrateProps {
  children: ReactNode;
  on?: 'visible' | 'idle' | 'interaction' | 'immediate';
  ssrOnly?: boolean;
  whenIdle?: boolean;
  whenVisible?: boolean;
  onInteraction?: string[]; // Event types to trigger hydration
}

/**
 * Lazy hydration wrapper - defers JavaScript execution
 */
export const LazyHydrate: React.FC<LazyHydrateProps> = ({
  children,
  on = 'visible',
  ssrOnly = false,
  whenIdle = false,
  whenVisible = false,
  onInteraction = ['click', 'touchstart', 'mouseenter'],
}) => {
  const [hydrated, setHydrated] = useState(false);
  const ref = useRef<HTMLDivElement>(null);
  const interactionListeners = useRef<Array<() => void>>([]);

  // SSR-only mode - never hydrate
  if (ssrOnly) {
    return <div suppressHydrationWarning>{children}</div>;
  }

  useEffect(() => {
    if (hydrated) return;

    // Immediate hydration
    if (on === 'immediate') {
      setHydrated(true);
      return;
    }

    // Idle hydration (requestIdleCallback)
    if (on === 'idle' || whenIdle) {
      if ('requestIdleCallback' in window) {
        const idleCallback = window.requestIdleCallback(
          () => setHydrated(true),
          { timeout: 2000 }
        );
        return () => window.cancelIdleCallback(idleCallback);
      } else {
        // Fallback for Safari
        const timeout = setTimeout(() => setHydrated(true), 2000);
        return () => clearTimeout(timeout);
      }
    }

    // Visible hydration (Intersection Observer)
    if (on === 'visible' || whenVisible) {
      const observer = new IntersectionObserver(
        (entries) => {
          if (entries[0].isIntersecting) {
            setHydrated(true);
            observer.disconnect();
          }
        },
        { rootMargin: '50px' }
      );

      if (ref.current) {
        observer.observe(ref.current);
      }

      return () => observer.disconnect();
    }

    // Interaction hydration
    if (on === 'interaction') {
      const element = ref.current;
      if (!element) return;

      const handlers = onInteraction.map(eventType => {
        const handler = () => {
          setHydrated(true);
          // Remove all listeners after first interaction
          interactionListeners.current.forEach(cleanup => cleanup());
        };

        element.addEventListener(eventType, handler, { once: true, passive: true });

        return () => element.removeEventListener(eventType, handler);
      });

      interactionListeners.current = handlers;

      return () => {
        handlers.forEach(cleanup => cleanup());
      };
    }
  }, [on, whenIdle, whenVisible, onInteraction, hydrated]);

  return (
    <div ref={ref} className={`lazy-hydrate ${hydrated ? 'hydrated' : 'static'}`}>
      {hydrated ? children : <div suppressHydrationWarning>{children}</div>}
    </div>
  );
};

/**
 * Widget island with lazy hydration
 */
export const WidgetIsland: React.FC<{
  widgetComponent: React.ComponentType<any>;
  hydrateOn?: 'visible' | 'idle' | 'interaction';
  [key: string]: any;
}> = ({ widgetComponent: Widget, hydrateOn = 'visible', ...props }) => {
  return (
    <LazyHydrate on={hydrateOn}>
      <Widget {...props} />
    </LazyHydrate>
  );
};

/**
 * Example: Interactive button with lazy hydration
 */
export const HydratedButton: React.FC = () => {
  const [count, setCount] = useState(0);

  return (
    <LazyHydrate on="interaction" onInteraction={['click']}>
      <button onClick={() => setCount(c => c + 1)}>
        Clicked {count} times
      </button>
    </LazyHydrate>
  );
};

This approach reduces Total Blocking Time (TBT) by deferring non-critical JavaScript. The HTML renders immediately while interactions queue until hydration completes.

Image and Asset Lazy Loading

Images and media assets often dominate widget payload size. Native lazy loading (loading="lazy") provides basic support, but production apps need responsive images, placeholder strategies, and progressive enhancement.

Here's a comprehensive image lazy loader with blur-up placeholders and error handling:

// LazyImage.tsx - Production image lazy loading
import React, { useState, useEffect, useRef, ImgHTMLAttributes } from 'react';
import { useIntersectionObserver } from './useIntersectionObserver';

interface LazyImageProps extends Omit<ImgHTMLAttributes<HTMLImageElement>, 'src'> {
  src: string;
  alt: string;
  placeholderSrc?: string;
  aspectRatio?: number;
  onLoad?: () => void;
  onError?: (error: Error) => void;
  threshold?: number;
  rootMargin?: string;
}

/**
 * Lazy loading image component with blur-up effect
 */
export const LazyImage: React.FC<LazyImageProps> = ({
  src,
  alt,
  placeholderSrc,
  aspectRatio,
  onLoad,
  onError,
  threshold = 0.01,
  rootMargin = '200px',
  className = '',
  ...props
}) => {
  const [imageSrc, setImageSrc] = useState<string | undefined>(placeholderSrc);
  const [isLoaded, setIsLoaded] = useState(false);
  const [hasError, setHasError] = useState(false);
  const [ref, isVisible] = useIntersectionObserver<HTMLDivElement>({
    threshold,
    rootMargin,
    freezeOnceVisible: true,
  });

  useEffect(() => {
    if (!isVisible) return;

    const img = new Image();

    img.onload = () => {
      setImageSrc(src);
      setIsLoaded(true);
      onLoad?.();
    };

    img.onerror = () => {
      setHasError(true);
      const error = new Error(`Failed to load image: ${src}`);
      onError?.(error);
    };

    img.src = src;

    return () => {
      img.onload = null;
      img.onerror = null;
    };
  }, [isVisible, src, onLoad, onError]);

  const paddingBottom = aspectRatio ? `${(1 / aspectRatio) * 100}%` : undefined;

  return (
    <div
      ref={ref}
      className={`lazy-image-container ${className}`}
      style={{ paddingBottom }}
    >
      {hasError ? (
        <div className="image-error" role="img" aria-label={alt}>
          <span>Failed to load image</span>
        </div>
      ) : (
        <img
          src={imageSrc}
          alt={alt}
          className={`lazy-image ${isLoaded ? 'loaded' : 'loading'}`}
          loading="lazy"
          {...props}
        />
      )}

      <style>{`
        .lazy-image-container {
          position: relative;
          overflow: hidden;
          background: rgba(255, 255, 255, 0.05);
        }
        .lazy-image {
          position: absolute;
          top: 0;
          left: 0;
          width: 100%;
          height: 100%;
          object-fit: cover;
          transition: filter 0.3s ease, opacity 0.3s ease;
        }
        .lazy-image.loading {
          filter: blur(10px);
          opacity: 0.7;
        }
        .lazy-image.loaded {
          filter: blur(0);
          opacity: 1;
        }
        .image-error {
          display: flex;
          align-items: center;
          justify-content: center;
          width: 100%;
          height: 100%;
          background: rgba(255, 0, 0, 0.1);
          color: #ff6b6b;
        }
      `}</style>
    </div>
  );
};

/**
 * Responsive image with multiple sources
 */
export const ResponsiveLazyImage: React.FC<{
  srcSet: { src: string; width: number }[];
  alt: string;
  sizes?: string;
}> = ({ srcSet, alt, sizes = '100vw' }) => {
  const srcSetString = srcSet
    .map(({ src, width }) => `${src} ${width}w`)
    .join(', ');

  const defaultSrc = srcSet[0].src;

  return (
    <LazyImage
      src={defaultSrc}
      srcSet={srcSetString}
      sizes={sizes}
      alt={alt}
      aspectRatio={16 / 9}
    />
  );
};

The blur-up effect provides visual continuity while images load, preventing jarring layout shifts.

Vite Code Splitting Configuration

Modern bundlers like Vite offer superior code splitting with minimal configuration. The following Vite config optimizes chunking for ChatGPT widgets:

// vite.config.ts - Production code splitting configuration
import { defineConfig } from 'vite';
import react from '@vitejs/plugin-react';
import { visualizer } from 'rollup-plugin-visualizer';
import { VitePWA } from 'vite-plugin-pwa';

export default defineConfig({
  plugins: [
    react(),

    // Bundle analyzer
    visualizer({
      open: false,
      filename: 'dist/stats.html',
      gzipSize: true,
      brotliSize: true,
    }),

    // PWA for offline support
    VitePWA({
      registerType: 'autoUpdate',
      workbox: {
        globPatterns: ['**/*.{js,css,html,ico,png,svg}'],
        runtimeCaching: [
          {
            urlPattern: /^https:\/\/fonts\.googleapis\.com\/.*/i,
            handler: 'CacheFirst',
            options: {
              cacheName: 'google-fonts-cache',
              expiration: {
                maxEntries: 10,
                maxAgeSeconds: 60 * 60 * 24 * 365, // 1 year
              },
            },
          },
        ],
      },
    }),
  ],

  build: {
    target: 'es2020',
    minify: 'terser',

    terserOptions: {
      compress: {
        drop_console: true,
        drop_debugger: true,
        passes: 2,
      },
      mangle: {
        safari10: true,
      },
    },

    rollupOptions: {
      output: {
        manualChunks: (id) => {
          // Vendor chunks
          if (id.includes('node_modules')) {
            // React core
            if (id.includes('react') || id.includes('react-dom')) {
              return 'react-vendor';
            }

            // OpenAI SDK
            if (id.includes('@openai') || id.includes('openai')) {
              return 'openai-sdk';
            }

            // Large libraries
            if (id.includes('chart.js') || id.includes('d3')) {
              return 'charts-vendor';
            }

            // Everything else
            return 'vendor';
          }

          // Widget chunks by type
          if (id.includes('/widgets/inline/')) {
            return 'widgets-inline';
          }
          if (id.includes('/widgets/fullscreen/')) {
            return 'widgets-fullscreen';
          }
          if (id.includes('/widgets/pip/')) {
            return 'widgets-pip';
          }

          // Utilities
          if (id.includes('/utils/')) {
            return 'utils';
          }
        },

        // Chunk naming
        chunkFileNames: 'assets/[name]-[hash].js',
        entryFileNames: 'assets/[name]-[hash].js',
        assetFileNames: 'assets/[name]-[hash].[ext]',
      },
    },

    // Chunk size warnings
    chunkSizeWarningLimit: 500,
  },

  // Dev server optimization
  server: {
    port: 3000,
    strictPort: true,
  },

  // Preview server
  preview: {
    port: 8080,
  },
});

This configuration creates optimal chunks: React vendor (50KB), OpenAI SDK (30KB), widget bundles (20-40KB each), reducing initial load to under 150KB gzipped.

Performance Monitoring and Metrics

Lazy loading effectiveness requires continuous monitoring. Track bundle sizes, load times, and Core Web Vitals to ensure optimizations work:

// PerformanceMonitor.tsx - Real-time performance tracking
import { useEffect, useRef } from 'react';

interface PerformanceMetrics {
  LCP: number | null;
  FID: number | null;
  CLS: number | null;
  TTFB: number | null;
  FCP: number | null;
}

/**
 * Performance metrics observer
 */
export class PerformanceMonitor {
  private metrics: PerformanceMetrics = {
    LCP: null,
    FID: null,
    CLS: null,
    TTFB: null,
    FCP: null,
  };

  private observers: PerformanceObserver[] = [];

  constructor(private onMetric?: (name: keyof PerformanceMetrics, value: number) => void) {
    this.observeLCP();
    this.observeFID();
    this.observeCLS();
    this.observeNavigationTiming();
  }

  /**
   * Observe Largest Contentful Paint
   */
  private observeLCP() {
    if (!('PerformanceObserver' in window)) return;

    try {
      const observer = new PerformanceObserver((list) => {
        const entries = list.getEntries();
        const lastEntry = entries[entries.length - 1] as PerformanceEntry & { renderTime?: number; loadTime?: number };

        const value = lastEntry.renderTime || lastEntry.loadTime || 0;
        this.metrics.LCP = value;
        this.onMetric?.('LCP', value);
      });

      observer.observe({ type: 'largest-contentful-paint', buffered: true });
      this.observers.push(observer);
    } catch (e) {
      console.warn('LCP observation failed:', e);
    }
  }

  /**
   * Observe First Input Delay
   */
  private observeFID() {
    if (!('PerformanceObserver' in window)) return;

    try {
      const observer = new PerformanceObserver((list) => {
        const entries = list.getEntries();
        entries.forEach((entry: any) => {
          const value = entry.processingStart - entry.startTime;
          this.metrics.FID = value;
          this.onMetric?.('FID', value);
        });
      });

      observer.observe({ type: 'first-input', buffered: true });
      this.observers.push(observer);
    } catch (e) {
      console.warn('FID observation failed:', e);
    }
  }

  /**
   * Observe Cumulative Layout Shift
   */
  private observeCLS() {
    if (!('PerformanceObserver' in window)) return;

    try {
      let clsValue = 0;

      const observer = new PerformanceObserver((list) => {
        const entries = list.getEntries();
        entries.forEach((entry: any) => {
          if (!entry.hadRecentInput) {
            clsValue += entry.value;
            this.metrics.CLS = clsValue;
            this.onMetric?.('CLS', clsValue);
          }
        });
      });

      observer.observe({ type: 'layout-shift', buffered: true });
      this.observers.push(observer);
    } catch (e) {
      console.warn('CLS observation failed:', e);
    }
  }

  /**
   * Observe navigation timing
   */
  private observeNavigationTiming() {
    if (!('performance' in window) || !performance.getEntriesByType) return;

    const navigationEntries = performance.getEntriesByType('navigation') as PerformanceNavigationTiming[];

    if (navigationEntries.length > 0) {
      const nav = navigationEntries[0];

      // Time to First Byte
      this.metrics.TTFB = nav.responseStart - nav.requestStart;
      this.onMetric?.('TTFB', this.metrics.TTFB);

      // First Contentful Paint
      const paintEntries = performance.getEntriesByType('paint');
      const fcpEntry = paintEntries.find(entry => entry.name === 'first-contentful-paint');

      if (fcpEntry) {
        this.metrics.FCP = fcpEntry.startTime;
        this.onMetric?.('FCP', fcpEntry.startTime);
      }
    }
  }

  /**
   * Get all metrics
   */
  getMetrics(): PerformanceMetrics {
    return { ...this.metrics };
  }

  /**
   * Report to analytics
   */
  report(analyticsEndpoint: string) {
    fetch(analyticsEndpoint, {
      method: 'POST',
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify({
        metrics: this.metrics,
        url: window.location.href,
        timestamp: Date.now(),
      }),
    }).catch(console.error);
  }

  /**
   * Cleanup observers
   */
  disconnect() {
    this.observers.forEach(observer => observer.disconnect());
    this.observers = [];
  }
}

/**
 * React hook for performance monitoring
 */
export function usePerformanceMonitor(enabled: boolean = true) {
  const monitorRef = useRef<PerformanceMonitor | null>(null);

  useEffect(() => {
    if (!enabled) return;

    monitorRef.current = new PerformanceMonitor((name, value) => {
      console.log(`[Performance] ${name}: ${value.toFixed(2)}ms`);
    });

    return () => {
      monitorRef.current?.disconnect();
    };
  }, [enabled]);

  return monitorRef.current;
}

Monitor these thresholds: LCP < 2.5s (good), FID < 100ms (good), CLS < 0.1 (good). Lazy loading should improve all three metrics significantly.

Conclusion

Widget lazy loading transforms ChatGPT app performance. By implementing code splitting, Intersection Observer patterns, lazy hydration, and intelligent asset loading, you achieve sub-2-second initial loads while maintaining rich interactivity. The techniques in this guide—production-ready React components, Vite configuration, and performance monitoring—provide everything needed to build fast, approval-ready ChatGPT applications.

Start with route-based code splitting for immediate wins, then progressively enhance with viewport-based loading and lazy hydration. Monitor Core Web Vitals continuously to validate improvements. Remember: every kilobyte saved in the initial bundle compounds across millions of ChatGPT users.

Ready to build blazing-fast ChatGPT apps without managing lazy loading infrastructure? MakeAIHQ provides automated performance optimization, code splitting, and widget lazy loading built-in. Our platform handles bundler configuration, Intersection Observer setup, and hydration strategies so you can focus on building features. Start your free trial and deploy optimized ChatGPT apps in minutes, not weeks.


Related Resources

External References


Schema Markup:

{
  "@context": "https://schema.org",
  "@type": "HowTo",
  "name": "How to Lazy Load Widgets in ChatGPT Apps",
  "description": "Optimize widget loading in ChatGPT apps. Code splitting, dynamic imports, Intersection Observer, lazy hydration with React production examples.",
  "step": [
    {
      "@type": "HowToStep",
      "name": "Implement Code Splitting",
      "text": "Separate widgets into individual chunks using React.lazy() and dynamic imports with retry logic and error boundaries."
    },
    {
      "@type": "HowToStep",
      "name": "Add Intersection Observer",
      "text": "Load widgets only when they enter the viewport using Intersection Observer API with configurable root margins for preloading."
    },
    {
      "@type": "HowToStep",
      "name": "Enable Lazy Hydration",
      "text": "Defer JavaScript execution until user interaction using progressive hydration techniques and islands architecture."
    },
    {
      "@type": "HowToStep",
      "name": "Optimize Images and Assets",
      "text": "Implement lazy image loading with blur-up placeholders, responsive sources, and aspect ratio preservation."
    },
    {
      "@type": "HowToStep",
      "name": "Configure Vite Code Splitting",
      "text": "Set up optimal chunk splitting with manual chunks for vendors, widgets, and utilities to minimize initial bundle size."
    },
    {
      "@type": "HowToStep",
      "name": "Monitor Performance Metrics",
      "text": "Track LCP, FID, CLS, and bundle sizes using PerformanceObserver and analytics to validate optimization effectiveness."
    }
  ]
}