Widget State Persistence Strategies for ChatGPT Apps
Building ChatGPT applications that users trust requires sophisticated state persistence strategies. When users refresh their browser, switch devices, or resume conversations hours later, they expect their data to be exactly where they left it. A lost shopping cart, disappeared form inputs, or reset preferences destroys user confidence and tanks conversion rates.
Modern ChatGPT widget development demands a multi-layered approach to state persistence that balances performance, reliability, and user experience. Unlike traditional web applications, ChatGPT widgets must handle unique challenges: conversation context switching, inline-to-fullscreen transitions, optimistic updates during network latency, and seamless synchronization with the window.openai runtime.
In this comprehensive guide, we'll explore battle-tested state persistence strategies that power production ChatGPT applications serving millions of users. You'll learn when to use client-side storage versus server-side persistence, how to implement state hydration for instant load times, optimize state with compression techniques, and build robust migration systems that evolve with your application.
Why State Persistence Matters for ChatGPT Apps
State persistence transforms one-time interactions into long-term relationships. Consider these real-world scenarios:
E-commerce widgets: Users browse products, add items to cart, then switch to another conversation. When they return, their cart should contain all selected items with correct quantities and pricing.
Form builders: Multi-step forms in ChatGPT require persistent draft state. Users might complete step 1, leave to research information, then return hours later expecting their progress intact.
Collaborative workspaces: Team members editing shared documents need real-time state synchronization across devices with conflict resolution when multiple users edit simultaneously.
Offline-first applications: Travel planning widgets, expense trackers, and note-taking apps must work without internet connectivity, queuing changes for background sync when online.
Without robust persistence strategies, each scenario results in frustrated users and abandoned workflows. The cost of poor state management compounds over time as user trust erodes.
Client-Side Persistence: localStorage and SessionStorage
Client-side storage provides the fastest persistence solution with zero server latency. Modern browsers offer two primary APIs: localStorage for long-term persistence and sessionStorage for temporary session data.
localStorage State Manager Implementation
localStorage stores data permanently until explicitly deleted, making it ideal for user preferences, authentication tokens, and small-to-medium application state (up to 5-10MB depending on browser).
Here's a production-ready localStorage manager with quota handling, serialization safety, and automatic cleanup:
// localStorageManager.ts - Complete localStorage abstraction
interface StorageConfig {
namespace: string;
maxAge?: number; // milliseconds
compression?: boolean;
}
interface StoredData<T> {
value: T;
timestamp: number;
version: string;
}
export class LocalStorageManager<T = any> {
private namespace: string;
private maxAge: number | null;
private compression: boolean;
constructor(config: StorageConfig) {
this.namespace = config.namespace;
this.maxAge = config.maxAge || null;
this.compression = config.compression || false;
}
/**
* Store data with metadata
*/
set(key: string, value: T, version: string = '1.0.0'): boolean {
const fullKey = this.getFullKey(key);
const data: StoredData<T> = {
value,
timestamp: Date.now(),
version
};
try {
const serialized = JSON.stringify(data);
const stored = this.compression
? this.compress(serialized)
: serialized;
localStorage.setItem(fullKey, stored);
// Sync to window.openai runtime
this.syncToOpenAI(key, value);
return true;
} catch (error) {
if (error instanceof DOMException && error.name === 'QuotaExceededError') {
console.warn('localStorage quota exceeded, attempting cleanup...');
this.cleanup();
// Retry once after cleanup
try {
const serialized = JSON.stringify(data);
localStorage.setItem(fullKey, serialized);
return true;
} catch (retryError) {
console.error('localStorage write failed after cleanup:', retryError);
return false;
}
}
console.error('localStorage write error:', error);
return false;
}
}
/**
* Retrieve data with expiration check
*/
get(key: string): T | null {
const fullKey = this.getFullKey(key);
try {
const stored = localStorage.getItem(fullKey);
if (!stored) return null;
const decompressed = this.compression
? this.decompress(stored)
: stored;
const data: StoredData<T> = JSON.parse(decompressed);
// Check expiration
if (this.maxAge && Date.now() - data.timestamp > this.maxAge) {
this.remove(key);
return null;
}
return data.value;
} catch (error) {
console.error('localStorage read error:', error);
this.remove(key); // Remove corrupted data
return null;
}
}
/**
* Remove specific key
*/
remove(key: string): void {
localStorage.removeItem(this.getFullKey(key));
}
/**
* Clear all data in namespace
*/
clear(): void {
const keys = this.getAllKeys();
keys.forEach(key => localStorage.removeItem(key));
}
/**
* Get all keys in namespace
*/
private getAllKeys(): string[] {
const prefix = `${this.namespace}:`;
const keys: string[] = [];
for (let i = 0; i < localStorage.length; i++) {
const key = localStorage.key(i);
if (key?.startsWith(prefix)) {
keys.push(key);
}
}
return keys;
}
/**
* Cleanup old/expired entries
*/
private cleanup(): void {
const keys = this.getAllKeys();
const now = Date.now();
let cleaned = 0;
keys.forEach(fullKey => {
try {
const stored = localStorage.getItem(fullKey);
if (!stored) return;
const data: StoredData<T> = JSON.parse(stored);
if (this.maxAge && now - data.timestamp > this.maxAge) {
localStorage.removeItem(fullKey);
cleaned++;
}
} catch (error) {
// Remove corrupted entries
localStorage.removeItem(fullKey);
cleaned++;
}
});
console.log(`Cleaned ${cleaned} localStorage entries`);
}
/**
* Get full namespaced key
*/
private getFullKey(key: string): string {
return `${this.namespace}:${key}`;
}
/**
* Compress string data (placeholder - use LZ-string in production)
*/
private compress(data: string): string {
// In production, use: import LZString from 'lz-string';
// return LZString.compress(data);
return data; // No compression in this example
}
/**
* Decompress string data
*/
private decompress(data: string): string {
// In production, use: return LZString.decompress(data);
return data;
}
/**
* Sync to window.openai runtime state
*/
private syncToOpenAI(key: string, value: T): void {
if (typeof window !== 'undefined' && window.openai) {
try {
window.openai.setWidgetState({
[key]: value
});
} catch (error) {
console.warn('Failed to sync to window.openai:', error);
}
}
}
/**
* Get storage usage statistics
*/
getStats(): { count: number; sizeBytes: number; keys: string[] } {
const keys = this.getAllKeys();
let totalSize = 0;
keys.forEach(key => {
const item = localStorage.getItem(key);
if (item) {
totalSize += new Blob([item]).size;
}
});
return {
count: keys.length,
sizeBytes: totalSize,
keys: keys.map(k => k.replace(`${this.namespace}:`, ''))
};
}
}
// Usage example
const widgetStorage = new LocalStorageManager<any>({
namespace: 'chatgpt_widget',
maxAge: 30 * 24 * 60 * 60 * 1000, // 30 days
compression: true
});
// Store user preferences
widgetStorage.set('preferences', {
theme: 'dark',
notifications: true,
language: 'en'
}, '1.0.0');
// Retrieve preferences
const preferences = widgetStorage.get('preferences');
console.log('User preferences:', preferences);
// Check storage usage
const stats = widgetStorage.getStats();
console.log(`Using ${stats.sizeBytes} bytes for ${stats.count} items`);
When to Use localStorage vs SessionStorage
Use localStorage for:
- User preferences (theme, language, layout)
- Authentication tokens (with encryption)
- Shopping cart items
- Draft form data
- Application settings
- Recently viewed items
Use sessionStorage for:
- Temporary wizard/flow state
- Current page context
- Search filters
- Temporary selections
- One-time tokens
SessionStorage automatically clears when the browser tab closes, providing natural cleanup for temporary data without persistence overhead.
IndexedDB for Large Data Storage
When your ChatGPT widget handles datasets exceeding 10MB, rich media content, or requires complex queries, IndexedDB becomes essential. This NoSQL database provides asynchronous storage up to gigabytes with transaction support and indexing capabilities.
Production IndexedDB Wrapper
Here's a complete IndexedDB wrapper with React hooks, transaction safety, and automatic migration:
// indexedDBWrapper.ts - Complete IndexedDB abstraction
interface DBConfig {
dbName: string;
version: number;
stores: {
name: string;
keyPath: string;
autoIncrement?: boolean;
indexes?: Array<{
name: string;
keyPath: string | string[];
unique?: boolean;
}>;
}[];
}
export class IndexedDBWrapper {
private dbName: string;
private version: number;
private stores: DBConfig['stores'];
private db: IDBDatabase | null = null;
constructor(config: DBConfig) {
this.dbName = config.dbName;
this.version = config.version;
this.stores = config.stores;
}
/**
* Initialize database connection
*/
async init(): Promise<void> {
return new Promise((resolve, reject) => {
const request = indexedDB.open(this.dbName, this.version);
request.onerror = () => {
reject(new Error(`Failed to open database: ${request.error}`));
};
request.onsuccess = () => {
this.db = request.result;
console.log(`IndexedDB ${this.dbName} v${this.version} initialized`);
resolve();
};
request.onupgradeneeded = (event) => {
const db = (event.target as IDBOpenDBRequest).result;
this.stores.forEach(storeConfig => {
// Delete old store if exists
if (db.objectStoreNames.contains(storeConfig.name)) {
db.deleteObjectStore(storeConfig.name);
}
// Create new store
const store = db.createObjectStore(storeConfig.name, {
keyPath: storeConfig.keyPath,
autoIncrement: storeConfig.autoIncrement || false
});
// Create indexes
if (storeConfig.indexes) {
storeConfig.indexes.forEach(index => {
store.createIndex(index.name, index.keyPath, {
unique: index.unique || false
});
});
}
});
console.log('Database schema upgraded to version', this.version);
};
});
}
/**
* Add single item to store
*/
async add<T>(storeName: string, item: T): Promise<IDBValidKey> {
if (!this.db) throw new Error('Database not initialized');
return new Promise((resolve, reject) => {
const transaction = this.db!.transaction([storeName], 'readwrite');
const store = transaction.objectStore(storeName);
const request = store.add(item);
request.onsuccess = () => resolve(request.result);
request.onerror = () => reject(request.error);
});
}
/**
* Add multiple items (batch operation)
*/
async addBatch<T>(storeName: string, items: T[]): Promise<void> {
if (!this.db) throw new Error('Database not initialized');
return new Promise((resolve, reject) => {
const transaction = this.db!.transaction([storeName], 'readwrite');
const store = transaction.objectStore(storeName);
transaction.oncomplete = () => resolve();
transaction.onerror = () => reject(transaction.error);
items.forEach(item => store.add(item));
});
}
/**
* Get single item by key
*/
async get<T>(storeName: string, key: IDBValidKey): Promise<T | undefined> {
if (!this.db) throw new Error('Database not initialized');
return new Promise((resolve, reject) => {
const transaction = this.db!.transaction([storeName], 'readonly');
const store = transaction.objectStore(storeName);
const request = store.get(key);
request.onsuccess = () => resolve(request.result);
request.onerror = () => reject(request.error);
});
}
/**
* Get all items from store
*/
async getAll<T>(storeName: string): Promise<T[]> {
if (!this.db) throw new Error('Database not initialized');
return new Promise((resolve, reject) => {
const transaction = this.db!.transaction([storeName], 'readonly');
const store = transaction.objectStore(storeName);
const request = store.getAll();
request.onsuccess = () => resolve(request.result);
request.onerror = () => reject(request.error);
});
}
/**
* Query by index
*/
async getByIndex<T>(
storeName: string,
indexName: string,
query: IDBValidKey | IDBKeyRange
): Promise<T[]> {
if (!this.db) throw new Error('Database not initialized');
return new Promise((resolve, reject) => {
const transaction = this.db!.transaction([storeName], 'readonly');
const store = transaction.objectStore(storeName);
const index = store.index(indexName);
const request = index.getAll(query);
request.onsuccess = () => resolve(request.result);
request.onerror = () => reject(request.error);
});
}
/**
* Update existing item
*/
async update<T>(storeName: string, item: T): Promise<IDBValidKey> {
if (!this.db) throw new Error('Database not initialized');
return new Promise((resolve, reject) => {
const transaction = this.db!.transaction([storeName], 'readwrite');
const store = transaction.objectStore(storeName);
const request = store.put(item);
request.onsuccess = () => resolve(request.result);
request.onerror = () => reject(request.error);
});
}
/**
* Delete single item
*/
async delete(storeName: string, key: IDBValidKey): Promise<void> {
if (!this.db) throw new Error('Database not initialized');
return new Promise((resolve, reject) => {
const transaction = this.db!.transaction([storeName], 'readwrite');
const store = transaction.objectStore(storeName);
const request = store.delete(key);
request.onsuccess = () => resolve();
request.onerror = () => reject(request.error);
});
}
/**
* Clear entire store
*/
async clear(storeName: string): Promise<void> {
if (!this.db) throw new Error('Database not initialized');
return new Promise((resolve, reject) => {
const transaction = this.db!.transaction([storeName], 'readwrite');
const store = transaction.objectStore(storeName);
const request = store.clear();
request.onsuccess = () => resolve();
request.onerror = () => reject(request.error);
});
}
/**
* Count items in store
*/
async count(storeName: string): Promise<number> {
if (!this.db) throw new Error('Database not initialized');
return new Promise((resolve, reject) => {
const transaction = this.db!.transaction([storeName], 'readonly');
const store = transaction.objectStore(storeName);
const request = store.count();
request.onsuccess = () => resolve(request.result);
request.onerror = () => reject(request.error);
});
}
/**
* Close database connection
*/
close(): void {
if (this.db) {
this.db.close();
this.db = null;
}
}
}
// Database configuration
const dbConfig: DBConfig = {
dbName: 'ChatGPTWidgetDB',
version: 1,
stores: [
{
name: 'documents',
keyPath: 'id',
autoIncrement: true,
indexes: [
{ name: 'userId', keyPath: 'userId' },
{ name: 'timestamp', keyPath: 'timestamp' },
{ name: 'tags', keyPath: 'tags', unique: false }
]
},
{
name: 'media',
keyPath: 'id',
autoIncrement: true,
indexes: [
{ name: 'type', keyPath: 'type' },
{ name: 'size', keyPath: 'size' }
]
}
]
};
// Initialize database
const db = new IndexedDBWrapper(dbConfig);
await db.init();
// Store document
await db.add('documents', {
userId: 'user123',
title: 'Project Plan',
content: '...',
tags: ['work', 'project'],
timestamp: Date.now()
});
// Query by user
const userDocs = await db.getByIndex('documents', 'userId', 'user123');
console.log('User documents:', userDocs);
React Hook for IndexedDB
Integrate IndexedDB with React components using this custom hook:
// useIndexedDB.ts - React hook for IndexedDB operations
import { useState, useEffect, useCallback } from 'react';
import { IndexedDBWrapper } from './indexedDBWrapper';
interface UseIndexedDBOptions {
db: IndexedDBWrapper;
storeName: string;
autoLoad?: boolean;
}
export function useIndexedDB<T>(options: UseIndexedDBOptions) {
const { db, storeName, autoLoad = true } = options;
const [data, setData] = useState<T[]>([]);
const [loading, setLoading] = useState(false);
const [error, setError] = useState<Error | null>(null);
/**
* Load all items from store
*/
const loadAll = useCallback(async () => {
setLoading(true);
setError(null);
try {
const items = await db.getAll<T>(storeName);
setData(items);
// Sync subset to window.openai (avoid large payloads)
if (window.openai && items.length > 0) {
window.openai.setWidgetState({
[storeName]: items.slice(0, 10) // Limit to 10 items
});
}
} catch (err) {
setError(err as Error);
console.error('IndexedDB load error:', err);
} finally {
setLoading(false);
}
}, [db, storeName]);
/**
* Add new item
*/
const addItem = useCallback(async (item: T): Promise<IDBValidKey> => {
try {
const key = await db.add(storeName, item);
await loadAll(); // Refresh data
return key;
} catch (err) {
setError(err as Error);
throw err;
}
}, [db, storeName, loadAll]);
/**
* Update existing item
*/
const updateItem = useCallback(async (item: T): Promise<void> => {
try {
await db.update(storeName, item);
await loadAll();
} catch (err) {
setError(err as Error);
throw err;
}
}, [db, storeName, loadAll]);
/**
* Delete item by key
*/
const deleteItem = useCallback(async (key: IDBValidKey): Promise<void> => {
try {
await db.delete(storeName, key);
await loadAll();
} catch (err) {
setError(err as Error);
throw err;
}
}, [db, storeName, loadAll]);
/**
* Clear all items
*/
const clearAll = useCallback(async (): Promise<void> => {
try {
await db.clear(storeName);
setData([]);
} catch (err) {
setError(err as Error);
throw err;
}
}, [db, storeName]);
// Auto-load on mount if enabled
useEffect(() => {
if (autoLoad) {
loadAll();
}
}, [autoLoad, loadAll]);
return {
data,
loading,
error,
loadAll,
addItem,
updateItem,
deleteItem,
clearAll
};
}
// Usage in React component
function DocumentManager() {
const { data, loading, addItem, deleteItem } = useIndexedDB<Document>({
db: db, // IndexedDBWrapper instance
storeName: 'documents',
autoLoad: true
});
const handleAddDocument = async () => {
await addItem({
userId: 'user123',
title: 'New Document',
content: '',
timestamp: Date.now()
});
};
if (loading) return <div>Loading documents...</div>;
return (
<div>
<button onClick={handleAddDocument}>Add Document</button>
<ul>
{data.map(doc => (
<li key={doc.id}>
{doc.title}
<button onClick={() => deleteItem(doc.id)}>Delete</button>
</li>
))}
</ul>
</div>
);
}
Server-Side State Persistence
Server-side persistence provides the authoritative source of truth for multi-device access, backup recovery, and collaborative features. The window.openai.setWidgetState() API synchronizes client state with the ChatGPT runtime, but you need custom backend sync for cross-session persistence.
Server-Side State Synchronization Manager
Implement bidirectional sync with conflict resolution and offline queue:
// serverStateSync.ts - Server-side state synchronization
interface SyncConfig {
apiEndpoint: string;
syncInterval?: number; // milliseconds
retryAttempts?: number;
userId: string;
}
interface SyncQueueItem {
id: string;
storeName: string;
operation: 'create' | 'update' | 'delete';
data: any;
timestamp: number;
retries: number;
}
export class ServerStateSync {
private config: SyncConfig;
private syncQueue: SyncQueueItem[] = [];
private syncing: boolean = false;
private intervalId: number | null = null;
constructor(config: SyncConfig) {
this.config = {
syncInterval: 30000, // 30 seconds default
retryAttempts: 3,
...config
};
this.loadQueueFromStorage();
this.startAutoSync();
}
/**
* Queue state change for sync
*/
async queueSync(
storeName: string,
operation: 'create' | 'update' | 'delete',
data: any
): Promise<void> {
const item: SyncQueueItem = {
id: this.generateId(),
storeName,
operation,
data,
timestamp: Date.now(),
retries: 0
};
this.syncQueue.push(item);
this.saveQueueToStorage();
// Trigger immediate sync if online
if (navigator.onLine) {
await this.processQueue();
}
}
/**
* Process sync queue
*/
private async processQueue(): Promise<void> {
if (this.syncing || this.syncQueue.length === 0) {
return;
}
this.syncing = true;
while (this.syncQueue.length > 0) {
const item = this.syncQueue[0];
try {
await this.syncToServer(item);
// Remove from queue on success
this.syncQueue.shift();
this.saveQueueToStorage();
} catch (error) {
console.error('Sync failed:', error);
// Retry logic
item.retries++;
if (item.retries >= (this.config.retryAttempts || 3)) {
console.warn('Max retries reached, removing from queue:', item);
this.syncQueue.shift();
}
this.saveQueueToStorage();
break; // Stop processing on error
}
}
this.syncing = false;
}
/**
* Sync single item to server
*/
private async syncToServer(item: SyncQueueItem): Promise<void> {
const endpoint = `${this.config.apiEndpoint}/${item.storeName}`;
let url = endpoint;
let method = 'POST';
let body: any = {
userId: this.config.userId,
data: item.data,
timestamp: item.timestamp
};
switch (item.operation) {
case 'create':
method = 'POST';
break;
case 'update':
method = 'PUT';
url = `${endpoint}/${item.data.id}`;
break;
case 'delete':
method = 'DELETE';
url = `${endpoint}/${item.data.id}`;
body = undefined;
break;
}
const response = await fetch(url, {
method,
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${this.getAuthToken()}`
},
body: body ? JSON.stringify(body) : undefined
});
if (!response.ok) {
throw new Error(`Server sync failed: ${response.status} ${response.statusText}`);
}
const result = await response.json();
// Update window.openai with server response
if (window.openai && result.data) {
window.openai.setWidgetState({
[item.storeName]: result.data
});
}
}
/**
* Pull latest state from server
*/
async pullFromServer(storeName: string): Promise<any[]> {
const response = await fetch(
`${this.config.apiEndpoint}/${storeName}?userId=${this.config.userId}`,
{
headers: {
'Authorization': `Bearer ${this.getAuthToken()}`
}
}
);
if (!response.ok) {
throw new Error(`Server pull failed: ${response.status}`);
}
const data = await response.json();
return data.items || [];
}
/**
* Merge server state with local state
*/
async mergeState(
storeName: string,
localItems: any[],
serverItems: any[]
): Promise<any[]> {
const merged = new Map<string, any>();
// Add server items first
serverItems.forEach(item => {
merged.set(item.id, item);
});
// Merge local items (last-write-wins on timestamp)
localItems.forEach(localItem => {
const serverItem = merged.get(localItem.id);
if (!serverItem) {
// Local-only item, queue for sync
this.queueSync(storeName, 'create', localItem);
merged.set(localItem.id, localItem);
} else if (localItem.timestamp > serverItem.timestamp) {
// Local item is newer
this.queueSync(storeName, 'update', localItem);
merged.set(localItem.id, localItem);
}
// If server item is newer, keep it (already in map)
});
return Array.from(merged.values());
}
/**
* Start automatic sync interval
*/
private startAutoSync(): void {
this.intervalId = window.setInterval(() => {
if (navigator.onLine) {
this.processQueue();
}
}, this.config.syncInterval);
// Sync when coming online
window.addEventListener('online', () => {
this.processQueue();
});
}
/**
* Stop automatic sync
*/
stopAutoSync(): void {
if (this.intervalId) {
clearInterval(this.intervalId);
this.intervalId = null;
}
}
/**
* Save queue to localStorage for persistence
*/
private saveQueueToStorage(): void {
try {
localStorage.setItem('sync_queue', JSON.stringify(this.syncQueue));
} catch (error) {
console.error('Failed to save sync queue:', error);
}
}
/**
* Load queue from localStorage
*/
private loadQueueFromStorage(): void {
try {
const stored = localStorage.getItem('sync_queue');
if (stored) {
this.syncQueue = JSON.parse(stored);
}
} catch (error) {
console.error('Failed to load sync queue:', error);
this.syncQueue = [];
}
}
/**
* Generate unique ID
*/
private generateId(): string {
return `${Date.now()}-${Math.random().toString(36).substr(2, 9)}`;
}
/**
* Get authentication token (implement based on your auth system)
*/
private getAuthToken(): string {
// Retrieve from localStorage, cookie, or auth context
return localStorage.getItem('auth_token') || '';
}
}
// Usage example
const syncManager = new ServerStateSync({
apiEndpoint: 'https://api.example.com/widget-state',
syncInterval: 30000,
userId: 'user123'
});
// Queue changes
await syncManager.queueSync('documents', 'create', {
id: '1',
title: 'New Document',
timestamp: Date.now()
});
// Pull and merge server state
const serverDocs = await syncManager.pullFromServer('documents');
const localDocs = await db.getAll('documents');
const mergedDocs = await syncManager.mergeState('documents', localDocs, serverDocs);
State Hydration for Instant Load Times
State hydration deserializes persisted state and restores application context on mount, providing instant load times without flickering or loading states.
State Hydration Hook with React
Implement automatic state hydration from multiple sources with fallback priority:
// useStateHydration.ts - Multi-source state hydration
import { useState, useEffect } from 'react';
interface HydrationConfig<T> {
key: string;
sources: Array<'localStorage' | 'sessionStorage' | 'indexedDB' | 'server'>;
defaultValue: T;
dbInstance?: IndexedDBWrapper;
serverEndpoint?: string;
onHydrated?: (value: T, source: string) => void;
}
export function useStateHydration<T>(config: HydrationConfig<T>) {
const [state, setState] = useState<T>(config.defaultValue);
const [hydrated, setHydrated] = useState(false);
const [hydratedFrom, setHydratedFrom] = useState<string | null>(null);
useEffect(() => {
hydrateState();
}, []);
/**
* Attempt hydration from sources in priority order
*/
const hydrateState = async () => {
for (const source of config.sources) {
try {
let value: T | null = null;
switch (source) {
case 'localStorage':
value = hydrateFromLocalStorage();
break;
case 'sessionStorage':
value = hydrateFromSessionStorage();
break;
case 'indexedDB':
value = await hydrateFromIndexedDB();
break;
case 'server':
value = await hydrateFromServer();
break;
}
if (value !== null) {
setState(value);
setHydratedFrom(source);
setHydrated(true);
// Call hydration callback
config.onHydrated?.(value, source);
// Sync to window.openai
if (window.openai) {
window.openai.setWidgetState({ [config.key]: value });
}
return; // Stop after first successful hydration
}
} catch (error) {
console.warn(`Hydration failed from ${source}:`, error);
continue; // Try next source
}
}
// No source provided valid data, use default
setHydrated(true);
console.log('Using default value, no hydration source available');
};
/**
* Hydrate from localStorage
*/
const hydrateFromLocalStorage = (): T | null => {
const stored = localStorage.getItem(config.key);
if (!stored) return null;
try {
return JSON.parse(stored);
} catch (error) {
console.error('localStorage parse error:', error);
return null;
}
};
/**
* Hydrate from sessionStorage
*/
const hydrateFromSessionStorage = (): T | null => {
const stored = sessionStorage.getItem(config.key);
if (!stored) return null;
try {
return JSON.parse(stored);
} catch (error) {
console.error('sessionStorage parse error:', error);
return null;
}
};
/**
* Hydrate from IndexedDB
*/
const hydrateFromIndexedDB = async (): Promise<T | null> => {
if (!config.dbInstance) return null;
try {
const value = await config.dbInstance.get<T>('state', config.key);
return value || null;
} catch (error) {
console.error('IndexedDB hydration error:', error);
return null;
}
};
/**
* Hydrate from server
*/
const hydrateFromServer = async (): Promise<T | null> => {
if (!config.serverEndpoint) return null;
try {
const response = await fetch(
`${config.serverEndpoint}?key=${config.key}`,
{
headers: {
'Authorization': `Bearer ${localStorage.getItem('auth_token')}`
}
}
);
if (!response.ok) return null;
const data = await response.json();
return data.value || null;
} catch (error) {
console.error('Server hydration error:', error);
return null;
}
};
return {
state,
setState,
hydrated,
hydratedFrom
};
}
// Usage in React component
function ShoppingCart() {
const {
state: cartItems,
setState: setCartItems,
hydrated,
hydratedFrom
} = useStateHydration<CartItem[]>({
key: 'cart',
sources: ['localStorage', 'indexedDB', 'server'],
defaultValue: [],
onHydrated: (value, source) => {
console.log(`Cart hydrated from ${source}:`, value);
}
});
if (!hydrated) {
return <div>Loading cart...</div>;
}
return (
<div>
<p>Cart loaded from: {hydratedFrom}</p>
<ul>
{cartItems.map(item => (
<li key={item.id}>{item.name}</li>
))}
</ul>
</div>
);
}
Optimistic Updates for Instant Responsiveness
Optimistic updates immediately reflect user actions in the UI while syncing to persistence layers in the background. This pattern creates instant, responsive interfaces even with slow networks.
Optimistic Update Manager
Implement automatic rollback on failure with user feedback:
// optimisticUpdateManager.ts - Optimistic update pattern
interface OptimisticUpdate<T> {
id: string;
storeName: string;
operation: 'create' | 'update' | 'delete';
optimisticData: T;
previousData?: T;
timestamp: number;
status: 'pending' | 'committed' | 'rolled-back';
}
export class OptimisticUpdateManager<T = any> {
private pendingUpdates: Map<string, OptimisticUpdate<T>> = new Map();
private db: IndexedDBWrapper;
private syncManager: ServerStateSync;
constructor(db: IndexedDBWrapper, syncManager: ServerStateSync) {
this.db = db;
this.syncManager = syncManager;
}
/**
* Apply optimistic update
*/
async applyOptimistic(
storeName: string,
operation: 'create' | 'update' | 'delete',
data: T,
previousData?: T
): Promise<string> {
const updateId = this.generateUpdateId();
const update: OptimisticUpdate<T> = {
id: updateId,
storeName,
operation,
optimisticData: data,
previousData,
timestamp: Date.now(),
status: 'pending'
};
this.pendingUpdates.set(updateId, update);
// Apply to local state immediately
try {
await this.applyToLocalStorage(update);
await this.applyToIndexedDB(update);
await this.applyToOpenAI(update);
// Queue for server sync
await this.syncManager.queueSync(storeName, operation, data);
// Commit after successful server sync (in practice, listen to sync events)
setTimeout(() => this.commitUpdate(updateId), 1000);
return updateId;
} catch (error) {
// Rollback on local failure
await this.rollbackUpdate(updateId);
throw error;
}
}
/**
* Commit update after server confirmation
*/
private async commitUpdate(updateId: string): Promise<void> {
const update = this.pendingUpdates.get(updateId);
if (!update) return;
update.status = 'committed';
this.pendingUpdates.delete(updateId);
console.log(`Update ${updateId} committed`);
}
/**
* Rollback update on failure
*/
async rollbackUpdate(updateId: string): Promise<void> {
const update = this.pendingUpdates.get(updateId);
if (!update) return;
console.warn(`Rolling back update ${updateId}`);
update.status = 'rolled-back';
// Restore previous data
if (update.previousData) {
await this.applyToLocalStorage({
...update,
optimisticData: update.previousData
});
await this.applyToIndexedDB({
...update,
optimisticData: update.previousData
});
await this.applyToOpenAI({
...update,
optimisticData: update.previousData
});
} else if (update.operation === 'create') {
// Remove created item
await this.removeFromLocalStorage(update.storeName, update.optimisticData);
await this.removeFromIndexedDB(update.storeName, update.optimisticData);
}
this.pendingUpdates.delete(updateId);
}
/**
* Apply update to localStorage
*/
private async applyToLocalStorage(update: OptimisticUpdate<T>): Promise<void> {
const key = `${update.storeName}_data`;
const stored = localStorage.getItem(key);
const items: T[] = stored ? JSON.parse(stored) : [];
let updated: T[];
switch (update.operation) {
case 'create':
updated = [...items, update.optimisticData];
break;
case 'update':
updated = items.map(item =>
(item as any).id === (update.optimisticData as any).id
? update.optimisticData
: item
);
break;
case 'delete':
updated = items.filter(
item => (item as any).id !== (update.optimisticData as any).id
);
break;
default:
updated = items;
}
localStorage.setItem(key, JSON.stringify(updated));
}
/**
* Apply update to IndexedDB
*/
private async applyToIndexedDB(update: OptimisticUpdate<T>): Promise<void> {
switch (update.operation) {
case 'create':
await this.db.add(update.storeName, update.optimisticData);
break;
case 'update':
await this.db.update(update.storeName, update.optimisticData);
break;
case 'delete':
await this.db.delete(
update.storeName,
(update.optimisticData as any).id
);
break;
}
}
/**
* Apply update to window.openai state
*/
private async applyToOpenAI(update: OptimisticUpdate<T>): Promise<void> {
if (!window.openai) return;
// Get current state
const currentState = await this.db.getAll<T>(update.storeName);
// Update window.openai with latest state
window.openai.setWidgetState({
[update.storeName]: currentState.slice(0, 10)
});
}
/**
* Remove from localStorage
*/
private async removeFromLocalStorage(
storeName: string,
data: T
): Promise<void> {
const key = `${storeName}_data`;
const stored = localStorage.getItem(key);
const items: T[] = stored ? JSON.parse(stored) : [];
const filtered = items.filter(
item => (item as any).id !== (data as any).id
);
localStorage.setItem(key, JSON.stringify(filtered));
}
/**
* Remove from IndexedDB
*/
private async removeFromIndexedDB(storeName: string, data: T): Promise<void> {
await this.db.delete(storeName, (data as any).id);
}
/**
* Generate unique update ID
*/
private generateUpdateId(): string {
return `opt_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
}
/**
* Get pending updates count
*/
getPendingCount(): number {
return this.pendingUpdates.size;
}
}
// Usage example
const optimisticManager = new OptimisticUpdateManager(db, syncManager);
// Add item with optimistic update
async function addCartItem(item: CartItem) {
try {
const updateId = await optimisticManager.applyOptimistic(
'cart',
'create',
item
);
console.log('Item added optimistically:', updateId);
} catch (error) {
console.error('Optimistic update failed:', error);
// Show error toast to user
}
}
State Compression for Storage Efficiency
Large state objects can exceed storage quotas quickly. Implement compression to reduce storage footprint by 60-80%.
State Compression Utility
Use LZ-string for text compression with automatic threshold detection:
// stateCompression.ts - Automatic state compression
import LZString from 'lz-string';
interface CompressionConfig {
threshold: number; // bytes - compress if larger
algorithm: 'lz-string' | 'gzip' | 'none';
}
export class StateCompression {
private config: CompressionConfig;
constructor(config?: Partial<CompressionConfig>) {
this.config = {
threshold: 1024, // 1KB default
algorithm: 'lz-string',
...config
};
}
/**
* Compress state if above threshold
*/
compress(data: any): string {
const serialized = JSON.stringify(data);
const sizeBytes = new Blob([serialized]).size;
// Skip compression for small data
if (sizeBytes < this.config.threshold) {
return JSON.stringify({
compressed: false,
data: serialized
});
}
let compressed: string;
switch (this.config.algorithm) {
case 'lz-string':
compressed = LZString.compress(serialized);
break;
case 'gzip':
// Use pako library for gzip in browser
compressed = serialized; // Placeholder
break;
default:
compressed = serialized;
}
const compressedSize = new Blob([compressed]).size;
const ratio = ((compressedSize / sizeBytes) * 100).toFixed(1);
console.log(
`Compressed ${sizeBytes} bytes → ${compressedSize} bytes (${ratio}% of original)`
);
return JSON.stringify({
compressed: true,
algorithm: this.config.algorithm,
data: compressed
});
}
/**
* Decompress state
*/
decompress(stored: string): any {
try {
const wrapper = JSON.parse(stored);
if (!wrapper.compressed) {
return JSON.parse(wrapper.data);
}
let decompressed: string;
switch (wrapper.algorithm) {
case 'lz-string':
decompressed = LZString.decompress(wrapper.data);
break;
case 'gzip':
decompressed = wrapper.data; // Placeholder
break;
default:
decompressed = wrapper.data;
}
return JSON.parse(decompressed);
} catch (error) {
console.error('Decompression failed:', error);
throw error;
}
}
/**
* Get compression statistics
*/
getStats(original: any): {
originalSize: number;
compressedSize: number;
ratio: number;
} {
const serialized = JSON.stringify(original);
const compressed = this.compress(original);
const originalSize = new Blob([serialized]).size;
const compressedSize = new Blob([compressed]).size;
return {
originalSize,
compressedSize,
ratio: (compressedSize / originalSize) * 100
};
}
}
// Usage with localStorage
const compression = new StateCompression({
threshold: 2048, // 2KB
algorithm: 'lz-string'
});
// Store compressed state
const largeState = {
products: Array(1000).fill({ name: 'Product', price: 99.99 }),
metadata: '...'
};
const compressed = compression.compress(largeState);
localStorage.setItem('widget_state', compressed);
// Retrieve and decompress
const stored = localStorage.getItem('widget_state');
const decompressed = compression.decompress(stored!);
console.log('Compression stats:', compression.getStats(largeState));
// Output: { originalSize: 45000, compressedSize: 12000, ratio: 26.7 }
State Migration System
As your ChatGPT application evolves, state schemas change. Implement automatic migration to prevent data loss across versions.
Version-Based Migration Manager
Handle schema changes with versioned migrations:
// stateMigration.ts - Version-based state migration
interface MigrationDefinition {
fromVersion: string;
toVersion: string;
migrate: (oldState: any) => any;
description: string;
}
export class StateMigration {
private migrations: MigrationDefinition[] = [];
private currentVersion: string;
constructor(currentVersion: string) {
this.currentVersion = currentVersion;
}
/**
* Register migration
*/
addMigration(migration: MigrationDefinition): void {
this.migrations.push(migration);
// Sort by version
this.migrations.sort((a, b) =>
this.compareVersions(a.fromVersion, b.fromVersion)
);
}
/**
* Migrate state from old version to current
*/
migrate(state: any, fromVersion: string): any {
if (fromVersion === this.currentVersion) {
return state; // No migration needed
}
let currentState = state;
let currentVersion = fromVersion;
// Apply migrations in sequence
for (const migration of this.migrations) {
if (this.compareVersions(currentVersion, migration.fromVersion) >= 0 &&
this.compareVersions(migration.toVersion, this.currentVersion) <= 0) {
console.log(
`Migrating state: ${migration.fromVersion} → ${migration.toVersion}`
);
console.log(`Description: ${migration.description}`);
try {
currentState = migration.migrate(currentState);
currentVersion = migration.toVersion;
} catch (error) {
console.error('Migration failed:', error);
throw new Error(
`Migration failed at ${migration.fromVersion} → ${migration.toVersion}: ${error}`
);
}
}
}
return currentState;
}
/**
* Load state with automatic migration
*/
loadWithMigration(key: string): any {
const stored = localStorage.getItem(key);
if (!stored) return null;
try {
const wrapper = JSON.parse(stored);
const { version, state } = wrapper;
// Migrate if needed
const migrated = this.migrate(state, version || '1.0.0');
// Save migrated state
this.save(key, migrated);
return migrated;
} catch (error) {
console.error('Failed to load and migrate state:', error);
return null;
}
}
/**
* Save state with version
*/
save(key: string, state: any): void {
const wrapper = {
version: this.currentVersion,
state,
migratedAt: Date.now()
};
localStorage.setItem(key, JSON.stringify(wrapper));
}
/**
* Compare semantic versions
*/
private compareVersions(v1: string, v2: string): number {
const parts1 = v1.split('.').map(Number);
const parts2 = v2.split('.').map(Number);
for (let i = 0; i < Math.max(parts1.length, parts2.length); i++) {
const part1 = parts1[i] || 0;
const part2 = parts2[i] || 0;
if (part1 > part2) return 1;
if (part1 < part2) return -1;
}
return 0;
}
}
// Define migrations
const migrationManager = new StateMigration('2.0.0');
// Migration 1.0.0 → 1.1.0: Add timestamps
migrationManager.addMigration({
fromVersion: '1.0.0',
toVersion: '1.1.0',
description: 'Add timestamp to all cart items',
migrate: (state) => {
return {
...state,
cart: state.cart.map((item: any) => ({
...item,
addedAt: item.addedAt || Date.now()
}))
};
}
});
// Migration 1.1.0 → 2.0.0: Restructure preferences
migrationManager.addMigration({
fromVersion: '1.1.0',
toVersion: '2.0.0',
description: 'Restructure user preferences object',
migrate: (state) => {
return {
...state,
preferences: {
ui: {
theme: state.preferences?.theme || 'light',
language: state.preferences?.language || 'en'
},
notifications: {
email: state.preferences?.emailNotifications || true,
push: state.preferences?.pushNotifications || false
}
}
};
}
});
// Load state with automatic migration
const state = migrationManager.loadWithMigration('widget_state');
console.log('Loaded state (migrated to 2.0.0):', state);
Best Practices for State Persistence
- Choose the right storage: localStorage for preferences, IndexedDB for large data, server for multi-device
- Implement quota management: Monitor storage usage and cleanup old data proactively
- Compress large state: Use LZ-string for text data to reduce storage footprint by 60-80%
- Handle migrations: Version your state schema and provide automatic migration paths
- Encrypt sensitive data: Never store PII or payment info unencrypted
- Sync to window.openai: Keep ChatGPT runtime state synchronized for seamless transitions
- Use optimistic updates: Apply changes immediately, sync in background
- Test persistence: Verify state survives page reloads, browser restarts, and network failures
Choosing the Right Strategy
| Strategy | Use Case | Size Limit | Sync |
|---|---|---|---|
| localStorage | Preferences, UI state | 5-10MB | Manual |
| sessionStorage | Temporary flow state | 5-10MB | Manual |
| IndexedDB | Large datasets, media | Gigabytes | Manual |
| Server Sync | Multi-device, backup | Unlimited | Automatic |
| window.openai | ChatGPT runtime | 4K tokens | Automatic |
For production ChatGPT applications, combine multiple strategies: localStorage for instant preferences, IndexedDB for local cache, server sync for authoritative state, and window.openai for runtime synchronization.
Related Resources
For deeper integration with ChatGPT's state management, explore these comprehensive guides:
- Complete Guide to Building ChatGPT Applications - Master ChatGPT app architecture and patterns
- window.openai API Reference - Complete state management API documentation
- React Widget Components for ChatGPT - Build production React widgets with best practices
- Widget Performance Optimization for ChatGPT - Optimize state management overhead and load times
- IndexedDB API Documentation - Official browser storage reference
- React State Management Guide - React's official state patterns
- LZ-string Compression Library - Fast text compression for JavaScript
Ready to build ChatGPT applications with bulletproof state persistence? Start with MakeAIHQ and deploy production-ready widgets in 48 hours using our no-code platform with built-in state management, automatic persistence, and seamless ChatGPT integration.