Next.js 15/16 cacheLife() and use cache: remote/private — Sharing Cache Across Instances and Protecting Personalized Data with a Single API
Deploying a Next.js App Router application to a serverless environment can surface unexpected problems. Traditional caching via unstable_cache or fetch options keeps an independent in-memory cache per Lambda instance, so every time a request is routed to a different instance, a cache miss occurs and database load spikes. On the flip side, if a CDN inadvertently caches personalized user data, it can lead to a serious security incident where that data is exposed to other users.
The declarative cache API — introduced in Next.js 15 behind the dynamicIO experimental flag and evolved into cacheComponents in Next.js 16 — directly addresses these problems. While it does not fully replace unstable_cache, it introduces a new approach for declaring cache policies at the component and function level, greatly improving code readability and maintainability. There is no need to repeat configuration on every call the way fetch options require, nor to create separate wrapper functions.
This article targets intermediate-to-advanced developers working with the Next.js App Router in production. By combining three strategies — separating lifecycle by content type with cacheLife() profiles, configuring a Redis-backed shared cache with use cache: remote, and isolating personalized data from the CDN with use cache: private — you can manage a consistent caching strategy from in-memory to CDN layer through a single declarative API.
Core Concepts
The Cache Components Model and Per-Version Flags
To use this caching system in Next.js 15, enable the dynamicIO flag. In Next.js 16, it was renamed to cacheComponents and entered the stabilization phase. Always verify the Next.js version in your deployment environment before applying the corresponding flag.
// next.config.ts — Next.js 15
const nextConfig = {
experimental: {
dynamicIO: true, // Next.js 15 기준
},
}
// next.config.ts — Next.js 16
const nextConfig = {
experimental: {
cacheComponents: true, // Next.js 16 기준
},
}Declaring the 'use cache' directive at the top of a file applies it to the entire file; declaring it inside a function applies it only to that function. This scoping behavior works identically for the use cache: remote and use cache: private variants, so you must declare it precisely at the intended scope.
// 파일 전체에 캐시 적용
'use cache'
export async function BlogList() { ... }
export async function BlogSidebar() { ... } // 이 함수도 캐시 적용됨// 특정 함수에만 캐시 적용
export async function BlogList() {
'use cache'
// 이 함수만 캐시 적용
}cacheLife() — Designing Cache Lifecycle with Three Properties
The cacheLife() function controls a cache's lifecycle via three properties: stale, revalidate, and expire.
| Property | Role | Behavior |
|---|---|---|
stale |
Client router cache TTL + CDN stale-while-revalidate integration |
Delivered via the x-nextjs-stale-time header |
revalidate |
Triggers background revalidation on the first request after this time | Converted to the s-maxage header and forwarded to the CDN |
expire |
Absolute expiration time for the cache (must be at least 300 seconds) | Subsequent requests always fetch fresh data |
Background revalidation (stale-while-revalidate): After the
revalidatetime has passed, the first incoming request immediately receives the currently cached response while new data is fetched in the background simultaneously. This is a strategy that maintains data freshness without degrading user experience.
Setting
expirebelow 5 minutes (300 seconds) excludes the entry from static pre-rendering and treats it as a "Dynamic Hole." That segment is rendered dynamically on every request, eliminating the performance benefit of caching.
The revalidate value is converted to the CDN's s-maxage header, controlling the cache refresh interval at the CDN layer. While use cache: remote handles the server-side shared cache (e.g., Redis), CDN layer control is achieved through this revalidate → s-maxage conversion.
Separating Strategies by Content Type with Custom Profiles
Registering named profiles in next.config.ts lets you reuse them by string key in your code. When configuration changes, updating a single place is reflected everywhere the profile is applied.
// next.config.ts
const nextConfig = {
experimental: {
cacheComponents: true,
},
cacheLife: {
blog: {
stale: 3600, // 1시간: 클라이언트·CDN 캐시 신뢰
revalidate: 900, // 15분 후 백그라운드 갱신 (s-maxage: 900)
expire: 86400, // 24시간: 완전 만료
},
realtime: {
stale: 0,
revalidate: 30,
expire: 300, // 최소 5분 이상
},
static: {
stale: 86400,
revalidate: 3600,
expire: 604800, // 7일
},
},
}Practical Application
Example 1: Blog Platform — Separating Profiles by Content Type
This scenario manages post lists, detail pages, and popular tags — each with a different update frequency — using distinct caching strategies. Using cacheTag() alongside this allows selective invalidation of only the affected cache when a post is updated.
// components/BlogList.tsx
'use cache'
import { cacheLife, cacheTag } from 'next/cache'
import { db } from '@/lib/db' // 직접 DB 접근 — 서버 컴포넌트에서 상대 경로 fetch는 동작하지 않습니다
export async function BlogList() {
cacheLife('blog')
cacheTag('blog-list')
const posts = await db.post.findMany({
orderBy: { createdAt: 'desc' },
take: 20,
select: { id: true, title: true, slug: true, createdAt: true },
})
return <ul>{posts.map(p => <li key={p.id}>{p.title}</li>)}</ul>
}// app/blog/[slug]/page.tsx
'use cache'
import { cacheLife, cacheTag } from 'next/cache'
import { db } from '@/lib/db'
export default async function BlogPost({ params }: { params: { slug: string } }) {
cacheLife('blog')
cacheTag(`post-${params.slug}`)
const post = await db.post.findUnique({ where: { slug: params.slug } })
if (!post) return <NotFound />
return <article>{post.content}</article>
}// app/actions/invalidate.ts — 게시글 수정 시 온디맨드 무효화
'use server'
import { revalidateTag } from 'next/cache'
export async function invalidatePost(slug: string) {
await revalidateTag(`post-${slug}`)
await revalidateTag('blog-list') // 목록도 함께 갱신
}| Component | Role |
|---|---|
cacheLife('blog') |
Applies the next.config.ts profile (stale 1h / revalidate 15m) |
cacheTag('blog-list') |
Groups the entire list under one tag for bulk invalidation |
cacheTag(\post-${slug}`)` |
Enables selective invalidation of a specific post |
revalidateTag() |
Discards only the relevant cache when an update event occurs |
Example 2: Serverless E-commerce — Shared Product Catalog Cache via Redis
In serverless environments like Lambda or Vercel Functions, the default use cache (in-memory) is not shared across instances, resulting in a low cache hit rate and higher-than-expected database load. Using use cache: remote switches the cache store to an external server-side store like Redis, allowing all instances to share the same cache.
use cache: remote: Stores cached results in an external store (Redis, Valkey, etc.) rather than in-memory. On Vercel, the infrastructure is configured automatically; in self-hosted environments, you connect Redis directly viacacheHandlers.remote. This is a separate mechanism from CDN response header control.
// cache-handlers/redis-handler.ts
import { createClient } from 'redis'
const client = createClient({ url: process.env.REDIS_URL })
async function getClient() {
if (!client.isOpen) {
await client.connect()
}
return client
}
export default {
async get(key: string) {
try {
const c = await getClient()
const value = await c.get(key)
return value ? JSON.parse(value) : undefined
} catch (error) {
console.error('[CacheHandler] get error:', error)
return undefined // 캐시 미스로 처리하여 서비스 중단 방지
}
},
async set(key: string, value: unknown, ttl: number) {
try {
const c = await getClient()
await c.setEx(key, ttl, JSON.stringify(value))
} catch (error) {
console.error('[CacheHandler] set error:', error)
}
},
async delete(key: string) {
try {
const c = await getClient()
await c.del(key)
} catch (error) {
console.error('[CacheHandler] delete error:', error)
}
},
}// next.config.ts
const nextConfig = {
experimental: { cacheComponents: true },
cacheHandlers: {
remote: require.resolve('./cache-handlers/redis-handler'),
},
cacheLife: {
catalog: { stale: 3600, revalidate: 1800, expire: 7200 },
},
}// components/ProductCatalog.tsx
// 파일 최상단 선언 → 이 파일의 모든 함수에 'use cache: remote' 적용
'use cache: remote'
import { cacheLife, cacheTag } from 'next/cache'
import { db } from '@/lib/db'
interface Props {
categoryId: string
}
// fetchProductsByCategory: DB에서 카테고리별 상품 목록을 가져오는 내부 함수
export async function ProductCatalog({ categoryId }: Props) {
cacheLife('catalog')
cacheTag(`catalog-${categoryId}`)
const products = await db.product.findMany({
where: { categoryId, isActive: true },
select: { id: true, name: true, price: true, imageUrl: true },
})
return <ProductGrid products={products} />
}| Component | Role |
|---|---|
'use cache: remote' (top of file) |
Delegates the cache store for the entire file to Redis |
cacheHandlers.remote |
Registers the path to the custom Redis handler |
cacheLife('catalog') |
Catalog-specific lifecycle (stale 1h / revalidate 30m) |
cacheTag(\catalog-${categoryId}`)` |
Enables selective invalidation per category |
Example 3: User Dashboard — Protecting Personalized Data with use cache: private
If a CDN caches dashboard data for a logged-in user, it can be exposed to other users. Using use cache: private lets you access cookies() while automatically setting a Cache-Control: private header on the response, which blocks CDN caching.
Cache-Control: private: A response with this header is not stored in intermediate caches (CDNs, proxies) and is cached only in the end user's browser. This prevents the CDN from sharing session tokens or personalized data.
// components/UserDashboard.tsx
// 함수 내부 선언 → 이 함수에만 'use cache: private' 적용
import { cacheLife } from 'next/cache'
import { cookies } from 'next/headers'
// fetchUserData: sessionToken으로 사용자 데이터를 조회하는 내부 함수
// 반환 타입: { name: string; stats: UserStats; recentActivity: Activity[] }
export async function UserDashboard() {
'use cache: private'
cacheLife({ stale: 300, revalidate: 60, expire: 600 })
const cookieStore = await cookies()
const sessionToken = cookieStore.get('session')?.value
if (!sessionToken) return <LoginPrompt />
const userData = await fetchUserData(sessionToken)
return <DashboardView data={userData} />
}Note: Calling
cookies()orheaders()inside a regular'use cache'will cause a build error. If a component needs access to request runtime APIs, you must use'use cache: private'.
The three examples can be combined based on service scale and requirements. For smaller services, the profile separation in Example 1 alone may be sufficient; for serverless deployments, add the Redis shared cache from Example 2; and if authentication is involved, apply Example 3 in parallel — building up incrementally as needed.
Pros and Cons
Advantages
| Item | Details |
|---|---|
| Fine-grained cache control | stale/revalidate/expire can be configured independently per component or function |
| Automated CDN integration | The revalidate value is converted to s-maxage, automatically controlling CDN refresh intervals |
| Declarative API | Business logic and cache policy are managed in a single file, improving readability |
| RSC serialization support | Not just JSON data — entire React Server Component output can be cached |
| On-demand invalidation | cacheTag() + revalidateTag() enable selective discarding of specific caches |
| Cross-instance sharing | use cache: remote + Redis maximizes cache hit rates in serverless environments |
Disadvantages and Caveats
| Item | Details | Mitigation |
|---|---|---|
| Experimental flag | cacheComponents (dynamicIO) is still an experimental feature |
Pin the Next.js version when applying to production |
| Serverless in-memory limitation | Default use cache cannot be shared across instances |
Introduce use cache: remote + Redis |
use cache: private constraint |
Known behavior issues during client-side navigation (GitHub #85672) | Verify data refresh after page transitions |
Vary header limitation |
CDN cache variant management is not fully supported (GitHub Discussion #82571) | Supplement complex variant caching with edge middleware |
| Runtime API restriction | Calling cookies() or headers() inside regular use cache causes a build error |
Switch to use cache: private |
expire below 5 minutes |
Excluded from static pre-rendering and treated as a Dynamic Hole | Always set expire to at least 300 seconds |
Dynamic Hole: A segment in a statically pre-rendered page where the cache TTL is too short for static processing. That segment is rendered dynamically on every request, eliminating the performance benefit of caching.
Vary header: An HTTP response header that tells the CDN which request header values cause the response to vary. For example,
Vary: Accept-Languagemeans separate caches must be maintained per language.
The Most Common Mistakes in Practice
-
Calling
cookies()inside a regular'use cache'— This causes a build-time error. For components that require request context, switching to'use cache: private'is the correct approach. -
Setting
expirebelow 5 minutes — The entry is excluded from static pre-rendering and unintentionally falls into the dynamic rendering path, failing to deliver the expected performance benefit. It is always safer to setexpireto at least 300 seconds (5 minutes). -
Using only the default
use cachein a serverless environment — Each Lambda instance maintains its own independent cache, resulting in a low cache hit rate and higher-than-expected database load. In serverless environments, consider adoptinguse cache: remotetogether with Redis.
Closing Thoughts
Here is a quick summary of how to choose the right directive:
- Static, public content (blog posts, product detail pages, etc.) →
'use cache'+cacheLifeprofile to control CDNs-maxage - Shared cache in serverless multi-instance environments →
'use cache: remote'+ RediscacheHandlersfor cross-instance sharing - Personalized or session-based data →
'use cache: private'for CDN isolation, leveraging browser cache only
Three steps to get started right now:
- Add the version-appropriate experimental flag (
dynamicIOorcacheComponents) tonext.config.ts, and register profiles matching your service's content types (blog,realtime,static, etc.) under thecacheLifeconfiguration. - Apply
'use cache'andcacheLife('profileName')to the most frequently accessed components or data functions, then verify in the Network tab that thex-nextjs-stale-timeheader ands-maxagevalue are being delivered as intended. - If you are on serverless, install the
@neshca/cache-handlerornextjs-turbo-redis-cachepackage viapnpm add, connect it tocacheHandlers.remote, then use Redis Monitor to verify that the shared cache is working correctly with'use cache: remote'.
Next article: Building a webhook-based on-demand cache invalidation pipeline with
cacheTag()+revalidateTag()— an automated strategy for receiving CMS update events and selectively discarding only the affected caches.
References
- Directives: use cache | Next.js 공식 문서
- Directives: use cache: remote | Next.js 공식 문서
- Directives: use cache: private | Next.js 공식 문서
- Functions: cacheLife | Next.js 공식 문서
- next.config.js: cacheLife | Next.js 공식 문서
- next.config.js: cacheHandlers | Next.js 공식 문서
- Guides: CDN Caching | Next.js 공식 문서
- Cache Components 시작하기 | Next.js 공식 문서
- Cache Components for Instant and Fresh Pages | Vercel Academy
- Next.js 16 릴리스 노트 | Next.js 블로그
- Next.js 16.2 Caching: unstable_cache vs use cache | Build with Matija
- Next.js Caching Explained | DEV Community
- @neshca/cache-handler | Caching Tools