# Runtimes

Source: https://vercel.com/docs/functions/runtimes

Runtimes Skip to content Menu Menu Runtimes Ask AI about this page Last updated February 18, 2026 Vercel supports multiple runtimes for your functions. Each runtime has its own set of libraries, APIs, and functionality that provides different trade-offs and benefits. 
 Runtimes transform your source code into Functions , which are served by our CDN . 
 Official runtimes 
 Vercel Functions support the following official runtimes: 
 Runtime Description Node.js The Node.js runtime takes an entrypoint of a Node.js function, builds its dependencies (if any) and bundles them into a Vercel Function. Bun The Bun runtime takes an entrypoint of a Bun function, builds its dependencies (if any) and bundles them into a Vercel Function. Python The Python runtime takes in a Python program that defines a singular HTTP handler and outputs it as a Vercel Function. Rust The Rust runtime takes an entrypoint of a Rust function using the vercel_runtime crate and compiles it into a Vercel Function. Go Runtime Go The Go runtime takes in a Go program that defines a singular HTTP handler and outputs it as a Vercel Function. Ruby The Ruby runtime takes in a Ruby program that defines a singular HTTP handler and outputs it as a Vercel Function. Wasm The Wasm runtime takes in a pre-compiled WebAssembly program and outputs it as a Vercel Function. Edge The Edge runtime is built on top of the V8 engine, allowing it to run in isolated execution environments that don't require a container or virtual machine. 
 Community runtimes 
 If you would like to use a language that Vercel does not support by default, you can use a community runtime by setting the functions property in vercel.json . For more information on configuring other runtimes, see Configuring your function runtime . 
 The following community runtimes are recommended by Vercel: 
 Runtime Runtime Module Docs Bash vercel-bash https://github.com/importpw/vercel-bash Deno vercel-deno https://github.com/vercel-community/deno PHP vercel-php https://github.com/vercel-community/php 
 You can create a community runtime by using the Runtime API . Alternatively, you can use the Build Output API . 
 Features 

 Location : Deployed as region-first, can customize location . Pro and Enterprise teams can set multiple regions 
 Failover : Automatic failover to defined regions 
 Automatic concurrency scaling : Auto-scales up to 30,000 (Hobby and Pro) or 100,000+ (Enterprise) concurrency 
 Isolation boundary : microVM 
 File system support : Read-only filesystem with writable /tmp scratch space up to 500 MB 
 Archiving : Functions are archived when not invoked 
 Functions created per deployment : Hobby: Framework-dependent, Pro and Enterprise: No limit 

 Location 
 Location refers to where your functions are executed . Vercel Functions are region-first, and can be deployed to up to 3 regions on Pro or 18 on Enterprise. Deploying to more regions than your plan allows for will cause your deployment to fail before entering the build step . 
 Failover mode 
 Vercel's failover mode refers to the system's behavior when a function fails to execute because of data center downtime. 
 Vercel provides redundancy and automatic failover for Vercel Functions using the Edge runtime. For Vercel Functions on the Node.js runtime, you can use the functionFailoverRegions configuration in your vercel.json file to specify which regions the function should automatically failover to. 
 Isolation boundary 
 In Vercel, the isolation boundary refers to the separation of individual instances of a function to ensure they don't interfere with each other. This provides a secure execution environment for each function. 
 With traditional serverless infrastructure, each function uses a microVM for isolation, which provides strong security but also makes them slower to start and more resource intensive. 
 File system support 
 Filesystem support refers to a function's ability to read and write to the filesystem. Vercel functions have a read-only filesystem with writable /tmp scratch space up to 500 MB. 
 Archiving 
 Vercel Functions are archived when they are not invoked: 

 Within 2 weeks for Production Deployments 
 Within 48 hours for Preview Deployments 

 Archived functions will be unarchived when they're invoked, which can make the initial cold start time at least 1 second longer than usual. 
 Functions created per deployment 
 When using Next.js or SvelteKit on Vercel, dynamic code (APIs, server-rendered pages, or dynamic fetch requests) will be bundled into the fewest number of Vercel Functions possible, to help reduce cold starts. Because of this, it's unlikely that you'll hit the limit of 12 bundled Vercel Functions per deployment. 
 When using other frameworks , or Vercel Functions directly without a framework , every API maps directly to one Vercel Function. For example, having five files inside api/ would create five Vercel Functions. For Hobby, this approach is limited to 12 Vercel Functions per deployment. 
 Caching data 
 A runtime can retain an archive of up to 100 MB of the filesystem at build time. The cache key is generated as a combination of: 

 Project name 
 Team ID or User ID 
 Entrypoint path (e.g., api/users/index.go ) 
 Runtime identifier including version (e.g.: @vercel/go@0.0.1 ) 

 The cache will be invalidated if any of those items changes. You can bypass the cache by running vercel -f . 
 Environment variables 
 You can use environment variables to manage dynamic values and sensitive information affecting the operation of your functions. Vercel allows developers to define these variables either at deployment or during runtime. 
 You can use a total of 64 KB in environments variables per-deployment on Vercel. This limit is for all variables combined, and so no single variable can be larger than 64 KB . 
 Vercel features support 
 The following features are supported by Vercel Functions: 
 Secure Compute 
 Vercel's Secure Compute feature offers enhanced security for your Vercel Functions, including dedicated IP addresses and VPN options. This can be particularly important for functions that handle sensitive data. 
 Streaming 
 Streaming refers to the ability to send or receive data in a continuous flow. 
 The Node.js runtime supports streaming by default. Streaming is also supported when using the Python runtime . 
 Vercel Functions have a maximum duration , meaning that it isn't possible to stream indefinitely. 
 Node.js and Edge runtime streaming functions support the waitUntil method , which allows for an asynchronous task to be performed during the lifecycle of the request. This means that while your function will likely run for the same amount of time, your end-users can have a better, more interactive experience. 
 Cron jobs 
 Cron jobs are time-based scheduling tools used to automate repetitive tasks. When a cron job is triggered through the cron expression , it calls a Vercel Function. 
 Vercel Storage 
 From your function, you can communicate with a choice of data stores . To ensure low-latency responses, it's crucial to have compute close to your databases. Always deploy your databases in regions closest to your functions to avoid long network roundtrips. For more information, see our best practices documentation. 
 Edge Config 
 An Edge Config is a global data store that enables experimentation with feature flags, A/B testing, critical redirects, and IP blocking. It enables you to read data at the edge without querying an external database or hitting upstream servers. 
 Tracing 
 Vercel supports Tracing that allows you to send OpenTelemetry traces from your Vercel Functions to any application performance monitoring (APM) vendors. Previous Streaming Next Node.js Was this helpful? supported. Send Official runtimes Community runtimes Features Location Failover mode Isolation boundary File system support Archiving Functions created per deployment Caching data Environment variables Vercel features support Secure Compute Streaming Cron jobs Vercel Storage Edge Config Tracing Copy as Markdown Install Vercel Plugin Give feedback Ask AI about this page v0 Build applications with AI AI SDK The AI Toolkit for TypeScript AI Gateway One endpoint, all your models Vercel Agent An agent that knows your stack Sandbox AI workflows in live environments CI/CD Helping teams ship 6× faster Content Delivery Fast, scalable, and reliable Fluid Compute Servers, in serverless form Observability Trace every step Bot Management Scalable bot protection BotID Invisible CAPTCHA Platform Security DDoS Protection, Firewall Web Application Firewall Granular, custom protection Customers Trusted by the best teams Blog The latest posts and changes Changelog See what shipped Press Read the latest news Events Join us at an event Docs Vercel documentation Academy Linear courses to level up Knowledge Base Find help quickly Community Join the conversation Next.js The native Next.js platform Nuxt The progressive web framework Svelte The web’s efficient UI framework Turborepo Speed with Enterprise scale AI Apps Deploy at the speed of AI Composable Commerce Power storefronts that convert Marketing Sites Launch campaigns fast Multi-tenant Platforms Scale apps with one codebase Web Apps Ship features, not infrastructure Marketplace Extend and automate workflows Templates Jumpstart app development Partner Finder Get help from solution partners Platform Engineers Automate away repetition Design Engineers Deploy for every idea Ask AI Menu Functions Runtimes Functions Runtimes
