Node.js Hosting Options

Jeff Morhous
@JeffMorhousChoosing the right hosting environment for a Node.js application will define much of both your development workflow and application performance. The hosting option you choose directly affects the developer experience (how easy deployments and updates are), the cost model of running your app, its scalability under load, and how much control (and responsibility) you have over your infrastructure.
For example, a fully managed platform can eliminate server maintenance at the cost of less flexibility and more money, whereas running your own server gives maximum control but demands more operational work.
Your goal in deciding on where to host a node app is to align your hosting choice with your app’s technical requirements and your team’s capacity to manage the underlying infrastructure.
Different types of Node apps have different needs
APIs built with Node are stateless request/response services and are a good fit for most hosting models. A Node.js API can run on anything from a cheap VPS to serverless functions, since each request is independent and typically short-lived.
Real-time apps (like those with WebSockets), on the other hand, need persistent connections. Things like chat apps or live dashboards require hosting that supports long-lived network sockets. Traditional servers or container-based platforms are often necessary here as pure serverless platforms often don’t allow WebSockets or constant connections. For example, Vercel’s serverless functions cannot hold always-on WebSocket connections, but they do support WebSockets through their Edge Runtime.
Server-rendered apps (think Next.js) are certainly a special case. Frameworks like Next.js generate (most) pages server-side and often do well with serverless deployment. Next.js is tightly integrated with Vercel, which offers zero-configuration deployment, serverless functions for API routes, and edge caching for static assets. Many teams choose serverless platforms for these SSR apps to leverage features like automatic CDN distribution and on-demand scaling without managing servers. However, this serverless approach comes with tradeoffs in execution time limits and statefulness, which we’ll discuss later.
First, let’s talk about the option that demands the most of you.
Hosting Node apps on a VPS (or similar cloud service)
Running a Node.js app on a VPS (Virtual Private Server), Amazon EC2, or cloud virtual machine gives you maximum control over the environment. But with that comes maximum responsibility.
On a VPS, you get root access to install any OS packages, configure the stack exactly as you want, and run any background processes you need. This flexibility is powerful for custom setups, but the maintenance burden on you or your team is high. You are in charge of everything under the hood.
Applying OS security patches, monitoring disk and CPU usage, setting up firewalls, managing backups, and handling scaling manually are all things you should be prepared to manage if you go this route.
Using infrastructure-as-code and containers can ease some pain, but won’t eliminate ops work. Tools like Kamal can simplify deploying a containerized app to a VPS. However, Kamal doesn’t handle the surrounding infrastructure needs. You still need to set up things like load balancers, databases with backups, log aggregation, and system monitoring yourself.
Containers help by packaging your Node.js app with its dependencies, making it portable and consistent across environments. But the VPS still needs to have everything the container needs. You’ll still be responsible for orchestrating containers, scaling them, and managing the host VM’s health.
Hosting on a VPS or cloud VM is fine if you need fine-grained control or have specialized requirements that platforms don’t support. But it’s not an option I can recommend unless you have a dedicated ops team (or you just really love that sort of thing). I’ve hosted small projects on a VPS, and it’s always been more headache than the cost savings I faced.
Hosting your node app on a PaaS
Platform-as-a-Service (PaaS) offerings strike a middle ground by handling most infrastructure concerns while still letting you run a “server-like” app. Platforms like Heroku, Render, Amazon ECS Fargate, and Fly.io are PaaS leaders.
They allow you to push your Node.js code (via Git or container image) and then they build, run, and serve your application in a managed environment. Platforms abstract away the server (or VPS) management.
Most platforms give you the option between using containers or not, so the above image could be even simpler, with you only managing the app itself.
With platforms, there’s very little manual configuration and management. You get a deployment platform that automates scaling, security updates, and (some) monitoring, usually through a web dashboard or CLI. Developers can focus on code and let the platform handle the “ops” heavy lifting.
Using a PaaS still provides you with the flexibility to run long-running processes and async job queues like BullMQ or Bee-Queue, which are things that pure serverless platforms don’t support.
The general-purpose nature of PaaS means it doesn’t matter whether you’re deploying a frontend, a Node API, or a background worker. This makes platforms the best option for most Node apps.
You get persistent Node.js processes that can maintain state in memory, hold database connection pools, handle WebSocket connections, and even schedule cron jobs without worrying about hitting an execution timeout or some vendor constraint. Essentially, it offers the convenience of managed hosting without the severe limitations on process lifespan that come with serverless function environments.
You get a managed environment that dramatically reduces your operations overhead, but you keep quite a bit of control.
But serverless is right for some apps! Let’s look into that next.
Hosting serverless Node apps on Vercel or Netlify
Serverless platforms like Vercel and Netlify have gained popularity, especially for frontend-oriented and Jamstack applications. Vercel hired much of the React core team away from Meta and has stewarded the development of both React and Next.js, which positions them well to support Next apps in particular.
In a serverless model, you don’t maintain a running server process. Instead, your Node.js code is deployed as functions that execute on demand in response to requests (or events) and then terminate. This model brings automatic scaling per request – every incoming request can spin up a new isolated function instance if needed, so capacity can increase seemingly without bound, and you never pay for idle time.
Vercel and Netlify both provide an experience where you connect a Git repo, and they build and deploy your site with serverless functions backing any dynamic endpoints or API routes. This gives a fantastic developer experience for certain use cases. Frontend-heavy apps get static hosting plus dynamic capabilities without ever thinking about servers, and things like CI/CD, CDN distribution, and SSL are handled for you out of the box.
I host my personal site and a few simple projects on Vercel and am quite happy with how hands-off it’s made hosting. For my simple Next.js app, Vercel is a very good fit and also free.
That being said, if I want to expand this application to include more functionality, I’d probably run into some limitations.
The first major limitation is that serverless functions on these platforms have hard time limits. This means you cannot do long processing jobs directly. If your Node app needs to generate a large report or process a big file, you’ll likely exceed these limits and the platform will kill the function.
Long-running tasks have to be offloaded to external services or broken into much smaller jobs. But Vercel and Netlify do not allow running arbitrary background worker processes. You can’t have a worker listening to a queue or a scheduler that continuously runs in the background. “Background Functions” on Netlify simply allow a single function invocation to run longer (up to 15 minutes) asynchronously, but they are not equivalent to a always-on worker process.
Vercel recently introduced scheduled functions, which are cron-like triggers, but these are just periodic invocations of serverless functions, not persistent jobs. Any asynchronous or delayed work in a serverless architecture has to be handed off to another system (using an external job queue service, or triggering an AWS Lambda via event).
This is a fundamental design difference. Traditional platforms (like Heroku, Render, etc) let you run a worker indefinitely, whereas on Netlify/Vercel, you might schedule a function to run every few minutes, but it will start fresh and then terminate each time.
Both Vercel and Netlify abstract away containers and don’t let you deploy a custom Docker image to their platform. You are limited to the runtimes and languages they support and the build process they provide. While the support is often sufficient, the platform’s provided environment is the only environment. Vercel and Netlify focus on source-based deployment and static assets, not running arbitrary containers.
They are great at what they do (fast frontend deployments), but aren’t general-purpose hosting for any kind of app.
Autoscaling a Node app
Scalability is a big question for web developers, and different platforms scale Node apps in different ways. Understanding your autoscaling options and their implications for performance and cost matters a bit for choosing a host.
On traditional setups like VPS or self-managed servers, scaling is usually manual unless you build your own scripts or use cloud vendor tools to spin up new VMs. By contrast, PaaS platforms typically offer some form of horizontal autoscaling for Node apps, but the responsiveness to load can vary.
Heroku, for example, has a built-in autoscaler (available on certain tiers) that can add or remove dynos based on response time thresholds. The caveat with this metric is that they might react sluggishly or scale at the wrong times.
This is why third-party solutions like Judoscale have emerged. Judoscale focuses on request queue time as the metric to decide scaling, which directly measures if requests are backing up due to a lack of capacity.
Judoscale will add more web processes as capacity demands it, and we also watch your job queues to autoscale worker processes. If you want reliable autoscaling on a PaaS, you want Judoscale.
Scaling on serverless is weird
Serverless platforms scale very differently.
Essentially, they scale per request by default. There’s no “instance” for you to add.
Every incoming event will find capacity by the provider launching more copies of your function as necessary. This leads to effectively unlimited concurrency out of the box, which is great for absorbing traffic spikes without any configuration. The flip side is limited control over this scaling.
Normally, every request that comes in will result in a new Node.js runtime starting if the existing ones are all busy. This is an awesome way to ensure reliability in a scenario where you traffic increases quickly.
However, there are two big tradeoffs: cold starts and cost unpredictability.
When serverless scales, many of those new function invocations might incur a cold start delay (a few hundred milliseconds or more to initialize a Node environment). In a high-traffic scenario, you could have lots of functions cold-starting, which might cause latency for some requests. More importantly, from a cost perspective, serverless billing is usually metered by time and memory per execution, plus any external service calls (like database or bandwidth).
If you get 1000 concurrent requests frequently, you pay for 1000 function runs in parallel, which can add up quickly. I see developers on X and Reddit all the time complaining that their Vercel bills ballooned under heavy load.
This isn’t to say serverless can’t be cost-effective. For super volatile but low-average traffic, it can be the cheapest option.
If you require tight control and predictability, a PaaS with the right autoscaling tool might be preferable. If you need to handle unpredictable surges and are okay with the stateless function model, serverless will do it out of the box. Just keep an eye on those usage metrics!
Picking your hosting option based on developer experience
I’ve thrown a bunch of information at you, but I don’t want to make my opinion unclear.
I think you should prioritize developer experience. Whether you’re trying to decide where to host a solo project or influence a decision for an enterprise, put real weight behind the developer cost that comes with the “cheaper” options.
Beyond that, the decision comes down to your application’s type and its traffic profile.
Ask yourself a few questions about your Node.js app:
Does your app require persistent connections or background processes? If it does, then a serverless platform (Vercel/Netlify) likely won’t serve you well. You’d lean towards a PaaS or even your own VPS if you’re okay being pretty hands-on.
How much ops work are you (or your team) willing to take on? If you have a strong DevOps skillset or an ops team, hosting on VPS or some pure cloud solution might be a good fit. You’ll get full flexibility to tailor the environment and potentially save on high-volume costs by squeezing more out of each server. But if you’d rather not deal with server management, then PaaS or serverless is attractive.
What are your scaling and traffic patterns? For relatively steady, predictable traffic, it can be more cost-effective and simpler to run a fixed number of servers (or dynos) on a PaaS or VPS. You won’t get surprises in the bill, and you can ensure they’re always warm and performant. For spiky or highly variable traffic, serverless is an option.
Choose the platform that fits the shape of your app and your team. For a typical web API or monolithic Node app that has a mix of web requests and background jobs, a PaaS will provide the least friction. If you’re building a highly interactive frontend-heavy app (especially with Next.js), deploying the frontend on Vercel or Netlify can be great for the static+serverless benefits, possibly complemented by a separate backend for any heavy lifting.