Best deployment strategy for microservices on VPS – separate servers or single VPS?
Unanswered
Common carp posted this in #help-forum
Common carpOP
I am working on building a platform for my company, and I am trying to design the deployment architecture using microservices. I plan to deploy the platform on a VPS (for example Hostinger VPS), but I am confused about the best infrastructure strategy.
My main question is about how microservices should be deployed in production.
From what I understand, microservices usually mean splitting a platform into smaller services such as:
Auth service (login/signup)
Booking service
Payment service
Notification service
Each service has its own logic and runs independently.
However, I am confused about the deployment side of things.
My main doubts are:
1. Should every microservice be deployed on a separate VPS server?
For example:
VPS 1 → Auth service
VPS 2 → Booking service
VPS 3 → Payment service
VPS 4 → Notification service
If this is the correct approach, I would like to understand why separate servers are necessary.
2. What happens if all microservices run on the same VPS?
Example architecture:
One VPS server
Multiple services running on different ports
Nginx acting as a reverse proxy
Example:
Auth service → port 4000
Booking service → port 4001
Payment service → port 4002
Notification service → port 4003
My concern is that if all services run on the same VPS, they will still share the same CPU and RAM.
If traffic increases significantly, would the single server become a bottleneck?
If that happens, I am trying to understand what advantage microservices provide if everything still runs on one machine.
3. What is the industry best practice here?
Do startups usually begin with one VPS running multiple services and scale later, or start with separate servers per microservice?
4. How does scaling actually work with microservices?
If the booking service receives much higher traffic than others, should we move only that service to another server or run multiple instances behind a load balancer?
5. What architecture is best for a startup level platform ?
My main question is about how microservices should be deployed in production.
From what I understand, microservices usually mean splitting a platform into smaller services such as:
Auth service (login/signup)
Booking service
Payment service
Notification service
Each service has its own logic and runs independently.
However, I am confused about the deployment side of things.
My main doubts are:
1. Should every microservice be deployed on a separate VPS server?
For example:
VPS 1 → Auth service
VPS 2 → Booking service
VPS 3 → Payment service
VPS 4 → Notification service
If this is the correct approach, I would like to understand why separate servers are necessary.
2. What happens if all microservices run on the same VPS?
Example architecture:
One VPS server
Multiple services running on different ports
Nginx acting as a reverse proxy
Example:
Auth service → port 4000
Booking service → port 4001
Payment service → port 4002
Notification service → port 4003
My concern is that if all services run on the same VPS, they will still share the same CPU and RAM.
If traffic increases significantly, would the single server become a bottleneck?
If that happens, I am trying to understand what advantage microservices provide if everything still runs on one machine.
3. What is the industry best practice here?
Do startups usually begin with one VPS running multiple services and scale later, or start with separate servers per microservice?
4. How does scaling actually work with microservices?
If the booking service receives much higher traffic than others, should we move only that service to another server or run multiple instances behind a load balancer?
5. What architecture is best for a startup level platform ?