Skip to main content
RapidDev - Software Development Agency
replit-tutorial

How to test microservices in Replit

Test microservices in Replit by running each service in its own Repl, using Shell to execute test suites that call HTTP endpoints between services, and verifying inter-service communication with integration tests. You can run multiple processes in a single Repl or use separate Repls with Replit's preview URLs to test how your services interact end to end.

What you'll learn

  • Run multiple services simultaneously in a single Repl using the .replit run command
  • Write integration tests that call HTTP endpoints and verify inter-service responses
  • Use Shell to run test suites with npm test or pytest
  • Set up test scripts that verify database and API connectivity across services
Book a free consultation
4.9Clutch rating
600+Happy partners
17+Countries served
190+Team members
Beginner9 min read20-30 minutesAll Replit plans. Core or Pro recommended for multiple concurrent services due to RAM limits on Starter (2 GiB).March 2026RapidDev Engineering Team
TL;DR

Test microservices in Replit by running each service in its own Repl, using Shell to execute test suites that call HTTP endpoints between services, and verifying inter-service communication with integration tests. You can run multiple processes in a single Repl or use separate Repls with Replit's preview URLs to test how your services interact end to end.

Run Integration Tests Across Microservices in Replit

When your application is split into multiple services — an API gateway, a user service, and a data service, for example — you need integration tests that verify these services communicate correctly. This tutorial shows you how to set up and run integration tests for a microservices architecture using Replit, from running multiple services simultaneously to writing tests that call real endpoints and validate responses.

Prerequisites

  • A Replit account (Core or Pro recommended for resource limits)
  • A project with at least two services or endpoints to test
  • Basic familiarity with HTTP requests and JSON responses
  • Node.js or Python knowledge for writing test scripts

Step-by-step guide

1

Set up multiple services in a single Repl

You can run multiple processes in a single Repl by using the ampersand (&) operator in your .replit run command. Open .replit (enable 'Show hidden files' in the file tree menu) and configure the run command to start all services simultaneously. Each service runs on a different port, and Replit maps them using the [[ports]] configuration. This approach is simpler than managing separate Repls and keeps all your code in one place.

typescript
1# .replit file
2run = "node services/api-gateway/index.js & node services/user-service/index.js & wait"
3
4[[ports]]
5localPort = 3000
6externalPort = 80
7
8[[ports]]
9localPort = 3001
10externalPort = 3001

Expected result: Both services start when you press Run. The Console shows output from both processes, and each service is accessible on its configured port.

2

Install a testing framework

Open the Shell tab and install a testing framework suitable for your language. For Node.js projects, Jest is the most common choice. For Python projects, pytest works out of the box. The testing framework gives you structured assertions, test runners, and output formatting so you can see exactly which tests pass and which fail.

typescript
1# For Node.js projects:
2npm install --save-dev jest
3
4# For Python projects:
5pip install pytest requests

Expected result: The testing framework installs successfully. Running 'npx jest --version' or 'pytest --version' in Shell confirms it is available.

3

Write integration tests that call service endpoints

Create a tests directory and add integration test files that make HTTP requests to your running services. Integration tests differ from unit tests because they test the real communication between services — actual HTTP calls, real database queries, and genuine response parsing. Each test should start a request to one service, verify the response, and optionally chain requests across services to test the full flow.

typescript
1// tests/integration.test.js
2const API_BASE = 'http://localhost:3000';
3const USER_SERVICE = 'http://localhost:3001';
4
5describe('Microservices Integration', () => {
6 test('API gateway health check returns 200', async () => {
7 const res = await fetch(`${API_BASE}/health`);
8 expect(res.status).toBe(200);
9 const data = await res.json();
10 expect(data.status).toBe('ok');
11 });
12
13 test('User service returns user list', async () => {
14 const res = await fetch(`${USER_SERVICE}/users`);
15 expect(res.status).toBe(200);
16 const users = await res.json();
17 expect(Array.isArray(users)).toBe(true);
18 });
19
20 test('API gateway proxies to user service', async () => {
21 const res = await fetch(`${API_BASE}/api/users`);
22 expect(res.status).toBe(200);
23 const data = await res.json();
24 expect(data).toHaveProperty('users');
25 });
26});

Expected result: Test files exist in the tests directory. Each test targets a specific endpoint and verifies the response status and body.

4

Add a test script to package.json

Open package.json and add a test script so you can run all integration tests with a single command. You can also add a test:integration script that specifically targets integration test files, keeping them separate from any unit tests you might have. This makes it easy to run tests from Shell or wire them into your workflow.

typescript
1{
2 "scripts": {
3 "start": "node services/api-gateway/index.js & node services/user-service/index.js & wait",
4 "test": "jest",
5 "test:integration": "jest tests/integration"
6 }
7}

Expected result: Running 'npm test' or 'npm run test:integration' in Shell executes your integration tests and shows pass/fail results.

5

Run tests from Shell while services are active

Press the Run button to start your services first. Then open a new Shell instance (click the '+' icon in the Shell tab area) and run your test suite. The tests will make HTTP requests to the running services and report results. Running tests in a separate Shell instance lets you keep the services running while iterating on test code. If a test fails, check the Console output from the services for error messages that explain what went wrong.

typescript
1# In a separate Shell tab while services are running:
2npm run test:integration

Expected result: Jest displays test results with green checkmarks for passing tests and red X marks for failures. All integration tests should pass if services are running correctly.

6

Add endpoint-specific tests for error cases

Good integration tests cover both success and error paths. Add tests that send invalid data, missing authentication, or requests to nonexistent endpoints. These tests verify that your services handle errors gracefully and return appropriate HTTP status codes. Error handling is where most inter-service bugs hide, especially when one service expects a response format that another service does not provide.

typescript
1// tests/errors.test.js
2describe('Error Handling', () => {
3 test('returns 404 for unknown routes', async () => {
4 const res = await fetch('http://localhost:3000/nonexistent');
5 expect(res.status).toBe(404);
6 });
7
8 test('returns 400 for invalid user data', async () => {
9 const res = await fetch('http://localhost:3001/users', {
10 method: 'POST',
11 headers: { 'Content-Type': 'application/json' },
12 body: JSON.stringify({ name: '' }),
13 });
14 expect(res.status).toBe(400);
15 });
16
17 test('returns 401 for missing auth token', async () => {
18 const res = await fetch('http://localhost:3000/api/protected');
19 expect(res.status).toBe(401);
20 });
21});

Expected result: Error case tests verify that services return correct HTTP status codes and error messages for invalid requests.

Complete working example

tests/integration.test.js
1/**
2 * Integration tests for microservices setup
3 * Run with: npm run test:integration
4 * Requires services to be running on ports 3000 and 3001
5 */
6
7const API_BASE = 'http://localhost:3000';
8const USER_SERVICE = 'http://localhost:3001';
9
10describe('Service Health Checks', () => {
11 test('API gateway responds on port 3000', async () => {
12 const res = await fetch(`${API_BASE}/health`);
13 expect(res.status).toBe(200);
14 const data = await res.json();
15 expect(data.status).toBe('ok');
16 });
17
18 test('User service responds on port 3001', async () => {
19 const res = await fetch(`${USER_SERVICE}/health`);
20 expect(res.status).toBe(200);
21 });
22});
23
24describe('Inter-Service Communication', () => {
25 test('API gateway proxies user list from user service', async () => {
26 const res = await fetch(`${API_BASE}/api/users`);
27 expect(res.status).toBe(200);
28 const data = await res.json();
29 expect(data).toHaveProperty('users');
30 expect(Array.isArray(data.users)).toBe(true);
31 });
32
33 test('API gateway forwards user creation to user service', async () => {
34 const res = await fetch(`${API_BASE}/api/users`, {
35 method: 'POST',
36 headers: { 'Content-Type': 'application/json' },
37 body: JSON.stringify({ name: 'Test User', email: 'test@example.com' }),
38 });
39 expect(res.status).toBe(201);
40 const user = await res.json();
41 expect(user.name).toBe('Test User');
42 });
43});
44
45describe('Error Handling', () => {
46 test('returns 404 for unknown API routes', async () => {
47 const res = await fetch(`${API_BASE}/api/nonexistent`);
48 expect(res.status).toBe(404);
49 });
50
51 test('returns 400 for invalid request body', async () => {
52 const res = await fetch(`${API_BASE}/api/users`, {
53 method: 'POST',
54 headers: { 'Content-Type': 'application/json' },
55 body: JSON.stringify({}),
56 });
57 expect(res.status).toBe(400);
58 });
59});

Common mistakes when testing microservices in Replit

Why it's a problem: Running tests before services have finished starting up

How to avoid: Add a health check step at the beginning of your test suite that waits for services to respond. Or start services manually with the Run button, then run tests from a separate Shell.

Why it's a problem: Using the Replit preview URL instead of localhost in test code

How to avoid: Tests run inside the same container as your services. Use http://localhost:PORT for test requests, not the .replit.dev preview URL.

Why it's a problem: Running all services on the same port

How to avoid: Assign a unique port to each service and configure them in the [[ports]] section of your .replit file. Common practice is 3000, 3001, 3002, etc.

Why it's a problem: Not handling the 'wait' command when running multiple services with &

How to avoid: Always add 'wait' at the end of your run command when using background processes (&). Without it, the Repl may terminate when the first background process finishes.

Why it's a problem: Hitting Starter plan memory limits with multiple services running simultaneously

How to avoid: The Starter plan only provides 2 GiB RAM. If you run out of memory, upgrade to Core (8 GiB) or reduce the number of services running at once.

Best practices

  • Always start services before running integration tests — use a separate Shell instance so services stay running
  • Use localhost URLs in tests, not the Replit preview URL, since tests run in the same container
  • Test both success paths and error paths (invalid data, missing auth, nonexistent endpoints)
  • Keep integration tests separate from unit tests using different directories or naming conventions
  • Add a health check endpoint to every service so tests can verify a service is running before testing business logic
  • Use environment variables for service URLs so tests work in both development and deployment environments
  • Set reasonable timeouts for HTTP requests in tests — services may take a moment to start responding
  • Run integration tests before deploying to catch inter-service communication bugs early

Still stuck?

Copy one of these prompts to get a personalized, step-by-step explanation.

ChatGPT Prompt

I have two microservices running in a single Replit project — an API gateway on port 3000 and a user service on port 3001. How do I write integration tests with Jest that verify the services communicate correctly? Include health checks and error handling tests.

Replit Prompt

Set up integration testing for this project. Install Jest as a dev dependency. Create a tests/integration.test.js file with tests that call the API gateway on localhost:3000 and verify it proxies requests to the user service on localhost:3001. Add a test script to package.json. Include tests for health checks, successful requests, and error handling.

Frequently asked questions

Yes. Use the ampersand (&) operator in your .replit run command to run multiple processes simultaneously: 'run = "node service1.js & node service2.js & wait"'. Each service should listen on a different port.

Press Run to start your services, then open a second Shell instance by clicking the '+' icon in the Shell tab area. Run your test suite (npm test or pytest) in the second Shell while services stay active in the first.

Technically yes, but the 2 GiB RAM limit on Starter makes running multiple services and a test suite simultaneously challenging. Core plan with 8 GiB RAM is recommended for microservices testing.

Use localhost with the appropriate port (e.g., http://localhost:3000). Tests run in the same container as your services, so localhost connections are direct and fast.

Store API keys in Tools > Secrets. Your test code accesses them via process.env.KEY_NAME (Node.js) or os.getenv('KEY_NAME') (Python). Never hardcode keys in test files.

Yes. Prompt Agent v4: 'Set up Jest integration tests for the microservices in this project. Write tests that call the API gateway on port 3000 and verify it proxies to the user service on port 3001. Include health checks and error cases.' Agent will create test files and update package.json.

For complex architectures with many services, container orchestration, and advanced testing pipelines, the RapidDev team can help you design a scalable testing strategy that works across environments.

Use beforeEach and afterEach hooks in your test framework to seed and clean up test data. For Replit's built-in PostgreSQL, create a separate test database or use transactions that roll back after each test.

RapidDev

Talk to an Expert

Our team has built 600+ apps. Get personalized help with your project.

Book a free consultation

Need help with your project?

Our experts have built 600+ apps and can accelerate your development. Book a free consultation — no strings attached.

Book a free consultation

We put the rapid in RapidDev

Need a dedicated strategic tech and growth partner? Discover what RapidDev can do for your business! Book a call with our team to schedule a free, no-obligation consultation. We'll discuss your project and provide a custom quote at no cost.