JavaScript is not only a browser thing anymore. With Node.js on the server, cheap VPS boxes, and npm bursting with modules, teams are starting to carve products into small services that speak JSON and stay out of each others way. You keep the parts tiny, deploy them fast, and let them grow at their own pace. This is a field note from the trenches on what that looks like and why it feels so right for the current stack.
What I mean by small services
When I say small service I picture a single purpose Node process. It listens on HTTP or a queue. It answers one clear question. It has a tiny surface, simple logs, a health endpoint, and a short README. The code fits in your head. Think thumbnail maker, email sender, pricing calculator, feature flag gate, or a webhook relay.
JavaScript is a good fit for this style. The event loop shines when most of your time is spent on I O. You wait for the database, you call an API, you stream a file. Node keeps that cheap. You can spin a service that accepts a request, calls S3, writes a row, and replies with JSON, without threading headaches.
Here is a small service with Express that generates short links. It exposes three routes. It stores in memory to keep the sample short. In real life you would pick Redis or Postgres.
/* file: server.js */
var express = require('express');
var crypto = require('crypto');
var app = express();
app.use(express.json());
var store = Object.create(null);
app.get('/health', function(req, res) {
res.json({ ok: true, uptime: process.uptime() });
});
app.post('/v1/shorten', function(req, res) {
var url = String(req.body.url || '');
if (!/^https?:\/\//i.test(url)) {
return res.status(400).json({ error: 'url is required and must be http or https' });
}
var id = crypto.createHash('md5').update(url).digest('hex').slice(0, 6);
store[id] = url;
res.json({ id: id, url: url, link: '/v1/r/' + id });
});
app.get('/v1/r/:id', function(req, res) {
var url = store[req.params.id];
if (!url) return res.status(404).json({ error: 'not found' });
res.redirect(url);
});
var port = process.env.PORT || 3000;
app.listen(port, function() {
console.log('shortener service listening on ' + port);
});Its package.json is tiny.
{
"name": "shortener-service",
"version": "1.0.0",
"main": "server.js",
"scripts": {
"start": "node server.js",
"test": "node test.js"
},
"dependencies": {
"express": "4.x"
}
}That is the vibe. Keep the scope small. Ship it as a unit. Give it a clear contract.
Why this makes sense right now
npm is a firehose. Every week new modules land for queues, HTTP clients, auth, schemas, and more. You can grab hapi if you want strong routing, or stay with Express if you like lean. You can add joi for validation, request for HTTP, amqplib for RabbitMQ, socket.io for real time. The building blocks are already there.
Cloud is cheap. You can fit three or four Node processes in a small DigitalOcean droplet and still have room for Redis. Heroku makes zero config deploys easy. AWS EC2 is steady. That lowers the friction to split a big thing into focused services. Start with one box, add another when you outgrow it. No need for a giant jump.
Front end and back end share a language. If your team already writes JavaScript for the browser, the step to Node is small. Same syntax. Same lint rules. Similar test tools like mocha, tape, and jasmine. You can move folks across tasks without a full rewrite of their brain.
Finally, the culture today leans to small pieces. Netflix and Amazon have talked for years about service oriented setups. JavaScript gives you a friendly way to get there without a giant framework jump. Start with one service that owns a thing and keep going only if it helps.
How to wire, run, and keep your sanity
Keep the contract boring. JSON in and JSON out over HTTP works great. If you need async, use a queue like RabbitMQ or a Redis list. For pub sub, Redis channels are fine for starters. If you need streaming, Node streams can move data without blowing memory.
Every service should expose a /health route and a /metrics route. Health answers quick. Metrics can return counters that a scraper picks up. Write logs to stdout. Let your process manager handle files and rotation.
Speaking of process managers, PM2 and forever both work. PM2 gives clustering and a web view. forever is simple and steady. If you are on Ubuntu you can add an upstart job. On newer distros you can use systemd. Pick one, keep the config in the repo, and script your start and stop in npm scripts.
// pm2.config.js
module.exports = {
apps: [{
name: "shortener",
script: "server.js",
instances: process.env.WEB_CONCURRENCY || 1,
exec_mode: "cluster",
env: { NODE_ENV: "production" }
}]
};Define your config with plain environment variables. That matches the twelve factor app ideas. No secrets in git. A tiny example with dotenv for local runs:
// config.js
require('dotenv').config();
module.exports = {
port: Number(process.env.PORT || 3000),
redisUrl: process.env.REDIS_URL,
rabbitUrl: process.env.RABBIT_URL
};Testing stays simple when you keep the surface small. Here is a fast check with tape that hits the shorten route.
// test.js
var test = require('tape');
var http = require('http');
var server = require('./server'); // export your app in server.js for tests
test('shorten returns id', function(t) {
t.plan(2);
var data = JSON.stringify({ url: 'https://example.com' });
var req = http.request({
hostname: 'localhost',
port: process.env.PORT || 3000,
path: '/v1/shorten',
method: 'POST',
headers: { 'Content-Type': 'application/json', 'Content-Length': Buffer.byteLength(data) }
}, function(res) {
t.equal(res.statusCode, 200);
var buf = '';
res.on('data', function(chunk){ buf += chunk; });
res.on('end', function(){
var json = JSON.parse(buf);
t.ok(json.id, 'id present');
});
});
req.write(data);
req.end();
});For async work like sending email, push a message to a queue and return right away. A worker can pick the message and do the slow part. Here is a tiny RabbitMQ worker with amqplib:
// worker.js
var amqp = require('amqplib');
var url = process.env.RABBIT_URL || 'amqp://localhost';
amqp.connect(url)
.then(function(conn){ return conn.createChannel(); })
.then(function(ch) {
var q = 'email';
return ch.assertQueue(q).then(function(){
ch.consume(q, function(msg) {
var payload = JSON.parse(msg.content.toString());
console.log('send email to', payload.to, 'with subject', payload.subject);
ch.ack(msg);
});
});
})
.catch(function(err){ console.error('queue error', err); });Tip: build a tiny shared module for cross service things like error shapes, request id headers, and JSON schema. Publish it to your private npm or pull it from a git repo. That keeps the surface consistent across services.
Small services vs giant frameworks
Rails, Django, and Spring give you a lot in one place. Scaffolds, ORM, admin, a view layer, the works. That can be great when one app owns the whole product and the team is tight on time. You write features inside one codebase, deploy one thing, and move on. If your feature list is still forming, a single app can fly.
Small services trade that center for separation. Each piece is clear, deploys by itself, and can fail without taking down the rest. You pay a tax in wiring and ops. You gain faster changes in small parts. If your app talks to many APIs, does heavy I O, or needs different deploy rhythms for different parts, a spread of small services can keep you sane.
It is not a religion. Keep your main web app if it is working. Pull out a piece only when it hurts to keep it inside. Think about the email sender, the file processor, the webhook catcher. These are good first moves. You keep risk low and you learn how your team likes to work with services.
Practical checklist to ship your first small service
- Pick one job: pick a function that is easy to measure and easy to call. Good picks are thumbnail maker, email sender, or rate limiter.
- Write the contract: one page with routes, verbs, sample JSON, error shapes, and status codes. Keep it on README.md.
- Create the repo: add package.json, a server file, a test file, and a Dockerfile only if your team already uses Docker. Do not start with three layers of scripts.
- Set config by env vars: PORT, NODE_ENV, and any URLs to queues and stores.
- Add health and metrics: /health returns ok. /metrics returns counters like requests, errors, and latency. Even a plain JSON dump beats silence.
- Pick a process manager: PM2 or forever. Check in the config and add npm scripts for start and stop.
- Log to stdout: include a request id and the route. One line per event. Sample: {“at”:”request”,”id”:”abc123″,”route”:”/v1/shorten”,”ms”:12}.
- Test the edges: invalid input, timeouts to upstream, and a happy path. Use tape or mocha.
- Deploy small: push to Heroku, a tiny EC2 box, or a droplet. One service per port. Keep the firewall simple.
- Watch it: wire a ping to /health. Send metrics to statsd and Graphite or to whatever you already run. Alert on high error rate and slow p95.
- Plan versioning: include /v1 in the route from day one. When you need a change, add /v2 next to it and retire v1 later.
- Document how to call it: a curl example in the README beats a wiki that nobody reads.
Here is a tiny curl script you can drop into your README so other services can call yours.
# shorten a url
curl -s -X POST \
-H "Content-Type: application/json" \
-d '{"url":"https://example.com"}' \
http://localhost:3000/v1/shorten | jq .
# follow a link
curl -i http://localhost:3000/v1/r/abc123And a tiny Node client that wraps that call.
// client.js
var http = require('http');
function shorten(base, url, cb) {
var data = JSON.stringify({ url: url });
var req = http.request({
hostname: base.hostname,
port: base.port,
path: '/v1/shorten',
method: 'POST',
headers: { 'Content-Type': 'application/json', 'Content-Length': Buffer.byteLength(data) }
}, function(res) {
var body = '';
res.on('data', function(c){ body += c; });
res.on('end', function(){
if (res.statusCode !== 200) return cb(new Error('bad status ' + res.statusCode));
cb(null, JSON.parse(body));
});
});
req.on('error', cb);
req.write(data);
req.end();
}
module.exports = { shorten: shorten };If you follow that list and keep the scope narrow, your first service will feel boring. That is the point. Boring is stable. Stable is fast.
One more snag to plan for: discovery. At first you can hardcode host and port in env vars. When you have five or six services, consider a tiny registry. You can even store a JSON file in S3 or a small config service that returns current host and port for each name. Keep it simple until you need more.
There is a lot of noise out there about the perfect way to do services. You do not need all of it to get value. A focused Node app, a clear route, and strong tests already move the needle. If later you want Docker, a message bus, or a full mesh, you can add those once you feel the pain that calls for them.
Right now the stack is in a sweet spot. npm gives you parts for almost any job. Hosting is cheap. Teams know JavaScript. That is enough to step beyond the browser and use the same language to power the small units that make your product feel fast and clean.
Keep it small, ship it, measure, repeat.