Backend Systems Engineer

Architecting
Robust APIs
& Systems.

I'm Farel Hanafi. I specialize in designing scalable backend architectures, optimizing performance, and integrating AI/ML capabilities into production environments.

Farel Hanafi

From self-taught
to professional freelancer.

I started learning to code completely self-taught in 2021 — no formal classes, just curiosity and the internet. YouTube tutorials, Stack Overflow, official docs — I built my foundation from the ground up.

By 2022, I was building personal projects — real applications I designed, developed, and deployed entirely on my own, not just exercises, but things I was genuinely proud of shipping.

In 2024, I moved into freelancing — working on real client projects with real requirements. This is where I learned what it truly means to build systems that have to work: performant, secure, and maintainable.

Now in 2026, I'm actively freelancing as a backend developer — helping businesses build scalable APIs, design clean system architectures, and integrate AI/ML into production pipelines.

2021
Started Coding
Self-taught from scratch — no classes, just curiosity.
2022
Personal Projects
Built real applications from scratch to sharpen hands-on skills.
2024
Freelancing
Started working with real clients — real systems, real stakes.
2026
Present ✦
Actively freelancing — backend, APIs, and AI/ML integration.

Technology Stack

Backend
Node.js Python Java Express FastAPI REST API
Database
PostgreSQL MySQL MongoDB Redis SQL
DevOps
Docker Linux CI/CD GitHub Actions Nginx
AI / ML
TensorFlow Scikit-learn PyTorch ML APIs
Frontend
HTML CSS JavaScript TypeScript

Selected Projects

01

High-Throughput REST API Platform

GitHub

E-commerce platform hitting database bottlenecks at 500+ concurrent users. Response times exceeding 2s for product listing endpoints.

Redesigned data access layer with Redis caching, connection pooling, and read replicas. Implemented CQRS pattern to separate read/write paths.

Redis was chosen over Memcached for its data structure support. CQRS allowed independent scaling of read-heavy workloads without affecting write throughput.

Response time dropped from 2.1s → 180ms. Handled 5,000+ concurrent users. 99.9% uptime over 6 months.

Node.js PostgreSQL Redis Docker Nginx
routes/products.js
// Cache-aside pattern
router.get("/products", async (req, res) => {
  const cacheKey = `products:${req.query.page}`;
  const cached = await redis.get(cacheKey);

  if (cached) {
    return res.json(JSON.parse(cached));
  }

  const data = await db.query(
    "SELECT * FROM products LIMIT $1 OFFSET $2",
    [limit, offset]
  );

  await redis.setex(cacheKey, 300, JSON.stringify(data));
  res.json(data);
});
02

ML Inference API Service

GitHub

Need to serve a trained Scikit-learn classification model to 1,000+ daily users with sub-100ms latency, without ML framework overhead on every request.

Built a FastAPI service with model loaded once at startup, serialized with joblib. Async request handling with background task queue for batch predictions.

FastAPI's async support avoids thread blocking. Loading model at startup (not per-request) saves ~800ms. Batch endpoint reduces client round trips by 80%.

P95 latency: 45ms. Serves 50K+ predictions/day. Zero downtime deployments with rolling updates via Docker.

Python FastAPI Scikit-learn Docker PostgreSQL
main.py
from fastapi import FastAPI
import joblib, numpy as np

app = FastAPI()
model = joblib.load("model.pkl")  # load once

@app.post("/predict")
async def predict(payload: PredictRequest):
    features = np.array(payload.features)
    prediction = model.predict([features])
    probability = model.predict_proba([features])
    return {
        "class": int(prediction[0]),
        "confidence": float(probability[0].max())
    }
03

Microservice Event System

GitHub

Monolithic app becoming impossible to maintain. Teams blocked on each other. Single deployment risk affecting all features simultaneously.

Decomposed into 5 domain services (auth, orders, inventory, notifications, payments) connected via event bus. Each service owns its own database.

Event-driven over synchronous REST between services: avoids tight coupling, enables replay on failure, and lets services evolve independently without coordination.

Independent deployment cycles. Team velocity increased 3x. System handles 10K events/min with guaranteed at-least-once delivery.

Node.js PostgreSQL MongoDB Docker CI/CD
services/orders.js
// Domain event pattern
class OrderService {
  async createOrder(data) {
    const order = await db.orders.create(data);

    // Publish domain event
    await eventBus.publish("order.created", {
      orderId: order.id,
      userId: data.userId,
      items: data.items,
      total: data.total,
    });

    return order;
  }
}

// Inventory service subscribes
eventBus.subscribe("order.created",
  handlers.reserveInventory);

How I Structure Backend Systems

Folder Structure

backend-api/
├── controllers/    # HTTP layer
│   └── userController.js
├── services/       # Business logic
│   └── userService.js
├── repositories/   # Data access
│   └── userRepo.js
├── models/         # DB schemas
│   └── User.js
├── routes/         # Route definitions
│   └── users.js
├── middlewares/    # Auth, validation
│   ├── auth.js
│   └── validate.js
└── server.js

Example API Endpoint

// controllers/userController.js
const getUser = async (req, res, next) => {
  try {
    const user = await userService
      .findById(req.params.id);

    if (!user) {
      return res.status(404)
        .json({ error: "User not found" });
    }

    res.json({ data: user });
  } catch (err) {
    next(err); // centralized error handling
  }
};

Database Schema

-- Optimized with indexes
CREATE TABLE users (
  id        UUID        PRIMARY KEY
                        DEFAULT gen_random_uuid(),
  email     VARCHAR(255) UNIQUE NOT NULL,
  name      VARCHAR(100) NOT NULL,
  role      VARCHAR(20)  DEFAULT 'user',
  created_at TIMESTAMPTZ DEFAULT NOW()
);

CREATE INDEX idx_users_email
  ON users(email);
CREATE INDEX idx_users_role
  ON users(role)
  WHERE role != 'user';
Farel Hanafi

Farel Hanafi

Backend Developer · Open to opportunities

Let's build something robust together.

Whether you're looking to architect a new microservice, optimize an existing system, or chat about backend engineering — I'm available.

Email

farelcuy122@gmail.com

Location

Indonesia · Remote Available