⚙️ Dev & Engineering

Mastering Modern Full-Stack Architecture: Monorepos & Limits

Chloe Chen
Chloe Chen
Dev & Engineering Lead

Full-stack engineer obsessed with developer experience. Thinks code should be written for the humans who maintain it, not just the machines that run it.

TypeScript monorepoGo rate limitingdeveloper experiencebackend performance optimizationReact state management

We've all stared at our React app re-rendering 50 times for no reason while downing our third cup of coffee, right? Or perhaps you've spent three days fighting your workspace configuration just to share a single TypeScript interface between your frontend and backend. It's frustrating, it drains our energy, and it keeps us away from doing what we love: building beautiful, performant experiences.

Today, we are diving deep into modern full-stack architecture. We are going to look at two massive pillars of our daily engineering lives that often cause friction: managing TypeScript monorepos without losing our minds, and protecting our backend services from traffic spikes using elegant rate limiting in Go.

Shall we solve this beautifully together? ✨

The Mental Model: Bento Boxes and Bouncers

Before we look at a single line of code, let's build a picture in our minds.

Think of a poorly configured monorepo like a giant bowl of spaghetti. Every noodle (module) touches every other noodle. When you pull one, the whole bowl moves. Instead, we want to build a Bento Box. In our bento box, the api (our backend), the client (our React/Vue app), and the shared (our types and utilities) have their own distinct, perfectly portioned compartments. They don't bleed into each other, but they sit in the same container, making it incredibly easy to carry around.

@repo/shared (Types) @repo/api (Hono) Imports Shared Types @repo/client (React) Imports Shared Types & API RPC Type-Safe RPC

Now, for the backend traffic, imagine a Bouncer at a VIP Club (our Token Bucket rate limiter). The club can only handle 100 people per second. The bouncer holds a bucket of 100 VIP passes. Every second, a magical machine drops exactly 100 new passes into the bucket. If a burst of 50 people show up at once, the bouncer hands out 50 passes—no problem! But if 150 people show up, the first 100 get in, and the remaining 50 are politely told to wait. This allows for natural bursts of traffic without ever overwhelming the dance floor (your database).

Deep Dive 1: Taming the Full-Stack TypeScript Monorepo

Over the years, I've seen countless teams set up a full-stack monorepo, get it working on day one, and then spend the next six months patching rough edges. The build times creep up, the hot-module replacement (HMR) breaks, and suddenly, developers are afraid to touch the shared folder.

While all-in-one frameworks like Next.js or Nuxt are incredible, sometimes you need strict boundaries. You might want a dedicated API with its own runtime (like Cloudflare Workers or Bun) and a separate React SPA built with Vite.

Let's look at how we can achieve perfect end-to-end type safety without the spaghetti, using Hono for the API and React/Vite for the client.

The Anti-Pattern: Manual Type Duplication

// ❌ BAD: Duplicating types across your monorepo
// api/src/routes/users.ts
export type User = { id: string; name: string };

// client/src/types/user.ts
// We hope this matches the backend!
export type User = { id: string; name: string };

When we duplicate types, we are creating a ticking time bomb. The moment the backend adds an email field, the frontend is out of sync, and our users experience runtime crashes.

The Elegant Solution: Hono RPC

Hono has a brilliant feature called RPC (Remote Procedure Call) that allows the client to infer the exact routes, inputs, and outputs of the API without any code generation. It feels like magic, but it's just incredibly smart TypeScript.

Here is how we set up the API:

// ✅ GOOD: api/src/index.ts
import { Hono } from 'hono'
import { z } from 'zod'
import { zValidator } from '@hono/zod-validator'

const app = new Hono()

// We define our route and export its type signature
const routes = app.post(
  '/api/users',
  zValidator('json', z.object({ name: z.string(), email: z.string().email() })),
  (c) => {
    const { name, email } = c.req.valid('json')
    // Save to DB...
    return c.json({ id: '123', name, email, status: 'created' })
  }
)

export type AppType = typeof routes
export default app

And here is the beautiful DX on the client side:

// ✅ GOOD: client/src/api.ts
import { hc } from 'hono/client'
import type { AppType } from '@repo/api' // Importing ONLY the type!

// Initialize the RPC client
const client = hc<AppType>('http://localhost:8787')

// Inside your React component or React Query hook:
async function createUser() {
  // ✨ Autocomplete heaven! TypeScript knows '/api/users' exists.
  // It also enforces that we pass 'name' and 'email'.
  const res = await client.api.users.$post({
    json: {
      name: 'Chloe',
      email: '[email protected]'
    }
  })
  
  const data = await res.json()
  // TypeScript knows data has id, name, email, and status!
  console.log(data.status)
}

Performance vs DX Evaluation

Developer Experience (DX): This is where we get to go home early. By importing AppType from the API workspace into the Client workspace, we get instant autocomplete. If the backend changes a route name from /api/users to /api/members, your React build will instantly fail, pointing exactly to the line you need to fix. No more guessing. No more Postman ping-pong.

Performance: Because we are only importing type { AppType }, zero backend code is bundled into your React app. Your Vite bundle remains incredibly lean. The browser only downloads the tiny hono/client wrapper. It's the perfect harmony of robust architecture and lightweight delivery.

Deep Dive 2: Respecting Boundaries with Go Rate Limiting

Now that our frontend and backend are communicating beautifully, we have a new problem: we are too successful. Our app went viral, and our Go microservices are hammering our downstream database (or a third-party API) with 5,000 requests per second.

We need to shape this traffic. If we don't, we'll see HTTP 429 (Too Many Requests) errors, cascading latency where the whole system grinds to a halt, and massive cost overruns from SaaS providers.

The Mental Model: The Token Bucket Algorithm

Token Bucket (Capacity: 100) Constant Refill (100 tokens/sec) Incoming Request (Consumes 1 Token)

Unlike a naive counter that resets every minute (which allows massive spikes at the 59th and 1st seconds), the Token Bucket algorithm provides a smooth, continuous flow.

The Code: Implementing Resile in Go

In Go, implementing this manually with mutexes and goroutines can be error-prone. Instead, we can use a resilience library like Resile to wrap our executions elegantly.

// ❌ BAD: Unbounded calls to a downstream API
func FetchUserData(ctx context.Context, userID string) (*UserData, error) {
    // If 10,000 users hit this at once, we crash the downstream API.
    return downstreamAPI.Get(ctx, userID)
}

Let's fix this by wrapping our action in a Token Bucket rate limiter:

// ✅ GOOD: Precise traffic shaping with Resile
import (
    "context"
    "time"
    "github.com/resile/resile"
)

func FetchUserData(ctx context.Context, userID string) (*UserData, error) {
    var data *UserData
    
    // Define the action we want to protect
    action := func(ctx context.Context) error {
        var err error
        data, err = downstreamAPI.Get(ctx, userID)
        return err
    }

    // Execute with a Token Bucket: Allow 100 requests per second.
    // If the bucket is empty, it fails fast with resile.ErrRateLimitExceeded.
    err := resile.DoErr(
        ctx,
        action,
        resile.WithRateLimiter(100, time.Second),
    )

    if err == resile.ErrRateLimitExceeded {
        // Gracefully handle the limit (e.g., return a 429 to OUR client, 
        // or serve stale cache data)
        return GetCachedUserData(userID), nil
    }

    return data, err
}

Performance vs DX Evaluation

Developer Experience (DX): Notice how clean that is? We didn't have to write complex channel logic or manage ticker states. We simply wrapped our business logic in a declarative rate limiter. It clearly communicates intent to the next developer reading the code.

Performance: The Token Bucket algorithm is incredibly lightweight on CPU and memory. By failing fast (ErrRateLimitExceeded), we prevent our application from holding open thousands of idle connections waiting for a struggling downstream API to respond. We save our own memory, and we respect the boundaries of the services we depend on.

(Pro-tip: GitHub just expanded their application security coverage with AI-powered detections. If you are pushing Go code like this, enabling GitHub Code Security will automatically scan your implementations for concurrency vulnerabilities or insecure API usage. It's a great safety net that runs silently in your CI/CD pipeline!)

The DX vs Performance Showdown

Let's look at how this modern approach compares to legacy setups.

Architecture StyleDeveloper Experience (DX)Backend PerformanceSetup Complexity
Monolithic Framework (e.g., standard Next.js)Excellent (Everything just works)Good, but scaling API & UI together can be costlyLow
Microservices (Multi-repo)Poor (Constant context switching, syncing types is a nightmare)Excellent (Scale exactly what you need)High
Modern Monorepo (Vite + Hono + Shared Types)Excellent (Instant autocomplete, clear boundaries)Excellent (Independent runtimes, lean bundles)Medium (Requires initial Vite/TS config)

By combining a structured monorepo for our codebase and strict rate limiting for our runtime, we achieve the holy grail: an environment that is a joy to code in, and a system that refuses to crash under pressure.

What You Should Do Next

Theory is great, but action is better. Here is how you can apply these concepts today:

1. Audit Your Monorepo Boundaries: Open your current project. Are your frontend components importing directly from backend folders? If so, create a @repo/shared package today. Move your Zod schemas and TypeScript interfaces there.
2. Test Hono RPC: Even if you aren't migrating your whole backend, spin up a tiny Hono project and connect it to a Vite React app. Experience the magic of zero-build type inference. It will change how you view API development.
3. Implement a Token Bucket: Identify the single most expensive external API call your backend makes. Wrap it in a token bucket rate limiter (using Resile in Go, or bottleneck in Node.js). Watch your error rates drop during peak hours.
4. Enable Security Scanning: Head to your GitHub repository settings and ensure CodeQL and application security scanning are enabled to catch architectural vulnerabilities early.

Your components are way leaner now, and your APIs are bulletproof. Happy Coding! ✨


Frequently Asked Questions

Why use Hono instead of Express for the API? Express is wonderful and battle-tested, but Hono is built for the modern edge. It uses standard Web APIs, meaning the exact same Hono code can run on Cloudflare Workers, Deno, Bun, or Node.js. Plus, its built-in RPC type inference provides a vastly superior Developer Experience compared to manually syncing types in Express.
Does the Token Bucket algorithm drop requests entirely? Yes, if the bucket is empty, the request is immediately rejected (failing fast). However, you can combine a Rate Limiter with a Retry mechanism or a Queue if you want to hold and process those requests later. Failing fast is usually better for system stability than holding thousands of pending connections.
Can I use these monorepo patterns with Vue or Svelte? Absolutely! The beauty of the @repo/api, @repo/client, @repo/shared architecture is that the client is completely framework-agnostic. You can easily swap the Vite React app for a Nuxt or SvelteKit application and still utilize the exact same Hono RPC client for end-to-end type safety.
Isn't setting up a monorepo too complex for small projects? It used to be, but tooling has improved dramatically. Tools like npm workspaces, pnpm, or Turborepo make scaffolding a monorepo almost as fast as a single-app setup. If you know your project will eventually need a separate API and frontend, starting with a lightweight monorepo saves massive refactoring headaches later.

📚 Sources

Related Posts

⚙️ Dev & Engineering
Modern TypeScript DX: Mastering Effect-TS & Vitest
Mar 23, 2026
⚙️ Dev & Engineering
Mastering the WebMCP API and Context-Aware Python Testing Workflows
Mar 22, 2026
⚙️ Dev & Engineering
Top 5 Web Architecture Patterns You Need in 2026
Mar 21, 2026