Flavours of Server-Side Rendering in React

11 min read
Flavours of Server-Side Rendering in React

Server-Side Rendering (SSR) has evolved a lot since React first introduced it.
Today, we can choose from multiple flavours — from the classic renderToString to modern Suspense-driven streaming in React 18+.
Each one has its own trade-offs between control, complexity, and performance.

In this post, I’ll walk you through how each approach works, what kind of performance gain you can expect, and when to use which one — all backed by real code from my SSR-demo repository.

Let’s dive in 🚀

What We’ll Cover

  • How the demo project is wired and how to run it locally
  • Classic SSR with renderToString for straightforward hydration
  • Manual streaming using chunked responses for progressive rendering
  • Suspense streaming with renderToPipeableStream and selective hydration
  • Observability techniques, migration paths, and production tips

Why SSR Still Matters

Client-side rendering shines once the browser has booted our app, but it often stumbles on slower networks or CPU-constrained devices. Shipping meaningful HTML straight from the server still solves real problems:

Typical CSR pain points

  • 3–8 s Time to Interactive on sub-4G connections
  • Delayed content discovery for search crawlers
  • Layout shifts while client code hydrates

What SSR fixes immediately

  • HTML arrives ready to paint (better FCP/LCP)
  • Crawlable content without headless browser workarounds
  • Predictable rendering for marketing, checkout, and dashboard-critical paths

Rather than treating SSR as a binary choice, React 18 offers a spectrum. Understanding the mechanics behind each flavour lets us mix and match intentionally.


The Reference Project

All examples reference the SSR-demo repo. It ships an Express server with three routes, a shared product-page UI, and instrumentation to show how each technique behaves.

Getting the repo running

git clone https://github.com/bhupendra1011/SSR-demo
cd SSR-demo
npm install
npm run start
# Server listens on http://localhost:3005

Key files

SSR-demo/
├── server.js
└── components/
    ├── App.js
    ├── PlainStreamSections.js
    └── SuspenseStreamApp.js

Project Flow (ASCII)

Browser request
     |
     v
[ Express server ]
     |
     +-- "/" ------------------> renderToString(App)
     |
     +-- "/render-stream" -----> manual chunked response
     |
     `-- "/render-suspense-stream" -> renderToPipeableStream(SuspenseStreamApp)

server.js wires three HTTP endpoints to three rendering strategies. The snippets below omit boilerplate (static assets, error middleware) so we can focus on the SSR mechanics:

// server.js (simplified)
import express from "express";
import React from "react";
import ReactDOMServer from "react-dom/server";
import { performance } from "node:perf_hooks";
import App from "./components/App.js";
import {
  PlainShell,
  PlainReviews,
  PlainRecommendations,
} from "./components/PlainStreamSections.js";
import SuspenseStreamApp from "./components/SuspenseStreamApp.js";

const app = express();

app.get("/", (req, res) => {
  const markup = ReactDOMServer.renderToString(<App />);
  const document = `<!DOCTYPE html>
  <html lang="en">
    <head>
      <meta charSet="utf-8" />
      <meta name="viewport" content="width=device-width, initial-scale=1" />
      <title>SSR demo</title>
      <link rel="stylesheet" href="/styles.css" />
    </head>
    <body>
      <div id="root">${markup}</div>
      <script type="module" src="/client/hydrate-classic.js"></script>
    </body>
  </html>`;

  res.send(document);
});

app.get("/render-stream", (req, res) => {
  // Manual streaming implementation (covered below)
});

app.get("/render-suspense-stream", (req, res) => {
  // React 18 streaming implementation (covered below)
});

app.listen(3005, () => console.log("SSR demo listening on 3005"));

The <script> tag points to the client bundle that hydrates the markup. In this demo that bundle lives at /client/hydrate-classic.js, but the precise path depends on how we build our assets.


Flavour 1: Classic SSR with `renderToString`

Classic SSR renders the entire component tree on the server before sending a single byte. It is conceptually simple, universally supported, and provides deterministic HTML.

How it works

  1. Express receives / requests.
  2. ReactDOMServer.renderToString walks the component tree synchronously.
  3. The HTML string is injected into a full document template.
  4. The browser receives the complete page and paints once.
  5. A matching client bundle hydrates the markup.

Classic SSR Flow

[Request] --> [renderToString(App)] --> [HTML template filled] --> [Full response sent] --> [Hydration in browser]

Server route

app.get("/", (req, res) => {
  try {
    const start = performance.now();
    const markup = ReactDOMServer.renderToString(<App />);
    const html = `<!DOCTYPE html>
    <html lang="en">
      <head>
        <meta charSet="utf-8" />
        <meta name="viewport" content="width=device-width, initial-scale=1" />
        <title>SSR demo</title>
        <link rel="stylesheet" href="/styles.css" />
      </head>
      <body>
        <div id="root">${markup}</div>
        <script type="module" src="/client/hydrate-classic.js"></script>
      </body>
    </html>`;

    res.setHeader("Content-Type", "text/html; charset=utf-8");
    res.send(html);
    console.log("classic-ssr", `${(performance.now() - start).toFixed(2)}ms`);
  } catch (error) {
    console.error("classic-ssr error", error);
    res.status(500).send(`<!DOCTYPE html>
    <html lang="en">
      <body>
        <div id="root">SSR failed — falling back to client render.</div>
        <script type="module" src="/client/hydrate-classic.js"></script>
      </body>
    </html>`);
  }
});

The fallback still loads the same hydration bundle so the client can recover.

Component structure

// components/App.js
export default function App() {
  return (
    <div className="app">
      <header className="hero">
        <h1>E–commerce Product Page</h1>
        <p>Classic SSR – everything ships together.</p>
      </header>
      <main>{/* Product info, reviews, recommendations */}</main>
    </div>
  );
}

Characteristics

  • TTFB equals the time required to render the entire tree.
  • Memory usage grows with page complexity because the full HTML string lives in memory.
  • Hydration requires markup parity—differences between server and client state cause warnings.

When to choose it

  • Marketing, documentation, or dashboards where data resolves quickly.
  • We want the lowest operational complexity and maximum compatibility.
  • We plan to layer on streaming later but need a stable baseline today.

Flavour 2: Manual Streaming with Chunked Responses

Manual streaming sends the initial HTML shell immediately and streams slower sections as they become available. We control chunk boundaries, timing, and fallbacks explicitly.

Why use it

  • Improve perceived performance by letting users see and interact with the hero section almost immediately.
  • Control exactly when each chunk is flushed (handy when different data sources have predictable response times).
  • Keep the approach framework-agnostic—no Suspense requirement.

Manual Stream Timeline

Request
  |
  +--> send shell chunk
  |        (browser paints hero instantly)
  |
  +--> stream reviews chunk (~5s)
  |
  `--> stream recommendations chunk (~10s)
           -> write closing tags
           -> ship hydration script

Server route

app.get("/render-stream", (req, res) => {
  const startedAt = performance.now();

  res.setHeader("Content-Type", "text/html; charset=utf-8");
  res.setHeader("Transfer-Encoding", "chunked");

  const write = (markup) => res.write(`${markup}\n`);

  // 1. Send document head and shell immediately.
  write(`<!DOCTYPE html>
  <html lang="en">
    <head>
      <meta charSet="utf-8" />
      <meta name="viewport" content="width=device-width, initial-scale=1" />
      <title>Manual streaming demo</title>
      <link rel="stylesheet" href="/styles.css" />
    </head>
    <body>
      <div id="root">`);
  write(ReactDOMServer.renderToStaticMarkup(<PlainShell />));

  // 2. Stream reviews after an artificial delay.
  setTimeout(() => {
    write(ReactDOMServer.renderToStaticMarkup(<PlainReviews />));
    console.log("stream:reviews", `${performance.now() - startedAt}ms`);
  }, 5000);

  // 3. Stream recommendations, then close the document.
  setTimeout(() => {
    write(ReactDOMServer.renderToStaticMarkup(<PlainRecommendations />));
    write(`</div>
    <script type="module" src="/client/hydrate-streaming.js"></script>
  </body>
  </html>`);
    res.end();
    console.log("stream:complete", `${performance.now() - startedAt}ms`);
  }, 10000);
});

The opening write sends the document head plus <div id="root">; the final chunk closes the document and includes the hydration bundle (the demo references /client/hydrate-streaming.js, and we can swap in whatever path our build emits).

Component slices

// components/PlainStreamSections.js
export function PlainShell() {
  return (
    <div id="shell">
      <section className="product-info"></section>
      <section id="reviews-placeholder" className="placeholder">
        ⏳ Loading reviews…
      </section>
      <section id="recommendations-placeholder" className="placeholder">
        ⏳ Loading recommendations…
      </section>
    </div>
  );
}

export function PlainReviews() {
  return <section id="reviews-placeholder">{/* Rich review markup */}</section>;
}

export function PlainRecommendations() {
  return (
    <section id="recommendations-placeholder">{/* Product grid */}</section>
  );
}

Because each chunk renders with renderToStaticMarkup, the markup is safe to stream without closing our shell prematurely. The client-side hydrate-streaming.js script listens for chunks and hydrates them once they land.

Characteristics

  • TTFB is very low because the shell flushes immediately.
  • Perceived latency improves even though slower sections still take time to arrive.
  • Operational overhead increases: we must manage timers, placeholders, and error states for every chunk.
  • Caching becomes granular—we can cache the shell separately from reviews or recommendations.

When to choose it

  • We need deterministic control over chunk timing (analytics, AB tests, legacy integrations).
  • We are on React 17 or earlier but still want streaming.
  • We prefer incremental adoption without restructuring components into Suspense boundaries.

Flavour 3: Suspense-Driven Streaming with React 18

React 18’s renderToPipeableStream brings streaming into the core API. Suspense boundaries decide when to flush HTML to the client, and selective hydration keeps parts of the UI interactive as soon as their data resolves.

Key ideas

  • Suspense boundaries mark sections that can stream/fallback independently.
  • Fallbacks (usually skeletons or spinners) appear instantly.
  • Selective hydration only attaches event handlers to the section whose data has arrived, keeping main-thread pressure low.

Suspense Streaming Flow

Request --> renderToPipeableStream()
                |
                +--> shell + fallbacks flush immediately
                |
                +--> boundary resolves --> swap fallback with real content
                |
                `--> hydrate boundary once its HTML arrives

Server route

import { renderToPipeableStream } from "react-dom/server";

app.get("/render-suspense-stream", (req, res) => {
  let didError = false;

  const { pipe, abort } = renderToPipeableStream(<SuspenseStreamApp />, {
    bootstrapScripts: ["/client/hydrate-streaming.js"],
    onShellReady() {
      res.statusCode = didError ? 500 : 200;
      res.setHeader("Content-Type", "text/html; charset=utf-8");
      pipe(res);
    },
    onShellError(error) {
      didError = true;
      console.error("suspense-shell error", error);
      res.statusCode = 500;
      res.send("<!doctype html><p>Shell failed.</p>");
    },
    onError(error) {
      didError = true;
      console.error("suspense-stream error", error);
    },
  });

  // Fail-safe: abort if something takes too long.
  setTimeout(() => abort("timed out"), 15000);
});

renderToPipeableStream streams automatically; there is no manual setTimeout management. React flushes fallbacks immediately and replaces them once resources resolve.

Suspense-aware component tree

// components/SuspenseStreamApp.js
import { Suspense } from "react";
import { ReviewsBlock, RecommendationsBlock } from "./DelayedBlock.js";

export default function SuspenseStreamApp() {
  return (
    <div className="app">
      <Hero />

      <Suspense fallback={<Skeleton title="Reviews" />}>
        <ReviewsBlock />
      </Suspense>

      <Suspense fallback={<Skeleton title="Popular picks" />}>
        <RecommendationsBlock />
      </Suspense>
    </div>
  );
}

ReviewsBlock and RecommendationsBlock read from React cache-aware resources (reviewsResource.read()), which suspend until data resolves. The Suspense boundary instructs React when it is safe to flush the real HTML and hydrate the corresponding client bundle.

Characteristics

  • TTFB is comparable to manual streaming (fallbacks flush instantly).
  • Developer experience improves—no manual orchestration, automatic error isolation per boundary.
  • Selective hydration keeps interactivity snappy on slower devices.
  • Compatibility requires React 18+ on both server and client.

When to choose it

  • We already use Suspense for data fetching (use in React 19 or resource wrappers).
  • We value fault isolation—one failing section doesn’t block the entire page.
  • We want progressive hydration without writing custom streaming plumbing.

Putting the Flavours Side by Side

DimensionClassic SSRManual StreamingSuspense Streaming
First content paintWaits for full HTMLImmediate shellImmediate shell + Suspense fallbacks
ComplexityLowMedium (manage timers/placeholders)Medium-high (React 18 + boundaries)
Error isolationGlobal try/catchPer chunk if we implement itAutomatic per Suspense boundary
Peak memoryHighest (entire HTML in memory)Lower (chunks rendered separately)Lower (React manages buffers)
CompatibilityWorks everywhereWorks everywhereReact 18+
Ideal use caseFast data, marketing pagesPredictable staged dataComplex UIs, mixed data sources

Measuring and Debugging

Inspecting the network waterfall

  • / shows a single response with Content-Length. The Preview tab populates only after the full HTML arrives.
  • /render-stream flips to Transfer-Encoding: chunked. Watch the Response/Preview tab populate in stages (shell → reviews → recommendations).
  • /render-suspense-stream also uses chunked transfer, but fallbacks render instantly, and Suspense swaps them with real content as soon as each resource resolves.

Instrumentation hooks

// logging.js (optional helper)
export function logChunk(name, startedAt, payload) {
  const duration = (performance.now() - startedAt).toFixed(2);
  console.log(`[stream] ${name}: ${duration}ms, ${payload.length} bytes`);
}

Use the helper inside each route to understand exactly when chunks flush and how much HTML they carry.


Key Takeaways

  • React’s SSR options form a continuum—pick the flavour that matches our latency profile and team workflow.
  • Streaming (manual or Suspense) drastically improves perceived speed without rewriting our entire UI.
  • Suspense provides the safest path for complex apps thanks to built-in fallbacks, selective hydration, and error isolation.
  • Instrumentation is non-negotiable: measure TTFB, chunk arrival times, and hydration timing to spot regressions early.

Let’s Try it out

  • Clone the SSR-demo repo.
  • Change the artificial delays in server.js (or the data helpers it imports) to simulate real-world APIs.
  • Observe the difference in Chrome DevTools → Network, Performance, and Coverage tabs.

Interactive sandbox

Once we have exercised all three approaches, we know which flavour fits the next project—and how to switch to another when requirements evolve.