🧾 Background
While integrating the Alibaba Cloud DeepSeek Assistant API into a frontend project, I wanted real-time AI responses via fetch
+ stream (ReadableStream
). Backend used Express to call DashScope’s streaming endpoint with X-DashScope-SSE: enable
, and the frontend expected to receive incremental AI tokens like a chat assistant.
This setup worked perfectly in production.
But in local development with netlify dev
?
❌ Streaming completely failed, frontend received nothing, and eventually crashed with:
net::ERR_EMPTY_RESPONSE
🧨 Problem Summary
Even though the backend successfully received and printed streaming chunks like:
data: {"output":{"text":"Hello"}}
The frontend never got them. Instead:
-
fetch().body.getReader().read()
was never triggered - The browser hung waiting for data
- Connection dropped with
ERR_EMPTY_RESPONSE
🔍 Root Cause
❌ Suspected Issue | Explanation |
---|---|
Backend forgot res.end()
|
Nope — it's there |
Streaming from DashScope failed | Nope — chunks are printed in the backend logs |
✅ netlify dev local proxy strips streaming |
✔️ Yes — this is the real issue |
The Netlify CLI (
netlify dev
) emulates serverless locally, but its internal proxy does not support chunked transfer encoding, which is required for streaming withres.write()
.
✅ Solution: Bypass Netlify CLI, Run Express Locally
✅ Step 1: Run your backend manually
ts-node src/server.ts
Make sure your Express server listens on a custom port (e.g. 3001
):
app.listen(3001, '0.0.0.0', () => {
console.log('Running at http://localhost:3001');
});
✅ Step 2: Configure frontend proxy (Vite)
// vite.config.ts
server: {
proxy: {
'/chat': {
target: 'http://localhost:3001',
changeOrigin: true,
},
},
}
Now your frontend code still calls /chat
, but under the hood it hits your local Express server instead of the broken netlify dev
proxy.
✅ Step 3: Make sure your backend properly streams
externalRes.on('data', (chunk) => {
res.write(chunk); // Stream directly to frontend
});
externalRes.on('end', () => {
res.end(); // Always close stream
});
💡 Use res.setHeader('Content-Type', 'text/event-stream')
for best results if you’re mimicking SSE.
✅ Alternative: Deploy to Netlify
Deployed Netlify Functions do support streaming!
netlify deploy --prod
But don't rely on local dev for anything stream-based.
🧠 Summary
Method | Streaming Supported? | Use Case |
---|---|---|
netlify dev |
❌ No | ❌ Never use for streaming/debugging SSE |
ts-node + Express |
✅ Yes | ✅ Local dev (DeepSeek, OpenAI, etc.) |
Deployed Netlify | ✅ Yes | ✅ Production use with DeepSeek or DashScope |
📝 Takeaway
If you're building real-time chat UIs, AI copilots, or stream-based assistants using Alibaba DeepSeek, OpenAI, or similar APIs:
⚠️ Don't use
netlify dev
during development.
✅ Use Express locally + Vite proxy for a reliable streaming experience.