Skip to content

Commit 5b2a4ef

Browse files
docs: add SSE and circuit breaker sections, update comparison table and API reference — 1.1.0
1 parent dd7ccde commit 5b2a4ef

1 file changed

Lines changed: 146 additions & 19 deletions

File tree

README.md

Lines changed: 146 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99
[![GitHub stars](https://img.shields.io/github/stars/firekid-is-him/hurl?style=flat-square&logo=github&logoColor=white&color=FACC15)](https://github.com/firekid-is-him/hurl/stargazers)
1010
[![Website](https://img.shields.io/badge/website-hurl.firekidofficial.name.ng-black?style=flat-square&logo=googlechrome&logoColor=white)](https://hurl.firekidofficial.name.ng)
1111

12-
**`@firekid/hurl`** is a modern, zero-dependency HTTP client for Node.js 18+, Cloudflare Workers, Vercel Edge Functions, Deno, and Bun — built on native fetch with retries, interceptors, auth helpers, in-memory caching, request deduplication, and full TypeScript support.
12+
**`@firekid/hurl`** is a modern, zero-dependency HTTP client for Node.js 18+, Cloudflare Workers, Vercel Edge Functions, Deno, and Bun — built on native fetch with retries, interceptors, auth helpers, in-memory caching, request deduplication, SSE streaming, circuit breaker, and full TypeScript support.
1313

1414
```bash
1515
npm install @firekid/hurl
@@ -34,6 +34,15 @@ const data = await hurl.get('/users', {
3434
cache: { ttl: 60000 },
3535
})
3636

37+
// Stream an AI response over SSE
38+
const { close } = hurl.sse('https://api.openai.com/v1/chat/completions', {
39+
method: 'POST',
40+
auth: { type: 'bearer', token: process.env.OPENAI_KEY },
41+
body: { model: 'gpt-4o', stream: true, messages: [{ role: 'user', content: 'Hello' }] },
42+
onMessage: (e) => process.stdout.write(JSON.parse(e.data).choices[0].delta.content ?? ''),
43+
onDone: () => console.log('\ndone'),
44+
})
45+
3746
// Parallel requests
3847
const [users, posts] = await hurl.all([
3948
hurl.get('/users'),
@@ -48,7 +57,7 @@ const [users, posts] = await hurl.all([
4857
| Feature | **hurl** | axios | ky | got | node-fetch |
4958
|---|:---:|:---:|:---:|:---:|:---:|
5059
| Zero dependencies ||||||
51-
| Bundle size | **~9KB** | ~35KB | ~5KB | ~45KB | ~8KB |
60+
| Bundle size | **~12KB** | ~35KB | ~5KB | ~45KB | ~8KB |
5261
| Node.js 18+ ||||||
5362
| Cloudflare Workers ||||||
5463
| Vercel Edge ||||||
@@ -58,6 +67,8 @@ const [users, posts] = await hurl.all([
5867
| Auth helpers || ⚠️ ||||
5968
| In-memory cache ||||||
6069
| Request deduplication ||||||
70+
| SSE (with POST + auth) ||||||
71+
| Circuit breaker ||||||
6172
| Upload progress ||||||
6273
| Download progress ||||||
6374
| Proxy support ||||||
@@ -178,6 +189,127 @@ await hurl.get('/users', { signal: controller.signal })
178189
179190
---
180191
192+
## SSE — Server-Sent Events
193+
194+
Unlike the native `EventSource`, `hurl.sse()` works with POST requests, custom headers, and all auth types. This makes it directly compatible with AI APIs like OpenAI, Anthropic, and Gemini that stream over SSE.
195+
196+
```ts
197+
const { close } = hurl.sse('https://api.openai.com/v1/chat/completions', {
198+
method: 'POST',
199+
auth: { type: 'bearer', token: process.env.OPENAI_KEY },
200+
body: {
201+
model: 'gpt-4o',
202+
stream: true,
203+
messages: [{ role: 'user', content: 'Hello' }],
204+
},
205+
onOpen: () => console.log('stream opened'),
206+
onMessage: (event) => {
207+
// event.data, event.event, event.id, event.retry
208+
const chunk = JSON.parse(event.data)
209+
process.stdout.write(chunk.choices[0].delta.content ?? '')
210+
},
211+
onDone: () => console.log('\nstream complete'),
212+
onError: (err) => console.error(err),
213+
})
214+
215+
// Stop the stream at any time
216+
close()
217+
```
218+
219+
SSE with an instance (inherits `baseUrl` and `auth` from defaults):
220+
221+
```ts
222+
const ai = hurl.create({
223+
baseUrl: 'https://api.openai.com/v1',
224+
auth: { type: 'bearer', token: process.env.OPENAI_KEY },
225+
})
226+
227+
const { close } = ai.sse('/chat/completions', {
228+
method: 'POST',
229+
body: { model: 'gpt-4o', stream: true, messages: [...] },
230+
onMessage: (e) => console.log(e.data),
231+
})
232+
```
233+
234+
`hurl.sse()` handles the `data: [DONE]` sentinel automatically — it fires `onDone` and closes the stream. You also get a `signal` option if you want to tie the stream to an `AbortController` you control.
235+
236+
---
237+
238+
## Circuit Breaker
239+
240+
The circuit breaker stops your app from hammering a failing service. After a set number of consecutive failures it opens the circuit and fast-fails all requests until a cooldown period passes, then lets one probe through to check if the service recovered.
241+
242+
States: **CLOSED** (normal) → **OPEN** (fast-failing) → **HALF_OPEN** (probing) → **CLOSED**
243+
244+
```ts
245+
await hurl.get('https://api.example.com/users', {
246+
circuitBreaker: {
247+
threshold: 5, // open after 5 consecutive failures
248+
cooldown: 30_000, // wait 30s before probing
249+
},
250+
})
251+
```
252+
253+
With a fallback so the circuit open doesn't throw:
254+
255+
```ts
256+
await hurl.get('/users', {
257+
circuitBreaker: {
258+
threshold: 3,
259+
cooldown: 10_000,
260+
fallback: () => [], // returned as res.data when circuit is open
261+
},
262+
})
263+
```
264+
265+
Set it on an instance so every request to that API is protected:
266+
267+
```ts
268+
const api = hurl.create({
269+
baseUrl: 'https://api.example.com',
270+
circuitBreaker: {
271+
threshold: 5,
272+
cooldown: 30_000,
273+
},
274+
})
275+
```
276+
277+
Use a custom key if you want multiple endpoints on the same host to have independent breakers:
278+
279+
```ts
280+
await hurl.get('/payments', {
281+
circuitBreaker: { threshold: 3, cooldown: 15_000, key: 'payments-service' },
282+
})
283+
284+
await hurl.get('/orders', {
285+
circuitBreaker: { threshold: 3, cooldown: 15_000, key: 'orders-service' },
286+
})
287+
```
288+
289+
Check the state of any breaker at any time:
290+
291+
```ts
292+
import { getCircuitStats } from '@firekid/hurl'
293+
294+
const { state, failures } = getCircuitStats('https://api.example.com')
295+
// state: 'CLOSED' | 'OPEN' | 'HALF_OPEN'
296+
// failures: number
297+
```
298+
299+
When the circuit is open and no fallback is provided, a `HurlError` with type `CIRCUIT_OPEN` is thrown:
300+
301+
```ts
302+
try {
303+
await hurl.get('/users', { circuitBreaker: { threshold: 3, cooldown: 10_000 } })
304+
} catch (err) {
305+
if (err instanceof HurlError && err.type === 'CIRCUIT_OPEN') {
306+
console.log('service unavailable, try again later')
307+
}
308+
}
309+
```
310+
311+
---
312+
181313
## Interceptors
182314
183315
```ts
@@ -226,12 +358,9 @@ await hurl.get('/users', { cache: { ttl: 60000, bypass: true } })
226358
```ts
227359
import { clearCache, invalidateCache } from '@firekid/hurl'
228360

229-
// Clear the entire cache
230361
clearCache()
231-
232-
// Invalidate a single entry by URL or custom key
233362
invalidateCache('https://api.example.com/users')
234-
invalidateCache('all-users') // if you used a custom cache key
363+
invalidateCache('all-users')
235364
```
236365
237366
---
@@ -253,7 +382,6 @@ const [a, b] = await Promise.all([
253382
## Upload & Download Progress
254383
255384
```ts
256-
// Upload
257385
const form = new FormData()
258386
form.append('file', file)
259387

@@ -263,7 +391,6 @@ await hurl.post('/upload', form, {
263391
}
264392
})
265393

266-
// Download
267394
await hurl.get('/large-file', {
268395
onDownloadProgress: ({ loaded, total, percent }) => {
269396
console.log(`Downloading: ${percent}%`)
@@ -294,17 +421,13 @@ setGlobalDispatcher(new ProxyAgent('http://proxy.example.com:8080'))
294421
```ts
295422
import { EnvHttpProxyAgent, setGlobalDispatcher } from 'undici'
296423
setGlobalDispatcher(new EnvHttpProxyAgent())
297-
// now set HTTP_PROXY=http://proxy.example.com:8080 in your env
298424
```
299425
300426
**Node.js 24+** — native fetch respects env vars when `NODE_USE_ENV_PROXY=1` is set:
301427
```bash
302428
NODE_USE_ENV_PROXY=1 HTTP_PROXY=http://proxy.example.com:8080 node app.js
303429
```
304430
305-
The `proxy` option in `HurlRequestOptions` is reserved for a future release where this will be handled automatically.
306-
307-
---
308431
---
309432
310433
## Parallel Requests
@@ -330,7 +453,7 @@ const api = hurl.create({
330453

331454
await api.get('/users')
332455

333-
// Extend with overrides
456+
// Extend with overrides — inherits parent interceptors
334457
const adminApi = api.extend({
335458
headers: { 'x-role': 'admin' }
336459
})
@@ -340,9 +463,9 @@ const adminApi = api.extend({
340463
341464
## Error Handling
342465
343-
`hurl` throws a `HurlError` on HTTP errors (4xx/5xx), network failures, timeouts, aborts, and parse failures. It never resolves silently on bad status codes.
466+
`hurl` throws a `HurlError` on HTTP errors (4xx/5xx), network failures, timeouts, aborts, parse failures, and open circuit breakers. It never resolves silently on bad status codes.
344467
345-
If you want to handle 4xx/5xx responses yourself without a try/catch, set `throwOnError: false` — the response will resolve normally and you can check `res.status` yourself.
468+
Set `throwOnError: false` to receive 4xx/5xx responses without a throw:
346469
347470
```ts
348471
const res = await hurl.get('/users', { throwOnError: false })
@@ -358,7 +481,7 @@ try {
358481
await hurl.get('/users')
359482
} catch (err) {
360483
if (err instanceof HurlError) {
361-
err.type // 'HTTP_ERROR' | 'NETWORK_ERROR' | 'TIMEOUT_ERROR' | 'ABORT_ERROR' | 'PARSE_ERROR'
484+
err.type // 'HTTP_ERROR' | 'NETWORK_ERROR' | 'TIMEOUT_ERROR' | 'ABORT_ERROR' | 'PARSE_ERROR' | 'CIRCUIT_OPEN'
362485
err.status // 404
363486
err.statusText // 'Not Found'
364487
err.data // parsed error response body
@@ -428,12 +551,13 @@ type HurlRequestOptions = {
428551
auth?: AuthConfig
429552
proxy?: ProxyConfig
430553
cache?: CacheConfig
554+
circuitBreaker?: CircuitBreakerConfig
431555
signal?: AbortSignal
432556
followRedirects?: boolean
433-
maxRedirects?: number
434557
onUploadProgress?: ProgressCallback
435558
onDownloadProgress?: ProgressCallback
436559
stream?: boolean
560+
throwOnError?: boolean
437561
debug?: boolean
438562
requestId?: string
439563
deduplicate?: boolean
@@ -460,7 +584,7 @@ Exports both ESM (`import`) and CommonJS (`require`).
460584
461585
## Why Not Axios?
462586
463-
**axios** is 35KB, has no native edge runtime support, no built-in retry, no deduplication, and carries `XMLHttpRequest` baggage from a different era of the web.
587+
**axios** is 35KB, has no native edge runtime support, no built-in retry, no deduplication, no SSE, no circuit breaker, and carries `XMLHttpRequest` baggage from a different era of the web.
464588
465589
**got** dropped CommonJS in v12 — if your project uses `require()`, you're stuck on an old version.
466590
@@ -486,16 +610,19 @@ Exports both ESM (`import`) and CommonJS (`require`).
486610
| `hurl.head(url, options?)` | HEAD request → `Promise<HurlResponse<void>>` |
487611
| `hurl.options(url, options?)` | OPTIONS request |
488612
| `hurl.request(url, options?)` | Generic request, method from options |
613+
| `hurl.sse(url, options)` | Open an SSE stream → `{ close() }` |
489614
| `hurl.all(requests)` | Run requests in parallel |
490615
| `hurl.create(defaults?)` | New isolated instance |
491-
| `hurl.extend(defaults?)` | New instance inheriting current defaults |
616+
| `hurl.extend(defaults?)` | New instance inheriting current defaults and interceptors |
492617
| `hurl.defaults.set(defaults)` | Set global defaults |
493618
| `hurl.defaults.get()` | Get current defaults |
494619
| `hurl.defaults.reset()` | Reset defaults to instance creation values |
495620
| `hurl.interceptors.request.use(fn)` | Register request interceptor |
496621
| `hurl.interceptors.response.use(fn)` | Register response interceptor |
497622
| `hurl.interceptors.error.use(fn)` | Register error interceptor |
498623
| `clearCache()` | Clear in-memory response cache |
624+
| `invalidateCache(key)` | Invalidate a single cache entry by URL or custom key |
625+
| `getCircuitStats(key)` | Get state and failure count for a circuit breaker key |
499626
500627
---
501628

0 commit comments

Comments
 (0)