Skip to content

Commit e7c744c

Browse files
committed
add doc on new cli commands
Signed-off-by: Sylvain Hellegouarch <sh@defuze.org>
1 parent 5575c3c commit e7c744c

File tree

6 files changed

+107
-18
lines changed

6 files changed

+107
-18
lines changed

docs/fault/docs/explanations/fault-injection-basics.md

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -70,6 +70,22 @@ Each fault type has a distinct role in helping you simulate and analyze adverse
7070
- **Engineering Focus:**
7171
Strengthen error-handling routines, validate user-friendly error messages, and implement effective retry or fallback mechanisms.
7272

73+
### LLM Faults
74+
- **Purpose:**
75+
To adjust the parameters of LLM exchanges.
76+
- **Use Case:**
77+
Verify your application can handle unexpected responses from LLM.
78+
- **Engineering Focus:**
79+
Strengthen error-handling routines, validate user-friendly error messages, and properly deal with LLM changes.
80+
81+
### Database Faults
82+
- **Purpose:**
83+
To impact communications to and from databases.
84+
- **Use Case:**
85+
Explore how your application performs under database failures.
86+
- **Engineering Focus:**
87+
Strengthen error-handling routines, validate user-friendly error messages, and properly deal with db errors.
88+
7389
## In Summary
7490

7591
Fault injection is a powerful tool in your reliability engineering toolkit. It not only helps you detect vulnerabilities but also guides you in making informed improvements. By understanding the purpose behind each fault type and how to apply different distribution models, you can build robust systems that continue to perform even under duress.

docs/fault/docs/explanations/why-fault.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -23,9 +23,10 @@ systems lead to healthier operations.
2323
traffic through its proxy and test your application as normal:
2424

2525
- Forward and tunnel proxy modes
26-
- HTTP and HTTPS
27-
- HTTP/1.1 and HTTP/2
28-
- TCP IPv4 transparent proxy
26+
- HTTP, HTTPS
27+
- HTTP/1.1, HTTP/2, SSE
28+
- TCP transparent proxy
29+
- LLM & Database high-level faults
2930
- Scenarii automation
3031
- eBPF stealth redirection on Linux
3132

docs/fault/docs/how-to/install.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,9 +6,10 @@ on your environment.
66

77
## Features Matrix
88

9-
From a very high-level, <span class="f">fault</span> provides the following features:
9+
From a very high-level <span class="f">fault</span> provides the following features:
1010

1111
* **Proxy**: a network proxy that model network traffic based on a configuration
12+
- **LLM/DB**: proxy subcommands dedicated to explore LLM and database issues
1213
* **Scenario**: testing automation using the proxy
1314
* **Injection**: machinery to inject the network proxy into platform resources
1415
* **AI Agent**: review of results and code from a reliability and resilience perspective

docs/fault/docs/overrides/landing.html

Lines changed: 6 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -408,23 +408,17 @@ <h3>Everything you need to know about bringing fault into your organization.</h3
408408
<div class="col-6">
409409
<details class="quote">
410410
<summary>What is fault?</summary>
411-
<p>fault is a developer product aiming at supporting engineers keep their high-standards while onboarding AI
412-
into their pipeline.</p>
411+
<p>fault is a developer product aiming at supporting engineers keep their high-standards.</p>
413412
</details>
414413
<details class="quote">
415414
<summary>How is fault delivered?</summary>
416-
<p>fault is a rust-cli. It runs natively on Linux, macOSX and Windows.</p>
415+
<p>fault is a Rust-powered command line. It runs natively on Linux, macOSX and Windows.</p>
417416
</details>
418417
<details class="quote">
419-
<summary>Is fault free?</summary>
420-
<p>The fault CLI is free and open-source and will remain so. You will pay for any LLM model through your own
418+
<summary>Is fault open-source?</summary>
419+
<p>The fault CLI is free and open-source, using an Apache license, and will remain so. You will pay for any LLM model through your own
421420
subscription.</p>
422421
</details>
423-
<details class="quote">
424-
<summary>Do you offer enterprise commercial support?</summary>
425-
<p>As part of the <a href="https://rebound.how/">Rebound</a> family, we do indeed support fault
426-
commercially. Please feel free to <a href="https://rebound.how/support/#contact">reach out to us</a>.</p>
427-
</details>
428422
<details class="quote">
429423
<summary>Do you upload my data anywhere?</summary>
430424
<p>The fault CLI doesn't send your data or code anywhere. If you use a Cloud-based LLM such as OpenAI,
@@ -488,10 +482,10 @@ <h2 class="footer-logo">
488482
</ul>
489483
</div>
490484
<div class="col-2">
491-
<header>SUPPORT</header>
485+
<header>COMMUNITY</header>
492486
<ul>
493487
<li><a href="https://github.com/rebound-how/rebound/issues">Issues</a></li>
494-
<li><a href="https://github.com/orgs/rebound-how/discussions/categories/ideas">Feature Requests</a></li>
488+
<li><a href="https://github.com/orgs/rebound-how/discussions/categories/ideas">Discussion</a></li>
495489
<li><a href="#faq">FAQ</a></li>
496490
<li><a href="https://rebound.how/support/#contact">Contact-Us</a></li>
497491
</ul>

docs/fault/docs/reference/cli-commands.md

Lines changed: 66 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,9 @@ defined in a file or launch a local demo server.
1212
### `run`
1313

1414
Run the proxy with fault injection enabled. This command applies the specified
15-
network faults to HTTP requests and tunnel streams.
15+
network faults to TCP streams and HTTP requests.
16+
17+
It has two subcommands to specifically explore LLM and database use-cases.
1618

1719
### `inject`
1820

@@ -354,12 +356,67 @@ Learn more about the [Blackhole fault](./builtin-faults.md#blackhole).
354356
_Default:_ `ingress`
355357

356358
- **`--blackhole-sched <value>`**
357-
[Intervals scheduling](./schedule-intervals-syntax.md) when to apply the fault (require `--duration` whhen using relative schedule).
359+
[Intervals scheduling](./schedule-intervals-syntax.md) when to apply the fault (require `--duration` when using relative schedule).
358360
**Example:** `--blackhole-sched "start:30s,duration:60s"`
359361
**Example:** `--blackhole-sched "start:5%,duration:40%"` (requires `--duration`)
360362

361363
---
362364

365+
### `llm` Subcommand Options
366+
367+
Specific faults to target your LLM.
368+
369+
**<TARGET>** Which LLM provider to target, one of `openai`, `gemini`,
370+
`open-router` and `ollama`
371+
372+
- **`--endpoint`**
373+
The base URL of the targeted LLM provider. Usually, you do not need to set
374+
this value as the right base url will be set for each provider.
375+
376+
- **`--case`**
377+
Which scenarios to run. Possible values `slow-stream`, `prompt-scramble`,
378+
`token-drop`, `inject-bias`, `truncate-response`, `http-error`
379+
380+
- **`--probability`**
381+
Fault injection probability between 0.0 (never) to 1.0 (always)
382+
_Default:_ `1.0`
383+
384+
Each case has its own parameters:
385+
386+
When `--case` is `slow-stream`.
387+
388+
- **`--slow-stream-mean-delay`**
389+
Delay in miliseconds to slow the stream by.
390+
_Default:_ `300`
391+
392+
When `--case` is `token-drop`.
393+
394+
No extra parameters.
395+
396+
When `--case` is `prompt-scramble`.
397+
398+
- **`--scramble-pattern`**
399+
Optional regex pattern to scramble in prompt.
400+
401+
- **`--scramble-with`**
402+
Optional substitute text for scramble (must be set when `--scramble-pattern`
403+
is set)
404+
405+
- **`--instruction`**
406+
Optional instruction/System prompt to set on the request.
407+
408+
When `--case` is `inject-bias`.
409+
410+
- **`--bias-pattern`**
411+
Regex pattern for bias.
412+
413+
- **`--bias-replacement`**
414+
Substitute text for bias.
415+
416+
When `--case` is `http-error`.
417+
418+
No extra parameters.
419+
363420
### Usage Examples
364421

365422
#### Running the Proxy with Multiple Faults
@@ -371,6 +428,13 @@ fault run \
371428
--with-bandwidth --bandwidth-rate 2000 --bandwidth-unit KBps
372429
```
373430

431+
#### Adding instructions to a LLM call
432+
433+
```bash
434+
fault run llm openai --instruction "Respond in French"
435+
```
436+
437+
374438
## `injection` Command Options
375439

376440
Inject <span class="f">fault</span> into your platform resources.

docs/fault/docs/reference/environment-variables.md

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -66,6 +66,19 @@ be simpler to populate these options via environment variables.
6666
| `FAULT_DNS_PROBABILITY` | `0.5` | Probability (0–100) to trigger a DNS fault. |
6767
| `FAULT_DNS_SCHED` | (none) | Scheduling of the dns fault. |
6868

69+
### `run llm` Command Variables
70+
71+
| **Name** | **Default Value** | **Explanation** |
72+
|----------------------------------|---------------------|--------------------------------------------------------------------------------------------------|
73+
| `FAULT_LLM_ENDPOINT` | (none) | Base URL of the target LLM provider. |
74+
| `FAULT_LLM_PROBABILITY` | `1.0` | Probability which will trigger the fault injection (0 means never and 1 means always). |
75+
| `FAULT_LLM_SLOW_STREAM_MEAN_DELAY` | `300` | Latency to apply to the LLM response. |
76+
| `FAULT_LLM_SCRAMBLE_PATTERN` | (none) | Regex pattern to look for into the request. |
77+
| `FAULT_LLM_SCRAMBLE_WITH` | (none) | Replacement string when the pattern matches. |
78+
| `FAULT_LLM_SCRAMBLE_INSTRUCTION` | (none) | Instruction to inject into the LLM requests as a system prompt. |
79+
| `FAULT_LLM_BIAS_PATTERN` | (none) | Regex pattern to look for into the response. |
80+
| `FAULT_LLM_BIAS_REPLACEMENT` | (none) | Replacement string when the pattern matches. |
81+
6982
## `injection` Command Variables
7083

7184
### `aws` Subcommand Variables

0 commit comments

Comments
 (0)