getting started
chaos testing with zero mercy
cruel is a chaos engineering library for testing how your ai applications handle failures. it wraps ai sdk models with configurable fault injection - rate limits, timeouts, stream cuts, corrupt responses, and more.
for the base cruel(...) function wrappers (fetch/services/core api), see the core api page.
install
bun add cruelwrap a model
import { openai } from "@ai-sdk/openai"
import { generateText } from "ai"
import { cruelModel } from "cruel/ai-sdk"
const model = cruelModel(openai("gpt-4o"), {
rateLimit: 0.2,
delay: [100, 500],
onChaos: (event) => console.log(event.type, event.modelId),
})
const result = await generateText({
model,
prompt: "hello",
})this wraps the model so 20% of calls get a 429 rate limit error and every call has 100-500ms of added latency. the onChaos callback fires whenever chaos is injected.
how it works
cruel sits between your app and the ai provider. when a call is made:
- pre-call checks - rate limit, overloaded, timeout, etc. fire before the api call
- api call - if pre-call passes, the real api call happens
- post-call mutations - partial response, token usage override, etc. modify the result
- stream transforms - slow tokens, corrupt chunks, stream cuts modify the stream
errors use the same format as real provider errors (APICallError.isInstance() returns true), so your retry logic works exactly like it would in production.
streaming
import { streamText } from "ai"
import { cruelModel } from "cruel/ai-sdk"
const model = cruelModel(openai("gpt-4o"), {
slowTokens: [50, 200],
streamCut: 0.1,
corruptChunks: 0.02,
})
const result = streamText({ model, prompt: "hello" })
for await (const chunk of result.fullStream) {
if (chunk.type === "text-delta") process.stdout.write(chunk.delta)
}with the gateway
import { gateway } from "@ai-sdk/gateway"
import { cruelModel } from "cruel/ai-sdk"
const model = cruelModel(gateway("openai/gpt-4o"), {
rateLimit: 0.2,
overloaded: 0.1,
})presets
import { cruelModel, presets } from "cruel/ai-sdk"
cruelModel(openai("gpt-4o"), presets.realistic)
cruelModel(openai("gpt-4o"), presets.nightmare)
cruelModel(openai("gpt-4o"), presets.apocalypse)wrap everything
import { cruelProvider } from "cruel/ai-sdk"
const chaos = cruelProvider(openai, { rateLimit: 0.1 })
chaos("gpt-4o") // language model
chaos.embeddingModel("text-embedding-3-small") // embedding model
chaos.imageModel("dall-e-3") // image modelrun the examples
cd packages/examples
bun run run.ts ai-sdk openai
bun run run.ts ai-gateway anthropic
bun run run.ts core
bun run run.ts with-diagnostics
bun run run.ts ai-sdk openai -m gpt-6
bun run run.ts ai-gateway openai --model gpt-6-m / --model sets MODEL for each matched example process. this swaps the model id without changing example files.