Skip to content
Prompt

stream-json — Windsurf Rules

by uhop

AI Summary

<!-- Canonical source: AGENTS.md — keep this file in sync --> stream-json is a micro-library of Node.js stream components for creating custom JSON processing pipelines with a minimal memory footprint. It can parse JSON files far exceeding available memory streaming individual primitives using a SAX-

Install

Copy this and paste it into Claude Code, Cursor, or any AI assistant:

I want to add the "stream-json — Windsurf Rules" prompt rules to my project.
Repository: https://github.com/uhop/stream-json

Please read the repo to find the rules/prompt file, then:
1. Download it to the correct location (.cursorrules, .windsurfrules, .github/prompts/, or project root — based on the file type)
2. If there's an existing rules file, merge the new rules in rather than overwriting
3. Confirm what was added

Description

The micro-library of Node.js stream components for creating custom JSON processing pipelines with a minimal memory footprint. It can parse JSON files far exceeding available memory streaming individual primitives using a SAX-inspired API.

Project identity

stream-json is a micro-library of Node.js stream components for creating custom JSON processing pipelines with a minimal memory footprint. It can parse JSON files far exceeding available memory streaming individual primitives using a SAX-inspired API. The npm package name is stream-json. It depends on stream-chain for pipeline composition.

Critical rules

• CommonJS. The project is "type": "commonjs". Use require() in source, import in tests (.mjs). • No transpilation. Code runs directly. • One runtime dependency: stream-chain. Do not add other packages to dependencies. Only devDependencies are allowed. • Do not modify or delete test expectations without understanding why they changed. • Do not add comments or remove comments unless explicitly asked. • Keep .js and .d.ts files in sync for all modules under src/. • Token-based architecture. The parser produces {name, value} tokens. All filters, streamers, and utilities operate on this protocol. • Backpressure must be handled correctly. All stream components rely on Node.js stream infrastructure via stream-chain.

Code style

• Prettier: 160 char width, single quotes, no bracket spacing, no trailing commas, arrow parens "avoid" (see .prettierrc). • 2-space indentation. • Semicolons are enforced by Prettier (default semi: true).

Architecture quick reference

• Parser (src/parser.js) consumes text, produces SAX-like token stream. Uses stream-chain's gen(), flushable(), many(), none, fixUtf8Stream, asStream. • Assembler (src/assembler.js) interprets tokens into JS objects (EventEmitter). connectTo(stream) and tapChain for chain(). • Disassembler (src/disassembler.js) JS objects → token stream. • Stringer (src/stringer.js) token stream → JSON text (Transform). • Emitter (src/emitter.js) token stream → named events (Writable). • Filters (src/filters/) edit the token stream: pick, replace, ignore, filter. All built on filterBase. • Streamers (src/streamers/) assemble complete objects: streamValues, streamArray, streamObject. All built on streamBase. • Utilities: emit(), withParser(), Batch, Verifier, Utf8Stream, FlexAssembler. • JSONL: jsonl/parser.js and jsonl/stringer.js for line-separated JSON. • JSONC: jsonc/parser.js, jsonc/stringer.js, and jsonc/verifier.js for JSON with Comments.

Discussion

0/2000
Loading comments...

Health Signals

MaintenanceCommitted 4d ago
Active
Adoption1K+ stars on GitHub
1.2k ★ · Popular
DocsREADME + description
Well-documented

GitHub Signals

Stars1.2k
Forks51
Issues5
Updated4d ago
View on GitHub
No License

My Fox Den

Community Rating

Sign in to rate this booster

Works With

Any AI assistant that accepts custom rules or system prompts

Claude
ChatGPT
Cursor
Windsurf
Copilot
+ more