Project Overview
InkField shifts painting from "image" back to "behavior."
In this system, the path, speed, and direction of every brushstroke are fully recorded, transforming painting into a temporal structure that can be saved and re-computed.
This motion data feeds into the brush and ink generation systems. The image is therefore not merely replayed, but continuously regenerated along existing trajectories. Ink bleeding, grain, and dry-brush effects shift with each variation in random seed, so every re-rendering retains subtle differences.
The work is both a record and a system.
It preserves traces of the artist's physical movement while allowing algorithms to continuously generate new variations along the same paths.
In an era of increasingly automated image generation, InkField refocuses attention on "intention" within the creative process.
The pauses, turns, and rhythms that briefly exist between hand and brain are transformed into data that can be observed and analyzed.
Painting is no longer just a static result, but a continuously flowing duration of time.
Technical Pipeline
The entire painting pipeline consists of 10 GLSL shaders and 27 offscreen buffers:
→ typeMapEncode.frag (per-pixel brush type recording)
→ feedback.frag (6 ink physics modes)
→ flow.frag (8 noise blend algorithms)
→ composite.frag (reads typeMap for brush identity)
→ distort.frag (FBM / resonance / cellular / white dot / grain)
→ screen output
Technical Scale
| Component | Metrics |
|---|---|
| JavaScript | 10,871 lines (5 core modules) |
| GLSL Shaders | 2,775 lines (10 custom shaders) |
| Buffers | 14 Framebuffers + 13 createGraphics |
| WebGL Contexts | 3 (optimized from 16) |
| Tech Stack | p5.js, WebGL, GLSL, p5.easycam |
| Deployment | Bundles to index.html + script.js + shader.js |
Brush System — 7 Modes & Spring Physics
Physics Model
Brush movement uses a spring-damper model (spring force + damping) to simulate the inertia and elasticity of a Chinese calligraphy brush. The mouse position acts as the target; the brush tip follows via acceleration interpolation, producing natural lag and bounce.
| Mode | Name | Characteristics |
|---|---|---|
| 1 | Ink Brush | Calligraphic strokes with spray variation, pressure controls ink depth |
| 2 | Marker | Hard-edged square brush for large-area block expression |
| 3 | Spray Paint | Liquid flow effect, ink trails with dynamic extension |
| 4 | Dry Brush | Noise-driven line width variation, simulates technical pen drawing |
| 5 | Spray Dots | Dot-based splash effect, density varies with brush speed |
| 6 | Flat Brush | Per-frame animated advanced calligraphy brush (flyBrushOnBuffer) |
| 7 | Deckle Edge | Precise strokes without spray, clean and sharp lines |
Additional Parameters
- Brush Size: 7 presets (0.1 / 0.25 / 0.5 / 1 / 2 / 3 / 5× base)
- Path Rotation: None / 5-10° random / 10-25° random
- Blend Mode: Mix (linear) / Multiply (darken) / Darken (minimum)
- Color Selection: 35-color palette (7×5 grid) + custom colors
Ink Diffusion — 6 feedback.frag Modes
Shader feedback.frag — Diffusion
Each frame, the feedback shader simulates physical diffusion on newBufferBlack. It reads a force map that defines direction and magnitude per pixel, pushing ink along those vectors to simulate natural ink flow on paper.
| Mode | Name | Effect | Feels Like |
|---|---|---|---|
| Mode 0 | MIX | Wide diffusion + blur | Ink dropped in water |
| Mode 1 | SHARP | Grain texture + glow | Charcoal on rough paper |
| Mode 2 | FLYING | Multi-scale noise dry brush | Dry brush on rice paper |
| Mode 3 | WET | Watercolor bleeding | Watercolor wash |
| Mode 4 | SALT | Salt crystallization texture | Salt technique on watercolor |
| Mode 5 | HAIR | Directional slow diffusion | Slow-bleeding ink wash |
After releasing the mouse, ink continues diffusing briefly (force decays to zero), simulating real ink's lingering spread.
Flow Distortion — 8 Blend Algorithms
Shader flow.frag — Flow Distortion
Long-press triggers a flow effect using Perlin/Simplex noise as a base, layered with different blend algorithms to transform static strokes into dynamic organic forms. Each blendType produces a distinct visual style.
| Type | Symbol | Name | Effect |
|---|---|---|---|
| 0 | ~ | Base Wave | Perlin/Simplex noise displacement |
| 2 | ◎ | Concentric Ripples | Ring waves expanding from center |
| 3 | ‖ | Vertical Wave | Vertical wave displacement |
| 4 | ≡ | Horizontal Shift | Horizontal wave displacement |
| 5 | ✿ | Crackle Pattern | Cell/crackle texture pattern |
| 6 | □ | Mosaic Shards | Block-fractured mosaic effect |
| 7 | 🌀 | Vortex | Spiral swirling toward center |
| 8 | ◇ | Cellular Texture | Organic cell-like deformation |
Key Feature: "Last Stroke Only" Mode
When Last Stroke Only is enabled, the flow effect applies only to the most recent stroke without affecting existing marks.
This requires GPU-level synchronization of the TypeMap identity buffer displacement, ensuring brush type data doesn't misalign during deformation — the typeMap pass uses identical noise offsets so identity follows color movement.
Metallic Etching — scanAndMarkDarkPoints
JS metallic.js — Bug Bite Etching Simulation
Simulates the traditional printmaking "bug bite" etching technique: the system scans dark pixels on the canvas, selects 10 target points via weighted random sampling, and generates procedural organic shapes at each location.
Trigger Modes
- GLOBAL: Scan entire canvas
- EACH: Scan each new stroke
- RANDOM: Random trigger
- EACHR: Per-stroke scan + random intensity
| Shape Type | Name | Form |
|---|---|---|
| C | Circle | Circular perforation |
| RE | Rectangle | Rectangular cutout |
| LE | Lightning | Small jagged cracks (1.3× enlarged during playback for visibility) |
| BE | Big Lightning | Large branching fracture patterns |
Material Presets
6 metallic materials: Gold / Silver / Copper / Rose Gold / Black Iron / Diamond. Rendered via metallic.frag shader for material luster and color.
Recording & Playback — Deterministic Replay
JS recording.js — Event Recording
A complete event recording system encodes every mouse position, timestamp, and brush parameter into JSON. Combined with the custom crandom seeded random wrapper (js/crandom.js) that tracks every random() call count, this achieves near-100% playback consistency.
Recording Features
- Event-driven JSON capture (35+ fields per stroke)
- Automatic pause compression (skips idle time between strokes)
- Bug bite targetPoints recorded directly in event data, eliminating pixel-scan indeterminacy during playback
- Playback path preserves identical distance checks (minDistance), ensuring consistent bite counts
Playback Features
- Full stroke reconstruction (preserving original physics parameters)
- Auto-loop playback (configurable wait time)
- Playback offset (X/Y translation)
- Progress bar with percentage display
- JSON import/paste support (Agent JSON Panel)
Consistency Guarantee: crandom System
crandom (js/crandom.js) wraps p5.js random(), counting each call. During playback, randomCount comparison verifies PRNG sequence alignment.
Key design: shapeSeed = floor(target.x × 1000 + target.y × 333 + shapeSeedRand) — shapes are determined by target position. generateOrganicShape internally calls randomSeed(seed) to reset PRNG, ensuring identical seeds produce identical shapes.
Dual Mode — Artist vs Collector
Artist Mode
Full three-panel UI:
- Art System Log (📋): Recording, playback, toggle switches (Paper / Grid / Camera / Loop / Distort / fBM / RS / Cellular / WhiteDot / Grain / Path / Fit)
- Brush Control (🖌️): Brush modes 1-7, ink effects 0-5, size, rotation, blend mode, 35-color palette
- Effect Control (✨): Bug bite trigger mode, metallic materials, flow effect buttons + intensity slider
All panels are draggable with positions saved to localStorage. Clicking any panel brings it to front.
Collector Mode
A pure dynamic exhibition device — all UI hidden, auto-loads artwork JSON and loops playback.
- Auto-loads
/lib/demo.json - URL hash switches artworks:
#1,#2,#3→ loads/lib/1.json,/lib/2.json… - Canvas size synced from JSON
- Auto-loop enabled
URL Parameter System
Format: ?_panel1:value_panel2:value_...
?_artist:1 // Force artist mode
#1, #2, #3 // Collector mode: load specific artwork
Pipeline Overview
Imagine painting in art class:
Step 1 — You draw a sketch on white paper with a pencil (darker = more pressure).
Step 2 — A "hairdryer" step lets the ink diffuse for a watercolor feel.
Step 3 — The sketch is encoded with a special scheme and stored in a "vault" (finalBuffer).
Step 4 — To show the piece, the system decodes from the vault and composites onto textured paper.
Step 5 — Final effects (distortion, grain) are applied.
Below is each pipeline stage with its shader/buffer and the corresponding tutorial:
| Stage | Shader / Buffer | What It Does | Tutorial |
|---|---|---|---|
| ① Draft | newBufferBlack | Brush leaves grayscale marks on white paper; depth = pressure | Color Journey → |
| ② Diffusion | feedback.frag | Each frame reads force map, lets ink flow and spread naturally | Ink Effects → |
| ③ Encode | encode.frag | Translates grayscale draft into color encoding, stores in finalBuffer | Color Journey → |
| ③b Identity | typeMapEncode.frag | Simultaneously writes brush type to separate typeMapBuffer | Color Journey → |
| ④ Composite | composite.frag | Check identity → select blend mode → overlay on paper texture | Blend & Flow → |
| ⑤ Preview | realtime.frag | Real-time color preview while drawing | Color Journey → |
| ⑥ Post Effects | distort.frag | FBM distortion / ripples / cellular / white dots / grain | Effects Workshop → |
Full Pipeline Diagram
From mouse press to screen output, the complete flow:
Buffer Reference Table
The application uses many offscreen buffers, each serving a different purpose — think of them as different desks in an art studio, each holding different materials:
| Buffer Name | Purpose | Analogy |
|---|---|---|
| newBufferBlack | Current stroke grayscale draft | White paper in front of you |
| pingPongBuffer | Feedback diffusion temp (dedicated) | Tray next to the hairdryer |
| finalBuffer | All strokes in encoded form | Vault (codebook) |
| typeMapBuffer | Per-pixel brush identity (R=type, G=white opacity) | ID card book |
| oldBuffer | Grayscale accumulation of all strokes | Backup photocopy |
| screenBuffer | Composite + real-time preview workspace | Display desk (erasable) |
| realtimeIntermediateBuffer | Composite → realtime copy (avoids same-FBO read/write) | Display desk photocopy |
| paperTextureBuffer | Textured background paper | Fine art paper |
| lastStrokeBuffer | Snapshot of last stroke (for flow effect) | Photo of last stroke |
| img (forceMap) | Force map driving diffusion direction | Wind direction chart |
| finalOut | Final output with effects applied | Framed artwork |
| cursorBuffer | Cursor and path preview (migrated to framebuffer) | Glass overlay markings |
Technical Note: createFramebuffer
All shader-processed buffers (plus cursorBuffer) use p5.js createFramebuffer(), sharing the main canvas WebGL context. This replaced createGraphics(WEBGL) which created separate GL contexts. Safari has strict GL context limits — using framebuffers brought performance from ~10 FPS back to smooth.
The original 16 WebGL contexts have been reduced to 3 (main canvas + 2 UI/debug overlays). Only P2D blur buffers and setup-time textures still use createGraphics.
Tutorial Series — Deep Dive
The chapters above provide feature summaries and pipeline overview. Below are the full tutorial articles for each topic:
InkField — Technical Documentation
Reorganized: 2026-03-10 (Hub architecture with deduplicated content)