Drawing is not an image.
It is an event.

InkField preserves the memory of every gesture.

Contents

Part I — Overview & Features

  1. Project Overview
  2. Brush System — 7 Modes & Spring Physics
  3. Ink Diffusion — 6 feedback.frag Modes
  4. Flow Distortion — 8 Blend Algorithms
  5. Metallic Etching — scanAndMarkDarkPoints
  6. Recording & Playback — Deterministic Replay
  7. Dual Mode — Artist vs Collector

Part II — Rendering Pipeline & Reference

  1. Pipeline Overview
  2. Full Pipeline Diagram
  3. Buffer Reference Table
  4. Tutorial Series — Deep Dive
0

Project Overview

InkField shifts painting from "image" back to "behavior."

In this system, the path, speed, and direction of every brushstroke are fully recorded, transforming painting into a temporal structure that can be saved and re-computed.

This motion data feeds into the brush and ink generation systems. The image is therefore not merely replayed, but continuously regenerated along existing trajectories. Ink bleeding, grain, and dry-brush effects shift with each variation in random seed, so every re-rendering retains subtle differences.

The work is both a record and a system.

It preserves traces of the artist's physical movement while allowing algorithms to continuously generate new variations along the same paths.

In an era of increasingly automated image generation, InkField refocuses attention on "intention" within the creative process.

The pauses, turns, and rhythms that briefly exist between hand and brain are transformed into data that can be observed and analyzed.

Painting is no longer just a static result, but a continuously flowing duration of time.

Technical Pipeline

The entire painting pipeline consists of 10 GLSL shaders and 27 offscreen buffers:

Stroke input → encode.frag (35-color LUT + HSB conversion)
→ typeMapEncode.frag (per-pixel brush type recording)
→ feedback.frag (6 ink physics modes)
→ flow.frag (8 noise blend algorithms)
→ composite.frag (reads typeMap for brush identity)
→ distort.frag (FBM / resonance / cellular / white dot / grain)
→ screen output

Technical Scale

ComponentMetrics
JavaScript10,871 lines (5 core modules)
GLSL Shaders2,775 lines (10 custom shaders)
Buffers14 Framebuffers + 13 createGraphics
WebGL Contexts3 (optimized from 16)
Tech Stackp5.js, WebGL, GLSL, p5.easycam
DeploymentBundles to index.html + script.js + shader.js
1

Brush System — 7 Modes & Spring Physics

Physics Model

Brush movement uses a spring-damper model (spring force + damping) to simulate the inertia and elasticity of a Chinese calligraphy brush. The mouse position acts as the target; the brush tip follows via acceleration interpolation, producing natural lag and bounce.

ModeNameCharacteristics
1Ink BrushCalligraphic strokes with spray variation, pressure controls ink depth
2MarkerHard-edged square brush for large-area block expression
3Spray PaintLiquid flow effect, ink trails with dynamic extension
4Dry BrushNoise-driven line width variation, simulates technical pen drawing
5Spray DotsDot-based splash effect, density varies with brush speed
6Flat BrushPer-frame animated advanced calligraphy brush (flyBrushOnBuffer)
7Deckle EdgePrecise strokes without spray, clean and sharp lines

Additional Parameters

  • Brush Size: 7 presets (0.1 / 0.25 / 0.5 / 1 / 2 / 3 / 5× base)
  • Path Rotation: None / 5-10° random / 10-25° random
  • Blend Mode: Mix (linear) / Multiply (darken) / Darken (minimum)
  • Color Selection: 35-color palette (7×5 grid) + custom colors
2

Ink Diffusion — 6 feedback.frag Modes

Shader feedback.frag — Diffusion

Each frame, the feedback shader simulates physical diffusion on newBufferBlack. It reads a force map that defines direction and magnitude per pixel, pushing ink along those vectors to simulate natural ink flow on paper.

ModeNameEffectFeels Like
Mode 0MIXWide diffusion + blurInk dropped in water
Mode 1SHARPGrain texture + glowCharcoal on rough paper
Mode 2FLYINGMulti-scale noise dry brushDry brush on rice paper
Mode 3WETWatercolor bleedingWatercolor wash
Mode 4SALTSalt crystallization textureSalt technique on watercolor
Mode 5HAIRDirectional slow diffusionSlow-bleeding ink wash

After releasing the mouse, ink continues diffusing briefly (force decays to zero), simulating real ink's lingering spread.

3

Flow Distortion — 8 Blend Algorithms

Shader flow.frag — Flow Distortion

Long-press triggers a flow effect using Perlin/Simplex noise as a base, layered with different blend algorithms to transform static strokes into dynamic organic forms. Each blendType produces a distinct visual style.

TypeSymbolNameEffect
0~Base WavePerlin/Simplex noise displacement
2Concentric RipplesRing waves expanding from center
3Vertical WaveVertical wave displacement
4Horizontal ShiftHorizontal wave displacement
5Crackle PatternCell/crackle texture pattern
6Mosaic ShardsBlock-fractured mosaic effect
7🌀VortexSpiral swirling toward center
8Cellular TextureOrganic cell-like deformation

Key Feature: "Last Stroke Only" Mode

When Last Stroke Only is enabled, the flow effect applies only to the most recent stroke without affecting existing marks.

This requires GPU-level synchronization of the TypeMap identity buffer displacement, ensuring brush type data doesn't misalign during deformation — the typeMap pass uses identical noise offsets so identity follows color movement.

4

Metallic Etching — scanAndMarkDarkPoints

JS metallic.js — Bug Bite Etching Simulation

Simulates the traditional printmaking "bug bite" etching technique: the system scans dark pixels on the canvas, selects 10 target points via weighted random sampling, and generates procedural organic shapes at each location.

Trigger Modes

  • GLOBAL: Scan entire canvas
  • EACH: Scan each new stroke
  • RANDOM: Random trigger
  • EACHR: Per-stroke scan + random intensity
Shape TypeNameForm
CCircleCircular perforation
RERectangleRectangular cutout
LELightningSmall jagged cracks (1.3× enlarged during playback for visibility)
BEBig LightningLarge branching fracture patterns

Material Presets

6 metallic materials: Gold / Silver / Copper / Rose Gold / Black Iron / Diamond. Rendered via metallic.frag shader for material luster and color.

5

Recording & Playback — Deterministic Replay

JS recording.js — Event Recording

A complete event recording system encodes every mouse position, timestamp, and brush parameter into JSON. Combined with the custom crandom seeded random wrapper (js/crandom.js) that tracks every random() call count, this achieves near-100% playback consistency.

Recording Features

  • Event-driven JSON capture (35+ fields per stroke)
  • Automatic pause compression (skips idle time between strokes)
  • Bug bite targetPoints recorded directly in event data, eliminating pixel-scan indeterminacy during playback
  • Playback path preserves identical distance checks (minDistance), ensuring consistent bite counts

Playback Features

  • Full stroke reconstruction (preserving original physics parameters)
  • Auto-loop playback (configurable wait time)
  • Playback offset (X/Y translation)
  • Progress bar with percentage display
  • JSON import/paste support (Agent JSON Panel)

Consistency Guarantee: crandom System

crandom (js/crandom.js) wraps p5.js random(), counting each call. During playback, randomCount comparison verifies PRNG sequence alignment.

Key design: shapeSeed = floor(target.x × 1000 + target.y × 333 + shapeSeedRand) — shapes are determined by target position. generateOrganicShape internally calls randomSeed(seed) to reset PRNG, ensuring identical seeds produce identical shapes.

6

Dual Mode — Artist vs Collector

Artist Mode

Full three-panel UI:

  • Art System Log (📋): Recording, playback, toggle switches (Paper / Grid / Camera / Loop / Distort / fBM / RS / Cellular / WhiteDot / Grain / Path / Fit)
  • Brush Control (🖌️): Brush modes 1-7, ink effects 0-5, size, rotation, blend mode, 35-color palette
  • Effect Control (✨): Bug bite trigger mode, metallic materials, flow effect buttons + intensity slider

All panels are draggable with positions saved to localStorage. Clicking any panel brings it to front.

Collector Mode

A pure dynamic exhibition device — all UI hidden, auto-loads artwork JSON and loops playback.

  • Auto-loads /lib/demo.json
  • URL hash switches artworks: #1, #2, #3 → loads /lib/1.json, /lib/2.json
  • Canvas size synced from JSON
  • Auto-loop enabled

URL Parameter System

Format: ?_panel1:value_panel2:value_...

?_camera:0_rs:0_paper:0_console:0_grid:1_path:0 // Custom toggle defaults
?_artist:1 // Force artist mode
#1, #2, #3 // Collector mode: load specific artwork
7

Pipeline Overview

Imagine painting in art class:

Step 1 — You draw a sketch on white paper with a pencil (darker = more pressure).

Step 2 — A "hairdryer" step lets the ink diffuse for a watercolor feel.

Step 3 — The sketch is encoded with a special scheme and stored in a "vault" (finalBuffer).

Step 4 — To show the piece, the system decodes from the vault and composites onto textured paper.

Step 5 — Final effects (distortion, grain) are applied.

Below is each pipeline stage with its shader/buffer and the corresponding tutorial:

StageShader / BufferWhat It DoesTutorial
① Draft newBufferBlack Brush leaves grayscale marks on white paper; depth = pressure Color Journey →
② Diffusion feedback.frag Each frame reads force map, lets ink flow and spread naturally Ink Effects →
③ Encode encode.frag Translates grayscale draft into color encoding, stores in finalBuffer Color Journey →
③b Identity typeMapEncode.frag Simultaneously writes brush type to separate typeMapBuffer Color Journey →
④ Composite composite.frag Check identity → select blend mode → overlay on paper texture Blend & Flow →
⑤ Preview realtime.frag Real-time color preview while drawing Color Journey →
⑥ Post Effects distort.frag FBM distortion / ripples / cellular / white dots / grain Effects Workshop →
8

Full Pipeline Diagram

From mouse press to screen output, the complete flow:

Mouse / Touch Input Position, pressure, speed
1. Brush draws on newBufferBlack Grayscale draft, depth = pressure
2. feedback.frag — ink diffusion Each frame: read force field → push ink → add texture
3. encode.frag — encode to color White → pure gray R=G=B / Color → mixed / Black → desaturated
3b. typeMapEncode.frag — write identity R=brush type / G=white max opacity → typeMapBuffer
4. Store in finalBuffer + typeMapBuffer Color and identity stored separately, no interference
5. composite.frag — check ID + decode + paper overlay Reads typeMapBuffer: White → Screen / Color → Multiply / Black → Darken
6. realtime.frag — overlay current stroke (Only active while drawing, provides live preview)
7. distort.frag — add effects → screen Distortion / Cellular / White Dots / Grain (toggleable)
9

Buffer Reference Table

The application uses many offscreen buffers, each serving a different purpose — think of them as different desks in an art studio, each holding different materials:

Buffer NamePurposeAnalogy
newBufferBlackCurrent stroke grayscale draftWhite paper in front of you
pingPongBufferFeedback diffusion temp (dedicated)Tray next to the hairdryer
finalBufferAll strokes in encoded formVault (codebook)
typeMapBufferPer-pixel brush identity (R=type, G=white opacity)ID card book
oldBufferGrayscale accumulation of all strokesBackup photocopy
screenBufferComposite + real-time preview workspaceDisplay desk (erasable)
realtimeIntermediateBufferComposite → realtime copy (avoids same-FBO read/write)Display desk photocopy
paperTextureBufferTextured background paperFine art paper
lastStrokeBufferSnapshot of last stroke (for flow effect)Photo of last stroke
img (forceMap)Force map driving diffusion directionWind direction chart
finalOutFinal output with effects appliedFramed artwork
cursorBufferCursor and path preview (migrated to framebuffer)Glass overlay markings

Technical Note: createFramebuffer

All shader-processed buffers (plus cursorBuffer) use p5.js createFramebuffer(), sharing the main canvas WebGL context. This replaced createGraphics(WEBGL) which created separate GL contexts. Safari has strict GL context limits — using framebuffers brought performance from ~10 FPS back to smooth.

The original 16 WebGL contexts have been reduced to 3 (main canvas + 2 UI/debug overlays). Only P2D blur buffers and setup-time textures still use createGraphics.

10

Tutorial Series — Deep Dive

The chapters above provide feature summaries and pipeline overview. Below are the full tutorial articles for each topic:

InkField — Technical Documentation
Reorganized: 2026-03-10 (Hub architecture with deduplicated content)