When AI Learns to Hold a Brush

Teaching machines the structure of a stroke, so they can write the script of a painting

Contents

  1. Agent learning journey (for other AIs, incl. downloadable .js / .json)
  2. Why JSON for AI? Text format
  3. Full parameter reference
  4. Seven brush modes (1–7)
  5. Color, blend, ink quick reference
  6. Flow and procedural params
  7. AI-friendly workflow: prompt to playback (incl. gen-pollock.js)

⚠️ For LLM / AI Agent: avoid these errors when generating or validating

This tutorial has seen the following issues cause generated output to be unusable. Please follow the guidelines below strictly when generating or helping debug.

❌ Serious errors that have occurred

✅ Guidelines for other LLMs

✅ What was done right and key lessons (for other LLMs)

What was done right this time

Key lessons

⏱️ Stroke disappearing: leave enough time between strokes

Symptom: Some strokes vanish right after they are drawn during playback.

Cause: After each stroke ends (mr), the system needs about maxUpdates frames to finish “pen-up decay” (force from 1.0 to 0.0). If the next stroke’s mp (mousePressed) fires within that window, the previous stroke’s pen-up is cut off and the stroke disappears.

When generating, obey: At least ~500ms between t of mr and t of the next mp (or scale by strokeData.maxUpdates: gap ≥ maxUpdates × ~16ms per frame). When a script generates multiple strokes, increase the gap between strokes so pen-up is not cut off.

0

Agent learning journey: from zero to expressionist artwork

This documents one AI Agent’s full path from zero knowledge of Inkfield, through reverse‑engineering the JSON format, to generating abstract expressionist work. Aimed at helping other Agents pick up core concepts quickly and avoid repeated pitfalls.

📥 Downloads

agent-generator-logic.js — Interleaved expressionist JSON generator v5 (Node.js or browser)
agent-expressionist-v5.json — Example output, load in Playground to replay

Phase 1: Understand the system — read first

Always start by reading: Don’t rush to code. Fully understand the system: read the official tutorial, analyze gen-pollock.js, inspect human-recorded JSON.

Key insight: The tutorial tells you “which fields exist”; human recordings show “how to use them naturally.” Both matter.

Phase 2: Reverse engineer — learn from recordings

Stroke density: Each stroke needs 50–80 md events for visible lines. Under 30 is too sparse.

Time gaps: 650–950ms between strokes. maxUpdates controls ink decay (~30 frames × 16ms); gaps must exceed that.

brushMode: 1=Standard, 2=Marker, 3=Gothic, 4=Pen, 5=Spray, 6=Fly, 7=Special

Flow effect: Not part of strokeData; it’s a separate event "m": "flow" with start/end pair sharing the same flowSeed.

Phase 3: Iteration — five versions

v1 geometric (failed): Correct format is only the baseline; good art needs “stroke as gesture.”

v2 vertical expressionist: Math “random” ≠ visual “even”; use slot-based layout.

v3 add Flow: Putting all Flow at the end feels unnatural.

v4 fix Flow timing: Timing is music; “listen” to the rhythm of real recordings.

v5 interleaved (final): Sequential → interleaved: draw a few strokes → Flow → draw more → Flow… Each Flow affects only what’s there so far.

Phase 4: Core algorithm design

Stroke path: Vertical strokes use sin wave + noise for hand wobble, tiltShift for arm angle.

Composition: Slot-based layout with Gaussian jitter inside each slot.

strokeBounds: Flow’s influence area, normalized 0–1, computed from drawn strokes.

Phase 5: Run and debug — generate in the browser

JSON is often >50KB; can’t paste into a text field. Solution: port the Node script to browser JS and call window.loadRecordingFromText(JSON.stringify(recording)).

IssueCauseFix
Strokes invisibleToo few md (<20)Use 55–80 md per stroke
Composition biasedRandom distribution biasGaussian center + slot system
Flow no effectDuration <1sMin 1200ms
JSON too big to inject52–74KB over limitGenerate in browser

Phase 6: Advice for other Agents

  1. Analyze before generating: Inspect real recordings for the target style first.
  2. Interleave > sequential: Flow and strokes should alternate, not all Flow at the end.
  3. Time is a hidden dimension: Event gaps directly affect ink diffusion; don’t set them arbitrarily.
  4. Physics simulation pays off: sin + noise + tilt beats pure random by far.
  5. Iterate: Don’t aim for a perfect generator in one pass; solve one problem per version.
  6. Generate in browser: When JSON is too big, running the logic in the browser is most reliable.

* Based on a 2026-03-03 development session; 5 iterations from zero to working output.

1

JSON from an AI perspective: why text format?

If you want to teach a robot to draw, the best way is to tell it in a language it understands: “put a brush of this size here, then move the mouse there.” JSON is that language—a plain-text instruction set that an LLM can produce token by token, without binary formats or black magic.

InkField’s brush system has two parts:

1️⃣ Recording phase

When the user draws in the UI, the system records every stroke’s full parameters as JSON—color, size, physics, effects, 60+ parameters.

2️⃣ Playback phase

Given a JSON file, the system can reproduce those strokes exactly—not a screenshot or video, but the same algorithm “drawing” again for full consistency.

So if you want an LLM or AI agent to create art, you only need to teach it how to generate valid JSON.

Why JSON?

  • Easy to generate: LLMs can emit text token by token; no binary serialization
  • Easy to debug: Humans can read and tweak parameters easily
  • Easy to validate: Standard JSON validators ensure correct format
  • Easy to combine: Multiple strokes go in a single array
  • Easy to archive: Plain text, any system can read it

Full example (top-level required fields, one stroke’s strokeData, and optional fields):

{
  "version": "1.0",
  "startTime": 0,
  "randomSeed": 1234567890,
  "initialPathToggle": false,
  "initialWhiteBrushMode": false,
  "initialBrushColorMode": 0,
  "canvasSize": { "width": 500, "height": 500 },
  "canvasBackgroundColor": [222, 212, 195],
  "events": [
    { "m": "mp", "t": 0, "x": 76, "y": 255, "strokeData": { ... } },
    { "m": "md", "t": 36, "x": 68, "y": 245 },
    { "m": "mr", "t": 624, "x": 392, "y": 254 }
  ]
}

See tech/examples/ai-json-step2a.json for a complete single-stroke example with full strokeData.

Optional top-level fields (newer recordings may include)

LLM-generated JSON only needs the 8 core fields above. Newer recordings may also include:

FieldTypeDescription
strokesarrayAlternate stroke array; usually []
timeOffsetnumberTime offset; usually 0
initialEffectControlobjectInitial effect settings: shapeType(0), metallicStrength(85), metallicFlow(200), metallicTint([r,g,b]), metallicTintType("copper")
savedAtstringISO 8601 timestamp, e.g. "2026-03-01T13:21:37.067Z"

Three event types

EventCodeMeaningRequired fields
Mouse Press"mp"Start new strokem, t, x, y, strokeData
Mouse Drag"md"Move and keep drawingm, t, x, y
Mouse Release"mr"End strokem, t (x,y optional)

One full draw sequence: press → drag (one or more) → release. Each "mp" event carries the full strokeData that defines that stroke’s visual behaviour.

⚠️ Common AI mistakes — event format pitfalls

❌ Wrong✅ CorrectNote
"ms" (mouseStart)No such event; remove itOnly three codes: mp / md / mr
"mu" (mouseUp)Use "mr"End event is mouseReleased, code mr
Putting strokeData in "md"strokeData only in "mp"md has only m/t/x/y; no strokeData
Putting "mp" last (or in the middle)mp is always the first event of each strokeOrder: mpmd… → mr

Each stroke is always:
{ "m":"mp", strokeData… } → { "m":"md" } → … → { "m":"mr" }

2

Full parameter reference

The strokeData object has 60+ parameters in 12 categories. Full reference below.

📏 Size parameters

ParameterTypeRangeDescription
baseBrushSizefloat0.1–10.0Master size scale; all spray sizes are multiplied by this
initialSizefloat2.0–240.0Stroke start size; decays per frame
spraySizefloat1.0–100.0Spread radius of spray particles
randStepfloatfixed 0.05Per-frame size decay

🎨 Color parameters

ParameterTypeValuesDescription
brushColorModeint0–35Key: color ID. 0=black, 1=white, 6=orange, 9=blue, 30=red, etc.
colorIndexint0–35Color variation index; independent of brushColorMode; fine random variation (0–3 typical)
hueShiftfloat−0.05–0.05Hue tweak (typical ~−0.02–0.02)
satShiftfloat−0.05–0.05Saturation tweak (typical 0–0.04)
briShiftfloat−0.05–0.05Brightness tweak (typical 0–0.04)
whiteMaxOpacityfloat0.7–1.0Max brush opacity (typical 0.78–0.95)
whiteBrushModeboolfalse/trueEnable white-brush render path; false for normal strokes

⚙️ Brush mode & shape

ParameterTypeRangeDescription
brushModeint1–7Brush type: 1=standard, 2=marker, 3=Gothic, 4=pen, 5=spray, 6=fly, 7=special
shapeTypeint0–3Spray particle shape: 0=circle, 1=ellipse, 2=triangle, 3=diamond
brushModeSPbool0/1Special mode flag (Mode 7)

🎬 Physics & motion

ParameterTypeValueDescription
springfloat0.3–0.6Spring coefficient (brush response speed)
frictionfloat0.5Damping (speed decay)
stepint10–15Interpolation steps between mouse samples (higher = smoother)
step2int1–10Spray particle iteration count
expectedStrokeLengthint100–400Expected stroke frame count (for fade in/out)

✨ Effect parameters

ParameterTypeRangeDescription
keyBlendModeint0–2Blend: 0=Mix(linear), 1=Multiply, 2=Darken
useSharpenfloat0.0–5.5Ink effect: 0=diffusion, 1=edge, 2=sharp, 3=watercolor, 4=texture, 5=directional
indiffusionStrengthfloat0.0–1.0Ink diffusion strength
pathRotationfloat0–25Stroke direction twist: 0=none, 7=subtle, 17=wild

🖌️ Brush direction & paint

These are required for all brushMode (1–5, 7), not just Mode 6:

ParameterTypeValueDescription
targetflyBrushTypeint0–3Branch brush type (Mode 6 main; others use 0–2)
targetmainStrokeDirint0–3Main stroke direction (0=default, 1–3=preferences)
brushDirint0–3Actual brush motion direction
ctlNoiseint0/1Control noise switch; usually 1
brushPaintCtlNoisebyFrameint0/1Per-frame control noise; usually 1
brushPaintInterpolationOffsetint−1, 1, 2Interpolation offset (−1=reverse, 1=normal, 2=smoother)
brushPaintOldRInitialfloat0 or 0.5Initial old radius (0=clean start, 0.5=residual)
explodeStartint0/1Stroke start burst (0=off, 1=on)
explodeEndint0/1Stroke end burst (0=off, 1=on)
effect3Brightnessfloat0.5–1.0Render brightness (typical 0.57–0.90 per stroke)

🌊 Flow & procedural (force map)

Flow needs a forceMapParams object: 4 random seeds, 3 scales, 3 amplitudes, 3 phases, 2 vortex scales, 2 cluster scales. Omit or use reference values if not using Flow.

🔑 System & seed parameters

Filled automatically during recording; for AI generation use any reasonable value:

ParameterTypeSuggestedDescription
strokeSeedintany positive integerPer-stroke random seed; affects particle distribution
mouseCountStartint0 for 1st stroke, then cumulativeGlobal event count at start of this stroke = sum of (mp+md) of all previous strokes
drawingSeedinte.g. 1000000–9999999Per-stroke render seed
mouseX / mouseYfloatsame as mp.x / mp.yRedundant; match first point of stroke
phasorVelfloat1Phasor velocity; set 1
maxUpdatesint30Max draw iterations per frame; set 30
3

Seven brush modes (Brush Mode 1–7)

Each mode has distinct look and recommended parameter sets. Quick reference:

Mode 1: Standard brush

Character: Natural ink diffusion, soft edges.

Suggested: baseBrushSize: 2.0, initialSize: crandom(20,24)×baseBrushSize, spraySize: 3×baseBrushSize, spring: 0.6, friction: 0.5, step: 15, step2: 5, maxUpdates: 30.

Mode 2: Marker

Character: Dry marker, angular.

Suggested: baseBrushSize: 1.5, spraySize: 1×baseBrushSize, spring: 0.3, step: 10, step2: 10, maxUpdates: 10.

Mode 3: Gothic (particle)

Character: Point-like particles, physics decay.

Suggested: baseBrushSize: 2.5, initialSize: crandom(2,4)×baseBrushSize, spraySize: 10×baseBrushSize, spraySteps: 3.

Mode 4: Pen (precise)

Character: Perlin noise weight, precise lines.

Suggested: baseBrushSize: 1.0, initialSize: crandom(6,9)×baseBrushSize, expectedStrokeLength: 400, penSketchStrokeWeight: 0.8–1.2.

Mode 5: Spray (scatter)

Character: Loose spray, no size decay.

Suggested: baseBrushSize: 3.0, spraySize: 10 (fixed), step2: 1.

Mode 6: Fly (generative)

Character: Branch-like pattern, complex splits.

Suggested: baseBrushSize: 2.0, targetflyBrushType: 0–3, targetmainStrokeDir: 0–3.

Mode 7: Special (Mode 1 variant)

Character: Mode 1 + random branch drop + wide angle variation.

Suggested: Same as Mode 1 but brushModeSP: true.

4

Color, blend, ink quick reference

🎨 Main colors (brushColorMode)

Key: brushColorMode must be set to a color ID; NOT 0 or 1 unless you want black or white.

0 black
1 white
6 orange
8 teal
9 blue_dark
30 red
3 gray
Other (ID 4–35)

🔀 Blend mode (keyBlendMode)

Important: Blend mode has no visible effect on pure black or white background. Use a mid-tone (e.g. [180,160,140]).

ModeValueFormulaEffect
Mix0mix(oldColor, newColor, alpha)Linear blend; more overlap = more saturated
Multiply1oldColor × adjustedColorMultiply; darker with overlap
Darken2min(oldColor, newColor)Take darker; keeps deep tones

✏️ Ink effect (useSharpen)

LevelValueNameCharacter
0< 0.5Basic DiffusionStandard ink diffusion, gradient blur
1< 1.5Ink EdgeEdge detection, darken edges
2< 2.5Sharp OutlineHigh-contrast edges, inner sharpening
3< 3.5WatercolorSmooth watercolor, texture variation
4< 4.5Textured InkPerlin turbulence, angle jitter
5< 5.5Directional FlowAnisotropic diffusion, time noise
5

Flow field and procedural parameters

Flow is a post-process effect and must be triggered separately during playback. There are 8 blend types:

Flow blendType list

TypeNameEffect
0BasicLinear Simplex noise displacement
2ConcentricRing ripples outward
3VerticalVertical flow lines
4HorizontalHorizontal flow lines
5PatternHigh-frequency procedural pattern
6RectangleAxis-aligned rectangle pattern
7VortexRotating vortex
8CellularVoronoi cell texture

ForceMapParams structure and use

Procedural noise parameters control Flow detail. Common strategies:

  • Random generation: AI can produce random but sensible values for each seed/phase/amplitude.
  • Fixed template: Use a preset “smooth” or “strong” forceMapParams template.
  • Tune: Start from a base template and tweak amplitude and scale for strength.
6

AI-friendly workflow: from prompt to playback

Giving the AI a drawing task is like writing a recipe: which brush? which color? which size? Then it can produce correct JSON.

Step 1: LLM prompt template

// Template prompt for the LLM Generate a InkField playback JSON from the user request. Request: {user_request} Visual style: Artistic, watercolor Suggested brush: Mode 1 (standard brush) with useSharpen=3 (watercolor) Suggested color: brushColorMode=6 (orange) with keyBlendMode=0 (Mix) Output a complete recording JSON with: 1. One "mp" (mousePressed) event containing full strokeData 2. 3–5 "md" (mouseDragged) events for the path 3. One "mr" (mouseReleased) event

Step 2a: Full JSON example (from real recording)

Two straight lines, horizontal layout. Full file: tech/examples/ai-json-step2a.json

Two lines (horizontal) standard brush preview

Step 2b: Real recording — five-stroke ink house (full)

Below: five strokes recorded directly from the InkField UI, brushMode: 1 (standard brush), 500×500 canvas. The five strokes: dome arc, shoulder line, wall U-shape, thick bottom line, door arch. Full file: tech/examples/ai-json-house.json

Five-stroke ink house playback

↑ Playback preview (500×500, brushMode 1, five strokes)

Step 2c: Example generator script — for bots / LLMs to learn

The following Node.js script shows how to generate JSON that meets density and timing requirements without hand-writing hundreds of events. You can download it or expand the full source on this page for other bots to copy.

gen-pollock.js playback: 10 Pollock-style strokes

gen-pollock.js output pollock-style.json (10 strokes, 60–80 md per stroke, 600–900ms between strokes)

Run: from tech/ run node gen-pollock.js.

Expand full script gen-pollock.js (for bots / LLMs to copy)
// Generate a Pollock-style JSON with 10 strokes, each with 60-80 md events
// on a 500x500 canvas, black color, various brush modes and ink effects
const fs = require('fs');
function rand(min, max) { return Math.random() * (max - min) + min; }
function randInt(min, max) { return Math.floor(rand(min, max + 1)); }
function generatePollockCurve(numPoints, canvasW, canvasH) {
  const points = []; let x = rand(50, canvasW - 50), y = rand(50, canvasH - 50);
  let vx = rand(-8, 8), vy = rand(-8, 8); const curvature = rand(0.02, 0.15), speed = rand(2, 6);
  for (let i = 0; i < numPoints; i++) {
    points.push({ x: Math.round(x), y: Math.round(y) });
    vx += rand(-curvature * 30, curvature * 30); vy += rand(-curvature * 30, curvature * 30);
    const maxV = speed * 3; vx = Math.max(-maxV, Math.min(maxV, vx)); vy = Math.max(-maxV, Math.min(maxV, vy));
    x += vx; y += vy;
    if (x < 20) { x = 20; vx = Math.abs(vx) * 0.8; }
    if (x > canvasW - 20) { x = canvasW - 20; vx = -Math.abs(vx) * 0.8; }
    if (y < 20) { y = 20; vy = Math.abs(vy) * 0.8; }
    if (y > canvasH - 20) { y = canvasH - 20; vy = -Math.abs(vy) * 0.8; }
  }
  return points;
}
function generateStrokeData(mouseCountStart, startX, startY, numMdEvents) {
  const safeBrushModes = [1], safeUseSharpen = [2, 3];
  return {
    strokeSeed: randInt(10000000, 999999999), mouseCountStart, colorIndex: randInt(0, 5), shapeType: randInt(0, 3),
    useSharpen: safeUseSharpen[randInt(0, safeUseSharpen.length - 1)], brushMode: safeBrushModes[0],
    indiffusionStrength: 0.45, whiteBrushMode: false, brushColorMode: 0, phasorVel: 1, explodeStart: 0, explodeEnd: 0,
    whiteMaxOpacity: parseFloat(rand(0.5, 0.8).toFixed(2)), hueShift: -0.01, satShift: 0.02, briShift: 0.02,
    targetflyBrushType: 2, targetmainStrokeDir: 0, brushDir: randInt(0, 2), ctlNoise: 1, brushPaintCtlNoisebyFrame: 1,
    brushPaintInterpolationOffset: randInt(2, 3), brushPaintOldRInitial: parseFloat(rand(0, 0.5).toFixed(1)),
    keyBlendMode: 0, initialSize: parseFloat(rand(18, 30).toFixed(2)), spraySize: 3, step: 15, step2: 5, randStep: 0.05, maxUpdates: 30,
    pathRotation: 0, spring: 0.6, friction: 0.5, baseBrushSize: 1, expectedStrokeLength: numMdEvents * 5,
    effect3Brightness: parseFloat(rand(0.5, 0.7).toFixed(2)), mouseX: startX, mouseY: startY, drawingSeed: randInt(1000000, 9999999), brushModeSP: false,
    forceMapParams: { randomSeed1: parseFloat(rand(50, 500).toFixed(1)), randomSeed2: parseFloat(rand(50, 500).toFixed(2)), randomSeed3: parseFloat(rand(50, 500).toFixed(2)), randomSeed4: parseFloat(rand(50, 500).toFixed(2)), scale1: 0, scale2: 0.01, scale3: 0.01, amplitude1: parseFloat(rand(0.1, 0.5).toFixed(2)), amplitude2: parseFloat(rand(0.1, 0.5).toFixed(2)), amplitude3: parseFloat(rand(0.3, 0.9).toFixed(2)), phase1: parseFloat(rand(0, 6.28).toFixed(2)), phase2: parseFloat(rand(0, 6.28).toFixed(2)), phase3: parseFloat(rand(0, 6.28).toFixed(2)), vortexScale1: 0.01, vortexScale2: 0.01, clusterScale1: 0, clusterScale2: 0 }
  };
}
const canvasW = 500, canvasH = 500, numStrokes = 10; const events = []; let time = 0, mouseCountStart = 0;
for (let s = 0; s < numStrokes; s++) {
  const numMd = randInt(60, 80); const curve = generatePollockCurve(numMd + 1, canvasW, canvasH);
  const startX = curve[0].x, startY = curve[0].y;
  time += (s === 0) ? randInt(50, 200) : randInt(600, 900);
  events.push({ m: "mp", t: time, x: startX, y: startY, strokeData: generateStrokeData(mouseCountStart, startX, startY, numMd) });
  for (let i = 1; i <= numMd; i++) { time += randInt(14, 20); events.push({ m: "md", t: time, x: curve[i].x, y: curve[i].y }); }
  time += randInt(10, 30); events.push({ m: "mr", t: time, x: curve[numMd].x, y: curve[numMd].y });
  mouseCountStart += 1 + numMd;
}
const recording = { version: "1.0", startTime: 0, randomSeed: randInt(100000000, 999999999), initialPathToggle: false, initialWhiteBrushMode: false, initialBrushColorMode: 0, canvasSize: { width: canvasW, height: canvasH }, canvasBackgroundColor: [255, 255, 255], events };
const outputPath = __dirname + '/examples/pollock-style.json'; fs.writeFileSync(outputPath, JSON.stringify(recording, null, 2)); console.log('Written to:', outputPath);

Three: gen-pollock.js writing logic

How the script builds JSON that meets the player’s requirements, for other bots or LLMs to follow.

3.1 JSON structure (8 required top-level fields)

version, startTime, randomSeed, initialPathToggle, initialWhiteBrushMode, initialBrushColorMode, canvasSize, canvasBackgroundColor.

3.2 Event sequence (three-part per stroke)

mp (mousePressed, with strokeData) → md × 60–80 (mouseDragged) → mr (mouseReleased).

3.3 Path generation (generatePollockCurve)

3.4 strokeData (~35 params + forceMapParams 16)

3.5 Timestamp design

// Between md: 14–20ms (simulate 60fps drag) time += randInt(14, 20); // Between strokes: ≥600ms (for maxUpdates=30 pen-up) time += (s === 0) ? randInt(50, 200) : randInt(600, 900);

3.6 Why script instead of hand-writing

Step 3: Three ways to play

Method A: Agent paste UI (recommended when file upload is not available)

The main app (index.html) has an Agent JSON paste area:

  1. Go to http://localhost:3001/ (artist mode, or add ?_artist:1)
  2. Find the "Agent JSON Input" textarea (#agent-json-textarea)
  3. Paste the full JSON (no extra quotes; paste the object directly)
  4. Click "▶ Play JSON" (#agent-json-submit)
  5. Check #agent-json-status (e.g. ✓ 2 strokes, 500×500px — playing)

Or call the JS function:

const result = window.loadRecordingFromText(jsonString); // Returns: { ok: true, strokes: 2, canvasSize: {width:500, height:500} } or { ok: false, error: "..." }

Method B: Direct JS (when the agent can run JS)

// In page JS context (e.g. Playwright / Puppeteer) recordingData = /* JSON object */; startPlayback();

Method C: File upload (manual or agent with filesystem access)

Click the "Load Json" button (#load-recording) and choose a .json file.

Method D: Console in-place generation (best option when JSON is too large)

When JSON exceeds 30KB, pasting into the Agent JSON Input textarea will fail or be truncated. The most reliable approach is to write the generation logic as a self-executing JS (IIFE), run it directly in the browser Console, and inject the result for playback on the spot.

Why this approach?

ProblemCauseMethod D solution
Agent JSON Input capacityTextarea can't handle 50KB+ JSON textGenerate JSON inside the browser; no textarea transfer needed
External fetch blockedRailway-deployed page can't fetch localhost filesAll data is generated inside JS; no network requests
Pasting in parts is slowSplitting JSON into chunks requires multiple JS callsOne JS block handles: generate + inject + play

Steps

  1. On the InkField page, press F12 (or Cmd+Option+J / Ctrl+Shift+J) to open DevTools → Console
  2. Paste the entire "self-executing generator script" below into the Console and press Enter
  3. The script will automatically: generate JSON → inject into textarea → trigger playback
  4. Console will print 🎨 Ready! and the canvas starts animating

Example script structure

// ═══ Self-executing generator (IIFE) — paste into Console ═══ (function() { // ── 1. Utility functions ── function rn(a,b) { return Math.random()*(b-a)+a; } function ri(a,b) { return Math.floor(rn(a,b+1)); } // ── 2. strokeData generator ── function mkStroke(sx, sy, mc, opts) { return { strokeSeed: ri(1e6,9e6), mouseCountStart: mc, colorIndex: ri(0,3), shapeType: ri(0,3), useSharpen: opts.us||3, brushMode: opts.bm||1, // ... omitted: full ~35 fields (see strokeData list on this page) forceMapParams: { /* 16 fields */ } }; } // ── 3. Path generator (smooth curve) ── function curve(sx,sy,ex,ey,n,amp,freq) { var pts = []; for (var i=0; i<=n; i++) { var r = i/n; pts.push({ x: Math.round(sx+(ex-sx)*r + Math.sin(r*Math.PI*freq)*amp), y: Math.round(sy+(ey-sy)*r + Math.cos(r*Math.PI*freq)*amp*0.5) }); } return pts; } // ── 4. Assemble events ── var ev=[], t=0, mc=0; // Define strokes (each with start/end, point count, wave amplitude, brush settings) var strokes = [ { sx:50, sy:400, ex:450, ey:380, n:60, wa:15, wf:1.5, o:{ c:8, sz:50, bl:1, us:3, sp:8, bs:2 } }, // ... more strokes ... ]; strokes.forEach(function(s, i) { var pts = curve(s.sx, s.sy, s.ex, s.ey, s.n, s.wa, s.wf); t += (i===0) ? 100 : ri(700,900); // mp → md × N → mr ev.push({ m:"mp", t:t, x:pts[0].x, y:pts[0].y, strokeData: mkStroke(pts[0].x, pts[0].y, mc, s.o) }); for (var j=1; j<pts.length; j++) { t += ri(14,20); ev.push({ m:"md", t:t, x:pts[j].x, y:pts[j].y }); } t += ri(10,30); ev.push({ m:"mr", t:t }); mc += 1 + s.n; // ── Interleave Flow effects (optional) ── if (i === 2) { // Add Flow after the 3rd stroke t += 1200; var seed = ri(1e5,9e5); ev.push({ m:"flow", t:t, action:"start", blendType:0, flowSeed:seed, strokeBounds:{ minX:0, minY:0, maxX:1, maxY:1 }, strength:100, lastStrokeOnly:false }); t += 800; ev.push({ m:"flow", t:t, action:"end", blendType:0, flowSeed:seed, iterations:10, totalFrames:30 }); t += 800; } }); // ── 5. Top-level JSON ── var painting = { version:"1.0", startTime:0, randomSeed:ri(1e5,9e5), initialPathToggle:false, initialWhiteBrushMode:false, initialBrushColorMode:0, canvasSize:{ width:500, height:500 }, canvasBackgroundColor:[222,212,195], events: ev, strokes:[], timeOffset:0 }; // ── 6. Inject and play ── var jsonStr = JSON.stringify(painting); var ta = document.getElementById('agent-json-textarea'); ta.value = jsonStr; ta.dispatchEvent(new Event('input', {bubbles:true})); document.getElementById('agent-json-submit').click(); console.log('🎨 Ready! ' + ev.length + ' events, ' + jsonStr.length + ' chars'); })();

Or use loadRecordingFromText (more direct)

// If you don't need the textarea, call the built-in function directly: var result = window.loadRecordingFromText(JSON.stringify(painting)); console.log(result); // { ok: true, strokes: 10, canvasSize: {width:500, height:500} }

Key considerations

  • IIFE wrapper: Wrap the entire code in (function(){ ... })(); to avoid polluting global scope
  • Stroke density: At least 50–80 md events per stroke, or lines will be too short to see
  • Stroke spacing: At least 700ms between strokes to let the ink diffusion animation complete
  • Flow spacing: At least 1200ms before Flow start, to ensure previous strokes are encoded into finalBuffer
  • Coordinate range: All x/y must be within 10–490 (leave margin for 500×500 canvas)
  • mouseCountStart accumulation: Stroke N's mouseCountStart = sum of (1 + md count) for all previous strokes

Full runnable example

Download mountain-mist-v2.js (12 strokes + 3 interleaved Flows, full 500×500 composition), paste into Console to play.

Common errors and debugging

ErrorCauseFix
Strokes are black (wanted color)brushColorMode: 0Use a color ID: 6=orange, 9=blue, 30=red
Blend mode has no effectBackground is pure black or whiteUse mid-tone background, e.g. [180,160,140]
JSON validation failsMissing required fields or wrong typesEnsure top level has version, startTime, randomSeed, initialPathToggle, initialWhiteBrushMode, initialBrushColorMode, canvasSize, canvasBackgroundColor
strokeData missing fieldsMissing brushDir, ctlNoise, etc.Include all ~35 strokeData fields, especially targetflyBrushType, targetmainStrokeDir, brushDir, ctlNoise, brushPaintCtlNoisebyFrame, brushPaintInterpolationOffset, brushPaintOldRInitial
Flow effect not showingMissing forceMapParams or getLastStrokeBounds()Include full forceMapParams (16 fields) and call getLastStrokeBounds()

strokeData full field list (quick reference)

When generating JSON, strokeData must include all of the following (~35 fields):

strokeSeed, mouseCountStart, drawingSeed, mouseX, mouseY brushMode, brushColorMode, colorIndex, whiteBrushMode, brushModeSP shapeType, baseBrushSize, initialSize, spraySize spring, friction, step, step2, randStep, maxUpdates, pathRotation, expectedStrokeLength keyBlendMode, useSharpen, indiffusionStrength, effect3Brightness hueShift, satShift, briShift, whiteMaxOpacity phasorVel, explodeStart, explodeEnd targetflyBrushType, targetmainStrokeDir, brushDir ctlNoise, brushPaintCtlNoisebyFrame, brushPaintInterpolationOffset, brushPaintOldRInitial forceMapParams: { randomSeed1~4, scale1~3, amplitude1~3, phase1~3, vortexScale1~2, clusterScale1~2 }

Advanced workflow: layered iteration (paint like playing Go)

The workflow above is "compose the entire piece, then perform it." Here's an alternative: like playing Go, place a few stones, read the board, then decide your next move.

Why iterate?

When generating all strokes at once, you're assuming every stroke executes perfectly. In practice:

  • Ink diffusion (feedback) and Flow effects are unpredictable — actual results deviate from expectations
  • Stroke #3 may already make one area too heavy, but your JSON has strokes #4–10 locked in
  • Compositional balance (density vs. emptiness) can only be judged by looking at the actual canvas

Layered iteration lets you observe before deciding, rather than guessing everything upfront.

Four-round iteration

Not one stroke at a time (too slow) — group by painting layers in 4 rounds:

RoundContentTypical paramsAfter submission
1. Base 2–3 broad strokes for foundation bs=4–5, initialSize=80–103, brushMode=1 Screenshot → assess coverage & center of gravity
2. Structure 3–4 medium strokes for composition bs=2–3, initialSize=30–50, pathRotation=7–15 Screenshot → assess density balance, where to add
3. Detail 2–3 fine strokes for refinement bs=0.25–1, initialSize=2–15, brushMode=4 or 6 Screenshot → decide if white highlights are needed
4. Finishing White brush / Flow / or decide to leave as-is whiteBrushMode=true or Flow blendType=3 Screenshot → final confirmation

Evaluation framework between rounds

After each screenshot, answer these four questions before generating the next round:

  1. Center of gravity — Where is the visual weight? Do you need strokes on the opposite side?
  2. Density — Which area is too crowded? Which is too empty? Is the emptiness intentional?
  3. Layer relationship — How does this round relate to the previous one? (overlay, contrast, echo)
  4. Completeness — What's still needed to "finish"? Could the current state already be the best?

Technical notes

  • Incremental JSON is supported — The system accepts multiple loadRecordingFromText() calls; each new JSON continues painting on the existing canvas
  • mouseCountStart must accumulate — Round 2's first stroke mouseCountStart = total (mp + md) events from all Round 1 strokes. You must track this number across rounds
  • No special gap needed between rounds — Just wait for the previous round's playback to finish before submitting. The console will log completion messages
  • randomSeed can differ per round — The top-level JSON randomSeed can use a new value each submission

When to use one-shot vs. iteration?

ScenarioRecommendation
Testing parameters or brush effectsOne-shot — quick output, see results fast
Clear composition plan (e.g., copying a reference)One-shot — plan is fixed, no adjustment needed
Free creation, exploring styleIterate — need to feel as you paint
Complex composition (>8 strokes + Flow)Iterate — avoid cumulative drift in later strokes

← Previous: AI Log   |   Next: Emotion & Intention →

InkField Tutorial Series — Understanding digital ink, explained simply