Claude Skill

Teach your agent
to use screencli.

Install this skill so Claude Code, Cursor, or any Claude-powered agent knows how to record browser demos for you automatically.

Install the skill

Add the skill folder to your project's .claude/skills/ directory. Three steps:

1

Create the skill directory

$ mkdir -p .claude/skills/screencli-record/references
2

Download the skill files

# SKILL.md (main skill file)
$ curl -sL https://screencli.sh/skill/SKILL.md \
    -o .claude/skills/screencli-record/SKILL.md

# Reference files
$ curl -sL https://screencli.sh/skill/references/cli-reference.md \
    -o .claude/skills/screencli-record/references/cli-reference.md

$ curl -sL https://screencli.sh/skill/references/effects.md \
    -o .claude/skills/screencli-record/references/effects.md
3

Done. Ask your agent to "record a demo of my app" and it will use screencli.

What is a Claude skill?

A skill is a structured set of instructions that teaches Claude how to use a specific tool. When you add this skill to your project, Claude automatically knows how to invoke screencli with the right flags and options.

The skill triggers when you say things like:

  • "Record a demo of my app"
  • "Create a screencast of the dashboard"
  • "Use screencli to record a video"
  • "Export this recording for Twitter"
  • "Add a gradient background to the video"

Skill structure

Three files, dropped into your project:

.claude/skills/screencli-record/
  SKILL.md ← main instructions + frontmatter
  references/
    cli-reference.md ← full CLI flags & config
    effects.md ← gradients & effects docs
SKILL.md
--- name: screencli-record description: | Record AI-driven browser demos using the screencli CLI tool. Use this skill when: - The user wants to record a screen demo, screencast, or browser recording with an AI agent - The user says "record a demo", "screencast", "create a video", "screen record" - The user wants to export or re-encode a screencli recording - The user mentions screencli, screencli record, or screencli export - The user wants to add gradient backgrounds, effects, or auth to a recording --- # screencli — AI-Powered Screen Recording screencli records browser sessions driven by an AI agent and produces polished videos with effects. ## When to Use Use screencli when the user needs to: - Create a demo video of a web application - Record a walkthrough or tutorial driven by AI - Export recordings for social platforms (YouTube, Twitter, LinkedIn, etc.) - Add visual polish (gradients, zoom, click highlights) to recordings ## Step-by-Step: Record a Demo ### 1. Install (if not already installed) npm i -g screencli ### 2. Configure API Key (first time only) npx screencli init Running npx screencli record for the first time auto-triggers init if not configured. ### 3. Record npx screencli record <url> -p "<prompt>" [options] Required: <url> and -p, --prompt <text> Common options: --background <name> aurora, sunset, ocean, lavender, mint, ember --padding <n> Background padding % (default: 8) --corner-radius <n> Corner radius px (default: 12) --no-shadow Disable drop shadow --login Manual login before AI takes over --auth <name> Save/load auth state --no-headless Show browser window --viewport <WxH> Viewport (default: 1920x1080) ### 4. Export (optional) npx screencli export <dir> --preset <name> Presets: youtube, twitter, instagram, tiktok, linkedin, github-gif ## Authentication Flow npx screencli record <url> -p "..." --login --auth myapp # first time npx screencli record <url> -p "..." --auth myapp # reuse ## Troubleshooting - "Missing ANTHROPIC_API_KEY" — Run npx screencli init - Browser crashes — Try --no-headless - Auth expired — Re-run with --login --auth <name> - FFmpeg errors — Install FFmpeg (brew install ffmpeg) ## Reference See: references/cli-reference.md · references/effects.md
references/cli-reference.md
# screencli CLI Reference ## Commands ### npx screencli init Interactive setup wizard. Prompts for Anthropic API key. Saves to ~/.screencli/config.json ### npx screencli record <url> -p "<prompt>" [options] Record an AI-driven browser demo. All flags: -p, --prompt <text> Instructions (required) --background <name> Gradient preset --padding <n> Padding % (default: 8) --corner-radius <n> Corner px (default: 12) --no-shadow Disable shadow --login Manual login first --auth <name> Named auth state --no-headless Show browser --viewport <WxH> Size (default: 1920x1080) -m, --model <model> Claude model --max-steps <n> Max steps (default: 50) --slow-mo <ms> Action delay -o, --output <dir> Output dir (default: ./recordings) -v, --verbose Debug logging ### npx screencli export <dir> --preset <name> Re-encode for a platform. Presets: youtube, twitter, instagram, tiktok, linkedin, github-gif
references/effects.md
# Gradients & Effects Reference ## Gradient Presets Six presets rendered via FFmpeg: aurora purple / blue / green sunset orange / pink / magenta ocean deep blue / teal / cyan lavender purple / pink / soft blue mint green / teal / aqua ember red / orange / amber ## Automatic Effects - Auto-trim — removes idle time - Zoom — auto-zooms to action area (--no-zoom) - Click highlights — pulse at clicks (--no-highlight) - Cursor trail — smooth overlay (--no-cursor) ## Composition Pipeline 1. Raw recording (Playwright WebM) 2. Trim dead time 3. Effects overlay (zoom, highlights, cursor) 4. Background (gradient + corners + shadow) 5. Final output (MP4 or GIF)