Josh Ramirez
← Back to the journal

Departed Spring 2024

SentenceAI

the keyboard, OS-level · AI agent · Archived

The APUSH demo. Hands are mashing random letters; the screen is producing English.

Departure

LLMs predict the next token — so why does autocomplete still stop at the next word? I wanted one that finished the sentence. The moment that worked, the experiment shifted: not 'AI helps you write,' but 'AI is writing, and the keyboard under your fingers is just playing it back.'

Approach

  • macOS
  • CGEvent tap
  • Gemini API
  • Supabase

Had to look like normal typing — no overlay, no completion to accept, no visible UI. One predicted character per real keystroke, so an observer watching the hands and the screen would see the same cadence.

Field log

  1. Spring 2024 — predict the rest, not the next

    Started as plain LLM-powered autocomplete: instead of one token, predict to the period. Within a day the more interesting framing arrived — undetectable AI typing under your fingers. Same backend, very different artifact.

  2. Day 1 — the event tap

    CGEvent.tapCreate at .cghidEventTap, listening for .keyDown, passing the event back through unmodified. Once it compiled and was granted accessibility, every keystroke in every app — Mail, the browser, the IDE — was routing through a function I'd written. That alone is the experiment.

    The whole leverage point. macOS hands you every keystroke if you ask.
  3. Gemini fills the rest

    Buffered the trailing 500 characters of typed text and shipped them to Gemini with a one-line ask: finish this sentence, end on a period. First response came back coherent and topical, ~200 characters long, sitting in memory with nothing to do with it yet.

  4. Buffer as typist

    Wired the response into an injectionBuffer. handleKeyEvent now checks the buffer first: if it's not empty, removeFirst, set the in-flight CGEvent's unicodeString to that character, return. The user is mashing keys; the screen is producing English.

    Real keypress in, predicted character out. Same event, swapped payload.
    The sleight of hand fits in a dozen lines.
  5. The APUSH demo

    Opened a Google Doc and typed 'Three APUSH examples of American Imperialism are'. The rest — the Spanish-American War, the annexation of Hawaii, the Open Door Policy in China — appeared under fingers that were now hitting random letters. From across the room you couldn't tell.

  6. Shelved

    Never used it on an assignment, never finished wiring Supabase past a logging stub. The honest version of this is 'undetectable AI typing,' which is exactly what you don't ship. The point was that it worked.

From the gallery

Title card. The robotic arm doing the typing.
The four pieces. Three of them load-bearing.

What I came back with

500-char context

Lesson from the terrain

The leverage point wasn't the model — it was that macOS will hand you every keystroke in the system if you ask. Once the event tap is up, the LLM is just a fancy text source feeding a queue, and the OS does the actual sleight of hand. I left this on the shelf because the cleanest version of the idea is also the version you can't responsibly ship: it turning out well is what made it not-a-product.

Cross-links