Departed Spring 2024
SentenceAI
the keyboard, OS-level · AI agent · Archived
[image: SentenceAI demo on macOS — Google Doc with the typed prompt 'Three APUSH examples of American Imperialism are' followed by Gemini's predicted continuation 'the Spanish-American War, the annexation of Hawaii, and the Open Door Policy in China. Pretty Cool!' appearing as the user hits random keys on the keyboard below]
Departure
LLMs predict the next token — so why does autocomplete still stop at the next word? I wanted one that finished the sentence. The moment that worked, the experiment shifted: not 'AI helps you write,' but 'AI is writing, and the keyboard under your fingers is just playing it back.'
Approach
- macOS
- CGEvent tap
- Gemini API
- Supabase
Had to look like normal typing — no overlay, no completion to accept, no visible UI. One predicted character per real keystroke, so an observer watching the hands and the screen would see the same cadence.
Field log
Spring 2024 — predict the rest, not the next
Started as plain LLM-powered autocomplete: instead of one token, predict to the period. Within a day the more interesting framing arrived — undetectable AI typing under your fingers. Same backend, very different artifact.
Day 1 — the event tap
CGEvent.tapCreate at .cghidEventTap, listening for .keyDown, passing the event back through unmodified. Once it compiled and was granted accessibility, every keystroke in every app — Mail, the browser, the IDE — was routing through a function I'd written. That alone is the experiment.
[image: macOS CGEvent tap diagram — every keystroke routed through a global event tap before reaching the active application, with arrows from the physical keyboard through SentenceAI's handler into apps like Mail, Safari, and Xcode]
The whole leverage point. macOS hands you every keystroke if you ask. Gemini fills the rest
Buffered the trailing 500 characters of typed text and shipped them to Gemini with a one-line ask: finish this sentence, end on a period. First response came back coherent and topical, ~200 characters long, sitting in memory with nothing to do with it yet.
Buffer as typist
Wired the response into an injectionBuffer. handleKeyEvent now checks the buffer first: if it's not empty, removeFirst, set the in-flight CGEvent's unicodeString to that character, return. The user is mashing keys; the screen is producing English.
[image: Buffer-based typing replay diagram — injectionBuffer holds Gemini's predicted continuation, user keypresses are intercepted by handleKeyEvent and replaced character-by-character with the next buffer entry before the CGEvent reaches the active application]
Real keypress in, predicted character out. Same event, swapped payload. [image: Screenshot of handleKeyEvent in AppDelegate.swift — Swift function with the injectionBuffer.removeFirst branch calling keyboardSetUnicodeString on the CGEvent and returning the modified event]
The sleight of hand fits in a dozen lines. The APUSH demo
Opened a Google Doc and typed 'Three APUSH examples of American Imperialism are'. The rest — the Spanish-American War, the annexation of Hawaii, the Open Door Policy in China — appeared under fingers that were now hitting random letters. From across the room you couldn't tell.
Shelved
Never used it on an assignment, never finished wiring Supabase past a logging stub. The honest version of this is 'undetectable AI typing,' which is exactly what you don't ship. The point was that it worked.
From the gallery
[image: SentenceAI title slide — detailed illustration of a futuristic robotic arm extending forward, captioned with the project name in green]
[image: Tech Highlights slide listing the four moving parts of SentenceAI — macOS Global Event Tap (CGEvent.tapCreate), Text Injection & Buffer Management, Google Gemini API integration, and Supabase for optional logging]
What I came back with
500-char context
Lesson from the terrain
The leverage point wasn't the model — it was that macOS will hand you every keystroke in the system if you ask. Once the event tap is up, the LLM is just a fancy text source feeding a queue, and the OS does the actual sleight of hand. I left this on the shelf because the cleanest version of the idea is also the version you can't responsibly ship: it turning out well is what made it not-a-product.
Cross-links
This fed into / from