Frictionless capture — Apple Notes into a central database
Context
For a notes system to actually serve, the capture gesture must be shorter than the thought. Any extra friction — open an app, pick a folder, type a title — loses the idea. Apple Notes is already on the phone, already open from the lock screen, already synced. The point isn't to replace Notes — it's to listen to it.
Constraint
Three requirements:
- The user gesture stays native: no custom app, no shortcut to learn.
- Notes land in the same database as every other Life OS signal (health, calendar, projects, music). No silo.
- No calls to a third-party service. A note can contain anything — so everything stays inside the cluster.
Decision
A dedicated folder inside Apple Notes, watched by a script
running on the local Mac. On every change the content is
pushed to a Flask API hosted on Pi5, which
normalises it (title, body, timestamps, hashtag tags) and
writes it into the central mln-mariadb on Pi4.
On the reading side: a dashboard at
lifeos.mylastnight.eu that shows today's notes,
notes by tag, and notes that re-surface (full-text search).
No editing on the site — editing stays in Notes, the single
source.
Measurement
Live since 27 April 2026. Capture → DB latency measured around 2 s. No habit change required: when something occurs to me, I type it into Notes as before. The site updates itself.
What remains
A proof by use: the best interface is often the one you already have. There was no need to reinvent capture — the need was to listen where it already happens. The pattern ("watch a native tool, write to a unified database") is reusable for many other signals: calendar, photos, screenshots, forwarded mail.
And one Life OS principle confirmed: show, to listen to yourself, not to judge yourself. The raw note stays the object — the cross-correlation comes later, read-only.