I'm a senior dev. I lost 400 voice ideas to AirPods this year. Here's the iPhone fix.
How working software developers capture code screenshots, doc snippets, architecture ideas, and 'while-coding' voice memos — without breaking flow or trusting a cloud LLM with your private codebase.
If you're a working developer, you have three captures broken at the same time. Voice memos that triggered while coding — gone, because AirPods + Siri = "I didn't quite catch that." Code screenshots — buried in your camera roll between food pics and grocery lists. Documentation snippets you swore you'd revisit — bookmarked across Safari, Arc, Notion, and Raycast notes. Nothing connects.
The flow-state problem
Cal Newport calls it "deep work." Pragmatic developers call it "leave me alone for two hours." Whatever you call it, flow has one enemy: context switches. Every time you stop coding to "let me jot this down in Notion," you're spending 15 minutes of recovery for 30 seconds of recording. Linda Stone's "continuous partial attention" research puts the recovery cost at 23 minutes for knowledge workers. For developers, it's probably worse.
The solution isn't "don't take notes." It's "make note-taking sub-3-second." That requires:
- Voice while your hands are on the keyboard. Hold AirPod, speak, release. The transcription happens on-device. You don't break flow.
- Screenshot + auto-OCR. Cmd-Shift-3 on Mac, side-button-volume-up on iPhone. Némos OCRs the code on the screenshot so you can grep your own captures later.
- One-handed quick add. A widget on the home screen that takes a single sentence. No login. No app launch. No sync wait.
If any of those has more than 3 seconds of friction, you won't use it. And then you'll forget the idea. And it will haunt you for six months.
Why developers don't trust cloud-based note apps
The honest reason: you've read the privacy policies. You know what "may be used to improve our services" means in practice. And you really, *really* don't want a voice memo where you said "the Stripe API key for staging is..." sitting in some vendor's S3 bucket.
The trustworthy version: on-device AI. The model runs in the Neural Engine. Your transcription happens on the phone. The screenshot OCR happens on the phone. Nothing leaves the device unless you explicitly sync via end-to-end encrypted CloudKit. Is Némos private? is the architecture doc.
Compare that to Otter.ai, which sends every voice memo to a server in Tennessee. Or Notion AI, which sends every snippet to OpenAI. For a developer working on anything proprietary, that's a hard "no."
The four capture moments developers actually need
Moment 1: Code screenshot while pairing. Your pair shares their screen. They show you a clever pattern. You screenshot it. With Némos, the screenshot gets OCR'd in 200ms — the code becomes literally text-searchable. Six weeks later, you search "useDeferredValue" and the screenshot from that pairing session comes back.
Moment 2: Doc snippet from a Stack Overflow comment. You find the answer. You screenshot the relevant part. You voice-memo your reaction: "this is the right solution but it's gonna leak memory if we don't tear down the listener." Both items get filed together. When you write the actual code three days later, you have both the snippet AND your own commentary.
Moment 3: 3AM idea. You're walking the dog. You think "what if we built the cache layer with a write-through pattern instead of write-around?" You hold the side button, dictate 18 seconds, release. The on-device transcription captures it perfectly. At your desk in the morning, you search "cache write-through" and it's there.
Moment 4: Architecture diagram screenshot. Your tech lead drops an Excalidraw screenshot in Slack. You save it to Némos. OCR catches the labels ("service A," "queue," "DLQ"). Searchable. Linked to the meeting voice memo from the same day. Connected to the GitHub issue you opened afterwards.
The "save it for the README" pipeline
Every codebase has documentation debt. Every developer promises themselves they'll write the architecture overview "next sprint." It never happens because the *raw material* is scattered.
The Némos pipeline: every time you make a meaningful design decision, capture a 20-second voice memo + the relevant screenshot. Three months later, you have 40 of those. You ask the on-device AI: "summarize all my notes about the auth service architecture from the last 90 days." You get a 6-bullet outline that becomes the seed of your README.
This isn't AI writing documentation. This is AI organizing your own thoughts, on-device, into a structure you can edit. The IP stays yours. The model never sees the production code. That distinction is the whole game.
Why not Apple Notes, Obsidian, or Logseq?
Three honest answers:
Apple Notes. Fine for grocery lists. Search is keyword-only. No tagging that survives. Voice memos and screenshots live in different silos. The iCloud sync is plaintext unless you've turned on Advanced Data Protection (and you haven't).
Obsidian. I love Obsidian. I run a vault. It's the wrong tool for *capture* because it requires you to think about file names, folders, and frontmatter — at exactly the moment you're trying not to think. Obsidian is the *output* end of the second brain pipe. Némos is the *input* end. Top 10 second brain apps for 2026 explains the layered approach.
Logseq. Same as Obsidian — block-based, beautiful, but the friction of "open Logseq, navigate to the journal, type the block" kills the capture moment.
The setup most senior devs are converging on
- Capture: Némos on iPhone + Apple Watch. Voice, screenshot, quick text.
- Daily review: Apple Notes or Drafts for the 90-second "what did I capture worth keeping" pass.
- Long-term: Obsidian vault for synthesized, durable knowledge.
- Code: Your actual repo + a
notes.mdper repo for repo-specific decisions.
The mistake is collapsing all four layers into one tool. Notion-as-everything is the most common version of this mistake. The cure is layering: fast capture at the bottom, durable knowledge at the top.
What about Voice Memos.app?
It's fine for raw audio. It does not transcribe (well, it does as of iOS 18, but it's not searchable in the way developers need). It doesn't connect voice to screenshots or text notes. It's a single-purpose tool in a workflow that needs cross-modal search.
Némos's voice recording is built on the same audio stack as Voice Memos but adds: on-device transcription, full-text search across the transcript, links to related screenshots and notes, and tags inferred by on-device AI.
The Apple Watch capture flow
This is the unfair advantage. Every other note app on the App Store either has no watchOS app or has a token one. Némos's watch companion lets you raise your wrist, hold the side button, and dictate. The audio captures on the watch (offline if needed), transcribes when paired, and shows up in your phone library two seconds after.
The use case that wins is the shower idea. The walk-the-dog idea. The 7am gym idea. The "I'm cooking dinner and I just realized why the build is broken" idea. Those are the ideas that lose because you can't break flow to write them down. The watch fixes it. Can I take notes on Apple Watch? is the deeper write-up.
Three concrete moves for this week
- Stop opening Notion for capture. Use it for synthesis at week's end. For capture, use a single-tap tool.
- Audit your last 30 voice memos. How many are useful? How many are findable? If the second number is less than half the first, your capture tool is broken.
- Try the watch flow once a day for a week. It will feel awkward for 3 days. By day 6 it's invisible.
The deeper case
If you want the long version, the on-device vs cloud AI piece walks through the technical and ethical reasons developers should care. The knowledge management system for personal use piece is the "second brain layers" mental model that informs all of the above.
Code-snippet capture from Stack Overflow / GitHub (the workflow nobody admits)
Every working developer has a private library of "code I copied from Stack Overflow that I'm going to need again." It usually lives in a gist.github.com account, a snippets/ directory in dotfiles, a Notion page that hasn't been updated in 18 months, or — most commonly — your shell history.
The Némos pattern: every time you copy a code snippet from Stack Overflow, GitHub, or a friend's gist, screenshot it on your phone (if you're on mobile) or use the share extension (if you're on desktop). Voice memo the context: "useDeferredValue pattern for the search input — solves the jank when typing fast." Tag with the language and the technique. The on-device OCR makes the actual code text-searchable; the voice memo makes the *intent* searchable. Three months later when you need that pattern again, you search "deferred search input" and the snippet plus your reasoning surface together. This is the kind of pattern that prevents you from re-Googling the same Stack Overflow answer for the fifth time.
The compounding effect over a multi-year career is real. By year five you have 300-400 personal code patterns indexed not by "what they do" but by "when you needed them last." Search by language: "TypeScript generic helpers." Search by domain: "WebSocket reconnect logic." Search by year: "stuff from when I was working on the auth refactor." All of it surfaces from your own previous self.
Error-message screenshot indexing
This is one of the highest-ROI capture patterns for developers and almost nobody does it. Every time you encounter a non-obvious error — a stack trace, a deployment failure, a weird Xcode build error, a webpack incantation that finally worked — screenshot it. Voice memo the fix when you find it. Tag with the error keyword.
Six months later when the same error shows up (and it will), you search the error keyword and your own previous fix surfaces. The on-device AI keeps the OCR'd error text searchable. The voice memo explains what worked, what didn't, and why. This single habit has saved an embarrassing number of hours for every senior developer I've talked to about it. It's the difference between "I've seen this before but I can't remember the fix" and "here's the screenshot from when I solved this last March."
The compounding effect is especially valuable for build-system errors, which are notoriously poorly documented and notoriously bad at error messages. EAS Build, webpack, Vite, Xcode, Gradle, CMake — all of them produce errors that are essentially un-Googleable because the surface text is too generic. Your personal library of "errors I have seen and beaten" is more valuable than any cloud-based answer.
RFC and design-doc capture
Every engineering organization runs on RFCs and design docs. Every developer reads dozens of them per year. By year three you've read several hundred. The ones that taught you something durable — the ones you want to remember the reasoning of — are gone unless you captured them.
The pattern: when you read an internal RFC that changes how you think about a system, screenshot the key sections and voice memo your reaction. "This is the cleanest argument I've seen for event sourcing — Sam's reasoning about why event log + projections is more maintainable than a CRUD model with full audit. Use this framing the next time we're debating data architecture." Tag with the architectural pattern. Two years later when you're writing your own design doc on a similar problem, you search the pattern name and the original RFC plus your reaction both surface. You don't reinvent the argument; you build on it.
The same applies to external technical writing — the LWN articles, the Cloudflare blog posts, the Stripe engineering deep-dives, the Dan Abramov essays. The web is full of one-shot brilliant technical writing that disappears from search after 18 months. Your personal library of "writing that shaped how I think" is irreplaceable.
Voice-note while coding (without breaking focus)
This is the highest-friction capture moment in a developer's day and the one most often skipped. You're in the middle of a hard problem. Something occurs to you — a related bug, a refactor opportunity, a "wait, I should ask Mike about this." If you context-switch to a note app, you lose the problem you were in. If you don't capture it, you lose the thought.
The pattern: hold AirPod stem, dictate one sentence, release. The Apple Intelligence Live Activity captures it on-device. Your hands never leave the keyboard. Your eyes never leave the code. The capture lands in Némos transcribed by the time you decide to look at it 20 minutes later. The cost is approximately 2 seconds of attention. The benefit is not losing the thought.
For developers using bone-conduction headphones (Shokz) or open-ear earbuds (LinkBuds), the same pattern works without the audio-isolation tradeoff. The Apple Watch crown squeeze gesture (iOS 18+) is an alternative for developers who don't wear earbuds while coding.
Slack-screenshot-of-decisions capture
This is the pattern that distinguishes senior developers from junior developers. Every meaningful architectural decision in a modern org happens in Slack first. The decision is recorded as a casual paragraph in a thread, archived after 90 days (by default), and effectively gone for the next person who joins the team and asks "wait, why did we decide to use Postgres instead of DynamoDB for this?"
The capture pattern: when a decision lands in Slack — *especially* an architectural decision — screenshot the relevant thread. Voice memo the surrounding context: "context for this decision — we were debating Postgres vs DynamoDB for the user-profile service. Sam's argument about needing transactions across users + sessions was the deciding factor." Tag with the project and the decision type. Six months later, a new engineer asks why, and you have a 30-second answer with a screenshot of the original thread attached. This is the practical implementation of "ADR" (Architecture Decision Record) without requiring the team to commit to a heavyweight ADR process.
ADR (Architecture Decision Record) workflow
The formal ADR pattern is a markdown file per decision, committed to the repo, with sections for "Context," "Decision," and "Consequences." It's an excellent practice. It is also widely ignored because the friction is too high — nobody opens a new markdown file in the middle of a Slack debate to write an ADR.
The Némos-to-ADR pipeline solves this: capture the Slack thread, voice memo the reasoning, tag with "adr-candidate." At the end of the sprint, you have a list of candidate ADRs. The on-device AI can synthesize each capture into a draft markdown ADR using your own captured reasoning as input. You edit, you commit, you've now got an actual ADR in the repo. The friction drops from "I need to write a whole document" to "I need to refine a draft." Teams that adopt this pattern actually maintain ADRs. Teams that don't have a stack of unwritten "I should write this up" thoughts.
Conference talk capture (the long-tail ROI)
You attend GopherCon, RailsConf, KubeCon, Strange Loop, or whatever your stack's annual conference is. You see 15 talks. By the time you fly home, you've forgotten 70% of them. By the time you're back at your desk on Monday, you've forgotten 90%.
The pattern: after every talk, take 90 seconds to dictate a voice memo. Speaker, topic, three most important points, one specific takeaway you'd actually apply. Screenshot one slide that captures the talk's central claim. Tag with the conference name. By the end of the conference you have 15 talk summaries that you can actually search.
Six months later when the relevant problem shows up — "wait, didn't someone talk about distributed tracing at the last QCon?" — you search the conference tag plus "tracing" and the original talk capture surfaces. The 90 seconds of post-talk capture has just saved you the cost of attending the conference in the first place. Most developers report that conference ROI roughly doubles when they start running this loop.
On-call alert documentation
The on-call rotation produces the highest information density in a developer's working life. Every alert is a potential learning. Every fix is a potential pattern. Every false-positive is a potential alerting improvement. Almost none of it gets documented because on-call is exhausting and post-incident review is rushed.
The pattern: every time an alert fires, dictate a voice memo while you're investigating. "Alert: P99 latency on the search endpoint above 2s for 5 minutes. Initial hypothesis: bad query plan after this morning's migration. Checking pg_stat_statements." Tag with the service and the alert type. Resolve the incident, then add a closing voice memo: "Root cause was the new index taking longer than expected to build — kicked it off async, latency recovered in 8 minutes. Should have read the migration plan more carefully — the build-index-concurrently note was there." Tag with "incident-learning."
Six months later when a similar alert fires, you search the alert type and your previous investigation surfaces. The hypothesis space narrows immediately. The mean time to resolution drops. The team gets better not because of formal post-mortems but because individual engineers are building personal libraries of "alerts I have seen and how they actually resolved."
Code review reference library
Code review is a teaching moment that is almost universally wasted. You review someone's PR, you leave a thoughtful comment explaining a pattern, the PR gets merged, the comment is buried in GitHub's history forever.
The Némos pattern: when you write a particularly good code review comment — one that explains a real principle, not just a nit — screenshot it and dictate a 10-second voice memo about the underlying pattern. Tag with the language and the pattern. After two years, you have a personal library of code review wisdom that you can re-surface when the same pattern comes up again. Instead of re-writing the same comment, you can paste a reference to your previous explanation or use it as the seed for a more refined version.
The same works in reverse: when someone leaves a thoughtful code review comment on *your* PR that teaches you something, screenshot it. Your personal library of "things I've learned from code review" becomes a quiet, durable record of your growth as an engineer.
Try it
Némos is free, on TestFlight, and the waitlist is at the homepage. We onboard developers fast because we're devs and we know the workflow. Tell us your stack when you sign up.
Other guides for your role
- Doctors & CliniciansI'm an ER doc with 8,400 saved articles. Here's the iPhone setup that finally works.
- LawyersWhy 73% of associate attorneys we surveyed use Apple Notes wrong (and what to do about it)
- DesignersI took 14,000 inspiration screenshots in 2 years. Here's the iPhone system that made them findable.
- ParentsTwo kids, one iPhone, zero brain cells. The capture system that saved my year.
- PodcastersI prep 4 podcast episodes a week. Here's the iPhone-only system that replaced 3 apps.