Reflect Academy: AI Note-taking 101
Based on Reflect Notes's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
Reflect’s note-taking workflow pairs LLM-based formatting with AI transcription to turn voice and text into structured, searchable notes.
Briefing
AI note-taking in Reflect hinges on two practical tools: large language models for rewriting and structuring text, and AI transcription for turning voice into usable written notes. Together, they let people skip the “giant block of text” problem and instead convert raw thoughts—spoken or read—into summaries, bullet points, action items, and study-ready formats. That matters because the fastest way to capture ideas (voice) becomes genuinely searchable and actionable once transcription and formatting happen automatically.
The workflow starts with what to use and where. Reflect’s built-in AI assistant can format selected text using pre-made “system prompts” and custom prompts saved by the user. The assistant isn’t a free-form chat; it operates on selected text, producing results that can be replaced, inserted, or rerun. For transcription, Reflect includes a microphone feature that converts spoken notes into text with near-human accuracy and places the output into the user’s daily note. The same capabilities extend to mobile, keeping the process “in the app” rather than forcing copy-paste roundtrips.
From there, the most common payoff comes from reformatting and condensation. Large language models can extract key takeaways from articles, convert content into bullet lists, identify the top themes, and pull out action items. They can also help with writing tasks: editing existing drafts, changing tone, generating outlines, and producing counterarguments to stress-test logic. Even knowledge-work mechanics like backlinks can be automated—running AI over text to suggest what should link to what.
Reflect’s practical guidance emphasizes habit-building. Transcribing voice notes isn’t just another AI feature; it nudges users to prompt the assistant repeatedly so notes become structured lists or checklists instead of unstructured transcripts. Users are encouraged to start with pre-built prompts, then clone and customize prompts once they find formats that fit their thinking.
Three sample workflows show how the pieces fit together. First, a daily reflection workflow: the user records a morning voice ramble, runs a custom prompt that structures it into sections like reframing, gratitude, and top priority, and then replaces the transcript with a formatted note that includes backlinks to related days. Second, meeting capture: long calls can be transcribed into the daily note, then processed into “key takeaways” (bulleted) and “action items” (checklist), reducing the need to type during meetings and making it easier to revisit exact wording later. Third, writing and research: voice-based brainstorming can be turned into an article outline, then expanded into a draft and refined with editing prompts. For research papers, saved highlights can be used as context so the user can “chat” with the document—asking questions like impacts of age on cognitive plasticity—while still having the original source available to verify details.
The throughline is speed and retrieval: voice captures thoughts quickly, AI turns them into structured notes, and those notes become searchable, linkable, and usable for writing or studying. The practical advice ends with a push to adopt voice notes as the default and set phone shortcuts so transcription starts with a single tap wherever ideas strike.
Cornell Notes
Reflect’s AI note-taking approach combines two capabilities: LLM-based assistants for formatting, summarizing, and writing, and AI transcription for converting voice memos into text inside notes. The assistant works by selecting text and running pre-built “system prompts” or custom prompts, producing outputs that can be replaced or inserted. Transcription drops near-human-accuracy text into the daily note, enabling voice-first capture without losing searchability. Sample workflows include daily reflections formatted into reframing/gratitude/priorities, meeting recordings turned into key takeaways and action-item checklists, and research-paper Q&A using saved highlights as context. The payoff is faster capture plus structured, reusable notes for writing and studying.
What two AI capabilities matter most for note-taking in this system, and what does each one do?
How does Reflect’s AI Assistant differ from a typical chat tool?
Why does the daily reflection workflow start with voice, and what does AI add afterward?
What’s the meeting workflow designed to solve?
How does AI help with writing beyond generating a first draft?
What does “chat with your notes” enable for research study?
Review Questions
- Which prompt types (system vs custom) does Reflect’s AI Assistant use, and how do users create custom prompts?
- In the meeting workflow, how are key takeaways and action items formatted differently?
- What steps turn a voice ramble into a structured daily reflection, and what sections does the template include?
Key Points
- 1
Reflect’s note-taking workflow pairs LLM-based formatting with AI transcription to turn voice and text into structured, searchable notes.
- 2
Reflect’s AI Assistant operates on selected text using system prompts and user-created custom prompts, with outputs that can be replaced or inserted.
- 3
Transcribing voice notes isn’t just capture—it encourages users to run prompts so transcripts become lists, checklists, and templates instead of unstructured blocks.
- 4
Daily reflections can be generated from voice rambling into consistent sections like reframing, gratitude, top priority, and wishes for the day.
- 5
Meeting recordings can be transcribed and then converted into “key takeaways” (bullets) and “action items” (checklists), reducing the need to type during calls.
- 6
Writing workflows can use voice-to-outline and then draft-and-edit prompts, including counterarguments and example/analogy generation.
- 7
Research study becomes interactive by “chatting with notes” built from saved highlights, returning answers with the underlying source available for follow-up.