Get AI summaries of any video or article — Sign up free
Reflect Academy: AI Note-taking 101 thumbnail

Reflect Academy: AI Note-taking 101

Reflect Notes·
5 min read

Based on Reflect Notes's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

Reflect’s note-taking workflow pairs LLM-based formatting with AI transcription to turn voice and text into structured, searchable notes.

Briefing

AI note-taking in Reflect hinges on two practical tools: large language models for rewriting and structuring text, and AI transcription for turning voice into usable written notes. Together, they let people skip the “giant block of text” problem and instead convert raw thoughts—spoken or read—into summaries, bullet points, action items, and study-ready formats. That matters because the fastest way to capture ideas (voice) becomes genuinely searchable and actionable once transcription and formatting happen automatically.

The workflow starts with what to use and where. Reflect’s built-in AI assistant can format selected text using pre-made “system prompts” and custom prompts saved by the user. The assistant isn’t a free-form chat; it operates on selected text, producing results that can be replaced, inserted, or rerun. For transcription, Reflect includes a microphone feature that converts spoken notes into text with near-human accuracy and places the output into the user’s daily note. The same capabilities extend to mobile, keeping the process “in the app” rather than forcing copy-paste roundtrips.

From there, the most common payoff comes from reformatting and condensation. Large language models can extract key takeaways from articles, convert content into bullet lists, identify the top themes, and pull out action items. They can also help with writing tasks: editing existing drafts, changing tone, generating outlines, and producing counterarguments to stress-test logic. Even knowledge-work mechanics like backlinks can be automated—running AI over text to suggest what should link to what.

Reflect’s practical guidance emphasizes habit-building. Transcribing voice notes isn’t just another AI feature; it nudges users to prompt the assistant repeatedly so notes become structured lists or checklists instead of unstructured transcripts. Users are encouraged to start with pre-built prompts, then clone and customize prompts once they find formats that fit their thinking.

Three sample workflows show how the pieces fit together. First, a daily reflection workflow: the user records a morning voice ramble, runs a custom prompt that structures it into sections like reframing, gratitude, and top priority, and then replaces the transcript with a formatted note that includes backlinks to related days. Second, meeting capture: long calls can be transcribed into the daily note, then processed into “key takeaways” (bulleted) and “action items” (checklist), reducing the need to type during meetings and making it easier to revisit exact wording later. Third, writing and research: voice-based brainstorming can be turned into an article outline, then expanded into a draft and refined with editing prompts. For research papers, saved highlights can be used as context so the user can “chat” with the document—asking questions like impacts of age on cognitive plasticity—while still having the original source available to verify details.

The throughline is speed and retrieval: voice captures thoughts quickly, AI turns them into structured notes, and those notes become searchable, linkable, and usable for writing or studying. The practical advice ends with a push to adopt voice notes as the default and set phone shortcuts so transcription starts with a single tap wherever ideas strike.

Cornell Notes

Reflect’s AI note-taking approach combines two capabilities: LLM-based assistants for formatting, summarizing, and writing, and AI transcription for converting voice memos into text inside notes. The assistant works by selecting text and running pre-built “system prompts” or custom prompts, producing outputs that can be replaced or inserted. Transcription drops near-human-accuracy text into the daily note, enabling voice-first capture without losing searchability. Sample workflows include daily reflections formatted into reframing/gratitude/priorities, meeting recordings turned into key takeaways and action-item checklists, and research-paper Q&A using saved highlights as context. The payoff is faster capture plus structured, reusable notes for writing and studying.

What two AI capabilities matter most for note-taking in this system, and what does each one do?

The setup relies on (1) large language models (LLMs) for transforming text—summarizing, condensing, extracting themes, generating outlines, editing tone, and producing structured outputs like bullet lists or checklists; and (2) AI transcription, which converts spoken voice notes into written text so users don’t have to replay audio to find ideas later. In Reflect, both are built in: the AI Assistant handles formatting on selected text, while the microphone transcription feature converts speech into text that lands in the daily note.

How does Reflect’s AI Assistant differ from a typical chat tool?

Reflect’s AI Assistant is prompt-and-selection driven rather than open-ended conversation. Users select text, then run a prompt from a list. Prompts include pre-written “system prompts” (for writing, formatting, generating arguments/outlines, and suggestions) plus custom prompts the user can create by cloning and saving. After the assistant finishes, the user can replace the selected text, insert the result, copy it, or rerun if the output isn’t right.

Why does the daily reflection workflow start with voice, and what does AI add afterward?

Voice makes capturing thoughts fast—people can ramble in the morning while walking or commuting. AI then structures that ramble into a consistent template using a custom prompt. In the demo, the formatted daily reflection includes sections such as reframing (turning an anxious concern into a more positive interpretation), gratitude (specific items to be thankful for), a top priority for the day, and a “what I wish I was doing” section. The result is a clean, reusable note rather than a raw transcript.

What’s the meeting workflow designed to solve?

Typing during meetings is described as awkward and distracting. The workflow records or captures a call, transcribes it into the daily note, and then runs a prompt that organizes content into “key takeaways” and “action items.” Takeaways become bullet points, while action items become a checklist. This reduces the need to take notes in real time and makes it easier to revisit exact statements later.

How does AI help with writing beyond generating a first draft?

The approach treats AI as a multi-step writing assistant. Voice brainstorming can be transcribed, then turned into an article outline via a system prompt. From there, AI can generate a draft (noted as a starting point), and the user can edit and improve it. Additional prompts can generate examples or analogies and create counterarguments to find gaps in logic. There’s also a voice-to-draft option: speaking through the outline so AI produces a draft that can then be cleaned up with copy-editing prompts.

What does “chat with your notes” enable for research study?

Saved research content (like a highlighted paper) can be used as context for Q&A. The user selects relevant notes/links, then asks questions such as “Do I have any information saved on cognitive plasticity?” The system returns extracted information and provides the source so the user can verify by going back to the original highlighted material. The same idea can apply to lectures: transcribe a lecture, save it, and then ask questions to study from the transcript.

Review Questions

  1. Which prompt types (system vs custom) does Reflect’s AI Assistant use, and how do users create custom prompts?
  2. In the meeting workflow, how are key takeaways and action items formatted differently?
  3. What steps turn a voice ramble into a structured daily reflection, and what sections does the template include?

Key Points

  1. 1

    Reflect’s note-taking workflow pairs LLM-based formatting with AI transcription to turn voice and text into structured, searchable notes.

  2. 2

    Reflect’s AI Assistant operates on selected text using system prompts and user-created custom prompts, with outputs that can be replaced or inserted.

  3. 3

    Transcribing voice notes isn’t just capture—it encourages users to run prompts so transcripts become lists, checklists, and templates instead of unstructured blocks.

  4. 4

    Daily reflections can be generated from voice rambling into consistent sections like reframing, gratitude, top priority, and wishes for the day.

  5. 5

    Meeting recordings can be transcribed and then converted into “key takeaways” (bullets) and “action items” (checklists), reducing the need to type during calls.

  6. 6

    Writing workflows can use voice-to-outline and then draft-and-edit prompts, including counterarguments and example/analogy generation.

  7. 7

    Research study becomes interactive by “chatting with notes” built from saved highlights, returning answers with the underlying source available for follow-up.

Highlights

Two core AI functions drive the system: LLMs for restructuring text and transcription for turning voice into written notes inside daily entries.
The AI Assistant is selection-and-prompt based (system prompts plus cloned custom prompts), producing replace/insert-ready outputs rather than free-form chat.
A daily reflection workflow turns a morning voice ramble into a formatted template with reframing, gratitude, priorities, and a wish for the day.
Meeting notes can be transformed into bullet takeaways and checklist action items after transcription, avoiding real-time typing.
Research-paper Q&A works by using saved highlights as context, letting users ask questions while still tracing answers back to the source.

Topics

Mentioned

  • LLMs