Get AI summaries of any video or article — Sign up free
Using Reflect's AI features thumbnail

Using Reflect's AI features

Reflect Notes·
6 min read

Based on Reflect Notes's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

Reflect’s AI assistant turns highlighted note text—most often voice transcriptions—into structured outputs like key takeaways, action items, project proposals, and daily reflections.

Briefing

Reflect’s core AI workflow centers on turning messy voice-to-text notes into structured, reusable outputs—then letting users query their own note library with AI using filtered context. The biggest practical payoff comes from the AI assistant: highlight a block of text (often produced by Reflect’s voice transcription), run a prompt, and instantly convert long transcripts into formats like key takeaways, action items, project proposals, daily reflections, or simple one-page summaries.

In the assistant flow, users can trigger prompts via a shortcut (Command J) or by highlighting text and selecting the “magic stars” menu. The transcript uses fake examples to demonstrate three common use cases. For a client call transcript, the assistant can reorganize an unbroken transcript into clean sections such as “key takeaways” and “action items,” making the content skimmable and actionable. For project work, a “project proposal” prompt outputs a consistent markdown template with fillable variables, so the result is structured rather than free-form. For personal journaling, a “daily reflection” prompt reformats a recorded audio note into a repeatable structure that includes sections like gratitude, a reframing exercise, and a top priority.

A key theme is that prompts work on any text, but voice notes are the dominant use case because transcription produces large blocks of text that typically need formatting. Even when the end goal is writing—such as turning a rambling voice note into an article outline—the assistant’s value is in transforming raw input into a usable structure. In practice, the workflow is often: record audio → transcribe → run an assistant prompt to restructure.

The transcript then shifts to how custom prompts are built and managed. Users can clone built-in “system” prompts, rename them for specific contexts (for example, tailoring a proposal format for LinkedIn), and edit instructions and output templates. A practical tip is to start by writing the desired format directly in a note, then copy that format into the custom prompt so the AI has a clear blueprint. For one-off requests, an “open prompt” field supports quick transformations without saving a reusable prompt.

Beyond formatting, Reflect adds “chat with your notes” through advanced search. Instead of giving AI access to everything, users narrow the context window using filters such as tags (e.g., a “book” tag). With that scoped context, the AI can answer questions and generate follow-ups—like recommending a next book based on what’s already been read in the selected category. The system uses Google Gemini for this chat, and it can ingest a large number of notes (such as daily notes from the past year) to extract themes and categories. It’s positioned as concept-level synthesis rather than exact fact retrieval (e.g., it won’t reliably answer a question like the “17th word” in a note).

Finally, device support matters: the AI assistant is available on iPhone iOS and works like desktop, including custom prompts. “Chat with your notes” is not yet available on iPhone because advanced search isn’t on that platform, though both features are available on iPad. Voice transcription is also available on mobile apps, reinforcing the same end-to-end workflow across devices: record, transcribe, then prompt-transform.

Cornell Notes

Reflect’s AI assistant is designed to turn long, unstructured note text—especially voice transcriptions—into structured outputs using prompts. Users highlight text (often transcribed from audio), run a prompt via Command J or the “magic stars” menu, and can generate key takeaways, action items, project proposals in markdown templates, daily reflections, or one-page summaries. Custom prompts can be created by cloning built-in system prompts and editing instructions and output formats; a recommended method is to draft the target format in a note first, then paste it into the prompt. For deeper insight, “chat with your notes” uses advanced search filters to limit context and relies on Google Gemini to synthesize themes from selected notes. The AI assistant works on iPhone and desktop, while chat-with-notes is currently limited to iPad and not available on iPhone.

Why do voice notes tend to be the best input for Reflect’s AI assistant?

Voice transcription produces a large block of text that usually needs restructuring before it’s useful. The assistant’s prompts are built for formatting and reorganization—splitting transcripts into sections like “key takeaways” and “action items,” converting raw notes into markdown templates (e.g., project proposals), or reshaping journaling content into a consistent daily reflection layout. Prompts can run on any text, but the transcript emphasizes that the most common workflow is voice note → transcription → prompt-run formatting.

How does a custom prompt help keep outputs consistent, and what’s the recommended way to build one?

Custom prompts can include both instructions and a specific output format (often in markdown), which forces the AI to return results in a predictable structure with fillable variables. The transcript’s practical tip is to first write the desired format directly in a note (for example, the daily reflection sections), then copy and paste that format into the custom prompt. After minor tweaks and one or two test runs, the prompt becomes reliable for quick reuse.

What’s the difference between cloning a system prompt and using the “open prompt” field?

Cloning a system prompt creates a reusable custom prompt that can be edited, saved, and run repeatedly with consistent results—useful for recurring tasks like project proposals or daily reflections. The “open prompt” field supports one-off transformations without saving the prompt; for example, asking for a “one pager for a client” based on highlighted text, then manually adjusting if needed.

How does “chat with your notes” avoid overwhelming the AI with everything in a library?

It uses advanced search filters to narrow the context window. The transcript describes selecting a tag (like a “book” tag) and then chatting with notes constrained to that filter, effectively turning the AI into a conversational interface over a subset of the library. This scoping is important because the AI chat doesn’t have access to every note by default.

What kinds of questions does the AI chat handle well versus poorly?

It performs well on summarization and theme extraction—such as summarizing categories of work from daily notes over the past year or recommending books based on what’s been read within a filtered set. It’s less reliable for exact, word-level retrieval; the transcript gives the example that asking for the “17th word” in a note may fail. The system is framed as concept-level synthesis rather than precise data scraping.

What device limitations apply to Reflect’s AI features?

The AI assistant is available on the iPhone iOS app and mirrors the desktop experience, including running saved custom prompts. However, “chat with your notes” isn’t available on iPhone because advanced search isn’t on that platform yet. Both AI assistant and chat-with-notes are available on the iPad app, and voice transcription is also available on mobile apps.

Review Questions

  1. When would you choose a saved custom prompt over the open prompt field, and what benefit does the saved prompt provide?
  2. How does advanced search filtering change what the AI can answer during “chat with your notes”?
  3. Give one example of a task that voice transcription makes easier to use with the AI assistant, and explain why formatting is the key step.

Key Points

  1. 1

    Reflect’s AI assistant turns highlighted note text—most often voice transcriptions—into structured outputs like key takeaways, action items, project proposals, and daily reflections.

  2. 2

    Prompts can be run via Command J or through the “magic stars” menu, and results can be inserted or replaced directly in the note.

  3. 3

    Custom prompts are most effective when they include a clear output format (often markdown) and can be cloned from built-in system prompts.

  4. 4

    A practical prompt-building method is to draft the target format in a note first, then paste that format into the custom prompt as a template.

  5. 5

    “Chat with your notes” relies on advanced search filters to limit context and uses Google Gemini to synthesize themes from selected notes.

  6. 6

    AI chat is better for summaries, recommendations, and conceptual insights than for exact word-by-word queries.

  7. 7

    The AI assistant works on iPhone and desktop, while “chat with your notes” is currently limited to iPad because advanced search isn’t on iPhone yet.

Highlights

Voice transcripts are most valuable when paired with the AI assistant, which reorganizes long blocks of text into skimmable sections and templates.
Cloning system prompts lets users create reusable, consistent markdown-based outputs—especially useful for project proposals.
“Chat with your notes” doesn’t scrape everything; advanced search filters narrow context, and Google Gemini then synthesizes themes.
The AI assistant is available on iPhone, but chat-with-notes is not yet available there due to advanced search limitations.

Topics

Mentioned

  • AI
  • GPT
  • iOS