3 Unbelievable AI Technologies to Automate Your Literature Review
Based on Andy Stapleton's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
Wiio helps turn a topic prompt into draft literature-review text and limited reference suggestions, but references still require manual checking.
Briefing
Automating a literature review is increasingly practical: three tools can generate draft text, suggest references, and help researchers extract key points from papers—cutting the time spent on repetitive writing and scanning. The biggest payoff comes from combining “draft generation” with “reference and evidence support,” so the work shifts from blank-page effort to editing, verification, and synthesis.
The first tool, Wiio, is positioned as “writing science powered by AI.” After creating a project (for example, a systematic review on “transparent electrode materials”), users can generate AI suggestions for paragraphs, refine grammar, translate to English, and—crucially—pull in references. The workflow is built around starting from a short prompt, then repeatedly selecting AI-generated text that can be copied and pasted into the growing review. Wiio also surfaces common materials relevant to the topic, such as ITO (indium tin oxide) and zinc oxide, as part of building the review’s substance. References can be generated for free, but the transcript emphasizes that they must be checked rather than accepted blindly, and that the free reference suggestions are limited (described as a small set of reference lists and supporting items).
The second tool, Jenny AI, is presented as a more direct “write the literature review for me” option. After logging in and starting a new document with a prompt like “transparent electrode materials,” it begins producing paragraphs immediately, continuing to expand the literature narrative without requiring the user to manually draft every sentence. Jenny AI also supports citation insertion: users highlight text, request citations, and receive reference suggestions across multiple journal or website formats. The transcript notes multiple citation styles, including an IEEE-style option where bracketed numbers appear in the text and corresponding entries appear at the bottom. Beyond drafting and citations, Jenny AI includes a chat-style feature (“Ask Jenny”) that can generate section headings and perform document-level tasks such as analysis, functioning like a research-oriented assistant.
The third tool, OpenRead, is framed as a paper-first automation aid. It’s described as “very new” and “temperamental,” but potentially powerful. Users can upload or locate papers, then quickly scan them—especially via a feature that lets users “talk to the papers” and extract highlights. The transcript describes a “paper espresso” style grab that summarizes key points (including abstract and figure-related context) that can be copied into a literature review and then rewritten in the user’s own words. OpenRead can work even without importing PDFs, by finding papers and then allowing users to bring in their own documents if desired.
Taken together, the tools target three bottlenecks in literature reviews: drafting coherent text, managing citations and structure, and extracting evidence quickly from papers. The transcript repeatedly returns to one guardrail—references should be verified—while pitching Jenny AI as the most immediately impressive for producing review-ready prose and OpenRead as the fastest route to paper-level summaries.
Cornell Notes
The transcript argues that literature reviews can be automated by combining three complementary tools: Wiio for AI-assisted drafting and reference suggestions, Jenny AI for rapid paragraph generation plus citation formatting and section-building via chat, and OpenRead for fast paper scanning and extracting highlights. Wiio helps users start with a topic prompt (e.g., transparent electrode materials), generate draft text, and retrieve a limited set of references that still require manual checking. Jenny AI generates longer review text quickly, supports citation insertion in multiple styles including IEEE-style bracketed numbering, and offers an “Ask Jenny” chat feature for headings and document tasks. OpenRead focuses on evidence: it summarizes papers and lets users copy “highlights” into their review, with the option to upload PDFs or rely on paper search. This matters because it shifts effort from blank-page writing to editing and synthesis.
How does Wiio turn a literature review topic into usable draft text and references?
What makes Jenny AI feel different from Wiio in the drafting workflow?
How does OpenRead speed up the evidence-gathering part of a literature review?
Why does the transcript repeatedly warn users to verify references?
What citation capabilities are highlighted across the tools?
Review Questions
- Which tool in the transcript is most focused on generating review-ready prose immediately, and what citation style example is mentioned for it?
- What specific features does OpenRead provide for extracting paper highlights, and how does that change the literature review workflow?
- How do Wiio and Jenny AI differ in how they build a literature review from a prompt, and what common caution is given about references?
Key Points
- 1
Wiio helps turn a topic prompt into draft literature-review text and limited reference suggestions, but references still require manual checking.
- 2
Jenny AI generates longer review paragraphs quickly and supports citation insertion with multiple styles, including IEEE-style bracketed numbering.
- 3
Jenny AI’s “Ask Jenny” chat feature can generate section headings and assist with document-level tasks like analysis.
- 4
OpenRead accelerates evidence gathering by letting users scan papers and extract copy-ready highlights (e.g., via a “paper espresso” grab).
- 5
OpenRead can summarize papers without requiring PDF import first, though users can upload PDFs if they prefer.
- 6
The most efficient workflow combines AI drafting with verification and rewriting so the final literature review reflects the researcher’s own synthesis.