How to get an AI research assistant
Based on Cortex Futura's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
Use illicit to query in natural language and get paper titles plus abstract takeaways without manually reading every result.
Briefing
Finding relevant research fast is often the hardest part of a literature review. Three AI-assisted tools—illicit, Connected Papers, and Research Rabbit—turn open-ended questions into structured paper discovery, then help map how ideas connect across citations. Together, they reduce the time spent hunting through abstracts, reference lists, and Google Scholar pages, while making it easier to build a coherent reading list around a specific topic.
illicit.org starts with natural-language queries. Instead of searching by keywords and manually scanning results, users type a question in plain English (for example, “what are the effects of mindfulness on decision making”). The tool returns matching paper titles and extracts key takeaways from abstracts, so readers can quickly judge whether a study is worth opening. A standout feature is structured extraction: after selecting a result, illicit can parse out the intervention and the outcome measured, presenting them in a table format. That means a user can compare studies by what they tested and what they measured—without reading every abstract end-to-end. The workflow supports iterative filtering: pick promising papers, verify that the abstract aligns with the intended question, and then drill into the details only when the match looks right.
Connected Papers tackles a different bottleneck: exploring the citation neighborhood of a single paper. Rather than manually checking the bibliography (papers a work cites) or the “cited by” list (papers that cite it), Connected Papers visualizes the surrounding literature as a graph. Users can switch views to focus on prior works or derivative works, effectively turning one paper into a navigable ecosystem. This graph-based browsing helps researchers move from one “seed” paper to the broader set of related studies that shape the field.
Research Rabbit builds on that graph idea by adding collections and iterative expansion. Researchers create a collection around a central paper, then the tool surfaces similar works and related references/citations tied to the papers inside the collection. As more papers are added, the system updates the research graph—showing how the literature landscape grows and which works appear more central. The result is a semi-guided way to develop understanding of a topic: start with one key study, then expand outward through suggested similar work, citations, and author connections.
For students and full-time researchers alike, the practical payoff is speed and structure. illicit helps translate questions into candidate studies with extracted intervention/outcome details; Connected Papers and Research Rabbit help trace how those studies connect through citations. Used together, they function like a personal research assistant—turning a vague research interest into a curated, navigable literature map that supports a proper literature review.
Cornell Notes
The core idea is to speed up literature research by combining three AI-assisted tools: illicit, Connected Papers, and Research Rabbit. illicit turns natural-language questions into paper results with extracted abstract takeaways and structured fields like intervention and outcome measured, letting researchers filter studies without reading every abstract. Connected Papers visualizes the citation network around a chosen paper as a graph, making it easier to find prior and derivative work. Research Rabbit extends that approach with collections: add a central paper, then iteratively grow a research graph using similar works, references, and citations tied to the collection. Together, these tools reduce manual searching and help build a coherent reading list around a specific research question.
How does illicit turn a research question into something usable for a literature review?
Why is the intervention/outcome extraction in illicit a big deal for decision-making research?
What problem does Connected Papers solve compared with searching in Google Scholar?
How does Research Rabbit improve on citation graphs by adding collections?
What does the “central paper” concept do in Research Rabbit’s workflow?
Review Questions
- If you had to choose only one tool for extracting structured study details (like intervention and outcome), which would it be and what fields does it extract?
- How would you use Connected Papers differently from illicit when starting from a single key paper?
- What is the advantage of building a collection in Research Rabbit instead of browsing citation lists one paper at a time?
Key Points
- 1
Use illicit to query in natural language and get paper titles plus abstract takeaways without manually reading every result.
- 2
Verify relevance in illicit by checking whether the abstract matches the specific research question before committing to deeper reading.
- 3
Leverage illicit’s structured extraction to compare studies by intervention and outcome measured across multiple papers.
- 4
Use Connected Papers to explore the citation neighborhood of a seed paper through a graph view, reducing time spent on reference lists and “cited by” pages.
- 5
Use Research Rabbit to build and expand a collection around central papers, letting the research graph update as more papers are added.
- 6
Combine the tools: start with illicit for candidate studies, then use Connected Papers and Research Rabbit to map how those studies connect through citations.