Get AI summaries of any video or article — Sign up free
The Best Tool for Literature Review? Research Rabbit vs Connected Papers! thumbnail

The Best Tool for Literature Review? Research Rabbit vs Connected Papers!

Andy Stapleton·
5 min read

Based on Andy Stapleton's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

Research Rabbit is positioned as a “free forever” literature review tool, with Zotero import/sync as a major practical advantage.

Briefing

Research Rabbit stands out as the best “free forever” tool for building a literature review workflow—but it comes with a steep learning curve and an interface that can feel overwhelming without a plan. Connected Papers is easier to navigate and can be especially useful for periodically checking prior work and derivative work, where its similarity-based network map tends to surface newer and older relevant papers more effectively.

Research Rabbit’s core appeal is cost and capability. It’s positioned as free for researchers indefinitely, and it can integrate with Zotero by importing an entire library or syncing collections. The tradeoff is usability: after uploading papers (the transcript describes a nine-paper collection), the interface can “vomit out” graphs, lists, and suggested authors in a way that makes it easy to lose track of what’s being explored. Users are advised to enter with a game plan—such as deciding whether to focus on “later work” after a set of seed papers, or using the tool to narrow the graph to a specific direction of inquiry.

Once users commit to that approach, Research Rabbit’s graph becomes powerful. The transcript describes how selecting suggested authors expands into long chains of related work, with nodes representing papers and authors and connections that are not always explicitly labeled as citation relationships. The layout can also feel counterintuitive: nodes can be dragged around, but the overall structure still behaves as a clustered system, and removing or restarting parts of the exploration can be necessary to change direction.

Connected Papers, by contrast, is presented as more intuitive from the start. Users input a keyword, paper title, DOI, or identifier, and the tool generates a network map. The key distinction is that this map is not a citation tree; it’s arranged by similarity. Node size reflects citation count, node color reflects publishing year, and clusters form where papers are strongly connected by similarity. The transcript highlights that the map’s positioning is not adjustable (no free X/Y axis movement), which can reduce misinterpretation and help users get a stable “big picture” view.

Where Connected Papers is most compelling is in prior and derivative work checks. The transcript claims Research Rabbit can do earlier/later work too, but Connected Papers more consistently surfaces older and newer papers—explicitly noting the appearance of a 2023 paper in Connected Papers while Research Rabbit’s later-work range in the example tops out before that. The recommendation is not to choose one tool exclusively: use Research Rabbit as a daily driver for deep exploration once the interface is learned, and rely on Connected Papers (often once or twice a month) to double-check blind spots in prior and derivative work.

Finally, the transcript adds a broader comparison: Litmaps is mentioned as a strong “daily driver” alternative because even its free tier supports limited inputs and allows changing axes, while still being easier to navigate. But for students who want free access and maximum exploratory power, Research Rabbit is the pick—provided they give it a few days to get past confusion and learn how to steer the graphs with a clear objective.

Cornell Notes

Research Rabbit is recommended as a “free forever” literature review tool with strong exploratory power, especially after users learn how its graphs and suggestions behave. Its main drawback is a steep learning curve and an interface that can feel confusing unless users start with a clear game plan (e.g., focusing on later work or earlier work). Connected Papers is easier to navigate and produces a similarity-based network map rather than a citation tree, using node size for citation count and node color for publishing year. The transcript argues Connected Papers is particularly effective for periodic checks of prior work and derivative work, including surfacing newer papers in the example. Using both tools together is framed as the best strategy: Research Rabbit for daily exploration, Connected Papers for targeted blind-spot verification.

Why does Research Rabbit get positioned as the best option for people who want free access?

Research Rabbit is described as “free forever,” with a mission focused on keeping access available to researchers. It also integrates with Zotero, letting users import their Zotero library or sync collections, which makes it practical for building a literature review starting point without paying for manual uploads.

What’s the biggest usability problem with Research Rabbit, and how do users manage it?

The transcript highlights a massive learning curve and an interface that can feel like it “vomits out” graphs, lists, and suggested authors, making it easy to lose track of what’s being explored. The mitigation is to enter with a game plan—deciding in advance whether to explore later work, earlier work, or a specific direction—so the tool’s outputs don’t overwhelm.

How does Connected Papers’ network map differ from a citation tree, and why does that matter?

Connected Papers arranges papers by similarity rather than showing direct citation links as a tree. Node size represents the number of citations, and node color represents the publishing year. Because the map is similarity-based (and not freely movable on an XY axis), it can be easier to interpret as a clustered landscape of related literature rather than a literal citation chain.

In what situations does Connected Papers outperform Research Rabbit, according to the transcript?

Connected Papers is recommended for prior works and derivative works checks. In the example, Connected Papers surfaces a 2023 paper when exploring later work, while Research Rabbit’s later-work results in the described range don’t reach that recency. The claim is that Connected Papers more consistently supports finding both older and newer relevant papers for targeted verification.

What’s the suggested “workflow” for using both tools without getting stuck?

Use Research Rabbit as the daily driver for deep exploration once the interface is learned, and use Connected Papers periodically—about once or twice a month—to double-check blind spots in prior and derivative work. The transcript also notes there’s no reason not to use both, since each tool is strongest in different tasks.

Review Questions

  1. What specific UI and interpretation challenges does Research Rabbit introduce, and what does the transcript recommend doing to manage them?
  2. How do node size and node color function in Connected Papers’ similarity network map, and how does that change how you should interpret connections?
  3. Why does the transcript recommend using Connected Papers for prior/derivative checks rather than relying on it for all exploration?

Key Points

  1. 1

    Research Rabbit is positioned as a “free forever” literature review tool, with Zotero import/sync as a major practical advantage.

  2. 2

    Research Rabbit’s main weakness is a steep learning curve and an interface that can overwhelm users unless they start with a clear exploration plan.

  3. 3

    Connected Papers generates a similarity-based network map (not a citation tree), using node size for citation count and node color for publishing year.

  4. 4

    Connected Papers is recommended for periodic prior-work and derivative-work checks, including surfacing newer papers more reliably in the transcript’s example.

  5. 5

    Research Rabbit is best treated as a daily exploration tool after users learn how its graphs and suggestions work.

  6. 6

    Using both tools together is framed as the optimal strategy: deep exploration with Research Rabbit, blind-spot verification with Connected Papers.

  7. 7

    Litmaps is mentioned as a more user-friendly alternative/daily driver, especially because its free tier supports limited inputs and adjustable axes.

Highlights

Research Rabbit’s power comes with a “massive learning curve,” and the transcript’s fix is to use a game plan so the graphs don’t become confusing.
Connected Papers’ network map is similarity-based rather than a citation tree, with node size tied to citation counts and node color tied to year.
Connected Papers is claimed to perform better for prior and derivative work checks, including finding a 2023 paper in the example where Research Rabbit’s later-work results didn’t reach that recency.
The recommended approach is complementary use: Research Rabbit for ongoing exploration, Connected Papers for monthly blind-spot verification.

Mentioned