Get AI summaries of any video or article — Sign up free
How to identify predatory publications | Tools | eSupport for Research | 2022 | Dr. Akash Bhoi thumbnail

How to identify predatory publications | Tools | eSupport for Research | 2022 | Dr. Akash Bhoi

5 min read

Based on eSupport for Research's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

Use UGC CARE’s authenticated Group 1 and Group 2 lists as a starting point to spot clone journals, including highlighted entries.

Briefing

Predatory journals often mimic legitimate titles, but researchers can reduce the risk by cross-checking journal listings across trusted directories—especially UGC CARE’s Group 1 and Group 2 lists—and then verifying indexing claims in major databases. UGC CARE is presented as an authenticated resource site maintained under UGC, designed to help identify questionable or “clone” journals that may exist mainly to extract publication fees.

The core method starts with UGC CARE’s categorization. Group 1 contains journals evaluated through the UGC CARE protocol and listed after meeting the program’s criteria. A key signal for Group 2 is whether the journal is indexed in globally recognized databases such as Web of Science and Scopus (including categories like SCIE, SSCI, and A&HCI). The transcript emphasizes that clone journals may share similar or confusingly close titles with legitimate ones, which can lead authors—especially those unaware—to submit manuscripts to the wrong outlet.

On the UGC CARE site, the transcript describes how clone journals are highlighted (for example, in yellow) within the Group 1 and Group 2 listings. It also notes a practical complication: some clone journals may appear only in print and may not have a functioning website. That makes title matching alone unreliable. The recommended approach is to treat the UGC CARE list as a starting point, then verify the journal’s official website and details before submitting.

Verification is framed as a multi-step cross-check. When a journal claims indexing in Web of Science, the researcher should search the journal in the Web of Science master list and confirm that the listing includes a hyperlink to the journal’s authentic website. The same logic applies to Scopus. For open-access journals, the transcript suggests checking the Directory of Open Access Journals (DOAJ), which also provides links to journal sites. Additional cross-validation can be done through resources like the SHERPA/RoMEO directory to compare publisher policies.

Beyond directory checks, the transcript points to UGC CARE’s resources section, which includes guidance on predatory publishing and a “journal evaluation rubrics” framework. A commonly cited debate—whether to follow Beall’s List blindly—is acknowledged, with the transcript urging researchers to use judgment rather than treat any single list as absolute. The evaluation rubric assigns scores such as good (3), fair (2), and poor (1), based on transparency indicators previously discussed in related guidance: clear journal identity, transparent peer-review process, conflict-of-interest disclosures, revenue model clarity, archive policy, publishing schedule, and fee-related information.

Journals scoring below a threshold can be placed in a “predatory” category, with the possibility of removal if they rectify issues. The takeaway is that avoiding predatory outlets requires active verification—confirming the correct journal identity, checking indexing in trusted databases, and using rubric-based transparency signals—before paying submission or publication fees.

Cornell Notes

UGC CARE provides an authenticated way to identify potentially predatory or “clone” journals by separating journals into Group 1 and Group 2 based on evaluation and indexing signals. Clone journals can share similar titles with legitimate journals, sometimes appearing only in print, so title matching alone is not enough. The transcript recommends cross-checking any journal’s claims by verifying its official website link in Web of Science or Scopus master lists, and using DOAJ or SHERPA/RoMEO for additional confirmation. UGC CARE’s resources also include journal evaluation rubrics that score transparency and publishing practices (good/fair/poor), with low scores indicating higher risk. Researchers should use these tools to decide where to submit rather than relying blindly on any single list.

How does UGC CARE help distinguish legitimate journals from clone or predatory ones?

UGC CARE lists journals after applying a UGC CARE protocol and then groups them into Group 1 and Group 2. Clone journals are highlighted within these lists (e.g., in yellow), signaling titles that may closely resemble original journals. The transcript stresses that researchers should use the UGC CARE listing as a first filter, not as the final authority, because some clone journals may lack websites and appear only in print.

What’s the difference between UGC CARE Group 1 and Group 2 in the transcript?

Group 1 journals are described as those qualified through the UGC CARE protocol, with UGC CARE evaluating and listing them. Group 2 is tied to stronger indexing signals—journals indexed in global databases such as Web of Science and Scopus (including categories like SCIE, SSCI, and A&HCI). The transcript also notes that even within these groups, clone journals can exist and should be avoided.

What cross-check steps should a researcher take before submitting a manuscript?

The transcript recommends verifying indexing claims in the relevant master lists. For Web of Science claims, search the journal in the Web of Science master list and confirm the presence of a hyperlink to the journal’s authentic website. For Scopus, cross-check similarly. For open-access journals, check DOAJ for the journal’s link. For publisher policies, compare with SHERPA/RoMEO. The goal is to ensure the submission goes to the correct journal site, not a lookalike clone.

Why does the transcript warn against relying only on Beall’s List?

Beall’s List is described as a commonly cited resource in debates about predatory publishing, but the transcript frames it as something researchers should not follow blindly. Instead, authors should consider multiple possibilities and use additional verification signals—especially transparency and indexing checks—before deciding where to publish.

What transparency criteria are used in UGC CARE’s journal evaluation rubrics?

The transcript links the rubric scoring to transparency indicators such as the journal’s clear identity, the peer-review process, conflict-of-interest disclosures, the revenue model, archive policy, publishing schedule, and fee-related information. Journals with clear, transparent information are more likely to score “good,” while missing or unclear details can push them toward “fair” or “poor.”

How do rubric scores relate to predatory risk and possible removal from a predatory list?

The rubric assigns points (good = 3, fair = 2, poor = 1). Journals scoring below a threshold can be categorized as potentially predatory. The transcript also notes that if a journal rectifies problems in its process, it may be removed from the predatory list, meaning the status can change over time.

Review Questions

  1. When a journal claims indexing in Web of Science, what specific verification step should be performed to confirm the journal’s authenticity?
  2. What transparency elements (identity, peer review, fees, archiving, etc.) are used to score journals in the UGC CARE rubric framework?
  3. Why can clone journals still appear in UGC CARE Group 1 or Group 2, and what practical checks help catch them?

Key Points

  1. 1

    Use UGC CARE’s authenticated Group 1 and Group 2 lists as a starting point to spot clone journals, including highlighted entries.

  2. 2

    Do not rely on title similarity alone; some clone journals may exist only in print and may not provide a reliable website.

  3. 3

    Verify indexing claims by checking the journal in Web of Science and confirming the master-list hyperlink to the journal’s official site.

  4. 4

    Cross-check Scopus indexing and, for open-access journals, confirm presence and links in DOAJ.

  5. 5

    Use SHERPA/RoMEO to compare publisher policies as an additional authenticity signal before submitting.

  6. 6

    Apply UGC CARE’s transparency rubric logic: clear journal identity, peer-review transparency, conflict-of-interest statements, revenue model clarity, archive policy, publishing schedule, and fee transparency.

  7. 7

    Treat predatory lists (including Beall’s List) as one input—not a standalone decision rule—and combine them with directory and transparency checks.

Highlights

UGC CARE’s Group 1 and Group 2 lists flag clone journals (e.g., highlighted in yellow), helping authors avoid lookalike titles.
Indexing claims should be verified in Web of Science or Scopus master lists, including the presence of a hyperlink to the journal’s authentic website.
UGC CARE’s evaluation rubrics score journals based on transparency signals like peer review, conflicts of interest, revenue model, archiving, schedule, and fees.
Low rubric scores can place journals in a predatory category, but improvements can lead to removal from that category.

Topics

Mentioned

  • Akash Bhoi
  • UGC
  • CARE
  • CP RP
  • SSCI
  • SCIE
  • A&HCI
  • DOAJ
  • SHERPA/RoMEO
  • Web of Science
  • Scopus