Get AI summaries of any video or article — Sign up free
Top 10+ Scopus and SCI Journals of Computer Science || CS Journal Selectors || Hindi || 2023 thumbnail

Top 10+ Scopus and SCI Journals of Computer Science || CS Journal Selectors || Hindi || 2023

4 min read

Based on eSupport for Research's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

Start with a tentative title and abstract based on the research area so journal search can match on keywords and scope.

Briefing

Computer science researchers looking to submit papers quickly are urged to avoid “pay-and-publish” traps by using a structured journal-selection workflow—then cross-checking indexing, relevance, and review timelines before final submission. The core message is practical: don’t rely on claims like “fast publication” or “no need to check,” because some journals may accept papers (including special issues) yet fail to deliver proper indexing or end up outside major databases. Instead, build a tentative title/abstract from the research area, search for similar articles, and use journal-selection tools to shortlist venues that match both topic and publication speed.

The process starts with refining the submission metadata. A researcher is advised to create a title and abstract (even tentatively) based on the known research area, then use those keywords to find journals that already publish similar work. The workflow emphasizes relevance checking through “similar articles” results: if the same or closely related topics appear in a journal’s recent publications, that journal is more likely to be a fit. This matters because submitting to an unsuitable journal often leads to desk rejection, wasting time—especially when the goal is faster turnaround.

Two main selection routes are described. First is a journal-selection search that uses the researcher’s title/abstract and subject area (example: Computer Science, with attention to interdisciplinary overlap such as Biomedical + Computer Science). The results include journal names, relevance ordering, and practical signals like review/decision time indicators. The second route uses a journal-selection platform’s subject-area filtering (again using Computer Science as the example) to narrow down results quickly, then sorting by factors such as turnaround time and indexing status.

A key decision point is verifying indexing and journal legitimacy. The guidance includes checking whether a journal is indexed in major services (the transcript mentions Scopus and “Web of Science” as benchmarks) and reviewing the journal’s official website details such as indexing claims, subject scope, and ranking signals. The workflow also recommends checking the journal’s “experience” with similar submissions—using metrics like decision time windows and competition indicators (e.g., how many papers are already present or how crowded the journal appears).

After shortlisting, the transcript stresses cross-verification before submission. Even if a journal appears in a recommendation list, researchers should confirm on external sources such as Google Scholar and the journal’s own site. The final step is to validate the journal’s current indexing status and performance, then submit only after confirming that the journal matches the paper’s topic, has credible indexing, and offers a realistic review timeline. The overall aim is to reduce the risk of rejection and avoid journals that promise speed or special-issue publication but fail on indexing and credibility.

Cornell Notes

The transcript lays out a step-by-step method for choosing Scopus/SCI-relevant computer science journals without falling for “pay-and-publish” traps. It recommends starting with a tentative title and abstract, then searching for similar articles to judge topical fit and reduce desk rejection risk. Two selection routes are used: title/abstract-based search and subject-area filtering (e.g., Computer Science), both with attention to turnaround/decision-time indicators. Before submitting, researchers should verify indexing claims (Scopus/Web of Science benchmarks), check the journal’s official site, and cross-check via Google Scholar. The goal is faster publication with credible indexing and better match quality.

Why does the transcript emphasize checking indexing and not trusting “fast publication” claims?

It warns that some journals may accept papers—sometimes even after special issues—yet fail to achieve proper indexing or may end up outside major databases. That risk matters because researchers can spend time and money only to discover the paper isn’t indexed as expected. The workflow therefore treats indexing verification as a non-negotiable step, not a marketing claim.

How does creating a tentative title and abstract help with journal selection?

A tentative title/abstract lets the researcher use keyword-based search to find journals that already publish similar work. The transcript stresses that relevance is a practical filter: if similar articles appear in the journal’s results, the submission is more likely to match the journal’s scope and avoid desk rejection.

What are the two main journal-shortlisting approaches described?

One approach uses the researcher’s title/abstract plus subject area to generate a ranked list of journals and show related/similar articles. The second approach uses subject-area filtering (example: Computer Science) to quickly list journals, then narrows further using indicators like turnaround time and other ranking signals.

What specific checks are recommended before final submission?

The transcript recommends cross-checking indexing status (with Scopus and Web of Science as reference points), reviewing information on the journal’s official website (including indexing claims and subject scope), and validating via Google Scholar. It also suggests checking decision-time/competition indicators and confirming that the journal’s recent publications align with the paper’s topic.

How does the transcript suggest handling interdisciplinary work (e.g., Biomedical + Computer Science)?

It recommends searching using combined or overlapping subject areas so the journal selection reflects the paper’s actual domain. The example given is biomedical and computer science together, implying that the journal shortlist should include venues that publish health-care/biomedical-linked computer science research.

What is the transcript’s practical strategy for avoiding desk rejection and wasted time?

It focuses on fit and evidence: use title/abstract search, confirm that similar articles exist in the shortlisted journal, and verify scope and indexing before submitting. By selecting journals that already publish closely related topics, the researcher reduces the chance of immediate rejection and improves the odds of a faster review cycle.

Review Questions

  1. What steps in the workflow are meant to reduce the risk of desk rejection, and how do they work?
  2. Which verification checks should be done even after a journal appears in a recommendation list?
  3. How does the transcript recommend using title/abstract and subject filters differently to build a shortlist?

Key Points

  1. 1

    Start with a tentative title and abstract based on the research area so journal search can match on keywords and scope.

  2. 2

    Shortlist journals by checking “similar articles” to confirm topical fit and reduce desk rejection risk.

  3. 3

    Use both title/abstract-based search and subject-area filtering (e.g., Computer Science) to narrow options quickly.

  4. 4

    Verify indexing claims against major benchmarks (Scopus/Web of Science) and confirm details on the journal’s official website.

  5. 5

    Cross-check shortlisted journals on external sources such as Google Scholar before submitting.

  6. 6

    Use turnaround/decision-time indicators and competition signals to choose venues aligned with faster review expectations.

  7. 7

    Avoid relying on marketing promises about speed or special issues; credibility depends on indexing and match quality.

Highlights

The workflow treats indexing verification as a must-have step, not a trust exercise based on “fast publication” promises.
Finding similar articles in a journal’s recent publications is presented as a direct way to judge scope fit and avoid desk rejection.
Two complementary shortlisting methods—title/abstract search and subject-area filtering—help narrow thousands of options to a manageable list.
Even after recommendations, cross-checking on Google Scholar and the journal’s official site is recommended before final submission.

Topics

Mentioned

  • SCI
  • CS
  • D
  • D
  • SCI
  • Scopus
  • Web of Science