How to publish?
At Charlesworth Author Services online seminars, you will learn how to write a professional text to publish it successfully.
On the AiP website, you will find records of webinars that took place in 2019 and 2020.
The originality of the text can be verified using the freely available Unicheck Plagiarism Checker.
For diploma theses in the Czech Republic, possible plagiarism is checked directly in the national register Theses.cz.
Palacky University has access to the Writefull tool, which provides advanced proofreading of texts in English and is intended primarily for universities and research institutions.
corrects grammar, technical terminology, punctuation, spelling and stylistics. In doing so, it uses language models based on artificial intelligence and continuously "learns" from the texts of millions of published professional articles. (Writefull vs. humans on three sample texts.)
Bibliometrics deals with the measurement and quantitative analysis of documents arising from scientific communication. It examines citation links, analyzes the publishing activities of researchers, monitors international publication trends, etc.
The quality of publishing activities of individual authors as well as the quality of scientific journals are usually measured on the basis of a citation index, i.e. a citation response. The citation index is often referred to as the database of publications and citations itself, in which there are a number of indicators and services that work with citation data.
The world's best-known citation indexes are included in the Web of Science (formerly Web of Knowledge) database created by The Institute for Scientific Information (ISI) and now managed by Clarivate Analytics.
ORCID (Open Researcher and Contributor ID) unambiguously identifies scientific and other academic authors, independently of the databases in which the publications of these authors are registered.
After creating a profile on ORCID, an author can link records of his publications in selected databases, including Web of Science and Scopus, to his ORCID identifier.
ORCID is intended to provide a lasting identity for authors and to facilitate the work of those who monitor the publishing activity of research institutes, especially in cases where the researcher appears in the systems as an author with different variants of the name (e.g. with or without diacritics).
ResearcherID is a tool for identifying authors and managing their publications in the Web of Science database. ResearcherID is created by each scientist himself after logging in to the Publons platform.
Scopus Author Identifier (Author ID) is assigned when processing records of publications into the Scopus database in order to group all works of the author into one author's profile, regardless of variants of the name given in the publications.
Unlike other identifiers, the Scopus Author ID is not created by the author himself, but by the Scopus source database. Authors can only pair the Scopus Author ID with the ORCID, or request corrections within the Scopus Author ID (a typical mistake is when Scopus creates duplicate profiles for the same author).
It is an indicator evaluating the publishing activity of individual authors or groups of authors and is calculated from citation responses of published scientific articles. It was proposed by J.E. Hirsch in 2005.
H-index is defined as the largest number (h), which is equal to the number of articles with the number of citations greater than or equal to the number of these articles.
It expresses the scientific output of a researcher, does not compare the performance of authors in different fields and is tied to a scientific discipline.
The h-index for the same author may differ in different databases (Web of Science and Scopus), as the calculation operates with the data of a specific database.
They characterize the journal on the basis of citation responses to articles published in it, they usually express the ratio of citations of articles to the number of all articles published in a certain period of time (e.g. two, three or five years). You can find journal metrics in the Web of Science and Scopus citation databases, each of which processes its own metrics.
Web of Science
Impact Factor (IF) is one of the most important indicators of the scientific level of serial publications (journals). It is recalculated and published annually it tells how many times the articles published during the previous two years of the researched journal were cited in the relevant year.
Article Influence Score (AIS) determines the average impact of articles in a given journal for the first five years after publication.
A score greater than 1.0 means that every article in the journal is above average.
This indicator has recently replaced the Impact Factor, which is often used in the evaluation of publishing activities, so AIS is also used in the current Methodology 17+ for the evaluation of research organizations.
The coefficients of journals are processed in a separate database Journal Citation Reports, which is part of the Web of Science.
SCImago Journal Rank (SJR) is a sophisticated metric similar to Google Page Rank (i.e. a number assigned by Google to each URL expressing the credibility or importance of a website) and takes into account the field of the journal and the weight of citations, not all citations are equally valuable.
CiteScore is a metric calculated in the same way as the Impact Factor, the difference is that the number of citations is divided by the number of articles published in the previous three years instead of two, as is the case with the Impact Factor.
Source Normalized Impact per Paper (SNIP) expresses the ratio of the average number of citations per article of a given journal and the citation potential of the field.
You can find these coefficients directly in the Scopus database in the "Sources" section.
Rankings of journals in the field
Each journal in the Web of Science or Scopus is classified into one or more subject categories (fields). In a given field, journals can be sorted by quality using the selected metric and divided into quartiles. The first quartile (Q1) contains 25% of the most prestigious journals, in the following quartiles (Q2, Q3) the value of the metric continues to decrease and the last quartile (Q4) contains 25% of the "weakest" journals in the field.
In the Web of Science, the Impact Factor is chosen as the metric for quartiles (however, the Article Influence Score can also be used); in the Scopus database, the CiteScore metric is used to divide journals into quartiles.
While standard bibliometric services work with citation responses, alternative metrics (or altmetrics) take into account a wider range of indicators.
In addition to the citation of the given document, the evaluation also includes the responses on social networks, the number of downloads of this document into reference managers, the number of views and other criteria.
The advantage of altmetrics is that they measure the impact of publication outputs immediately after their publication and work at the level of individual documents (not only articles). The most important services in this area are PlumX, Altmetric and ImpactStory.
Open Science includes principles such as open access, open data, open methodologies, open source code, open peer review, open educational materials, alternative metrics, citizen involvement in science and more.
It provides a new framework for scientific research and, in particular, improves the availability of scientific results, transparency, reproducibility, collaboration and the efficiency of dissemination of results.
It provides permanent, free and immediate access to the science results and research via the Internet.
The most well-known projects in this area include the electronic archive of preprints arXiv.org (operating since 1992), the Directory of Open Access Journals (DOAJ) or the non-profit publishing house Public Library of Science (PLOS).
You can find other Open Access resources here on the portal in the list of useful links.
Open Access has also been addressed in the long term by the European Commission through the OpenAIRE (Open Access Infrastructure for Research in Europe) initiative, which aims to support Open Access policy through technical infrastructure. The OpenAIRE portal is a network of open repositories, archives and journals that support Open Access policies.
The open access trend has also been adopted by many publishers of scientific journals who offer complete or partial content in the Open Access mode for selected titles on their platforms.
These are journals, that freely make published articles available and in which authors can publish under the condition of a fee to the publisher, without their text going through the usual review procedure. The danger of publishing in such journals lies in the threat to the professional reputation of the author and the research institute.
The basic instructions for verifying the credibility of the journal are provided by the guide on the website: Think. Check. Submit.
When reviewing a journal, you can follow the standards of transparency and good practice of scientific publishing set by the Committee on Publication Ethics (COPE), the Open Access Scholarly Publishing Association (OASPA), or administrators of the Directory of Open Access Journals (DOAJ).
To avoid publishing your article in a predatory journal, look for journals that meet the following criteria:
- clear determination of amount article processing charge (APC) for an author
- true statements about journal citation metrics
- length and quality of the review procedure are declared
- declaring the Open Access regime in the journal (e.g. indexing in DOAJ)
- availability of articles and available archive of the journal
- the possibility of verifying the members of the editorial board
- true contact details of the journal publisher
- the name and graphics of the journal do not imitate the more prestigious journal
- information about the ISSN is given on the journal page and can be checked in the ROAD database
- conformity between the included topics and the content profile of the journal
- positive references from colleagues or the scientific community