How to publish?
At Charlesworth Author Services online seminars, you will learn how to write a professional text to publish it successfully.
On the AiP website, you will find records of webinars that took place in 2019 and 2020.
The originality of the text can be verified using the freely available Unicheck Plagiarism Checker.
For diploma theses in the Czech Republic, possible plagiarism is checked directly in the national register Theses.cz.
Palacký University subscribes to the Grammarly Business tool, which enables proofreading of texts in English.
Using artificial intelligence, it corrects grammatical errors, suggests synonyms to make the text more readable and accurate, and can even check documents for plagiarism.
It is possible to install the extension in a web browser and use Grammarly's Editor or directly check the written text in Word or Outlook within Microsoft Office.
If you are interested in using Grammarly Business, contact us at email@example.com.
Bibliometrics deals with the measurement and quantitative analysis of documents arising from scientific communication. It examines citation links, analyzes the publishing activities of researchers, monitors international publication trends, etc.
The quality of publishing activities of individual authors as well as the quality of scientific journals are usually measured on the basis of a citation index, i.e. a citation response. The citation index is often referred to as the database of publications and citations itself, in which there are a number of indicators and services that work with citation data.
The world's best-known citation indexes are included in the Web of Science (formerly Web of Knowledge) database created by The Institute for Scientific Information (ISI) and now managed by Clarivate Analytics.
ORCID (Open Researcher and Contributor ID) unambiguously identifies scientific and other academic authors, independently of the databases in which the publications of these authors are registered.
After creating a profile on ORCID, an author can link records of his publications in selected databases, including Web of Science and Scopus, to his ORCID identifier.
ORCID is intended to provide a lasting identity for authors and to facilitate the work of those who monitor the publishing activity of research institutes, especially in cases where the researcher appears in the systems as an author with different variants of the name (e.g. with or without diacritics).
ResearcherID is a tool for identifying authors and managing their publications in the Web of Science database. ResearcherID is created by each scientist himself after logging in to the Publons platform.
Scopus Author Identifier (Author ID) is assigned when processing records of publications into the Scopus database in order to group all works of the author into one author's profile, regardless of variants of the name given in the publications.
Unlike other identifiers, the Scopus Author ID is not created by the author himself, but by the Scopus source database. Authors can only pair the Scopus Author ID with the ORCID, or request corrections within the Scopus Author ID (a typical mistake is when Scopus creates duplicate profiles for the same author).
It is an indicator evaluating the publishing activity of individual authors or groups of authors and is calculated from citation responses of published scientific articles. It was proposed by J.E. Hirsch in 2005.
H-index is defined as the largest number (h), which is equal to the number of articles with the number of citations greater than or equal to the number of these articles.
It expresses the scientific output of a researcher, does not compare the performance of authors in different fields and is tied to a scientific discipline.
The h-index for the same author may differ in different databases (Web of Science and Scopus), as the calculation operates with the data of a specific database.
They characterize the journal on the basis of citation responses to articles published in it, they usually express the ratio of citations of articles to the number of all articles published in a certain period of time (e.g. two, three or five years). You can find journal metrics in the Web of Science and Scopus citation databases, each of which processes its own metrics.
Web of Science
Impact Factor (IF) is one of the most important indicators of the scientific level of serial publications (journals). It is recalculated and published annually it tells how many times the articles published during the previous two years of the researched journal were cited in the relevant year.
Article Influence Score (AIS) determines the average impact of articles in a given journal for the first five years after publication.
A score greater than 1.0 means that every article in the journal is above average.
This indicator has recently replaced the Impact Factor, which is often used in the evaluation of publishing activities, so AIS is also used in the current Methodology 17+ for the evaluation of research organizations.
The coefficients of journals are processed in a separate database Journal Citation Reports, which is part of the Web of Science.
SCImago Journal Rank (SJR) is a sophisticated metric similar to Google Page Rank (i.e. a number assigned by Google to each URL expressing the credibility or importance of a website) and takes into account the field of the journal and the weight of citations, not all citations are equally valuable.
CiteScore is a metric calculated in the same way as the Impact Factor, the difference is that the number of citations is divided by the number of articles published in the previous three years instead of two, as is the case with the Impact Factor.
Source Normalized Impact per Paper (SNIP) expresses the ratio of the average number of citations per article of a given journal and the citation potential of the field.
You can find these coefficients directly in the Scopus database in the "Sources" section.
Rankings of journals in the field
Each journal in the Web of Science or Scopus is classified into one or more subject categories (fields). In a given field, journals can be sorted by quality using the selected metric and divided into quartiles. The first quartile (Q1) contains 25% of the most prestigious journals, in the following quartiles (Q2, Q3) the value of the metric continues to decrease and the last quartile (Q4) contains 25% of the "weakest" journals in the field.
In the Web of Science, the Impact Factor is chosen as the metric for quartiles (however, the Article Influence Score can also be used); in the Scopus database, the CiteScore metric is used to divide journals into quartiles.
While standard bibliometric services work with citation responses, alternative metrics (or altmetrics) take into account a wider range of indicators.
In addition to the citation of the given document, the evaluation also includes the responses on social networks, the number of downloads of this document into reference managers, the number of views and other criteria.
The advantage of altmetrics is that they measure the impact of publication outputs immediately after their publication and work at the level of individual documents (not only articles). The most important services in this area are PlumX, Altmetric and ImpactStory.
Open Science includes principles such as open access, open data, open methodologies, open source code, open peer review, open educational materials, alternative metrics, citizen involvement in science and more.
It provides a new framework for scientific research and, in particular, improves the availability of scientific results, transparency, reproducibility, collaboration and the efficiency of dissemination of results.
See more on the website of Palacký University Open Science.