Abstract
The objective of this study was to construct and validate the evaluation instrument for the analysis of articles submitted to the Meta: Evaluation Journal, with a view to approval and subsequent publication. The process of peer review of articles is essential for the maintenance of scientific journals, as the quality of the expert opinions supports the legitimacy of the research presented by the authors and the reliability of the work to be published. The expert-centered approach was used, as the adaptation of the evaluation instrument requires consistent knowledge about the peer review process adopted by the journal. The study was carried out in six stages. The first focused on a literature review on the subject of peer review. In the second, the guidelines and guidelines of the Committee on Publication Ethics and the commercial publishers Wiley, Elsevier, Springer Nature, Taylor & Francis and Sage composed the theoretical framework. In the third stage, relevant aspects of the framework were selected and adapted along with the evaluation form used by Meta Magazine: Evaluation, and a checklist containing categories and indicators of the evaluation was elaborated. In the fourth, an instrument was built for the technical and content validation of the checklist. On Thursday, the checklist was validated by four evaluation experts, members of the journal's editorial team. These experts considered the nine categories pertinent and suggested minor modifications to 22 indicators, the suppression of two and the inclusion of four. In the sixth stage, the empirical validation of the instrument already validated took place, by means of a pre-test with its target audience, the ad hoc reviewers of the journal. In general, the items of the instrument made it possible to judge the article. Thus, all 38 assessment items were filled out adequately, with only two additions to the instructions present in the instrument. As a result, it is considered that the evaluation instrument elaborated and validated in this study meets both the needs of the editorial team and the ad hoc reviewers of the Meta: Evaluation Journal.
DOI:https://doi.org/10.56238/sevened2024.007-039