liu.seSearch for publications in DiVA
Change search
Link to record
Permanent link

Direct link
Alternative names
Publications (6 of 6) Show all publications
Carlsson, R., Batinovic, L., Hyltse, N., Kalmendal, A., Nordström, T. & Topor, M. (2024). A Beginner’s Guide to Open and Reproducible Systematic Reviews in Psychology. Collabra: Psychology, 10(1)
Open this publication in new window or tab >>A Beginner’s Guide to Open and Reproducible Systematic Reviews in Psychology
Show others...
2024 (English)In: Collabra: Psychology, E-ISSN 2474-7394, Vol. 10, no 1Article in journal (Refereed) Published
Abstract [en]

This paper provides guidance and tools for conducting open and reproducible systematic reviews in psychology. It emphasizes the importance of systematic reviews for evidence-based decision-making and the growing adoption of open science practices. Open science enhances transparency, reproducibility, and minimizes bias in systematic reviews by sharing data, materials, and code. It also fosters collaborations and enables involvement of non-academic stakeholders. The paper is designed for beginners, offering accessible guidance to navigate the many standards and resources that may not obviously align with specific areas of psychology. It covers systematic review conduct standards, pre-registration, registered reports, reporting standards, and open data, materials and code. The paper is concluded with a glimpse of recent innovations like Community Augmented Meta-Analysis and independent reproducibility checks.

Place, publisher, year, edition, pages
UNIV CALIFORNIA PRESS, 2024
Keywords
systematic review, open science, reproducibility, guide
National Category
Psychology
Identifiers
urn:nbn:se:liu:diva-210579 (URN)10.1525/collabra.126218 (DOI)001379350300001 ()2-s2.0-85213028550 (Scopus ID)
Funder
Swedish Research Council, 2020-03430
Note

Funding Agencies|Swedish Research Council [2020-03430]

Available from: 2025-01-01 Created: 2025-01-01 Last updated: 2025-05-05
Batinovic, L. & Andre, K. (2024). Community Augmented Meta-Analysis of Evidence in Learning and Didactics. In: META-REP 2024: . Paper presented at META-REP 2024, Munich, 2024.
Open this publication in new window or tab >>Community Augmented Meta-Analysis of Evidence in Learning and Didactics
2024 (English)In: META-REP 2024, 2024Conference paper, Poster (with or without abstract) (Other academic)
Abstract [en]

Community-augmented meta-analysis (CAMA) platforms pioneer a new standard for promoting FAIR (findable, accessible, interoperable, reusable) data sharing, and allow dynamic and interactive meta-analysis which ensures reproducibility of results (Tsuji et al., 2014). As the area of (special) education research moves towards open science practices, our CAMA platform sets to facilitate data sharing of meta-analyses and make evidence-based practice accessible to practitioners. Furthermore, we aim to promote high-quality standards in conducting evidence synthesis, which are still not readily implemented in the education research area (Nordstrom et al., 2023).Our platform provides Bayesian and frequentist meta-analytic methods, interactive interface to conduct and visualize the analyses, easy-to-understand results and a large database of extracted effects that can be downloaded and reused by researchers. Users can conduct various moderator analyses, based on demographic information, risk of bias assessment, or study characteristics, and contribute to the database by submitting new extracted effects.

National Category
Educational Sciences
Identifiers
urn:nbn:se:liu:diva-209160 (URN)
Conference
META-REP 2024, Munich, 2024
Available from: 2024-11-06 Created: 2024-11-06 Last updated: 2025-11-03Bibliographically approved
Batinovic, L., Howe, M., Sinclair, S. & Carlsson, R. (2023). Ageism in Hiring: A Systematic Review and Meta-analysis of Age Discrimination. Collabra: Psychology, 9(1)
Open this publication in new window or tab >>Ageism in Hiring: A Systematic Review and Meta-analysis of Age Discrimination
2023 (English)In: Collabra: Psychology, E-ISSN 2474-7394, Vol. 9, no 1Article in journal (Refereed) Published
Abstract [en]

We aimed to identify effect sizes of age discrimination in recruitment based on evidence from correspondence studies and scenario experiments conducted between 2010 and 2019. To differentiate our results, we separated outcomes (i.e., call-back rates and hiring/invitation to interview likelihood) by age groups (40-49, 50-59, 60-65, 66+) and assessed age discrimination by comparing older applicants to a control group (29-35 year-olds). We conducted searches in PsycInfo, Web of Science, ERIC, BASE, and Google Scholar, along with backward reference searching. Study bias was assessed with a tool developed for this review, and publication bias by calculating R-index, p-curve, and funnel plots. We calculated odds ratios for callback rates, pooled the results using a random-effects meta-analysis and calculated 95% confidence intervals. We included 13 studies from 11 articles in our review, and conducted meta-analyses on the eight studies that we were able to extract data from. The majority of studies were correspondence studies (k=10) and came largely from European countries (k=9), with the rest being from the U.S. (k=3) and Australia (k=1). Seven studies had a between-participants design, and the remaining six studies had a within-participants design. We conducted six random-effects meta-analyses, one for each age category and type of study design and found an average effect of age discrimination against all age groups in both study designs, with varying effect sizes (ranging from OR = 0.38, CI [0.25, 0.59] to OR = 0.89, CI [0.81, 0.97]). There was moderate to high risk of bias on certain factors, e.g., age randomization, problems with application heterogeneity. Generally, there’s an effect of age discrimination and it tends to increase with age. This has important implications regarding the future of the world’s workforce, given the increase in the older workforce and later retirement.

Place, publisher, year, edition, pages
University of California Press, 2023
Keywords
ageism, discrimination, systematic review, meta-analysis
National Category
Psychology (excluding Applied Psychology)
Identifiers
urn:nbn:se:liu:diva-197800 (URN)10.1525/collabra.82194 (DOI)2-s2.0-85169012673 (Scopus ID)
Available from: 2023-09-15 Created: 2023-09-15 Last updated: 2023-12-22Bibliographically approved
Batinović, L. & Andre, K. (2023). ELD CAMA platform for educational interventions. In: : . Paper presented at 4th symposium on big data and research syntheses in psychology.
Open this publication in new window or tab >>ELD CAMA platform for educational interventions
2023 (English)Conference paper, Oral presentation only (Other academic)
Abstract [en]

Background:Community-augmented meta-analysis (CAMA) platforms have begun to set a new standard for promoting FAIR (findable, accessible, interoperable, reusable) data sharing. They allow dynamic and interactive meta-analysis of data and ensure reproducibility of results (Tsuji et al., 2014). As the area of disability research and education moves towards open science practices, the newly created CAMA platform sets to facilitate data sharing of meta-analyses and make evidence-based practice accessible to practitioners. Furthermore, we aim to promote high-quality standards in conducting evidence synthesis, which are still not readily implemented in the education research area (Nordström et al., 2022).

Objectives of the CAMA platform:The Evidence in Learning and Didactics, and Disability research CAMA platform will provide meta-analytic tools to conduct both frequentist (http://194.47.110.50:3838/visualization/), and Bayesian meta-analyses (http://194.47.110.51:3838/) of educational interventions for typically developing and students with intellectual disability. Studies will be subdivided into categories of typically developing students and students with intellectual disability, with further subdivision of educational domains: writing, reading, math, science, and other.

First objective: Create a platform that facilitates sharing of high-quality meta-analyses. The platform will allow publication bias assessment, effect size aggregation and moderator analysis. One important feature of this CAMA platforms is the ability to do analyses based on risk of bias assessments. Quality assessment and risk of bias estimation will be mandatory for dataset inclusion and it will be possible to conduct analyses on studies with different risks of bias estimation.

Second objective: Create a platform that is valuable both as a pedagogical and research tool. Apps will allow high flexibility in model building and provide an interface that provides plain and technical explanations/summaries of statistical outputs. The goal is to have the apps become an easy-to-use tool for students and researchers aiming to conduct meta-analyses and serve as a guide on evidence based practices for practitioners.

National Category
Educational Sciences
Identifiers
urn:nbn:se:liu:diva-200463 (URN)
Conference
4th symposium on big data and research syntheses in psychology
Available from: 2024-01-27 Created: 2024-01-27 Last updated: 2025-05-21Bibliographically approved
Stacey, J. E., Danielsson, H., Heinrich, A., Batinović, L., Holmer, E., Ingo, E. & Henshaw, H. (2023). Relationship between self-reported listening and communication difficulties and executive function: a protocol for a systematic review and meta-analysis. BMJ Open, 13(11), Article ID e071225.
Open this publication in new window or tab >>Relationship between self-reported listening and communication difficulties and executive function: a protocol for a systematic review and meta-analysis
Show others...
2023 (English)In: BMJ Open, E-ISSN 2044-6055, Vol. 13, no 11, article id e071225Article, review/survey (Refereed) Published
Abstract [en]

Introduction

Listening and communication difficulties can limit people’s participation in activity and adversely affect their quality of life. Hearing, as well as listening and communication difficulties, can be measured either by using behavioural tests or self-report measures, and the outcomes are not always closely linked. The association between behaviourally measured and self-reported hearing is strong, whereas the association between behavioural and self-reported measures of listening and communication difficulties is much weaker, suggesting they assess different aspects of listening. While behavioural measures of listening and communication difficulties have been associated with poorer cognitive performance including executive functions, the same association has not always been shown for self-report measures. The objective of this systematic review and meta-analysis is to understand the relationship between executive function and self-reported listening and communication difficulties in adults with hearing loss, and where possible, potential covariates of age and pure-tone audiometric thresholds.

Methods and analysis

Studies will be eligible for inclusion if they report data from both a self-report measure of listening difficulties and a behavioural measure of executive function. Eight databases are to be searched: MEDLINE (via Ovid SP), EMBASE (via Ovid SP), PsycINFO (via Ovid SP), ASSIA (via ProQuest), Cumulative Index to Nursing and Allied Health Literature or CINAHL (via EBSCO Host), Scopus, PubMed and Web of Science (Science and Social Science Citation Index). The JBI critical appraisal tool will be used to assess risk of bias for included studies. Results will be synthesised primarily using a meta-analysis, and where sufficient quantitative data are not available, a narrative synthesis will be carried out to describe key results.

Ethics and dissemination

No ethical issues are foreseen. Data will be disseminated via academic publication and conference presentations. Findings may also be published in scientific newsletters and magazines.

Place, publisher, year, edition, pages
BMJ PUBLISHING GROUP, 2023
National Category
Psychology (excluding Applied Psychology)
Identifiers
urn:nbn:se:liu:diva-199064 (URN)10.1136/bmjopen-2022-071225 (DOI)001102645200006 ()37940150 (PubMedID)
Funder
Swedish Research Council, 2017-06092
Note

Funding: National Institute for Health and Care Research (NIHR) [CDF- 2018- 11- ST2- 016, PB- PG- 0816- 20044]; NIHR Nottingham Biomedical Research Centre [NIHR 203310]; NIHR Manchester Biomedical Research Centre [NIHR 203308]; Swedish Research Council [2017- 06092]

Available from: 2023-11-09 Created: 2023-11-09 Last updated: 2024-05-06Bibliographically approved
Nordström, T., Kalmendal, A. & Batinovic, L. (2023). Risk of bias and open science practices in systematic reviews of educational effectiveness: A meta‐review. Review of Education, 11(3), Article ID e3443.
Open this publication in new window or tab >>Risk of bias and open science practices in systematic reviews of educational effectiveness: A meta‐review
2023 (English)In: Review of Education, E-ISSN 2049-6613, Vol. 11, no 3, article id e3443Article, review/survey (Refereed) Published
Abstract [en]

In order to produce the most reliable syntheses of the effectiveness of educational interventions, systematic reviews need to adhere to rigorous methodological standards. This meta-review investigated risk of bias occurring while conducting a systematic review and the presence of open science practices like data sharing and reproducibility of the review procedure, in recently published reviews in education. We included all systematic reviews of educational interventions, instructions and methods for all K-12 student populations in any school form with experimental or quasi-experimental designs (an active manipulation of the intervention) with comparisons and where the outcome variables were academic performance of any kind. We searched the database Education Resources Information Center (ERIC) through the years 2019-2021. In parallel we hand-searched four major educational review journals for systematic reviews: Educational Research Review (Elsevier), Educational Review (Taylor & Francis), Review of Education (Wiley), and Review of Educational Research (AERA). Systematic reviews were assessed with the risk of bias tool ROBIS and whether the studies had pre-registered protocols, shared primary research data, and whether a third party could reproduce search strings and details of where exactly primary research data were extracted. A total of 88 studies that matched our PICOS were included in this review; of these, 10 educational systematic reviews were judged as low risk of bias (approximately 11%) . The rest were classified as high risk of bias during a shortened ROBIS assessment or assessed as high risk or unclear risk of bias following a full ROBIS assessment. Of the 10 low risk of bias reviews, 6 had detailed their search sufficiently enough for a third party to reproduce, 3 reviews shared the data from primary studies, however none had specified how and from where exactly data from primary studies were extracted. The study shows that at least a small part of systematic reviews in education has a low risk of bias, but most systematic reviews in our set of studies have high risk of bias in their methodological procedure. There are still improvements in this field to be expected as even the low risk of bias reviews are not consistent regarding pre-registered protocols, data sharing, reproducibility of primary research data and reproducible search strings.Rationale for this studyRigorous systematic review is the method to use to evaluate and provide synthesis of methods and interventions in educational research.Why the new findings matterAlarmingly few systematic reviews were judged with low risk of bias and few reviews utilise recent open science practices. This might undermine the reliability of study findings.Implications for journals and review researchersThis meta-review highlights the need for review researchers and journals to better adopt best practices in systematic reviews. We therefore urge for better dissemination and awareness of systematic reviews standards and for more transparency in the review process, which may lead to more reliable synthesis of what works best in education. In the long run this is a necessary change and way forward to provide evidence-based practices in the classroom.Context and implications

Place, publisher, year, edition, pages
WILEY, 2023
Keywords
education; meta-review; open science; reproducibility; risk of bias; systematic review
National Category
Educational Sciences
Identifiers
urn:nbn:se:liu:diva-199902 (URN)10.1002/rev3.3443 (DOI)001135375900005 ()
Funder
Swedish Research Council
Note

Funding: Vetenskapsrdet

Available from: 2024-01-03 Created: 2024-01-03 Last updated: 2024-10-16
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0002-1017-0025

Search in DiVA

Show all publications