Australasian Journal of Educational Technology, 2022, 38(3).
132
al., 2020; Perdomo et al., 2020; Zhao et al., 2021). Moreover, systematic
reviews should be repeated
regularly, and results applied to advance theory and practice (Pettersson, 2018).
Finally, it is important to analyse the role that universities play in DC development to enhance links between
policy, organisational infrastructures, strategic leadership and teachers and teaching practices (Pettersson,
2018). In this sense, it is necessary to consider the connection between teaching competence and
pedagogical leadership for educational innovation and the importance of digital teacher training for the
development of student and institutional competencies (Fernández-Batanero et al., 2020).
In relation to RQ3, a broad variability was observed when assessing methodological quality, organised into
low, medium and high-quality clusters. Similar to results from Polanin et al. (2017), one of the most
concerning findings from the current study is the quality
reporting of the reviews, both in terms of
methodological reporting and reporting of the included primary studies. There were omissions across a
range of criteria, as reported in the results. These findings can again be explained by the broader consensus
in the literature about methodological quality and relevance of EdTech research more generally (Bulfin et
al., 2020; Castañeda et al., 2018; Hew et al., 2019) as well as a lack of clear guidelines for systematic
reviews
in educational research, where much of the methodological literature comes from the health
sciences (Aromataris et al., 2015; Pollock et al., 2021). There may be several factors explaining these
results, including discipline, as critical appraisal is primarily carried out in the health sciences, pragmatic
concerns such as time constraints or the fear that many studies will be excluded, or the lack of familiarity
with guidelines for conducting systematic reviews.
Dostları ilə paylaş: