Publication:
Assessing the validity of a learning analytics expectation instrument: a multinational study

dc.affiliation.dptoUC3M. Departamento de Ingeniería Telemáticaes
dc.affiliation.grupoinvUC3M. Grupo de Investigación: Aplicaciones y Servicios Telemáticos (GAST)es
dc.contributor.authorWhitelock‐Wainwright, Alexander
dc.contributor.authorGasevic, Dragan
dc.contributor.authorTsai, Yi Shan
dc.contributor.authorDrachsler, Hendrik
dc.contributor.authorScheffel, Maren
dc.contributor.authorMuñoz Merino, Pedro José
dc.contributor.authorTammets, Kairit
dc.contributor.authorDelgado Kloos, Carlos
dc.date.accessioned2022-05-16T11:22:43Z
dc.date.available2022-05-16T11:22:43Z
dc.date.issued2020-04
dc.description.abstractTo assist higher education institutions in meeting the challenge of limited student engagement in the implementation of Learning Analytics services, the Questionnaire for Student Expectations of Learning Analytics (SELAQ) was developed. This instrument contains 12 items, which are explained by a purported two-factor structure of "Ethical and Privacy Expectations" and "Service Feature Expectations". As it stands, however, the SELAQ has only been validated with tudents from UK university, which is problematic on account of the interest in Learning Analytics extending beyond this context. Thus, the aim of the current work was to assess whether the translated SELAQ can be validated in three contexts (an Estonian, a Spanish, and a Dutch University). The findings show that the model provided acceptable fits in both the Spanish and Dutch samples, but was not supported in the Estonian student sample. In addition, an assessment of local fit is undertaken for each sample, which provides important points that need to be considered in future work. Finally, a general comparison of expectations across contexts is undertaken, which are discussed in relation to the General Data Protection Regulation (2018).en
dc.format.extent32es
dc.identifier.bibliographicCitationJournal of computer assisted learning, 36(2), April 2020, Pp. 209-240en
dc.identifier.doihttps://doi.org/10.1111/jcal.12401
dc.identifier.issn1365-2729
dc.identifier.publicationfirstpage209es
dc.identifier.publicationissue2es
dc.identifier.publicationlastpage240es
dc.identifier.publicationtitleJournal of Computer Assisted Learningen
dc.identifier.publicationvolume36es
dc.identifier.urihttps://hdl.handle.net/10016/34810
dc.identifier.uxxiAR/0000025651
dc.language.isoengen
dc.publisherJohn Wiley & Sons Ltd.en
dc.rights© 2020 John Wiley & Sons Ltd.en
dc.rights.accessRightsopen accessen
dc.subject.otherLearning analyticsen
dc.subject.otherMultinationalen
dc.subject.otherQuestionnaireen
dc.subject.otherStudent expectationsen
dc.titleAssessing the validity of a learning analytics expectation instrument: a multinational studyen
dc.title.alternativeMultinational validity of an instrumenten
dc.typeresearch article*
dc.type.hasVersionSMUR*
dspace.entity.typePublication
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
assessing_JCAL_2020_pr.pdf
Size:
908.52 KB
Format:
Adobe Portable Document Format