Abstract
The thesis is structured into three parts that describe the iterative process of creating, applying, evaluating and improving the different versions of the evaluation framework for learning analytics (EFLA). The first part describes the identification of quality indicators for learning analytics as well as the initialisation, first evaluation and first improvement of the EFLA based on input from the learning analytics community as well as related literature; the second part then applies the EFLA to a collaborative learning support widget and describes the subsequent evaluation and improvement; the third part then illustrates the application of the EFLA to widgets of a massive open online course platform and explains the final evaluation and validation process of the framework. The thesis is concluded by a General Discussion of the results reported in all studies. Apart from a summary of the findings, general limitations are reviewed and practical implications are discussed.
Original language | English |
---|---|
Qualification | PhD |
Awarding Institution |
|
Supervisors/Advisors |
|
Award date | 22 Sept 2017 |
Publisher | |
Publication status | Published - 22 Sept 2017 |
Keywords
- learning analytics
- evaluation framework
- doctoral thesis
- awareness
- reflection
- impact
- validation
- EFLA