Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs. A related field is educational data mining.
Video Learning analytics
Definition
The definition and aims of learning analytics are contested. One earlier definition discussed by the community suggested that "Learning analytics is the use of intelligent data, learner-produced data, and analysis models to discover information and social connections for predicting and advising people's learning." But this definition has been criticised by George Siemens and Mike Sharkey.
A more holistic view than a mere definition is provided by the framework of learning analytics by Greller and Drachsler (2012). It uses a general morphological analysis (GMA) to divide the domain into six "critical dimensions".
A systematic overview on learning analytics and its key concepts is provided by Chatti et al. (2012) and Chatti et al. (2014) through a reference model for learning analytics based on four dimensions, namely data, environments, context (what?), stakeholders (who?), objectives (why?), and methods (how?).
It has been pointed out that there is a broad awareness of analytics across educational institutions for various stakeholders, but that the way learning analytics is defined and implemented may vary, including:
- for individual learners to reflect on their achievements and patterns of behaviour in relation to others;
- as predictors of students requiring extra support and attention;
- to help teachers and support staff plan supporting interventions with individuals and groups;
- for functional groups such as course teams seeking to improve current courses or develop new curriculum offerings; and
- for institutional administrators taking decisions on matters such as marketing and recruitment or efficiency and effectiveness measures.
In that briefing paper, Powell and MacNeill go on to point out that some motivations and implementations of analytics may come into conflict with others, for example highlighting potential conflict between analytics for individual learners and organisational stakeholders.
Ga?evi?, Dawson, and Siemens argue that computational aspects of learning analytics need to be linked with the existing educational research if the field of learning analytics is to deliver to its promise to understand and optimize learning.
Maps Learning analytics
Differentiating learning analytics and educational data mining
Differentiating the fields of educational data mining (EDM) and learning analytics (LA) has been a concern of several researchers. George Siemens takes the position that educational data mining encompasses both learning analytics and academic analytics, the former of which is aimed at governments, funding agencies, and administrators instead of learners and faculty. Baepler and Murdoch define academic analytics as an area that "...combines select institutional data, statistical analysis, and predictive modeling to create intelligence upon which learners, instructors, or administrators can change academic behavior". They go on to attempt to disambiguate educational data mining from academic analytics based on whether the process is hypothesis driven or not, though Brooks questions whether this distinction exists in the literature. Brooks instead proposes that a better distinction between the EDM and LA communities is in the roots of where each community originated, with authorship at the EDM community being dominated by researchers coming from intelligent tutoring paradigms, and learning anaytics researchers being more focused on enterprise learning systems (e.g. learning content management systems).
Regardless of the differences between the LA and EDM communities, the two areas have significant overlap both in the objectives of investigators as well as in the methods and techniques that are used in the investigation. In the MS program offering in learning analytics at Teachers College, Columbia University, students are taught both EDM and LA methods.
History
The context of learning analytics
In "The State of Learning Analytics in 2012: A Review and Future Challenges" Rebecca Ferguson tracks the progress of analytics for learning as a development through:
- The increasing interest in big data for business intelligence
- The rise of online education focussed around virtual learning environments (VLEs), content management systems (CMSs), and management information systems (MIS) for education, which saw an increase in digital data regarding student background (often held in the MIS) and learning log data (from VLEs). This development afforded the opportunity to apply business intelligence techniques to educational data
- Questions regarding the optimisation of systems to support learning particularly given the question regarding how we can know whether a student is engaged/understanding if we can't see them?
- Increasing focus on evidencing progress and professional standards for accountability systems
- This focus led to a teacher stakehold in the analytics--given that they are associated with accountability systems
- Thus an increasing emphasis was placed on the pedagogic affordances of learning analytics
- This pressure is increased by the economic desire to improve engagement in online education for the deliverance of high-quality affordable education
History of techniques and methods of learning analytics
In a discussion of the history of analytics, Cooper highlights a number of communities from which learning analytics draws techniques, including:
- Statistics, which are a well established means to address hypothesis testing.
- Business intelligence, which has similarities with learning analytics, although it has historically been targeted at making the production of reports more efficient through enabling data access and summarising performance indicators.
- Web analytics, tools such as Google analytics report on web page visits and references to websites, brands and other keyterms across the internet. The more "fine grain" of these techniques can be adopted in learning analytics for the exploration of student trajectories through learning resources (courses, materials, etc.).
- Operational research, which aims at highlighting design optimisation for maximising objectives through the use of mathematical models and statistical methods. Such techniques are implicated in learning analytics which seek to create models of real world behaviour for practical application.
- Artificial intelligence and Data mining, machine learning techniques built on data mining and AI methods are capable of detecting patterns in data. In learning analytics such techniques can be used for intelligent tutoring systems, classification of students in more dynamic ways than simple demographic factors, and resources such as "suggested course" systems modelled on collaborative filtering techniques.
- Social network analysis (SNA), which analyses relationships between people by exploring implicit (e.g. interactions on forums) and explicit (e.g. "friends" or "followers") ties online and offline. SNA developed from the work of sociologists like Wellman and Watts, and mathematicians like Barabasi and Strogatz. The work of these individuals has provided us with a good sense of the patterns that networks exhibit (small world, power laws), the attributes of connections (in early 70's, Granovetter explored connections from a perspective of tie strength and impact on new information), and the social dimensions of networks (for example, geography still matters in a digital networked world). It is particularly used to explore clusters of networks, influence networks, engagement and disengagement, and has been deployed for these purposes in learning analytic contexts.
- Information visualization, which is an important step in many analytics for sensemaking around the data provided, and is used across most techniques (including those above).
History of learning analytics in higher education
The first graduate program focused specifically on learning analytics was created by Ryan S. Baker and launched in the Fall 2015 semester at Teachers College, Columbia University. The program description states that
data about learning and learners are being generated today on an unprecedented scale. The fields of learning analytics (LA) and educational data mining (EDM) have emerged with the aim of transforming this data into new insights that can benefit students, teachers, and administrators. As one of world's leading teaching and research institutions in education, psychology, and health, we are proud to offer an innovative graduate curriculum dedicated to improving education through technology and data analysis."
Analytic methods
Methods for learning analytics include:
- Content analysis, particularly of resources which students create (such as essays).
- Discourse analytics, which aims to capture meaningful data on student interactions which (unlike social network analytics) aims to explore the properties of the language used, as opposed to just the network of interactions, or forum-post counts, etc.
- Social learning analytics, which is aimed at exploring the role of social interaction in learning, the importance of learning networks, discourse used to sensemake, etc.
- Disposition analytics, which seeks to capture data regarding student's dispositions to their own learning, and the relationship of these to their learning. For example, "curious" learners may be more inclined to ask questions, and this data can be captured and analysed for learning analytics.
Analytic outcomes
Analytics have been used for:
- Prediction purposes, for example to identify "at risk" students in terms of drop out or course failure
- Personalization & adaptation, to provide students with tailored learning pathways, or assessment materials
- Intervention purposes, providing educators with information to intervene to support students
- Information visualization, typically in the form of so-called learning dashboards which provide overview learning data through data visualisation tools
Software
Much of the software that is currently used for learning analytics duplicates functionality of web analytics software, but applies it to learner interactions with content. Social network analysis tools are commonly used to map social connections and discussions. Some examples of learning analytics software tools include:
- BEESTAR INSIGHT: a real-time system that automatically collects student engagement and attendance, and provides analytics tools and dashboards for students, teachers and management
- LOCO-Analyst: a context-aware learning tool for analytics of learning processes taking place in a web-based learning environment
- SAM: a Student Activity Monitor intended for personal learning environments
- SNAPP: a learning analytics tool that visualizes the network of interactions resulting from discussion forum posts and replies
- Solutionpath StREAM: A leading UK based real-time system that leverage predictive models to determine all facets of student engagement using structured and unstructured sources for all institutional roles
- Student Success System: a predictive learning analytics tool that predicts student performance and plots learners into risk quadrants based upon engagement and performance predictions, and provides indicators to develop understanding as to why a learner is not on track through visualizations such as the network of interactions resulting from social engagement (e.g. discussion posts and replies), performance on assessments, engagement with content, and other indicators
Ethics and privacy
The ethics of data collection, analytics, reporting and accountability has been raised as a potential concern for learning analytics, with concerns raised regarding:
- Data ownership
- Communications around the scope and role of learning analytics
- The necessary role of human feedback and error-correction in learning analytics systems
- Data sharing between systems, organisations, and stakeholders
- Trust in data clients
As Kay, Kom and Oppenheim point out, the range of data is wide, potentially derived from:
- Recorded activity: student records, attendance, assignments, researcher information (CRIS)
- Systems interactions: VLE, library / repository search, card transactions
- Feedback mechanisms: surveys, customer care
- External systems that offer reliable identification such as sector and shared services and social networks
Thus the legal and ethical situation is challenging and different from country to country, raising implications for:
- Variety of data: principles for collection, retention and exploitation
- Education mission: underlying issues of learning management, including social and performance engineering
- Motivation for development of analytics: mutuality, a combination of corporate, individual and general good
- Customer expectation: effective business practice, social data expectations, cultural considerations of a global customer base.
- Obligation to act: duty of care arising from knowledge and the consequent challenges of student and employee performance management
In some prominent cases like the inBloom disaster, even full functional systems have been shut down due to lack of trust in the data collection by governments, stakeholders and civil rights groups. Since then, the learning analytics community has extensively studied legal conditions in a series of experts workshops on "Ethics & Privacy 4 Learning Analytics" that constitute the use of trusted learning analytics. Drachsler & Greller released a 8-point checklist named DELICATE that is based on the intensive studies in this area to demystify the ethics and privacy discussions around learning analytics.
- D-etermination: Decide on the purpose of learning analytics for your institution.
- E-xplain: Define the scope of data collection and usage.
- L-egitimate: Explain how you operate within the legal frameworks, refer to the essential legislation.
- I-nvolve: Talk to stakeholders and give assurances about the data distribution and use.
- C-onsent: Seek consent through clear consent questions.
- A-nonymise: De-identify individuals as much as possible
- T-echnical aspects: Monitor who has access to data, especially in areas with high staff turn-over.
- E-xternal partners: Make sure externals provide highest data security standards
It shows ways to design and provide privacy conform learning analytics that can benefit all stakeholders. The full DELICATE checklist is publicly available.
Open learning analytics
Chatti, Muslim and Schroeder note that the aim of open learning analytics (OLA) is to improve learning effectiveness in lifelong learning environments. The authors refer to OLA as an ongoing analytics process that encompasses diversity at all four dimensions of the learning analytics reference model.
See also
- Odds algorithm
- Pattern recognition
- Predictive analytics
- Text analytics
Further reading
For general audience introductions, see:
- The Educause learning initiative briefing (2011)
- The Educause review on learning analytics (2011)
- The UNESCO learning analytics policy brief (2012)
Notes
References
External links
- Society for Learning Analytics Research (SoLAR) - a research network for learning analytics
- US Department of Education report on Learning Analytics. 2012
- Learning Analytics Google Group with discussions from researchers and individuals interested in the topic.
- International Conference Learning Analytics & Knowledge
- Learning Analytics and Educational Data Mining conferences and people
- Next Gen Learning definition
- Microsoft Education Analytics with information on how to use data to support improved educational outcomes.
- Educational Data mining
- Educause resources on learning analytics
- Learning analytics infographic
Source of the article : Wikipedia