What is Learning Analytics?

Learning analytics refers to the interpretation of a wide range of data produced by and gathered on behalf of students in order to assess academic progress, predict future performance, and spot potential issues. Data are collected from explicit student actions, such as completing assignments and taking exams, and from tacit actions, including online social interactions, extracurricular activities, posts on discussion forums, and other activities that are not directly assessed as part of the student’s educational progress. The goal of learning analytics is to enable teachers and schools to tailor educational opportunities to each student’s level of need and ability. Learning analytics promises to harness the power of advances in data mining, interpretation, and modelling to improve understandings of teaching and learning, and to tailor education to individual students more effectively. Still in its early stages, learning analytics responds to calls for accountability on campuses and leverages the vast amount of data produced by students in academic activities.

INSTRUCTIONS: Enter your responses to the questions below. This is most easily done by moving your cursor to the end of the last item and pressing RETURN to create a new bullet point. Please include URLs whenever you can (full URLs will automatically be turned into hyperlinks; please type them out rather than using the linking tools in the toolbar).

Please "sign" your contributions by marking with the code of 4 tildes (~) in a row so that we can follow up with you if we need additional information or leads to examples- this produces a signature when the page is updated, like this: - Larry Larry Oct 30, 2011

(1) How might this technology be relevant to the educational sector you know best?

  • - paul.signorelli paul.signorelli Nov 13, 2011Learning analytics, like the broader area of evaluation in training-teaching-learning, is sort of like flossing: we know we have to do it, we know there are consequences (no flossing, our teeth fall out; no evaluation of our learners' in-the-moment efforts beyond the obvious system of grading, we remain mired in whatever mistakes we are currently making in terms of missing chances to provide more effective learning opportunities for those we serve). And yet: This seems to be an area always on the verge of emerging, yet never quite making it. Evaluation, in most academic and staff training settings I've been in, is that stepchild that is undervalued and underserved because we simply fizzle out by the time we reach this stage in the educational-training-creative process. Learning analytics remains something well worth exploring in that it documents where we and our learners are, where strong as well as weak points exist, and what we might do to make a critically important postive difference in a learner's ultimate success. The tacit opportunities mentioned in the introduction to this discussion session--"online social interactions, extracurricular activities, posts on discussion forums, and other activities that are not directly assessed as part of the student’s educational progress"--seem to offer magnificent possibilities for all of us given the learning analytics tools that are available to us, and the process seems well worth pursuing for the results it might produce.
  • I think the question is how far is the educational sector willing to go. I started with question 2 below and listed a number of things there that our institution uses data for. When I have presented on our methodologies, I typically get two reactions: 1. About 40% who love it and want to figure out how to start doing some of the same things at their institutions, and 2. 60% who have a visceral reaction and declare that this type of big data is not acceptable in academia. As Paul notes above, IR and research around learning effectiveness tend to get overlooked at most institutions. I would contend that the problem may be in exactly what it means to have extreme transparency into the institution. Quite simply there are a lot of institutional stakeholders who have made careers out of promoting and extending projects without a shred of serious data to back them up. Once you start churning huge quantities of data you get the ability to look deeply into how well the institution is accomplishing its mission. For some this level of transparency is rather like looking into the abyss.

    If education does take data seriously then the possibilities are endless. For years, the commercial sector has used data to drive almost every decision and from that some extremely robust analytical tools have been developed. However, to date there are few instances where those same techniques have been applied to learning. Perhaps the best example can be found in the field of web analytics. At present there are no serious web analytics tools being used to assess elearning. Application of commercial products could provide extremely powerful insights into exactly how people are interacting with content, interacting with each other and learning. While the question, “If Amazon can provide custom shopping experiences, why cant we provide custom learning experiences?” has become a little cliché, it is accurate. If we look at the techniques Amazon uses we see that they could easily be applied to education – from a purely statistical perspective there is practically no difference between provisioning content to get to a point of sale and optimizing content / learning experiences to help learners maximize outcomes. However, the systems that can do this are quite expensive and countless volumes of learning theory on their head. The question is, who will be the first to go down this path and how long will it be? Then who will have the will to follow? - phil.ice phil.ice Nov 18, 2011
  • Agreed, if the context is right, the possibilities are endless. I think there are several things to start with: educating the higher education community as to what LA are and how they can improve education, getting the tools/staffing in place to do it well, and then applying the knowledge to the learning process. For many institutions those are some pretty big cultural hurdles to pass. That being said, in a culture of assessment, increasing class sizes, and grant funding around LA, I suspect we'll be seeing more of it (and the endless possibilities that will come along as well). - lauren.pressley lauren.pressley Nov 19, 2011
  • Learning analytics is emerging as the research question of the decade. We have been able to track visits to web pages, and many LMS systems have built reports that try to summarise the use of the learning management system. But with the ability to actually keep the data from thousands or tens of thousands of learners, with the ability to tag content not only as a object but within the object at finer levels of granularity the semantic meaning of the objects elements, we can now begin to ask the more important questions - not just what did the learner click, buy why? We are on the cusp of building stochastic models of learner behavior that can start to tell us not only about them but about the learning materials themselves. What's making this move from 'always on the verge' to a reality is the combination of large, efficient NOSQL databases, semantic frameworks (not just heavy-weight XML but microformats and microdata as well) plus opening of formerly closed LMSs pushed by 'born-open' competitors (initially Sakai and Moodle, but now Canvas and meta-layering of PLEs). These are forcing us to start to think about the behavior that reflects learning and ways we might begin to model it.

    Naturally there are concerns that we'll use it to monitor rather than inform. These are valid and represent significant requirements for vigilance and care. But we are starting to see evidence of the promise, as Phil Ice noted above, in the recommender systems, now with the chance to take them in a different direction. The CyberTutor environment, now the framework of the Mastering "subject-goes-here" series by Pearson, aggregates all the interactions of users using the application worldwide. Any student using Mastering Physics has their interactions added to the common backend, opening up opportunities for 'big data mining' to understand more carefully the areas that represent learning challenges. We're just at the beginnings but the potential is huge.- Phillip.Long Phillip.Long Nov 20, 2011
  • Learning analytics may be increasingly required to help assessment (in a formative way) some "21st Century Skills", for example collaboration. As international benchmarks such as those provided by PISA will be part of the driving force - Gavin Gavin Nov 20, 2011
  • I do not have the impression that Learning Analytics is anything new. Collecting and analyzing data about processes, groups, individuals, trying to read footsteps, paths, timelines etc. has always been the aim of "Bildungsforschung" (educational research) by empirical methods as well as qualitative methods (grounded research, design-based research). Not all educational research is done by testing, experiment or control design, a lot of it consosts of documenting processes, gathering data, and analyzing those. To me Learning Analytics is an attempt to create a new buzz word like Connectivism, which also does not stand up to its claim.- rolf.schulmeister rolf.schulmeister Nov 20, 2011
  • LA, while still in its infancy, has significant potential to transform the way we provide services in HED. We are just now beginning to explore its uses and of key interest is its ability to improve student learning and our delivery of services. - drvdiaz drvdiaz Nov 21, 2011

(2) What themes are missing from the above description that you think are important?

  • - paul.signorelli paul.signorelli Nov 13, 2011Nothing obviously missing for me; the goals are clearly stated ("to enable teachers and schools to tailor educational opportunities to each student’s level of need and ability") and the need (for accountability in training-teaching-learning) apparent. - jochen.robes jochen.robes Nov 18, 2011
  • The problem with the term “Learning Analytics” (though it’s the best we have for now) is that its just way too broad. When we start to look at what is and can be done with data around learning we find lots of intersecting nodes. Here are a few examples of things that might fall under this umbrella: retention, progression, program scheduling optimization, course placement, student satisfaction, content optimization, instructional design analysis, marketing strategy, social media analysis, sentiment analysis, faculty effectiveness, learning outcomes assessment, career placement. And these are just a few. My IR staff is responsible for 34 defined measures of learning effectiveness alone – all driven by analytics – and this doesn’t count the exploratory work. While arguably there is overlap in many of these areas, each requires a discrete type of analysis and unique data sets (even though many of the measured variables are present in more than one analysis). I might also note that even though we have done a lot with data, we also know that we are only scratching the surface and have whiteboarded years of analyses that need to be undertaken. This raises the specter of how big big data really is. In fact it is so vast that many institutions simply back away because they have no clear idea where to start. In short, despite some recent attempts at definitions, understanding what learning analytics is is a daunting task. As such, trying to determine which pieces are how far out on the timeline is extremely problematic. While it might be a bit of an exaggeration, the scope is so broad that there could almost be a stand-alone Horizon report for analytics alone. - phil.ice phil.ice Nov 18, 2011
  • One of the issues that needs to be emphasized is that the goal is also to build a model of learning behavior. Second, it's as valuable for the learner as the teacher. Having a representations of the learners progress through material, feedback on the concepts and methods that are proving difficult, and the possibility of 'seeing' how more expert or at least successful learners are interacting with the same content has the potential for giving situated feedback that the teacher often has little time to provide. - Phillip.Long Phillip.Long Nov 20, 2011
  • Should there be a role for the students in identifying measures that are relevant and sources of data that are meaningful? Analytics should be transparent and not done to students. - alanwolf alanwolf Nov 20, 2011
  • Notion of Dashboard for real time data that can provide individual student as well as group / cohort progress towards outcomes at key points in students learning journeys and which might also help with academic advising. - Nick Nick Nov 21, 2011
  • A crucial aspect of well-designed learning analytics: giving students access to the tools allows them to make decisions about their own learning. In other words, learning analytics projects should not view faculty and administration as the sole primary consumers of the information, but also consider how students might use it for and by themselves. - rubenrp rubenrp Nov 21, 2011

(3) What do you see as the potential impact of this technology on teaching, learning, or creative expression?

  • - paul.signorelli paul.signorelli Nov 13, 2011Learning analytics offers a process for using the tools that are readily available to us to serve as better teachers, more effectively facilitate learning by reacting quickly to information about where our learners are in their learning process, and stimulate creative expression by using learning analytics to spot barriers to our learners' creeative process so we can help them overcome those barriers where possible.
  • My response to this question is embedded in no's 1 and 2. - phil.ice phil.ice Nov 18, 2011
  • I see this potentially radically changing education for many students, particularly those too quite to participate in traditional ways, those lost in huge classrooms, and those who need a nudge at a specific moment in the course. It also changes teaching from being more of an art to more of a science. - lauren.pressley lauren.pressley Nov 19, 2011
  • This is perhaps the most significant development since content was put online. - Phillip.Long Phillip.Long Nov 20, 2011 Agree completely. - phil.ice phil.ice Nov 20, 2011
  • I agree. And i´d add that this is the added value, the main future difference between open-free resources and the more plannified learning that learning industries will be offering.
    - dolors.reig dolors.reig Nov 20, 2011

(4) Do you have or know of a project working in this area?

Malcolm Brown, "Learning Analytics: The Coming Third Wave," Educause Learning Initiative, April 2011:
http://net.educause.edu/ir/library/pdf/ELIB1101.pdf; starting with a reference to the 2011 Horizon Report's coverage of Learning Analytics, Brown mentions Purdue University and the University of Maryland, Baltimore County as two of educational institutions experimenting in a "landscape [that] is now changing rapidly." Other examples cited include the Copenhagen Business School and the University of Saskatchewan. Great introductory article to the topic.- paul.signorelli paul.signorelli Nov 13, 2011
"How Should I Use Learning Analytics?"; September 26, 2011; an online instructor at the University of California, Davis, blogs about "a fairly simple application of learning analytics": http://cetl.ucdavis.edu/learning-analytics/- paul.signorelli paul.signorelli Nov 13, 2011 George Siemens und Phil Long: Penetrating the Fog: Analytics in Learning and Education, EDUCAUSE Review Magazine, Vol. 46, Nr. 5, 2011 http://www.educause.edu/ - jochen.robes jochen.robes Nov 18, 2011
Simon Buckingham Shum und Rebecca Ferguson: Social Learning Analytics, Technical Report KMI-11-01, Knowledge Media Institute/ The Open University, June 2011 (pdf) - jochen.robes jochen.robes Nov 18, 2011
Freeman Hrabowski III, Jack Suess and John Fritz "Assessment and Analytics in Institutional Transformation" EDUCAUSE Review November/December 2011 describes how assessment and analytics can drive institutional change and impact student retention. References 2011 Horizon Report. http://www.educause.edu/EDUCAUSE+Review/EDUCAUSEReviewMagazineVolume46/AssessmentandAnalyticsinInstit/235010 - billshewbridge billshewbridge Nov 19, 2011

As noted in my replies above, we are doing several things in the field of learning analytics at American Public University System. One of the most interesting is that we constructed an engine that can pull 187 variable on demand (we do 24 hour cycles) from across our systems and use neural network analysis to predict disenrollment within the next five days, with causality contributed to the top few variables on a student by student basis - for those at risk. Currently the model is 82% accurate for undergraduates and 90% accurate for graduates. Another project that has been very interesting is the Predictive Analytics Reporting Framework (
http://www.prweb.com/releases/2011/10/prweb8882165.htm), in which 3 million records from six institutions is being analyzed to look for patterns in retention and progression. The analysis for this project is still underway, but should final results should be available soon. - phil.ice phil.ice Nov 18, 2011
Here is a fairly recent study that will certainly support the quest toward student online retention and the various ways learning analytics will play a large part in this quest,Classroom walls that talk: Using online course activity data of successful students to raise self-awareness of underperforming peers
http://www.sciencedirect.com/science/article/pii/S109675161000062X - melissa.burgess melissa.burgess Nov 19, 2011

Gates Foundation gave awards for work in this area: http://www.hackeducation.com/2011/02/15/recipients-of-gates-foundations-next-generation-learning-grants-announced/ - lauren.pressley lauren.pressley Nov 19, 2011

References include: Phillip Long and George Siemens, "Penetrating the Fog: Analytics in Learning and Education", Educause Review, vol 46, 2011
E. Duval (2011): Attention Please! Learning Analytics for Visualization and Recommendation. Proceedings of the 1st
Conference on Learning Analytics and Knowledge
, Banff, Canada, to appear 2011.
Blikstein, P, (2011), Using learning analytics to assess students' behavior in open-ended programming tasks. Proceedings of the I Learning Analytics Knowledge Conference (LAK 2011), Banff, Canada.
Michel C. Desmarais (2011), Conditions for eff ectively deriving a Q-Matrix from data with Non-negative Matrix Factorization, Proceedings of Educational Data Mining, July 8-10, Eindhoven, the Netherlands.
- Phillip.Long Phillip.Long Nov 20, 2011