Paul Prinsloo1 and Sharon Slade2
1University of South Africa, 2Open University, UK
Higher education, and more specifically, distance education, is in the midst of a rapidly changing environment. Higher education institutions increasingly rely on the harvesting and analyses of student data to inform key strategic decisions across a wide range of issues, including marketing, enrolment, curriculum development, the appointment of staff, and student assessment. In the light of persistent concerns regarding student success and retention in distance education contexts, the harvesting and analysis of student data in particular in the emerging field of learning analytics holds much promise. As such the notion of educational triage needs to be interrogated. Educational triage is defined as balancing between the futility or impact of the intervention juxtaposed with the number of students requiring care, the scope of care required, and the resources available for care/interventions.
The central question posed by this article is “how do we make moral decisions when resources are (increasingly) limited?” An attempt is made to address this by discussing the use of data to support decisions regarding student support and examining the concept of educational triage. Despite the increase in examples of institutions implementing a triage based approach to student support, there is a serious lack of supporting conceptual and theoretical development, and, more importantly, to consideration of the moral cost of triage in educational settings.
This article provides a conceptual framework to realise the potential of educational triage to responsibly and ethically respond to legitimate concerns about the “revolving door” in distance and online learning and the sustainability of higher education, without compromising ‘openness.’ The conceptual framework does not attempt to provide a detailed map, but rather a compass consisting of principles to consider in using learning analytics to classify students according to their perceived risk of failing and the potential of additional support to alleviate this risk.
Keywords: Distance education; educational triage; learning analytics; open distance learning
Are students walking around with invisible triage tags attached, that only lecturers can see? Is this fair? Or is it just pragmatic? Like battlefield medical attention, lecturers’ attention is finite. And as class sizes and workloads increase, it is becoming scarcer. (Manning, 2012)
While The New York Times announced that 2012 was “the year of the MOOC” (Pappano, 2012), by the start of 2014 there were signs that some of the initial hype had died down (Wetterstrom, 2014). Despite a (possibly) more sober assessment regarding the promise of massive open online education, The Economist (2014, June 28th-July 4th) dedicated their weekly issue to the “creative destruction” facing higher education and the ‘reinvention’ of the university. It is therefore difficult to estimate the scope and real impact of the changes facing international and national higher education. Over the last three years, terms such as “disaggregation” (Wiley & Hilton III, 2009), relating to the competitive nature of higher education, “unbundling and unmooring” (Watters, 2012), referring to the disruption caused by the introduction of new technologies in combination with ever changing external factors, “academic revolution” (Altbach, Reisberg, & Rumbley, 2009), and “crisis” (Carr, 2012), referring to the potential damage caused by the introduction of MOOCs, are increasingly common in discourses on current and future states of higher education. The higher education landscape may be irrevocably changing (Staley & Trinkle, 2011; The Economist, 2014), with some authors even mooting the notion that these ongoing changes herald the “end of higher education’s golden age” (Shirky, 2014). Others, however, question the eschatological terminology used to describe the impact of technological changes in society (Morozov, 2013a, 2013b) and in the current higher education landscape (Watters, 2012) and petition a consideration of the ideological and political agendas informing the current “techno-romanticism” in education (Selwyn, 2014).
Against this backdrop, higher education institutions increasingly need to make strategic decisions to assess and exploit various opportunities, alleviate risk, and ensure the longterm financial viability of institutions of higher learning. Risk within higher education not only mirrors broader societal dimensions of risk, but also includes the danger of obsolescence, the impact of technology on content, assessment, and the role of faculty, the increasing diversification of forms of higher education and student populations, and concerns about student success and retention (Altbach, Reisberg, & Rumbley, 2009; Siemens & Long, 2011; Newman, Couturier, Scurry, 2010; and Slaughter & Rhoads, 2010). In particular, concerns about student success and retention continue to increase in intensity amidst changes in funding frameworks, higher education ranking tables, consumer activism amongst students, and employer needs.
As early as 1995, Hartley pointed to the impact of external scrutiny, inspection, and the increasing emphasis on efficiency, calculability, predictability, control, and fake fraternisation in higher education. Expanding on the principles provided by Bentham’s work on education in 1816, Hartley (1995) explores the notion of the “McDonaldisation” of higher education with its emphasis on the optimisation of increasingly scarce resources with its maxim of “doing more with less” (p. 412). Shifts in funding frameworks facing higher education were visible as early as 1995 – resulting in funding following rather than preceding performance (Hartley, 1995). In the same year, with respect to the debates surrounding higher education, Lagowski (1995) stated that commentators “claim that we [higher education] cost too much, spend carelessly, teach poorly, plan myopically, and when questioned, act defensively” (p. 861) and, as a result, that “higher education will never again be as it was before” (p. 861). The proposed solution then (e.g., Lagowski, 1995) and, increasingly, now is to adopt a triage approach.
Within this context, distance education, and in particular open distance and elearning (ODeL), relies ever more on data to inform its key strategic decisions across a wide range of issues, including marketing, enrolment, curriculum development, the appointment of staff, and student assessment. In an ODeL context, students are typically not required to satisfy pre-entry requirements for admission, nor are they readily available for consultation or direct support in the same sense as on a campus-based institution. Crucially then, data is increasingly harvested and analysed to inform initiatives which might increase student retention and success (Siemens & Long, 2011; Oblinger, 2012). While commercial entities have long since optimised the harvesting and analysis of data to inform their marketing and customer service strategies, and to increase their profitability, learning analytics is a recent phenomenon in the field of higher education, and, more specifically, in ODeL contexts (New Media Consortium, 2014). Learning analytics as a phenomenon “spans the full scope and range of activity in higher education, affecting administration, research, teaching and learning, and support resources” (Siemens & Long, 2011, p. 36). In the above context, it is easy to understand why learning analytics is described as the “new black” (Booth, 2012), or student data as the “new oil” (Watters, 2013). It falls outside the scope of this article to explore the implications of the different terminologies and practices used in the field of student data such as academic and learning analytics (for a clarification of the two terms see Ferguson, 2012; Siemens, 2011; Siemens & Long, 2011). In the context of this article, we focus specifically on the potential of learning analytics to inform educational triage.
The harvesting and analysis of student data in academic and learning analytics therefore offers opportunities for higher education institutions to respond, timeously and appropriately, to identifying students who are at risk of failing or dropping out (Campbell, DeBlois, & Oblinger, 2007; Siemens & Long, 2011). The opportunities offered by learning analytics have, however, also brought to the fore concerns regarding a number of issues such as governmentality, data privacy, consent, and other ethical issues and challenges (Slade & Prinsloo, 2013; Booth, 2012; Clow, 2012, 2013; Long & Siemens, 2011; May, 2011; Oblinger, 2012; Siemens, 2011; and Wagner & Ice, 2012). While it is indisputable that learning analytics offers huge potential for use in the management of teaching and learning in ODeL contexts, it is crucial that we do not negate and/or ignore the potential perils and associated ethical dilemmas (Slade & Prinsloo, 2013). Learning analytics therefore finds itself within the centre of the “tension between the framing of education as an economic activity and conceptions of education and learning that are concerned with the development of meaning and the transformation of understanding” (Clow, 2013a, p. 683). Despite various claims regarding the success of learning analytics to improve student success and retention (e.g., Arnold, 2010; Clow, 2013a, 2013b), Watters (2013) also warns that “the claims about big data and education are incredibly bold, and as of yet, mostly unproven” (par. 17). (For a discussion regarding various claims in the application of learning analytics, see Clow 2013a, 2013b, and 2013c.)
The central question posed by this paper is “how do we make moral decisions when resources are (increasingly) limited?” An attempt is made to address this by discussing the use of data to support decisions regarding student support and examining the concept of educational triage. Despite the increase in examples of institutions implementing a triage based approach to student support, there is a serious lack of supporting conceptual and theoretical development, and, more importantly, to consideration of the moral cost of triage in educational settings. During the literature review the researchers could not find any published articles exploring the theoretical and conceptual bases for educational triage, though some of the literature describe the practical implication of educational triage on primary and secondary school levels (Booher-Jennings, 2005; Cobbold, 2010; Sparks, 2012; Wilson, 2012; Gillborn & Youdell, 2000; and Marks, 2012).
Due to its distinct origins in medical practice, we should consider also whether the notion of triage provides a useful heuristic in educational settings. Biesta (2007, 2010), for example, raises legitimate concerns regarding the transferability of concepts between the medical and educational domains of practice. The fact that something ‘works’ in one context does not necessarily mean that it is appropriate in another context, regardless of whether it appears to work or not.
In this paper we
Due to the variety of interpretations and uses of data in higher education, as well as different purposes, skills required, contexts, and tools used in the harvesting and analysis of data, it is crucial to clarify the terms and definitions surrounding learning analytics (Van Barneveld, Arnold, & Campbell, 2012). Siemens and Long (2011) state that the “ubiquity of the term analytics partly contributes to the breadth of meanings attached to it” (p. 34) and clarify the notion of learning analytics by firstly providing a clear definition and secondly juxtaposing learning analytics with academic analytics.
During the first International Conference on Learning Analytics and Knowledge (2011), learning analytics was defined as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs” (Siemens & Long, 2011, p. 34). Academic analytics, in contrast, is “the application of business intelligence in education and emphasises analytics at institutional, regional, and international levels” (Siemens & Long, 2011, p. 34). Siemens and Long (2011) state that learning analytics is “more specific than academic analytics” in that it focuses “exclusively on the learning process” (p. 34). (For a detailed explanation of the differences between learning and academic analytics, please see Siemens and Long, 2011; and Diaz and Brown, 2012.)
Learning analytics is an “emerging discipline” (Siemens, 2013) and its given role is “to support sensemaking and not to supplant it” so that “learning analytics does not make decisions, it enables them” (Siemens in Diaz & Brown, 2012, p. 3). Byron (in Diaz & Brown, 2012) lists eight practical examples of using learning analytics to support course design and student success at Sinclair Community College in the US. Here, a number of tools are used to
Several recent articles summarise the growing awareness of the potential of learning analytics to shape the management of teaching and learning in higher education. For example, Wagner and Ice (2012) explore its potential in an article entitled “Data Changes Everything”. The authors state that the “digital breadcrumbs” and data trails left by students provide higher education with the data necessary to analyse and create “meaningful learning experiences that can engage, inspire, and prepare current and future students for success” (p. 34). Further, Booth (2012) lauds the potential of learning analytics to “harness the power of advances in data mining, interpretation, and modelling to improve understandings of teaching and learning, and to tailor education to individual students more effectively” (p. 52, quoting the New Media Consortium’s 2011 report). The NMC 2014 Horizon Report (New Media Consortium, 2014) describes the “rise of data-driven learning and assessment” as a “mid-range trend” (p. 12), impacting on higher education within the next three to five years.
Because much of the data can be gathered in real time, there is a real possibility of continuous improvement via multiple feedback loops that operate at different time scales – immediate to the student for the next problem, daily to the teacher for the next day’s teaching, monthly to the principal for judging progress, and annually to the district and state administrators for overall school improvement. (Bienkowski, Feng, & Means, 2012, pp. vii-viii)
Diaz and Brown (2012), in the broader context of learning analytics, discuss two types of data that students generate during a course namely the “digital footprints, or digital breadcrumbs” left by students as well as “learner-generated data [that] are supplemented or augmented by data about the learner, such as previous coursework, demographics, and other data that might exist in the student information system” (p. 2). These data trails allow higher education institutions to “detect patterns and make predictions” (p. 2). Patterns of sensemaking are informed by comparing individual learners’ activities to the rest of the class, comparing their activities to the levels of activities of students who have taken the course before, or comparing an individual learner’s activity in a particular course with their activities in previous courses, whether at the same institution or at a different institution (Diaz & Brown, 2012). Predictive modelling may then apply various statistical techniques to analyse those data sets in order to make predictions about future behaviours or likely student success.
Learning analytics therefore show promise in informing the design of more effective, appropriate and, importantly, more cost-effective student support. As such, the notion and practice of educational triage needs to be critically interrogated.
The concept of triage is more typically associated with medical treatment where it refers to a classification or sorting of injured patients and the subsequent allocation of treatment according to the severity of their wounds. The close association of the concept of triage with military medicine developed in the First World War and continues to this day (Winslow, 1982). The original purpose of triage was to conserve human resources in times of crises and to bear the interests of the sick and wounded in mind.
Triage is described by the World Medical Association (WMA) as prioritising treatment and management “based on rapid diagnosis and prognosis for each patient” (1994, par. 7). The diagnosis and treatment is carried out systematically “taking into account the medical needs, medical intervention capabilities and available resources” (WMA, 1994, par. 7). The basis of triage is therefore the balancing of the scope of treatment in the context of limited resources and health status of patients. The WMA (1994) also acknowledges that triage raises a number of ethical problems.
Triage categorisation (WMA, 1994) involves the following criteria:
This final category includes those who are dead or beyond emergency care and, as such, can cause significant unease, but is a vital part of disaster triage systems. As the WMA points out, it is ethical for a physician not to persist, at all costs, in treating individuals “beyond emergency care’, thereby wasting scarce resources needed else-where” (1994, par. 10) and “He/she should attempt to set an order of priorities for treatment that will save the greatest number of lives (1994, par. 11).
So, a common thread in medical triage relates to balancing or prioritising between the futility of intervention juxtaposed with the number of patients requiring care, the scope of care required, and the resources available for care/interventions. Beauchamp and Childress (2001) therefore suggest four basic moral principles providing a common framework used in the analysis of medical ethics, namely
Inherent in the discussion and evaluation regarding the futility of medical care for some patients are issues around the nature of relevant information (would a second person come to the same conclusion?), the patient’s own evaluation of their condition, other stakeholders’ (such as family, peers) evaluations, and balancing the ‘cost’ of continued or more advanced treatment against the prospects of survival (with futility often defined as a less than 1% chance of survival).
In an interesting development on the work by Beauchamp and Childress (2001), Joynt and Gomersall (2005) suggest two additional pathways to be considered before the above four moral principles apply. These pathways include where the patient him or herself “makes an autonomous, informed decision to decline ICU admission” (p. 34) or where the patient, once selected for care, decides to forego care based on an informed decision that “no meaningful medical benefit can be achieved” (p. 35). Joynt and Gomersall (2005) also point to the fact that prioritisation within the process of triage depends on different fundamental methods such as ‘first come, first serve’ (a naturally random process), to admitting patients to ICU on the basis of those who would receive the most benefit from being admitted. “Put simply, the society should get more survivors for the same outlay of ICU resources” (pp. 37-38). In this utilitarian approach to triage, “other factors unrelated to benefit such as ethnic origin, race, religion, sex, social status and ability to pay, and age should not be considered as acceptable criteria on which to base a triage decision” (p. 38).
Joynt and Gomersall (2005) point to the fact that there “are enormous difficulties when justifying decisions in relation to prioritisation” (p. 38). As a way to overcome these difficulties, they suggest that a focus on “an acceptable process” instead of focusing only on the moral principles will alleviate some of the practical issues around the justification of triage. The proposed process contains four key procedural elements, namely
In considering the triage decision-making process it is crucial to understand that, due to the “complexity of disease and heterogeneous nature of general ICU patients and our lack of quantitative knowledge of ICU outcomes”, it is almost impossible to “define enough specific conditions under which individual triage decisions should be made” (p. 38). Joynt and Gomersall (2005) therefore propose the following essential components of triage decisions:
From this brief discussion it is clear that “making moral decisions when resources are limited” (Joynt & Gomersall, 2005, p. 34) is complex and difficult. In attempting to take consideration of this approach within an educational context, we need to consider not only the four principles suggested by Beauchamp and Childress (2001), but also the procedural elements proposed by Joynt and Gomersall (2005). The next section brings to the fore the growing importance of student retention within higher education and the use of data to identify students at risk of (passive) withdrawal or failure. The concept of educational triage is then introduced to describe ways in which educational institutions are beginning to make conscious decisions regarding which students are selected to receive targeted support based on their own data trails (and by association, which students are not).
The picture of (often) dismal student retention and course success rates in higher education in general, and distance education in particular, can paint pictures of students as the “walking wounded” (Graber, 1996), with higher education seen as a “revolving door” (Barefoot, 2004; and Yorke, 2004). Indeed, higher education has been described as a “battlefield” (Kogan, 1987, p. 68), referring to the tussle between administrative and academic staff (Waugh, 1998) or the increasing competition in internationalised higher education (Rust & Kim, 2012). The concept of the ‘wounded’ student is embedded in many current practices in higher education (Manning, 2012).
In the context of this article, the ‘wounded’ in higher education may therefore refer to those students who are at risk of not surviving the ordeal, by either dropping out of their studies or (continuous) failing. Not only do students’ failure and dropout constitute a risk for students, but they pose an increasing risk for the sustainability of higher education. Educational triage is therefore defined as balancing between the futility or impact of the intervention juxtaposed with the number of students requiring care, the scope of care required, and the resources available for care/interventions.
There is, however, an inherent moral dilemma in allocating the risk and the scope of risk just to students – as if higher education institutions are always effective and fair, and, secondly, as if macro-societal influences such as an economic downturn or retrenchment do not impact on students’ ability to survive higher education (Prinsloo, 2009; and Subotzky & Prinsloo, 2011). Student success and retention (as well as its opposite of failure and dropout) are the result of a complex, multidimensional ecology with many different and often mutually constitutive variables dynamically interacting (Prinsloo, 2009). This seems to point to the need to reconsider the definition and scope of educational triage as a means of directing support toward students most likely to ‘survive’ as one solely based on student deficiency, especially when considering factors impacting on student success and retention of students from disadvantaged or minority backgrounds (Grimes, 1997; McInerney & King, 2013; and Rovai, 2002). The role of learning analytics as a means of applying triage in this sense is the ease with which higher education institutions can now track and predict potential failure by applying models with data submitted and generated by students.
The notion of triage is reasonably well established in the contexts of primary and secondary school education (Booher-Jennings, 2005; Cobbold, 2010; Sparks, 2012; Wilson, 2012; Gillborn & Youdell, 2000; and Marks, 2012). There is, however, a lack of direct referencing to the notion of triage in higher education research, though issues of optimisation, analytics, and addressing the needs of under-prepared students are well-documented.
In secondary education within the United Kingdom, educational triage is common practice as schools are subject to public comparison via published league tables which focus on the percentage of students receiving grades in the higher bands (5 A*-C grades including English and Mathematics). In order to maintain or improve league positions (which are felt to directly link to future income), additional resource may be directed toward those students in danger of not achieving the minimum C grade, often at the expense of those who will confidently achieve an acceptable grade (but who might have done even better with additional help) and those who are too far from the grade to be worth saving. In fact, a study in 2005 of all pupil level data for secondary state schools in England found that low ability students do worse in schools where there are more students on the C/D border as a result of the redirection of resource (Burgess, Propper, Slater, & Wilson, 2005). Indeed, Crehan (2012) argues that the continued focus on percentage based measures and league tables drives the practice of triage by encouraging a concentration of effort at the point where it will have the most impact on the target measures.
Cobbold (2010) agrees that the practice of triage in schools may be considered strategically sensible, but expresses concerns about the broader impact on those students who may be impacted by what is perhaps predominately a public relations exercise. He cites the allocation of the most experienced teachers to identified students, the abandonment of those students too weak in one of the core subjects, and the sheer cynicism of the approach which seemed to be for the benefit of headline figures rather than to enable pupils to achieve their individual potential. Cobbold suggests that such practice exists in other countries too, citing the Australian Primary Principals Association’s concerns that many schools allocate more resources to those students most likely to improve school results (the NAPLAN tests) and that students with greater needs received less until after completion of tests.
There is very little if any published research on the philosophical and theoretical underpinnings of educational triage, and its practice in the context of higher education. Though there is ample evidence of different forms of educational triage in higher education, for example, the Signals project at Purdue University (Arnold, 2010; Caulfield, 2013; Essa, 2013; Clow, 2013a, 2013b; and Straumsheim, 2013), and two projects at the University system of Maryland namely at the University of Maryland Eastern Shore (UMES) and Bowie State University (BSU) (Forsythe, Chacon, Spicer, & Valbuena, 2012), there is very little information available on the conceptual and moral considerations and implications of these projects.
In this article we attempted to map not only the historical development (Winslow, 1982) and moral principles (Beauchamp & Childress, 1994) and guiding processes (Joynt & Gomersall, 2005) inherent in triage within the context of medicine, but we have also raised a number of concerns regarding the function of algocracies (Danaher, 2014a; Marwick, 2014; and Morozov, 2013a, 2013b) in the discourses of accountability and governmentality in higher education (Beer, Jones, & Clark, 2012; Clow, 2013a; and Morozov, 2013b).
In this section we would like to provide a draft heuristic for positioning educational triage as moral practice. The growing prominence of a learning analytics approach to shaping student support within our higher education institutions now requires an examination of the principles which underlie how that support is allocated. Continuation along a path whereby support is determined by data and algorithms alone may quickly result in an approach to support at direct odds with the underlying principles of an institution.
Firstly it is important to take cognizance of the historical (Hartley, 1995) and current emphasis on the optimisation of resources in higher education (Long & Siemens, 2011) as a reality. This does not mean, however, that we support the “McDonaldisation” (Hartley, 1995) or continuing hegemony of neoliberal ideologies in higher education (Slaughter & Rhoades, 2010). Our approach to triage as moral educational practice is, on the one hand, acknowledging the reality of funding constraints and, on the other hand, contesting a technocratic approach to the allocation of resources to improve the effectiveness of teaching and learning (Morozov, 2013a, 2013b; and Slaughter & Rhoades, 2010). We believe that learning analytics as moral practice (Slade & Prinsloo, 2013) not only provides opportunities for more effective teaching and learning and allocation of resources (Long & Siemens, 2011), but also addresses the question “how do we make moral decisions when resources are (increasingly) limited?” (e.g., Joynt & Gomersall, 2005).
Slade and Prinsloo (2013) proposed a number of principles underlying learning analytics as moral practice which include recognising that learning analytics (and implicitly educational triage) can be immoral. Their proposed principles include recognising the agency of students and the identity and performance of students as dynamic constructs. Student success is furthermore a “complex and multidimensional phenomenon” (p. 1520) and higher education cannot afford not to use data to support student learning. (For a full discussion on a framework for learning analytics as moral practice, see Slade and Prinsloo, 2013.)
Based on the potential of learning analytics as moral practice, it follows that educational triage, in the context of limited resources, will involve making difficult decisions. Even in the context of medical triage the “complexity of disease and heterogeneous nature of general ICU patients, and our lack of quantitative knowledge of ICU outcomes” makes it almost impossible to “define enough specific conditions under which individual triage decisions should be made” (Joynt & Gomersall, 2005). How much more should we then be cognisant of the constraints of determining and applying the outcomes of bounded and subjective algocracies?
The moral principles, autonomy, beneficence, non-maleficence, and distributive justice (Beauchamp & Childress, 2001), provide useful pointers for considering the practice of educational triage. It would, however, seem as if these principles do not transfer directly or easily to an educational context. Education is not a “causal technology” or a “process of ‘push and pull’”, but an “open and recursive system” (Biesta, 2007, p. 8) where the factors impacting on student retention and success are complex, and often interdependent and mutually constitutive (Subotzky & Prinsloo, 2011).
We would therefore propose an adaptation of the principles suggested by Beauchamp and Childress (2001) as follows.
It is, however, important that the consequences of such a refusal be made clear to students, funders, and other stakeholders.
When additional resources are withheld from students based on analyses of their past performance, demographics, and findings from learning analytics, or when institutions exclude students from further registration opportunities, it is crucial to follow the procedural steps provided by Joynt and Gomersall (2005). The exclusion of students from further registration opportunities points to the possible limitations of directly transferring principles between triage in medical contexts to educational contexts. However, if institutions are transparent regarding the rationales, their diagnosis, prognosis, and outcomes, as well as providing evidence to support any decisions, the provision of or exclusion to services may be justified. We should, however, always be aware of the complexities of student success, the incompleteness of our algorithms and data as well as the short and long term implications of excluding students. This principle should be read concurrently with the next one dealing with beneficence.
Considering that education flows from and often maintains and perpetuates social and power structures, educational triage as moral practice cannot ignore the principle of distributive justice (Apple, 2004; Bauman, 2012; Bernstein, 1996; and Chomsky, 2013). We propose that there are issues of morality around not taking into account the historical impact of some of these factors in considering the classification of students in educational triage. It is immoral to simply take only educational performance into account where there is ample evidence of historical imbalances and injustices. Race, gender, culture, and various combinations of identity constructs should not be ignored in considering the scope and delimitation of groups of students who will be excluded, or receive no additional support. Morozov (2013b) warns that we don’t know “what biases and discriminatory practices” are built in the algorithms we use (also see boyd & Crawford, 2013). We acknowledge that the explicit mention of these criteria does make possibly for uncomfortable reading, but there is ample evidence that these constructs are currently used in the allocation of resources and, secondly, that they should be used. A case in point is current educational policy in the context of South Africa where previous educational dispensations favored a racial criteria for admission into higher education (Baloyi, 2004; Chisholm, 2005; Dube, 1985; Jansen, 1990a, 1990b) and will impact for generations to come on student success and retention (Subotzky & Prinsloo, 2011). Within the UK, there are similar initiatives which target attention and resource toward particular groups. For example, the Higher Education Funding Council for England (HEFCE) provides specific funding to universities to support the costs of both recruiting and retaining students with particular demographic characteristics. These criteria have furthermore influenced educational policy throughout the history of higher education (Apple, 2004; Bernstein, 1977, 1996; Giroux, 1992, 2003) and we need to engage with these criteria and the contexts of their application on a theoretical level, however uncomfortable.
Having said that, the use of these identity constructs in the allocation of resources needs to be read in context with the other principles and not treated as a stand-alone issue. It is already difficult in educational contexts to ‘simply’ categorise students into students who don’t need assistance; students who, with additional support, may pass; and students who are destined to fail no matter any additional support. Marwick (2014) warns that data discrimination can lead to customers with the most potential being targeted and provided access to services while others will be classified as being of lower value and “waste” (par. 20). This is where educational triage (dramatically) differs from triage in medical contexts. There are a number of authors who express concern regarding some of the basic assumptions, epistemologies, and ontologies underlying the current hype in the discourses surrounding data-based decision making in education, especially with regard to teaching and learning (Clegg, 2005; Elliott, 2001; Knight et al., 2014; Oliver & Conole, 2003; Reeves, 2011; and Swain, 2013). Our conceptual and theoretical understandings of factors that impact on the effectiveness of teaching and learning are still incomplete and we should therefore tread carefully when we make decisions when resources are limited.
Against the backdrop of claims that “student data is the new oil” (Watters, 2013) that enables universities to be data-driven (Wishon & Rome, 2012), there are also concerns that learning analytics only values the measurable (Richardson, 2012), and that the “technocratic predictive logic” (Henman, 2004) inherent in the emphasis on students’ digital data forgets that students’ digital footprints do not represent the whole picture (see also Mayer-Schönberger, 2009). Biesta (2007, 2010) also expresses the concern that evidence of “what works” does not necessarily “work.”
In this article we considered the complexities of “making moral decisions when resources are limited” (e.g., Joynt & Gomersall, 2005). The effective allocation of increasingly limited resources, although not new (e.g., Hartley, 1995), challenges higher education institutions to take concerns regarding student failure and dropout seriously. Institutions increasingly rely on the analysis of data through algorithms to determine students’ chances of success or risk of dropout and allocate resources according to a system of triage. Students are classified in different categories based on an assessment of their educational risk and the cost of increasing or ensuring their chances of success.
Though educational triage is germane to higher education within the discourses and practices of accountability, governmentality, and the optimisation of resources, there is a serious need to explore the epistemological and ontological assumptions underlying and informing these discourses and practices.
We have mapped educational triage against its historical development from the world of emergency medical care, exploring the principles and processes guiding triage. Though these principles and processes provide some insight into the complexities of triage, we found that the principles do not transfer seamlessly to educational contexts. Even within the discourses of medical triage there are concerns that diseases and emergency medical care are complex and we mostly lack sufficient knowledge to always be certain that we make the right decisions (Joynt & Gomersall, 2005). Biesta (2007, 2010) furthermore questions the epistemologies, ontologies, and practices underlying the direct transfer of practices in medical fields to education.
Despite these concerns, the principles and processes in medical triage allow us to explore the risks and potential of educational triage. We are able to understand the need to respect student autonomy in conjunction with the long-term sustainability of the institution. We can see the transferability of the concepts of beneficence and non-maleficence to an educational context. Further, we appreciate the complexity of the use of data and algorithms alone to drive student support. Learning analytics indeed offers the potential to provide targeted support based on what is known and it would be relatively simple to use learning analytics to justify an educational triage approach to determining student support. However, as we have shown, such an approach is simplistic and potentially immoral. It is hard to see a time when students may be fully defined by their personal data, nor be fully separated from their own social and historical contexts.
The principles and broad heuristic framework discussed in this article do not attempt to provide a detailed map or check list of bulleted points to guide us through the messiness of decisions that impact on students’ chances of success, nor on the long-term sustainability of higher education institutions. We believe the article opens up the discourse and provokes further discussion regarding the question “how do we make moral decisions regarding students-at-risk and the allocation and effectiveness of resources, when resources are (increasingly) constrained?”
An earlier version of this article was presented at the EDEN annual conference in Zagreb, 10-13 June, 2014.
We would like to acknowledge the care and input of the reviewers, as well as the financial support to one of the researchers by the South African National Research Foundation.
Altbach, P. G., Reisberg, L., & Rumbley, L. E. (2009). Trends in global higher education: Tracking an academic revolution (A report prepared for the UNESCO World Conference on Higher Education). Paris: UNESCO. Retrieved from http://www.cep.edu.rs/public/Altbach,_Reisberg,_Rumbley_Tracking_an_Academic_Revolution,_UNESCO_2009.pdf
Apple, M. W. (2004). Ideology and curriculum (3rd edition). New York: Routledge Falmer.
Arnold, K. (2010, March 3). Signals: Applying academic analytics. EDUCAUSE Review. Retrieved from http://www.educause.edu/ero/article/signals-applying-academic-analytics
Baloyi, R. (2004). The role of the state in the establishment of a culture of learning and teaching in South Africa (1910-2004) (PhD thesis). University of South Africa, Pretoria.
Barefoot, B. O. (2004). Higher education’s revolving door: Confronting the problem of student dropout in US colleges and universities. Open Learning, 19—18. DOI: 10.1080/0268051042000177818
Bauman, Z. (2012). On education. Cambridge, UK: Polity Press.
Beauchamp T. L., & Childress J. F. (2001). Principles of biomedical ethics (5th ed). Oxford: Oxford University Press.
Beer, C., Jones, D., & Clark, D. (2012). Analytics and complexity: Learning and leading for the future. Retrieved from http://www.ascilite2012.org/images/custom/beer,colin_-_analytics_and.pdf
Bernstein, B. (1977). Class, codes and control. Towards a theory of educational transmission (Vol. 3, 2nd edition). London: Routledge & Kegan Paul.
Bernstein, B. (1996). Pedagogy, symbolic control and identity: Theory, research, critique. London: Taylor & Francis.
Bienkowski, M., Feng, M., & Means, B. (2012). Enhancing teaching and learning through educational data mining and learning analytics. An issue brief. Washington, D.C. : U.S. Department of Education, Centre for Technology in Learning. Retrieved from https://www.ed.gov/edblogs/technology/files/2012/03/edm-la-brief.pdf
Biesta, G. (2007). Why “what works” won’t work: Evidence-based practice and the democratic deficit in educational research. Educational Theory, 57(1), 1–22. DOI: 10.1111/j.1741-5446.2006.00241.x.
Biesta, G. (2010). Why ‘what works’ still won’t work: From evidence-based education to value-based education. Studies in Philosophy of Education, 29, 491–503. DOI 10.1007/s11217-010-9191-x.
Booher-Jennings, J. (2005). Below the bubble: “Educational triage” and the Texas Accountability System. American Educational Research Journal, 43(2), 231–268.
Booth, M. (2012, July 18). Learning analytics: The new black. EDUCAUSEreview. Retrieved from http://www.educause.edu/ero/article/learning-analytics-new-black
boyd, d., & Crawford, K. (2013). Six provocations for big data. Retrieved from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1926431
Burgess, S., Propper, C., Slater, H. & Wilson, D. (2005). Who wins and who loses from school accountability? CMPO discussion paper 05/128.
Campbell, J. P., DeBlois, P. B., Oblinger, D. G. (2007). Academic analytics: A new tool for a new era. EDUCAUSEreview. Retrieved from http://www.educause.edu/ero/article/academic-analytics-new-tool-new-era
Carr, N. (2012). The crisis in higher education. Technology Review. Retrieved from http://www.technologyreview.com/featuredstory/429376/the-crisis-in-higher-education/
Caulfield, M. (2013, September 26). Why the Course Signals math does not add up [Web log post]. Retrieved from http://hapgood.us/2013/09/26/why-the-course-signals-math-does-not-add-up/
Chisholm, L. (2005). The politics of curriculum review and revision in South Africa in regional context. Compare: A Journal of Comparative Educatio, 35(1), 79—100.
Chomsky, N. (2013). Power systems. London, UK: Penguin.
Clegg, S. (2005). Evidence-based practice in educational research: a critical realist critique of systematic review. British Journal of Sociology in Education, 26, 415—428.
Clow, D. (2012). The learning analytics cycle: Closing the loop effectively. Paper presented at the 2nd International Conference on Learning Analytics and Knowledge (LAK12). Retrieved from http://dl.acm.org/citation.cfm?id=2330636
Clow, D. (2013a). An overview of learning analytics. Teaching in Higher Education, 18(6), 683—695. DOI: 10.1080/13562517.2013.827653
Clow, D. (2013b, November 13). Looking harder at Course Signals [Web log post]. Retrieved from http://dougclow.org/2013/11/13/looking-harder-at-course-signals/
Clow, D. (2013c, December 10). InFocus: Learning analytics and big data [Web log post]. Retrieved from http://dougclow.org/2013/12/10/infocus-learner-analytics-and-big-data/
Cobbold, T. (2010, September 24). Get used to “Bubble Kids” and “Educational Triage” [Personal web log post]. Retrieved from http://www.saveourschools.com.au/league-tables/get-used-to-bubble-kids-and-educational-triage
Crehan, L. (2012, August 30). Educational triage [web log post]. Retrieved from http://www.edapt.org.uk/news/2012/08/blog-educational-triage
Danaher, J. (2014a, February 9). What’s the case for sousveillance? (Part one) [Web log post]. Retrieved from http://philosophicaldisquisitions.blogspot.com/2014/02/whats-case-for-sousveillance-part-one.html
Diaz, V., & Brown, M. (2012). Learning analytics. A report on the ELI focus session. Retrieved from http://net.educause.edu/ir/library/PDF/ELI3027.pdf
Dube, E. F. (1985). The relationship between racism and education in South Africa. Harvard Educational Review, 55(1), 86—100.
Elder-Vass, D. (2010). The causal power of social structures. New York, NY: Cambridge University Press.
Elliot, J. (2001). Making evidence-based practice educational. British Educational Research Journal, 27(5), 555—574.
Essa, A. (2013). Can we improve retention rates by giving students chocolates? [Web log post]. Retrieved from http://alfredessa.com/2013/10/can-we-improve-retention-rates-by-giving-students-chocolates/
Ferguson, R. (2012). The state of learning analytics in 2012: A review and future challenges. Knowledge Media Institute, Technical Report KMI-2012, 1. Retrieved from http://kmi.open.ac.uk/publications/pdf/kmi-12-01.pdf
Forsythe, R. G., Chacon, F. J., Spicer, Z., & Valbuena, A. (2012). Two case studies of learner analytics in the university system of Maryland. EDUCAUSEreview. Retrieved from http://www.educause.edu/ero/article/two-case-studies-learner-analytics-university-system-maryland
Gillborn, D., & Youdell, D. (2000). Rationing education: Policy, practice, reform and equality. Buckingham: Open University Press.
Giroux, H. A.(1992). Border crossings: Cultural workers and the politics of education. New York: Routledge.
Giroux, H. A. (2003). Racial injustice and disposable youth in the age of zero tolerance. Qualitative Studies in Education, 16(4), 553—565.
Graber, K. C. (1996). Influencing student beliefs: The design of a “high impact” teacher education program. Teaching and Teacher Education, 12(5), 451–466.
Grimes, S. K. (1997). Underprepared community college students: characteristics, persistence, and academic success. Community College Journal of Research and Practice, 21(1), 47—56. DOI: 10.1080/1066892970210105
Hartley, D. (1995). The ‘McDonaldisation’ of higher education: Food for thought? Oxford Review of Education, 21(4), 409—423.
Henman, P. (2004). Targeted! Population segmentation, electronic surveillance and governing the unemployed in Australia. International Sociology, 19, 173—191.
Jansen, J. D. (1990a). Curriculum policy as compensatory legitimation? A view from the periphery. Oxford Review of Education, 16(1), 29—38.
Jansen J.D. (1990b). Curriculum as political phenomenon: historical reflections on Black South African education. The Journal of Negro Education, 59(2), 195-206.
Joynt, G. M., & Gomersall, C. D. (2005). Making moral decisions when resources are limited – an approach to triage in ICY patients with respiratory failure. South African Journal of Critical Care (SAJCC), 21(1), 34—44. Retrieved from http://www.ajol.info/index.php/sajcc/article/view/35543
Knight, S., Buckingham Shum, S., & Littleton, K. (2013). Collaborative sensemaking in learning analytics. Retrieved from http://oro.open.ac.uk/36582/
Kogan, M. (1987). The political view. In Burton R. Clark, (Ed), Perspectives on higher education: Eight disciplinary and comparative views (pp. 56—78). Berkeley, CA: University of California Press.
Lagowski, J. J. (1995). Higher education: A time for triage? Journal of Chemical Education, 72(10), 861.
Manning, C. (2012, March 14). Educational triage [Web log post]. Retrieved from http://colinmcit.blogspot.co.uk/2012/03/educational-triage.html
Marks, R. (2012), “I get the feeling that it is really unfair”: Educational triage in primary mathematics. Proceedings of the British Society for Research into Learning Mathematics, 32(2), Smith, C. (Ed.) Retrieved from https://www.bsrlm.org.uk/IPs/ip32-2/BSRLM-IP-32-2-10.pdf
Marwick, A. E. (2014, January 9). How your data are being deeply mined [Web log post]. The New York Review of Books. Retrieved from http://www.nybooks.com/articles/archives/2014/jan/09/how-your-data-are-being-deeply-mined/?pagination=false
May, H. (2011). Is all the fuss about learning analytics just hype? Retrieved from http://www.loomlearning.com/2011/analytics-schmanalytics
Mayer-Schönberger, V. (2009). Delete. The virtue of forgetting in the digital age. Princeton, NJ: Princeton University Press.
McInerney, D. M., King, R. B. (2013), Harnessing the power of motivational factors for optimizing the educational success of remote indigenous students: A cross-cultural study. In Rhonda G. Craven & Janet Mooney (Eds.), Seeding success in indigenous Australian higher education (Diversity in Higher Education, Volume 14) (pp.81—111). Emerald Group Publishing Limited. DOI: 10.1108/S1479-3644(2013)0000014004.
Meisenhelder, S. (2014, March 6). Rush to online higher ed only provides ‘access’ to failure [Web log post]. Retrieved from http://www.huffingtonpost.com/susan-meisenhelder/rush-to-online-higher-education_b_4914762.html
Morozov, E. (2013a, October 23). The real privacy problem. MIT Technology Review. Retrieved from http://www.technologyreview.com/featuredstory/520426/the-real-privacy-problem/
Morozov, E. (2013b). To save everything, click here. London, UK: Penguin Books.
Newman, F., Couturier, L., & Scurry, J. (2010). The future of higher education: Rhetoric, reality, and the risks of the market. San Francisco, CA: Jossey Bass.
New Media Consortium. (2014). NMC Horizon Report. Retrieved from http://www.nmc.org/pdf/2014-nmc-horizon-report-he-EN.pdf
Oblinger, D. G. (2012). Let’s talk analytics. EDUCAUSEreview, July/August, 10—13. Retrieved from http://www.educause.edu/ero/article/lets-talk-analytics
Oliver, M., & Conole, G. (2003). Evidence-based practice and e-learning in higher education: Can we and should we? Research Papers in Education, 18(4), 385—397. DOI: 10.1080/0267152032000176873
Pappano, L. (2012, November 12). The year of the MOOC [Web log post]. New York Times. Retrieved from http://query.nytimes.com/gst/fullpage.html?res=9906E0D91F3EF937A35752C1A9649D8B63
Prinsloo, P. (2009). Modelling throughput at Unisa: The key to the successful implementation of ODL. Retrieved from http://uir.unisa.ac.za/handle/10500/6035
Reeves, T. (2011). Can educational research be both rigorous and relevant? Educational Designer. Journal of the International Society for Design and Development in Education. Retrieved from http://www.educationaldesigner.org/ed/volume1/issue4/article13/index.htm
Richardson, W. (2012, July 14). Valuing the immeasurable [Web log post]. Retrieved from http://willrichardson.com/post/27223512371/valuing-the-immeasurable
Rovai, A. P. (2002). In search of higher persistence rates in distance education online programs. Internet and Higher Education, 6, 1—16. DOI: 10.1016/S1096-7516(02)00158-6.
Rust, V. D., & Kim, S. (2012). The global competition in higher education. World Studies in Education, 13(1), 2—50. Retrieved from http://www.academia.edu/2490323/The_Global_Competition_in_Higher_Education
Selwyn, N. (2014). Distrusting educational technology. Critical questions for changing times. New York, NY: Routlegde.
Shirky, C. (2014, January 29). The end of higher education’s golden age [Web log post]. Retrieved from http://www.shirky.com/weblog/2014/01/there-isnt-enough-money-to-keep-educating-adults-the-way-were-doing-it/
Siemens, G. (2011). Learning analytics: A foundation for informed change in higher education. Retrieved from http://www.educause.edu/library/resources/learning-analytics-foundation-informed-change-higher-education
Siemens, G., & Long, P. (2011). Penetrating the fog: Analytics in learning and education. EDUCAUSE Review, September/October, 31-40. Retrieved from https://net.educause.edu/ir/library/pdf/ERM1151.pdf
Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behavioural Scientist, 57(10), 1380—1400.
Slade, S., & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist, 57(1) pp. 1509–1528.
Slaughter, S., & Rhoades, G. (2010). Academic capitalism and the new economy: Markets, state, and higher education. Baltimore, Maryland: Johns Hopkins University Press.
Sparks, S. D. (2012, March 9). Study finds ‘Bubble Student’ triage a gut reaction to rising standards [Personal web log post]. Retrieved from http://blogs.edweek.org/edweek/inside-school-research/2012/03/study_finds_bubble_student_tri.html
Staley, D. J., & Trinkle, D. A. (2011). The changing landscape of higher education. EDUCAUSEreview. Retrieved from http://www.educause.edu/ero/article/changing-landscape-higher-education
Straumsheim, C. (2013, November 6). Mixed signals [Web log post]. Retrieved from http://www.insidehighered.com/news/2013/11/06/researchers-cast-doubt-about-early-warning-systems-effect-retention
Subotzky, G., & Prinsloo, P.. (2011). Turning the tide: A socio-critical model and framework for improving student success in open distance learning at the University of South Africa. Distance Education, 32(2), 177—193.
Swain, H. (2013, August 5). Are universities collecting too much information on staff and students? [Web log post]. The Guardian. Retrieved from http://www.theguardian.com/education/2013/aug/05/electronic-data-trail-huddersfield-loughborough-university
The Economist. (2014, June 28th-July 4th). Creative destruction. Reinventing the university. The Economist, 411(8893).
Van Barneveld, A., Arnold, K. E., & Campbell, J. P. (2012). Analytics in higher education: Establishing a common language. EDUCAUSE Learning Initiative, 1, 1—11. Retrieved from http://sites.ewu.edu/elearningservices/files/2012/06/Analytics-in-Higher-Education-Establishing-a-Common-Language-ELI3026.pdf
Wagner, E., & Ice, P. (2012, July 18). Data changes everything: delivering on the promise of learning analytics in higher education. EDUCAUSEreview. Retrieved from http://www.educause.edu/ero/article/data-changes-everything-delivering-promise-learning-analytics-higher-education
Watters, A. (2012). Unbundling and unmooring: technology and the higher ed tsunami. EDUCAUSEreview. Retrieved from http://www.educause.edu/ero/article/unbundling-and-unmooring-technology-and-higher-ed-tsunami
Watters, A. (2013, October 13). Student data is the new oil: MOOCs, metaphor, and money [Web log post]. Retrieved from http://www.hackeducation.com/2013/10/17/student-data-is-the-new-oil/
Waugh, W. L. Jnr. (1998). Conflicting values and cultures. The managerial threat to university governance. Retrieved from http://www.jcu.edu/academic/planassess/planning/files/Planning%20articles/managerial%20threat%20to%20univ%20govern.pdf
Wetterstrom, L. (2014, 28 January). The year after the MOOC. [Web log post]. The Gate. Political analysis and opinion from the University of Chicago. Retrieved from http://uchicagogate.com/2014/01/28/years-after-mooc/
Wiley, D., & Hilton III, J. (2009). Openness, dynamic specialization, and the disaggregated future of higher education. International Review of Research in Open and Distance Learning, 10(5), 1—16. Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/768/1414
Wilson, T. (2012, March 30). Educational triage – who gets lost? [Personal web log post]. Retrieved from http://nisce.org/educational-triage/
Winslow, G. R. (1982). Triage and justice. Berkeley, CA: University of California Press.
Wishon, G. D., & Rome, J. (2012, 13 August). Enabling a data-driven university. EDUCAUSEreview. Retrieved from http://www.educause.edu/ero/article/enabling-data-driven-university
World Medical Association. Statement on Medical Ethics in the Event of Disasters. (1994). Adopted by the 46th WMA General Assembly, Stockholm, Sweden, September and revised by the 57th WMA General Assembly, Pilanesberg, South Africa, October 2006. Retrieved from http://www.wma.net/en/30publications/10policies/d7/
Yorke, M. (2004). Retention, persistence and success in on‐campus higher education, and their enhancement in open and distance learning. Open Learning: The Journal of Open, Distance and e-Learning, 19(1), 19—32. DOI: 10.1080/0268051042000177827