Practice exchange Enhancing feedback and feed-forward via integrated virtual learning environment based evaluation and support P. Penn & I. Wells Over the last 15 years, the subject of feedback on assessment has come under considerable scrutiny in the literature, with a particular focus on the utility of feedback for subsequent assessment (i.e. feed-forward). Organisations such as the Higher Education Academy (HEA) have synthesised research on this topic into guidance for tutors through the Student Enhanced Learning through Effective Feedback (SENLEF) project (Juwah et al., 2004). However, the provision of advice on good practice in feedback for tutors faces a parallel challenge to the provision of feedback to students: ‘it’s only useful if it’s used’. Creative use of technology has the potential to reconcile the requirements of high value feedback for students with university resource constraints. This paper outlines the use of the Grademark functionality within the Turnitin© software package to create a series of standardised feedback items, informed by principles of good practice advocated by SENLEF, that facilitate the process of feed-forward by web-linked integration into a custom on-line repository of constructively aligned skills based material. Keywords: Feedback, Feed-forward, Virtual Learning Environment (VLE), Turnitin Conceptualising feedback and feed-forward into its principles of good prac- feed-forward tice. However, despite such efforts, feedback IN HIS DEFINITION of the term ‘feed- remains a principle source of dissatisfaction back’, Sadler (1989) argued that in order for students in higher education (Bloxham, for feedback to benefit the quality of 2014; Soilemetzidis et al., 2014). Studies student learning: ‘The learner has to (a) have indicated that feedback can be: difficult possess a concept of the standard (or goal, to understand (Weaver, 2006); lack specific or reference level) being aimed for, (b) advice on how to improve (Higgins et al., compare the actual (or current) level of perfor- 2001) or be difficult to act upon (Poulos & mance with the standard, and (c) engage Mahony, 2008). There exists a possibility that in appropriate action which leads to some a principle reason for this limited success is closure of the gap’ (p.121). Thus the extent that the literature on feedback and projects to which feedback can be used to benefit such as SENLEF are being confounded by future assessment performance i.e. its ‘feed- the same fundamental challenge in commu- forward’ (e.g. Duncan, 2007) assumes a crit- nicating principles of good practice to tutors ical role in its definition. that tutors face when communicating feed- The importance of the utility of feed- back to students: i.e. ‘the advice is only useful back has been embraced by the literature if it’s acted upon’. (Murtagh & Baker, 2009; Price et al., 2010; Wimhurst & Manning, 2013). Bodies such as Reconciling feedback and resources. the Higher Education Academy (HEA) have The need for high value feedback and also synthesised extensive research on feed- sustainable practice back into the Student Enhanced Learning Research has indicated that a number through Effective Feedback (SENLEF) of factors including time pressure (e.g. project (Juwah et al., 2004), which embeds Chanock, 2000) and perception of student 60 Psychology Teaching Review Vol. 23 No. 2, 2017 Enhancing feedback and feed-forward via integrated virtual learning environment based evaluation recipience (Price et al., 2010) can be an be used of part of a marker’s repertoire impediment to the implementation of (e.g. Denton et al., 2008). At the same time, advice on good practice with respect to contemporary literature has been exam- feedback. A clear example of this issue is ining how feedback can be constructively implicit in the distinction made by Houn- aligned with pre-defined assessment criteria sell (2007) between ‘low vs. High value or learning outcomes to address concerns feedback’. Low value feedback encapsu- with feedback that have been articulated by lates practices such as: pre-occupation with students, such as that it can lack specific superficial aspects of the work, e.g. errors advice on how to improve (Higgins et al., in spelling and grammar; terse comments, 2001) or be difficult to act upon (Poulos & for instance: ‘this is unclear’; or the use Mahony, 2008). There is, therefore, an of symbols to denote evaluative comments, impetus and opportunity to use technology such as: ‘???’; and comments that focus on to audit and shape feedback in relation to describing the issues with a piece of work the requirements of the assessment and to without suggestions on how to remedy them. more general principles of good practice In contrast, high value feedback focuses on such as those postulated by SENLEF before the conceptual aspects of the work, such marking occurs. as the development of argument and is Arguments for using technology in the explicit and directive in terms of providing feedback process go beyond just the char- instruction on what can be done to improve acteristics of the feedback, however. They any issues identified. Unfortunately, high also extend to facilitating student access to value feedback requires considerably more their feedback. In an insightful analysis of staff resources to provide within a higher impediments to student engagement with education landscape where larger student feedback Winestone et al. (2016) postulated numbers, increased modularisation and four psychological processes governing end loading of assessment have become student engagement (awareness, cogni- increasingly problematical (Carless et al., sance; agency and volition) and emphasised 2011). A key objective, therefore, lies not that students often knew of appropriate just in the promotion of good practice, strategies to make use of their feedback, but concomitantly ensuring that it mitigates but that interventions needed to support potential barriers to its implementation. the students in turning that knowledge into action by making feedback seeking more Utilising technology to facilitate the accessible and encouraged. This is some- implementation of sustainable good thing that technology could also be of great practice in feedback help with, particularly when software used to There is precedent in the literature for provide feedback can be embedded into a technology facilitating attempts to imple- virtual learning environment and integrated ment assessment practice that might be too with sources of support on how to imple- resource intensive by other means, especially ment feedback i.e. nesting the feedback when dealing with large cohorts (Heinrich within a wider network of support (Houn- et al., 2009; Hepplestone et al., 2011). This sell, 2015). Turnitin© is an example of such has been demonstrated with respect to quiz software. Turnitin contains the Grademark based assessment (e.g. McDaniel et al., 2012; interface that has a feature called ‘Quick Penn & Wells, in press). Research has also Marks’ that allow the application of pre- begun to examine how technology might be specified feedback comments to specific harnessed to enhance feedback on written points within the students’ submissions that assignments such as essays via the function- can be used by markers (in addition to their ality of word processing software to provide own comments) as and when appropriate. standardised feedback comments that can An elucidation of how the authors have used Psychology Teaching Review Vol. 23 No. 2, 2017 61 P. Penn & I. Wells this facility in conjunction with on-line skills into specific parts of a student’s composition provisions to reconcile the need for high (denoted by a coloured symbol) or attached value feedback with resource constraints to highlighted text. The comments expand will now be illustrated with respect to two from their symbol denotation only when key principles identified by SENLEF. The the mouse is rolled over them so as to aid two principles selected have a particular legibility by reducing the conflict between emphasis on promoting feed-forward i.e. the feedback and the composition. This is that good feedback: ‘Helps clarify what good an important factor in students’ compre- performance is (goals, criteria, expected hension of their feedback (Sadler, 2010) standards) … and provides opportunities to and can be very hard to achieve with paper close the gap between current and desired based marking. Furthermore, Quick Marks performance’ (Nicol & MacFarlane-Dick, are titled and categorised with respect to 2006, p.7). the marking criterion to which they relate so tutors can easily locate relevant comments. Utilising Quick Marks to help Consider the following example of a Quick feedback clarify what constitutes good Mark pertaining to an essay marking crite- performance rion called ‘using evidence’ that could be Understanding expectations with respect inserted into a student’s composition via one to assessment is an obvious prerequisite mouse click. for student attainment. However, there is ‘Unconvincing evidence: This text research that suggests that such expecta- contains evidence of dubious reliability. In tions about marking criteria and assessment an academic piece of writing, you should standards are not aligned between students be referring to peer reviewed sources (e.g. and tutors (e.g. Rust et al., 2003) and that published journals, books, conference the magnitude of the discrepancy negatively proceedings etc.). You should consider the correlates with performance on assessed work reliability of the source when using website (Hounsell, 1997). Clarifying what consti- based material (e.g. Wikipedia), as this is tutes good performance in feedback also often not peer reviewed and the authority/ assumes additional importance in terms of qualifications of the author can be unclear. feed-forward into future assessments (Sadler, You should also avoid using anecdotal 2010). An obvious way of achieving this is to evidence; experience is not a substitute for ensure that comments are tied to explicit empirical investigation. For additional help marking and grading criteria such that they and advice on this go to https://moodle.uel. have some utility with respect to the develop- ac.uk/mod/page/view.php?id=66161’. ment of a skill (such as using evidence) that In this example, the comment is explic- can then be applied to subsequent assess- itly tied to the relevant marking criterion, ment featuring the same criterion. This, the issue with the work is explained with however, is labour intensive and marker time respect to that criterion and what constitutes and fatigue become limiting factors, espe- good performance is clarified, all with very cially in large cohorts. The authors have little time investment from the marker. addressed this issue by postulating a series of Quick Marks for different levels of perfor- Embedding URL links within Quick mance with respect to each of 7 marking Marks to additional sources of on-line criteria used in the assessment of essays: support to facilitate the feed-forward of Academic Integrity; addressing the question; tutor comments critical evaluation; quality of composition; Research indicates that students often structuring; standard of coverage; and using need help to facilitate their efforts to make evidence. These Quick Marks can be inserted use of their feedback (i.e. feed-forward) 62 Psychology Teaching Review Vol. 23 No. 2, 2017 Enhancing feedback and feed-forward via integrated virtual learning environment based evaluation and that the absence of such instruction and model responses/exemplars advocated can have a deleterious effect on their by Nicol and MacFarlane-Dick (2006) to engagement with feedback (e.g. Doan, facilitate students closing the gap between 2013; Weaver, 2006). Carless et al. (2011) their current level of performance and the commented that this need has become desired level of performance. more pronounced in recent years owing to the expansion of higher education, and Quick Mark and VLE integration: some the shift of the timing of the assessment concluding comments towards the end of modularised courses. A concern with the use of Quick Marks is Indeed, students have indicated a prefer- that students might perceive feedback ence for feedback delivery on a one-to-one that is devoid of any non-standardised basis in a dialogue with tutors (Murtagh & comments from the marker as ‘impersonal’. Baker, 2009) and conceptualising feedback There is research to suggest that students as more of a dialogue has been advocated are less inclined to deal with feedback that by research (e.g. Nicol, 2010). However, is deemed generic i.e. not specific to the resource limitations make providing this student (e.g. Doan, 2013). However, it is level of support problematical (Hounsell important to clearly distinguish between the et al., 2008). This limitation is particularly terms ‘generic’ and ‘standardised’. A series salient in modules featuring large cohorts of appropriately used and placed standard- and several assessments. ised comments that reflect the strengths and An alternative approach to one-to-one weaknesses of an individual’s work is not tutor based follow up on feedback is to generic feedback. In contrast, non-standard- support students in being proactive in feed- ised comments that are either not specifi- forward via the provision of resources on cally tied to the contents of individual work, the implementation of comments on their or which seek to characterise the cohort’s work i.e. nest the feedback within a wider performance as a whole is generic feed- network of support (Hounsell, 2015). The back. It should be noted that the authors authors have achieved this by embedding are not advocating the use of Quick Marks web-links within Quick Marks to provide to the exclusion of individual comments. direct and immediate access to construc- Indeed, there is the facility for a marker tively aligned additional learning materials/ to insert their own comments or append guidance on interpreting feedback online. Quick Marks to expand or clarify the points Each Quick Mark postulated for essay feed- being made and give feedback the personal back contains a URL to a corresponding touch advocated by authors such as Hounsell help video located in an essay writing guides (2015). Conversely, it could be argued that section of an extensive on-line study skills pre-specified comments are more conducive repository within Moodle. If the reader with avoiding idiosyncratic marking prac- refers to the example Quick Mark on using tices, such as errors in student compositions evidence given previously they will see that being attributed to personal failings, which it concludes with a URL link to a video can be deleterious to feedback (Shute, 2008; in the skills repository on help with using Värlander, 2008). evidence. This video contains a delinea- Evaluation of student and staff percep- tion of the using evidence criterion, action tion of the integration of Quick Marks with points on how to satisfy this criterion and VLE based support is clearly warranted. In examples that invite students to identify the interim, this paper postulates that there whether they are exemplars of good or bad is a strong evidence-based pedagogical argu- practice with respect to this criterion. Each ment for the utility of this approach in: facili- video contains the kind of action points tating the implementation of good practice Psychology Teaching Review Vol. 23 No. 2, 2017 63 P. Penn & I. Wells in giving feedback for staff; reconciling the need for high value feedback with resource constraints; and promoting feed-forward of tutor comments in student compositions. The authors P. Penn & I. Wells & Dr Paul Penn Correspondence Dr Paul Penn, School of Psychology, Univer- sity of East London, Water Lane, London, E15 4LZ. Email: [email protected] References Bloxham, S. (2014). Assessing assessment: New Hounsell, D. (2007). Towards more sustainable feed- developments in assessment design, feedback back to students. In D. Boud and N. Falchichov practices and marking in higher education. In (eds.) Rethinking assessment in higher education (pp. F. Heather, S. Ketteridge & S. Marshall (Eds.) A 101-113). London: Routledge. handbook for teaching and learning (pp.107–123). Hounsell, D. (2015). Commenting constructively: Feed- New York, NY: Routledge. back to make a difference. Wise assessment briefing Carless D., Salter D., Yang M. & Lam J. (2011). Devel- number 11. Hong Kong: University of Hong Kong. oping sustainable feedback practices. Studies in Hounsell, D., McCune, V., Hounsell, J. & Litjens, J. Higher Education, 36, 395–407. (2008). The quality of guidance and feedback to Chanock, K. (2000). Comments on essays: Do students. Higher Education Research & Development, students understand what tutors write? Teaching 27(1), 55–67. in Higher Education, 5(1), 95–105. Juwah, C., Macfarlane-Dick, D., Matthew, B., Nicol, D., Denton P., Madden J., Roberts M. & Rowe P. (2008). Ross, D. & Smith, B. (2004). Enhancing Students’ response to traditional and computer student learning through effective formative assisted formative feedback: A comparative case feedback. Retrieved online at http://www. study. British Journal of Educational Technology, 39, heacademy.ac.uk/resources.asp?process=full_ 486–500. record§ion=generic&id=353 Doan, L. (2013). Is feedback a waste of time? The McDaniel, M.A., Wildman, K.M. & Anderson, J.L. students’ perspective. Journal of Perspectives in (2012). Using quizzes to enhance summative- Applied Academic Practice, 1(2), 3–10. assessment performance in a web-based class: An Duncan, N. (2007). Feed-forward: Improving experimental study. Journal of Applied Research in students’ use of tutors’ comments. Assessment & Memory and Cognition, 1, 18–26. Evaluation in Higher Education, 32(3), 271–283. Murtagh, L. & Baker, N. (2009). Feedback to feed Heinrich, E., Milne, J., Ramsay, A. & Morrison, D. forward: Student response to tutors’ written (2009). Recommendations for the use of e-tools comments on assignments. Practitioner Research in for improvements around assignment marking Higher Education, 3(1), 20–28. quality. Assessment & Evaluation in Higher Educa- Nicol, D. (2010). From monologue to dialogue: tion, 34(4), 469–79. improving written feedback processes in mass Hepplestone, S., Holden, G., Irwin, B., Parkin, higher education. Assessment & Evaluation in H.J. & Thorpe, L. (2011). Using technology to Higher Education, 35(5), 501–517. encourage student engagement with feedback: A Nicol, D. & Macfarlane-Dick, J. (2006). Formative literature review. Research in Learning Technology, assessment and self-regulated learning: A model 19(2), 117–127. and seven principles of good feedback practice. Higgins, R., Hartley, P. & Skelton, A. (2001). Getting Studies in Higher Education, 31(2), 199–216. the message across: The problem of communi- Penn, P.R. & Wells, I.R. (in press). Making assess- cating assessment feedback. Teaching in Higher ment promote effective learning practices: Education, 6(2), 269–74. An example of Ipsative assessment from Hounsell, D. (1997). Contrasting conceptions of essay- the School of Psychology at UEL. Psychology writing. In F. Marton, D. Hounsell & N. Entwistle Teaching Review. (Eds.) The experience of learning (Rev. 2nd edn, Poulos, A. & Mahony, M.J. (2008). Effectiveness of pp.106–125). Edinburgh: Scottish Academic Press. feedback: The students’ perspective. Assessment & Evaluation in Higher Education, 33(2), 143–54. 64 Psychology Teaching Review Vol. 23 No. 2, 2017 Enhancing feedback and feed-forward via integrated virtual learning environment based evaluation Price, M., Handley, K., Millar, J. & O’Donovan, B. Värlander, S. (2008). The role of students’ emotions (2010). Feedback: All that effort, but what is the in formal feedback situations. Teaching in Higher effect? Assessment & Evaluation in Higher Educa- Education, 13(2), 145–156. tion, 35(3), 277–289. Weaver, M.R. (2006). Do students value feedback? Rust, C., Price, M. & O’Donovan, B. (2003). Student perceptions of tutors’ written responses. Improving students’ learning by developing Assessment and Evaluation in Higher Education, 31, their understanding of assessment criteria and 379–394. processes. Assessment & Evaluation in Higher Wimhurst, K. & Manning, M. (2013). Feed-forward Education, 28(2), 147–164. assessment, exemplars and peer marking: Sadler, D.R. (1989). Formative assessment and the Evidence of efficacy. Assessment and Evaluation in design of instructional systems. Instructional Higher Education, 38, 451–465. Science, 18(2), 119–141. Winstone, N.E., Nash, R.A., Parker, M. & Rowntree, J. Sadler, D.R. (2010). Beyond feedback: Developing (2016). Supporting learners’ agentic engage- student capability in complex appraisal. Assess- ment with feedback: A systematic review and a ment & Evaluation in Higher Education, 35(5), taxonomy of recipience processes. Educational 535–50. Psychologist, Early online. doi:10.1080/00461520. Shute, V.J. (2008). Focus on formative feedback. 2016.1207538 Review of Educational Research, 78(1), 153–189. Soilemetzidis, I., Bennett, P., Buckley, A., Hillman, N. & Stoakes, G. (2014). The HEPI–HEA student academic experience survey 2014. York: The Higher Education Academy. Retrieved on line http:// www.hepi.ac.uk/uploads/2014/05/HEA_ HEPIReport_WEB_160514.pdf Psychology Teaching Review Vol. 23 No. 2, 2017 65