Title: Developing a Holistic, Whole of Institution Retention Project
Presenter: Kelly Linden (Retention Lead)
Co-presenter(s): Neil Van der Ploeg and Ben Hicks
Faculty / Division: DSS
School / Unit: Retention Team
Session Type: Symposium
When: Thursday, 18 November 2021 @ 11:15am – 12:15pm
The Charles Sturt University Retention Team have been working across the three Faculties and Division of Student Success to increase domestic undergraduate student retention and engagement since 2019. The team works closely with subject coordinators and the Student Outreach Team to identify, and contact disengaged students in the pre census period of each main session and pilots initiatives aimed to increase student success and engagement.
Ghosting is a student behaviour characterised by enrolling in a subject but never participating. Hence, a ghost student who remains enrolled receives a zero-fail grade. From 2022, the Job ready Graduates Package will require that only genuine students have access to Commonwealth assistance at Australian Universities and an institution may need to refund the fees of what are referred to as ‘non-genuine students. In 2019 and 2020, 382 first-year subjects were monitored to identify disengaged students in weeks 3 and 4 of the session using learning analytics and non-submission of an early assessment item. Disengaged students were contacted via phone and 2-way SMS and offered timely and targeted support pre census. The total number of domestic undergraduate students receiving a zero fail has decreased during this time. To further reduce the number of ghost students, once identified as disengaged, an engagement should be mandatory to remain in the subject.
Three large first-year undergraduate subjects with 240-517 enrolled students were selected to participate in this pilot study. A meeting scheduling tool was embedded in the learning management system and thirty-minute, one-on-one tutorial sessions were available to students in the 2 weeks leading up to the due date of at least one large written task. 31% of enrolled students attended at least one appointment with a tutor. There was no difference in the average assessment mark that students obtained before the first tutorial was offered between those who attended a tutorial session for a later assessment item and those who did not. There was a significant increase in the average cumulative grade (10%, p<0.05) of students who attended a tutorial. The novel use of the calendar booking tool combined with online meeting technology provides a simple and convenient method for engaging a large cohort of students in online support.
Knowing when a student is being productive in an online learning environment is challenging to discern from online trace data. Custom built assessment tools, integrated into a learning management system (LMS), offer a way to obtain finer-grained data not commonly available in existing systems. An exploratory observational study of 1,822 assessment submissions in an online course of 500 students was conducted. All assessments were submitted utilising a new online assessment tool that offered embedded resources, feedback and tracked when words were typed or pasted into the tool. Students were hesitant to consistently use the online editor, moderately used the embedded resources and heavily utilised the feedback. There was moderate evidence that whether or not a student viewed the previous assessments feedback was a better indicator of future assessment success than LMS activity.