Member storyA female pupil is sitting behind a computer receiving help in lesson

From scoping to implementing: insights and guidance from Newcastle’s learning analytics journey

The learning analytics project team from Newcastle University share their experience implementing data explorer and study goal.

Newcastle University (NU) is a Russell Group institution based in Newcastle upon Tyne and two overseas campuses in Singapore and Malaysia. NU is partnering with Jisc to deploy learning analytics as a service using data explorer and study goal and a secure cloud platform.

The NU project team, including Sam Flowers and Matt Laidler, learning enhancement and technology advisers from the Learning and Teaching Development Service (LTDS) and Dr Raghda Zahran, programme and project manager from the Information Technology Project Management Office (PMO), agreed that building an understanding of the scope of analytics is essential:

"As the rest of the sector, we have a growing body of colleagues and alumni and progressing cohorts of students and we needed to build an understanding of: What do we mean by analytics? Why are we using it and the benefits behind it?"

Define your priorities early

The team was already involved with the university’s education systems and had an in-depth understanding of how learning analytics could be useful. They mapped a framework around the aim, objectives, and data before they moved onto policy, as Raghda explains:

"We used our learning analytics policy as a starting point, we identified and prioritised our data points, however, we needed to be clear what we want to achieve with analytics."

The project team engaged with a wider network of colleagues across the sector. This helped them draw a comprehensive overview of the current situation, understand existing policies, and how other institutions had considered and deployed learning analytics.

All the team appreciated the ‘open, honest and friendly’ support of the existing learning analytics community. Sam explains:

"We've often had really effective conversations with other institutions that are a lot further down the road than we are. Finding out about how it's been implemented, what type of stakeholders, what policies and procedures are impacted. People are happy to share their experiences, which has been really beneficial."

The analytics team focused on the students' learning experiences, centring on wellbeing, engagement, and achievement over retention. With a strong student body, their aim was to build and maintain students’ interactions with their tutors and support services.

NU took into account that Jisc’s learning analytics is co-designed with the sector when choosing data explorer and study goal. Raghda says:

"We were looking for a delivery partner that cares about student wellbeing and specific requirements."

Support from senior managers and leaders

The project objectives were carefully examined by NU’s digital education sub committee and education committee. The project was made possible thanks to the crucial support of the university registrar, who holds ownership of the student data and possesses a strong appreciation for its significance. Raghda explains:

"Our senior management play a pivotal role in guiding and shaping the direction of our analytics initiative."

Sam Flowers agreed, commenting:

"A steering group chaired by the registrar gave us some significant movement and helped us to push this forward."

Completing a business case and securing the necessary funds was a vital step to move the project from concept to delivery.

Bring the people on the journey

Engaging the right stakeholders is critical to ensure the success and effectiveness of any initiative or project. Many academics were very enthusiastic about the plans. In addition, colleagues who used teaching analytics and business intelligence contributed their own ideas to the project. Raghda says:

"Identify key stakeholders because you can identify priority data feeds, but you need access to understand how the data is used."

There will always be some scepticism at the advent of a change programme, including those colleagues who aren’t already using learning analytics, who may not understand what it could do for them. The team were sensitive to what might be considered additional burden or workload. Building on the culture of student support was core to the project’s success.

The team also acknowledged the support of their in-house technical teams:

"The synchronisation amongst our data owners, integration, and software development teams’ commitment was astonishing."

Plan big, start small, and keep testing

The team realised that they would need a scaffolded approach to the pilot, to match the agile technical build. By splitting the pilot into small, incremental stages, they could identify any pain points, resolve them, or keep them in mind for the next stage. Sam says:

"We rolled out small to begin with so that we could test the data, get feedback, find out if there's any issues, any data inaccuracies and determine best use cases. It’s a new system, so we wanted to know how colleagues were using it and what they found to be the most useful functionality."

The pilot included focus groups, feedback sessions, and online feedback forms to gather as much qualitative and quantitative data as possible.

The Jisc learning analytics team were on hand to support NU throughout the process, Sam describes this experience:

"It's been such a collaborative approach that we don't feel like we would have had that with other systems, whereas Jisc feel focused on improving students’ opportunities and work in the higher education sector. We feel like they've really had our interests at heart, so that has been a really positive thing."

Students at the heart of learning analytics

Sam shared how the team has run focus groups with students throughout the pilot, looking at how beneficial they find the system, how often they feel they would use it, and why they would use it. Matt agreed:

"Having students in the process throughout has been massive in terms of helping us with staff buy-in. The students have been actively involved, from the communication, through to talking about what learning analytics means and how it can help them. All of that has fed into what we're doing; I think it's been huge."

The student input and feedback has helped overcome some of the concerns about monitoring or feeling ‘tracked,’ which Matt expanded upon:

"A few of the contentious issues have been around comparison to cohorts, obviously, and some people have quite strong feelings one way or the other about that. We found from talking with our student focus groups, that they quite like that functionality. It motivates them in terms of, 'okay, this is something I need to improve upon'."

Getting your house (and data) in order

So: you have the system, you have the people for the pilots, and importantly you have stakeholder support.

You know you have the data, but getting it into a new system is a significant part of the challenge for any institution moving to a centralised system, as Raghda explains:

"The biggest challenge for us was integrating and mapping our data, because of the differences in our institution, data sources, students learning styles, and colleagues’ practices."

The team recommend understanding the differences in data as early on as possible, as this also helps you to be ‘flexible in future to changes’:

"Data today may not be what we want tomorrow."

As they approach the release of study goal to undergraduates, the team reflected on the size and scale of the project and what it has meant to them:

"We were thinking what we've done so far, but the journey has just started for us. With learning analytics, the benefits outweigh any downsides. You have to give the data back to the students and colleagues so they can use it for learning and teaching."

Further information