From prospect to alumnus: London/Birmingham

Jisc are currently working on and exploring four key co-design challenges:

  1. Research at risk
  2. From prospect to alumnus
  3. Effective learner analytics
  4. Building capability for new digital leadership, pedagogy and efficiency

You can find out more about each of these challenges and the co-design approach via Jisc’s website at:

Each challenge is actively working to consult the sector to help inform where Jisc should focus its effort. In this post we’d like to share with you some of the feedback we’ve had so far from two consultation workshops (one in London and one in Birmingham) focusing on “From prospect to alumnus”.

Structure of the consultation workshops

The two workshops were participatory in their nature to try and get as much input as we possibly could from each attendee. We used a mixture of individual activities so that everyone had the chance to participate, but also wider group discussions which helped to tease out some of the cross-cutting themes.

The following list describes our approach:

  1. Student lifecycle. We asked each participant to highlight their experience in relation to the student lifecycle. Where do individuals or their institution experience highs (things that work well for them) and lows (issues/barriers/road blocks).
  2. Individual case studies. Using one of the issues from the “student lifecycle” activity, we asked each participant to write an individual case study pertinent to them. These were then mapped onto a matrix–one axis focusing on who the issue affected (individuals, departments, institutions, through to the wider sector and beyond) whilst the other axis focused on how many of the student lifecycle stages the issue affected.
  3. Solutions. Participants were then asked to vote on what they thought were the most pressing issues that Jisc needed to focus on. The top issues were selected and attendees were then given time to consider how these issues could be resolved.

Invites for the workshops went out through a range of communication channels including, but not exclusive to: NUS, UCISA Corporate Information Systems Group (CISG), Relationship Management in HE/FE, funding agencies, and NCUB.

Workshop findings

Experience in relation to the student lifecycle

There have been a number of reports focusing on the student lifecycle and different interpretations of what it does and does not include. We used the Jisc “Landscape Study of Student Lifecycle Relationship Management” as a foundation for discussions during our consultation, see Figure 1. Each participant was asked to annotate the model, highlighting lows and highs experienced along the journey.

Figure 1, student lifecycle stages used during this consultation.

Screen Shot 2014-07-30 at 16.37.32.png

There were two key observations during this exercise:

  1. Individual stages. Experience varied when focusing on a particular stage of the lifecycle. One institution might have cracked it at the application stage whilst others are floundering. Institutions that are struggling in one particular stage would find it helpful to know who in the sector is mature in that area and how they achieved that maturity. They would also find it useful to know what already exists in terms of information, advice and guidance in relation to a specific stage.
  2. Cross-cutting themes. There were a range of cross-cutting themes that emerged from discussions which the vast majority of organisations struggle with, for example:
    1. Staff skills/competencies–leadership, digital literacies, embracing data
    2. A more joined-up approach–integrated systems, breaking down silos, information strategies
    3. Infrastructure–working more effectively with suppliers, up-to-date IT infrastructure

A lot of the issues being highlighted at individual stages are areas that Jisc have been/or is involved with e.g. assessment and feedback, timetabling, and learner analytics. The links provided are just examples of what’s available from Jisc and so perhaps there is further work to be done in improving access to those resources and communicating the learning from some of those areas.

Individual case studies

The following are summaries of the case studies voted as the highest priority for Jisc to address (all other case studies are plotted on the matrix and forms completed by attendees are available)


11 votes–effective implementation and use of analytics

Digital data we collect on students on multiple systems. How can the data be brought together to enhance student experience, increase retention, and support students. There are complexities in providing data to staff, giving students access and moving data across multiple systems.

Affects: Existing students, academic staff
Impacts: Induction, L&T, pastoral care
Complexity: 6/8

7 votes–MI business intelligence management ownership policy and implementation

It is noticeable that there is a lack of governance of management information and business intelligence. Departments tend to look after their own interests, and nobody takes responsibility for MI and BI across the board. There are also issues with data retention, meaning it is difficult to work out what to retain, maintain or archive. Data governance can be difficult without a lack of strategy and vision from the top – it is hard to get buy in at VC level.

Affects: Potential students, existing students, academic staff, support staff, graduates/alumni
Impacts: Whole lifecycle
Complexity: 7/8

6 votes–inconsistencies in giving timely feedback to students

Feedback is a big issue across the board. It is both a human problem (people can slow things down) and a system problem – fi systems are not user friendly some people will not bother to use them, meaning feedback can be delayed or not given at all. This could be an issue for retention in some institutions. There is a mix of institutions using in-house systems and established solutions.

Affects: Existing students, academic staff
Impacts: L&T
Complexity: 5/8

5 votes–assessment criteria and feedback

This is a sector based issue which affects the whole lifecycle. Criteria are often not very objective – students don’t know what they need to do to get a 2:1 for example. There are also inconsistencies in feedback which has implications for pastoral support. Students don’t know what to aim for which in turn can affect their outcomes, and can have a knock on effect for other aspects of student experience.

Affects: Potential students, existing students, academic staff, support staff
Impacts: Pre-application, application, induction, L&T, pastoral care, employability, graduation, alumni
Complexity: 8/8

Figure 2–matrix of issues discussed at the London workshop

P2A London Matrix


7 votes–integration of systems

From an institutional perspective the greatest issues are around registration, induction, learning and teaching and assessment but goes wider. MIS doesn’t talk well to the VLE impacting on the student and lecturer experience (for example when data hasn’t rolled onto the VLE at the start of the course, and students not being recognised by systems). Similar issues exist with assessment and student records – systems can be clunky, require unnecessary effort, and delays in feedback can occur. Systems can constrain rather than support what can be done due to lack of integration between assessment and student record systems.

A lot of the technical implementations we have were put there to do specific things, perhaps a long time ago. We are now trying to get them to do things they weren’t designed for, bolting on rather than doing it from the start. People are experiencing the same problems, but trying to work around it in different ways.

Affects: Existing students, academic staff, support staff
Impacts: Registration, induction, L&T
Complexity: 3/8

6 votes–analysing data to best effect

Masses of data exists but nobody knows how to use it or understood reports. We need to present data fit for purpose, for different audiences, that can be created automatically and be sustainable. A people focused solution needs to be there before technological implementation.

People have limited time to engage with receiving data so we have to use it to best effect. There is a small window of opportunity – people respond best to data when presented in different ways, for example people with a bad experience of numbers may prefer visualisation to get the key message straight away. Technology needs to be lean, mean and efficient, with a strategic push to present data that means things to people, in digestible quantities. Automation of as much data as possible is important, communicated at regular times (through automation). There needs to be a big emphasis on usability.

Affects: Potential students, existing students, academic staff, support staff, graduates/alumni
Impacts: Whole lifecycle
Complexity: 4/8

4 votes–staff experience

Recognition of learning and teaching is vital, but is low – without it staff will not put in as much effort. There has also been a culture which has developed whereby the student is at the heart of everything, but staff experience directly impacts on student experience but this is often not thought through.

You can do as much as possible with learning and teaching (L&T) systems and processes to free up time for academics, but will they spend that time on L&T or on research which will not benefit student experience but often results in greater recognition and reward in institutions. Staff need to understand the usefulness and benefits of using systems, otherwise systems will not benefit students. There are variations of who is using and populating systems with data – sometimes academic staff, sometimes administrators. This can vary across institutions, and also departments within institutions. Standardisation in the use of systems is an immense challenge and could remove some of the necessary flexibility required.

Affects: Potential students, existing students, academic staff, support staff, graduates/alumni
Impacts: Whole lifecycle
Complexity: 7/8

Figure 3–matrix of issues discussed at the Birmingham workshop

P2A Birmingham Matrix

Idea Generation

Common themes arising:

  • Analytics needs to be planned, we need to work out common issues and problems. Analytics also needs to be complemented by qualitative data. There needs to be different ways and styles of viewing data so that it is understandable to all and allows everyone to engage with it.
  • Feedback needs to be timely. Standards would help to provide acceptable standards. Sector specific practices should be taken into account to harmonise feedback. Feedback should be acted upon, and student view on it monitored.
  • Pastoral care relies on systems, which links into feedback above.
  • Need for a consistent approach in developing systems.
  • Integration of systems moving away from silo approach – needs strategy, definitions, processes, standardised processes, prototyping and piloting.
  • Staff experience is key – staff need to be recognised for their efforts with students, staff are key to enhancing student experience.

Final feedback–what stood out? What should Jisc take forward?

Voice of the sector. Jisc could act as a voice of the sector to suppliers to establish common requirements and a common integration process.

Systems Integration. Some systems work well together, others don’t. There is a need to integrate systems to give a complete view of the lifecycle. We therefore need to work out how systems can work well together, addressing issues of multiple systems and each part of the lifecycle and overlapping systems.

Analytics. This is still an issue, even though it has already been looked at by Jisc. Outputs from other programmes should inform the “From prospect to alumnus” theme.

Human elements. Not all of the problems are systems related, people and cultural issues need to be addressed. Jisc could provide a framework for good practice and a self-assessment tool to work out where improvements can be made. There is a need to get the perspective of staff and students before deciding how to integrate systems–staff need to feel that technology is an enabler rather than a barrier, and that people have the appropriate skills.

Consultation. Jisc should find out more about what students think, just like staff we need to talk to them and manage expectations.

Don’t overlook the start. Jisc should make sure the applicant side of the student experience is given attention.

Sharing best practice. Systems integration is not a new thing–could Jisc learn by speaking to sectors outside of education? A great deal of commonality has been identified of the problems faced, not just about digital aspects but the whole student experience.

This post was co-authored by Marc Dobson and Andrew Stewart.

1 thought on “From prospect to alumnus: London/Birmingham

  1. Pingback: From prospect to alumnus: Newcastle/Glasgow | From Prospect to Alumnus

Leave a Reply

The following information is needed for us to identify you and display your comment. We’ll use it, as described in our standard privacy notice, to provide the service you’ve requested, as well as to identify problems or ways to make the service better. We’ll keep the information until we are told that you no longer want us to hold it.
Your email address will not be published. Required fields are marked *