Saturday 31 March 2018

News round-up: A new higher education regulator is launched and pension strikes continue

The pensions dispute rumbles on, the OfS admits to making a mistake over the Toby Young appointment, and the OU vice-chancellor faces a vote of no confidence

Universities give out unconditional offers like candy, say head teachers

The Times, 31/03/2018, Rosemary Bennett

Universities have been told to stop dishing out unconditional offers “like candy” by teachers who say they are damaging the education of thousands of teenagers.

Sixth-formers need to have targets to help them to stay focused in the final months of their A levels, the teachers say. Those who do not need specific A-level grades to get into university tend to struggle to revise over Easter and fail to consolidate their subject knowledge. They then underperform in their exams and arrive ill-prepared and poorly motivated for their degrees.

Elite universities are selling themselves – and look who’s buying

The Guardian, 30/03/2018, Grif Peterson and Yarden Katz

Last weekend, while media attention was focused on the March for Our Lives protests across America, a militarised police force blocked the road leading up to the Massachusetts Institute of Technology (MIT) Media Lab, one of the university’s most famous laboratories, for a special guest. The guest – the crown prince of Saudi Arabia, Mohammed bin Salman – visited both Harvard and MIT on his first official tour of the US. Saudi officials boasted about the visit, posting photos of Bin Salman with both Harvard provost Alan Garber and MIT president Rafael Reif on social media.

Open University chief to face vote of no confidence

The Guardian, 29/03/2018, Diane Taylor

Staff at the Open University are tabling a vote of no confidence in its vice-chancellor, Peter Horrocks.

Horrocks has come under fire over plans to axe staff and cut courses, which were first revealed in the Guardian last week.

The plans include reducing the number of courses, qualifications and modules by more than one-third as well as axing many lecturers – the workforce budget will be cut by £15m-20m. The university is launching a voluntary redundancy programme on 9 April.

Cambridge University says exams can be shortened due to disruption caused by lecturers’ strike

Daily Telegraph, 29/03/2018, Camilla Turner

Cambridge University has announced that exams and finals can be shortened because of disruption caused by the lecturers’ strike.

The university has sanctioned the removal of questions on material that has not been taught due to cancelled classes, and decreasing the number of questions set.

Academics feel besieged and aggrieved

Times Higher Education, John Gill, 29/03/2018

The Office for Students comes into force on 1 April. This is also Easter Day and April Fool’s Day. You decide which is the most relevant.

To take the first of these, it would be a mistake to think that this is a resurrection of the Higher Education Funding Council for England – the insistence that the OfS is a different beast is not just public relations guff.

UK university staff to vote on latest pensions offer

Richard Adams, The Guardian, 28/03/2018

The bitter dispute between universities and their staff over pensions may soon be resolved, after staff were asked to vote on the latest offer to renegotiate the changes being proposed by employers.

University watchdog’s regret over Toby Young appointment

Katherine Sellgren, BBC News Online, 27/03/2018

The chairman of university watchdog, the Office for Students, says he has “learnt a lot” from the controversial appointment of Toby Young to its board.

Sir Michael Barber told MPs it was a mistake not to look more closely at the journalist’s history on social media.

Free speech at UK universities is being shut down, MPs warn

ITV News Online, 27/03/2018

Free speech at UK universities is being put at risk, MPs and peers have warned.

A new report argues that free speech on campus is being hampered by factors such as intolerant attitudes and unacceptable behaviour, red tape and a lack of clear guidance.

It warns that whole universities cannot be “safe spaces” and they must be places where unpopular and controversial ideas can be heard and debated.

Universities asked to open more post-16 maths free schools

Schools week, 26/03/2018, Pippa Allen-Kinross

Universities are being sought by the government to open specialist post-16 free schools to encourage more pupils to study maths at A-level.

The government will also provide £350,000 dedicated funding every year to each existing and future maths school to support outreach work with local schools and colleges, schools minister Nick Gibb will announce today.

‘Genie out of the bottle’ on casualisation after pension strikes

Times Higher Education, 21/03/2018, Sophie Inge

The increasingly bitter dispute over university pensions marks a seminal moment for UK higher education that will lead to further demands to tackle casualisation and marketisation in the sector, academics say.

After 14 days of escalating strike action at 65 universities over plans to cut the element of the University Superannuation Scheme that guarantees members a certain level of income in retirement, debate is already widening from pensions to the state of British universities as a whole.

Open University plans major cuts to number of staff and courses
The Guardian, 21/03/2018, Diane Taylor

Open University chiefs are planning significant reductions in the number of courses the institution offers and the number of lecturers it employs, the Guardian has learned.

Last June the OU, established in 1969 and the largest university in the UK, announced it needed to cut £100m from its £420m -a-year annual budget, but specific detail of where the cuts would fall was not made public.

Universities review scheme blackhole sums

FT Adviser, 20/03/2018

Universities UK (UUK) and the University and College Union (UCU) have agreed to set up an independent panel to review the valuation that put the Universities Superannuation Scheme (USS) deficit at £6.1bn.

According to Alistair Jarvis, UUK chief executive, “concerns have been raised over the way the scheme has been valued, which has led some to question whether there is, in fact, a very large deficit”.

University regulator is ‘Office for State Control’, say critics

The Guardian, 20/03/2018, Anna Fazackerley

University leaders this week described the government’s new regulator, the Office for Students, as the ‘Office for State Control’, warning it would prove disastrous for higher education and was ‘dangerous for democracy’.

The OfS is already mired in controversy thanks to the short-lived appointment of Toby Young to its board, which sparked a storm of public protest. But it has emerged that Universities UK, the umbrella group for vice-chancellors, didn’t challenge Young’s suitability for the role because it feared annoying the government.

John Arnold, professor of medieval history at Cambridge University, says the OfS’s predecessor, the Higher Education Funding Council for England, was seen as a forum for discussion between government and the sector. ‘Now we are in an apparently antagonistic relationship with a market regulator. Even if you were to pretend universities were selling a commodity, I don’t see how anything people only ever buy once can possibly respond to market forces.’

 



from RSSMix.com Mix ID 8239600 http://cdbu.org.uk/news-round-up-a-new-higher-education-regulator-is-launched-and-pension-strikes-continue/
via IFTTT

source https://thermoplasticroadmarking.tumblr.com/post/172447534639

Thursday 29 March 2018

What is teaching intensity – and how do you measure it?

The government believes that university courses can be rated according to the level of teaching intensity they provide. Professor GR Evans detects a lack of joined-up thinking

“Prospective students deserve to know which courses deliver great teaching and great outcomes – and which ones are lagging behind,” said the minister for higher education launching a consultation on the new subject-level teaching excellence framework (TEF) on 12 March. He also announced a competition for designing apps, to enable students to find the answers quickly on their smartphones.

The subject-level TEF will be designed to rate the lifelong “value” to a future employee of choosing a particular course at a particular provider. The model is essentially one of efficient “delivery” by the provider. As planned it does not factor in assessment of the part the student plays in acquiring the learning.

Among the proposals in the government consultation document is the addition of a “supplementary” measure of “teaching intensity”, on the hypothesis that:

“…excellent teaching is likely to demand a sufficient level of teaching intensity in order to provide a high quality experience for students.

The idea that a student is entitled to a number of hours of actual teaching or feedback from academic staff in return for the fee paid was first floated in a series of Higher Education Policy Institute (HEPI) publications. It had the attractiveness of simplicity and it encouraged students to complain that they were getting too few “contact hours” a week for their high tuition fees.

In 2011 in an effort to clarify matters, the Quality Assurance Agency (QAA) published helpful guidance entitled Explaining contact hours. However, expressions of student disquiet about getting “value for money” when fees are so high have grown still louder since. In December 2017 the Education Select Committee held its first evidence session in a Value for money in higher education enquiry In January 2018 the Office for Students (OfS) commissioned “a major piece of research” on value for money. The theme, in the form of the call for “teaching intensity” has now been taken up in connection with planning for a subject-level TEF.

What responsibilities do students have?

This raises one of the central questions arising from the design of the TEF in general. How far in higher education should the student actively meet the “teacher” half way in “learning”, rather than merely receive the instruction delivered? Student contracts commonly list what the provider and the student are respectively expected to do. For example, that the student “will take responsibility for [his or her] own learning and development, working in partnership with staff to become a self-reliant, independent learner” is an expectation in Bristol University’s student agreement. But this reciprocal requirement of student participation does not seem to be measured in the planned subject-level TEF.

It seems that independent study may not in the end be included, because as the consultation suggests:

“…it does not actually measure what teaching a student is receiving (and hence does not measure value for money) and is more dependent on the student than on the provider. Furthermore, it is difficult to reliably collect data on students” independent study.”

Yet the detailed TEF Guidance for providers in its version released in January 2018 goes in some detail into the complexities of the ways student response to teaching may take place, including, for example, the problem of “asynchronous online teaching” were a student may visit the online teaching at any time and perhaps many times and the teacher may not be present at all. Is this a contact hour (or hours)? How is it to be measured in terms of “value for money”? How is its “intensity” to be quantified?

The lack of joined-up thinking with work in progress part 1: learning gain

The first and most basic test of teaching is likely to be whether students complete their courses and gain the relevant qualification. Non-completion has become a troubling feature of higher education statistics, notably in a series of Public Accounts Committee and National Audit Office reports on alternative providers. Nothing appears to be proposed in the TEF plans to test whether there is a correlation between the quality of teaching and whether or not students graduate.

Learning gain in English higher education, a recent progress report from Hefce, describes what has been achieved so far since the project was launched in 2015. It is recognised that this needs to be tied into the TEF, so Hefce suggests that a Learning Gain Toolkit could provide “tested methodologies for institutions to undertake their own learning gain measurements and to demonstrate the outcomes through assessments such as the TEF.” However the project includes plans for “the use of data not just for improvement, but also for student information and performance incentives, which are the focus of TEF”.

There is now a body of “analysis of the learning gain activity identified within TEF year 2 submissions”. Hefce notes that so far this yields “little evidence that could be used across a range of providers to demonstrate learning gain”.

“Learning gain” is not defined solely in terms of knowledge and skills acquired by the student through the teaching received. It has a broader range:

“…learning gain has been understood for the purpose of HEFCE”s work to relate to the changes in knowledge, skills, work-readiness and personal development during a student’s time in HE.”

“Content knowledge is what “students traditionally acquire “through their classes and other study at university”. But “skills and competencies can be either discipline-specific or non-discipline-specific.” “Work-readiness relates to concepts of employability” rather than to the student’s capacity to leave a university ready to work for his or her living.

As to measuring the results, Hefce has been trying out a “National Mixed Methodology Learning Gain project (NMMLG) in which students at 10 selected institutions are completing a series of repeated online assessments throughout their course.”

Here there has been experimental testing to see whether students show they can do better after more time and teaching.

The “learning gain” project has taken seriously the role of students” engagement with their learning” and has also explored “the influence of students’ backgrounds on their learning outcomes”:

“This activity is particularly important in the context of the extension of access agreements to include successful participation in the duties of the Office for Students and the context-specific student outcome measures used in the TEF.”

The arrival of learning analytics in HE has not been uncontroversial. Tracking students’ activity for their own good clearly raises ethical questions. Hefce suggests that:

“…this is enabling institutions to draw on increasingly sophisticated student data (such as, for example, real-time data on engagement and granular information about learning outcomes), to inform learning and teaching maintain an up-to-date understanding of potential connections between learning analytics and learning gain, we are liaising with Jisc, which is particularly active in this area of work.”

The lack of joined-up thinking with work in progress part 2: teacher “qualification” in higher education

There is as yet no suggestion that teaching should be observed or particular methods stipulated in the subject-level TEF. The Technical document (Consultation) recognises that it is “the right of providers to decide how teaching should be carried out” and promises that government “is not beginning with a view on whether certain types of teaching methods are better than others.

On the other hand, considerable effort has been put into encouraging teachers in higher education to gain an appropriate qualification. The Institute for Teaching and Learning in Higher Education set up in the wake of the Dearing Report had a short career. It was replaced by the Higher Education Academy (HEA), which is now to be amalgamated with two other sector bodies into a new entity to be called AdvanceHE.

Its chief executive, Alison Johns, who comes to the post from a career at HEFCE and most recently the role of CEO to the Leadership Foundation for Higher Education, told Times Higher Education that she envisages its having a role complementary to that of OfS:

“Having a friendly, supportive organisation as a counterweight to the tougher OfS oversight will be crucial for the health of UK higher education, Ms Johns explained, as it ‘provides confidence to students, government, banks and all sorts of stakeholders that institutions can run themselves effectively’”.

She acknowledged that “the fate of the HEA’s work” is “likely to generate the most discussion”. It has just reached 100,000 teaching fellows:

“That is one hell of a community of professional practitioners and it demonstrates that individuals and universities care about teaching,” she said.

Nevertheless there is clearly uncertainty about the future of the HEA and therefore of the long-term value of an HEA “teaching fellowship”.

Don’t encourage students to rely on this dubious measure

So neither any basis for measuring the “intensity” of teaching nor any way of checking what the student does with provision of teaching on the course “bought”, by way of “learning”, seems to be being factored into the planning for the subject-level TEF. When it is recognised that with a casualised academic workforce one year’s rating may give no reliable indicator of what might be expected from the same course in successive years, subject-based TEF ratings seem likely to remain a measure of dubious value to the students who are being encouraged to rely on it in choosing a course.



from RSSMix.com Mix ID 8239600 http://cdbu.org.uk/what-is-teaching-intensity-and-how-do-you-measure-it/
via IFTTT

source https://thermoplasticroadmarking.tumblr.com/post/172375806769

Tuesday 27 March 2018

Who needs external examiners? We do

UCU has asked external examiners to resign in support of the strike over pensions. But what exactly does an external examiner do? Professor GR Evans reflects on the value of an undersung role

On 15 March the University and College Union (UCU) asked external examiners to “resign” in support of the strike over proposed USS pension changes:

“UCU said external examiners resigning would cause universities a number of specific problems around the setting and marking of exams. External examiners agree the setting of questions, moderate exam results and ensure that institutions’ assessment procedures are rigorous.”

It was reported on 23 March that “more than 600 external examiners” had recorded their resignations in a public document circulating on social media’.

Are external examiners necessary?

External examiners provide a means of ensuring that there is reasonable comparability of standards among degree-awarding bodies.

The Quality Assurance Agency (QAA) published guidance on the role of the external examiner in 2004. A version updated in 2011 is now online. This explains the way the external examiner provides a check without interfering with the academic autonomy of the degree-awarding body:

“External examining provides one of the principal means for maintaining UK academic standards within autonomous higher education providers.”

It is also explained that external examiners do more than moderate examinations. They help the provider improve its examining practice:

“Based on their qualifications and experience, they are able to provide carefully considered advice on the academic standards of the awards, programmes and/or modules to which they have been assigned, and can offer advice on good practice and opportunities to enhance the quality of those programmes/modules.”

They help to ensure comparability of standards with other providers:

“They are also able to offer an informed view of how standards compare with the same or similar awards at other higher education providers (primarily in the UK, and sometimes overseas as well) of which they have experience.”

The Higher Education Academy published its own handbook for external examiners in 2012, in collaboration with the QAA.

Important work then, but not perhaps one for which there are many keen candidates. It is not surprising that attracting enough appropriately-qualified and experienced individuals can prove difficult. External examining is needed at a time in the academic year when an examiner is likely to be under marking pressure in his or her own institution. It is burdensome. It can involve some tension where an external examiner finds it necessary to press for recalibrations or revisions of marks awarded by the internal examiners. It is not likely to boost the examiner’s own career.

The Innovation, Universities and Skills Select Committee called in 2009 for the creation of a register of external examiners. The evidence it had heard suggested to it that:

“… the problems of the external examiner system at present can be summarised as:

  • the remit and autonomy of external examiners is often unclear and may sometimes differ substantially across institutions in terms of operational practices;
  • the reports produced by external examiners are often insufficiently rigorous and critical;
  • the external examiner’s report’s recommendations are often not acted upon—partly because their remit is unclear; and
  • the appointment of external examiners is generally not transparent.”

Why no national pool?

The Dearing Report had recommended in 1997 that the sector “create, within three years, a UK-wide pool of academic staff recognised by the QAA, from which institutions must select external examiners”. That recommendation had never been implemented and the Select Committee wanted to revive it:

“We strongly support the development of a national ‘remit’ for external examiners, clarifying, for example, what documents external examiners should be able to access, the extent to which they can amend marks – in our view, they should have wide discretion– and the matters on which they can comment. This should be underpinned with an enhanced system of training, which would allow examiners to develop the generic skills necessary for multi-disciplinary courses. We conclude that higher education institutions should only employ external examiners from the national pool.”

In the absence of a pool, individual universities continue to make their own arrangements. At Durham, for example, there are detailed rules for the selection and appointment of external examiners.

The proportion of first-class degrees awarded has been rising. Whether this is a consequence of the manipulation of the algorithms used by examiners to calculate a student’s final mark, has become contentious. The potential role of external examiners in helping to moderate such activity is obvious but has been difficult to monitor in the continuous absence of the proposed ‘pool’ arrangement.

A project on degree standards, running from 2016 to 2021 has been ‘managed’ by the Higher Education Funding Council England (Hefce) on behalf of all the UK funding councils. It is not yet clear how the Office for Students will approach the questions that continue to surround the operation of the external examiner system.



from RSSMix.com Mix ID 8239600 http://cdbu.org.uk/who-needs-external-examiners-we-do/
via IFTTT

source https://thermoplasticroadmarking.tumblr.com/post/172293665189

Friday 23 March 2018

TEF: so what is a ‘subject’?

Opinion piece by Professor G. R. Evans

The subject-based Teaching Excellence and Student Outcomes Framework (TEF) promises to give prospective university students clear, easy-to-understand information so that they can see at a glance where ‘excellent teaching and student outcomes can be found’, at a level of detail which will enable them to choose not only a university but a course. A consultation has just opened on the ‘technical aspects’ of the implementation of this ‘subject level’ TEF. The consultation runs alongside a Year Three Subject Pilot, which is being conducted by the  soon-to-disappear Higher Education Funding Council for England, on behalf of the Department for Education.  The Consultation’s closing date is 20 May.

What counts as a ‘subject’? 

If provision of ratings for courses at ‘subject’ level is to be robust, an accurate definition of a ‘subject’ is going to be important. The prospective student still at school has already faced ‘ subject-choices’ on the way to gaining the qualifications needed to get a university place.  But those options may not fit tidily with the ‘informed choices’ TEF hopes to offer. The conventional dozen subjects appropriate for degree-level study in England identified from the nineteenth century still dominate the list of A level subjects likely to be taken seriously by the Russell Group universities.  Students whose schools put them in for some of the wide range of BTEC qualifications at level 3 may then find they are in subjects not acceptable for entry to some universities or for some courses. On this problem UCAS offers some guidance and a warning.  TEF planning so far does not appear to have considered its relevance.

The subjects of courses on offer by higher education providers have multiplied and diversified hugely in recent decades. Interdisciplinary work too is having its day with Government endorsement in both teaching and research. It is recognised in the Consultation document that ‘provision at many providers will cross the boundaries of any subject or discipline definitions we use’.

Comparing the incommensurable is bound to be unsatisfactory and  yet it seems clear that the definition of a ‘subject’  for TEF purposes is a long way from having any agreed parameters  of subject classification to enable like to be compared with like.   The Pilot identified a ‘strong consensus’ that:

 it would be greatly preferable for the TEF to use an existing subject classification system rather than to create a new one.

However, there have been several attempts to group subjects leading to several rival lists, so a choice has to be made:

The CAH has recently been developed by HESA as an aggregation system to sit alongside the Higher Education Classification of Subjects (HECoS), the new subject coding system. CAH and HECoS will together replace JACS, which is currently used by the Universities and Colleges Admissions Service (UCAS) for students applying to university.

Among the Consultation questions is one touching on the Quality Assurance Agency’s Subject benchmark statements, which offer a further approach to the identification and grouping of subjects in higher education.The QAA sees these as, among other things, ‘of interest to prospective students and employers, seeking information about the nature and standards of awards in a subject area’, in other words, as meeting a key need identified by the architects of subject-level TEF. The Statements are compiled by committees of specialist academics and regularly reviewed. They ‘set out expectations about standards of degrees in a range of subject areas’:

They describe what gives a discipline its coherence and identity, and define what can be expected of a graduate in terms of the abilities and skills needed to develop understanding or competence in the subject.

Nevertheless this system of subject identification was not favoured by the architects of the Pilot:

 The two classifications that received most support were the Units of Assessment used in the Research Excellence Framework (REF UoAs) and the CAH developed by the Higher Education Statistics Agency (HESA).

The consultation document indicates that the CAH2 (see Table 1) will be used.

Table 1: CAH2 classification to be used in TEF

Against using the UoAs on which the REF was based, it was pointed out that linking REF to TEF for these purposes would:

 send a negative message contrary to our commitment to increase the parity of esteem between research and teaching, because it would mean research considerations would be driving the TEF as well as the REF.

Besides, it was suggested that it might be unfair to students even to try:

one grouping includes both ‘veterinary’ and ‘food science’, two subjects which are completely different from a student perspective.

‘Mixed module’ subjects

This last point is of an importance which does not seem to have been considered at the current stage of modelling a subject-based TEF.  It is instructive to look at the content of some Complementary Medicine courses,  which have been only rather recently added to the catalogue of accepted UK degree subjects, for example ‘Herbal medicine’ and ‘Chiropractic’.

A BSc in Herbal Medicine is offered by the post-1992 University of Westminster, in its Faculty of Science and Technology.  This course includes conventional accepted scientific modules such as ‘botany’ and ‘physiology’ and others with less established traditional acceptance as serious science such as ‘Herbal medicine therapeutics’.  A graduate might or might not find the embedded qualification for  ‘botany’ or ‘physiology’ obtained on this course helpful in getting a job.

What is an’ accreditation’ worth in the job market?

The accreditation of courses is another area the planning for subject-level TEF has not yet addressed. In some cases getting a job in the field of a degree depends on licensing by a professional body.

An Integrated Undergraduate Masters in Chiropractic (MChiro) course is offered by the alternative provider BPP University.  Its ‘course details’ available to students looking for information online are sketchy from the point of view of subject-matter detail, but it notes accreditation of the course by the General Chiropractic Council (GCC):

‘This means that you are qualified to apply to register with the GCC, the statutory body regulating Chiropractic in the UK.’ 

This accrediting body is moving to self-assessment by its recognised providers, including arrangements for programme submission.  A graduate would be able to proceed to practise as a Chiropractor but again it is less clear how acceptable this degree might be for other purposes.

Apart from the Professional, Statutory and Regulatory Bodies on which the QAA keeps an eye, self-defined ‘accreditors’ are legion. For example, Counselling and Psychotherapy have numerous ‘accrediting’ bodies.   These vary considerably in their membership requirements and some have categories of membership, such as ‘associate member’ and ‘accredited member’.

So in many subjects on offer for degrees it is less clear what accreditation may be available for ‘professional’ purposes and what weight it may carry in the search for employment or practice.

What is the relationship between the study of a subject and a well-paid job?

There is a considerable area of uncertainty about the relationship between the study of a subject and a well-paid job going even wider than these considerations.  Higher education graduate employment and earnings released on 15 March. is an update to the Longitudinal educational outcomes data (LEO). But the LEO data ‘shows a plethora of different factors influences earning potential, particularly student background and environmental attributes, which are all outside the control of HE institutions’. It does not and cannot demonstrate a clear link between degree outcomes alone and graduate earnings.

The problem of assessing subject provision has defeated previous attempts

The last time an attempt was made to rate teaching in universities at subject level through the Teaching Quality Assessment (TQA), it had to be ended in  2001 in the face of vigorous protest both from academics themselves and from universities, whose managements complained of the excessive burden it imposed.

However, it also came to grief as a result of ‘gaming’ by universities which made its results no longer pausible as a measure of achievement:

The teaching quality assessment exercise has been rendered all but meaningless by grade inflation, gamesmanship and the rise of inspectors’ “cartels”, experts have claimed.

In 2005, the remaining selective ‘drill down’ QAA ‘’discipline audit trails’ came to an end.Providers will certainly ‘game’ whatever the TEF introduces and because ‘teaching intensity’ information ‘is not routinely collected by providers’ introducing that ‘supplementary’ requirement  ‘would require new data to be collected’:

’The Government considers it important that data collection in this area should not itself drive teaching practices’. 

It seems inevitable that it will do so.

If rating ‘subject-teaching’ proves not to be reliable, it could become a hostage to fortune. A graduate from Anglia Ruskin University who claims she did not get the career advancement the prospectus had seemed to promise and had unsuccessfully taken her complaint through various channels, recently decided to sue. Reliance on the planned subject-based ratings could tempt many more disappointed students to do the same if they claim they had been misled. At the very least providers could find themselves facing a rising tide of complaints. In their turn some may be seeking redress against the designers of the new TEF for ratings they may claim to have been distorted by failure to identify the ‘subjects’ accurately.

 



from RSSMix.com Mix ID 8239600 http://cdbu.org.uk/tef-so-what-is-a-subject/
via IFTTT

source https://thermoplasticroadmarking.tumblr.com/post/172169500599

Sunday 18 March 2018

Subject level TEF: are the metrics statistically adequate?

Opinion piece by D. V. M. Bishop

The Teaching Excellence and Outcomes Framework (TEF) was introduced at an institutional level in 2016. It uses metrics from the National Student Survey, together with information on drop-out rates, and on student outcomes. These are benchmarked against demographic information and combined to give an ‘initial hypothesis’ about the institution’s classification as Bronze, Silver or Gold. This is then supplemented by additional evidence provided by institutions; these are considered by a panel who can modify the initial hypothesis, to give a final ranking.

Potential students are probably more interested in the characteristics of specific courses than global rankings of institutions, and, on advice from the Competition and Markets Authority, it was decided that there should be a subject level TEF. This was piloted in autumn 2017, with another pilot to take place this summer, after the technical consultation that was announced this month. In future, institutions will have both a global ranking and subject level rankings.

What is measured?

In general, the plans are for subject-level TEF to use the same metrics as for the institutional TEF, although the weight given to NSS scores will be halved. The metrics are as follows:

National Student Survey (NSS)

The NSS runs in the spring of each academic year and is targeted at all final year undergraduates, who rate a series of statements on a 5-point scale ranging from Strongly Agree to Strongly Disagree. The items used for TEF are shown in Table 1.

Course completion

This is perhaps the most straightforward and uncontentious metric – at least for full-time students –  and is assessed as the proportion of entrants who complete their studies, either at their original institution, or elsewhere.

Student outcomes

TEF will use data from the Destination of Leavers from Higher Education Survey, supplemented by the Longitudinal Education Outcomes (LEO) survey, (explained in this beginners’ guide). Data from LEO will include the proportion of students who have been in sustained employment or further study 3 years after leaving, and, most controversially, the proportion who fall above a median earnings threshold.

Teaching intensity?

It has been proposed that a measure of teaching intensity might be added, but this will be influenced by the consultation.

Statistical critique of TEF methods

Criticisms of TEF have been growing in frequency and intensity. Most of the commentary has focused on the questionable validity of the metrics used: quite simply the data entered into TEF do not measure teaching excellence. I agree with these concerns, but here I want to focus on a further problem: the statistical inadequacies of the approach. Quite simply, even if we had metrics that accurately reflected teaching excellence, it is unclear that we could get meaningful rankings from them, because they are just not reliable or sensitive enough. I made this point previously, noting that institutional-level TEF ratings are statistically problematic: I’m not aware that those issues have ever been adequately addressed, and now the problems are just made worse with the smaller samples available when assessments are made at subject level.

Scientists often want to measure things that are not directly observable. All such measurements will contain error, and the key issue is whether the error swamps the true effect of interest. If so, our measure will be useless. In the context of TEF, there is an implicit assumption that there is a genuine difference between institutions in terms of some underlying dimension of teaching excellence, which we may refer to as the ‘true’ score. The aim is to index this indirectly by proxy indicators: we take a weighted average of these indicators from all the students on a course as our measure of true score. There are two kinds of error we need to try to minimise: random error and systematic error.

Random error

Factors that are unrelated to teaching excellence can affect the proxy indicators: for instance, a student’s state of mind on the day of completing the NSS may affect whether they select category 3 or 4 on a scale; a student may have to drop out because of factors unrelated to teaching, such as poor health; subsequent earnings may fall because an employer goes out of business. Even more elementary is simple human error: the wrong number recorded in a spreadsheet. Measurement error is inevitable, and it does not necessarily invalidate a measure: the key question is whether it is so big as to make a measure unreliable.

There is a straightforward relationship between sample size and measurement error. The larger the sample size, the smaller the impact of an individual score on the average.  To take an extreme example, suppose that all students on a course intend to give a NSS rating of 4, but one student inadvertently hits the wrong key and gives a rating of 1. If there are 100 students, this will have little impact (average score is 3.97), but if there are 10 students, there is a larger effect (average score is 3.7).

Systematic error and benchmarking

In the context of TEF, systematic error refers to factors that differ between institutions, that are not related to ‘true’ teaching excellence, but which will bias the proxy indicators. This is where the notion of benchmarking comes in. The idea is that institutions vary in their intake of students, and it would not be reasonable to expect them all to have equivalent outcomes. An institution that is highly selective and only takes students who obtain three As at A-level is likely to have better outcome data than one that is less selective and takes a high proportion of students who have achieved less well at school. So the idea behind benchmarking is that one takes measures of relevant demographics related to outcomes, such as proportions of students from lower income backgrounds, with disabilities or from ethnic minorities, to see how these relate to proxy indicators. A ‘benchmark’ is computed for each institution and subject, which is the score they are expected to get based solely on their demographics. The benchmark is then subtracted from the obtained score to give a measure of whether they are performing above or below expectation. In effect, benchmarking is supposed to correct for these initial inequalities; in TEF the plan is that it will be done subject by subject, to ensure that the final rating ‘will not judge the relative worth of different subjects.

Small sample sizes are particularly problematic with benchmarked variables. Benchmarking involves complex statistical models that will only be valid if there is sufficient data on which to base predictions. It is possible to incorporate statistical adjustments to take into account variable sample sizes between institutions, but these typically create new problems, as the same absolute difference in obtained score vs benchmark will be interpreted differently depending on the sample size. It is already noted in a Lessons Learned document that benchmarking can create problems with extreme scores, and so further tweaks will be needed to avoid this. But this ignores the broader problem that the reliability of assessments will vary continuously with sample size: this is not something that suddenly kicks in with very small or large samples – it just becomes blindingly obvious in such cases.

Sensitivity

Although it is stated that TEF exists to provide students with information, a main goal is to provide rankings, so that different institutions can be compared and put into bandings. For this to be meaningful, one needs measures that give a reasonable spread of scores.  This is something I have commented on before, noting that ratings from the National Student Survey tend to be closely bunched together towards the top of the scale, with insufficient variation to make meaningful distinctions between institutions. A similar point was made by the Office of National Statistics. The tight distribution of scores coupled with the variable, and often small, sample sizes for specific courses is seriously problematic for any attempt at rank ordering, because a great deal of the variation in scores will just be due to random error.

It would be possible to model the sample size at which a rating becomes meaningless, but it is not clear from the documents that I have read that anyone has done this. All I could find was a comment that scores would not be used in cases where there were fewer than 10 students (TEF consultation document, p. 24), in which case a more qualitative judgement would be used. We really need a practical test, with students from each institution and course subdivided into two groups at random, and TEF metrics computed separately for each subgroup. This would allow us to establish how large a sample size is needed to get replicable outcomes.

Extreme data reduction: Gold, Silver and Bronze

Some of the information gathered for TEF could be potentially interesting for students, provided it was presented with explanatory context about error of measurement. In practice, a pretty complex set of data for each institution and course, based on numerous data-points integrated with the subjective evaluations of qualitative evidence submitted with the metrics, will be reduced to a three-point scale – with the institution categorised as Bronze, Silver or Gold. This is perhaps the most pernicious aspect of the TEF: it is justified by the argument that prospective students need to have information about their courses, yet they are not given that information. Instead, a rich, multivariate dataset is adjusted and benchmarked in numerous ways that make it hard for a typical person to understand, and then is boiled down to a single scale that cannot begin to capture the diversity of our Higher Education system.

Is the distinction between Gold, Silver and Bronze robust?

As noted above, the metrics from proxy indicators are not blindly applied: they are integrated by panel members with additional evidence provided by the institution, to form a global judgement. This, however, provides ample opportunity for bias. A ‘Lessons learned’ evaluation of TEF concluded: ‘Overwhelmingly, assessors and panellists thought that the process of assessment was robust and effective. They were confident that they were able to make clear, defensible judgements in line with the guidance based on the evidence provided to them.’ Now, to a psychologist, I’m afraid this is laughable.  The one thing we do know is that people are a mass of biases. You get people to make a load of ratings and then ask them whether their ratings were ‘robust’. Of course they are going to say they were.

To demonstrate genuinely robust ratings, one would want to have two independent panels working in parallel from the same data, to demonstrate that there was close agreement between them in the final ratings. Ideally, the ratings should also be done without knowledge of which institution was being rated, to avoid halo effects. Given how high the stakes are in terms of the rewards and penalties of being rated ‘Bronze’, ‘Silver’ or ‘Gold’, universities should insist that OfS conducts studies to demonstrate that TEF meets basic tests of measurement adequacy before agreeing to take part.



from RSSMix.com Mix ID 8239600 http://cdbu.org.uk/subject-level-tef-are-the-metrics-statistically-adequate/
via IFTTT

source https://thermoplasticroadmarking.tumblr.com/post/172002526864

Friday 16 March 2018

News round-up: demand for higher education is set to grow

Demand for university places is going to grow by 300,000, but the rise in tuition fees means fewer people are choosing to study part-time

More than 40,000 fewer part-time students go to university due to tuition fees hike, study suggests

The Independent online, 15/03/2018, Eleanor Busby

More than 40,000 fewer part-time students are going to university because of the hike in tuition fees in England, a new study suggests. The Government’s introduction of higher tuition fees exacerbated the decline of part-time students in England, preventing many ‘second chance routes’ to social mobility, the report from the Sutton Trust charity states.

There were more than 40,000 fewer part-time students in 2015 than five years before – when tuition fees had not yet risen to £9,250 a year for full-time undergraduates, it reveals.

The number of part-time students in England declined by 51 per cent between 2010 and 2015 – and researchers say part of the fall was caused by higher tuition fees in 2012.

University place demand to grow by 300,000 by 2030

BBC News online, 15/03/2018, Hannah Richardson

About 300,000 new places will be needed at universities over the next 12 years, experts predict, making the higher education funding model unsustainable. A rise in the number of 18-year-olds by 2030 will push demand up by 50,000, the Higher Education Policy Institute says.

A further 350,000 places will be needed to keep pace with the existing growing participation rate, it adds, but other factors may reduce that by 50,000.

The Hepi report examines the impact of policy changes on university entrant rates, feeding a number of scenarios into the calculations to arrive at the 300,000 figure. The 18-year-old population has been declining steadily for a number of years, but from 2020 it will increase again, rising by nearly 23% by 2030, says Hepi. And if participation continues to increase at the current rate, about 350,000 extra places will be needed on top. Countervailing factors such as Brexit, are likely to reduce that total by about 50,000, the research says.

Bad universities should be allowed to go bust

The Times, 15/03/2018, Emma Duncan

Comment piece about the university lecturers’ strike and how it relates to university funding and the value of higher education to the economy and society: ‘If we know that students benefit from university education, and we’re not sure whether or not society does, then we’re right to treat it largely as a private good. It follows then that most of the costs should be paid by the students, and the sector should be governed more by private-sector discipline than by public-sector fiat. So rather than having a vast ill-governed pension fund for the sector as a whole, universities should be in charge of their own pensions, and negotiate with their academics. Universities that overpay their bosses, mismanage their affairs and fail to attract enough students should be allowed to go bust.’

Tuition fee value for money: ‘I feel ripped off’

BBC News online, 14/03/2018, Katherine Sellgren

A new survey by the Office for Students has found only 38% of students in England think the tuition fees for their course are good value for money.

Course subject is a major factor which influences students’ perception of tuition fees, with computer science students, those doing physical sciences and law students the most likely to say that the tuition fees represented good value for money. Those doing historical and philosophical studies, languages and creative arts and design are least satisfied with the value they have received.

The OfS spoke to 5,685 current higher education students in England and 534 recent graduates. When asked whether their overall investment in higher education was good value for money, the majority (54%) agreed, a quarter said they were undecided while 21% disagreed.

In terms of nationality, UK students are the least likely to consider their investment as good value for money (49%), compared to 61% of the students from other EU countries and 66% of those from non-EU countries.

‘Wasted potential’ of mature students

BBC News online, 14/03/2018, Sean Coughlan

A university group says that the government’s review of tuition fees in England should make a priority of finding ways to attract more mature and part-time students.

The Million Plus group says there is a ‘huge pool of untapped potential’ among adults who missed out on university.

After fees increased in 2012, mature student numbers fell by 20%. Les Ebdon, head of the university access watchdog, backed calls to reverse this ‘very worrying trend’.

Part-time students also saw a significant drop in numbers – and this often overlaps with older students, who might be working and unable to study full-time.

Student loans sale faces Audit Office probe

The Sunday Times, 11/03/2018, Sabah Meddings

The government spending watchdog is to investigate the sale of a £1.7bn student loans book that has reportedly led to an £800m loss for taxpayers. It was the first of a four-year sell-off of loans made to students before 2012.

The first clutch of debt, which had a face value of £3.7bn, was sold to specialist investors including pension funds and hedge funds via a securitisation process — where assets are packaged together and sold as bonds. Now the transaction is subject to a probe by the National Audit Office (NAO), which will consider whether the government ‘achieved value for money from this sale’.

MPs call for action on fraud and malpractice at alternative ‘university’ providers

The Independent, 06/03/2018, Eleanor Busby

The government has not done enough to prevent alternative providers of higher education from ‘playing the system’, MPs argue. A damning report from the Commons Public Accounts Committee says the current system offers a ‘chancer’s charter’, which saw around £10m paid out to individuals and providers not eligible for student loan funding from 2014 to 2016.

Serious allegations of fraudulent practices at alternative providers – which include agents helping ineligible students with fake applications so they can claim loans – shows more could be done.

So far, the Student Loans Company has only been able to recover a quarter of the £45m of ineligible payments made in the six years to 2016, the report highlights.

UK universities rely on casual staff ‘for up to half of teaching’

Times Higher Education online, 06/03/2018, Sophie Inge

Some UK universities rely on casual, hourly paid staff to do as much as half of all their teaching, new analysis suggests. The University and College Union sent freedom of information requests to 135 higher education institutions, asking them to provide the number of hours of scheduled learning and teaching activities delivered during the academic year 2015-16. Universities were then asked to state the number of those hours that had been delivered by hourly paid lecturing staff.

Just 38 institutions returned usable information, but with caveats about its accuracy. Based on these data, the union estimates that most universities use hourly paid teachers for between 15 and 40% of their teaching, with an average of 27%.

Also:

300,000 more university places needed to keep up with demand for degrees, study says

The Independent, 15/03/2018, Eleanor Busby

Demand for 300K new university places ‘threatens uncapped system’

Times Higher Education online, 15/03/2018, Simon Baker

Baby boom caused by immigration fuels need for 300,000 more places at university by 2030

Mail Online UK, 15/03/2018, Eleanor Harding

Universities will need to find 300,000 more places to cope with soaring student population over next decade

The Sun online, 15/03/2018, Lynn Davidson

‘Debt for life’: only 38% of students say tuition fees are good value

The Guardian, 13/03/2018, Sally Weale

Only the truly ignorant would rank universities according to graduate earnings

The Guardian, 13/03/2018, Suzanne Moore

Comment on government plans to rate university courses by looking at graduate earnings.

 



from RSSMix.com Mix ID 8239600 http://cdbu.org.uk/news-round-up-demand-for-higher-education-is-set-to-grow/
via IFTTT

source https://thermoplasticroadmarking.tumblr.com/post/171931919609

Wednesday 14 March 2018

Resolving the pensions dispute: who has the power to agree?

It seemed that the pensions dispute had been resolved – and then union members voted to reject it. So who decides when an agreement has been reached, asks Professor GR Evans

The press was quick off the mark to announce a resolution to the pensions dispute when talks with ACAS involving Universities UK and the University and College Union (UCU) seemed to have reached a satisfactory conclusion on 12 March. UUK reported on the “agreement” and announced that:

“UUK is now consulting with Universities Superannuation Scheme (USS) employers about a revised mandate for a forthcoming Joint Negotiating Committee on Wednesday 14 March.”

However, soon social media was carrying warnings that UCU branches around the country were going to reject it. On March 13 UCU, having put it to their members, announced that they had rejected the agreement.

The question that ought to have been asked sooner – who or what body or bodies actually have powers to reach an agreement – now needs to be addressed as a matter of some urgency.

Without reference to the question who has powers to bind the employers or the members, the USS Joint Negotiating Committee has been made up of equal numbers of “representatives” of the “employers” (through UUK) and scheme members (but through UCU), with a chair whose casting vote is likely to be determinative. The chair has regularly voted with the employers, thus outvoting UCU. It was on the basis of such decision-making that the Joint Negotiating Committee made its proposals in late January 2018.

Who may bind the employers?

It is far from clear that all vice-chancellors have powers to commit their universities to an agreement to which UUK is a party. It is not even clear all vice-chancellors are individually authorised by their universities to agree a solution.

In a university the employer is the governing body, usually the university council; in Oxford it is the medieval academic democracy of congregation and in Cambridge the counterpart Regent House. It is on the governing body that the decision has to be taken in each case. The governing body is responsible for determining the level of risk the university is willing to accept when a higher level of employer contribution is proposed.

Since 2008 the chair of the USS Joint Negotiating Committee has been Sir Andrew Cubie. He was certainly aware that it is the governing body not the vice-chancellor which has supreme decision-making powers on behalf of a university. From 2006 he was chair of the Committee of University Chairs (CUC), the companion association to UUK.

He took the lead in the framing of Key Performance Indicators, a CUC project of 2006, whose steering group included Sir Andrew Burns (another sometime CUC chair), Sir Ivor Crewe (on behalf of UUK) and Ewart Wooldridge (CEO of the Leadership Foundation for Higher Education 2003-2012, which provided governance training for members of university councils).

The KPIs were designed to make it possible for members of university councils to make decisions on matters including pensions, without too much expenditure of reading time on too many committee papers:

(1.20) “Governors comment that there is so much paper and reporting and measurement in higher education that they find it hard to distinguish the areas where they need to engage with these processes. They are naturally cautious about challenging the advice from the senior managers, and they find that even business-related issues like pay, pensions, financial accounts and strategic plans seem to come with a special higher education context, and to generate voluminous documents. Risk assessment is a good example of this. External governors are used to making risk-based judgements, and would take naturally to this approach in the governance of their institutions, but governing bodies often seem to be presented with a voluminous and detailed operational risk register, rather than a presentation on the ‘five things that could put us out of business’. “

“Key Performance Indicators”, with their pragmatic acceptance that councils and their equivalent in practice leave “operational compliance” to “the senior management” (1.24). So it proposes that the members of governing bodies should be supplied with a brief pack of information (25-30 pages) and notes that even “this is probably too much material”. “The one-page summary would show at a glance where any potential problems lie and individual governors could choose to refer to the particular back-up pages for the areas of interest” (1.48).

It would be helpful to see not only council minutes of universities relating to the USS dispute (which may often be read online but may be restricted to the intranet) but also the papers considered by Council members in approving their university’s position (if indeed they did so). Each vice-chancellor ought to be able to show a double delegation, of powers to commit the university and of powers to declare that the university has accepted whatever UUK agrees.

Individual members must be consulted

The method of consulting with scheme members also needs to be clarified. The proposal which caused all the trouble, the headlines and the strikes had come from the Joint Negotiating Committee at the end of January, with a note that acceptance of the proposal was “subject to a statutory consultation by employers with all affected employees (active members and employees eligible to join)”.

This must mean consultation with individual members. Consultation with a trade union is not enough. Not all USS members belong to UCU. In some universities only a tiny proportion are UCU members and in Cambridge UCU is not even a recognised trade union so small is its membership.

Individual consultation with affected members has been scheduled and must in some form go ahead, but this is between the member and the pension scheme not between employer and employee. Representative negotiating and agreement through a trade union cannot be a substitute.

So who can decide?

On 13 March that JNC meeting scheduled for 14 March began to seem rather a waste of time. What could it actually decide? If, as seems not improbable, the UUK members vote one way and the UCU the other and the chair once again votes with UUK, there is a decision, but apparently not one binding on either the members or the employers, surely? Too much is at stake for it to be acceptable for the USS trustee to allow the unsatisfactory JNC arrangement to continue.



from RSSMix.com Mix ID 8239600 http://cdbu.org.uk/resolving-the-pensions-dispute-who-has-the-power-to-agree/
via IFTTT

source https://thermoplasticroadmarking.tumblr.com/post/171878146009

Monday 12 March 2018

Universities UK – who does it represent and what does it do?

Professor GR Evans looks at the history of an organisation that has rarely been out of the news in recent weeks – and asks what powers it really has

Universities UK has been in the news recently during the upset caused by the proposals to make major changes to the USS pension scheme. It has not had a good press.

What is it and what does it do? Essentially it is just a club. Its members are the ‘executive heads’ of ‘universities in England, Wales, Scotland and Northern Ireland’, The vast majority of these institutions are those in receipt of direct public funding but it admits alternative providers, for example Regent’s University London.

It has a long history. It began over a century ago in ad hoc meetings of vice-chancellors who had discovered common cause in some current controversy. By 1918 it had begun to work as a body informally representing all of the then twenty-two universities and colleges. In 1930 it took the precaution of trying to ensure that each university had given its vice-chancellor a mandate to represent its interests in a process of ‘mutual consultation’.

This question of the extent to which vice-chancellors may now commit their universities has been sharply in focus during the current USS pensions dispute. The Universities Superannuation Scheme (USS) is a corporate trustee with a Board of Directors made up of four appointed by UUK (typically vice-chancellors) and three representatives of the University and College trade union (UCU), plus five ‘independent’ members, appointed by the Board itself. For the purposes of negotiation if change to the scheme is called for by the trustee, there is a Joint Negotiating Committee (JNC). This is made up of an equal number of UUK and UCU members with a chair appointed by the JNC and who has the casting vote where opinion is divided.

Can UUK force its members to abide by its decisions?

In the present dispute this has dug a trench between the ‘employers’, who are likely to resist having to pay a larger employer contribution and the ‘employees’ who are angry at the suggestion that their defined benefit pension rights should disappear. Vice-chancellors, through UUK, collectively took a view which would benefit the employers. Members of the scheme in their universities went on strike to protect the benefits of employees and objected fiercely to the idea that their vice-chancellors had a right to commit their universities to a course of action solely as ‘employers’. Several vice-chancellors broke ranks and said they did not endorse the position of UUK. There remains a question in these circumstances by what mechanism UUK can have a position binding its member vice-chancellors and how, even if it can, that can in its turn bind their universities.

On behalf of its members, the Committee of Vice-chancellors and Principals (CVCP) lobbied successive governments throughout the twentieth century. The CVCP had a significant influence on the shaping of higher education policy. For example, it commissioned the Jarratt Report (1985), which helped to prompt the changes which have encouraged universities to regard themselves as businesses. In 1990 the CVCP set up the precursor of the Quality Assurance Agency, the Academic Audit Unit. That reported to the CVCP not to Government.

The CVCP was hugely enlarged after 1992 when the former polytechnics became universities and their new vice-chancellors joined it. In its evidence to the Dearing Committee, which reported in 1997, it tried to set the long term goal for the UK of the creation of a new education and training framework encompassing all post-16 further and higher level learning and qualifications.

CVCP was ‘rebranded’ as Universities UK (UUK) in December 2000. It has remained a single UK body despite the devolution of higher education that leaves the present radical changes in England (replacing HEFCE with the Office for Students and UK Research and Innovation) separate from what is happening elsewhere in the UK.

Universities UK currently describes itself as:

“the voice of universities, helping to maintain the world-leading strength of the UK university sector and supporting our members to achieve their aims and objectives.”

It sees itself as achieving this by being a lobbying organisation:

“We help to shape the higher education policy agenda, engaging directly with policy makers and other stakeholders. We maintain strong and proactive relationships with government, the private sector, the professions and sector agencies.”

It operates from extensive offices in Woburn Square, which provide conference space.

Who is entitled to speak for a university?

However, this leaves unresolved the question how far a vice-chancellor is entitled to speak for a university without its authorisation. Publicly-funded universities (except Oxford and Cambridge) now have company-style governing bodies, usually called ‘councils’, which in the case of the post-1992 universities are the sole ‘members’ of the university. Their chairs are senior to the vice-chancellor as chief executive.

There is a Committee of University Chairs which ‘represents’ the chairs. The funding councils liaise with the CUC for certain purposes, rather than with Universities UK. The recent Higher Education Remuneration Code in response to the outcry about the size of some vice-chancellors’ salaries was CUC project, as was the creation of The Higher Education Code of Good Governance revised in 2014.

However the CUC, meeting a mere twice a year, lacks the resources to compete with the vice-chancellors’ organisation in any collective bid to ‘represent’ individual universities en bloc when a dispute arises.

 



from RSSMix.com Mix ID 8239600 http://cdbu.org.uk/universities-uk-who-does-it-represent-and-what-does-it-do/
via IFTTT

source https://thermoplasticroadmarking.tumblr.com/post/171796685529

Tuesday 6 March 2018

Insecure academics – just how much do universities rely on casual teaching staff?

A Freedom of Information request from the University and College Union reveals a reluctance by universities to share information about their use of teaching staff on hourly contracts

Casual staff have always played an important role in university teaching. Many graduate students and postdocs supplement their income by delivering lectures and tutorials, gaining valuable experience and often doing a great job. But has casualization gone too far? For university managers, cutting back on salaried staff is a tempting way of saving money, but one that is bound to damage the quality of teaching. Students need continuity: this requires experienced staff who have some investment in the institution they work for, and who have been involved in setting up, delivering and marking courses and who can build relationships with those they teach. As G.R. Evans noted in a blogpost last year, a university should be a “cohesive and self-critical academic community” – something that is just not possible if most staff are paid hourly.

Concern about short-term contracts is one of the three main issues that CDBU is campaigning on, so we were interested to see a recent report by The University and College Union (UCU) entitled ‘Precarious education: how much university teaching is being delivered by hourly-paid academics?’ This makes gripping reading: quite apart from the relatively high estimates of teaching being delivered by hourly-paid staff, a striking point was how difficult it was to obtain information from many institutions.

UCU sent 138 Higher Education Institutions (HEIs) a Freedom of Information Request, as follows:

1.  Please disclose the number of hours of scheduled learning and teaching activities that were delivered at your institution during the academic year 2015/16. Scheduled teaching and learning activities should be understood to be as defined by HESA here.

2.  Please disclose the number of hours of scheduled learning and teaching activities that were delivered by hourly paid lecturing staff during the academic year 2015/16.

Only 38 HEIs provided the requested information in full – in three cases only after a request for internal review of their refusal. Thirty-six ignored the request; 30 refused to provide information, citing FOI exemption to public authorities if the information is not held or it would take more than 18 hours and cost more than £450 to compile. The remainder provided partial information. The Russell Group universities were particularly reluctant to provide figures: only three of them did so.

Although UCU provides a table giving “indicative information” of the percentage of teaching delivered by hourly-paid staff in different HEIs, they emphasise that the numbers are hard to interpret. They certainly are! One may struggle to understand, for instance, how it was that two institutions (University of Durham and Imperial College) had over 100% of their teaching delivered by hourly-paid staff! The explanation seems to be that some classes were taught by a group of instructors, and the estimates of teaching hours were also incomplete.

Universities must be open about the number of casual staff they employ

The authors of the UCU report sensibly conclude that, while their estimates of teaching done by hourly-paid staff seem high (averaging at 27%), the underlying data are a mess, so we cannot draw strong conclusions or compare institutions. But the more important point that they raise is that this information should be readily available, and institutions should not be hiding it away.

We support their main recommendation:

UCU recommends that government should instruct the Office for Students to make it a requirement on universities to collect and publish data on their total annual teaching and the number of hours of teaching delivered by staff on insecure contracts.

Whenever universities are measured on anything, they start competing to be the best. We recommend that requiring them to report data on hourly teaching and short-term contracts, together with some data on staff satisfaction, would do far more than the deplorable Teaching Excellence Framework to help our universities retain their reputation for teaching quality.

 



from RSSMix.com Mix ID 8239600 http://cdbu.org.uk/insecure-academics-just-how-much-do-universities-rely-on-casual-teaching-staff/
via IFTTT

source https://thermoplasticroadmarking.tumblr.com/post/171594210654

Alzheimer’s Researchers win 2018 Brain Prize

The research of these four European scientists, some of which has used GM animals, has revolutionised our understanding of the changes in the brain that lead to Alzheimer´s disease and related types of dementias.

from RSSMix.com Mix ID 8239600 http://www.understandinganimalresearch.org.uk/news/communications-media/alzheimers-researchers-win-2018-brain-prize/
via IFTTT



source https://thermoplasticroadmarking.tumblr.com/post/171594210519

Friday 2 March 2018

News round-up: academics go on strike over pensions

The universities minister says that students should be compensated for missed teaching time as a result of the pensions strike, and a report says that there should have been greater scrutiny in the appointment of Toby Young to the board of the Office for Students

EU ‘should forge wider research community after Brexit’

Times Higher Education, 01/03/2018, Rachael Pells

A new report from the Wellcome Trust has suggested Brexit is an opportunity to forge a stronger European research area, including the UK and other nations outside the EU bloc.

Don’t silence empire ideas, says minister

The Times, 01/03/2018, Rosemary Bennett

Universities should not discard “unpopular or unfashionable” parts of the curriculum in their enthusiasm to decolonise degree courses, the minister for higher education has warned. Many universities are under pressure to make curriculums more ethnically diverse and less dominated by white, Eurocentric traditions. Sam Gyimah said it was important to keep unloved writers, ideas and opinions so students could understand and challenge them. He was also unhappy about the rise in unconditional offers to prospective students, saying academic selection was a cornerstone of university.

The debate over decolonisation was exactly what should happen in universities and was a part of a “free exchange of ideas”, he said. “Part of the university experience is facing up to the unpopular, engaging it, challenging it. That is how you widen your horizons,” he said.

Also:

Universities minister warns against ‘decolonising’ curricula to avoid ‘unfashionable’ subjects

Daily Telegraph, 01/03/2018, Camilla Turner

Universities Minister hits out at bid to ‘decolonise’ courses and make them ‘less white’ by phasing out ‘pale, male and stale’ subjects to appease campaigners

Mail Online UK, 28/02/2018, Eleanor Harding

Universities should pay back students for strike, says minister

BBC News online, 28/02/2018, Sean Coughlan

The universities minister says students whose courses have been disrupted by the university strike should receive compensation for lost classes. Sam Gyimah said this could mean a refund on tuition fees or rearranging cancelled lectures. At the launch of a new higher education regulator, Mr Gyimah told universities this was the “age of the student” and they deserved better value for money. The universities minister, speaking in Westminster at the launch of the Office for Students, said he was “very serious” about universities paying back students who had missed out on classes during the strike over pensions. He said that the wages saved by universities because of the strike could be used to support students, rather than kept by institutions.

King’s College London says it will have a fund to compensate students. The university says that it will ring-fence any savings from staff pay on strike days and use the money to “offset the impact of the strike on our students”.

Also:

University strikes: Students set to receive ‘direct compensation” over lectures missed due to action, minister says

The Independent, 28/02/2018, Eleanor Busby

Students need compensation over strikes, says Minister

Financial Times, 01/03/2018, Robert Wright

Government Backs Student Demands For Refunds Over Lecturer Strike

Huffington Post UK, 28/02/2018, George Bowden

World university subject rankings: the UK is back on top

The Guardian, 28/02/2018

The latest QS university rankings by subject, released on 28 February, show that ten of the 48 subject tables are led by UK institutions. No UK institution that held a world-leading status in 2017 has been overtaken by an international competitor. The University of Oxford has retained its number one status for four subjects. The University of Cambridge has taken the top spot for anthropology from Harvard.

Avoidable mistakes made in appointment of Toby Young

BBC, 27/02/18

Toby Young’s reputation as a “controversialist” should have prompted greater scrutiny into his past before he was appointed to England’s new university regulator, a probe has said.

Universities claim success at start of strike

Financial Times, 22/02/18, Robert Wright

The trade union leading strikes at UK universities in protest at a shake-up to academics’ pensions on Thursday claimed success with the first day of industrial action.



from RSSMix.com Mix ID 8239600 http://cdbu.org.uk/news-round-up-academics-go-on-strike-over-pensions/
via IFTTT

source https://thermoplasticroadmarking.tumblr.com/post/171451786504

Thursday 1 March 2018

The Office for Students has a new regulatory framework – but what does it all mean?

The newly-published regulatory framework runs to 167 pages – but still leaves many questions unanswered, writes Professor GR Evans

At last the Office for Students (Ofs) has published its Regulatory Framework. It will be laid before Parliament but essentially this is subordinate legislation of the Henry VIII kind. There will be no automatic debate or vote to approve its contents. The document is liberally spattered with mentions of the powers of the Secretary of State. When the OfS authorises a provider to grant degrees, the Orders of the OfS will themselves become Statutory Instruments (Framework para.213 and Higher Education and Research Act s.42)

Suppose I am a student who wants to be assured that a prospective provider is properly “regulated”? One of the readerships listed as the document’s audience is “students, and bodies representing the interests of students, on higher education courses provided by English higher education providers”.

If an experienced reader of higher education policy documents is flummoxed by what follows, how is this key constituency going to be served?

Also intended to use this document are “providers of higher education in England and bodies representing the interests of such providers.”

The “others including, but not limited to, employers, charities and research bodies that are not themselves providers” will presumably come panting some way behind in trying to make sense of it.

Those in most urgent need to understand it will be the providers. There was a promise on the temporary OFS website that guidance for providers wishing to be included on the new Register would also published on 28 February too and there it is. Or is it? The link just takes the enquirer to the whole Regulatory Framework, but under a different heading: Regulatory Framework and Registration. This is broken down into documents, some of which seem to form part of the Regulatory Framework, some not. This is surely a hard-to-forgive breakdown in an elementary duty to be clear in setting the immense complexities of the new rules before those who will have to use them.

In the material listed on the temporary website but not included in the Framework document is the Strategic Guidance for 2018-9. This is the counterpart of the Annual Grant Letters received by the Higher Education Funding Council for England (HEFCE), usually in January and normally from the Secretary of State not as now from the minister for higher education alone. It is interesting to compare this with the norms previous established, for example in the latest letter to HEFCE last year.

Countering harassment…or supporting free speech?

The new letter comes almost exactly a year later dated the day of the publication of the Framework and it embodies a paradox. The adjective “regulatory” occurs countless times in the Framework document and in its very title, but the Secretary of State speaks of “removing unnecessary regulatory burdens for providers”.

The Minister evidently has on his mind various matters in recent headlines:

This includes working to counter harassment and hate crime in higher education, taking steps to make campuses places of tolerance for all students. “

But:

“I would also like the OfS to be a champion of freedom of speech, which is so crucial to higher education. Free speech is essential in ensuring that universities are places which expose students to new and uncomfortable ideas, and encourage robust, civil debate and challenge. “

Value for money and vice-chancellors’ salaries seem to be on his mind too:

“I would also ask the OfS to work with the sector to ensure good governance, effective and efficient use of resources, including around senior staff remuneration, as well as engaging closely with the sector on its own self-regulation in this area.”

Industrial strategy, STEM, especially mathematics, employability and degree apprenticeships are rather untidily bundled together:

“I would like the OfS, in particular, to consider how to encourage sector support for the pipeline of skilled graduates from all backgrounds that is needed by the economy, for example through sector support for maths schools. Key to this will be promoting and enhancing collaboration between the higher education sector and employers, both nationally and locally, and I would like the OfS to work with Government on reviewing how funding can be used to stimulate this, and also on the impact of apprenticeships in achieving this goal as well as supporting access and participation.”

Clarification is still needed

The House of Lords tried hard, in the debates on the new legislation, to get satisfactory answers to the question of how research students’ interests would be protected once teaching and research were moved into different departments of State. The minister’s paragraph on working with UK Research and Innovation (UKRI) seems to have missed that one:

“I would like the OfS to prioritise collaboration with UKRI on those areas of shared interest, including: skills; capability and progression; knowledge exchange; the ongoing financial sustainability of HE providers; accountability and assurance; infrastructure funding; building robust evidence and intelligence; and ensuring that the Research Excellence Framework (REF), Teaching Excellence and Student Outcomes Framework (TEF) and Knowledge Exchange Framework (KEF) are mutually reinforcing.”

The old Grant Letter allocated block funding for both teaching and research to which conditions of grant could be attached. Responsibility for public research funding has now passed to UKRI. Teaching funding now comes largely from tuition fees.

This is a Grant Letter yet direct state funding for teaching has largely disappeared with the removal of teaching funding to tuition fees.  The remaining sum is in Annex B, and it is lower even than last year. There is also a Condition of Grant (Annex C) “regarding regulated fees”.  If a provider overcharges, the amount  “will be repaid by the institution to the OfS, or withheld from grant”. So this has to be a new kind of letter with new sanctions.

Perhaps the architects of the temporary OfS website could assist those for whom the Regulatory Framework will become a document of frightening importance if they are to avoid deregulation. They could start by clarifying the status of all those documents offering guidance and their exact relationship to the Framework document to be laid before a Parliament. If any MP is actually going to read it, they will need to be able to understand it.

 

 



from RSSMix.com Mix ID 8239600 http://cdbu.org.uk/the-office-for-students-has-a-new-regulatory-framework-but-what-does-it-all-mean/
via IFTTT

source https://thermoplasticroadmarking.tumblr.com/post/171413740294

Thermoplastic Road Marking

Lots of organisations have playground graphics put down to boost the appearance of their area, let the children to have a more enjoyable ti...