Issue BriefsPublished on Jul 21, 2017 PDF Download
ballistic missiles,Defense,Doctrine,North Korea,Nuclear,PLA,SLBM,Submarines

The great Indian exam debacle

The “marks moderation” debate, which made headlines this summer, [1] resulted in a frenzied, but incomplete, media coverage. There has been litigation as well, stalling the declaration of high-school results of students of the Central Board of Secondary Education (CBSE), Council for the Indian School Certificate Examinations (CISCE), and various state boards. This brief aims to demonstrate how the reliance on a simplistic metric—of a 101-point (0–100) scale for awarding absolute scores in each subject—has resulted in a near-complete breakdown of the assessment standards in Indian board examinations, largely because of the existence of multiple systems. The boards are competing to stay relevant by awarding increasingly high scores, thus diluting academic standards and pulling down the public’s trust in the assessment system.

Introduction

The integrity of the country’s examination system is in the spotlight, and Bihar would be a good starting point in any such discussion. In 2016, the high-school pass rate in Bihar’s state school-leaving board exam was 67 percent for the science stream; in 2017, it plunged to 30 percent. [2] In Punjab, meanwhile, the high-school pass rate fell by 15 percentage points, from 77 to 62 percent, over the same period. [3] The high pass rates until 2016 were mainly due to excessive marks inflation by the boards in the garb of “moderation.” After a joint decision in April 2017 by all the 40 major exam boards of India to do away with this form of moderation, the state boards that actually implemented the decision—like Bihar and Punjab in the above examples—saw dramatic falls in pass rates. Some people have been held accountable for this: for instance, the head of the Punjab school exam board was pressured by the chief minister to resign; [4] after all, such a fiasco is a dent in the image of the ruling government. Unfortunately, even those at the highest levels of government do not realise that what is going on is falsification and misreporting of marks, which is very different from real moderation, a sound statistical process.

Data shows that secondary education graduates are around 18 years old and, as such, are valued voters or soon-to-be voters. Some political parties try to capitalise on this. Table 1 shows, for example, how the pass rates in the Uttar Pradesh high-school board exam fluctuate with a change in political guard. In 1992, under the newly elected Bharatiya Janata Party (BJP) government of Chief Minister Kalyan Singh, police were installed at all the exam centres to deter cheating, and the pass rate fell dramatically from 57 percent to 14.7 percent. It is apparent that whenever the Samajwadi Party (SP) is in power, the high-school pass rate massively increases to around double that of the years when another party is in power.

Table 1: Pass rates in the Uttar Pradesh high-school board exam

Year Percentage of exam takers who passed Political party in power
1988 46.6
1989 44.8
1990 44.2
1991 57.0 Janata Dal
1992 14.7 BJP*
1997 47.9 BSP
2002 40.2 BJP
2007 74.4 SP
2008 40.1 BSP
2013 86.6 SP
2016 87.7 SP

Source: Newspaper reports, various years.

The genie is out of the bottle

Many perils of using raw marks had been flagged as early as 1971, in a report on the scaling and moderation of scores. [5] That study was headed by A. E. T. Barrow, the then secretary of the CISCE, who had strongly advocated for the use of rigorous statistical methods for assessment. The recommendations, however, were never implemented.

After decades of opaque assessment procedures, the test scores in two major Indian school-leaving examinations were collated and analysed for the first time in 2013, by two data analysts: Prashant Bhattacharji (one of the authors of this brief) and Debarghya Das. The data was analysed in subsequent years as well. Bhattacharji was a software engineer who had worked at Microsoft Corporation and Lehman Brothers, and Das was then a sophomore at Cornell University, interning at Google. The two revealed major anomalies in the CBSE [6] and CISCE [7] examinations, respectively. It was observed, for one, that CBSE had been manipulating scores since 2008. [8]

Figures 1 and 2 are histograms of the scores in Mathematics and Computer Science in the 2015 grade 12 (A-Level equivalent) examinations conducted by the CISCE, a national school board. Note the irregular gaps in the graph; there were certain scores such as 81, 82 and 84 that were never attained by a single student in any of the subjects.

Figure 1: Histogram of the grade 12 Mathematics marks (CISCE exam board, 2015)

Figure 2: Histogram of the grade 12 Computer Science marks (CISCE exam board, 2015)

Figures 3 and 4, meanwhile, show histograms of scores in the Mathematics examination conducted by the CBSE in 2013 and 2015, respectively. The histograms may look odd: over the period, the mean jumped by almost 5 percent, indicating the lack of repeatability and reliability in the examinations. These are followed by the score distributions in Physics and Chemistry, in figures 5 and 6.

Figure 3: Histogram of the grade 12 Mathematics marks (CBSE exam board, 2013)

 

Figure 4: Histogram of the grade 12 Mathematics marks (CBSE exam board, 2015)

Figure 5: Histogram of the grade 12 Physics marks (CBSE exam board, 2015)

Figure 6: Histogram of the grade 12 Chemistry marks (CBSE exam board, 2015)

‘Moderating’ marks: Who loses?

The country has over 60 school boards. Many universities, such as the University of Delhi, admit students to undergraduate programmes based on their absolute score in the grade 12 examination. This is an unfair admission procedure, because different boards have very different marking standards. A score of 80 percent could indicate highly different levels of academic accomplishment across different boards, say, Tamil Nadu, CBSE, and West Bengal.

The lack of a normalised metric has created an unfortunate incentive for school boards to inflate and misreport the marks of their students. Boards that fail to do so see their schools switch over to other boards where grading is more liberal. There have been several cases in the state of West Bengal, for example, where schools decided to make a switch from the local state board to the pan-India boards, as grading standards were extremely strict in the former. 

Between 2004 and 2016, the median percentage score in CBSE’s school-leaving examinations jumped by 8 percentage points. Scores in languages, practical work, and project work were worst hit by the score-inflation phenomenon.

An analysis of Physics practical examination scores of nearly 525,000 candidates revealed that over 40 percent had been awarded a full score of 30 upon 30 in the CBSE grade 12 examinations of 2017. In addition, more than 88 percent had scored 90 percent or more in the practical component (27 upon 30, or more). Similar trends are noted in the practical components of other science subjects such as Chemistry and Computer Science. Such misreporting of standards renders the schooling system ineffective, and eventually, the system ends up producing unemployable engineers and technologists. Because of the empirical nature of science and the practical nature of technology-oriented subjects, the country’s progress in both science and technology is being stalled by awarding unrealistically high scores to students who barely even entered the laboratories during high school.

While moderation is well justified as a statistical process, artificial inflation of marks is not. The country’s boards engage in the latter, and they do not have a system in place for the former. Unfortunately, even those at the highest levels of the school system in India are not aware of the distinction [9] between the two.

There is a process of moderation or standardisation in which exam boards across the world use statistical processes to upscale or downscale raw scores with the objective of either fitting them into a curve or adjusting them to compensate for inter-examiner variability. Sometimes, this is done to also ensure the repeatability of the examination by adjusting for year-to-year variations in the difficulty levels of the question papers.

During the last decade, Indian exam boards have started misusing this practice and devolved it into a highly opaque affair, leading to an artificial inflation and misreporting of marks. For example, it was discovered that in 2016 and 2017, the CBSE inflated scores in Mathematics by 16 and 11 marks, respectively, but in both years, it truncated them at 95. This meant that a candidate with 79 marks and another with 94 would both obtain a final score of 95 in their marksheets, without either the students or anyone else getting to know on what basis the scores were increased.

In the case of the CISCE examinations, after the issue was highlighted that there were specific marks—such as 81, 82, 85 and 87—that had never been obtained by anyone in their examinations in any of the subjects for over a decade, the boards silently addressed the issue in their 2017 results, rectifying what was, in all likelihood, a case of a flawed computer program running a post-processing operation on raw marks. In the race to increase the average marks of the students, the various Indian exam boards have made it difficult for university admission officers to distinguish exceptional students from the average, and the rest.

Going beyond what is acceptable

Some deviation from perfect bell curves is acceptable and there are plausible explanations for a few spikes such as those around pass marks (where evaluators give examinees the benefit of the doubt) and around grade-boundaries (where examination script markers are faced with dilemmas). However, it is hard to find any benign explanation or acceptable justification for the level of distortion observed in histograms from various CBSE and CISCE examinations.

So far, all the schemes to edit and transform marks have been arbitrary, hidden from public view, and primarily used in a dishonest game of competitive score inflation. Not only were scores being inflated, but the ad-hoc schemes being used to do so were causing a serious distortion of relative ordering. Some boards awarded as much as 20 percent extra marks in specific subjects, so that failing candidates would clear the pass mark. One of the co-authors of this paper is a member of the examinations committee of an Indian state exam board, where she has seen ‘moderation’ of up to 100 percent.

Suggestions for reform

First, the country needs an exam watchdog along the lines of UK’s Ofqual (Office of Qualifications and Examinations Regulation), that will keep an eye on the assessment standards across the multitude of examining bodies.

Second, exam boards must report not only absolute scores but also the relative ordering indicated by either the percentile score or the z-score and, ideally, a normalised grade point. Raw scores often lead to an arbitrary level of precision, ignoring the subjectivity involved in assessment.

A grade-point system has the potential to be helpful as a coarser filter. But it should be ensured that there is consistency in the grading schemes of various boards. Moreover, not more than two to three percent of the candidates should attain the highest scores, to distinguish genuine excellence from the rest. The resolution, and hence the relevance, of the examination is lost if lakhs (or even hundreds) attain the highest grade. [10]

Third, there should be a general scholastic test designed along the lines of the SAT (Scholastic Aptitude Test) in the United States, whose results can also be used to benchmark the performance of students across boards. The results of such a test may be used by central universities to create an incentive for students to opt for it. This can be a board-agnostic “pass certificate” to help identify candidates who actually have basic reading, writing and numeracy skills. For political and commercial reasons, India’s school exams fail to do this.

Fourth, a one-time ‘correction’ (stopping of marks inflation) is needed, to be implemented by all the exam boards together, rather than gradually over a few years or by different exam boards in different years. Moreover, since the number of examinees is huge, the exams need to have a good proportion of multiple choice questions that can be marked electronically using OMR answer sheets (instead of all questions asking for narrative answers), to avoid poor quality script marking due to a shortage of examiners.

Finally, in the course of admitting students, all colleges and universities must have the autonomy to conduct a layer of assessment beyond the school examinations. Currently, many colleges such as Ramjas and Shri Ram College of Commerce,[11] for example, are unable to do this, as this degree of administrative autonomy generally requires a minority status. Therefore, these colleges find themselves helpless in a situation where a disproportionate fraction of their seats are occupied by students from boards that inflate scores the most.


About the authors 

Prashant Bhattacharji is a Data Analyst ([email protected]).

Geeta Kingdon is a Professor at UCL Institute of Education ([email protected])


Endnotes

[1] FP Staff. “CBSE moderation row: Board awarded up to 11 extra marks in this year’s Class 12th exams.” Firstpost, 2 June 2017. Accessed 17 July 2017. http://www.firstpost.com/india/cbse-moderation-row-board-awarded-up-to-11-grace-marks-in-this-years-class-12th-exams-3508967.html. 

[2] PTI. “Bihar Board Class 12 results 2017 sees 64% students fail in their exams.” Livemint, 30 May 2017. Accessed 17 July 2017. “http://www.livemint.com/Education/HbiMixK07K2LknJIPPZ6VK/Bihar-Class-12-Board-results-2017-sees-64-students-fail-in.html. 

[3] Deep, Jagdeep Singh. “Punjab Class XII results: No grace marks lead to 14 per cent dip in pass percentage.” The Indian Express, 14 May 2017. Accessed 17 July 2017. http://indianexpress.com/article/india/punjab-class-xii-results-no-grace-marks-lead-to-14-per-cent-dip-in-pass-percentage-4654738/. 

[4] FE Online. “PSEB Results: Balbir Singh Dhol quits as chairman after CMO asked to resign following poor board results.” The Indian Express, 26 May 2017. Accessed 17 July 2017. http://www.financialexpress.com/education-2/pseb-10th-result-2017-pseb-ac-in-balbir-singh-dhol-quits-as-chairman-after-cmo-asked-to-resign-following-poor-board-examination-results/686413/. 

[5] Barrow, A. E. T. “Principles of scaling and the use of grades in examinations.” Teacher Education (India), January 1971. Accessed 17 July 2017. http://www.teindia.nic.in/mhrd/50yrsedu/g/52/76/52760I01.htm. 

[6] Bhattacharji, Prashant. “Exposing CBSE and ICSE: A follow-up (2015 Results) after 2 years.” The Learning Point, June 2015. Accessed 17 July 2017. http://www.thelearningpoint.net/home/examination-results-2015/exposing-cbse-and-icse-a-follow-up-after-2-years. 

[7] Das, Debarghya. “Hacking into the Indian Education System.” Quora, 4 June 2013. Accessed 17 July 2017. https://deedy.quora.com/Hacking-into-the-Indian-Education-System. 

[8] Bhattacharji, Prashant. “A Shocking Story of Marks Tampering and Inflation: Data Mining CBSE Scoring for a decade.” The Learning Point, June 2013. Accessed 17 July 2017. http://www.thelearningpoint.net/home/examination-results-2013/cbse-2004-to-2014-bulls-in-china-shops.

[9] Kunju S., Shihabudeen. “CBSE, State Boards Agree To Scrap ‘Marks Moderation’, Degree Cut-Offs To Drop.” NDTV, 25 April 2017. Accessed 17 July 2017. http://www.ndtv.com/education/cbse-state-boards-agree-to-scrap-marks-moderation-degree-cut-offs-to-drop-1685742.

[10] Sebastian, Kritika Sharma. “Over 1.6 lakh score ‘perfect 10’ CGPA.” The Hindu, 29 May 2016. Accessed 17 July 2017. http://www.thehindu.com/news/cities/Delhi/over-16-lakh-score-perfect-10-cgpa/article8662008.ece.

[11] Gohain, Manash. “Tamil Nadu students grab up to 80% of seats in SRCC so far.” The Times of India, 2 July 2016. Accessed 17 July 2017. http://economictimes.indiatimes.com/industry/services/education/tamil-nadu-students-grab-up-to-80-of-seats-in-srcc-so-far/articleshow/53017636.cms.

The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.