Why exams like JEE, NEET are not good enough to test students’ aptitude

Chennai: Loyola College students during their protest demanding justice for Anitha and urging the Central government to ban NEET, in Chennai on Wednesday

S. Anitha, who has reportedly committed suicide recently, had failed to make it through NEET (National Entrance-cum-Eligibility Test), an exam now taken nationally to decide who can study medicine and who can’t. NEET is conducted by the Central Board of Secondary Education (CBSE). Anitha who had scored exceptionally well in her Class XII exam of Tamil Nadu (TN) Board, failed to perform well in NEET. The state government had tried to resist NEET, and for a year it succeeded in providing TN students an exemption from NEET. Following Anitha’s death, protests against NEET have been held in Chennai. Anitha, a daily-wage labourer’s daughter, apparently thought that her hard work at school would let her move on in life. She had even approached the court and argued that she couldn’t afford coaching.

Her story reminds us how exhausting and irrational our selection procedures are. In principle, NEET is a good idea as it makes multiple state-wise tests unnecessary. But conducting a professionally competent NEET is nothing less than a fantasy today. No institution has the capacity to standardise a test to make it fair and suitable for a country as diverse as India. Translation of items into different languages itself poses a formidable challenge. CBSE is unfairly burdened with carrying out a mind-boggling variety of tests all year round. It is widely believed that CBSE is tougher than state boards; no one knows what that means. Basically, all entrance tests now serve to eliminate. NEET doesn’t test aptitude for medicine; nor does the joint entrance test (JEE) judge a student’s potential for engineering.

There is little we can do to examine the validity of any exam today, including these mega tests like NEET and JEE. The mantras of transparency and accountability have made no impact on the exam machinery. In fact, people think that applying accountability to exams would dilute standards. So, all Boards maintain silence over how they evaluate answer sheets. Entrance tests are usually based on multiple-choice questions whose quality is mostly so poor that only someone coached in cramming can crack them. Questions in board exams require short answers. They are judged against model answers given to evaluators.

A few years ago, CBSE had introduced a procedure for seeking re-evaluation. This year, when a political science student applied for re-evaluation, she was told that the provision had recently been withdrawn. When she went to court and sought her answer sheet, it revealed how she had been marked. Here are two examples.

A one-mark question was: “How far do you agree with the statement that cultural globalisation is dangerous not only for poor countries for the entire globe?” Her answer was: “I do not agree with this statement as cultural globalisation leads to enhanced cultures with newer combinations arising from external influences, cultural heterogenisation and greater influence of all cultures.” She was given zero for this answer. The model or ‘correct’ answer used by evaluators was: “Yes, Cultural globalisation does lead to cultural homogenisation which affects all countries as it causes shrinkage of the rich and diverse cultural heritage of the entire globe”. If you compare the two, you will conclude that the girl was punished for her creativity. But in this case, her answer was closer to what the textbook, ‘Contemporary World Politics’ (Chapter 9, p. 143), says: “It would be a mistake to assume that cultural consequences of globalisation are only negative. Cultures are not static things and all cultures accept outside influences all the time… Sometimes external influences simply enlarge our choices, sometimes they modify our culture without overwhelming the traditional.”

In many other questions, she loses marks because her answer is slightly longer than the desired answer or differently worded. But there are answers where she is spot on, and still loses marks. For instance, in analysing the biggest constraints on American hegemony, a 6-mark question, the desired answer mentions the ‘institutionalised architecture’ of the American state based on the division of power, free press and NATO. The candidate mentions all three, but uses words like ‘engineering of the government’ instead ‘architecture of the state’. For such difference of vocabulary, she gets three out of six. Clearly, she was expected to cram the exact words from some exam guide.

This is just one example exhibiting the arbitrary and opaque nature of our exam system. Much has changed in India since the late 19th century when the public exam system was put in place. Minor reforms have occurred, but its core remains solidly opaque.