Friday, January 27, 2012

District ranking error Vs
Z score issue

, The Island.

By Dhara Wijayatilake

Attorney-at-Law,

Secretary, Ministry of Technology and Research and Chairperson of the Committee Appointed by the President to Inquire into the District Ranking Error.

Much is being said about the Report of the five-member Committee appointed by President Mahinda Rajapaksa to report on matters relating to the District ranking error that occurred in the 2011 G.C.E A/L results. Some allege that the report seeks to cover up a serious flaw in the Z score calculation that has been used. Some allege that it has not addressed a vital concern regarding the Z score.

It is unfortunate that too much is being said without even an attempt to appreciate what the Committee was directed to report on. Any Committee that is appointed is given a specific mandate. While constructive comments made after an understanding of that mandate are undoubtedly valuable, comments that are made without an understanding of that mandate must be dismissed as being irrelevant and unfortunate. Let me then make this humble attempt to clarify what I see as confusion.

District Rankings

The results of the GCE A/L examination released on December 25th, 2011 were found to contain errors with regard to the District rankings. The fact that there were errors in the district rankings was accepted and was not denied by anyone. There was no difference of opinion on that. The President desired to ascertain certain facts with regard to that error and appointed a Committee to look into it. The mandate given to the Committee was to ascertain the causes that contributed to the error with regard to the district rankings and who was responsible; to examine the computer systems used for processing and data analysis and ascertain whether these functions were carried out in compliance with acceptable standards; to ascertain whether there were any shortcomings in the technology used; and finally to recommend what steps should be taken to prevent a recurrence of such a situation. The Committee interviewed relevant persons and submitted its report which contained its findings and recommendations with regard to the matters that were referred to it. The recommendations have now been forwarded to the relevant authorities for implementation.

It is important to note that the Committee was required to ascertain the causes that contributed to the district ranking error. Clearly, the reference is to the error that was contained in the results that were first released by the Department of Examinations on 25th, December 2011. Of this there is no doubt. The Committee reported on all of the matters referred to it. The Committee obviously did not report on matters that were extraneous to its mandate.

The facts as ascertained by the Committee are as follows:

The Department of Examinations (DoE) is statutorily mandated with the power/responsibility of conducting the GCE A/L examination while the University Grants Commission (UGC) is statutorily mandated with the power of determining admission of students to institutes of higher learning. The UGC had used the Z score of candidates to determine admissions to the University since 2001 and there was no debate about the use of the Z score for this purpose. The physical activity of calculating the Z score based on whatever the UGC determined had always been carried out by the DoE which had a system in place to enter, amongst other things, the raw marks of candidates and proceed to process same to generate the Z scores.

In 2011, there were two categories of students who sat the GCE A/L – those who sat under the old syllabus and those who sat under the new syllabus. The DoE was planning on calculating the Z scores of the two categories of students as in previous years and to generate two separate lists – one with the Z scores and rankings of the old syllabus candidates and another with the Z scores and rankings of the new syllabus candidates. However, the UGC did not want two separate lists. They required the DoE to generate one list from which university admissions could be determined on merit. The UGC therefore appointed a five-member Panel of experts to obtain advice on how to produce one single list from which they could determine University admissions on merit.

The UGC obtained that advice from the Z score Expert Panel and passed the formula that the Panel proposed, to the DoE to do the calculations. The department proceeded to apply the formula given to them by the UGC and produced one list of results.

Expert Panel

As stated previously, when the GCE A/L results were released on 25th December 2011, it was found that an error had occurred in the district rankings. The DoE had, it was said, corrected that mistake and issued a new set of results with different district rankings. The President wanted an inquiry to be conducted. A Committee was appointed to inquire into it and was given the mandate previously referred to.

In pursuance of its mandate, the Committee identified how the district ranking error was caused (due to a processing error at the final stage), who was responsible for that error and after its own and independent investigations confirmed that, that error (i. e. the processing error that occurred at the final stage) had been corrected. The Committee also identified many areas that must of necessity be improved to enhance the quality of the examination process including the need for double checking and validation at every stage and made recommendations for improvement. The computer system was examined and recommendations were made with regard to improvements. It was also recommended that implementation be pursued in terms of a Plan of Action and time targets. The President has directed that the recommendations be implemented. I will not here refer to the details of the recommendations made. I am confident however that no one will call us bureaucratic lackeys or boot lickers after reading our report.

The finding of the Committee was that the error in the district rankings was caused due to a mistake that occurred at the data processing stage in the Department of Examinations. The formula adopted by the UGC to determine University admissions was not the cause of the district ranking error. Obviously, then the correctness or otherwise of the Z score formula however important it may be, (and I fully agree that it is very important) was a matter that was extraneous to the mandate of the Committee. The use of a wrong formula to calculate the Z score would have resulted not only in an error in the district rankings but also in the island rankings.

The fact that there are other issues that are also important does not give the Committee the right to step outside its given mandate and comment on such issues. The Committee had no right to assume that the appointing authority had any intention of expecting advice on those matters, from this Committee.

Clear Distinction

There is a clear distinction between an error (such as was caused in the district rankings) and a difference of opinion, such as is now raging with regard to the calculation of the Z score. There are two issues here and they are distinct and separate. The first is regarding the district rankings error and the other is regarding the formulae adopted to calculate the Z score. Please note there was no error in the island rankings in the results released on 25th December, 2011. (Prof. Gunawardane in his article in the Island of January 25th, 2011 states that there were errors in the island rankings as well, in the results issued. This is incorrect).

An error is an error – a mistake. If one needs to find out what went wrong, it’s necessary to inquire into it. On the other hand, the whole issue of the Z score and how it was calculated in the instant case of the 2011 G.C.E A/L to deal with 2 syllabi, stems from a difference of opinion. Should the Z scores have been calculated after pooling of the means and variances or should the Z score have been calculated separately? There are those who say the formula used is sound (the Z score expert Committee appointed by the UGC says so) and there are those who argue that it is not. The mere fact that there is a difference of opinion about this matter, does not, by itself, render the formula erroneous. Equally, no one who has an opinion different to the one that has now been adopted should be faulted for having that opinion. But it is absolutely important to get it right, in the end.

Obviously, the needs of decision makers in these two situations are different. While the former situation (error) requires a fact finding process to be put in place, the latter (Z score) requires expert advice.

Mr. Sumanasiri Liyanage – Island Jan 23rd, 2012, states "I strongly believe the members (of the Committee) would have carefully read the Terms of Reference (TOR) before setting about their task, asked themselves whether they could within the given mandate do justice to the issue? Had they done so, they would have realised that the investigation within the purview of the given ToR was not meaningful.’ Mr. Liyanage also hints that the investigative exercise of the Committee was destined not to produce reasonable results.

The Issue

What is this issue that’s referred to here by Mr. Liyanage? To whom is the investigation of the Committee not meaningful? The issue that the Committee was tasked to report on was the District ranking error issue. Was that not an issue? It was an issue to the many stakeholders who were as yet wondering whether even now, that error has in fact been corrected. And it was obviously an issue to the Head of State. Is it not important to ensure that such mistakes do not recur? Is it not important to ascertain whether that mistake has now been corrected as claimed by the DoE? Is it not meaningful to attempt to improve the process followed by the DoE in conducting one of the most important examinations that our students submit themselves to? Prof. R.P. Gunawardena (Island January 25th, 2012) rightly emphasises the importance of the G.C.E A/L examination. Is it then meaningless to ensure that all is right in the conduct of that examination? No, it is not. It is not only meaningful, but is also the serious responsibility of those in charge of education, to remedy what is wrong. If one is sincere about remedying what is wrong, you need to first find out what is wrong.

The mere fact that there is a difference of opinion about the Z score issue, however intrinsically connected that issue may be to the same examination (the 2011 GCE A/L), does not make the District ranking error and everything done to prevent a recurrence of such an error, meaningless.

Prof. Gunawardane in his article (Island January 25th, 2012) analyses the formula used to calculate the z scores in the 2011 examination and states that it is a fundamental error made by the expert Committee (i. e. the z score expert committee) and proceeds to suggest how it should have been done. These comments are indeed valuable. He also states that "Unfortunately, the Presidential Committee appointed to look into this matter has overlooked this important issue of using a wrong formula for the calculation of the z scores." (The Island editorial also cites this statement). No, Prof. Gunawardane, the Presidential Committee did not overlook that matter. As explained earlier, we were not appointed to look into that matter. Perhaps Prof. Gunawardane states so because of an incorrect assumption that there were island ranking errors in the results issued on 25th December, 2011 by the Department of Examinations? While the District ranking error was caused due to a processing error in the DoE, if the formula used to calculate the Z scores is not correct, it has an impact on district as well as island rankings. The question as to whether it is right or wrong or whether there is a better formula or not, are matters that must be concluded based on the best advice.

The Z score issue is one that deserves the highest consideration. What is required in that regard is not a fact finding inquiry but expert advice. The focus should be on the need to ensure that our students are treated justly and fairly and that merit alone is rewarded.

No comments:

Post a Comment