Skip to main content

School Exam Results - Report - 03 May 2011

The official version of this document can be found via the PDF button.

The below content has been automatically generated from the original PDF and some formatting may have been lost, therefore it should not be relied upon to extract citations or propose amendments.

Education and

Home Affairs Scrutiny Panel

School Exam Results

Presented to the States on 3rd May 2011

S.R.6/2011

CONTENTS

  1. CHAIRMAN'S FOREWORD.................................................................................................1
  2. INTRODUCTION..................................................................................................................2
  3. KEY FINDINGS AND RECOMMENDATIONS...................................................................... 4
  4. THE PUBLICATION OF EXAM RESULTS...........................................................................5
  5. THE COLLECTION AND USE OF EXAM STATISTICS .......................................................9 COLLECTION OF THE STATISTICS...................................................................................9 COMPARISON WITH THE UK...........................................................................................10 MEASURING SCHOOL PERFORMANCE......................................................................... 11
  6. OTHER ISSUES ................................................................................................................17 THE ISLAND'S SCHOOL SYSTEM ...................................................................................17 GENDER AND SOCIO-ECONOMIC ISSUES ....................................................................18
  7. CONCLUSION...................................................................................................................20
  8. APPENDIX 1 – EXAM RESULTS MEDIA RELEASE .........................................................21
  9. APPENDIX 2 – DEPARTMENTAL SUCCESS CRITERIA..................................................22
  10. APPENDIX 3 – PANEL MEMBERSHIP AND TERMS OF REFERENCE ...........................23
  11. APPENDIX 4 – EVIDENCE CONSIDERED........................................................................ 24
  1. CHAIRMAN'S FOREWORD
  1. I am pleased to present the Panel's report following its review of school exam results. Our initial intention had been to conduct our review within a short timescale: significant issues had been raised by the very public debate on whether (and what) exam statistics should be published and we wished to present our report quickly so that those issues could be addressed.
  2. That ultimately proved to be impossible due to the impact of the Easter recess and the presentation of our report has therefore needed to wait until the end of the recess. We trust, however, that our conclusions and recommendations remain pertinent and will be taken into account by the Minister for Education, Sport and Culture.
  3. Our apologies go to any who thought we might use this review to set out future proposals for secondary education in Jersey. That was beyond our remit. However, we anticipate that will in fact come during the discussions generated the Minister's forthcoming Green Paper.
  4. Instead, we focussed on the narrower issue of the publication of information on school results and overall performance. We believe it has to be a far more open process so that parents,  taxpayers,  students  and  other  interested  parties  are  given  the  full  picture. Reforming secondary education will be a major challenge but transparency is essential to that process.
  5. I would like to thank our two witnesses, the Minister and Mr. John Mills, and the Panel Members for their contributions and efforts during our review.

Deputy R G Le Hérissier

Chairman – Education and Home Affairs Scrutiny Panel

  1. INTRODUCTION
  1. On 23rd February 2011, the Jersey Evening Post (JEP) printed the first of several articles in which concerns were expressed at the GCSE results of some of the Island's schools. Few people can be unaware of the impact made by these articles and the debate which subsequently ensued in the JEP's letters pages.
  2. The initial articles drew heavily on information provided to the JEP by Mr John Mills who had obtained, from the Department of Education, Sport and Culture (ESC), comprehensive data on Jersey's GCSE and A-Level results. In the articles, Mr Mills argued that while some results were clearly very good, others were less so, and some aspects were a cause for concern. He made the point that Jersey's 2010 GCSE results (based on candidates achieving at least five A* to C grades at GCSE) were, in aggregate, close to the overall UK figure but a little behind those of certain UK regions with which Jersey might reasonably be compared. Mr Mills also highlighted the seemingly wide variations in performance between the Island's nine secondary schools on the benchmark' measure of at least five A* to C GCSE grades including Maths and English; and the fact that about a quarter of overall GCSE entries in Jersey had been awarded Grade D or below. He was of the view that the data potentially indicated a range of performance problems in Jersey's education system which the Minister needed to address. Beyond Mr Mills's own analysis, the JEP also highlighted the fact that the GCSE performances of the Island's four 11-16 schools compared unfavourably with UK schools' performance.
  3. Exam results data for individual schools had not been published in this way before and the decisions of Mr Mills and the JEP to do so were condemned by the Minister for ESC. In a statement to the States Assembly on 1st March 2011, the Minister argued that the Island's schools could not reasonably be compared with schools in the UK and that focussing on one narrow measure (i.e. exam results or, more particularly, the proportion of pupils receiving five or more A* to C GCSE grades, including English and Maths) placed undue and unfair pressure on the schools.
  4. We shall consider both viewpoints in more detail later in this report. In light of the publicity afforded the initial allegations and the arguments put on both sides of the debate, the Education and Home Affairs Scrutiny Panel agreed to review the matter. It is not our role to act as an adjudicator or moderator between two opposing parties. It is our role, however, to hold the Minister to account. In our terms of reference, we therefore set out to consider his policy on publishing exam results and to determine whether any changes should be made to that policy. We therefore held two Public Hearings, first with Mr Mills to consider the validity of his concerns, and secondly with the Minister to consider his policy. Inevitably, our discussions touched upon broader issues which we shall also cover in this report. The transcripts of both Public Hearings have been made available on the Scrutiny website (www.scrutiny.gov.je) and we would encourage all interested parties to read them.
  1. School performance is a significant issue which could lend itself to a lengthy and detailed review. However, as the Minister will imminently publish a Green Paper on the future of education in the Island, we agreed to make our review short in order that our conclusions could feed into that consultation period. We therefore ceased to gather evidence once we had held the Public Hearings. We anticipate that further detailed study and discussion of these issues will be necessary in due course.
  1. KEY FINDINGS AND RECOMMENDATIONS

KEY FINDING

  1. Exam statistics for each of the Island's schools cannot reasonably be withheld from publication.

RECOMMENDATION

  1. The Minister for Education, Sport and Culture should revise his policy on the publication of exam statistics.

KEY FINDING

  1. There is sufficient independence in the collation of Jersey's exam results data and there is therefore no current need for more direct involvement of the Statistics Unit.

KEY FINDING

  1. The Minister's policy on publishing exam results should ensure that a proper explanation is provided of how Jersey's results may feasibly be compared with other jurisdictions.

KEY FINDING

  1. Exam statistics are not the only performance measure used by the Department of Education, Sport and Culture.

RECOMMENDATION

  1. The Minister for Education, Sport and Culture should develop a reporting structure for school performance that takes into account other performance measures used by his Department (as well as exam results) and through which information should be made publicly available unless in exceptional circumstances.

KEY FINDING

  1. There needs to be a proper debate on the structure and objectives of the Island's secondary school system.

KEY FINDING

  1. Work should continue on addressing the apparent gender imbalance in school performance and on determining the impact of socio-economic status and parental contribution/influence on performance.  
  1. THE PUBLICATION OF EXAM RESULTS
  1. The immediate issue that our review had to address was the Minister's policy on publishing exam results and whether the results of each of the Island's schools could (or, indeed, should) be published.
  2. The Minister's policy dates from 2005 when it was agreed by the ESC Committee of the day. The policy requires that:

"a) Aggregate results for the island are compiled by the DfESC and released

to the media; [and]

b)  Individual schools publish their results to governors and parents and show how these compare with the Island average."

As a consequence, each year the Department issues a media release that might include Island-wide statistics but not the results for each school; headteachers are able to comment generally about their school's achievements but any requests for statistical data must be referred to the Department.[1] An example of how the Island's GCSE results were announced in 2010 by the Department has been appended to this report. While much of our focus has been on GCSE results, the same policy applies to Jersey's A-Level results.

  1. The rationale underlying this policy is to avoid the publication of performance tables (commonly, although not always accurately, referred to as league tables) as their publication would be counter-productive. The Minister indicated as much in his statement to the States Assembly on 1st March 2011. The Minister's decision to release comprehensive data (by releasing it to Mr Mills) could therefore be seen as recognition on his part that it was public information (not private) and the publication could therefore be seen to be at odds with his own policy.
  2. Much of the recent debate has indeed centred on the question of whether performance tables for the Island's schools should be published and on the respective roles of the fee- paying and non-fee-paying schools in the Island's education system. However, it is our view that this is not the issue which needs to be addressed. Rather, the issue is whether the information in question can feasibly and reasonably be withheld from publication. Mr Mills himself indicated to us that his concern was not the establishment of league tables and he stated that Jersey was in fact too small for such tables. In his view, it was a question of access to information and whether the Minister should make the information on individual schools more readily available. [2]
  1. On this point, there are two reasons why the Minister cannot seemingly withhold the information in question (i.e. individual schools' exam statistics). First, the schools' exam statistics can already be seen to be in the public domain. Part (b) of the Minister's policy requires the schools to publish their results to governors and parents. We are aware of at least one school, for instance, which publishes exam statistics on its website. While Part (b) does not equate to a formal media release from the Department, the information can be seen to be out there' in the public domain. It therefore appears difficult for the Minister to justify that he could not publish such information himself.
  2. The second reason is that recent events show the Minister was seemingly unable to withhold the information from Mr Mills, despite some initial reticence on the Minister's part. Subsequent to our Public Hearing with the Minister, he advised us that it had not been a question of being unable to withhold the information.' Rather, due to his duty of care to the Island's pupils, he had been required to follow due process in releasing any data. That process had included seeking advice which meant it had not been possible to respond instantaneously to the request for information. Nevertheless, it is ultimately the case that while the Minister had previously refused to release exam statistics for the Island's schools[3], he had on this occasion chosen to do so, notwithstanding his apparent reluctance.
  3. Mr Mills was able to request and obtain exam statistics for all schools under the provisions of the Code of Practice on Public Access to Official Information. This Code allows access to official information unless there are reasonable grounds for non-disclosure. The fact that the exam statistics were released to Mr Mills suggests that there were no such reasonable grounds on this occasion. In a sense, any choice about whether or not to publish was made for the Minister. It also therefore stands to reason that he will need to accede to any future requests for this data. We anticipate this will remain the case if the Code of Practice is replaced by the Freedom of Information (Jersey) Law (due to be debated by the States Assembly on 3rd May 2011).
  4. The Minister's rationale for withholding the exam statistics for the Island's schools is that the creation of league tables would have a damaging impact on the schools. At our Public Hearing, the Minister provided the following explanation:

"[] People and organisations start to behave according to the way in which they are measured. That leads on to the curriculum ending up being designed to deliver on measurements and not necessarily focusing on the pupil's needs. It can end up with a loss of a broad and balanced curriculum. There is evidence that league tables can encourage cheating by teachers to deliver targets. There is an adverse impact on whole school communities. I think we have already seen signs of that in relation to the information that is being put in the public domain by certain individuals. There is also and, perhaps, more worryingly, an adverse  impact  on  those  pupils  with  lower  ability  and  special  educational needs."[4]

For his part, when we put this issue before Mr Mills, he explained that the introduction of performance tables in England had been "tough" but that the impact had been "wholly beneficial" because of its positive impact on the accountability of schools to parents and the wider public. Concerns about the apparent negative impact of performance tables sounded to him like excuses although he emphasised that the issue was ultimately not one of league tables: the exam statistics for certain schools were in fact of concern in themselves.[5]

  1. We were unable to verify either argument in the time available for our review. We have noted,  however,  that  the  House  of  Commons  Children,  Families  and  Schools  Select Committee found evidence of the problems cited by the Minister when it undertook an enquiry into school accountability in 2009 and 2010. Indeed, the Committee called upon the UK Government to move away from the use of Achievement and Attainment Tables.[6] At the very least, this shows that Jersey is not alone in having to contend with the question of performance tables.
  2. Ultimately, the Minister's reasons were not sufficient grounds for withholding the information provided to Mr Mills and it seems unlikely the Minister could ignore a similar request in the future. Therefore, while the Minister's policy on publishing exam results is laudable in its endeavour to protect schools from undue pressure, it is untenable and needs to be re- examined.

 

KEY FINDING

4.11  Exam statistics for each of the Island's schools cannot reasonably be withheld from

publication.

RECOMMENDATION

4.12  The  Minister  for  Education,  Sport  and  Culture  should  revise  his  policy  on  the

publication of exam statistics.

4.13  In making this recommendation, we are not calling upon the Minister to publish league tables. Rather, we have concluded that a revised policy is required as the Minister's current policy cannot be justified and is seemingly no longer workable. We are not naïve enough to think that other people or the media would not create performance tables themselves, regardless of the format in which the Minister himself were to publish the information. That

is undoubtedly a risk and a challenge to the Minister in revising his policy. However, it is a risk that cannot be avoided simply by not publishing the data and the Minister therefore needs to meet the challenge head on.

  1. THE COLLECTION AND USE OF EXAM STATISTICS
  1. In relation to the collection and use of exam statistics in the Island, three main questions arose during our review. First, whether there was sufficient independence in the collection and analysis of the statistics; secondly, whether the Island's exam results could feasibly be compared with those of the UK; and, finally, what role exam results in fact played in the Department's measurement of school performance in the Island.

COLLECTION OF THE STATISTICS

  1. Taking the first of those issues, Mr Mills advised us of his view that the Statistics Unit should be more involved in the publication of the Island's exam results data. Mr Mills stated that the Unit was well-respected and that its involvement would bring the requisite independence and authority, intimating that this would better allow the data to be trusted as an important evidence base.[7]
  2. Given this suggestion, we asked the Statistics Unit whether it could accommodate this responsibility, if so charged. We were advised that it would be feasible in principle for the Unit to become more involved in the collection and publication of the statistics; however, this would increase the demands placed upon the Unit and an additional statistical officer for the Unit would be required.
  3. The Department of ESC advised us that the Island's exam results were collated by the National Confederation for Examination Results (NCER) in the UK. The Department itself had no input into that process and it could not therefore influence the raw data. The Department was able to analyse the data itself, however, using software entitled Educational Performance and Analysis System (EPAS). Hence, the Department could break down the results by school, by subject, by gender et al in order to conduct its own analyses.[8]
  4. There is evidently a need for whatever exam data is published to be analysed and presented clearly so that it can be understood. That is a distinct issue to whether there is sufficient independence in the initial collation to allow the data to be trusted in the first place. The fact the Department has no input into the collation of the data suggests that sufficient independence is indeed already built in to the system. Furthermore, in the current climate of the Comprehensive Spending Review and the need to be mindful of expenditure, it is unlikely that additional resources would be provided to the Statistics Unit unless there were an irrefutable need to do so. Given the involvement of the NCER, there does not appear to be such a need at present.

KEY FINDING

5.6  There is sufficient independence in the collation of Jersey's exam results data and

there is therefore no current need for more direct involvement of the Statistics Unit.

COMPARISON WITH THE UK

  1. Turning to the second issue, whether the Island's exam results could be compared with those achieved in the UK, this was a matter on which much of the recent debate has centred.
  2. The initial reports in the JEP suggested that some of the Island's schools could be seen to be  underperforming  as  their  GCSE  results  compared  unfavourably  to  the  majority  of schools' results in the UK. In the reports, Mr Mills made the point that Jersey's 2010 GCSE results (based on candidates achieving at least five A* to C passes) were, in aggregate, close to the overall UK figure but a little behind those of UK regions such as the South East of  England  or  Northern  Ireland  with  which  Jersey  might  reasonably  be  compared demographically; and he felt that given Jersey's relative wealth this could be seen as slightly surprising. At his Public Hearing, Mr Mills downplayed slightly the question of comparison and said that the results were simply worrying in themselves, regardless of whether or not a comparison was made with the UK.[9] For instance, he drew attention to wide variations in performance between Jersey's nine secondary schools on the benchmark' measure of at least five A* to C including Maths and English, and to the fact that of some 8,000 GCSE entries in Jersey about a quarter (from across all nine schools) were awarded Grade D or below. Nevertheless, the question of comparison was significant and we considered the matter during our review.
  3. The Minister has repeatedly stated that the Island's schools cannot be compared with their UK counterparts on a school-by-school basis. The explanation given for this statement was the Island's unique school system. In that system, we understand that 41% of pupils attend fee-paying  schools  (whereas  only  7%  of  pupils  in  the  UK  attend  such  schools). Furthermore, 15% of 11 to 16 pupils transfer to Hautlieu at the age of 14 (the majority of transfers coming from the four 11 to 16 schools). As a result, the make-up of the school populations at those 11 to 16 schools differs to that found in UK schools. Hence, school-by- school comparison with the UK was invalid in the eyes of the Minister and his Department as it would not be on a like-for-like basis.
  4. Context is evidently everything and it appears sound that comparisons should be done on a like-for-like basis. Nevertheless, it is apparent that the Department of ESC does compare the Island's results with those achieved in the UK. For example, as Mr Mills pointed out to us, favourable comparisons with the UK are included among the departmental success

criteria  in  the  Department's  2011  Annual  Business  Plan.[10]   Furthermore,  the  Annual Performance Report for the States reports Jersey's exam results (in both GCSE and A- Level) in comparison with those of the UK. A similar comparison is included in the media releases despatched each August by the Department (see Appendix 1).

  1. The Department advised us that comparisons were feasible at a "system level."[11] In other words, the make-up of the total school population in Jersey was, proportionately speaking, similar to the make-up of the school population in the UK (although the respective sizes of those populations were very different). Consequently, the Island's overall results could be compared with the UK's overall results. It is this comparison that one finds in the States Annual Performance Report and the media releases circulated by the Department.
  2. This is a somewhat nuanced view of how Jersey can be compared with the UK and appears sound. However, the message can easily be lost and therefore needs to be made as clear as possible. That message, that Jersey may be compared with the UK on a system basis but not on a school-by-school basis indeed appears to have been missed at times in the recent debate and could be conveyed more clearly by the Department. For instance, the media release from August 2010 was entitled GCSE results better than UK average' and yet the release itself does not contain the nuanced and contextualised explanation which we received from the Minister during our review. It is therefore not surprising that a view may have  developed  that  the  Department  only  compares  Jersey  with  the  UK  when  it  is convenient to do so.
  3. This problem may be enhanced by the fact the Minister's current policy on publishing exam data requires them to be disseminated via a departmental media release. Once released, the Minister and his Department have no control over what the media report. It may well be that a more formal reporting mechanism (e.g. through the presentation of a report to the States specifically on this subject) would allow the Minister to ensure that the nuanced and contextualised explanation is indeed put across.

 

KEY FINDING

5.14  The  Minister's  policy  on  publishing  exam  results  should  ensure  that  a  proper

explanation is provided of how Jersey's results may feasibly be compared with other

jurisdictions.

MEASURING SCHOOL PERFORMANCE

  1. The third issue was the most significant: how are exam results used to measure school performance? Discussion of this matter at our Public Hearings inevitably broadened to consider questions of what other measures, if any, the Department uses and whether those other measures are valid.
  1. Mr Mills advised us of his view that GCSEs were a good measure of school performance, not only in the apparent absence of anything else comprehensible to parents and other citizens but also in that they provided one clear and objective measure of the level of achievement of pupils for them to carry with them to the next stage of their lives, whether continuing in education or entering the world of work.[12] As a national standard, GCSEs were generally understood and represented a clear and reliable benchmark. He was by no means averse to the use of other measures of school performance (although those were not, or not necessarily, the same as measures of individual pupils' levels of attainment); however, he was unaware of any such measures and he called upon the Minister to make known any other measures that might be used by the Department.[13]
  2. The Minister provided us with a copy of the Jersey Framework for School Evaluation used by his Department to evaluate school performance. We also discussed the matter at our Public Hearing with him. From both of these sources, we understand that evaluation of the Island's schools involves four phases:
  1. Self-evaluation (using a standard template in which schools are asked to assess themselves on a traffic-light system in the areas of achievement; relationships; organisation; and learning);
  2. Validation by a Professional Partner (a member of the Department but who is not a member of the school in question);
  3. External Validation by Independent Inspection; and
  4. Departmental Review.

The Framework applies throughout the Island's education system and is therefore not merely relevant to school performance at Key Stage 4 (i.e. when pupils take their GCSEs).

  1. Within the Framework, exam statistics are evidently used by the Department in the evaluation process. For instance, schools are required to consider their exam data when undertaking self-evaluation while Professional Partners discuss such data when they undertake their validations.
  2. However, it is apparent that exam statistics are not the only measure used by the Department to evaluate schools. The Framework itself shows that the Department relies upon a gradated system of inspection and school validation in which other information is taken into account. It can perhaps be seen more clearly from the Department's 2011 Annual Business Plan. In that Plan, the Department's second objective is stated to be "to

continue to raise standards and improve key outcomes for children and young people."[14] As we have already noted, exam statistics are included amongst the success criteria for this objective. There are other success criteria, however, and we have provided a copy of the full list in Appendix 2 to this report.

 

KEY FINDING

5.20  Exam statistics are not the only performance measure used by the Department of

Education, Sport and Culture.

  1. Some of these other measures have been mentioned during the recent debate on school performance. It is apparent, for instance, that the Department has begun to use vocational measures, a development we understand to be close to the Minister's heart. Essentially, this has involved the development of vocational qualifications for those who may be less academically-minded or less academically able. There are evidently questions to be asked, for instance whether such vocational qualifications provide students with opportunities equal to those they would gain through obtaining more traditional academic qualifications.
  2. Taking a further example, another method of measuring school performance has been raised by the Minister and his Department: that of value-added measures. Put simply, such measures are seemingly intended to allow for a pupil's achievement to be measured against a starting point rather than as an absolute measure in itself. We were advised that it could therefore be described as measuring progress'.[15] For example, a pupil may achieve six Cs at GCSE at the age of 16. However, that pupil at the age of 14 may have been expected to achieve six Ds at GCSE. Those in favour of a value-added system might argue that the starting point at age 14 should be taken into account in that instance in order that the progress of the pupil could be measured. In that way, the pupil's true' achievement (and the performance of the school) could be measured.
  3. That is putting it simply. The Department advised us that schools made use of Cognitive Ability Test (CAT) scores to measure the progress of pupils and that, in that regard, the Island's system of measuring performance differed from that used in the UK (where CAT scores are not universally used).[16] Again, there are questions to be asked. Mr Mills raised some concerns regarding value-added measurements. He stated that he was not against them per se but, unlike clearly understood national standards such as GCSE results, value- added measures could, in his view, mean all things to all people.[17] For example, the progress  and  achievement  of  pupils  (and  thereby  the  performance  of  schools)  would depend  upon  the  starting  point'.   In  the  example  cited  in  the  paragraph  above,  an

assessment of the school's performance might alter depending on whether a starting point of age 14 was used, or a starting point of age 11.

  1. Our consideration of matters such as vocational qualifications and value-added measures led us to consider wider issues: the purpose of education itself and the debate on whether education should focus on exam results or whether it should strive towards something more rounded'. For example, how much should school education focus on developing the employability of the Island's young people? Alternatively, should the well-being of pupils be the primary focus? We began to touch upon issues such as these with both the Minister and Mr Mills. However, those issues were not the primary aim of our terms of reference and much more time would be required than we had available to address them. But consideration of school performance inevitably brings into play issues such as the underlying purpose of education and we anticipate the Minister will no doubt need to look at those issues during the consultation period initiated by the release of his Green Paper.
  2. The Minister advised us that his Department was working on the development of new performance measures. The Director of the Department explained that the Department was in the middle of a three-year project to develop "more robust accountability measures." This process was likely to lead to a system in which "annual report cards" would be produced for individual schools and the Island's school system as a whole.[18] This would appear to reflect developments elsewhere, for instance in England where school performance is likely to be assessed in the future by a School Report Card' that would incorporate a variety of measures, from exam data to pupil well-being.[19] It is interesting to note that both parties to whom we spoke essentially agreed that there should be such a holistic approach to measuring school performance; the differences in views related to how that aim should be reached.
  3. We shall await the results of the Department's work. The questions regarding any performance measure are ultimately whether it is viable and whether it can be trusted – this is true throughout the education system and not merely for performance at the age of 16. Mr Mills's apparent distrust of value-added measures, for example, appeared to stem from the view that they provide a fuzzy image of school performance whereas exam data provide a clear and easily understandable picture of performance. Our review was too short to allow a full-scale assessment of the viability of each of the Department's performance measures. What our review did show, however, is that information is required for such an assessment to take place.
  4. In that regard, the Department currently publishes exam results (albeit at a system level, rather than on a school by school basis). The data are also included in the States Annual Performance Report. However, that Report includes no reference to value-added measures and it does not include any separate information on vocational qualifications. Nor does it report on the other success criteria which the Department includes in its own Annual Business Plan. Furthermore, school inspection reports are not made public. If the only data which is made widely available to the public are exam results, then it is perhaps not surprising that the view should develop that they are the only worthy measure and that other measures may be somewhat unreliable.
  1. We were advised why more information on school performance is not currently made publicly available. In relation to inspections, for instance, the Director of ESC explained that schools were more willing to engage in the inspection process if it were of a confidential nature. If the process were more open, undue pressure could be placed upon the school and the school's performance could be damaged as a result.[20] Mr Mills's view, however, was that inspection reports should be made available in order that they could contribute to the discussion on school performance. Without that openness of information, there could not be a proper debate on the issues at hand.[21]
  2. In this regard, the Director highlighted an issue which is of fundamental importance to the subject of our review: how to find the right balance between school performance and school accountability? In other words, how do you develop a school system in which schools perform well but in which sufficient information is made available for the public to have faith in that system? As the work of the aforementioned Children, Schools and Families Select Committee shows, this is an issue which does not merely face Jersey. The Department's work on the matter is seemingly underway but as both the Minister and Mr Mills suggested, that needs to include a proper, informed debate.
  3. For there to be a proper debate, and for informed judgements about school performance to be made, the Minister needs to ensure that sufficient information is made available and that it is presented clearly and within the proper context. While our review focussed on GCSE results, this principle also applies at other levels of the education system. For example, our remit did not cover primary school level, but we are aware that concerns have been raised in the past regarding numeracy and literacy levels. Such concerns would also be addressed through access to appropriate information. As matters stand, we fear that too many of the Department's performance measures remain opaque to the public and therefore cannot be easily understood. Again, we raise the possibility that a more formalised reporting structure (perhaps in the States Annual Performance Report itself or in a specific report on school performance presented by the Minister to the States) may be beneficial. We accept there are risks, as the Department advised, but the risks cannot be avoided if criteria such as value-added measures or vocational standards are to be trusted.

 

KEY FINDING

5.31  The Minister for Education, Sport and Culture should develop a reporting structure

for school performance that takes into account other performance measures used by

his Department (as well as exam results) and through which information should be

made publicly available unless in exceptional circumstances.

  1. OTHER ISSUES
  1. Inevitably for a review such as this, our discussions touched upon several topics that lay outside the specific scope of our review (which was consideration of the Minister's policy on publishing exam statistics and, more generally, on measuring and reporting school performance). We highlight those issues here as they should no doubt be considered by the Minister in due course, perhaps as part of his Green Paper on the future of education in the Island.

THE ISLAND'S SCHOOL SYSTEM

  1. Perhaps the most significant issue which arose was the structure of the Island's school system. As we have noted, the unique' nature of that system was used by the Minister and his Department to explain why the Island's schools' exam results could not be compared on a school-by-school basis with those of schools in the UK.
  2. The Minister advised us that the system was not failing the Island's pupils[22] while his Department's Professional Adviser told us that, in his view, the system was made to work well.[23] Nevertheless, we understand the system will be a topic for discussion within the Minister's Green Paper. Mr Mills observed that it was at least plausible to assert that a factor in the relative underperformance of the 11 to 16 schools (as measured by GCSE results) might well be the disruptive effects on those schools of the 14+ transfer to Hautlieu. In his view, the matter needed careful and dispassionate examination.[24]
  3. When the matter does come up for discussion, we trust that due and detailed consideration will be given to the issues involved – some of which became apparent during our review. For example, what impact does the 14+ transfer actually have on performance? In the recent debate, much has been said to suggest that the results of the four 11 to 16 schools may be seen to be less satisfactory as a proportion of their more able pupils move to Hautlieu. What we have not seen described is what impact, if any, that transfer process actually has on the performance of those pupils who remain. If there is no impact, then this might suggest that no changes to that part of the system are necessary. If there is indeed an impact, whether it is positive or negative will help to decide whether changes should be implemented. The Department highlighted another aspect of Jersey's education system and stated that an impact was also made by the selection of 41% of pupils to attend fee- paying schools. These are both the kind of issues we would expect to be discussed.
  1. As part of this, the Minister will need to ensure that the current system is clearly understood. For example, what focussed efforts or targeted resources are possible within the current system? Do the relatively large size of the fee-paying sector and the transfer to Hautlieu allow the Department and schools to focus efforts and resources more successfully in the 11  to  16  schools  than  they  would  otherwise?   Or  does  the  system  inherently  stifle achievement?
  2. Within the discussion on the Island's school system, consideration will no doubt be given to the relationship between the schools and the Department. In that regard, Mr Mills offered the view that schools were perhaps not afforded sufficient independence to manage their own affairs (the implication being that they would perform better if allowed to do so).[25] In this area, the Department advised us that the powers and authorities of governing bodies in Jersey were different from those of their UK counterparts: in the UK, governing bodies were the employer' of staff and had budgetary oversight of schools; in Jersey, it was the States which acted as employer and which oversaw the budgets. It was suggested to us that greater school autonomy could be achieved by giving such responsibilities to the governing bodies but consideration would need to be given to whether that constituted an efficient use of resources in a jurisdiction of Jersey's size.[26] Again, we anticipate that this issue will be considered as part of the Minister's consultation process.
  3. In highlighting such issues, we must stress that our review has not led us to any conclusions on these particular matters, simply that the Minister should bring them into his imminent consultation. After all, if the Department assesses performance on a system level (as it does  with  exam  statistics  in  the  annual  media  release  and  in  the  States  Annual Performance Report), then there needs to be a proper analysis of whether the system is currently designed to achieve the maximum performance.

 

KEY FINDING

6.8  There needs to be a proper debate on the structure and objectives of the Island's

secondary school system.

GENDER AND SOCIO-ECONOMIC ISSUES

  1. Our review highlighted two areas that need to be taken into account when considering pupil achievement and, consequently, school performance: an apparent gender imbalance in performance and the impact of socio-economic status.
  2. In relation to the former, the initial JEP articles reported that the exam results data revealed some significant gender imbalances in the respective results of boys and girls at the four 11 to 16 schools. Mr Mills noted that this imbalance was not in any way repeated at Hautlieu and that, in his view, it raised the question of to what extent this might be, perhaps, due to managerial weaknesses in the former. The Minister and Department acknowledged the problem of gender imbalances and advised that it was a problem encountered in many jurisdictions.[27] It is evidently a matter that the Department and schools should continue to work on.
  1. The impact of socio-economic status is also not unique to Jersey. Indeed, we are aware of work undertaken in the UK by the Equality and Human Rights Commission on the impact of socio-economic status on pupil and school performance.[28] Mr Mills advised us of his view that there were significant social issues in the Island that tended, it seemed, to remain somewhat unseen.[29] While that lay outside the scope of the review, it is apparent that the impact socio-economic status may have on performance in Jersey may not have been fully explored. From the Jersey Framework for School Evaluation, we could see that schools are required  to  consider  the  socio-economic  status  of  their  pupils  when  considering  their performance. It was not clear, however, whether other measures such as exam statistics are analysed on the basis of socio-economic status. We believe this matter requires further study and, indeed, we on the Panel might be minded to undertake further work in due course.

 

KEY FINDING

6.12  Work  should  continue  on  addressing  the  apparent  gender  imbalance  in  school

performance and on determining the impact of socio-economic status and parental

contribution/influence on performance.

  1. CONCLUSION
  1. We are conscious that we have ultimately not addressed the real issue' that was raised by the recent debate: are there schools in the Island which are underperforming? However, while this may be the real issue' it needs to be approached with great care and would require a review of much more depth and length than our own.
  2. We are similarly aware that in recommending the Minister revise his policy on publishing exam statistics and, more generally, on reporting school performance, we may be accused of pre-empting the findings of the Minister's Green Paper. We do not believe this to be the case. We have not made any conclusions regarding the Island's school system, or the performance of individual schools, or the viability of the Department's performance measures. Our review has not allowed for such conclusions. What it has shown, however, is that information is required for those crucial issues to be considered.
  3. There needs to be a debate on the balance between school performance and school accountability. Both the Minister and Mr Mills called for that debate to take place and, with the publication of the Green Paper, we trust that it will now occur. For it to occur properly, however, the Minister needs to re-visit his policy on reporting school performance so that sufficient information is available to States Members and the general public. The proper debate' needs that information to be in the public domain.
  4. We commend the Minister's desire to protect schools from undue pressure and we acknowledge the risks involved in publishing information on school performance. However, rightly or wrongly, we now live in an age of information and accountability and the Minister's policies need to reflect that reality. Indeed, those two principles are fundamental to the work of this Panel: we hold the Minister to account through the accumulation of information.
  5. There is always the risk that information is misinterpreted or misunderstood and that people may jump to conclusions. With such an emotive subject as the Island's school system and school performance, that risk presents a daunting challenge. We believe the Minister and his Department should meet the challenge head on and develop a formal reporting policy that promotes reasonable and reasoned debate on information that is clearly presented. That work is now underway, we understand, and we therefore encourage the Minister to proceed with our conclusions in mind.
  1. APPENDIX 1 – EXAM RESULTS MEDIA RELEASE

8.1  The following media release was made by the Department of Education, Sport and Culture on 26th August 2010 in relation to GCSE results:

GCSE results better than UK average  

26 August 2010

This year's GCSE results again indicate that Jersey students have achieved very good results in their GCSE examinations.

Mario Lundy, Director of Education, Sport and Culture, said he was pleased to see yet again another set of excellent results from the Island schools. "Today's examination results have yet again exceeded expectation and we yet again see evidence of the effectiveness of our education service. We must not forget that education starts in the early years and all of those involved in the education of this group of students should be very proud of their achievements".

Head of Planning and Projects for Education, Sport and Culture, Jim Westwater, said he was delighted to see such good results. "Today's excellent results are due to the professionalism of our teachers, the diligence of the pupils and the support of their parents, not only during this year but throughout their entire school careers. The Island has good reason to be proud."

 

All Island results 2010

Grade

A*

A

B

C

D

E

F

G

U/X

Percentage of entries

9.5%

17.2%

23.0%

23.9%

13.6%

7.6%

3.2%

1.4%

0.6%

UK results 2010

Grade

A*

A

B

C

D

E

F

G

U/X

Percentage of entries

7.5%

15.1%

20.6%

25.9%

15.9%

7.8%

4.0%

1.9%

1.3%

One examination board (OCR) has sent out data that does not discriminate between A and A* grades, hence the top end results are very unlikely to be accurate at this stage.

  1. APPENDIX 2 – DEPARTMENTAL SUCCESS CRITERIA

9.1  The following objective and success criteria were taken from the 2011 Annual Business Plan for the Department of Education, Sport and Culture:

Objective 2: To continue to raise standards and improve key outcomes for children and young people

Success Criteria:

  1. Professional partnership arrangements, including performance frameworks, improve the effectiveness of schools and colleges;
  2. Literacy and numeracy progress for all children and young people is appropriate and evidenced through internal and external moderation;
  3. GCSE and A Level results continue to compare favourably with benchmark authorities;
  4. Robust performance indicators are used to identify areas for development of the service;
  5. ICT strategy implemented to meet agreed targets;
  6. Funding, support and quality assurance arrangements for nursery education are monitored to ensure objectives are achieved;
  7. Vocational pilot options for 14-16 year olds reviewed;

(viii) Assess and implement, where appropriate, the recommendations arising from the 2011

reviews of the curriculum, structure and funding of primary and secondary education;

(ix)  A comprehensive programme for leadership at all levels improves school self-evaluation and increases effectiveness.

Strategic Plan Priority: 12

  1. APPENDIX 3 PANEL MEMBERSHIP AND TERMS OF REFERENCE
  1. At the time of this report's presentation, the Education and Home Affairs Scrutiny Panel comprised the following members:
  • Deputy R. G. Le Hérissier, Chairman
  • Deputy T. M. Pitman, Vice-Chairman
  • Deputy M. Tadier
  • Deputy J. M. Maçon
  1. The Panel approved the following Terms of Reference for the purpose of the Review:
  1. To consider the policy of the Minister for Education, Sport and Culture for measuring school attainment (and reporting that attainment); and
  2. To consider what measures, if any, the Minister has taken following recent press reports on school examination results, particularly with regard to the publication and accessibility of those results.

11. APPENDIX 4 – EVIDENCE CONSIDERED

Documents

  1. Act B7 of the Education, Sport and Culture Committee, 25th July 2005
  2. Policy on Publication of Examination Results, Department of Education, Sport and Culture, 25th July 2005
  3. GCSE and A-Level exam data for 2005 to 2010
  4. Written question to the Minister for Education, Sport and Culture by Senator B. E. Shenton regarding the detailed breakdown of G.C.S.E grades by students in the non-fee-paying sector, 8th September 2009
  5. Written question to the Minister for Education, Sport and Culture by Senator B. E. Shenton regarding the detailed breakdown of grades at A' level by Hautlieu students, 8th September 2009
  6. Written question to the Minister for Education, Sport and Culture by Deputy R. G. Le Hérissier regarding school league tables, 21st September 2009
  7. Written question to the Minister for Education, Sport and Culture by Deputy R. G. Le Hérissier regarding the number of students not competent in literacy and numeracy skills, 20th October 2009
  8. Written question to the Minister for Education, Sport and Culture by Deputy R. G. Le Hérissier regarding Literacy and Numeracy levels in primary schools, 3rd November 2009
  9. School Accountability, First Report of Session 2009 – 2010 of the House of Commons Children, Schools and Families Committee, 30th November 2009
  10. How fair is Britain?, Equality and Human Rights Commission, October 2010
  11. Oral question to the Minister for Education, Sport and Culture by Deputy P. V. F. Le Claire regarding the comparison of G.C.S.E. grades between all non fee-paying and fee-paying secondary schools in Jersey, 2nd November 2010
  12. Written question to the Minister for Education, Sport and Culture by Deputy P. V. F. Le Claire regarding school inspections, 16th November 2010
  13. Education, Sport and Culture Business Plan 2011
  14. Annual Performance Report, States of Jersey, 25th February 2011
  15. Statement by the Minister for Education, Sport and Culture regarding the publication of GCSE results, 1st March 2011
  1. Oral question to the Minister for Education, Sport and Culture by Deputy M. Tadier regarding Scrutiny access to examination results, 15th March 2011
  2. Annual Performance Report – Statement by the Comptroller and Auditor General, 28th March 2011

Written Submissions

The Panel did not actively seek written submissions during its review due to the short time available. Nevertheless, the Panel received four written representations as well as written material from the Department of Education, Sport and Culture and Mr J. Mills.

Public Hearings

24th March 2011 1.  Mr J. Mills 25th March 2011

1.   Deputy J. G. Reed, Minister for Education, Sport and Culture

Mr M. Lundy, Director – Education, Sport and Culture

Mr J. Westwater, Head of Planning and Projects – Education, Sport and Culture Mr G. Jones, Professional Adviser – Education, Sport and Culture


[1] Publication of Examination Results, Education, Sport and Culture Department Policy (25th July 2005)

[2] Mr J. Mills, Public Hearing, pages 21 and 30

[3] Written question to the Minister for Education, Sport and Culture by Senator B. E. Shenton regarding

the detailed breakdown of G.C.S.E grades by students in the non-fee-paying sector, 8th September 2009

[4] Deputy J.G. Reed, Minister for Education, Sport and Culture, Public Hearing, page 28

[5] Mr J. Mills, Public Hearing, page 29

[6] School  Accountability,  House  of  Commons  Children,  Schools  and  Families  Committee  (30th

November 2009), page 7

[7] Mr J. Mills, Public Hearing, page 24

[8] Mr J. Westwater, Head of Planning and Projects, Public Hearing, Page 13

[9] Mr J. Mills, Public Hearing, page 4

[10] Education Sport and Culture Business Plan 2011, page 7

[11] Mr M. Lundy, Director – Education, Sport and Culture, Public Hearing, page 17

[12] Mr J. Mills, Public Hearing, page 19

[13] Ibid, page 9

[14] Education, Sport and Culture Business Plan 2011, page 7

[15] Mr G. Jones, Professional Adviser – Education, Sport and Culture, Public Hearing, page 14

[16] Ibid, page 24

[18] Director – Education, Sport and Culture, Public Hearing, pages 33 and 39

[19] School Accountability, House of Commons Children, Schools and Families Committee (30th

November 2009), page 7

[20] Director – Education, Sport and Culture, Public Hearing, page 39

[22] Minister for Education, Sport and Culture, Public Hearing, page 26

[23] Professional Adviser – Education, Sport and Culture, Public Hearing, page 25

[25] Mr J. Mills, Public Hearing, page 39

[26] Director of Education, Sport and Culture, Public Hearing, page 11

[27] Head of Planning and Projects – Education, Sport and Culture, Public Hearing page 37

[28] How fair is Britain?, Equality and Human Rights Commission (October 2010), Chapter 10