Performance Assurance

Warning This Web page has been archived on the Web.

Performance Assurance Audit: Reports

Report on the Audit of Frequency of Contact

378-01-143
December 2000

Performance Assurance Sector
Correctional Service Canada


TABLE OF CONTENTS


Summary of tables and charts

 


EXECUTIVE SUMMARY

The audit of the Frequency of Contact (FOC) was conducted as part of the Performance Assurance Sector's audit schedule for fiscal year 1999-2000, and also in response to a recommendation by the Auditor General in his 1999 periodic report to Parliament on the Reintegration of Offenders.

The FOC audit was completed to determine if conditionally released offenders are being supervised in accordance with policies and procedures under Supervision Standard 700-06. It was conducted solely by electronic means, using the Offender Management System (OMS). No visits were made to individual offices, nor were paper files examined. The audit scope only assessed the compliance with the frequency of face-to-face contacts with conditionally released offenders, rather than all contacts with the offenders, or other aspects of community supervision. The location of the contacts with offenders, whether they took place in the office or in the community was reviewed, as well as collateral contacts with the offenders' family members, employers, program officers and others.

Although this was an audit of compliance regarding the frequency of face to face contacts with conditionally released offenders, it must be noted that face to face contact is only one component of effective supervision with offenders in the community. Effective community supervision is all encompassing, taking into account the offender, the family, the employer or program supervisor, the therapists, the police and all others who play a significant role in community life. In addition, other tools, such as curfews, urine testing, and the use of community residential facilities as a half-way back measure, are also used extensively to manage risk and to maintain offenders in the community. Parole supervisors are utilizing all of these methods to supervise, and to successfully reintegrate offenders into community living.

Effective supervision leads to successful reintegration between the institution and the community. This is circular, since a measure of successful reintegration is the successful completion of the conditional release. This means offenders reaching their Warrant Expiry Date while on conditional release in the community, and this is occurring in greater numbers than ever before. The national rate of conditionally released offenders who reached Warrant Expiry Date in the community was 93.6% in the quarter ending April 1, 2000 (source: Corporate Results, June 2000).

The audit team found that there was effective supervision taking place in the community and that offenders are making the successful reintegration from the institution to the community, even if some cases were not compliant with the required number of face to face contacts. The cases are being supervised and the risk is being managed.

However, since this audit dealt primarily with compliance to face to face contacts with conditionally released offenders, it is this aspect of offender supervision that will be reported on in the overall report.

In his audit report on the Supervision of Released Offenders in 1994, the Auditor General brought attention to shortcomings in direct supervision across the country, especially in the area of face to face contacts with conditionally released offenders. In 1998, CSC conducted its own nation-wide review and found there were problems in achieving the frequency of contact level with offenders as required by policy. According to that CSC study, the percentage of offenders who were not contacted with the required frequency, varied from eleven (11) percent to thirty-eight (38) percent across the five regions.

As part of his audit on the Reintegration of Offenders in 1999, the Auditor General reviewed the audit results and recommendations of his 1994 audit on the Supervision of Released Offenders. He reiterated his concern from 1994 that face to face contacts with conditionally released offenders are not being achieved according to CSC policy, and recommended that "CSC should ensure adherence to the standards for frequency of contact with offenders required by its policy ...".

During the 1999 review, the Auditor General's staff selected a random sample of one hundred and fifty offenders under community supervision, who required close supervision or the most frequent contact with parole officers. According to the Auditor General's report, the percentage of offenders in the sample who did not meet the required frequency of contact ranged from ten (10) percent to twenty (20) percent. This was actually lower that the numbers reached by the CSC study in 1998. It should be noted that the Auditor General's staff completed this review from OMS, and only reviewed the casework record fields, rather than reading the entire entry.

With these different reviews and overall numbers for reference, the Performance Assurance's national Frequency of Contact audit was conducted in late 1999. To obtain maximum benefit from the review, it was conducted in two (2) phases. The first phase was an Offender Management System (OMS) review of 10% to 25% of casework records from all parole offices across the country for cases during a six month period (March 1 to August 31,1999). The review was completed between September and December 1999, therefore any casework records that were entered subsequent to mid-September when the cases were extracted from OMS for review, were not included in the results.

To test the Service's ability to use electronic files, especially as they relate to the supervision of offenders, and as a management review tool, the first phase of the Frequency of Contact audit was completed solely from the Offender Management System (OMS). No field site visits were made during the audit. Since the early nineteen-nineties, the Offender Management System (OMS) has become the principal offender file. This sole use of the electronic file for all offender reports and contacts has become more evident in the past couple of years, as the final releases of OMS have occurred, and since a directive was issued to use OMS for all casework records, maintaining no hard copies on file. According to the Directive: "the only time that a PO should need to print casework records is when they are needed for sharing with someone who does not have access to OMS."

A total of seven hundred and sixty-five (765) cases were reviewed during the audit and all types of conditional releases were randomly selected for review.

The second phase of the audit involved requesting and receiving input from the parole offices that had been audited. The offices were given the opportunity to explain deficiencies and the information received from the offices was compared to the initial results. If necessary, the results were re-examined. Both phases of the audit are culminated in this final report.

The major issue identified in the audit was the extent to which the Frequency of Contact standards appeared not to have been met, based on a review of casework records in OMS. Due to the fact that the audit was only conducted from OMS, it was difficult to determine if the non-compliance resulted from the contacts not taking place or because the information from the contacts was not entered into the System. Responses from the parole offices indicated the reasons were evenly split between contacts not having been made and contacts not having been entered into the Offender Management System.

Input from the parole offices during the second phase did improve the statistics, especially in the Quebec region where large numbers of casework records had not been entered into OMS, but overall after the second phase was completed the majority of offices still maintained less than seventy-five percent compliance rate.

While a large number of cases were in non-compliance because of late entries into the Offender Management System, overall, the reasons that offices did not meet the Supervision Standards were varied. One difficulty lies in the various interpretations of frequency of contact calculations, which caused some offices not to be in compliance, (for example, for level A-4, if contacts were counted on a weekly basis rather than four times a month, and for level B-2 if the contacts were counted every two weeks rather than twice a month). Supervision in isolated settings caused some offices to be in non-compliance because the audit assessed face-to-face contacts only and many cases in remote areas are supervised by phone, or not supervised until the offenders return to their home surroundings. According to some office responses, exceptions were granted by District Directors, however except for a couple of cases in the Quebec region, the audit team found no such exceptions recorded in OMS.

In addition to the non-compliance with the FOC, other findings from the audit included:

  • inaccuracies, and lack of significant information in casework entries,
  • delayed changes to frequency levels in the Correctional Plan Progress Reports,
  • inconsistency between the fields and narratives in the Correctional Plan Progress Reports and the casework records, and
  • lack of collateral contacts.

A positive finding from the audit was that the requirement that the majority of contacts be made in the community (outside of an office setting) is generally being met.

Based on the number of months that were in compliance, as compiled from OMS, the total national rate of compliance was approximately 70% (refer to Table 2 for further details). The numbers moved upward slightly after the second phase of the audit was complete.

While the Auditor General's audit, CSC's 1998 review and the current audit all found that 100% compliance is not being achieved with respect to frequency of contact, it should be noted, that, owing to differences in methodologies and approaches, a direct numerical comparison of the results of these audits should be avoided. However, since the Offender Management System is now the official offender file, audits will be periodically conducted on the System by management and by outside agencies. Therefore it is necessary that entries be made into OMS regarding all contacts with offenders, and it is also necessary that these entries be current. Until such time as entries into OMS are made in a timely and complete manner, the utility of OMS as a management monitoring tool and a control mechanism will be limited. The responses received from the parole offices indicate that action has been taken in this regard to ensure that OMS accurately reflects all contacts with each offender while under supervision.

All responses from the audited parole offices were taken into account in the final report, and in the findings and recommendation. Only one recommendation is made, and is directed to the Correctional Operations and Programs Sector at NHQ. The recommendation states that "the policies and procedures regarding Community Supervision be amended to include:

a) a timeframe for casework record entries in OMS;
b) a clearer definition of the timeframe for initial interviews;
c) exceptions as they relate to rural/remote supervision;
d) a clearer interpretation of the requirement to have the majority (more than 50%) of contacts in the community;
e) a clarification of contact requirements for both level A and level B frequency of contact; and
f) a consistent method of recording the level of frequency of contact.

An action plan has been developed to address the recommendation and can be found on page 41 of this report.

INTRODUCTION


Standard Operating Procedure (SOP) 700-06, entitled "Community Supervision", outlines the requirements for determining the minimum number of face-to-face contacts required to appropriately manage the risk of conditionally released offenders in the community. There are five possible levels at which an offender can be supervised:

LEVELS OF SUPERVISION FREQUENCY OF CONTACT
Level A Four (4) face-to-face contacts per month
Level B Two (2) face-to-face contacts per month
Level C One (1) face-to-face contact per month
Level D One contact every two (2) months
Level E A minimum of one (1) face-to-face contact every three (3) months

These levels of supervision have been set as the minimum requirement needed to appropriately manage the offender's risk. Therefore, the use of the word 'minimum' could be perceived or interpreted as meaning that the contacts should generally be exceeded in order to ensure proper supervision. In fact, according to the Supervision Standards, the parole officers are encouraged to see the offender more often than the minimum requirements. Paragraph 43 of SOP 700-06 states that "parole officers will determine an appropriate frequency of contact that is equal to or greater than the minimum and is based on the relevant factors and the parole officer's professional judgement".

This audit on the Frequency of Contact (FOC) dealt with an assessment of the compliance to the three (3) highest levels of supervision, since these are the levels of supervision that are necessary to manage the offenders at the highest risk in the community. These higher levels of supervision constitute the majority of offenders under supervision in the community.

The first phase of the audit was completed in its entirety from the Offender Management System (OMS). This phase of the audit determined if paragraphs 76-77 of the SOP were met; that is, if all casework records were entered into OMS, and that each record provided information on the particular contacts with the offender. The SOP does not specify any standards for timeframes in which casework records are to be entered into OMS. However, our audit expectation was that records would be entered into OMS as soon as practicable after contacts with offenders (no more than one month) to enable OMS to be an effective management control and monitoring system.

In addition to the frequency of contacts with offenders, the audit team reviewed casework records to determine the number of contacts that took place in the community and the number and type of collateral contacts.

Paragraphs 57-62 of SOP 700-06 require that "the majority of contacts with the offender take place in the community". The casework records were reviewed to verify that fifty percent or more of the face-to-face contacts with the offenders took place in the community, rather than an office setting.

The audit also assessed compliance with paragraphs 64-66 of the SOP on Community Supervision that indicates parole officers "must establish a network of community contacts to corroborate information provided by the offender". These contacts are to include among others, program deliverers, employers, and different members of offenders' families.

These particular sections of SOP 700 relevant to this audit, also outline the process and the documentation required to assess and/or re-assess the frequency of contacts (FOCs) that are necessary to manage risk in the community.

The review of the frequency of contact with conditionally released offenders was conducted as part of the National Performance Assurance annual audit schedule for fiscal year 1999-2000. It was also completed as a response to recommendations made by the Auditor General in his latest Report to Parliament on the Reintegration of Offenders. In that Report, the Auditor General indicated specific findings on the lack of minimum standards for face-to-face contacts with offenders in the community, and made a recommendation for the Service on the subject. This Frequency of Contact audit is one of the steps CSC is taking on corrective action to the Auditor General's recommendation.

For this FOC audit, casework records on conditionally released offenders were reviewed in all seventy-four (74) parole offices and sub-offices across the country. The Intensive Supervision Units in Ontario and Quebec regions were also reviewed. In total, the audit team reviewed and reported on seven hundred and sixty-five (765) cases, in addition to fifteen (15) team/intensive supervision cases, for the time period of March 1 to August 31, 1999. Although additional cases were reviewed, they were not included in the final report sample because they did not meet the required sample criteria, i.e. they were suspended or revoked during the timeframe, moved to another parole office, etc. Only those offenders who remained at the same parole office for the 6 months reviewed are included in this report.

Since the review was to be completed primarily from the OMS system, the audit was conducted in two (2) phases. Phase 1 involved the OMS review & the related findings. This phase enabled the audit team to extract all casework records from the database for the relevant six months and examine them for compliance to the supervision standards. Phase 2 was consultation with the field & the review of responses received from each parole office. For this phase, the audit team extracted all non-compliant cases found during phase 1 and forwarded them to the appropriate District Director in each region for information and comments. The districts were allotted time to review the cases and return the results and comments to the audit team. This final report includes a compilation of the results from the two phases.

The following table illustrates the number of cases reviewed in each parole office and the total number of cases reviewed per region.


TABLE 1 - Number of cases reviewed in each Parole Office

Parole Office Number of Cases Reviewed
Atlantic Region
Bathurst 4
Charlottetown 9
Corner Brook 3
Dartmouth 9
Fredericton 6
Grand Falls 2
Grand-Sault 3
Halifax 11
Happy Valley 1
Kentville 8
Moncton 14
Saint John 10
St. John's 9
Sydney 6
Truro 8
Total (Atlantic) 103
Quebec Region
Chicoutimi 5
Estrie 8
Granby 9
Hull 8
Lafontaine 19
(+5 team/intensive supervision)
Lanaudiere 11
(+2 team/intensive supervision)
Langelier 25
Laurentian 10
(+1 team/intensive supervision)
Laval 11
(+2 team/intensive supervision)
Longueuil 20
Quebec 20
Rimouski 4
Trois Rivieres 10
Ville Marie 25
Total (Quebec)

189
(+10 team/intensive supervision)

Ontario Region
Barrie 6
Brantford 5
Guelph 13
Hamilton 17
Kingston 19
London 12
Muskoka 2
Nunavut 2
Ottawa 22
Peterborough 9
Peel 15
St. Catherines 7
Sault Ste. Marie 2
Sudbury 8
Timmins 1
Toronto Downtown 21
Toronto East 25
Toronto Team Supervision 5 team/intensive supervision
Toronto West 25
Windsor 9
Women's Supervision Unit 10
Total (Ontario) 230
(+ 5 team/intensive supervision)
Prairie Region
Brandon 3
Calgary 25
Drumheller 2
Edmonton 25
Lethbridge 3
Medicine Hat 2
Northwest Territories 3
Prince Albert 16
Red Deer 6
Regina 14
Saskatoon 11
Thunder Bay 3
Thompson 2
Winnipeg 24
Total (Prairie) 139
Pacific Region
Abbotsford 14
Chilliwack 3
Kamloops 4
Kelowna 4
Nanaimo 5
New Westminster 21
Prince George 11
Vancouver 24
Vernon 4
Victoria 14
Total (Pacific) 104
TOTAL 765 + 15 intensive supervision

 

METHODOLOGY

The majority of the audit was conducted using a straightforward numerical approach, allowing for the compilation and comparison of compliance results. However, the content of all casework records for each offender was also reviewed for the six-month period.

The objective was to determine the number of cases in full compliance for the six-month review period. Full compliance was achieved when all of the required face-to-face contacts were satisfactorily completed for each of the six months reviewed. If the frequency of contact (FOC) was set a level A (A-4), the casework records were required to indicate that the offender was seen at least once every week in the month. If the casework record indicated that two (2) face-to-face contacts were made during a single week, only one visit was counted towards the requirement, and the other visit counted as an "extra".

The results were calculated in the following manner:

a) minimum number of contacts required in the period reviewed (the FOC);
b) number of face-to-face contacts that took place, ("extras" were counted separately)
c) minimum number of community contacts required (50% of contacts);
d) number of contacts that took place in the community; (again, "extras" were counted separately)
e) number of collateral contacts; and
f) number of compliant months out of six.

The review involved extracting all casework records from OMS for the period of March 1, 1999 to August 31, 1999. Record entries for inclusion in the six-month review period, made into OMS subsequent to the week of September 13, 1999 were not included in the results. In addition to identifying contacts, the casework records were read to determine overall levels of accuracy and quality, i.e. duplication of records, etc.

A selection of approximately ten percent (10%), to a maximum of twenty-five (25) cases was reviewed from each parole office. An attempt was made to identify offenders released on supervision during January and February 1999, however in order to obtain a sufficient sample size, cases of offenders released prior to these dates were also selected. Regardless of the offender's release date, the audit examined the same 6-month period, from March to August 1999, for each case.

The FOC to be audited for each offender was determined by the most recent Correctional Plan Progress Report (CPPR) on OMS. Since offenders must be supervised at the A-4 level until the initial community CPPR is completed, if there was no CPPR completed since release, or completed later than the thirty-day requirement, the FOC was assessed at an A-4 level until the CPPR was completed.

PHASE ONE


OMS Review and Results

Offender Management System (OMS) records are considered to be the most up-to-date collective database of information on all offenders supervised by the Service. Therefore, it is important that all information in the database be available, accurate and current. When the availability and accuracy of the information in OMS on the frequency of contact for conditionally released offenders was reviewed for the purposes of this audit, it was found that the information was not always available and when available not always current. In addition, some information in OMS was not always accurate, such as the incorrect information entered in the field, or information on one offender entered in the casework records of a different offender.

As the cases were reviewed in OMS, non-compliance with the frequency of contact as they pertain to the Supervision Standards, was an area of concern. This was first observed in light of a lack of casework records registered in OMS. Since the first phase of the review was conducted solely in OMS, it was difficult to determine if the non-compliance resulted from contacts with offenders not taking place, or because the information pertaining to the contacts was not entered into the system. If OMS is going to be the official file on all offenders, it is important that the information is entered promptly and correctly. This is especially true for casework records where no hard copies exist, making the computer the official information on the day-to-day, and week-to-week events surrounding the offender's supervision.

The following are the primary issues identified during the audit:

  • The frequency of contact (FOC) is not always met, as determined by the Supervision Standards. The audit team found that in the majority of offices across the country a significant number of cases were not compliant to the supervision standards for frequency of contact. The number of non-compliant cases varied from region to region, however in all regions the numbers were significant.
  • Casework records are not being entered into OMS in a timely fashion. The audit determined that parole officers were often late entering data from contacts with offenders into OMS. In many cases, casework records were not entered for several months. As a result, the utility of OMS as a management tool, and its ability to provide up-to-date information on an offender (including collateral contacts), is negated. Similarly, any assessment of compliance to FOC standards by an outside agency such as the Auditor General, strictly using OMS information, could lead to inaccurate findings.
  • Technical errors were observed in the entry of casework records into OMS. The overall accuracy of the information in the OMS entries on casework records also varied across the country. The problems ranged from technical errors in the location/type of contacts, to multiple entries in one record (i.e., collateral and offender contact combined), to all information from several contacts with the offender entered on a monthly basis (monthly casework records as opposed to individual entries). When this occurs, it is often difficult to discern which information belongs to which contact and when the actual contact took place during the month.
  • The content of the casework record entries varied in quality. The quality of the information in the casework records varied according to the parole officers who authored the record. In certain cases, the auditors found that the information in the casework records was short, terse, and did not really indicate the situation with the offender. In other cases, the casework records contained all the information needed to follow the case and determine the management of risk. In some offices, a template format was used to ensure that all parole officers addressed the significant issues.
  • The audit team found inconsistency between the information in the Correctional Plan Progress Reports (CPPRs) and the casework records. In some cases, the Correctional Plan Progress Report identified a set FOC level or a change in the FOC, but this was not evident in the casework records. Conversely, in some cases, casework records mentioned that the FOC had been modified, but there was no CPPR completed to support the modification. In addition, the audit team found that CPPRs are not always completed in a timely fashion, as required by the SOP.
  • Most cases were seen in the community as frequently as required by the standards. Although the audit team found that in some cases reviewed, the offenders were not seen in the community fifty percent of the time and in a few cases, the community visits were almost nil, this was not the norm. In the majority of cases, the parole officers are seeing the offenders in the community over fifty percent of the time and meeting the Supervision Standard. It was noted in some casework records however, that the information in the record did not clearly indicate where the visit had taken place, whether in the office or in the community, again this was not the norm.
  • Many cases reviewed identified few or no collateral visits. Cases were reviewed that indicated a wide-range of contacts had been made with family, employers, counselors and police. However, in other cases, the auditors noted that only a few collateral contacts had been conducted. The quality of the collateral contacts was also in question in some cases. In many instances, the only collateral contacts that took place were in the presence of the offender.

General Audit Results (based on OMS review only)


The audit team noted that in some cases, parole officers had gone to exceptional measures to ensure that the required contacts with offenders were met. In a difficult A-4 level case that was compliant in each of the six months reviewed in the Toronto Downtown office, one parole officer made fifty-seven (57) face to face contacts with the offender and fifty-six (56) collateral contacts. In the Saskatoon office another difficult case had continuous contacts by the parole officer, the casework records indicated good in-depth interviews and the case was well documented.

There were also cases that proved to be non-compliant regarding meeting the required number of weekly or bi-weekly contacts, but yet the parole officers maintained a significant number of face to face contacts, often more that the number required by the CPPR. One example of such a case was in the St. John's office. The case was compliant five months out of 6, but in those six months, the offender was seen thirty-five (35) times, with thirty-eight (38) collateral contacts being made. It was a very difficult case to supervise, but a major effort was made to keep the offender in the community. Although this case is more extreme than others, there was evidence during the audit that parole officers across the country are going to extraordinary lengths to maintain contact with the offenders they supervise.

The audit identified two (2) parole offices (in different regions) that were in full compliance and met all the required FOC standards. However, the 10% sample size for each of these offices was only two (2) cases in each office. Other larger sample sizes, for offices with larger case loads, did come very close to being in full compliance, with only one (1) or two (2) cases not meeting the required FOC standards. Generally, reasons for the lack of full compliance included:

  • all records not entered into OMS,
  • casework records for the review period not entered into OMS until after the week of September 13 and therefore not reflected in the results, and
  • parole officers not meeting the FOC as determined by the CPPR.

The audit team found that there were several ways to interpret the data collected during the case reviews. For example, one way was to determine how many cases were fully compliant in all 6 months reviewed. This was true for 303 of the 765 cases reviewed (40%).

Another interpretation, which is used in this audit report, is based on the total number of months that were compliant and is presented in the following table. In calculating this result, the total number of months audited for the Parole Office were calculated by multiplying the number of cases reviewed by 6 (the number of months audited in each case). The number of months in compliance is then presented and given as a percentage of the total number of months audited. Overall, the audit team audited 4602 months, of which 3275 were compliance (71%).

TABLE 2 - Results of OMS review of casework records

PAROLE OFFICE Total # of months audited
(= # cases x 6)
Total # of months compliant % of months compliant
PACIFIC      
New Westminster 126 74 59%
Vancouver 144 94 65%
Vernon 24 16 67%
Kelowna 24 20 83%
Kamloops 24 14 58%
Chilliwack 18 10 56%
Abbotsford 84 57 68%
Nanaimo 30 12 40%
Victoria 84 60 71%
Prince George 66 55 83%
Pacific
AVERAGE
624 412 66%
PRAIRIES      
Winnipeg 144 63 44%
Brandon 18 11 61%
Calgary 150 119 79%
Saskatoon 66 51 77%
Lethbridge 18 13 72%
North West Terr. 18 14 78%
Thunder Bay 18 14 78%
Red Deer 36 22 61%
Thompson 12 9 75%
Medicine Hat 12 10 83%
Drumheller 12 11 92%
Regina 84 77 92%
Prince Albert 96 36 41%
Edmonton 150 117 78%
Prairie
AVERAGE
834 570 68%
ONTARIO      
St. Catherines 42 40 95%
Ottawa 132 88 67%
Peel 90 73 81%
Sudbury 48 37 77%
Barrie 36 26 72%
Nunavut 12 0 0%
Muskoka 12 12 100%
Peterborough 54 10 19%
Toronto East 150 116 77%
Sault Ste. Marie 12 8 67%
Toronto West 150 101 67%
Kingston 114 91 80%
Timmins 6 2 33%
Hamilton 102 99 97%
Guelph 78 53 68%
Toronto Downtown 126 117 93%
Woman's Sup. Unit 60 54 90%
Brantford 30 21 70%
Windsor 54 42 78%
London 72 48 67%
Ontario
AVERAGE
1380 1038 75%
QUEBEC      
Lafontaine 114 33 29%
Estrie 48 32 67%
Granby 54 38 70%
Longueuil 126 108 86%
Ville Marie 150 48 32%
Quebec 126 101 80%
Trois Rivieres 60 44 73%
Rouyn-Noranda 24 17 71%
Hull 48 43 90%
Lanaudiere 66 54 82%
Laurentian 60 48 80%
Laval 66 54 82%
Chicoutimi 30 30 100%
Rimouski 24 24 100%
Langelier 150 46 31%
Quebec
AVERAGE
1146 720 63%
ATLANTIC      
Grand Falls 12 10 83%
Grand Sault 18 17 94%
Truro 48 41 85%
Saint John 60 43 72%
Kentville 48 45 94%
Corner Brook 18 16 89%
Moncton 84 76 90%
Bathurst 24 23 96%
Happy Valley 6 1 17%
Sydney 36 35 97%
Dartmouth 54 42 78%
Fredericton 36 34 94%
St. John's 54 43 80%
Charlottetown 54 52 96%
Halifax 66 57 86%
Atlantic
AVERAGE
618 535 87%
NATIONAL
AVERAGE
4602 3275 71%

Findings and Analysis of Results (Phase One)

As previously mentioned, there are several ways to interpret the data collected during the audit. This was evident from the various discussions that the audit team had with staff at various levels when setting up and conducting the audit, as well as the responses received from the field in phase 2. One comment received stated that there were: "Some concerns with the method of reporting results, e.g. "compliant 3 months out of 6." The difficulty is that is would be very easy to calculate a compliance percentage of ½ from this example. Take a case where a level A offender is seen 4 times for 3 months and 3 times for each of the remaining 3 months, for a total of 21 contacts. The minimum number of required contacts for that period is 24. One might reasonably argue that the compliance rate is actually 88% - the "score" of 50% would clearly be misunderstood by the general public."

However, regardless of the way in which the results are presented, 100% compliance to minimum frequency of contact requirements is not being achieved. As noted in the executive summary, however, the measure of face-to-face contacts is only one aspect of effective community supervision. In reviewing the cases, the audit team found evidence that effective supervision is taking place in the community and that offenders are making the successful reintegration from the institution to the community, even if some cases are not compliant with the required number of face to face contacts. The cases are being supervised and the risk is being managed.

The following sections elaborate on the findings of the audit team in relation to the OMS audit results.


Finding #1: Casework records are not being entered into OMS in a timely fashion.


It was noted in some cases, particularly in the Quebec region, and with some indirect supervision cases across the country, that no casework records had been entered into OMS. However, out of the 765 cases reviewed, this situation (no OMS casework records for the 6-month period) was only found in less than 5% of cases. In these cases (and in others with sporadic records entered in OMS), information from Parole Offices would indicate that parole officers are making regular face-to-face contacts and keeping hand written notes on the offender's file, sometimes inputting the information into OMS at a much later date. This was evident in some casework records where information regarding contacts stated "refer to file" and in cases where there were indications that some parole officers consistently delay entering the information on their offender contacts into OMS. Noted delays ranged from one (1) day to three (3) months. An ideal situation would see information entered into OMS no later than forty-eight hours after a meeting takes place with an offender, although there is currently no clear policy in this regard.

In responses from the field in the second phase of the audit, several offices reported that as a result of the audit new procedures and timeframes for entering records into OMS had been put into place. One office stated that "this audit has alerted our District for the need to ensure that activity records are entered in a more timely manner. As a direct result, the Director will now direct all parole officers to routinely enter activity records within one (1) week of the contact taking place". A report from another office indicated that "the SOP is not clear as to when a casework record should be entered on OMS, it only indicates that it should be within a reasonable time period. It is now policy [in our office] that casework records must be completed within five (5) working days of seeing the client".

Finally, a third office noted that "Since [many POs and agencies in our District], produced "paper" Casework Records over the review period of March to August, the results received are not surprising. However, I can now tell you (after verifications in OMS and RADAR) that, since December, all POs have been entering their information in the OMS casework record screen. As well, [agencies] have been sending us information on diskette on all their cases since the beginning of January, so that we can now enter this information into OMS." [translation]


Finding #2: The information in the Correctional Plan Progress Reports (CPPRs) and the casework records is often inconsistent.


SOP 700-06 requires that parole officers examine the Correctional Plan Progress Report (CPPR) within thirty (30) days following the release of an offender, and confirm the intervention level required in the community, as well as the frequency of face-to-face contacts. Until the CPPR is updated, offenders are to be supervised at the A-4 level. The audit team identified cases where the CPPR was not updated within the thirty (30) day time period, and in some of these instances, the parole officers were not making the four (4) face-to-face contacts per month as required.

Section 49 of SOP 700-06 states that all components of Correctional Plan Progress Reports must be implemented in a timely fashion. The audit team noted that in some cases, when the casework records indicated a change in the FOC (sometimes the change was initiated by a case conference), the CPPRs were not being updated within a reasonable time period. This causes inconsistencies between the information in CPPRs and the information in the casework records. It was noted in some cases that two (2) months passed before changes were documented in a CPPR, even though the offender was being seen at the new FOC level. For the audit, these latter cases were scored against the most recent FOC, as indicated in the most recent CPPR.

Finding #3: The quality of the information in casework records varies from one Parole Officer to the other.


All casework records were read to assess the overall content quality, even though scores were not assigned for quality. Policy does not specifically state the type of information that should be included in casework records, however the audit team looked for information relating to the offender's reintegration into the community, work, family and relationships, finances, risk and special conditions, and the level of detail on each of these items. Specifically, the audit team noted that casework records should contain a quantity of information on the following:

· the offender's initial reintegration into the community;
· living conditions and family support;
· proof of work;
· the offender's capability of managing finances and budgeting money;
· the identification of any vehicles owned by the offender, detailing car information and license plate number; and
· as the supervision progresses, any indication of increased risk and how that risk was managed by the parole officer.

Some casework records read by the audit team were extremely detailed and provided a myriad of information for every face-to-face contact, enough information that the reader could follow the case and determine how the case was handled, and how any risk was managed. In some cases, a template format was used to ensure that the parole officer touched on each of the significant areas.

Although some cases were not compliant regarding the required number of face-to-face contacts for the review period, the audit team found that they were still supervised well, with the risk appropriately managed. This was particularly true for cases that had been designated at an A-4 level of supervision, and maintained at that level for a lengthy period of time. As a minimum standard, seeing an offender four times a month for six to twelve months is very taxing, especially if the offender resides in a rural or remote area. Examples of cases that were managed well and well recorded were found in all regions. Of particular note were a case, or cases in:

Ø Ontario region - Guelph office, Barrie office, Ottawa office, Toronto Downtown office, Sudbury office and the London office;

Ø Prairie region - Edmonton office, Saskatoon office, Calgary office, and the Winnipeg office;

Ø Atlantic region - Kentville office, Corner Brook office, Halifax office, Grand Sault , St. John's office, and the Truro office;

Ø Pacific region - Vancouver office, one case in Victoria, Kelowna office, Abbotsford office, Kamloops office, New Westminster office, Nanaimo office, and the Chilliwack office; and

Ø Quebec region - Longueuil office, Rouyn-Noranda office, Quebec office and the Ville Marie office.

In some cases, however, regardless of the frequency of contact, only one or two sentences were available in the casework record regarding the contact, and this information did not convey anything of significance on the offender. There appeared to be no depth to interviews in some cases; the parole officer seemed to be simply meeting the requirement to interview the offender. The auditors noted these types of records were particularly prevalent in cases that were supervised at the C-1 level and were found in almost all offices across the country.

Finding #4: In many cases, the frequency of community contacts is meeting the requirements of the SOP.


Section 57 of SOP 700-06 states that the majority of contacts with offenders are to take place in the community. The audit team found that in many cases, almost all contacts were made in the community through a combination of home contacts, work and contacts in other areas. There were other cases where the majority of contacts were made in the office. In the overall number of cases reviewed however, these were in the minority. It was noted in all regions, that attempts were made to ensure that community contacts were made. The overall compliance to paragraphs 57-62 of supervision standard 700-06 was 64% across the Service. This illustrates that more than half of all offenders on conditional release, are seen regularly in the community.

Note was also taken of offenders who were residents of Community Residential Facilities (CRFs). In the majority of those cases community contacts were made in the CRF. This does not necessarily meet the supervision standards. Although the CRF is essentially the offender's home, it is difficult for parole officers to determine progress made against the CPPR and the offender's ability to adapt to community living if they are not seen in other environs in the community. This is particularly true if the offender has family in the vicinity and will be returning to that family on release from the CRF, or if the offender is employed in the community.

Rural/Remote supervision of cases

It was noted during the audit that offenders living in remote areas, or travelling to remote areas for employment, present a particular challenge for supervision. According to policy, exceptions to supervision standards can be granted in certain circumstances, including supervising offenders in remote areas.

There were cases reviewed during the audit in which offenders were living and/or working far from their home communities, in remote areas of the country. The supervision in these cases was completed by telephone or not at all during the time away from home. If there were exceptions granted in those cases, the audit team did not find any notations indicating such on the casework records or in the CPPR, aside from a few cases in the Quebec region. If no exceptions were indicated in the casework records, the cases were listed as non-compliant for audit purposes. If exceptions to the supervision standards are granted by Area Directors, it should be noted in the casework records or the CPPR that such exceptions have been approved.

Additionally, it was noted that aboriginal offenders living in remote areas or on reserves and supervised by Band constables or other band members are making regular contact with their parole officers by telephone, rather than personal contact. These cases were found to be non-compliant by the audit team because the contact with the parole officer was only by telephone. However, the cases appeared to meet the requirements for supervision under Section 84 of the Corrections and Conditional Release Act (CCRA). If arrangements had been made for Band supervision for those cases, no notations were found during the audit that could cover them under Section 84. During the second phase of the audit, there was no explanation offered from Districts regarding such cases, no mention of Section 84, and no explanation regarding reasons for those cases not being in compliance with supervision standards.

The following table indicates the number of community contacts recorded in OMS casework records between March 1, 1999 and August 31, 1999.

Again, there were a number of ways of interpreting the requirement for 50% of the contacts to be made in the community. For example, if a case was audited at the A-4 level, 24 contacts were required in the 6 months audited. If 20 contacts were made during the 6 months, with 11 of them being in the community, then two possible interpretations could result for community contacts:

a) if the interpretation is that 50% of the 24 contacts required were to be made in the community, then the case would be in non-compliance, as only 11 of the 12 were made. There would be non-compliance with both the level of FOC and the number of community contacts made.

b) if the interpretation is that 50% of the 20 contacts made were to be in the community, then the case would be in compliance for community contacts and would actually have made one "extra" above the 10. There would be non-compliance in this case with the level of FOC, but not with the number of community contacts.

As a result, the following table presents the information in two ways: calculated against 1) 50% of the number of required contacts in the community and 2) 50% of the number of contacts that were made by the parole officers.


TABLE 3 - Community contacts

Parole Office # of cases reviewed # cases compliant (50% of required contacts) % # cases compliant (50% of contacts made) % Total # of community contacts recorded in OMS
PACIFIC            
New Westminster 21 14 67% 16 76% 197
Vancouver 24 9 38% 12 50% 146
Vernon 4 2 50% 2 50% 30
Kelowna 4 2 50% 1 25% 27
Kamloops 4 2 50% 3 75% 36
Chilliwack 3 2 67% 2 67% 31
Abbotsford 14 13 93% 14 100% 121
Nanaimo 5 3 60% 5 100% 39
Victoria 14 9 64% 9 64% 115
Prince George 11 10 90% 9 82% 101
Pacific
TOTAL
104 66 63% 73 70% 843 (8.1 avg.)
PRAIRIES            
Winnipeg 24 15 58% 14 58% 95
Brandon 3 3 100% 3 100% 29
Calgary 25 17 69% 19 76% 165
Saskatoon 11 8 73% 6 55% 87
Lethbridge 3 1 33% 1 33% 17
North West Territories 3 1 33% 1 33% 32
Thunder Bay 3 3 100% 3 100% 22
Red Deer 6 2 33% 2 33% 31
Thompson 2 2 100% 2 100% 3
Medicine Hat 2 1 50% 1 50% 9
Drumheller 2 2 100% 2 100% 12
Regina 14 14 100% 13 93% 154
Prince Albert 16 4 25% 8 50% 105
Edmonton 25 15 60% 15 60% 223
Prairie
TOTAL
139 87 63% 90 65% 984 (7.1 avg.)
ONTARIO            
St. Catherines 7 7 100% 7 100% 115
Ottawa 22 19 86% 20 90% 230
Peel 15 5 33% 5 33% 123
Sudbury 8 2 25% 1 13% 40
Barrie 6 5 83% 4 67% 56
Nunavut 2 2 100% 1 50% 1
Muskoka 2 2 100% 2 100% 7
Peterborough 9 8 89% 9 100% 32
Toronto East 25 12 48% 12 48% 123
Sault Ste. Marie 2 1 50% 1 50% 14
Toronto West 25 3 12% 3 12% 65
Kingston 19 14 74% 14 74% 198
Timmins 1 0 0% 1 100% 8
Hamilton 17 16 94% 16 94% 207
Guelph 13 13 100% 13 100% 146
Toronto Downtown 21 11 52% 6 29% 133
Women's Sup. Unit 10 5 50% 6 29% 133
Brantford 5 5 100% 5 100% 79
Windsor 9 7 88% 7 88% 69
London 12 4 33% 6 50% 76
Ontario
TOTAL
230 141 61% 139 60% 1775 (7.7 avg.)
QUEBEC            
Lafontaine 19 6 32% 11 58% 125
Estrie 8 7 86% 8 100% 119
Granby 9 8 89% 8 89% 129
Longueuil 20 14 70% 14 70% 233
Ville Marie 25 8 32% 13 52% 136
Quebec 20 14 70% 14 70% 168
Trois Rivieres 10 8 80% 8 80% 133
Rouyn-Noranda 4 3 75% 3 75% 48
Hull 8 8 100% 7 86% 80
Lanaudiere 11 10 91% 9 82% 157
Laurentian 10 8 80% 10 100% 169
Laval 11 11 100% 11 100% 201
Chicoutimi 5 5 100% 5 100% 44
Rimouski 4 4 100% 5 100% 38
Langelier 25 8 32% 7 28% 74
Quebec
TOTAL
189 122 65% 132 70% 1854 (9.8 avg.)
ATLANTIC            
Grand Falls 2 2 100% 2 100% 11
Grand Sault 3 3 100% 3 100% 29
Truro 8 6 75% 7 88% 92
Saint John 10 7 70% 8 80% 88
Kentville 8 8 100% 8 100% 116
Corner Brook 3 3 100% 3 100% 26
Moncton 14 12 86% 11 79% 105
Bathurst 4 4 100% 4 100% 41
Happy Valley 1 0 0% 0 0% 4
Sydney 6 5 83% 6 100% 42
Dartmouth 9 8 89% 8 89 66
Fredericton 6 4 67% 3 50% 44
St. John's 9 7 78% 6 67% 93
Charlottetown 9 9 100% 9 100% 98
Halifax 11 7 64% 7 64% 75
Atlantic
TOTAL
103 85 83% 85 83% 930 (9.0 avg.)
NATIONAL
TOTAL
765 501 65% 519 68% 6386 (8.3 avg.)


Finding #5: Collateral contacts are not being made in all cases.


The number of collateral contacts made concerning the offenders was also part of the FOC audit. According to Section 65 of SOP 700-06, parole officers should maintain a network of collateral contacts that allow the verification of the offender's place of residence, program participation, employment, and other factors relevant to the CPPR. Some Boards of Investigation raised the issue of collateral contacts during investigations of community incidents during the past year. Members of these Boards saw a lack of collateral contacts or poorly conducted collateral contacts as a concern in sensational incidents involving conditionally released offenders.

In some cases regarding collateral contacts, the auditors found that the fields in OMS and the narrative regarding collateral contacts were inconsistent, occasionally stating that collateral contacts had occurred when in actuality they had not. Conversely, in other cases, collateral contacts were included in the narrative of offender contacts and, as a result, a cursory review of OMS would not reveal these contacts.

In many cases, the auditors determined that collateral contacts were not being carried out, especially with family members. If contacts were made, it was often with the offender's spouse who accompanied the offender to the office for the interview or was present in the home while the offender was being interviewed.

The auditors noted some cases where detailed interviews took place with family members and other significant people in the offenders' lives, but they were not the norm during the six-month period that was reviewed. It was determined during the audit that up to twenty-four percent (24%) of the cases reviewed did not complete any collateral contacts during the March to August review period.

The following chart illustrates the average number of collateral contacts recorded in OMS casework records during the review period from March to the end of August 1999. Detailed information for each parole office follows the chart.

The national average of collateral contacts over the 6-month review period was 6.1 (which works out to approximately 1 collateral contact per month per case). The chart and table would indicate that many offices fall well below this average.

CHART 1 - Average number of collateral contacts per parole office


TABLE 4 - Collateral contacts

Parole Office # of cases reviewed Total # of collateral contacts Average # of collateral contacts per case over a 6-month period
PACIFIC REGION
New Westminster 21 73 3.5
Vancouver 24 102 4.3
Vernon 4 13 3.3
Kelowna 4 34 8.5
Kamloops 4 42 10.5
Chilliwack 3 7 2.3
Abbotsford 14 27 1.9
Nanaimo 5 24 4.8
Victoria 14 41 2.9
Prince George 11 121 11.0
PacificTOTAL     4.7
PRAIRIE REGION
Winnipeg 24 66 5.8
Brandon 3 19 6.3
Calgary 25 150 6.0
Saskatoon 11 93 8.5
Lethbridge 3 37 12.3
North West Terr. 3 36 12.0
Thunder Bay 3 11 3.7
Red Deer 6 62 10.3
Thompson 2 4 2.0
Medicine Hat 2 10 5.0
Drumheller 2 9 4.5
Regina 14 161 11.5
Prince Albert 16 86 5.4
Edmonton 25 207 8.3
PrairieTOTAL     6.8
ONTARIO REGION
St. Catherines 7 27 3.9
Ottawa 22 188 8.5
Peel 15 135 9.0
Sudbury 8 48 6.0
Barrie 6 59 9.8
Nunavut 2 0 0
Muskoka 2 0 0
Peterborough 9 10 5.2
Toronto East 25 93 3.7
Sault Ste. Marie 2 1 0.5
Toronto West 25 166 6.6
Kingston 19 117 6.2
Timmins 1 8 8.0
Hamilton 17 238 14.0
Guelph 13 50 3.8
Toronto Downtown 21 244 11.6
Women's Sup. Unit 10 27 2.7
Brantford 5 38 7.6
Windsor 9 65 7.2
London 12 92 7.7
OntarioTOTAL     7.0
QUEBEC REGION
Lafontaine 19 45 2.0
Estrie 8 12 2.0
Granby 9 32 7.8
Longueuil 20 156 7.8
Ville Marie 25 59 2.4
Quebec 20 162 8.1
Trois Rivieres 10 95 9.5
Rouyn-Noranda 4 30 8.0
Hull 8 53 7.0
Lanaudiere 11 97 9.0
Laurentian 10 90 9.0
Laval 11 88 8.0
Chicoutimi 5 56 11.2
Rimouski 4 88 22.0
Langelier 25 24 1.0
QuebecTOTAL     5.8
ATLANTIC REGION
Grand Falls 2 5 2.5
Grand Sault 3 14 4.7
Truro 8 37 4.6
Saint John 10 99 9.9
Kentville 8 88 11.0
Corner Brook 3 28 9.3
Moncton 14 34 2.4
Bathurst 4 5 1.3
Happy Valley 1 16 16.0
Sydney 6 11 1.8
Dartmouth 9 54 6.0
Fredericton 6 21 3.5
St. John's 9 66 7.3
Charlottetown 9 28 3.1
Halifax 11 60 5.5
AtlanticTOTAL     5.5
NATIONALTOTAL 765 4694 6.1

 

TEAM / INTENSIVE SUPERVISION


A small number of cases from the Team Supervision units in the Ontario and Quebec regions were assessed. These are supervised differently than the norm in supervision cases and are not included in any graphs or tables presented in the report.

Toronto Team Supervision

Three cases were reviewed for the unit: one for the same six-month period against which other offices were assessed, one over a five-month period and one was reviewed for only three months. All three cases were reviewed against a frequency of contact of two (2) face-to-face contacts per week, as required by the supervision unit. The audit found that two out of the three cases reviewed were in compliance for the months reviewed. A particular note on the case that was not in compliance was that the majority of the required curfew checks were completed by telephone, rather than by face-to-face contact.

Quebec intensive/team supervision

The following is a summary of the two types of intensive/team supervision programs in the Quebec region.

Intensive Supervision Program (PSI - Programme de supervision intensive)

The sample of cases randomly selected for the Lafontaine office included 5 intensive supervision cases (PSI). This program is described as a three-phase program. All contacts are the responsibility of the Parole Officer to whom the offender is assigned.

Phase I requires that face-to-face contacts with the offender are to take place eight times per month. The CPPR reflects this requirement. As part of the intensive supervision, curfew checks are required weekly by phone, and at least once per month in person. The curfew is set between midnight and 8:00 a.m.

Phase II requires the same number of face-to-face contacts as Phase I, but the curfew checks are removed.

Phase III reduces the number of face-to-face contacts to four times per month in preparation for the offender to return to regular supervision. Phase III differs only from regular 4/month supervision in the requirement that most of the contacts take place in the community (more than the minimum 50% normally targeted).

Of note is that the CPPR during the entire intensive supervision period would reflect a requirement for eight face-to-face contacts per month. This is not changed when the offender reaches Phase III of the program, rather the CPPR is only updated once the offender passes into regular supervision.

The audit found that none of the five PSI cases were fully compliant for the 6 months reviewed. The total number of compliant months was approximately 68% (20 months out of 30 reviewed). It was noted, however, that while the cases were technically in non-compliance, there was still evidence that the offenders were being closely supervised. The number of community contacts exceeded the minimum 50% required in three of the five cases. In one of these cases, almost all contacts were made in the community. The number of collateral contacts in each case ranged from 5 to 17 over the 6 months reviewed.

As a general observation, it was often difficult to determine from the casework records when the offender moved from one phase of the supervision program to the next (either to a higher or lower phase). The fact that the CPPR is not changed from 8 times per month to 4 times per month until the offender leaves the program also contributes to confusion (given that he is actually supervised at 4 times per month during phase III).


PSA Program (Programme de supervision accrue)

In the East/West District in the Quebec region, there is a program in place entitled the "Programme de supervision accrue (PSA)". In the random sample of cases selected for the Laval, Lanaudière and Laurentian parole offices, five cases in total were reviewed that were part of this program.

The primary difference between the Intensive supervision and PSA programs lies in the fact that there is only one Parole Officer involved in the supervision of the intensive supervision cases, and two in the PSA cases. In the PSA program, the Parole Officer to whom the case is assigned is responsible for the regular supervision of the offender, normally at a rate of four times per month. This is the frequency that is reflected in the offender's CPPR. An agreement is arranged with the PSA agent to provide additional contacts with the offender, over and above the regular frequency of contact. Offenders for the PSA program can be supervised by a Parole Officer in the Laval, Lanaudière and Laurentian offices, whereas the PSA program is coordinated by the Laurentian office.

The additional supervision requirements for offenders in the PSA program vary on a case-by-case basis. The offender may be in the program for a short, medium or long period of time, and the requirements may be modified at any time, depending on the needs and risks of the case. The terms of the referral to the PSA program are initially laid out in a form (entitled "Référence PSA") which is signed by the referring Parole Officer and the PSA coordinator. This form provides information such as:

- tombstone information on the offender;
- information on the offender's release and special conditions;
- information on the type of PSA desired (short, medium or long-term);
- the start and finish dates;
- the reasons for the referral; and
- the parameters of the PSA supervision.

This form is completed when the offender is initially referred to the program, although it would appear that it is not updated as changes are made to the terms of the PSA supervision. The audit team reviewed the forms completed for the two cases from the Laval office. In both cases, many of the pertinent fields were left blank and it was not possible to determine the frequency and type of contacts requested by the offender's Parole Officer.

In some cases reviewed, information was found in the OMS casework records to indicate discussions with the PSA Parole Officer about the terms of the supervision (i.e., curfew checks will no longer be conducted) or when the period of PSA was terminated. In most cases, however, it was not possible to determine from the casework records the exact parameters of the PSA supervision, in particular the number of face-to-face contacts required. No information was found in the CPPRs with respect to the PSA program.

Consequently, the audit team was not able to determine the level of compliance with the terms of the PSA program. The results for the three offices in the rest of this report, include these 5 cases, but reflect only the level of compliance by the supervising Parole Officer with respect to the standard level of frequency of contact set out in the CPPR (i.e., A-4). The only other information collected was the number of face-to-face contacts recorded on OMS by the PSA Parole Officer in each case reviewed. Although there were also many phone contacts recorded on OMS by the PSA Parole Officer, these were not included in the audit results. (In each case, the audit examined casework records between 99/03/01 and 99/08/31.)

TABLE 5 - Face-to-face PSA contacts recorded in OMS

Case # of months the offender was on the PSA program # of face-to-face contacts recorded on OMS by the PSA officer (in addition to contacts by regular PO)
Laval 3 4
Laval 5 7
Laurentian 6 8
Lanaudiere 6 5
Lanaudiere 3 5

These numbers would indicate that offenders are being seen face-to-face one to two times per month more than regular A-4 offenders. Additional phone contacts are also being made in these cases.

PHASE II


FIELD CONSULTATION AND RESULTS

This audit was unique in two ways: a) it was initially conducted totally from Offender Management System (OMS), and b) direct replies were solicited from the districts and the parole offices on the results obtained from OMS. At the completion of the OMS portion of the Frequency of Contact audit, all cases not in compliance for the six-month review period were listed for each office and forwarded to District Directors for review and response. If there was a difference between the audit findings and the responses from the field, the findings were rechecked by the audit team. In some cases, changes were made to the results by the audit team, and in others, the initial audit findings remained the same. In some cases, if it was reconfirmed that the original audit findings were correct, the appropriate office was notified.

The contact with the field offices included consultation with and responses from all parole offices, on cases that were non-compliant for part or all of the relevant six months. This consultation was an opportunity to obtain rationale and feedback for cases that did not meet the standard for frequency of contacts, and to actively engage the offices in the audit process. The response from all districts and parole offices was exceptional. All of the offices provided rationale for missed contacts and many raised concerns regarding policy issues.

The following table indicates the audit results, by office, after consultation with all parole offices and taking into account the responses from each office.



TABLE 6 - Results of responses received from Districts

OFFICE (and number of cases sent out for comments) # Cases where the parole office agreed that one or more contacts were not made # Cases changed as a result of the information provided # Cases where the parole office indicated the contacts were made but were not in OMS at the time of the audit Other - including cases where the parole office felt that non-compliance was justified, or those cases where information explained some of the missing contacts but not all.
PACIFIC        
New Westminster (19) 7 - 5 7
Vancouver (17) 2 - 11 4
Vernon (3) - 1 - 2
Kelowna (2) 1 - - 1
Kamloops (4) 1 - 1 2
Chilliwack (3) - - - No response received.
Abbotsford (13) 5 - 1 No comments provided by office for 7 cases
Nanaimo (4) 3 - - 1
Victoria (10) 4 - 1 5
Prince George (8) 8 - - -
PacificTOTAL (83) 31 1 19 29 (3 no response)
PRAIRIE        
Winnipeg (14) 8 - 1 4 (and in one case, no comments were provided)
Brandon (3) 2 - - 1
Calgary (17) 9 - 4 4 (in one case, the FOC was changed by the PO but not reflected in the CPPR)
Saskatoon (7) 7 - - -
Lethbridge (3) - - 3 -
North West Territories (2) 1 - - 1
Thunder Bay (2) 2 - - -
Red Deer (4) 2 - 1 1 (in remote area)
Thompson (1) - - - 1 (still missing some contacts)
Medicine Hat (2) - - - No response received.
Drumheller (1) - - 1 -
Regina (4) 2 - 2 -
Prince Albert (14) 14 - - -
Edmonton (14) 9 - 4 1
PrairieTOTAL (88) 56 0 16 14 (+2 no response)
ONTARIO        
St. Catherines (2) - - 2 -
Ottawa (15) 3 1 7 4
Peel (5) - - 5 -
Sudbury (5) * - - - -
Barrie (6) 3 - 2 1
Nunavut (2) - - - No response received.
Muskoka (0) - - - -
Peterborough (6) 1 - 5 -
Toronto East (12) 1 - 4 7
Sault Ste. Marie (1) * - - - -
Toronto West (14) 3 - 10 1 (in custody from Jan. to Mar. 6/99, but audit from Mar. to Aug. 99)
Kingston (12) 3 - - 9
Timmins (1) * - - - -
Hamilton (1) - - 1 -
Guelph (10) 6 - 2 2 (1 case suspended but no indication in OMS)
Toronto Downtown (6) 1 - 4 1 (1 contact in OMS for July, but FOC is B-2)
Women's Supervision Unit (5) 5 - - -
Brantford (3) 2 - 1 -
Windsor (6) 1 1 2 2
London (9) 4 - 5 -
OntarioTOTAL (121) 33 2 50 27 (+2 no response)
*** The information for three offices (Sudbury, Timmins and Sault Ste. Marie) - 7 cases - was combined into one response, therefore it was not possible to distinguish the results for each office. ***
QUEBEC        
Trois-Rivieres (3) - - - 3
Lanaudiere (7) 3 - 3 1
Rouyn-Noranda (1) - - - 1
Laval (4) - - 4 -
Hull (2) 2 - - -
Chicoutimi (1) - - - 1 (gap between contacts is too long)
Rimouski (0) - - - -
Quebec (13) 4 1 5 3 (one case still missing one contact)
Laurentian (6) 4 2 - -
Lafontaine (18) - - 18 -
Granby (6) 1 - 3 2 (in one case, gap between contacts is too long)
Langelier (22) 5 - 17 -
Estrie (6) 3 1 1 1
Longueuil (7) 3 - 2 2
Ville Marie (22) 8 - 14 -
QuebecTOTAL (118) 33 4 67 14
ATLANTIC        
Grand Falls (2) 1 - - 1
Grand Sault (1) 1 - - -
Truro (5) 2 - 3 -
Saint John (8) 2 - 5 1 (Exemption for rural supervision but no indication in OMS)
Kentville (3) 1 - - 1 (and one supervised by Truro office when non-compliant in May)
Corner Brook (1) - - 1 -
Moncton (5) 5 - - -
Bathurst (1) 1 - - -
Happy Valley (1) - - - 1
Sydney (1) 1 - - -
Dartmouth (8) 4 - - 4
Fredericton (2) 1 - 1 -
St. John's (6) 5 - 1 -
Charlottetown (2) 2 - - -
Halifax (8) 4 - - 4
AtlanticTOTAL (54) 30 0 11 13
NATIONALTOTAL (464) 183 7 163 111

 

ANALYSIS OF RESPONSES RECEIVED


A range of responses was received from across the country. Most offices saw the exercise as an opportunity to discuss concerns regarding the perceived inflexibility in the Supervision Standards. Others reviewed the number of cases versus the number of parole officers available to supervise those cases. As quoted by one A/Area Director, "this has been a valuable 'exercise' for the field, as it heightens the profile of the need to adhere as closely as possible to established standards and the importance of doing the necessary OMS changes when an FOC changes". The responses are seen as important for consideration in policy making and amendments to existing policy and therefore some of the quotes received and a summary of others are included as part of this report.

Particular concerns raised by the offices and districts included:

· maintaining face-to-face contacts in isolated settings and exceptions to maintaining the set level of contact;
· following policy regarding the initial contact with the offenders within 24 hours of release when the offenders live in a rural/remote area; and
· inputting the information regarding the contacts into OMS on a timely basis.

One of the prime concerns voiced in the responses, however, was the calculation of the number of contacts to ensure compliance in this audit, and in audits in general. According to information from the field, different audit teams and boards of investigations count the number of contacts required to meet the set frequency of contact level in different ways. Some regional audits allow a set number of days (2-3) grace beyond the timeframe when a case should have been seen, to satisfy the requirement for the level set by the CPPR. Other auditors, and/or staff conducting incident investigations determine compliance in a different manner.

This national audit was conducted by assessing Level A-4 as requiring four (4) face-to-face contacts a month, at a rate of one contact each week. However if there were five (5) weeks in a month during the review period, the auditors still audited at four (4) contacts per month. For level B-2, requiring two (2) contacts per month, the auditors determined that if there were two contacts made in the same week, they were only counted as one. If the two (2) required contacts were made before (or both after) the fifteenth of the month, with no contact after that until the next month, it was counted as one (1) contact for that month. There was some subjectivity by the auditors, but it was very limited. The auditors did count some cases as compliant for a month at the B-2 level if they were seen the first week and the last week of the month, as long as the extreme was not too great (i.e., the first and last day of the month).

Some responses from the field indicated averaging cases is sufficient to meet compliance. That is, if the parole officer makes face-to-face contacts the required number of times some months plus extras (depending on the risk and needs of the case) but misses others, it averages out to be compliant over time according to the level set by the CPPR. One office responded that "some cases were seen the appropriate number of times if "averaged" over a two month period, yet would not show up in the audit. For example, with an FOC of 4 times per month, a case was seen 3 times in August and then on the 1st of September, followed by a further 4 contacts in September." Another office reported that "one obvious answer to the problem would be to pro-rate the requirement based upon weeks available during the month".

The review period for the audit presented a problem for some offices, since part of it covered the vacation period that usually extends from June to September. This concern was more prominent in smaller offices with fewer parole officers. Some offices felt that this fact should have been taken into account during the review process: ". the impact the summer vacation has on the operational mandate. In small parole offices (4 or 5 parole officers), generally one-half of the staff are approved for vacation in the late June through July period, while the other half are scheduled for late July through August period. There are generally no provisions to enable the hiring of extra staff and the parole officers must face the demand for 100% compliance while operating, at times with only one-half the staff compliment. The dilemma of managing workload when short staffed is a very serious operational issue."

Office management also felt, however that "The vacation and sick leave coverage of absent parole officers must be approached with sufficient organization and administrative support to ensure community Supervision Standards are met".

In only counting face-to-face contacts, 'no shows' or cancelled appointments were not taken into account during the audit. This meant that although there was evidence in the casework records that particular parole officers made appointments and showed up for those appointments, the contact was not counted if the offender was not available to be seen. This presented problems for some offices. It was felt that "the District has maintained a practice of not penalizing the parole officer for an offender's lack of compliance. This situation occurs when an appointment has been made to meet the offenders and the offender is a 'no show'. We do not sanction the parole officer if an offender tests positive on urinalysis testing and therefore why would we penalize the parole officer when an offender has breached his reporting requirements."

Supervising in Isolated Settings

As stated in Phase 1 of this report, there were no exceptions made during the audit for cases in isolated settings. Since the audit was assessing face to face contacts only, any cases supervised by telephone were judged to be non-compliant. This presented problems for the parole offices and districts across the country when the non-compliant cases were forwarded for review. Many offices commented on the ability of meeting standards of supervision in rural areas - "Good corrections would suggest that an offender be seen weekly if in level A, or every two weeks if in level B. However, the realities of rural supervision do not always allow this to happen. There is a need for some flexibility; the question becomes what flexibility is allowed to meet the Standard. Clarification is needed, especially as it applies to level B".

One office reported that "there are geographic issues that prevent the FOC from being maintained in some cases. Many of the offenders in the north access employment that takes them into remote areas, or they live in remote areas. Much of the work entails short notice when they leave and is often in isolated areas for time periods that go beyond what traditional FOC standards call for. I do believe that it is quite possible to develop alternative supervision strategies that would meet the principles of quality supervision such as more collateral contacts, police reporting or telephone contacts. The SOP currently provides the flexibility to provide exceptions to the FOC standards and we would hope that any future audits would take these sorts of situations into consideration before a finding of non-compliance was determined."

Another comment regarding rural supervision included "in some rural cases or isolated working situations it is not feasible (time and financial resources) to see them in strict adherence, particularly if there are collateral contacts and the situation leads the supervisor to believe the risk is well managed. The process of having a case conference and requesting an exemption from the established FOC by the Area Director was never used, but would have been very appropriate in a number of these cases. I believe the use of exemptions granted by the Area Director should increase."

Other offices also addressed the issue of 'exceptions' or written approval from the area or district director, "we have cases where rural living and isolated work make it impossible to maintain face-to-face contact with the required frequency. It is ironic that with the use of collateral contacts, these are cases in which we can often be very comfortable that we are on top of the situation. It is important that we be more conscientious in ensuring that the Area Director's formal written approval to deviate from standards under these circumstances is recorded on file."

Initial Contact

The concerns over supervision in rural areas extended to complying with meeting the initial contact with newly released offenders. The policy on initial interviews with conditionally released offenders states that "a face-to-face interview will be scheduled with the offender within one working day of his/her arrival at the release destination". This policy has been interpreted by parole offices to mean different things for offenders living within a short distance from the parole office and offenders who live upon release a considerable distance away from the office, or live in remote locations. The word 'scheduled' in the Standard is the operative word that determines when the initial interview with the offender takes place.

If offenders will be residing within a reasonable distance from the parole office (allowing offenders to present themselves at the office upon arrival at the release destination), they are seen within a working day of release. However if the offenders will be residing in an area where parole officers have to 'travel' each week to make contact, the initial interview is 'scheduled' for the first time after the offender's release that the parole officer is in the area of the offender's release destination.

This policy needs clarification to determine if all offenders are to be seen within one working day of release or if the initial contact depends on the interpretation of the parole office, and the distance the offenders reside from the office.

Timely input of casework records into OMS

The timely input of casework records into the Offender Management System was the cause of the majority of the non-compliant cases, and the responses from the field offered both explanations and excuses for the overall results. In an adamant response one manager stated that "in many cases the FOC is maintained, even though the casework records do not support it. I want to make this point forcefully because even though I understand the problem, the results of the audit are being presented as confirming FOC. What it actually measures is a combination of FOC as well as timely data entry." As previously noted, the audit did find that some of the non-compliant results of the OMS review are due to non-entry of the information into the OMS system. It is also observed that FOC is only one measure of supervision in the community.

The issue of resources within parole offices was raised as one reason for delays in inputting the necessary information into the System; and parole officers' lack of typing and computer skills was another reason for the non-compliance. The response from one office indicated that "timely entries of CWRs will remain an issue until there are sufficient resources in the community to meet FOCs and then to document these meetings." Another stated that: "As expected, we found that there have been gaps in OMS entries. We saw that several files contained a handwritten Casework Record, but no corresponding entry in OMS. We should mention that in the past few years we have suffered from a considerable shortfall in FTEs. In fact, the recent study on community standards conducted by NHQ shows a deficit of 15 FTEs for our district. The lack of staff has translated into a measure allowing for handwritten notes, with the OMS entry to follow when possible." [translation] The issue of parole officer skills was summed up by one office's response, indicating that "Parole officers primary skills often do not include typing and data entry. The margin of error with respect to these activities is not surprising as this group of employees takes on increasing computer related responsibilities."

Some offices, particularly in the Quebec region, are still not entering casework records into OMS relating to indirect supervision cases. This issue was raised in one response, which indicated that: "The preliminary results also reflect the adoption of a policy derived from NHQ's Security Division, which is not to allow the private sector access to the OMS network. Since the private sector alone is responsible for supervising 25 % of our active cases (about 400 cases), quite a few of these ended up in the sample. Except for the . pilot project, all other private agencies document their supervision by writing their notes in the paper file. This is particularly so in the Langelier sector since 74% of the cases there belong to the private sector." [translation]

The issue of delayed input into OMS of all information related to contact with released offenders, especially casework records, needs to be dealt with at the national level. Inclusion in policy of a set timeframe for the entry of casework records into OMS would benefit parole officers who would be aware of the exact limitations for entries. It would also benefit managers conducting quality control exercises on the work of the parole officers.

OVERALL FINDINGS AND RECOMMENDATION


While the overall compliance results indicate that FOC is not always being met, the audit team found overall that parole officers are striving to make the required contacts with the offenders on their caseloads. In addition, there was evidence in most cases that offenders were being supervised effectively in the community through various means, only one of which is face-to-face contacts.

In summary, the audit team's findings, as they related to the OMS review of FOC, include:

A) The frequency of contact as they relate to the supervision standards are not always met.

B) Casework records are not always entered into OMS in a timely manner.

C) The information in the casework entries is sometimes not accurate and the entries often lack significant information.

D) Changes to frequency levels in Correctional Plan Progress Reports (CPPRs) are often not completed in a timely manner and the fields and narratives in CPPRs and casework records are sometimes inconsistent.

E) The requirement for contacts to be made in the community at least 50% of the time is generally being met.

F) Not all parole officers are completing collateral contacts.

The overall issues that were raised in the responses from the field formed part of the findings and recommendation. The ideas and comments from the field will also be presented to the policy division of Correctional Operations and Programs for comment and possible amendments to policy.

RECOMMENDATION #1

That the policies and procedures regarding Community Supervision be amended to include:

a) a recommended timeframe for the entry of casework records into OMS;

b) a clearer definition of the requirement for an initial interview;

c) the circumstances allowing authorized exceptions, as they relate to rural/remote supervision, as well as the documentation required;

d) a clearer interpretation of the requirement to have more than 50% of contacts in a community setting; (to be measured against the required contacts or measured against the actual contacts made)

e) a clarification of the contact requirements for both level A and level B frequency of contact;

f) a clear and consistent method of recording the level of frequency of contact to facilitate monitoring.

Action by: Assistant Commissioner, Correctional Operations and Programs


Action plan:

1. We will issue a bulletin to all parole offices and supervision managers (including contract supervisors) concerning the importance of compliance with the policy as it exists today and with information about what will be done regarding policy review.

2. We will immediately start monitoring on a quarterly basis the casework records of all offices and will examine in detail the 25% lowest performing individual offices. ADC's will be asked to explain at ADC meetings actions they have taken to improve the performance of these offices. As the number of offices that reach and maintain full compliance increases, the number of offices requiring review will decrease.

3. A change to the policy on FOC has implications not only for field services but also for the review of policy generally, the design of OMS (there are changes needed to the way casework records are entered and the FOC schedule is linked to risk assessment tools, etc.) as well as systems such as RADAR and, most importantly, the workload formula for the community. Therefore, we will do a review of the standards to ensure that the frequency of contact, the way contacts are described, and the nature of which contacts "count" towards providing for the safe reintegration of offenders is more clearly spelled out. Any policy change will be timed to when we can implement changes to the issues mentioned above (OMS, workload formula, etc) but should be completed by 00.12.31.

Other comments received from different offices not included in the body of the report:

· A different kind of problem emerges given the wording of the FOC standard. When the standards were initially reviewed there were some that thought the requirements should be specified by week instead of month. Others argued that the previous wording was acceptable but clarification should be added to deal with potentially ambiguous situations. In the end the wording was left alone. However, there are at least three instances where counting face-to face contacts become problematic. First, the number of contacts required for a level-A offender released on the 10th of the month. A literal interpretation of the Standard would result in 4 contacts. Another case to be made is a release on the 29th of the month or cases when a change of FOC is made mid-month. One obvious answer to the problem would be to pro-rate the requirement based upon the weeks available during the month.

· This audit has alerted our District for the need to ensure that activity records are entered in a more timely manner. As a direct result, the Director will now direct all parole officers to routinely enter activity records within one (1) week of the contact taking place. In the immediate a policy will be initiated whereby senior parole officers will routinely conduct a formal monthly review of a number of randomly chosen cases from the case list of the parole officers they supervise. These cases will be checked for a number of supervision compliance issues including adherence to frequency of contact requirements.

· I would like to point out how disconcerting it can be when we review an audit result that appears to have crunched numbers but did not allow for the quality of work that is actually being done. I do honestly believe that there must be intelligent methods of insuring risk remains manageable in spite of contacts that may be missed for comprehensible reasons. The SOPs are supposed to allow parole officers to manage risk and not because of worry over arbitrary audits, have to deal with other issues.

· The office has incorporated additional control measures as a result of the audit. While we cannot control workload and resource issues, we should be able to better deal with timely input of casework records on OMS. The office has a one (1) week timeframe for records to be available on OMS; Operation By-Pass does not give us a time-frame. Supervisors will conduct regular monthly reviews of a number of cases; this will alert them to associated issues.

· In this type of review one needs to read the full OMS entry, the recent CPPR and the paper file to know what contacts are being made and when. Also, timely entries of CWRs will remain an issue until there are sufficient resources in the community to meet FOCs and then to document these meetings.

· In some cases, it appears that the audit assumes a rate of contact higher than we are following. It appears that there might have been a problem in the switch from our "old" way of recording FOC in OMS to the "new" By-Pass way.

· These results provide a good illustration of the understanding we have of data entry in the OMS. The exercise you conducted nonetheless shows us that there are still some problems in terms of adherence to supervision policies with regard to frequency of contacts and alternation of meetings in the community. We are still of the opinion that the actual number is lower than the results might indicate. Still, it is obvious that it cannot be sanctioned. [translation]