Inspection Report No. OIG-INS-16-01-03
Review of the Data Accuracy of the Fiscal Year 1999 NLRB Annual Report

UNITED STATES GOVERNMENT
National Labor Relations Board
Office of Inspector General

Memorandum

September 25, 2001

To: The Board
General Counsel

From: Jane E. Altenhofen
Inspector General

Subject: Inspection Report No. OIG-INS-16-01-03: Review of the Data Accuracy of the Fiscal Year 1999 NLRB Annual Report.

The objective of this review was to determine whether the National Labor Relations Board (NLRB or Agency) Annual Report for Fiscal Year (FY) 1999 accurately presented the Agency's accomplishments. We reviewed both unfair labor practice cases (C-Cases) and representation cases (R-Cases).

We found that the number of cases received, cases closed, and cases pending reported in Table 1 of the Annual Report were in significant disagreement with the databases upon which the Annual Report was compiled. The Executive Secretary System was generally accurate, but we found errors relating to the originating document date for C-Cases and that the policy for recording information for certain R-Cases does not reflect the total time the case was at the Board. The Casehandling Information Processing System (CHIPS), which was used to compile Regional Office casehandling statistics, also contained errors.

The FY 2000 Annual Report, which is not yet ready to be published, will be primarily compiled on data from the Case Activity Tracking System (CATS) instead of CHIPS. According to the Division of Operations-Management, all pending cases in CHIPS were transferred to CATS. As part of the data verification process, cases that should have been closed before FY 2000 were deleted from CATS and the data was entered into CHIPS. Therefore, the pending case statistics from September 30, 1999, will be inconsistent with the October 1, 1999 figures. Operations-Management has not kept track of the number of cases deleted so does not know if the number is significant. They intend to acknowledge that any differences in numbers between years is a result of converting to a new data system.

SCOPE

We interviewed NLRB staff in the Information Technology Branch (ITB) and program offices to gain an understanding of processes for collecting and compiling casehandling statistics reported in the Annual Report. We recalculated Annual Report tables using databases provided by ITB. We also tested the accuracy of information in the databases upon which the Annual Report was compiled by verifying the information against documentary evidence for the Executive Secretary System and CHIPS data in four Regional Offices - Region 2 (Manhattan), Region 6 (Pittsburgh), Region 15 (New Orleans), and Region 26 (Memphis). This inspection was conducted between January and September 2001 in accordance with Quality Standards for Inspections.

BACKGROUND

NLRB is required by law to make a report to Congress and to the President summarizing significant case activities and operations each fiscal year. To fulfill this requirement, the Agency publishes an Annual Report that identifies and summarizes significant cases and presents charts and tables quantifying and summarizing the Agency's accomplishments. The FY 1999 Annual Report contains 17 charts and 40 tables.

Many of the tables and charts provide casehandling information generated by Regional Offices. Regional Office data in the FY 1999 Annual Report was generated by CHIPS, with the exception of four Regional Offices that used CATS for all or part of their data. Region 13 (Chicago) and Region 16 (Fort Worth) entered both C-Case and R-Case data only into CATS during FY 1999. Region 1 (Boston) and Region 9 (Cincinnati) entered R-Case data only into CATS during FY 1999.

The FY 1999 Annual Report was sent to the Government Printing Office for publication on March 5, 2001. Officials in ITB and the Division of Operations-Management attribute the delay in report production to four Regional Offices switching from using CHIPS to CATS in FY 1999, while the rest of the Regional Offices continued to use CHIPS.

CALCULATION OF ANNUAL REPORT

The number of cases received, cases closed, and cases pending reported in Table 1 of the Annual Report were in significant disagreement with the databases upon which the Annual Report was compiled. Ten of the 17 charts and 14 of 40 tables represent information from Table 1 in different forms or levels of detail that also would be inaccurate.

We obtained CHIPS and CATS electronic databases, upon which the Annual Report was compiled, loaded them into audit software, and recalculated Table 1 of the Annual Report. We shared our calculations with officials from ITB. They confirmed that our calculations were logical and were unable to reconcile the differences, which are shown in the table below.

  Reported in FY 1999 Annual Report Recalculated From Database Difference
All cases
Received fiscal 1999 33,232 32,829 403
Closed fiscal 1999 35,806 33,162 2,644
Pending September 30, 1999 32,056 31,204 852
Unfair Labor Practices Cases - "C"
Received fiscal 1999 27,450 28,005 -555
Closed fiscal 1999 29,741 28,317 1,424
Pending September 30, 1999 29,815 29,420 395
Representation Cases - "R"
Received fiscal 1999 5,462 4,509 953
Closed fiscal 1999 5,708 4,536 1,172
Pending September 30, 1999 2,004 1,572 432
Union-shop Deauthorization Cases - "UD"
Received fiscal 1999 110 110 0
Closed fiscal 1999 128 116 12
Pending September 30, 1999 56 52 4
Amendment of Certification Cases - "AC"
Received fiscal 1999 15 14 1
Closed fiscal 1999 12 10 2
Pending September 30, 1999 11 11 0
Unit Clarification Cases - "UC"
Received fiscal 1999 195 191 4
Closed fiscal 1999 217 183 34
Pending September 30, 1999 170 149 21

Agency Comments

The ITB Chief commented that data accuracy, completeness, timeliness, and consistency have always been a problem. The problems were increased in FY 1999 because some Regions were using CATS and some were using CHIPS, making data coordination and consolidation extremely difficult and time-consuming. In addition, an on-going problem is that changes are constantly being made to previous data, including changes after the Annual Report tables are initially run. These are not just updates for the previous month, but often are changes that go back many months. The Agency objective should be to have one common data base from which all charts and tables are prepared and that a time and date be established to "lock down" the official data for the end of the year. Once every office uses the same data at the same cut-off period and once there is accuracy, completeness, timeliness, and consistency of the data, Agency reporting will improve. The CATS system is a major part of the solution, but the total solution must include process coordination among offices and data reliability among data owners.

EXECUTIVE SECRETARY SYSTEM

The Executive Secretary System produced the number of cases issued by the Board and various median day figures for C-Cases and R-Cases handled by the Board. The Executive Secretary System was generally accurate, but we found errors relating to the originating document date for C-Cases and that the policy for recording information for certain R-Cases does not reflect the total time the case was at the Board.

Unfair Labor Practice Cases

The Executive Secretary System contained 423 C-Cases. We selected a random sample of 40 cases and tested three data elements for each case: date of the originating document, date assigned to Board members, and date of Board decision. We identified five errors (12.5 percent) for the date of the originating document. In one instance the database did not contain an entry. The four other cases had discrepancies of 2, 3, 30, and 207 days. Generally, the date the Board Member was assigned and the date of the Board decision were accurate.

Representation Cases

The Executive Secretary System contained 268 R-Cases. We selected a random sample of 40 cases and tested three data elements for each case: date of originating document, date assigned, and date of Board decision. Generally, the data tested was accurate but we found that the policy for recording requests for review of Regional Director decisions does not reflect the total time the case was at the Board.

The Regional Director determines, on the basis of the record made at the hearing and the briefs of the parties, whether a question concerning representation exists and the appropriate bargaining unit. The Regional Director's decision must set forth findings of fact, conclusions of law, and a direction of election or order dismissing the petition. A request for review of the decision may be filed with the Board by any party. Review is granted only if substantial questions of law or policy are raised, if there is clear error on a substantial factual issue, if the conduct of the hearing was prejudicial, or if compelling reasons exist for reconsideration of an important Board rule or policy.

The Executive Secretary System does not include requests for review that have been denied in the casehandling statistics, unless the denial results in a published decision. The decision granting a request for review is the originating document for a requests for review, rather than the receipt of the request. In some instances, the request for review is granted simultaneously with the Board decision. Thus the date of the originating document, assigned date, and date of the Board decision can all be the same date. This results in zero days processing from the date of originating document and date of assignment until the date of the Board's decision. In reality, though, the originating document was received and the case was assigned before the date the decision issued. A review of the Originating Document to Board Decision report generated by the Executive Secretary System showed 21 cases had the same dates for the originating document and Board decision. For these 21 cases, we identified the date the request for review was received by reviewing information in case files and querying the Office of Representation Appeals' database. We recalculated the median days processing from the originating document to Board decision as 144 days rather than 125 days stated on Table 23.

A review of the Assignment to Board Decision report generated by the Executive Secretary System showed 13 cases had the same dates for the date of assignment and date of the Board decision and another case showed one day from the date of assignment to Board decision. For these 14 cases, we identified the assignment date by reviewing information in case files and querying the Office of Representation Appeals' database. We recalculated the median days processing from assignment to Board decision as 111 days rather than 101 days stated on Table 23.

Ninety-five of the 268 R-Cases identified in the Executive Secretary System were requests for review. We did not recalculate median days processing statistics considering the assignment date or date the request for review was received for each of the 95 cases, but recording the decision granting the request for review as the originating document and assignment date would further increase the median days.

Agency Comments

The Executive Secretary acknowledged that in these cases the statistics did not reflect the time the case was at the Board. He stated that this issue would be reviewed in the future, but noted that changing the methodology would cause the median days statistics to be inconsistent with prior periods.

REGIONAL OFFICES

A significant number of C-Cases and R-Cases identified as pending as of September 30, 1999, were actually closed in FY 1999 or prior periods. This understates the Agency's accomplishments for the period in which the case actually closed, and overstates the pending workload as of September 30, 1999. Data pertaining to cases closed during FY 1999 generally was accurate.

We selected four different samples in each of the four offices reviewed: C-Cases closed during FY 1999; C-Cases pending at September 30, 1999; R-Cases closed during FY 1999; and R-Cases pending as of September 30, 1999. For each sample we either selected a random sample of 78 items, or if the universe of cases contained less than 78 cases we tested all items in the universe. In some instances case files related to pending cases were unavailable for our review, and Regional Office representations, based, primarily on a review of manual records such as case cards, served as the basis for our findings. We considered data provided through the date of the draft report. Case file data provided after that date was determined not to impact the overall findings and was not incorporated.

Pending Case Data

Pending cases were reviewed solely to determine whether their status (pending) was reported accurately. We identified 78 C-Cases (25 percent of items tested) and 58 R-Cases (27 percent of items tested) that were reported inaccurately as pending as of September 30, 1999, that were closed either during FY 1999 or in prior periods. This understates the Agency's accomplishments in the period which the case actually closed and overstates the Agency's pending workload. This affects 5 of the 17 charts in Chapter 1 and 9 of 40 tables in the Appendix of the Annual Report. Each of these charts and tables presents case disposition information. The tables below contain the results of our testing.

C-Cases Pending at September 30, 1999

  Manhattan Region 2 Pittsburgh Region 6 New Orleans Region 15 Memphis Region 26
  No. % No. % No. % No. %
Items in the universe 2,133   1,179   1,285   521  
Items tested 78 100 78 100 78 100 78 100
Incorrectly reported as pending 47 60 1 1 25 32 5 6
Regional Office did not provide information for our review 7 9 0 0 0 0 2 3

R-Cases Pending at September 30, 1999

  Manhattan Region 2 Pittsburgh Region 6 New Orleans Region 15 Memphis Region 26
  No. % No. % No. % No. %
Items in the universe 181   54   37   44  
Items tested 78 100 54 100 37 100 44 100
Incorrectly reported as pending 40 51 1 2 4 11 13 30
Regional Office did not provide information for our review 18 23 0 0 0 0 0 0

Closed Case Data

For cases closed during FY 1999, we reviewed the date filed, date closed, stage closed, and method of disposition. Closed C-Case and R-Case data were generally accurate, with the exception of some R-Case data elements in Region 2 (Manhattan) and Region 26 (Memphis). The CHIPS database had 127 closed R-Cases for Region 2. We tested 78 of them and found that three cases (4 percent) had the date closed incorrect, three cases (4 percent) had the method closed incorrect, and for five cases (6 percent) files were not available for our review. The CHIPS database had 105 closed R-Cases for Region 26 (Memphis). We tested 78 of them and found that for nine cases (12 percent), the date closed was incorrect.

The incorrect date closed did not affect the Annual Report because the incorrect date was either in FY 1999 or the closing action for that Regional Office was transferring the case to another Regional Office, which would not have affected the total cases reported as being closed by the Agency. The method closed would affect Table 7.

Agency Comments

The Division of Operations-Management Associate General Counsel commented that the Regions reconcile data at the end of the year, but not all inaccuracies are revealed. The errors were primarily attributed to human error and a possible problem with transferred cases. At this point, the focus of all Regions is to ensure that all information input into CATS is updated and accurate. CATS has several features that will safeguard against the entry of inaccurate data. Further, CATS requires that the data be entered very close in time to the disposition and the case will show up on the overage reports if the data is not entered properly.

SUGGESTIONS

Correcting discrepancies in CHIPS at this point would not be a good use of Agency resources. However, we believe the Agency needs to be aware that some data in the FY 2000 Annual Report may not logically follow that in the FY 1999 Annual Report and should address these differences in the narrative.

cc: Louis B. Adams, Information Technology Branch Chief
John J. Toner, Executive Secretary
Richard A. Siegel, Associate General Counsel
Vanita C.S. Reynolds, Library and Administrative Services Branch Chief
David B. Parker, Director of Information