State Comments on Frozen Data - 2010

State comments on frozen data are available through the links below, organized by the submission year. Additional data quality information which is not specific to the frozen data is available through the known data problems and State Review Framework Recommendations Tracker.

2010 Comments on Frozen Data (Captured February 2011)
State Media Comments Data Links
Alabama CAA Metrics A01A1S and A01K0S for the CAA data erroneously assume that any source classified as a major source at the plant level should have a Title V air program code and should be identified as a major source subject to the CMS policy. However, due to the State of Alabama's continued regulation of total suspended particulates (TSP), the State has numerous sources that are major sources under the PSD air program due to their potential to emit TSP. In AFS, Alabama has correctly entered these sources without a Title V air program code and as not being subject to the CMS policy.  
RCRA The use of biennial report data to establish the number and identities of facilities in LQG universes for the '1 FY' and '5 FY' periods (data metrics 5B & 5C) causes the actual number of LQG inspections for those two periods to be underreported. As reported in Alabama's RCRA Grant End-of-Year reports that were submitted to EPA region 4 and accepted without comment, Alabama completed 86 LQG inspections in '1 FY' and 347 LQG inspections in '5 FY.' The End-of-Year report data represents inspection rates of 37% for '1 FY' and 152% for '5 FY' verses the 29% and 83% reflected in SRF data metrics 5B and 5C, respectively.  
Alaska CAA A problem with the translation of data between EPA and the State exists for facility classification data and the stack test data, so the Data Metrics do not reflect the State's activities or facility classification.  
Arizona
RCRA A problem with the translation of data between EPA and the State exists for the total LQG universe metric as compared to the total LQG universe defined through BRS reporting. The total LQG universe metric displayed is substantially higher than the actual total universe, therefore the data metrics associated with total LQG universe are not accurate.  
Colorado CAA The Colorado Department of Public Health and Environment, Air Pollution Control Division, Stationary Sources Program has reviewed the information for the CAA data. Colorado has worked diligently on correcting data issues in AFS and reconciling them with Colorado's own data. However, we question differences we find in AFS with what we have in our own CACTIS data system - especially data found in AFS that Colorado did not put there. Source classification seems to be one of most frequent issues that we find. We will continue to work with Region 8 to resolve these issues.  
Delaware CAA

For Matrix A01B1S:

Facility ID 10-001-00015 showing as SM facility, it isn't, so manually corrected in AFS (this would reduce the count from 84 to 83)

Facility ID 10-777-00233 did not have a SM air program set up in DEN as it should have, we have since set it up and done an upload to AFS (this would put the count back up to 84)

For Matrix A01C2S:

Facility ID 10-003-00324 had a NESHAP air program in AFS, however state's database did not have a NESHAP air program, so not sure how AFS had one. Manually shut down the NESHAP air program in AFS.

For Matrices A01F1S & A01F2S:

A NOV for Facility ID 10-001-00117 was not counted (both for the informal enforcement action and the source). It is in AFS, not sure why it's not in the OTIS data verification spreadsheet.

For Matrix A01J0S:

Delaware's database is incorrectly combining penalty and cost recovery and reporting the combined total as penalty to AFS. Therefore, the following are the actual penalties assessed in FFY 2010:
Facility ID 10-003-00365 -- $20,000 penalty (instead of $23,000)
Facility ID 10-003-00016--$1,945,000 (there was no cost recovery)
Facility ID 10-003-00087--$9,900 penalty (instead of $11,385)
Facility ID 10-003-00705--$12,775 penalty (instead of $14,691)
Grand Total of Assessed Penalties: $1,987,675 (instead of $1,994,076)

For Matrix A05E0S:

The unknown compliance status for Facility ID 10-003-00516 was caused by the un-archiving of the facility.

Do not understand how Facility ID 10-001-00002 had unknown compliance status for January and February of 2010. State does not think it should have had that status however, due to state database's lack of ability to track compliance status by historical date(s), no way of reconciling.

For Matrix A05G0S:

Counted should be increased by 2 and Not Counted reduced by 2. Review of 2010 Self-Certifications for Facility ID 10-003-00063 & 10-003-00021 were completed on 10/4/10.

 
RCRA A problem with the translation of data between EPA and the state exists for inspection data, so the data metrics do not reflect the complete number of activities performed by the state.  
District of Columbia CAA

The following caveats should be posted:

  1. Some gasoline station partial and full compliance evaluations have not yet been entered into the system, though they were performed in FY2010.
  2. Some Area Source MACT sources do not have appropriate air programs and subparts identified, primarily gasoline stations and perchloroethylene dry cleaners.
 
Florida CAA

CAA Subpart Designations: Percent NESHAP facilities with FCEs conducted after 10/1/2005: This metric does not appear to be properly calculating the correct percentage, because it is including facilities that have shutdown operation (i.e. status = 'X'). When considering only operating sources in the universe of sources, 99% of the facilities have the correct data.

Historical Non-Compliance Counts (1 FY): Sources on this list include those under the responsibility of EPA Region IV, which has lead enforcement over the CFC sources and their compliance status.

Number of Sources with Unknown Compliance Status (Current): The # of facilities with an unknown compliance status will change once EPA runs their Unknown Compliance Status Generator in the Air Facility Subsystem for the month of February (2011). Only a few sources will remain in this metric after the utility has run.

 
RCRA For metric 10.A, many of the SNCs not counted in the list of "SNCs with formal action/referral taken within 360 days (1 FY)" became SNC less than 360 days from the date of the report, skewing the percentages for both Florida and the National average. Florida's actual rate for metric 10.A, if stated as: "SNCs with formal action/referral taken within 360 days" is close to 80%. Our percentage for Metric 5.C "Inspection coverage for LQGs (5 FYs)" is lower due to the department relying on the US Coast Guard to inspect Cruise Ships covered by the 2001 MOU with the cruise line industry.  
Georgia
CWA A problem with the translation of data between EPA and the state exists for inspections, actions and penalties data as well as state issued Land Application System permit data, so the data metrics do not reflect the complete number of activities performed by the state.  
Illinois CAA U.S. EPA Region 5 has reviewed the CAA data for all six States, specifically, the entries reported to AFS for the air program pollutant compliance status field at the source level. There are data issues that still remain which involve inaccurate data entered by EPA in the State agency's compliance status field instead of EPA's field. Region 5 will continue working with the States and HQs to clean-up and correct all inaccurate data entered for all six States.  
Indiana CAA

In addition to the general caveat that Region 5 is providing to address data issues that involve data entered by EPA in the State agency's compliance status field instead of EPA's field, there are a few ongoing issues we deal with that affect how Indiana data is represented in AFS:

  1. We cannot upload all stack tests conducted because AFS does not accept more than one test per facility per date per pollutant; and,
  2. The length of time an Indiana source appears to be out of compliance is not accurately reflected in OTIS/ECHO as a result of the periods between data pulls and AFS updates. Frequently the reported period of non-compliance is inconsistent with the actual non-compliance (usually showing up as longer) and also reflected as occurring in a later period than actually occurred.
U.S. EPA Region 5 has reviewed the CAA data for all six States, specifically, the entries reported to AFS for the air program pollutant compliance status field at the source level. There are data issues that still remain which involve inaccurate data entered by EPA in the State agency's compliance status field instead of EPA's field. Region 5 will continue working with the States and HQs to clean-up and correct all inaccurate data entered for all six States.
 
CWA During the data verification process for 2010 CWA data, Indiana noted the Metrics were revised by EPA from "Reconnaissance inspections are not counted at primary industries or at approved pretreatment facilities" to "inspection types which count toward coverage will match those provided in the national CMS. It is unclear to Indiana how inspection coverage at major and non-major facilities are queried in ICIS-NPDES to reflect the inspection types counted toward coverage in the national CMS. If inspection types counted for coverage are dependent on a facility's SIC code, Indiana requests clarification from EPA on which SIC codes are associated with inspections counted for coverage. Based on a review of NPDES primary industry categories listed in Appendix A to Part 122, Indiana assumes that the 2 and 4 digit SIC codes associated with primary industries are 10, 12, 22, 24, 26-31, 33-36, 38 and only 4911 from the "49" Major Group 2-digit SIC Code and therefore reconnaissance inspections at these facilities are not counted for coverage; even if the state conducted multiple recon inspections at the same facility during the inspection year. Indiana is unclear why reconnaissance inspections do not count for coverage at permits where the "facility" Major Group 2-digit SIC code entered in ICIS-NPDES is 01, 02, 07-09, 13-17, 20, 25, 32, 37, 39-49, or 52-59. Indiana has also noted that if the facility SIC code is left blank in ICIS-NPDES, any inspection type appears to count for coverage, even for true primary industries.

The numbers associated with the enforcement activities do not encompass all of the formal enforcement actions completed by the Water Enforcement Section. The numbers displayed in ECHO represent only enforcement actions involving those sites and permitees that are included in ICIS, and therefore do not include actions involving sites which are not currently tracked in ICIS or violations that are not addressed by NPDES permits, such as spills and unpermitted discharges.
 
RCRA The number of LQG and SQG inspections conducted is greater than shown. These numbers are based on current counts for LQGs and SQGs but since the universe changes on a daily basis some of the LQG/SQG inspections Indiana conducted are not credited in this report because the handler moved out of the universe. Indiana has also done a great many more inspections at CESQG's, non-notifiers, and other types of facilities in the last 5 fiscal years. This report is pulling by some criteria that does not include all the inspections Indiana conducted, even though these inspections are in the EPA database. The percentage of violations found at inspections should be lower than is shown. The count of inspections does NOT include the NRR inspection type, but the count of inspections with violations found DOES include this inspection type. Data Link (PDF) (1pp, 9 K)
Kentucky CAA Because of minor discrepancies between the frozen data in the EPA's FY 10 report and the current data in Kentucky's database, it is suggested that any questions be forwarded to the DAQ: (502) 564-3999, Extension 4453, or Jon.Trout@ky.gov  
Maine
RCRA

Maine DEP has posted information regarding data completeness and data interpretation at the following web site: http://www.maine.gov/dep/enforcement/echo.html.

In addition to the general caveats at the above link, please consider the following specific caveats and corrections:

  1. Maine DEP conducted inspections at LQGs that are part of the RCRAInfo LQG Universe, but these LQGs do not show up in the Biennial Report LQG universe. Because EPA calculates inspection coverage by using the Biennial Report LQG universe, Maine's inspection coverage is not fully captured by EPA's metric [EPA metric 5.B.0 = Inspection coverage for LQGs (1 FY)]. EPA's metric for percent coverage of the LQG Universe underestimates the actual coverage of the LQG Universe by Maine inspection activities.
  2. The data issue identified in item #1 above also occurs for EPA metric 5.C.0 [Inspection coverage for LQGs (5 FYs)]. Again, during the five year review period, Maine's inspection coverage for LQGs included LQGs that are part of the RCRAInfo LQG Universe, but which do not show up in the Biennial Report LQG universe. As a result, EPA's metric for inspection coverage for the five fiscal year review period understates Maine's inspection coverage for LQGs.
  3. Maine conducts inspections at facilities that do not have EPA ID#s and are not required to notify EPA or Maine of their hazardous waste activities (i.e. Maine-defined Small Quantity Generators, i.e. federal CESQGs, and other non-notifier facilities identified through citizen complaints). The inspections activities, enforcement actions, and resulting monetary penalties assessed or collected as a result of such enforcement activities by Maine are not captured or reflected in the EPA metrics. As a result, EPA's metrics understate Maine's full range of hazardous waste compliance, inspection, and enforcement efforts.
  4. Under Metric 10.A.0, EPA reports that 33.3 % of SNCs had formal action/referral taken place within 360 days (1 FY). This metric is incorrect. The actual percentage is 100%. The EPA data metric does not count or include formal actions taken in two case-specific examples listed by EPA as "not counted". In fact, formal actions were concluded with both Northeast Coating Technologies and eWaste Recycling Solutions within the 360 days metric measure. If these two cases are included in the calculation, then 100% of the SNCs identified in that metric had formal actions taken within 360 days.
 
Michigan CAA   Data Link (PDF) (2pp, 20 K)
CWA A problem with the transfer of discharge monitoring report (DMR) data between the State of Michigan and US EPA currently exists. Therefore, the data metrics for DMRs does not accurately reflect the number of submissions received by the State of Michigan. Successful resolution of this problem is anticipated to occur in early 2011.  
RCRA   Data Link (PDF) (4pp, 15 K)
Minnesota CAA The State of Minnesota has completed data verification for the 2010 SRF AQ data and has made appropriate corrections.

Please see the attachments and comments below. The first attachment contains the synthetic minor inspections conducted by MPCA the past 5 years, 102 facilities, the second attachment contains the actual Minnesota SM80s, 41 facilities. The 3rd attachment lists the 20 SM Universe corrections that were made to bring the AFS and Minnesota SM Universe counts into agreement. The 4th attachment lists the HPV Pathways added during FFY2010 showing the 6 "missing" discovery dates.

The 6 unrecognized (by AFS) discovery actions were replaced with actions recognized by AFS. Similarly unrecognized HPV discovery actions from FFY2009 (attachment not included) were also replaced with actions recognized by AFS.

John seems to have figured out why him and I have different numbers when looking at the test reports reviewed in the EPA database. He uses our stack test screens in Delta to populate the EPA database. So, if we have completed the review and sent the letter out but have yet to get the screen filled in, which sometimes does take awhile since this seems to carry less priority than getting letters out in a timely manner, then John is not aware that these reports have been reviewed as well.

We're not sure how to address this situation or if in fact we even need to. How often do discussions like this come up with EPA? If we knew in advance a dump of the data in their database was going to be completed, we could prioritize and see that we get the stack test screens populated prior to this. Or, it may be enough to just send you a list of the other facilities that we are aware of having reports reviewed, similar to what was done this time. You would know that any difference between the EPA database and our list was due to the data entry lag time. We could also copy John on our NOCs however, this seems redundant for him to populate the EPA database since we are already needing to populate Delta with the same information on a routine basis.

U.S. EPA Region 5 has reviewed the CAA data for all six States, specifically, the entries reported to AFS for the air program pollutant compliance status field at the source level. There are data issues that still remain which involve inaccurate data entered by EPA in the State agency's compliance status field instead of EPA's field. Region 5 will continue working with the States and HQs to clean-up and correct all inaccurate data entered for all six States.
Data Link (PDF) (1pp, 5 K)
Mississippi CAA

Metric 3.A.1 - Percent HPVs Entered ≥ 60 Days After Designation, Timely Entry (1 FY)

This metric evaluates the timely entry of HPVs in AFS by determining how long it takes to enter the Day Zero date in AFS. Because of the difficulty in AFS changing a HPV day zero to a non-HPV day zero, and vice versa, we do not enter a day zero until we're certain which it is. Therefore, we often do not make the determination until we've reviewed the company's response to our NOV. This means our actual determination of HPV status is after the day zero. Consequently, we cannot guarantee day zero's are entered within 60 days; however, we do believe that the majority of the time day zero's are entered into AFS within 60 days of our determination of HPV/non-HPV. Additionally, issues with the batch upload process and the Universal Interface have contributed to data entry delays.

Metric 3.B.1 - Percent Compliance Monitoring related MDR actions reported ≥ 60 Days After Designation, Timely Entry (1 FY)

This metric evaluates the timely entry of actions such as a Stack Tests, ACCs or FCEs in AFS by determining how long it takes to enter the actual action into AFS. For a Stack Test, the data entry is measured from the date the stack test was conducted. Permits typically allow a company 45 to 60 days to submit a stack test report. Our data entry on the report does not occur until our review of the report has been completed. This process typically creates the potential for the 60 day entry time to be exceeded. Stack test for the FY 10 review comprised of 38% of the universe in this metric. Additionally, issues with the batch upload process and the Universal Interface have contributed to data entry delays.

Metric 3.B.2 - Percent Enforcement related MDR actions reported ≥ 60 Days After Designation, Timely Entry (1 FY)

This metric evaluates the timely entry of NOVs and discovery activities in AFS by determining how long it takes to enter the action date in AFS. As discussed in 3.A.1, our business process cannot guarantee HPV day zero's are entered into AFS within 60 days. NOV's and most discovery activities are not entered into AFS until the appropriate day zero is created to facilitate linking of these actions to the day zero. We believe data entry for the other enforcement related MDR's is generally timely. Additionally, issues with the batch upload process and the Universal Interface have contributed to data entry delays.

Metric 10.A - Percent HPVs not meeting timeline goals (2 FY)

This data metric evaluates only the Air HPV enforcement timeline goals and does not account for more than one media being evaluated. A significant number of MDEQ's enforcement actions are multi-media and our business practice is to address all violations, regardless of media, under one enforcement action. Due to the complexity of having multimedia violations included in one enforcement action and each media having their respective timeline(s), a particular media's timeline goal may be exceeded in resolving the enforcement.

 
CWA Metric C01B2C & C01B3C - NPDES Major individual permits: DMR entry rate based on DMRs expected (Forms/Forms) (1 Qtr- 7/1/10 to 9/30/10)
NPDES Major individual permits: DMR entry rate based on DMRs expected (Permits/Permits) (1 Qtr- 7/1/10 to 9/30/10)


EPA did not provide data for these metrics prior to the data freeze for FY10. Therefore, MDEQ could not verify the data associated with these metrics as of February 16, 2011.

Metric W01C1C - Non-major individual permits: correctly coded limits (Current)

This data represents the Non-Major NPDES permits that SRF/OTIS considers as coded incorrectly (262 permits). Of the 262 permits listed as coded incorrectly, 29 are permits that have Applications Received Only in PCS, and 79 are permits that do not have outfalls/limits in PCS. The State considers these permits to be coded correctly, and the measurement should be 1233/1387 or 88.9%.

Metric C01C2C & C01C3C - NPDES Non-Major individual permits: DMR entry rate based on DMRs expected (Forms/Forms) (1 Qtr- 7/1/10 to 9/30/10)
NPDES Non-Major individual permits: DMR entry rate based on DMRs expected (Permits/Permits) (1 Qtr- 7/1/10 to 9/30/10)


EPA did not provide data for these metrics prior to the data freeze for FY10. Therefore, MDEQ could not verify the data associated with these metrics as of February 16, 2011.

Metric W05B1S - Inspection coverage: NPDES non-major individual permits (1 FY)

This data represents the Non-Major NPDES facilities that had an inspection completed for FY10. The SRF/OTIS logic does not include inspections that were performed at Non-Major NPDES facilities that are now Inactive (14 evaluations for FY10).

Based on the number of facilities that have had inspections for FY10, the inspection coverage should be 146/1383, or 10.6%.

Metric W05B2S - Inspection coverage: NPDES non-major general permits (1 FY)

This data represents the Non-Major general permit facilities that had an inspection completed for FY10. The SRF/OTIS logic does not include inspections that were performed at Non-Major general permit facilities that are now Inactive (41 evaluations for FY10).

Based on the number of facilities that have had inspections for FY10, the inspection coverage should be 144/1684, or 8.6%.

Metric W010AC - Major facilities without timely action (1 FY)

The universe that SRF is using for this metric includes one facility that is regulated by EPA. Therefore, the major facilities without timely actions should be 7/95, or 7.4%.
 
RCRA

Metric R05B0S - Inspection Coverage for LQGs (1FY) - The universe that SRF is using is the number of LQGs that filed a report for the BRS reporting cycle 2009. Mississippi's LQG universe has changed significantly since March 1, 2010, and does not accurately reflect the State's LQG universe as of September 30, 2010. There are 145 active LQGs on February 17, 2011. Metric R01A2S is a more accurate representation of Mississippi's LQG universe for FY10, and should be the universe to measure inspection coverage (Metric R05B0S & R05C0S) for the LQGs.

The SRF counted universe for FY 2010 LQG inspection coverage is 39. The State performed a total of 52 LQG inspections during FY 2010. The discrepancy is due to a change in generator status at 13 facilities who are no longer LQGs.

Based on the number of facilities that have LQG inspections for FY 2010, the inspection coverage should be 52/145, or 35.9%.

Metric R05D0S - Inspection Coverage for SQGs (5FY) - The SRF counted universe for FY 2006-2010 SQG inspections coverage is 133. The State performed a total of 254 SQG inspections at SQGs for FY 2006-2010. The discrepancy is due to a change in generator status at facilities that had inspections during this period.

Based on the number of facilities that have SQG inspections for FY 2006-2010, the inspection coverage should be 254/389, or 65.3%.

 
Montana
CWA A problem with the translation of data between EPA and the State exists for some of our data, so the Data Metrics do not reflect the complete number of activities for: violations at non-majors: DMR non-receipt, Informal actions: number of non-major facilities, Informal actions: number of actions at non-major facilities, and inspection related data and activities performed by the State. Data Link (PDF) (1pp, 5 K)
Data Link (PDF) (1pp, 9 K)
RCRA A problem with the translation of data between EPA RCRAInfo and OTIS exists for: 1) the number of non-notifier inspections, and 2) all other active sites, so the data metrics do not reflect the complete and accurate number of activities performed by the state. OTIS is not extracting the full universe of non-notifiers as identified in RCRAInfo nor the total of all other active sites. 3) The number of formal enforcement actions noted in metric 1F2 for the State is correct. This value should match metric 10B, per the instructions, and it does not. and 4) For metric 1D2 for the State, Montana has a two 110 (verbal informal) enforcement actions that are not counted in this metric, which creates the discrepancy. Data Link (PDF) (1pp, 11 K)
New Jersey CAA The data submitted to AIRS-AFS by New Jersey is quality checked quarterly, with a very high level of accuracy for all major and SM facilities. Any errors found are corrected immediately. As of 11/30/2010, the FY 2010 data in AFS is accurate based on our reviews of the data.  
New York
RCRA

NY does not use the federal generator status field in RCRAInfo to determine a site's generator status, i.e. LQG, SQG, and this field is inaccurate for NY sites. We use an internal database of manifest data instead.

NY also does not use the active site field in RCRAInfo.

This results in the calculation of RCRA State Review Framework Metrics being inaccurate, especially # 5 b, c, and d.

 
North Carolina CAA The State of North Carolina has completed data verification of the 2010 SRF CAA data and has made appropriate corrections. However, there are continued timeliness AFS data reporting issues that remain.

1) NC DAQ has monthly conference calls with Wendell Reed of EPA Region 4 to discuss High Priority Violations (HPV's). For years, EPA has made updates to AFS for all North Carolina's HPV's.

In the 2010 SRF CAA data, of the four cases that are reported as late, two of them were late because EPA entered the information to AFS far after the date of the EPA Conference Call. On average EPA is entering HPV information into AFS (e.g., "DZ Creation date") 17 days after the EPA conference call. The range is from 1 to 83 days later.

Case 1 CNA HOLDINGS INC.; TICONA POLYMERS SHELL was not a new HPV in FFY 2010. It was discussed and added during the 09/12/09 conference call. EPA entered the HPV information into AFS 83 days AFTER the conference call. The "DZ Creation date" for this HPV was 12/03/09, almost three months AFTER it was originally discussed with EPA.

Case 2 BLUE RIDGE PAPER PRODUCTS - CANTON MILL was not late. It was discussed during the 12/19/09 conference call, which was 25 days after day zero. However, it was added to AFS by EPA on 01/22/10, which was 44 days after the 12/19/09 conference call.

2) NC DAQ is in the process of developing and implementing a protocol which ensures that Pass/Fail/Pending codes (PP/FF/99) for all stack tests are reported in the AFS results code field, and pending codes are updated within 120 days. The following EPA Region 4 personnel are aware of NC DAQ's ongoing batch upload programming change to AFS: Beverly Spagg, Dick Dubose, Mark Fite, Ahmed Amanulah and Shannon Maher.
 
CWA Problems with the translation and timing of the data between EPA and the State exist for correctly coded limits, total penalties collected, inspections, single-event violations, unresolved compliance schedule violations, SNC rate for majors and timely actions taken, so the data metrics do not reflect the complete number of activities performed by the State.  
Ohio CAA U.S. EPA Region 5 has reviewed the CAA data for all six States, specifically, the entries reported to AFS for the air program pollutant compliance status field at the source level. There are data issues that still remain which involve inaccurate data entered by EPA in the State agency's compliance status field instead of EPA's field. Region 5 will continue working with the States and HQs to clean-up and correct all inaccurate data entered for all six States.  
Oklahoma
CWA
  1. Due to program commitments and manpower limitations, ODEQ does not code general & minor facility enforcement actions (metrics 1E.3, 1E.4, 1F.3, & 1F.4), inspections (metrics 5B.1 & 5B.2), or SEVs (metric 7A.2) into ICIS-NPDES.
  2. Metric 1A.2 is a combined (State & EPA Region 6) universe. Oklahoma Water Quality Division can verify that 125 of the 182 facilities listed are state non-major general permits.
  3. ODEQ's fiscal year (July-June ) is different than EPA's fiscal year (October-September), which has caused the discrepancy of the major inspection coverage (metric 5.A) to report fewer inspections than the actual value.
  4. Metrics 7.B & 7.C (facilities with unresolved compliance and permit schedules) reflect the schedules for ODEQ's major facilities only.
In addition, the metrics1.F, 1.G, 5.A, 5.B, & 6.B values have been updated in the list below to reflect the correct values listed in the revised column. An itemized spreadsheet of the penalties & enforcement data has been included to support the revised values stated.
Data Link (PDF) (1pp, 64 K)
RCRA   Data Link (PDF) (1pp, 9 K)
Pennsylvania CAA

"Metric entitled 'CAA Subpart Designations: Percent NESHAP facilities with FCEs conducted after 10/1/2005 is in error': This metric indicates that 98 NESHAP facilities in AFS were missing subpart data. However, these 98 facilities were not subject to non-MACT NESHAP and had non-MACT NESHAP air programs and program pollutants marked in error."

"Metric entitled 'CAA Subpart Designations: Percent NSPS facilities with FCEs conducted after 10/1/2005': This metric indicates that 4 NSPS facilities in AFS were missing subpart data. However, these facilities were not subject to NSPS and had NSPS air programs and program pollutants marked in error."

"Metric entitled CAA Subpart Designations: Percent MACT facilities with FCEs conducted after 10/1/2005': This metric indicates 2 MACT facilities in AFS were missing subpart data. These facilities were not subject to MACT and had been designated as such based upon NAICS codes, which had been entered incorrectly. NAICS codes have been corrected, and MACT air programs and program pollutants have been closed."

 
CWA   Data Link (ZIP) (684 K)
South Dakota CWA 1a1 - SD has 29 major permits, but only 27 of which get expected DMRs entered. The other 2 permits have unscheduled or inactive limit sets and do not typically have DMRs entered.

1a3 - SD00TEST1 is a test permit that is used to copy and paste limit sets from one permit to another, and therefore should not be counted. SD0020486 was an individual permit that was switched to coverage under a master general permit. They are now in the process of switching back to an individual permit. The previous permit was un-terminated so it could be reissued with the same permit number. Thus it is being counted under the general permit numbers and should not be counted in this metric.

1e1 - This metric does not account for informal enforcement actions that were issued to a facility early in the fiscal year, and then later in the fiscal year the facility's permit was terminated.

1e2 - This metric does not account for informal enforcement actions that were issued to a facility early in the fiscal year, and then later in the fiscal year the facility's permit was terminated.

7c1 - SD0022667 is the only permit that should be listed for this metric.
 
Tennessee CAA A coding error was recently discovered that incorrectly assigned NESHAP designation to non-NESHAP facilities. These errors will be corrected as soon as appropriate database update authority is granted by EPA to TDEC-APC.

The corrections have been completed. There are 8 NESHAP sources, they are in the report below. All have an FCE conducted after 10/2005. Now the report should read 100% completed.
 
Texas CAA FY2010 Frozen Data for the metric A1C4S is inaccurate. Texas does not report investigations by subprograms. By State standards, FCEs performed at Title V facilities are understood to include an FCE for each subprogram associated to the facility.

FY2010 Frozen Data for the metric A1C5S is inaccurate. Texas does not report investigations by subprograms. By State standards, FCEs performed at Title V facilities are understood to include an FCE for each subprogram associated to the facility.

FY2010 Frozen Data for the metric A1C6S is inaccurate. Texas does not report investigations by subprograms. By State standards, FCEs performed at Title V facilities are understood to include an FCE for each subprogram associated to the facility.

FY2010 Frozen Data for the metric A05A1S is inaccurate. Texas Runs a Risk Based Investigation Strategy and established agreement With R6 to inspect on a 7 year cycle.

FY2010 Frozen Data for the metric A05A2S is inaccurate. Texas Runs a Risk Based Investigation Strategy and established agreement With R6 to inspect on a 7 year cycle.

FY2010 Frozen Data for the metric A05B1S is inaccurate. Texas does not maintain a universe of synthetic minors; therefore Texas is unable report on Synthetic Minors.

FY2010 Frozen Data for the metric A05B2S is inaccurate. Texas does not maintain a universe of synthetic minors; therefore Texas is unable report on Synthetic Minors.

FY2010 Frozen Data for the metric A05C0S is inaccurate. Texas does not maintain a universe of synthetic minors; therefore Texas is unable report on Synthetic Minors.
 
 
RCRA

Metric 5.A (Inspection coverage for operating TSDFs (2 FYs)): TCEQ inspection coverage for TSDFs in FY09 was reduced due to TCEQ state-wide assistance with environmental response efforts related to waste-management issues (ex. debris management) resulting from Hurricane Dolly (landfall July 2008) and to even a greater extent, Hurricane Ike (landfall September 2008). Based on the logic in the EPA's National Program Manager Guidance, TCEQ inspection coverage for TSDFs in TCEQ's FY10 (9/1/2009 to 8/31/2010) was 58.8%, exceeding the 50% annual coverage requirement for operating TSDFs.

Metric 5.C (Inspection coverage for LQGs (5 FYs)): Under the RCRA Core LQG Pilot Project (alternative approach), TCEQ was allowed to use a 2:1 SQG/CESQG CEI to LQG CEI to count towards their universe coverage for LQGs, allowing them to redirect resources to facilities other than LQGs. Under this alternative approach, the TCEQ meets the 100% coverage requirement for LQGs over a 5 year period.

The number of multi-facility orders varies between the state database and RCRAInfo. The state database counts multi-facility orders as a single order.

Data Link (PDF) (1pp, 7 K)
Utah
CWA The inspection data metrics does not reflect the complete number of inspection activities preformed by the state. The discrepancies significantly lower the metrics for Compliance Monitoring as many of Utah's inspections completed are not presently included in EPA's numbers.  
Virgin Islands
CAA The territory of the Virgin Islands has completed its review of the CAA Data. Several issues remains relating to the CAA Data.  
Virginia
CWA Metric 1A4, NPDES non-major general permits - Virginia General permit information is not uploaded to or stored in PCS. Data shown is not accurate.
Metric 1B1, Correctly Coded Limits - Percentage shown does not accurately reflect limits submitted by VADEQ; all permits are correctly coded in PCS. The MS4 permit limits are not stored in PCS.
Metric 1D1, Violations at non-majors: noncompliance rate - The count in this metric is not reliable. Virginia does not upload non-major individual DMR data into PCS; all of the NC/RNC violations shown in this metric are not accurate.
Metric 2A, Action linked to violations: major facilities - One enforcement action could not be linked to its effluent violations that occurred during the previous permit cycle.
Metrics 5A and 5B1, Inspection coverage: NPDES majors and non-major individual permits - Numbers of inspections shown for individual permits are not completely accurate. Certain types of inspections conducted by VADEQ could not be transmitted to PCS due to restricted PCS rules.
Metric 5B2, Inspection coverage: NPDES non-major general permits - General permit inspection information is not uploaded to or stored in PCS. Data shown is not accurate.
Metric 5C, Inspection coverage: NPDES other - Information shown does not represent the universe of "NPDES other" or permits other than NPDES.
Metric 7D, Percentage major facilities with DMR violations - Violations shown do not include those that occurred during previous permit cycle.
Metric 10A, Major facilities without timely action - Facilities shown in this metric do not have such issue - The non-compliance status of Farmville Advanced WWTP did not occur during FY10 and issue was resolved in the first quarter of FY10. The non-compliance status of Danville City - Northside has returned to compliance without any enforcement action.
 
RCRA One facility was not counted (TSDF not inspected). This was Merck, VAD001705110, which had an FCI inspection conducted on 9/28/2010. DEQ conducts FCI's based on Risk-Based criteria agreed upon with Region 3. For Enforcement items, please see the attached file. Data Link (PDF) (1pp, 7 K)
Washington
RCRA

A translation problem between EPA and the state exists for handler data, so the data metrics do not reflect the correct number of active sites in Washington.

Metric 1A2 (Active LQG Universe) does not accurately reflect this universe in Washington. Ecology uses an Ecology developed database named TurboWaste to track annual reports of hazardous waste generation and management. TurboWaste contains the correct record of which sites are active and which are inactive. Ecology has had difficulty translating the handler records from TurboWaste to RCRAInfo, especially regarding active and inactive site status. We continue working towards a resolution of the problem.

Ecology's BRS submittal as shown in metric 1A5 is correct and provides accurate information for the related LQG metric calculations. Inspection rates for LQGs are calculated by the SRF using BRS data and are therefore correct.

The calculation affected by the universe count issue is Metric 5D, the 5 year inspection coverage for active SQGs. As Metric 5D is "Informational Only", it is not a great concern, although the correction would be to Ecology's benefit. Currently, OTIS lists 1,328 SQGs in Washington with a calculated SQG inspection coverage of 21%. Ecology's TurboWaste system currently lists 602 SQGs, which would result in a calculated SQG inspection coverage of 46%.

 
West Virginia CAA A number of facilities were entered in the state and EPA CAA data bases with NESHAPS, and/or NSPS air programs erroneously indicated. Data clean up and correction is in progress. A great many of the uninspected NSPS facilities, particularly those permitted in the last year and a half, are emergency generators subject to Subpart IIII and located at remote cell phone towers. WV has no current plans on inspecting the NSPS IIII backup emergency generators under an agreement with EPA Region 3.  
CWA A problem with the translation of data between EPA and the state exists for inspection data, so the data metrics do not reflect the complete number of activities performed by the state. Data Link (PDF) (4pp, 56 K)
Wisconsin CAA The number of inspections conducted at NSPS facilities after 10/1/2005 is 199 out of 206 facilities, or 96.6%. The number at NESHAP facilities in 177 out of 182, or 97%. Data metrics incorrectly present these values as 11.0% and 1.7%.

U.S. EPA Region 5 has reviewed the CAA data for all six States, specifically, the entries reported to AFS for the air program pollutant compliance status field at the source level. There are data issues that still remain which involve inaccurate data entered by EPA in the State agency's compliance status field instead of EPA's field. Region 5 will continue working with the States and HQs to clean-up and correct all inaccurate data entered for all six States.
 

Top of Page