Please share this information
Ms. Hammer has been asked to respond to the following questions pertaining to information in her special report Injustice for All.
Do approved changes to data based on data adjustments and appeals defined in current federal regulations explain the discrepancies between the Department of Education’s (DOE) institutional cohort default rate (iCDR) reporting and its data? For example, appeals that resulted in changes to the data that were used to calculate the official rates after the fact and that appeared in the institution-level files the DOE posted in 2014. Or could this be an example of DOE incompetence rather than its malfeasance? The DOE has screwed up data before.
Ms Hammer’s response:
The excuse that these data discrepancies are a result of appeals is INVALID for several reasons:
1. From dates provided on the PEPS300 files and the DOE Briefings, the information is pulled simultaneously. ED’s answer to Chairman Kline’s question was that all of the information is pulled from the NSLDS. It should match or have variations so small that it doesn’t change the overall findings between the data and the reporting.
2. For the most part, the public sector schools don’t do appeals unless they are going to lose funding. If these did do appeals, the number of defaults would go DOWN in larger numbers than the number entered repayment. The goal is to decrease the rate by a significant number. The number of public schools in jeopardy of losing funding was minimal. The pattern of the same % of decrease in both the number in default and the number entered repayment was EXACTLY the same for FY 2009-2011. For example, for FY 2009, the decrease in numbers was by 4% for both the number in default and the number entered repayment. This pattern continued for FY 2010 and FY 2011. That would also be a big coincidence.
3. Does it make sense that for FY 2009-2011, EVERY year’s discrepancies showed the increase in the number of defaults at 1% higher than the increase in the number entered repayment? For example, for FY 2009, the proprietary number of defaults were increased by 10% and the number entered repayment were increased by 9%. For FY 2010 and FY 2011, this same pattern of the defaults increasing at 1% higher than the increase in number entered repayment continued. That would be a pretty big coincidence!
4. Why would proprietary schools do appeals to increase their rates by several percentage points? They wouldn’t.
5. I’ve been doing default management for 28 years. I helped draft both the statute and the regulatory language for cohort default rates and the appeals processes. I have completed thousands of appeals over the years. The pattern of changes in data after these have been approved is that the number of borrowers in default decreases and the number of borrowers entered repayment increases.
- Some of the corrections are to add missing borrowers in good standing. This increases the number of borrowers entered repayment.
- Some of the corrections are to the default type or dates so some of the borrowers remain in the denominator and are removed from the numerator. For example, the borrower is reported as counting in both the numerator and denominator (default status) when the claim was actually a death claim which doesn’t count against the school. In this case, the student remains in the denominator and is removed from the numerator.
- Many of the corrections move the students from the most recent CDR to a prior CDR where the date of default falls outside of the measurement definition. In this case, both the number of borrowers in default and the number of borrowers entered repayment are reduced by one (1) borrower in the current year and the borrower is added to a prior year as a denominator only because the default occurred outside the window where it is counted against the school (increases the number of borrowers entered repayment by one (1) borrower.)
All of these net a pattern across multiple CDRs where the numerator (borrowers in default) decreases at a higher rate than the denominator (borrowers entered repayment). Neither the pattern for public schools (even decreases in both the numerator and denominator) or for proprietary schools (increases in numerator is higher than increases in denominator) is consistent with how numbers change in data adjustments and appeals.
6. The number of “uncorrected data adjustments” is now SO minimal because of the electronic processes. These include those “incorrect data adjustments” that are submitted during the draft process and approved by the data manager that are NOT corrected in the final data. In other words, the final data used for the “official” CDRs is very clean.
7. Remaining appeals for loan servicing, low borrower numbers, economically disadvantaged would NOT cause discrepancies in the data and reporting that are pulled in the same timeframe if at all.
- Loan servicing appeals take AT LEAST a couple months so these wouldn’t be reflected in the data at the time it is released for the current CDR. Loan servicing appeals from prior CDRs would have ALREADY BEEN processed by the deadlines set forth in the regulations.
- Low borrower numbers and economically disadvantaged appeals are processed as “waivers” to the threshold criteria—these don’t change the data.
8. The information reported every year for the prior years included in the three most recent CDRs used to determine Title IV eligibility has NOT changed in any of the DOE’s briefings. For example, the FY 2009 and FY 2010 information included in the briefing was exactly the same as the year in which those CDRs were the most recent CDR—the FY 2009 information was exactly the same in DOE briefings in 2012, 2013, and 2014; the FY 2010 information was exactly the same in 2013, 2014, and 2015. If adjustments and appeals were affecting the numbers, all of these numbers would change each year as borrowers were moved from one cohort to another based on the incorrect data adjustments, erroneous data appeals and loan servicing appeals.
Would conduit loans (purchased by the Department of Education from the private sector FFELP student loan community) default rates be MUCH higher than the national average, in part, because these were more troubled loans to begin with? (No one is denying that the transition among servicers was rough, and accounts for some of the higher default rate).
Ms Hammer’s response:
The DOE may try to explain these extremely high rates with a variety of excuses but I was there. I saw thousands of accounts that we were servicing go from deferment status to default. I made a trip to DC solely for the purpose of meeting with David Bergeron to ask him to intervene and correct these accounts. There are hundreds of thousands of students who went into default when they shouldn’t have—and who are living with the consequences of defaulted loans and ruined credit ratings.
The Department has liability here. They don’t want this out in the public because they very well will be dealing with lawsuits and this may even be enough for Congress to close them down as so many in office would like to see, especially Jason Chaffetz. ED will come up with every excuse possible and place blame on everyone else. The bottom line is that Congress needs to take these kids out of default and clean up their credit scores—period. Besides, it will save taxpayers money because there are at least two servicing fees being paid for those students who have one or more current loans and one or more defaulted loans. Then, the rates need to be corrected for the schools. Doing the right thing is right for students, schools, and the federal fiscal interest.
Please share this information