print this page

Bookmark this page
-A A +A

The danger of “easy” data: four troubling numbers in our future

You may have heard the story of a man who lost his keys one night outside and was looking for them under a street light. A passerby asked him if he could remember the last place he had them, to which he responded, “Down the street.” Asked why he didn’t look for them there, the man replied, “The light is better here.” Looking in the wrong place because it is “better” (read “easier”) reminds me of the danger of “easy” data—and why such data may be as harmful as no data or even bad data.

The most prominent example of “easy” (read “dangerous”) data for accrediting agencies like ours is what I call “four troubling numbers in our future.” Those four numbers are being pushed hard by politicians and policy makers in Washington DC, especially by the US Department of Education (USDE), as reliable indicators of educational quality. And while I’m not aware of any similar pressure for our Canadian colleagues, the USDE’s reach is long and strong, especially for the dozen or so Canadian schools in ATS that participate in USDE federal loan programs.

Before I discuss these four numbers, let me add that some of these numbers may actually be helpful for some purposes. The danger is that they are being increasingly viewed as definitive measures of educational quality, when in reality they have little to say about actual educational quality. In fact, diploma mills would score high on three of these four numbers. A key reason they are being pushed so hard publicly is that they are relatively easy numbers to find (“the light is better here”). And since they are numbers, it is easier to set benchmarks or bright lines and compare schools with one another—even if their missions and offerings are decidedly different. One of the places these numbers can be found is in the USDE’s November 2015 “Fact Sheet” (the word “fact” is interesting here), subtitled: “Department of Education Advances Transparency Agenda for Accrediting Agencies.” That document describes how USDE will now publish on its website “key student outcome measures for each institution alongside its accreditor.” The four most troubling numbers on their list are these: (1) median debt of graduating students, (2) federal loan default rates, (3) graduation rates, and (4) placement rates (the last one is not explicitly stated in this document, but is associated with the “post-school earnings” number that is stated). Let’s look briefly at these four numbers in two paired sets—and why these data can be so “dangerous,” or at least troubling.

(1) and (2)—Median student debt and loan default rates

The first pair of troubling numbers (“easy” data) being pushed publicly and politically relates to the cost of education. Cost is clearly a critically important issue and worth discussing, but it is not a reliable measure of educational quality—particularly for theological schools. As noted in a December 2015 ATS Colloquy article, seminary student debt has no direct correlation with educational costs because so many ATS students are older adults who borrow more for living expenses than for tuition. Median debt and loan default rates for seminary graduates are actually fairly low compared to national averages. The median debt for ATS graduates who borrow (only 55% do) is estimated at $37,000, compared to around $50,000 for all other graduate profession graduates; while seminary student debt default rates are only 5%, compared to 12% for all US higher education students. However, neither number measures the quality of students’ educational experiences.

Exacerbating the danger of these data is the fact that institutions do not control the amount that students borrow; student debt limits are set by the USDE. Even if ATS schools wanted to limit student borrowing, they are powerless to do so. Yet, excessive debt loads, which often lead to high loan default rates, are two numbers being pushed as indicators that those schools are “failing” and that accrediting agencies are not doing enough to “punish the poor performing” schools. That is troubling.

(3) and (4)—Graduation rates and placement rates

The second pair of troubling numbers (“easy” data) being pushed, more for political than educational purposes, relates perhaps more closely to actual student outcomes, but in a rather troubling way—especially for theological schools. To be sure, these are numbers worth knowing and discussing, which is why we require schools to collect them and report them on the ATS Annual Report Forms and why we published some of those data in a March 2016 Colloquy article. But such discussions must be informed by each school’s distinct mission and context. For example, many ATS schools intentionally view students’ time in seminary as a period of discernment to determine if they are truly called to ministry. If they are not, then for them not graduating is actually a good thing. And many students come to seminary not looking for a degree, but for spiritual development or personal fulfillment. Yet, the push to use high graduation rates as a bright line measure of educational quality ignores those theological, ecclesial, and individual realities.

Something similar is true for placement rates. Those numbers can help inform meaningful conversations among our member schools (see, for example, this April 2016 Colloquy article), but again each school needs to look at its numbers in light of its own mission and constituency. ATS schools serving denominations without MDiv ordination requirements enroll a large number of students who are already “placed” in ministry, so high placement rates for those schools might mean very little. And some ATS degree programs virtually require students to be placed before enrolling (e.g., the DMin degree). Again, placement rates can be helpful indicators, but to push them as determiners of educational quality seems simplistic at best. Especially troubling for theological schools is the push to link placement rates with “post-school earnings” as promulgated by the USDE. Students do not come to seminary for “gainful employment,” but to find meaningful ministry, often in settings that are not economically appealing.

Unfortunately, these four numbers, troubling as they may be, appear to be the new reality in which accrediting agencies and the schools they serve now live. So, we will collect these numbers as best we can, but we will always encourage schools to go beyond the statistics to understand the stories that make them meaningful. While I’m a firm believer in data and their value for our membership organization (see my previous blog on that), I also affirm the wisdom of the following quotation. It is often attributed to Einstein, but was first found in a 1963 paper by the sociologist William Cameron: “…not everything that can be counted counts, and not everything that counts can be counted." Some numbers need narratives, especially these four troubling numbers in our future.

Meet the Author

Tom Tanner

Tom Tanner is director, accreditation and institutional evaluation. When he is not out on the road with evaluation teams, he loves to analyze ATS data.

Add new comment

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.

Filtered HTML

  • Web page addresses and e-mail addresses turn into links automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <blockquote> <code> <ul> <ol> <li> <dl> <dt> <dd>
  • Lines and paragraphs break automatically.
All comments to ATS blog posts will be moderated. ATS reserves the right to not post comments that are disrespectful, inflammatory, or irrelevant to the topic of the post or that are suspected to be spam.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.