Guardrail: discount rate

Discount Rate: The portion of total tuition and fee revenue channeled back to students as institutional grant and scholarship aid.

In keeping with Earlham’s focus on the guardrails, I wanted to share some thoughts about the tution discount rate.  Not only is this metric a focus for Earlham, but it is also of crucial importance for all colleges and universities in this quickly shifting world of higher education.

Context – national
According to the NACUBO Tuition Discounting Study (Inside Higher Education) discount rates have been steadily climbing for the past 10 years, from about 39% up to a national average of about 50%. This means that institutions, on average, are paying for half of each student’s tuition sticker price.

Context – private colleges
More specifically, the Chronicle reported a significantly higher number of private colleges breaking into the 60% discount rate ballpark, according to Moody’s. The credit-rating agency’s annual tuition survey, released November 2018, shows that private colleges are facing increasing pressure to undercut their sticker prices. In 2017, only 13% of private colleges reported a discount rate higher than 60%. Just one year later, that number increased to 19%. The sharp increase now finds colleges trying to figure out a model that is sustainable.

Earlham vs. comparison groups
For Earlham, the climb in discount rate has been steep, showing an increase from 39% for the Fall 2007 cohort up to 77% for the Fall 2016 cohort. Earlham’s discount rate has doubled in the last 10 years.  In fact, no other college in either the GLCA nor the ACM nor the HCAC has seen as sharp an increase in discount rate.

Comparison averages
2007 2011 2016
National average 39% 44% 50%
GLCA average 46% 50% 59%
ACM average
44% 51% 57%
HCAC average
45% 51% 55%
Earlham College 39% 58% 77%

My analysis
I would say that for too long, Earlham denied the realities of this changing landscape and acquiesced by offering larger and larger amounts of financial aid. From one recruiting cycle to another, despite all the best admissions and marketing efforts, the reality hits home some time during the spring weeks of the recruiting cycle. And as the clamor for new students gets louder, that discount rate starts to climb.

Thus, President Stewart’s call to focus on this metric, a metric which, at least in part, represents the value of the education and experience that is offered at Earlham College.

So you might ask yourself …. Despite all the planning and hard work over the past 10 years, why has the Earlham experience become less valuable rather than more valuable? What is it, about other post-secondary experiences, that makes the Earlham experience less and less attractive? When the Earlham experience is discounted by 77%, how can we expect others to believe in its value?………….Without pointing fingers, I think we all need to wrap our heads around these questions.

As always, you are invited to leave a comment or contact me directly.

Polly Albright
Director, IR and Assessment

6-year grad rate; perhaps the most important guardrail

an extra rail alongside the main rail of a railroad at a crossing etc., as to keep the cars on track (Collins English Dictionary)

You all may recall the January 23 convocation when Interim President Avis Stewart, Academic Dean Becky Thomas and Chief Finance Officer Stacy Davidson outlined the recent budget reduction past and called us all to envision the future of Earlham College. As I heard it, the message was a call to turn the page.

In his remarks, Stewart talked about three “guardrails” or key indicators that will help keep the College community focused on financial sustainability…. in other words, to “keep the cars on track.” The indicators he described are:

Guardrails Parameters
Discount Rate 65-70%
Student:Faculty Ratio 12 : 1
6-year Graduation Rate

In this post, I want to address what is perhaps the most important guardrail – graduation rate. Of the three, this one most directly speaks to the College mission —– to provide the highest quality liberal arts education.

But first, another defintion. Six-year graduation rate is a metric derived from the number of students who graduate within six years of entry, divided by the number who entered in a given cohort Fall semester. A first look at our retention/graduation rate data shows that Earlham has some work to do in this area. The most recent cohort rates all fall well below 72%.

Earlham College
2009 Cohort
2010 Cohort
2011 Cohort
2012 Cohort
Six-Year Graduation Rate

Further, this linked 20-year trend chart illustrates that the 6-year rate for the 2012 cohort (63.9%) dips to a serious low point, in fact the lowest in at least 20 years.

In addition to the trend shown above, when considering institutions in the GLCA and ACM, Earlham is ranked 19th out of 26 institutions. Only Albion, Ohio Wesleyan, Wabash, Coe, Cornell, Monmouth and Ripon are at or below the Earlham cohort average rate. By contrast, Earlham ranks at the second highest rate when compared to HCAC institutions. Defiance shows the lowest average rate – 49%. (Note: these averages are based on three-cohorts going back 8 years. Data is collected by the National Center for Educational Statistics – IPEDS.)

Finally, I did a quick calculation for current Earlham cohorts – looking at only the four cohorts that are on campus today. I applied the 72% “guardrail” graduation rate (instead of the current 67.1) and figure that with the higher rate there would be approximately 57 more students on campus at this moment. I think we could all celebrate 57 more Earlhamites, diploma in hand, ready to make a difference in the world.

As always, I welcome questions and comments via this blog or directly to my e-mail.

Polly Albright
Director, IR and Assessment

Additional Resource:
Single Cohort Comparison (2011) – GLCA, ACM and HCAC

A few NSSE takeaways

It’s been a while since my last check in but I want to share some of the results of the National Survey of Student Engagement. While this was the topic for October Faculty Forum, I also want to get the word out through this blog. My understanding is that the TLC team will be following up with information on the “Effective Teaching Practices” engagement scores. Below are some of the details in other areas:

High Impact Practices (HIP)
(research, internships, service learning, study abroad, capstone)
Earlham Students Report (**significant at the p<.01 level):

 At least one HIP
Peer group
More Selective group
National norm
First-Year Students
58% 55% 43%** 58%
99% 98% 93% 86%**

Student-Faculty Engagement
For both First-year and Seniors, this score is on par with the peer and more selective group, but significantly higher as compared to the national norm. (*** significant at the p <.001 level)

Student-Faculty Interaction Score
Peer group
More Selective group
National norm
First-Year Students
23.6 23.5 22.1 21.1*
30.4 30.4 29.6 23.9***

Discussions with Diversity
This one is a mixed bag. The overall score for both first-year and senior students is significantly higher, however, the score on one of the contributing questions is significantly lower than all comparison groups: “How often have you had discussion with people who hold political views other than your own?” (% showing “Very often” or “Often”)

Frequency of Political View Discussions
Peer group
More Selective group
National norm
First-Year Students
56% 63% 58% 66%
35% 58% 54% 66%

Highest performing single item scores (compared to peer group) were these:

  • Discussions with people of a race/ethnicity other than your own
  • Discussions with people with religious beliefs other than your own
  • Institution emphasis on encouraging contact among students from different backgrounds
  • Tried to understand someone else’s view by imagining a different perspective

Lowest performing single item scores – First-years

  • Evaluated what others have concluded from numerical information
  • Spent more than 15 hours per week preparing for class
  • Quality of interaction with other administrative staff and offices
  • Reviewed your notes after class

Lowest Performing single item scores  – Seniors

  • Institution emphasis on providing support for your overall well-being
  • Institution emphasis on helping you manage your non-academic responsibilities
  • Quality of interaction with other administrative staff and offices
  • Asked another student to help you understand course material
  • Discussion with people with political views other than your own

Related Links:
Engagement Indicators Report
High Impact Practices
Faculty Forum Presentation
More about NSSE

As always, if you have questions, don’t hesitate to contact the IR Office.
Polly Albright
Director, IR and Assessment

Fall 2018 Enrollment Update

Fall 2018 enrollment figures have been posted since September. In case you missed it, I wanted to share the latest and the five year trend:

Earlham College
Fall 2014
Fall 2015
Fall 2016
Fall 2017
Fall 2018
Fall Undergraduate Headcount
993 988 1031 1060 1050
Fall Graduate Headcount
81 79 71 66 52

You will note Fall 2015 as a low point for undergraduate numbers (at 988) and now the beginning of, what might be, a climb, despite the ~1% drop between 2017 and 2018. Graduate numbers seem to be on a steady decline.

Retention rates are showing a return to historic rates with the Fall 2017 cohort back at 81.9% after the uptick with the 2016 cohort. This rate is the subject of much work by the Completions Working Group and will be reported on in a future blog post.

Meanwhile, if you have comments and/or questions please share in the comments section here, or contact me directly.

Polly Albright
Director, IR and Assessment


Over the course of the academic year, I find myself engaged in many conversations about institutional assessment. And every time ….. absolutely every time …. the default position is “let’s field a survey.” Seems like a painless way to gather information….right?

Not so fast.

There are at least a few words of caution related to the use of surveys. A couple of those words are “survey” and “fatigue.” In fact, just a couple of years ago, the Advisory Committee on Assessment authored some guide lines and a process for scheduling to help guard against this phenomenon.

However, survey fatigue is only a symptom. There is an entire philosophy around the use of “big surveys” that should be considered. In my work, I have come to view survey data like those big green road signs that span across six lanes on the interstate. It goes like this…..we (Earlham College along with other colleges and universities) are traveling along a highway. The statistics offered through institutional surveys are like those signs that guide thousands of cars and trucks all motoring in the same direction; to the land of student success and balanced budgets.

The signs may indicate we have a short distance to the  next exit. They may mark the fact that we are fully two car-lengths ahead or perhaps behind the other vehicles. Or they might show that we’re traveling a bit faster than we were last year at this time. While I’ve been using this analogy in my own thinking I have a more difficult time communicating it to my colleagues.

Fortunately, just this week, our friends from the Center of Inquiry and HEDS Consortium, Charlie Blaich and Kathy Wise, shared hot-off-the-press research about this very topic; “Approaching Big Survey Data One Byte at a Time.”

Blaich and Wise make a distinction between “data” and “evidence.”  They define “data” as the statistical information gleaned from big surveys such as means, medians and effect sizes.

And evidence? “We  define evidence as the consensus that emerges from conversations that you and your colleagues have to make sense of the data and identify its most important elements, (p. 4).” Sounds like it fits well with the Earlham ethos, don’t you think? They have adopted the word “sensemaking” to describe these analytical discussions.

“Sensemaking is talking with one another to form a consensus around identifying and interpreting events or other information to pave the way for action (p. 4).”

I have to agree. In fact, I know from experience that a big survey on its own or discussion alone usually lead to more questions and no action. The assessment process is best served when our own domain experts, equipped with hard information, are able to find the narrative behind the data. When linked, the marriage of the quantitative and the qualitative can be powerful.

I urge you to give this article a read. And if you would like to engage in some “sensemaking” I would be happy to provide the big survey data and I would be thrilled to assist. But be warned, I would expect you to gather your senses for some serious sensemaking.

As always, comments and questions are welcome.

Polly Albright
Director, IR and Assessment

Hello, you may remember our good friend NSSE

NSSE is back!

Right around mid-March, we will be asking students to participate in the National Survey of Student Engagement (NSSE), an  important institutional assessment. This will be the first of a few blog posts to help educate and remind the community about this important data collection. Our last administration was Spring 2015.

This administration will be exciting for two reasons:

  1. It will be the second administration at Earlham since the 2013 overhaul of the survey itself. This means we will now have trend data.
  2. We are participating in a pilot module for this survey to include 2nd-year-student responses and with questions related to student and residence life.

For more reports from the 2015 survey try these:
Engagement Indicators – themed constructs and comparisons with peers and national norms
High-Impact Practices – data on research, internship and service learning experiences
Frequencies Report – includes question-by-question frequencies, means and comparisons
Academic Advising – supplementary report including 2012 and 2015 data

Comments and questions are welcome.

Polly Albright
Director, IR and Assessment