Back To Career News

Addressing Critics of the PayScale College ROI Rankings

Topics:

In a previous blog post, I discussed the methodology of the PayScale College Return on Investment (ROI) Rankings, which examine the financial return of attending 853 different U.S. colleges and universities.

This ranking has garnered a lot of attention and has played a role in the public debate on the financial value of a college education. While the majority of attention has been positive and the report overall has been well-received, there are some criticisms.

As researchers, we welcome feedback on our data reports and are always looking for ways to improve upon them. However, many of the criticisms we've heard are based on simple misunderstandings of our methodology or the report overall. In this blog post, I will address the points brought up by critics of the ROI Rankings and attempt to clear up any confusion.

Responding to Critics of the College ROI Methodology

As mentioned earlier, we’ve seen some media pieces stating that our methodology for the College ROI Rankings is unsound. However, the specific arguments made in these pieces show our methodology was either (a) not inspected thoroughly or (b) not understood. I tried to address (b) in a previous post and now I want to address (a).

Do You Know What You're Worth?

In an article titled “Flaws Revealed in PayScale Ranking” featured on the Seton Hill University blog, Setonian Online, PayScale was accused of “purposely [choosing] to inflate methodological flaws in order to achieve and publicize [our] rankings.” The article goes on to say that we calculate our ROI measures by “taking into consideration the cost of attending SHU and the median annual salary of alumni.”

As discussed in our methodology statement, our methodology is more complicated than a simple difference between the cost of attending and the earnings of alumni. Furthermore, PayScale is an impartial source of information. We would never alter or “inflate” our methodology in an effort to publicize our rankings. We simply calculate the rankings to enrich the ongoing debate around the value of education.

The article goes on to say, “The study failed to take into account aspects like financial aid.” In fact, we show the rankings two ways — with and without average financial aid amounts included so that individuals can decide which version applies to their specific situation.

Lastly, the article makes the statement, “Payscale also didn’t account for graduate students and gender factors.”

We readily admit we do not include graduate students in our study. That exclusion is intentional. The reason behind this decision is simple — it is difficult to tease out the earnings that are impacted by undergraduate education and those that are impacted by a graduate education.

For example, say we have a student with a bachelor’s degree from Harvard and a master’s degree from MIT. Are their earnings driven by their bachelor’s degree education, their master’s degree education, or some combination of both?

Due to this ambiguity, we chose to focus this study on those who have only a bachelor’s degree so we can explicitly dissect the earnings of alumni from a given school who don’t have other schools potentially impacting their earnings. We also have done studies around MBA earnings and other master’s degrees but for this particular report, we chose to focus solely on bachelor’s degrees which is the highest degree level that a majority of college alumni achieve.

In terms of “gender factors” as it relates to pay, the gender wage gap is generally overstated,as shown in previous posts, so gender factors tend to play a very small role in the results.

As another example of a misrepresentation of the PayScale College ROI Rankings methodology, see an article written by Anderson University, which claims, “The report uses industry averages for compensation rather than actual salary levels of Anderson University alumni which are not publicly available.”

As discussed in the previous blog post and our methodology, we calculate median pay for a school’s alumni by utilizing profiles in our database for bachelor’s degree graduates from that school. We are not using industry averages nor alumni data provided by the schools. Rather, we use data provided directly by alumni.

College ROI: Responding to Critics of PayScale’s Sample

Some critics have expressed concern about PayScale’s sample base. These critiques have fallen into two general camps: 1) PayScale’s sample is biased as it is not randomly selected and 2) PayScale’s sample of alumni from a given school may not be representative of the alumni from that school.

Sample Bias Discussion

For those of you who are not familiar, sample bias is a statistical term for when a sample is collected in a way that results in some members of the intended population being less or more likely to be included than others. Sample bias is likely to occur in online surveys where a nonrandom sample of people choose to participate.

Given how we collect our data, it would be possible for our sample to suffer from sample bias as people who are curious about their place in the labor market are the ones who choose to participate in our survey. We do not approach people randomly with an incentive to build a PayScale profile.

However, knowing that, we have compared our sample with several other collection methodologies (e.g., Census data, Bureau of Labor Statistics data, other compensation surveys, etc.) and found no systematic bias in our data when we did an apples-to-apples comparison, e.g., comparing information for workers with similar traits. We routinely do these types of comparisons.

Generally speaking, there are two overall biases to our data of which we are aware and for which we control in our College ROI Rankings:

  1. Our data skews slightly younger than the U.S. workforce.
    • Our data collection rate for workers 55 years old and older is roughly half of our data collection rate for workers in their 20s to 40s.
  2. Our data skews slightly more white collar than the U.S. workforce.
    • Generally, we have a higher proportion of data for white-collar and higher-wage jobs (like IT and nursing), and a lower proportion of data for blue-collar and lower-wage jobs (like agriculture and retail).

However, neither of these general biases would impact the results reported in the College ROI Rankings. For the first, we focus on workers over a projected 30-year career post college graduation. Given that people typically graduate in their early-to-mid 20’s, we are generally including people who are mid-50’s and younger. For the second, since the rankings are for college graduates, who tend to be white collar workers, the white-collar bias does not play a role in our pay numbers.

Representative Sample Discussion

Another critique we’ve heard is that our sample might not be representative of a given school’s alumni. Similar to the sample checks we do to look for sample bias, we do school-specific checks to see how our samples compare to outside sources.

We look at two things: 1) the proportion of graduates from a given school relative to all college graduates; and 2) the breakdown of majors at each given school.

For the first examination, I present a chart illustrating how our representation of a school’s graduates relative to all graduates from 1,000 degree-granting institutions compare to graduation numbers reported by the Integrated Postsecondary Education Data System (IPEDS) for the same set of schools.

School Representation

As the above scatter plot shows, there is a strong linear correlation between PayScale data and IPEDS data. Although there is a slight over-representation of small schools and a slight under-representation of large schools, generally speaking, the amount of data we have is proportional to the size of the school.

Other than just a general representation, we also examine the breakdown of majors for a given school compared to the majors graduated by that school according to IPEDS. As an example, I present a table comparing the major breakdown of our sample of University of Washington bachelor’s graduates from 2011 to the major breakdown reported by IPEDS.

Note: For brevity, I focus on the 10 most common majors for UW bachelor’s graduates, as determined by the number of 2011 graduates in the IPEDS data. Even though the numbers below are for only 10 majors, the percentage is a ratio over all majors. 

Common Majors for UW Bachelor’s Graduates

Percent of PayScale’s 2011 Bachelor’s Graduate Sample

Percent of IPEDS 2011 Bachelor’s Graduate Sample

Biology

4%

6%

Psychology

3%

5%

Political Science

4%

5%

Economics

6%

5%

English

4%

5%

Sociology

3%

4%

Biochemistry

2%

3%

Accounting

4%

2%

Finance

3%

2%

History, General

2%

2%

The table above shows the relative major breakdown in our sample is largely in line with the major breakdown reported by IPEDS. One important note is we cannot tell from the IPEDS numbers how many graduates go on to earn higher degrees. These people will be excluded from our sample and not their sample. This is likely what causes our representation of science and psychology majors to be slightly below those reported by IPEDS.

As previously mentioned, we welcome critiques of our studies and are more than willing to examine them. That being said, we stand behind our findings and have many supporters of the rankings who have found a lot of value in examining the ROI for bachelor’s degrees by school.

The majority of criticism we’ve received has come from schools who perform poorly in the ROI Rankings. As one commenter states on the Democrat and Chronicle story titled “College Costs Rise as Tuition Spirals Upward,” which references the PayScale College ROI Rankings, “I suspect it is the conclusions more so than the methodology that have the college perturbed.”

Regards,

Katie Bardaro
Analytics Manager, PayScale, Inc.

?
Katie Bardaro
Read more from Katie

Leave a Reply

avatar
  Subscribe  
Notify of
What Am I Worth?

What your skills are worth in the job market is constantly changing.