Tagged: rankings

The Brookings Common Sense Ranking

Blog Post by

By now, you’ve likely heard of the newly-released College Scorecard, which provides prospective college applicants and their families with information about schools’ annual net price for federal financial aid recipients, graduation rates, and earnings of former students (who received federal financial aid) 10 years after entering the school.  A recent article from The New York Times presents its readers with the idea that the College Scorecard, though not a college ranking system, “suffers from many of the same flaws that afflict nearly every other college ranking system: There is no way to know what, if any, impact a particular college has on its graduates’ earnings, or life for that matter.”  It’s the “or life for that matter” that is actually a very important distinction to consider!  It makes us stop and think about whether measuring a school’s strengths based primarily on graduates’ future earnings is too narrow an approach to college rankings OR if it is a crucial indicator of how well a college prepares its students for later success.  Prof Muller, from Catholic University of America, tackles this question and is quoted in the article as saying, “To rank the value of colleges based on the ultimate earnings of their graduates radically narrows the concept of what college is supposed to be for.”  This is definitely food for thought…

Which factors should be measured and scrutinized when thinking about how best to rank colleges? Some would suggest that colleges shouldn’t be ranked at all (after all, aren’t all rankings a bit arbitrary?), while others are considering interesting new attributes of schools and ways to measure and quantify their strength and weaknesses.  For instance, as was written in an earlier post, the Brookings Institution has developed a ranking system that incorporates a “value-added approach to assessing two and four-year schools”, essentially comparing expectations for students to how they actually fare in life post-graduation.  As was put forth by the Brookings Institution, “One goal of the value-added measures is to isolate the effect colleges themselves have on those outcomes, above and beyond what students’ backgrounds would predict.”  It’s an interesting concept.  Still, it’s difficult to truly separate the effects of attending a specific college from innate qualities and other influences and formative experiences in the lives of those making up a school’s student body (e.g. family background, socioeconomic status, personality and work ethic, grit, perseverance, drive, innate intelligence, etc) AND it’s important to mention that the outcomes being considered by the Brookings Institution are primarily measures of economic well-being.

Economic well-being is, to a great extent, correlated with whether schools have a strong STEM (science, technology, engineering, and math) curriculum or not, as these majors often lead to high-paying jobs.  This leads me to what I most appreciated about this particular New York Times article and what I want to share with you today.  Click on this link and then be sure to scroll down to the bottom of the page – it’s there that you’ll find what the author calls The Brookings Common Sense Ranking, which is essentially the aforementioned “value-added assessment” of schools that takes away the curriculum component of the equation.  Colgate University tops the list, and the Ivies are nowhere to be seen!  A ranking system where no Ivy League schools break into the top 20?!  Wheels, start turning.

rankings 

New Ranking System On The Block

Blog Post by

You’re probably aware of some of the popular college ranking systems out there, such as those produced by the U.S. News & World Report,  ForbesMoney MagazineEducation to Career, and (for world university rankings) The Times Higher Education.  Each source utilizes different indicators to compile and compute their lists, and information from their respective websites reveals what each has to say about their own system:

  • U.S. News & World Report – Our rankings “allow you to compare at a glance the relative quality of institutions based on such widely accepted indicators of excellence as freshman retention and graduation rates and the strength of the faculty.”  For more information on their methodology, click here.
  • Forbes – “We ignore the abstract (reputation) and wasteful (spending-per-student) to focus on one measurement: outcome. From student satisfaction and graduation rates, to career success and student debt, this ranking counts what matters.”
  • Money Magazine – Our rankings indicate which schools “deliver the most value—that is, a great education, at an affordable price, that helps students launch promising careers.” See here for more information on their methodology.
  • Education to Career – Our index “analyzes the quality of students when they enter a given college, the total costs related to attending the college, and the outcomes of the students when they enter the labor market. The rankings results are determined by which schools did the best job of improving the earnings and attainment of quality employment of their students.”
  • The Times Higher Education –  Our list “judges world-class universities across all of their core missions – teaching, research, knowledge transfer and international outlook.”

However, as Forbes itself pointed out, two different college ranking systems can yield very different results.  In the article, they give this example, comparing their 2014 rankings of Top 10 Liberal Arts Colleges to those from the U.S News & World Report‘s rankings:

forbesusrankings

They both agree on Williams as the top liberal arts college, but there are also some noteworthy differences between the two lists, such as Forbes‘  inclusion of the United States Military Academy, Dartmouth and Davidson, and the U.S. News‘ inclusion of Wellesley, Middlebury, and Claremont McKenna College.

To add to these lists, there’s a new ranking system on the block, one that is earning a reputation for “uncovering hidden gems.”  Allow me to better acquaint you with the new system that was introduced by the Brookings Institution last May, one that “goes beyond college rankings” by using a “value-added approach” to assessing schools.  As this policy report from Brookings elaborates:

“Value-added, in this sense, captures the benefits that accrue from both measurable aspects of college quality, such as graduation rates and the market value of the skills a college teaches, as well as unmeasurable ‘x factors’, like exceptional leadership or teaching, that contribute to student success.  (One goal of the value-added measures) is to isolate the effect colleges themselves have on those outcomes, above and beyond what students’ backgrounds would predict.”

This is a very cool concept and, as is written in the policy report, has helped to unveil some lesser-known colleges and universities, such as Indiana’s Rose-Hulman Institute of Technology and Minnesota’s Carleton College.  Now, I’m not here to tell you that one system of ranking colleges is better or worse than any other but, rather, my intention is to bring your attention to the fact that there are many different ranking styles and lists out there, and each uses slightly different methodologies in calculating their findings.  Be aware of those differences.  One goal of the new Brookings Institution rankings is to “make them more relevant” than other ranking systems, but whether you think they’ve succeeded in this aim or not is up to your personal discretion.  As ever, do your research, go out, and conquer!