One of the biggest challenges of transitioning to college is time management. Students find many different approaches to solving this problem, but one advice we recommend is to avoid time-wasting activities – those that really have no purpose and bring no joy. There are plenty of examples that might sound familiar to students at Tufts: arguing with strangers on Sidechat, refreshing Instagram for the millionth time in a day, or queuing for the bathroom downstairs in Tisch (it’s almost always faster to use the ones below). A reasonable addition to this list of activities you’d be better off without would be to worry about — or even think about — Tufts’ place in the college rankings.
There is something almost irrational in trying to distil every aspect of a university into a single measure of its quality. When US News & World Report releases its “Best Colleges” ranking, it’s not immediately obvious what the best colleges have more, compared to their lower-ranked peers – “better? The real answer, it turns out, is an arbitrarily weighted average of statistics like test scores, alumni donation rates, and survey “peer ratings” that college administrators use to assess their rivals. Many of these statistics are, of course, beneficial to the public – a prospective student deserves to know what a college’s graduation rate is and typical class sizes. The aggregate score, however, is an opaque value that reveals little about life at any given college. The whole, in this case, is less useful than the sum of its parts.
Unfortunately for applicants researching colleges, even the disaggregated statistics collected by ranking services can be misleading. A college may, for example, narrowly modify its own policies to create the appearance of a big change. For example, Northeastern University decided to limit small class sizes to precisely 19 students because US News has historically rated class sizes based on the percentage of under 20 students. In the most egregious cases, universities are simply lying about their data — as in the case of Columbia (which fell from No. 2 in last year’s US News rankings to No. 18 this year after a professor from school mathematics accused her of submitting “inaccurate, questionable, or grossly misleading”) and others.
Rankings also work against the societal goals of many institutions. A college aiming to do the most good for the world will seek out students who have the greatest potential to benefit from higher education. However, traditional ranking systems encourage colleges to focus on often arbitrary criteria that denote “the elite” rather than focusing on tangible measures of positive student and societal impacts. For example, in calculating their scores, US News puts social mobility measures at 5% of the school’s final score, while graduation rate and selectivity statistics count at 8% and 7%, respectively.
Selectivity is calculated from the standardized test scores of the university’s incoming freshman class. Those with higher family income and better access to resources tend to do better on standardized tests. Therefore, a college interested in moving up the ranks would place a high priority on test scores in the admissions process. Similarly, graduation rates are also positively correlated with family income. By favoring schools with higher graduation rates, the ranking system also favors schools with wealthier participants. Overall, the rankings encourage colleges to focus on often arbitrary yardsticks that signal the elite, rather than focusing on tangible metrics that measure how well they have a positive impact on their students.
Measurements published in Forbes found that low-income students make up more than 65% of college dropouts. Conversely, students with household incomes of $100,000 or more are 50% more likely to graduate than their peers from low-income households. These statistics are further exacerbated in first-generation demographics; approximately 90% of low-income first-generation students do not graduate within six years. In the context of social mobility and societal well-being, FGLI students arguably have the most to gain from a college education, which can increase lifetime earnings by hundreds of thousands of dollars, but Current grading systems create environments that may discourage colleges from admitting students from these backgrounds.
Although college rankings are ostensibly intended to help students choose the college that offers them the best opportunities, these lists’ promotion of selectivity and elite distracts from the true promises and benefits of education. superior. Last month, Education Secretary Miguel Cardona publicly criticized college ranking systems that prioritize the proprietary tendencies of many highly selective institutions over a university’s ability to meet the needs of his students, calling them a “joke”.
As a result of these criticisms, a few alternative lists that rate colleges on more relevant metrics have emerged. The New York Times published a study in 2017 rating colleges based on the socioeconomic stratification and economic mobility of their graduates. Tufts, like many elite universities, ranks in the top 10 schools with more students coming from the top 1% than the bottom 60%. Elite colleges were conspicuously absent from the top of the list that measured the social mobility of low-income students after graduation. The Washington Monthly magazine also provides ranked lists of colleges based on metrics such as social mobility, research, public service, and return on investment. A list, which estimates college affordability for low-income students, ranks Tufts 209th among Northeast colleges.
As students, we must push our universities to adopt policies that benefit every student, regardless of the impact on rankings. For instance, ending inherited preferences in admissions These criteria have been shown to increase economic diversity among students, with a notable increase in the percentage of Pell-Grant recipients. In turn, universities are instructed to prioritize accountability to students, rather than arbitrary measures of prestige and exclusivity among peer institutions. On a societal level, it’s time to assess the usefulness of rankings such as US News’. While access to a range of publicly available metrics for educational institutions can be helpful to potential and current students, alumni and employers, we must choose to rate universities in the way we prefer and not not give in to the often arbitrary rankings that are provided to us each year.