Like many other computer science concentrators at Brown, I began searching for a summer internship early last fall. In the weeks leading up to winter break, I saw many of my peers notify their LinkedIn connections that they would be spending their summers working for top tech companies. To my surprise, the majority of them demonstrated a dramatic bias towards Silicon Valley. The appeal of these offers is quite plain — the pay can be dramatically higher than alternatives in computer science research and academia. But I was still shocked to notice that even Brown students — as socially concerned as they generally are — were not driven to more impactful and experimental career paths. Frankly, industry salaries tell us little about how far we can actually advance technology itself, a consideration that ought to weigh far more on students who’ve had the world-class education that Brown offers.
Starting salaries at big tech companies can reach over $150k annually — and that’s without the added bonuses and lucrative stock options they also offer. However, junior software engineers at these companies are essentially cogs in gigantic corporate machines: one of thousands of capable software engineers contributing to a codebase that is likely millions of lines long. These systems have already been proven to function extremely well — it wouldn’t make sense to let a new grad try out meaningful changes to a software that generates millions of dollars. This dynamic isn’t inherently negative, but it does remove some potential for creative engagement and innovation. A-level software engineers may be more likely to win important promotions and multiply their salary several times over (to a figure well above even $500k annually) before they’re given the power to experiment with methods in software development that are not already tried and tested.
While the average computer science academic makes significantly less — an average of around $103k for tenured and tenure-track faculty at public institutions, according to a study from the National Education Association — they also have a degree of autonomy that is near impossible to match at an established company. Although many researchers — especially at universities — are under pressure to publish frequently and obtain grants, their superiors have far less control over the topics they inquire into or the way they choose to conduct their scholarship. In fact, many computer science researchers get to assume full individual responsibility for the entire research process from ideation to impact, allowing them to architect each decision they make. Moreover, the success of their work is not closely tied to how easily their findings can be monetized. Thus, unlike their industry counterparts, researchers have the freedom to pursue less established and more experimental methods.
In the last five years, it’s clear that academia and industry have made substantially different contributions to machine learning. While machine learning is an important part of the work that many large tech companies do, these companies have largely settled into techniques that were first developed nearly 10 years ago. It’s common practice to utilize libraries such as scikit-learn or TensorFlow that make it simple to use popular and existing machine learning models in desired applications. These libraries are helpful, but they are also standardized, making it difficult for their users to innovate with new machine learning techniques. Instead, it is academic researchers who have made the most crucial contributions to the field lately, whether it is discovering new models that lead to performance improvements or conceptualizing how deep learning works.
While it is true that most fields see differences in the tasks performed by those working in academia versus industry, this disparity is even more concerning in the tech sector considering the domination of big tech today. Academic research is underpinned by the idea that advancing our knowledge ought to be prioritized over monetizing it. While there is significant funding for academic research in tech, the private sector, particularly FAANG, has access to just as much, if not more money for research. These companies’ massive assets allow them to pay such extraordinarily high salaries to new employees, not to mention the plethora of amenities they provide. Because most academic research simply isn’t designed to net comparable profit, academic researchers’ compensation also falls short, irrespective of its true value in shaping the field. This financial disparity can often serve to drive people away from academia and towards big tech.
Although some students genuinely require the additional money that a career in industry could provide — to pay off debts, support family members or deal with medical emergencies — not everyone has these kinds of motivations. Many are driven to industry strictly on the basis of salary, whether or not they feel truly satisfied by the work they’re doing.
While neither academic researchers’ contributions nor their expertise in their field are adequately recognized in their compensation, the influence that their work has in defining the future of computer science is of seminal importance. Although we can’t disregard monetary compensation when choosing our career path, we must remember that a tech career’s greater value is not in its financial benefits but in the scientific advancements it generates.
Anika Bahl ’24 can be reached at firstname.lastname@example.org. Please send responses to this opinion to email@example.com and other op-eds to firstname.lastname@example.org.