Hacker Newsnew | past | comments | ask | show | jobs | submit | logisticseh's commentslogin

There has been a real shift in the culture of academia over the last 10-15 years that coincides with what I'm starting to call "the rise of the Twitter Prof". The twitter thing is both literal and also a stand-in for similar junk like TED talks.

IMO, the US needs to clean house. We should restructure our research funding so that it goes to motivated and curious people instead of politically connected PIs. This means totally disconnecting NSF/NIH/DARPA funding from the university apparatus. Treat academic affiliation -- and the associated albatross of "overhead" -- as a hindrance to grant applications as opposed to a requirement.

Imagine how much better the caliber of our research will be if we pay grad students $100K instead of paying them $35K and their university $65K. We could 10x our research spend by giving money to the right people instead of giving it to people who can stomach and afford giving 2/3rds of their funding to sad excuses for institutions that American universities have become.

At the very least, tax payers need to stop bailing out student loan debt while simultaneously funding multiple annual week-long conferences in Hawaii and Europe for professors who scoff at their teaching assignments.


Yes, the university taking 60-70% is almost highway robbery, but it is hard to completely disassociate from them. For one, the university can through economies of scale reduce the cost of setting up and running experimental facilities. Moreover, one usually finds motivated researchers in academic settings (I am anticipating dissent here, since that could be because of the funding flowing only through Universities). However talented and motivated researchers usually find other work situations which while to fully aligning to their erstwhile research interests, pay for a good life. Those who really have the passion that exceeds the mundane normalisation of drive ( also called life per some definition) usually seek refuge in the academic setting. So, yes if we can set up labs like the old Bell labs or find a way to replicate the tenure model of universities, taking funding away from the univ mechanism and giving it to individual unaffiliated researchers won't be practical. But even if that can be achieved, you have just replaced universities with a slightly different structure. Better option is to trim the fat and realign priorities at universities, which is also easier said than done. Cushy administrator jobs, student housing mansions with 5 star facilities, large stadium colosseum all should go.. But systemic reasons, societal perception, need to justify legacy admission all factor in here as strong headwinds.


Does anybody really know where the overhead money of top places end up ?

Some institutions (eg famous ones) have near 100% overhead. Overhead sounds reasonable for example to maintain IT and core facilities, but I have a suspicion that a lot of it is wasted in a black hole of unknown function generically labelled “admin”


Most of the overhead goes towards admins. But it is also important to remember that many universities have multi-billion dollar endowments. It's been explained to me that "I don't know what endowments are for" when I've pointed out that my (not top 10) university could indefinitely pay for all US student tuition (and 90% of foreign) on the interest of our endowment at a 5% yearly growth. So I'm still not sure what endowments are for considering that there are 15 schools with endowments that are over 1m/student. So yeah, I still have no idea what endowments are for.

https://www.collegeraptor.com/college-rankings/details/Endow...


In the us the size of endowment affects US News college rankings. So it’s not to be touched!


I'm not in the US, but from what it sounds like, the situation isn't much different where I live.

A surprising amount of overhead goes to employees in accounting who perform some black magic so that everyone on staff is payed by some combination of projects. Whether someone actually works on said projects is irrelevant of course.


100% seems reasonable for other businesses.

If I had a factory with 100 people assembling my product, or a sales team with 100 people selling it, I'd be quite happy to only budget for the cost of another 100 people to pay my rent and run my IT, payroll, accounting, janitorial services, maintaining my machinery...

I get the impression that modern universities have five or ten other employees for each teacher/researcher, which assuming everyone gets the same pay implies 500-1000% overhead, or subsidy from sources other than grants.


The problem is that they've created a whole hierarchy of useless deans and other admins to eat up all that overhead to justify it.


100% is a bit of an exaggeration. I’ve heard of one place with a 100% overhead rate and it was for something very specialized (marine research vessels, which probably do eat money). Everywhere else I know of is near 50 percent, and this includes “famous” places (Yale, McGill, MIT, Georgetown, etc).

The overhead rates are supposed to reflect the University’s actual costs and are negotiated with the government, so I suppose you could try to make a FOIA request for them.


Most universities have their negotiated Facilities and Administrative (F&A) rates posted online via their Sponsored Research Office. For example, Harvard charges a 68% indirect rate and Stanford about 58%. The public R1 institutions I’ve done business with charge 40-50%.

This doesn’t paint the whole picture though as sometimes additional overhead costs (eg. healthcare benefits) can be budgeted as Fringe Benefits for personnel working directly on the project. In that situation, the F&A rate would stack on top of those costs.


If this is true then the universities will win the grants anyways, so why not adopt my proposal?

(It's not, especially for disciplines like CS and math.)


If your proposal is to make all grant applications blind with respect to grant writer information, it is an idea that has a lot of unintended side effects. Firstly, not every field, in fact, most fields are not Math or CS by definition. It is certainly the most popular in this crowd, especially these days, but without proper facilities most of science cannot be done. So, making an application for grant anonymous, you have dramatically increased the risk of nothing useful coming out of the proposal. A talented independent grant writer (who may or may not be a great scientist) now has a much higher chance of getting the award. But if he doesn't have the ability to execute that great idea the project will fail. Most often idea is aspirational with no detail, the breakthrough comes in only during execution. Execution requires a team and facilities. This is why things like MURI (multi-univeristy-research-initiative) were created. Also, a lot of the grants are earmarked for early career researchers, without a lot of experience in proposal writing, which also will go away if the awards are anonymous. I am sure there are other points I am missing and also there may be ways to remove the shortcomings, but I don't see a clean and easy path forward with this.

As for math and CS, I am quite positive at least in Math most work is done by individual professors/post-docs/students without much involvement of grants. In fact, a lot of universities do reduce the overhead charged to purely intellectual pursuit with no major facilities usage kind of departments. Not sure all math or cs research will qualify for that because of the HPC requirements, but the option is out there. Universities as signals for funding and research success really appears to me like the best of a bad option set.


> So, making an application for grant anonymous, you have dramatically increased the risk of nothing useful coming out of the proposal

To me this sounds not as a critique of the proposed system, but rather defense of the current system.

Science is underpinned by the scientific method which essentially is "form hypothesis -> create controlled environment -> test hypothesis". I do not want to start debate around expected outcome rations, but negative result (hypothesis being false) is perfectly normal outcome of the scientific method. Current system tries to defend against rent seeking by being heavily biased towards positive outcomes. This encourages heavily incremental research, which a bit paradoxically can be also seen as rent seeking. Furthermore, it also encourages low quality science (like p hacking and similar stuff) and discourages publication of negative outcomes, possibly leading to hitting the same dead ends multiple times.

Blind grant proposals could reduce bias towards "past performance" decreasing ratio of positive results, and encourage "riskier", more brave hypotheses to be formed and tested.

Past performance as a metric in grant application encourages research optimization. Some may see this as a good thing, but it discourages early-to-mid career researchers (and even late in their careers) from pursuing something non-obvious. Safe science is more often than not optimization and edge case/condition testing.


I think you're missing the point. Negative results is not the issue. If your hypothesis involves NMR, you need access to an NMR machine to test it. Most people outside of a university don't have that and so would fail to execute.


Ok, apparently I did miss your point a bit.

However, your argument is still orthogonal to grant blindness itself. Mere affiliation with with research institution does not guarantee (although highly increases the likelihood, I can agree on that) access to certain research instruments.

This is the thing I have tried to explore in my previous comment. Current model practically revolves around researcher "fame" and grant board "knowing" their [lab's] ability to execute technical part of the research. This creates some perverse incentives. Naturally, grant boards become more inclined (consciously or not) to favor certain institutions and researchers, giving more grants, further increasing "competitiveness". Researchers become disincentivized to publish negative results and are incentivized to resort to data hacking and irreproducible science. This sure does give rise to the OP's mentioned Twitter prof or "trust me bro" science. There is much more politics in science than we publicly admit.

Current process does not actually evaluate grant proposal on technical ability to execute, this is left for institutions "underwriting" proposal. We could start evaluating technical ability regardless of whether grant applications are blind or not. If peer review meant "peers got roughly similar basic results in basic reproduction" instead of "someone in the field looked at the final draft" I guess we could have blind proposals with advantages of current system without its major political disadvantages.

I can agree that open process does help filter out researchers underqualified to do the research in question, but it does come at a certain cost. Ability to weed out underqualified teams does not necessarily require current process, i.e. this can be achieved by other means.


Just a slight clarification/correction/question: as far as I can tell, M in MURI is for multidisciplinary. That means the DoD is seeking interdisciplinary research (and maybe intra-university collaboration), not explicitly inter-university work -- is this correct?


There may be separate multidisciplinary initiatives, but "Multi" in MURI definitely stands for inter university initiatives. All the MURI grants I have seen (not many and long ago to be honest) were multiple universities and one area of research.


It’s “multidisciplinary”, but funded programs are almost always large multi-institutional teams https://www.defense.gov/News/Releases/Release/Article/295323...


> At the very least, tax payers need to stop bailing out student loan debt while simultaneously funding multiple annual week-long conferences in Hawaii and Europe for professors who scoff at their teaching assignments.

I think this is missing the forest for the trees. These aren't the reason schools cost so much and in turn why student debt is so high. Meeting with other scholars internationally has been happening for over a hundred years. European schools often travel to America a lot because conferences are often here, and their costs aren't what American universities are. Despite having similar levels of research output. The high cost is for other reasons. This also isn't why professors ignore their teaching requirements (find time to do both research, which is the higher priority, and teach. You sure can't in 40hrs)

As to the rest of the comment, I am in a lot more agreement with. I don't even think grad student pay needs to even be that high, it would just be nice enough if I didn't need to rely on internships for my cost of living. If I had enough money to pay off my undergrad loans instead of letting them rack up tens of thousands of dollars more.

But one thing I would highly argue, is that research is investing in your own country. Academia is a common place for innovation to be derived from. It is no surprise given you put a bunch of smart people together that are passionate about learning that they make new things. Especially considering how research is literal innovation. I think there's a large argument to be made about how this is a national security issue, and can be justification for diverting military budget for non-weapons based research. War is about economics after all. Really it is about a lot of things besides the actual fighting parts. Innovation has been a key element in America's (and many others) success. It is weird to think that we wouldn't place this as one of our highest strategic assets.


>> At the very least, tax payers need to stop bailing out student loan debt while simultaneously funding multiple annual week-long conferences in Hawaii and Europe for professors who scoff at their teaching assignments.

> I think this is missing the forest for the trees.

Not at all. People have a sense of right and wrong, and this fact offends them far more than the nebulous task of assess the effectiveness of research.

I have a PhD and spent a lot of time within and interfacing with academia. I think even the average science-skeptic congressperson massively over-estimates both the short-term and long-term value of the sort of research that happens in most of academia. But we're probably not going to agree on that.

So, see, we can go back and forth all day about the relative merits of academic research. And there's always cover because most congressional people -- let alone voters -- don't know enough to have a really informed opinion.

If I go to a congressional staffer and complain about the NSF or NIH, not much is going to change unless the candidate already agrees with me and is passionate about the issue. They know voters don't care.

But I can go to a congressional staffer and ask: why does your candidate support bailing out student loan debt while simultaneously funding multiple annual week-long conferences in Hawaii and Europe for professors who scoff at their teaching assignments?

And THAT question is going to be viewed very differently, because it's something that could anger voters who don't have the time and attention to vote on the basis of whatever useless crud the NSF is funding this year.


(former tenured prof here)

This is exactly right. Just like it's better to think of McDonald's as a real estate company with a food business on the side, these days it's better to think of big state schools as mechanisms for ingesting federal research dollars, with an education business on the side. (Never mind that state schools shouldn't be education businesses _at all_, they should be _public services_)


As an example, MIT spends about 16% of its revenue on undergraduates; undergrad tuition is about 14% of revenue.* Remarkably those percentages were about the same when I was an undergraduate there 40 years ago.

Basically MIT is an enormous government research lab with a small school bolted on the side. This was a deliberate structure thanks to James Conant and Karl Taylor Compton working for the government in WWII.

* honestly I’m astonished they are willing to run that minor (and by a lot of faculty reviled) operation at a small loss.


Interesting that you bring up WWII. There's a theory that says that US's big research funding agencies are deliberately set up the way that they are because the US government freaked the _fuck_ out upon realizing that a bunch of nerds could go from zero to nuclear bombs in ~10 years. So, these big bureaucracies were set up to, in effect, keep tabs on physicists. (I first heard of this from a talk from Kim Stanley Robinson of Red Mars fame, and more recently Ministry for the Future. It's on YouTube somewhere but it's a bit hard to find, it was a seminar he gave at Duke University some 15 years ago.)

Then, some 35 years later came the Bayh-Dole act, which however well-meaning it might have been, really provided the incentives for universities to turn into fed money capturing enterprises. The rest is history.


You don’t need a conspiracy theory — Compton and especially Conant were clear about the model they wanted to set up and why. It was well documented in memos, planning, and position papers.

This was before the Manhattan project was authorized.


Do you have a source for that cost attribution? Based on my time there (slightly more recent than yours), that breakdown feels roughly correct. However, I've been trying to find similar information and have found it difficult even for public universities.


The institution publishes its budget. 10 seconds of DDG search found this page: https://vpf.mit.edu/about-vpf/publications Looks like tuition is becoming less important over the Covid period and grad+ug is down to 9% of revenue.

They have a nice presentation with various charts and such which is where I get this info myself. I don’t know if it’s online. I forget how I got it when I was in school but now the people who come visit from the development office bring it with them because they know I’m interested.


>(Never mind that state schools shouldn't be education businesses _at all_, they should be _public services_)

But the education public services were long-since defunded so that states could cut their local taxes and "attract business".


Specialization is the greatest source of efficiency.

Almost every organization has more people in support roles than in line positions. If not directly, then indirectly. If the "overhead" is not high enough, the people who are supposed to do line work have to waste their time as incompetent janitors and secretaries. That's a very common situation in universities, which employ many administrators for regulatory reasons but often don't have the money to hire enough support personnel.

If you want to pay grad students $100k, then the grant must be $165k. Or maybe $180k to allow the grad student to focus more on their research. Maybe the sum can be a bit less, if you manage to reduce the regulations and reporting requirements. But then you have to accept that a larger fraction of grant recipients will misuse the money or spend it on research many taxpayers would consider frivolous.


In CS this is just false. Nearly all of the CS research we fund could be done in inexpensive office space with no administrators. The one exception is DARPA-funded robotics research, and even then it's only a subset.

In wet lab disciplines the story is more complicated. But there are many possible configurations that are more efficient than the current university-as-gatekeeper system.


You can do many kinds of research with minimal overhead, as long as everyone is self-employed and your funding comes with minimal restrictions and reporting requirements.

I did that for a couple of years as a CS postdoc on a private grant. I believe my overhead rate was 15-20%, which was mostly health and other insurances. To reach that, I did accounting and taxes on my own. I also had a loose affiliation with a university, so I didn't have to pay for office space or computing.

If you start using paid services, hire employees, and accept more restrictive grants, your overhead rate can easily climb above 50%.


It's not just in academia: I've worked on lots of scientific and technical grants for industry: https://seliger.com/2017/03/28/write-scientific-technical-gr..., and the RFPs often indicate a need to claim all sorts of things. Cure cancer! Lead to the complete end of fossil fuel reliance! Revolutionize computing!

At the same time, few applicants, or applications, seem to be punished for grandiose, but not completely impossible, claims. Grant making follows the golden rule: https://seliger.com/2007/12/06/studio-executives-starlets-an...: he who has the gold, makes the rules.


This is sort of what NGI Zero is doing, mostly via NLNet:

https://nlnet.nl/project/current.html

https://www.ngi.eu/ngi-projects/ngi-zero/

They're an insignificantly-tiny fraction of EU research funds, but I'm shocked how often I come across an outstanding project (nix, chips4makers, betrusted, libresoc, searx, OTR, osmocom, qubes, sourcehut, wireguard) that they funded.

Whatever awesomeness the NSF had in the 1970s-1980s died somewhere in the mid-1990s, and appears to have been reincarnated with these guys.


IMO, we need to regulate online advertising. By allowing advertisers to fund websites and apps, we allow them to be used as platforms for mass hype and dissemination of "junk" (bait).

The "Twitter prof" is the academic who has learnt to master the advertiser-funded hype machine.

The internet should be an advertising-free internetwork as it once was.

Everyone reading this comment and enjoying this forum is using a non-commercial website. There is no advertiser funding.

When we fail regulate so-called "tech" companies, we watch the value of the internetwork continually degraded to nothing more than a means to advertise and disseminate "junk".


This is a commercial website and the advertiser funding it is called Y Combinator. It advertises its own VC business and posts launch ads / job ads for its funded companies here.

The rise of the Twitter Prof has nothing to do with Twitter’s funding model.


"Advertiser-funded" as used here means the income from advertising is necessary for the continued existence of the website.

Companies advertising open position on HN do not pay for the ads. HN does not rely on imcome from ads to run the website.

(Now, one could argue that HN is owned by YC, YC makes money from "tech" companies and "tech" companies generally use online ads or ad services as their "business model". Fair enough. In response, I would point out that HN does not need surveillance, data collection or ads in order to survive. HN's "content" comes from from submitters and commenters. These individuals are unpaid. Unlike sites run by "tech" companies, the users generating the "content" are not paid from online advertising revenue.)


> Everyone reading this comment and enjoying this forum is using a non-commercial website. There is no advertiser funding.

HN has advertisements and is part of a .com (Ycombinator).


This is happening in europe too, perhaps more, given that EU agencies dont care much about the long-term impact of work beyond the reporting requrements. PR-savvy people are guaranteed a stream of funding by virtue of their prior funding.

I think the whole idea of small-grant-based funding in academia has outgrown its usefulness. Just pay people directly and let them work on whatever they like.


DARPA is already very willing to fund work outside of academia. Every meeting I’ve been to has included consortia led by a big company like HRL Labs or Teledyne as well as smaller companies, sometimes even 1-2 person shops, that work in the particular field. The overhead rates at those big companies are massive though.


> if we pay grad students $100K instead of paying them $35K

That would make a big difference I think. Especially in fields where private sector pay much more. That being said, research remains attractive for the brightest minds and there's no shortage of great candidates yet.


I'm not an expert on the German research landscape, but I think it works a bit like that there, that the big shots work at Max Planck, Helmholtz,... institutes. Would be funny if the US followed Germany there, since the current academic/research setup in the US was basically imported from Germany in the early 20th century.


Somewhat related to this is Kardashian Index [1]

[1] the https://en.wikipedia.org/wiki/Kardashian_Index


Is higher education anything more than a parasitical industry that takes whatever money is given to it from government, military and corporations, in order to return whatever studies are requested?


Discord servers.


What are your favorites?


I agree. There are two different sets:

1. Actual deep learning researchers, actively doing research. I rarely hear nonsense about impending AGI or mass technological unemployment from these folks. They might express tepid concern about the ethics of various applications of deep learning and tepidly point out that some technological disruption in employment is possible, but in both cases with an emphasis on tepid. And who, if anything, tend to under-estimate progress. (Self-driving is still very far off and also much further along than I thought it would be 9 years ago. I turned down jobs at self-driving cos because I didn't believe they would make enough progress to even have an ADAS product, but I was clearly wrong and if I could do it over I'd work on self-driving from 2013 onwards even though L5 is still a long way out.)

2. Deep learning fan-boys, for lack of a better term. The rationalist community in particular has a sub-community of tech-adjacent folks who aren't publishing in major conferences every cycle or running research labs but do talk a lot about AGI/UBI.

IMO it's not that dissimilar from climate science or even in the extreme the existence of aliens. Scientists with a lot of real expertise will sort of tepidly talk you through the full complexity. And then some "true believers" who aren't actually expert will sort of run to the extremes of anti-natalism or aliens among us. If that makes sense.


> IMO it's not that dissimilar from climate science or even in the extreme the existence of aliens.

Those may be similar to the contrasting attitudes regarding AI, but the analogy I first thought of for what you describe is the topic of cryptocurrency.

1. On the one hand, most people who have deep knowledge and experience of software development and databases, or professional experience of financial markets and banking, tend to be extremely sceptical, critical, or outright dismissive of the whole idea of cryptocurrency.

2. On the other, people who are technology "enthusiasts" and have some limited or self-taught programming skills, or those who have some moderate knowledge of the basics of finance and investments (often motivated by personal ambition), are much more likely to be cryptocurrency fan-boys.


This is just the hype cycle in action. New technology comes out and its fanboys and shallower (from a usecase perspective) users are adamant it'll change everything everywhere. Deep practitioners understand limitations because they're involved in the work. Eventually the fanboys are proved wrong or gravitate to the next hype and everyone else finds the place for a new technology.

A good example is Go some years ago and these days Rust. But it's the same thing, just the hype cycle at work.


I guess John Carmack falls somewhere in between but as a complete layperson when he said he gave AGI by 2030 a 50 percent chance, that at least indicated to me that some really smart people think there is a chance.


1. He predicts "signs of life" by 2030, which is a (probably intentionally) vague statement.

2. He raised $20MM for an AI startup, which is fine and well but also makes him not entirely disincentivized from hype.

3. I wouldn't characterize him as someone in the trenches of deep learning.

More of a meta point: technical depth in more than a few things is impossible in a human lifespan, and just a bit harder once you become a "somebody" since a portion of your life becomes consumed by the fact that you're a "somebody". You end up doing things like raising VC money and starting companies with bold ambitions. Its own time sink.

I had this realization when I had a conversation with Lamport about a niche topic in distributed systems and he expressed a position that was just wrong. It was a minor point that didn't really matter much at all, but he was pretty confident in a conjecture I knew was wrong. To be clear, the fact that no one can be an expert on everything -- even everything within a subfield of CS -- doesn't detract from the fact that geniuses exist. Someone can "forget more than you know" and also not know something that you know. Life is just sadly very short.


Thats true, he seems to have flopped on the rockets startup too, although you might argue AI is still much closer to his wheelhouse than aerospace so it make more sense. Before hearing that I had some vague idea that the most AI experts timeline for AI was significantly longer and some never.


It's only your investments that are failing (assuming you are a US taxpayer).

The servicers are paid to service; they don't own the instrument. You own the actual debt.

Think of it as a particularly regressive tax -- levied against everyone (including the SEVENTY PERCENT of people who couldn't access higher education) to fund the relatively small and precious set who got a four year degree and couldn't even manage to pay off a used Toyota Corolla with the skills they learned.

If this angers you, write your congressperson and urge them to completely defund the research agencies -- NSF, NIH, DARPA, AFRL, the National Labs, and so on. These institutions are rotten to the core and are today completely controlled by twitter personalities. We can get a working man's jubilee by killing off funding for useless junk.

I have a college degree and a PhD. I have faculty in my immediate family. We feel the same way. Hit academics -- and academia -- where it hurts. The research these agencies fund is NOT an investment in research. It's just BS soft money that allows professors to be a jet setting group.

I promise you that these faculty literally laugh about their obligation to the US taxpayer, right before flying off to Europe for skiing or a nice summer vacation after their conference. Pick a faculty member and go read the names of countries where they present their papers.

I also promise you that the research is useless. Go read the proceedings of any CS conference.

To pay working class folks back for student loan jubilee, we should start by cancelling Twitter-personality-cum-faculty's summer jet setting vacay by defunding the NSF's Division of Computer and Network Systems.


> The "cream" rising to the top is often less genius and more politically savvy with the right connections on the PC.

I'm generally nauseated when I interact with American CS academics. Every time I attend a conference, PC, or NSF panel, I am so glad I chose industry. It's like IRL twitter.

(Europe seems to be better for some reason.)

> If you're smart enough to rise, you'll get an offer from the private sector you simply cannot refuse. It doesn't matter if your passion is Academia, they can and will buy you out and own whatever you're working on.

IME it's less about "offer you can't refuse" on the industry side and more about "offer you can't take" on the academic side.

After 6 years of deferred income I simply could not take a job that paid $80K-$100K in an HCoL area or $65K-$80K in an LCoL area. I had loans to pay back, no 401K, and not enough savings for a down payment.

If you want good people to stay in CS academia, I think a few things need to change:

1. First, and most importantly, the faculty culture. I don't really know how to describe the problem, but "the old folks are checked out and the young folks are Twitter personalities" is probably close. What's the point of being in academia if you have to be surrounded by the intellectual equivalent of used car salesmen, especially when you can go to industry and do interesting work without the BS?

2. Double the income of PhD students so that they aren't financially ruined by choosing the academic path. This isn't a super unreasonable request -- they'd still be paid less than their peers in industry while doing what's effectively a full time job.

3. Pay faculty more. Not a lot more... just, like, "at least what my undergrad students make at their first job after graduating".

I think if you solve items 2 and 3, then item 1 will take care of itself.


IDK, I think tenure contributes a lot to 1. I understand and agree with a lot of the rationale (academic freedom, etc.) but when you select for people that prioritize, “if I work really hard for 6 years and get lucky, I can never be fired,” you get a lot of dysfunctional individuals and encourage some of their worst impulses.


Should faculty be paid more? Absolutely. Should Ph.D. students be paid more? Absolutely!! But the blanket statement you make in (1) is wrong and strikes me as awfully close to the extreme left-wing and right-wing mindsets of "the system is fucked up beyond repair, all that remains to be done is to tear it down". The reality is more nuanced than this, and the picture you paint of industry is hardly that rosy, even at silver-spoon companies that invest heavily in R&D.


I've spent a lot of time with working for or closely interfacing with a half dozen academic institutions. I left academia by choice -- with multiple TT offers in hand -- so this isn't sour grapes.

I am highly confident in my assessment that the personalities found on the typical R1 tenure track are exactly the sort of personalities I avoid hiring or working with at all costs. There are exceptions, but they prove the rule (and I can often poach them anyways).

I don't think I said anything about industry other than that it pays 3x-5x better than the TT, and I'm pretty darn confident that's true. I am clear-eyed about the issues in industry, but the personalities are much better.

I really do believe that the massive pay disparity between CS industry and CS academia is, in part, a "toxic personality that can't play well with others" tax. And I really do believe that you'd get more mentally/emotionally healthy people on the TT if it paid better.

Anyways, we can agree to disagree, because we agree on the solution in any case.


> I am highly confident in my assessment that the personalities found on the typical R1 tenure track are exactly the sort of personalities I avoid hiring or working with at all costs.

And what is that exactly out of curiosity?


My experience working with a former academic that was awful to work with: Self-absorbed, self-promoting, accomplished next to nothing but talked a big game, shit on everything everyone else did, even though their code ran the business


The pay is not the biggest problem though. Obviously it is a big one, but there's a huge issue with the work culture.

I agree it's a rosy picture of industry, but IME most of the supposed "intellectual freedom" of academia is just a marketing pitch these days. You don't get there until you somehow make tenure, and even then if you're in a high cost field you need to be very high profile if you don't want to be forced to focus on the topics that award grant money. You're interested in narcolepsy? Too bad.

So I consider it a red flag when a PI immediately jumps to say that "yes salaries should be higher but" and then goes on to defend everything else about their current situation.

Like it is ridiculous the amount of self promotion one feels pressured to do on Twitter. Do you not see the problem with authors pushing their work on social media during a supposed double blind review period?

I don't disagree that there is often a lot of bitching without actionable suggestions. But I don't think the characterization in (1) was especially extreme and I don't see the suggestion to burn the whole system to the ground. Personally I think we need more diversity in how academic institutions operate, that doesn't mean that old institutions will disappear.


I'm curious where all the money goes, since student loans are incredibly high but teacher pay is so low. I'm guessing the answer is 'random nonsense that shouldn't matter'.


In my university teaching experience, I found that everyone up the administrative chain to the top gets a cut, with the teaching faculty themselves receiving 1-2% of the annual tuition...


As he should; tiling is highly skilled labor -- closer to applied art than construction, really -- and destroys your body.


Two confounding factors:

1. billed hours in the trades don't include travel to or from the worksite; if the plumber is 30 minutes from you and the job takes an hour, they're already down to $75/hr.

2. software contracting can be done from anywhere. You can live in rural Kentucky and charge a high rate. But plumbers and electricians in rural Kentucky probably aren't charging $150/hr for house calls.


I don't buy the attention filter argument. No one -- and I really do mean no one -- is going to read the entire contents of the proceedings of even just one of these conferences. NeurIPS -- a single CS conference -- is more than twice the size of the Joint Math Meetings. ICRA and ICML are just as large or larger, and AAAI isn't far behind. That's just one sub-field of CS. There are so many papers coming out every year that I simply cannot keep up with two of my own niches. Adding more papers to that firehose wouldn't materially change the situation.

I've reviewed for some (high quality) Mathematics journals. Papers tend to be more complete, for sure, but the reviewing is much less rejectionist. I'm not aware of any Mathematics journal with a 10% acceptance rate, and even 20% is probably on the low end.

> It's an unfortunate reality of academia that there are fewer resources (jobs, grant funding, etc.) available, than there are researchers who are prepared to put them to good use.

I don't think this is true in CS. Universities outside of an elite set really struggle to hire and retain high quality faculty. It's at a crisis level outside of R1. Teaching-oriented institutions have mostly have stopped trying to hire traditional academics; a masters degree with some teaching experience is sufficient.

Some of this is due to industry -- high-quality faculty candidates tend to also have 3x-5x offers in industry, and it's hard to turn down a guaranteed early retirement for the grind and uncertainty of the tenure track. But I think some of it is also that students who would make good teachers and mentors lose confidence due to a series of unnecessary paper rejections and decide to nope out of academia.

Again, I spend a lot of time around academic mathematics. The rejectionist culture in CS is real. And not just conferences, btw. An NSF program manager started my last review panel by telling us that scores are consistently way lower in CS than in any other field and to please chill out.


Interesting. Seems my experiences in math extrapolate less well than I'd imagined.


> I don't think this is true in CS. Universities outside of an elite set really struggle to hire and retain high quality faculty. [...] Some of this is due to industry -- high-quality faculty candidates tend to also have 3x-5x offers in industry, and it's hard to turn down a guaranteed early retirement for the grind and uncertainty of the tenure track.

That seems to just confirm that resources are lacking. There's no shortage of people, but there's a shortage of money to pay them.


What does R1 mean in this context?


Universities that offer doctoral degrees and have "Very High Research Activity" according to the Carnegie Classification of Institutions of Higher Education.

Specifically, the 130 or so institutions listed here: https://en.wikipedia.org/wiki/List_of_research_universities_...

Heuristically, think "major private universities and flagship public universities".


It's the other way around -- academic R&D is just about the only type of government spending for which there's wide-spread support for openness and a lack of entrenched power against openness.

The USG spent $6B on cloud computing in 2020. That number is increasing quickly. To say nothing of the massive quantities of non-OSS software that the government buys and incorporates into is own business-critical processes. And it's not just government licenses, but also anyone who interacts with the government. E.g., try interacting with any government agency without an Office 365 license.

You get really funny looks if you say that MSFT should have to give away Office 365 for free if the government is going to use it for anything.

But total USG spend on closed-source software has to be well into the 30B-50B range conservatively. For reference, the entire NSF budget is $10B.

The main reason for this is that there are many monied and powerful stakeholders who benefit from selling closed software to USG, whereas the academic publishers a tiny, often not even American-owned, and got super greedy and screwed their natural contingency (academics hate them as much as or more than anyone else).


There's a difference between the government paying to use software and paying for it to be developed.


I'm not sure the difference is as cut and dry as you're making it out to be. A big organization doesn't just pay Microsoft a zillion dollars for a million Office licenses and then never talk to them again. There's an ongoing support relationship, which for large enough customers might include things like developing features on request.


Most of what the big contractors like Booz do is custom software. Every single cloud provider has an entire GovCloud division. Even Office has special Government licensing that behaves differently on the backend.


I think part of the point here, is that the value from that investment should go to the investors, who are (if you buy the 'by the people, for the people' hype) the taxpayers.

Say I'm vulture capitalist Tom, and I pay a few gajillion dollars to developer Gupta to create a product for me. I would be understandably pissed if Gupta turned around and sold that same product to competitor vc Janet. She didn't pay for that dev work, I did.


1. There isn't as much of a difference here as you think. Contractors do turn around and use components developed in public contracts for other consulting projects. Most commonly with other sovereigns, especially when the original contract was with a city or state, but sometimes at the national level as well.

2. With respect to R&D, one big difference is that the government doesn't provide seed funding. They provide grants. If the government wanted equity in research labs, they'd have to pay a lot more. You'll see this in practice if you ever have the extreme displeasure of doing non-useless research in academia. Companies that insist on IP ownership/sharing end up paying much higher premiums for university research contracts. Repealing Bayh-Dole would have no effect on the accessibility of actually useful research; universities and companies would privately fund the useful stuff and leave the government to fund the labs of politically-connected/twitter-famous but otherwise totally useless academics.

(To be clear: we're on the same side here with respect to open access publications.)


Thanks for the well informed response! I had not yet heard of Bayh-Dole and you gave me some good googlin'.

In regards to your explanation in [2], that sucks - I kinda figured that's how things were but I sorta went around academia rather than through it so it's interesting to hear. Any hot ideas about how it could be fixed?


> Any hot ideas about how it could be fixed?

For Computer Science:

1. replace the current academic funding model with pure fellowships. Each individual, from most junior to most senior, gets their own N year funding.

2. Each has a legal entity under which their IP lives and in which the government takes a small, fair, non-voting share.

3. Completely divorce this funding infrastructure from universities -- if someone wants to use part of their grant to pay for PhD courses/advising, great, but make it so that funding science is not contingent on that institutional apparatus.

For lab sciences things are more complicated.


The other difference is if you had to open source anything sold to the USG then no one would sell anything closed source to the USG.

And there's lots of useful software the government wants to buy that is closed source.


Useful to who, and for what?


The usg, for whatever that piece of software says on the tin. Unless you are just asking if there is any closed source software anywhere which provides value not totally replicable using oss, which seems like a silly question.


Many of the message queues I have seen are only necessary because of personalization or analytics features that the user doesn't want and the business probably doesn't need.

When I try to estimate ROI on those features it's usually miniscule. A/B test are unconvincing and the aggregate reports synthesized from the data look more like numerology than rigorous science. More-over, senior leadership usually doesn't realize how much additional liability they take on by collecting and storing so much customer data.

It's not 2010. The large tech firms are your direct competitors and regulators across the world have caught up, including in many US states. If you have a primary product for which people will pay money, then the "surveillance agency as a side-business" model of the 2000s-2010s is probably a bad idea. It's a net expense that exposes you to liability and distracts your team.

Additional benefit: without real-time analytics and personalization, you can go even further without a message queue.


> because of personalization or analytics features

That's not the architectural reason for message queues in my experience.

Primarily, a message queue is used when there is a potential for a bottleneck in the overall application throughput where there is high cost of reacting to some event so instead of reacting to the event synchronously, you queue the event and react to it asynchronously.

Some very common use cases:

1. Your UI. Operating systems use a message queue to capture and forward inputs. Your mouse and keyboard events are all queued in a message queue.

2. Webhooks. Many webhook origins have timeouts on responses and your application must respond within a given window or it will be throttled or downgraded. In this case, the best practice is to queue the event to be processed asynchronously (that queue can be a simple database table with your own logic and wrapper around it or something like AWS SQS, Azure Service Bus, or Google Pub/Sub).

3. Mitigating Throughput Bottlenecks. Most applications have a read/write asymmetry so it makes sense to optimize your architecture to scale for reads. But what if your application occasionally has to handle a burst of writes? Should you size your infrastructure for that case? One approach is to proxy the writes through a queue so you can size the infrastructure for a maximum throughput that is managed by the queue. For example, instead of 1000 concurrent writes per second, a queue can capture the write mutations and trickle out only 100 concurrent writes per second. Instead of sizing your application to scale to handle 1000 writes per second, you only need to size your queue to handle that scale.

4. Resiliency. If a message fails, it can be retried according to whatever heuristics make sense for the domain. Sure, you can use a simply loop to retry, but every message queue provides some mechanism for handling retries, failed message delivery, and so on. If you decide to roll your own and log a failed call into a database to try it again later...well, you've effectively captured a message in a custom queue.


User tracking and/or personalization tick 2-3 of your 4 boxes:

UI - Logging mouse movement, keyboard events, and other types of attention proxies.

Webhooks and/or Bottlenecks - calling out to either internal or third party classifiers, or more recently even generative models, for personalization based on user tracking data.

And I don't think this is just my unlucky experience. The fine article includes user tracking as one of only three explicitly enumerated reasons for queuing:

> Why? Because users increasingly expect a real-time experience. In use cases like order flows, webhooks, user tracking, etc. users expect to be able to see the new data in the user interface instantly, instead of having to wait for some background batch processing to periodically reload.

Of the explicitly enumerated motivations in the article:

1. "Order flows" - tracking/modification is often one of the higher latency items in order flows ("you might also like" / "what to order next" features).

2. "Webhooks" - often used for tracking/personalization

3. "User tracking" - ...this one is easy :)


Webhooks go far beyond personalization and tracking; it's a general purpose integration pattern.


Yes, and I've worked on browser-based games that are built on top of webhooks. But what percentage of people could forego webhooks entirely if they weren't doing any personalization or tracking? Or could at least get away with webhooks without any "real" queuing infra? I'd wager a large number.


Not really. Any payment system requires webhooks


Shameless (but relevant!) plug: we built Svix to help people send webhooks from their platform. With Svix you don't have to use message queues for webhooks, at least not from the sender side: https://www.svix.com/

Note: though message queues are great, they make asynchronous operations and interacting with external services much more resilient.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: