Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Nvidia also spends a metric shit ton of money to make sure professors use and teach on their platform.

I don’t remember any alternatives in uni. Maybe OpenCL but only lightly mentioned



As someone who has designed and taught those courses, my experience (admittedly only one persons) is that you pick what will work with the least hassle - because you'll have plenty of hassle elsewhere and probably no real time to deal with any of it without making more.


This is actually one of my favorite comments of all time, because it's how software wins. The software that students use is the software the industry uses about five years later.


Not always.

One of our machine learning courses was taught in Matlab.

Unsurprisingly, nobody used Matlab after uni, or 5 years later.


Also did an algorithms in machine learning course in matlab

It’s a great language choice for it

It weeded out the script kiddies who incorrectly signed up wanting a Tensorflow or PyTorch course

It’s a fairly bland and slow but usable language for the task

Shits me off to no end a lot of engineering courses moreorless indoctrinate their students into using it unconditionally, though

Octave exists but is a relative pain to use


It's still a pain spending time learning matlab syntax/semantics when you could just, idk, use C or Haskell instead


Matlab is fairly easy to work with (initially) and is great when learning a new concept, instead of learning that plus arbitrary syntax of the tool.

It isn't particularly fast though, and the simplicity quickly becomes an obstacle when solving a real problem.


My experience in university was the exact opposite. The stuff we were using was 5-10 years behind what industry was using.


> The software that students use is the software the industry uses about five years later.

which is why it's anti-competitive for a company to sponsor university courses (such as providing educational versions for free). It should be disallowed, unless the course is _specifically_ teaching the software, rather than a general course.


That's competitive, not anti-competitive.

Anti-competitive means others are not allowed to do the same.


> others are not allowed to do the same.

it's usually the case where the sponsor is the sole sponsors (aka, the course does not teach both X and Y, esp. if X is given to the uni for free).

It's anti-competitive to allow companies to embed themselves in general courses, despite it not being so by the letter of the laws.


Sort of -- but basically no course is going to teach X and Y, if they're functionally equivalent ways to learn about Z, because almost no course is specifically about X or Y, it's about Z, and learning both X and Y isn't germane to learning Z, just learning one is enough.

As long as the companies behind X and Y both have a fair shot at sponsorship, this isn't really anti-competitive. It's literally a competition in which the companies compete for student and faculty attention.

Anti-competitive would be a company saying "you must teach X and not Y in your class about Z because you use Xco's mail services" or some other such abuse of one contractual relationship for an unrelated gain.


They say "hey if you want to teach a class using X, we'll sponsor it."

A competitor can complete for that sponsorship. So long as it's done on direct merit of the value, there's no problem.

Anti-competitive would be providing products or services and forcibly leveraging that into an unrelated contract.


>Nvidia also spends a metric shit ton of money to make sure professors use and teach on their platform.

Do you have a source for this claim? Or do you simply mean that since they spend money making it better that professors end up using it on their own accord?


I hold an NVidia instructors cert from when I worked in academia. They even give you access to hardware while you’re running courses on it. It’s super easy and totally free.


I won an Nvidia GPU while I was doing my advanced graphics course for making custom shaders.

Had to buy a new power supply just so I could use it.


They co-author the definitive CUDA textbook, and it's based on their sponsored class (You can find the story in the intro of the book.) https://www.amazon.com/Programming-Massively-Parallel-Proces...


Co authoring a book is not "metric shit ton of money".


No, I think it’s a source for the claim and not the actual evidence of what they spent it on.


OpenCL was discussed more frequently in classes about a decade ago. However, I haven't heard it mentioned in the last five years or so.


Yea people tried to push OpenCL back then, it simply was just inferior


Opencl is horrible compared to cuda


Especially since AMD and nVidia have similar costs for a GPU


AMD has hip which is basically a CUDA clone.


Only for those that equate CUDA to C++ only, and poor tooling.


They've replicated many of the libraries as well. But yea haven't personally tried it.


Not exactly but they give massive discounts and the tools are much more appropriate to use for late undergrads and grads.


> Nvidia also spends a metric shit ton of money to make sure professors use and teach on their platform.

Nah. People teach what they use because that's what's easy.


It's definitely both.

I'm sure plenty of professors use CUDA in their courses because it's what they actually use. At the same time, in 2013 when I was in college I took a course on "parallel computing" as a CS elective. The professor told us on day 1 that NVidia was sponsoring the course and had donated a bunch of GPUs to the clusters we could remotely connect into for the sake of the class. Naturally we used CUDA exclusively.

I know for a fact that this happened at a lot of schools. I don't know if it's still happening since I'm not in that world anymore, but don't see why it would have stopped.


CUDA is extremely simple, the classes might as well be on rails. OpenCL is like impossible without graphics and/or CUDA/distributed computing/operating system experience.


I'm not sure if I really agree - the level of abstraction used for each is extremely similar. There's not really any "Graphics Pipeline Specifics" pollution in OpenCL or CUDA.

You can pretty much translate something like https://github.com/jcupitt/opencl-experiments/blob/master/Op... with string replace function names.


You get free access to hardware for courses if you teach CUDA courses.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: