Hacker Newsnew | past | comments | ask | show | jobs | submit | _nist's commentslogin

Companies lie and exaggerate all the time. It's called PR. It makes no difference whether it comes from a company or university, but companies have much more incentive to lie than universities, especially in technical fields, because they don't have to subject their claims to peer review unlike universities.


I don't think anyone is accusing academics of lying -- academics are just more likely to announce things no industrial relevance. Because industrial relevance is not required for academic relevance.


It depends on what you mean by "more likely to announce things with no industrial relevance". If we are including humanities, then yes. If we exclude humanities, then no. Nearly all current STEM advancements can be traced back to early university work because industries normally do not pay for R&D because of its high risk, low reward effort. Not to mention, it's a money sink. Even current AI models come straight from university work, intermixed with private industry who often apply these concepts and scale them, but aren't the idea originators.


> Nearly all current STEM advancements can be traced back to early university work because industries normally do not pay for R&D because of its high risk, low reward effort.

Yes. But not the converse.


Doesn't surprise me in the least. McDonald's is such a bad company on many levels, even from its founding roots. They make fast food (an industry notorious for craping all over their employees) and it was usurped by a greedy business man (Ray Kroc) from its original founders. Their food is okay, but I would never, ever do work for them.


> usurped by a greedy business man (Ray Kroc) from its original founders

It's a weird aside, but maybe worth noting that Ray Croc was probably not just greedy. Ray Kroc was probably also racist against the Irish. A tell-tale sign is using a clown as the company mascot. The US strain of clowns were heavily influenced by "pale white face" racism jokes about Irish immigrants [0] from some of the same minstrel shows notorious for "black face" and "yellow face" and "red face". Ray Kroc was from a generation that would have easily been aware of that and would have been "entertained" by it. Ray Kroc's behavior to the actual McDonald's founders is rather easier to explain assuming it included quite a bit of Old Fashioned Racism than assuming just pure greed. Sometimes it is useful to remind ourselves that past isn't as clean as corporate memos want to paper over it.

[0] Notably, among other things: red hair, big feet, freckles, red drunken noses, loutish drunken behavior. Even the "clown car joke" is the exact same "joke" as "Mexican pickup truck" transposed across a couple of decades and about a different working class immigrant population. (Racists seem pretty lazy in how they reuse old material.) So yeah, if you ever wondered why clowns don't seem all that funny in the modern era, congratulations you probably aren't a racist. Also, now that this past horror is in your head I'm sorry for ruining Disney's Dumbo which uses all the worst of clown stereotypes and "jokes" all in the same place and eats up a lot of runtime with it, if you weren't already concerned about the "black face" crows in the movie or thought you could dismiss them as not central characters or contributing that much to the runtime.


Why is this grey? Does that mean people are reporting it? If you report it or want to, then why?

I haven’t verified these claims, nor do I have the background knowledge to form an opinion. I don’t care. Seems plausible and I think Irish were within the definition of “black” a century ago.

Is it because you don’t believe in this? Or do you think it is astroturfing with bad intentions similar to the anti women memes plastering social media (where it is some wrong committed by a woman with 100s of bot comments saying all women are bad)? Kinda like repeating long ago marginalizations of white people to stir up a feeling of wronghood among older more conservatives white men?


I don't even consider the food okay... I liked a few breakfast options, but the prices of even that have gone up so much the past 3 years, I won't even go then. It wasn't that long ago that the breakfast burrito was $1 (then 2 for $2/3) and they had some sandwiches 2 for $2/3/3.33/4/5 ... When I'm paying over $10 for a couple breakfast sandwiches and a drink, I'm out.

They definitely seem to be squeezing more out of their franchises than ever at this point. They will push technology optimizations, but they're at a point where service and price just isn't there. If I'm spending $15+ for lunch, I may as well go to Applebees/Chili's, etc.


At the airport yesterday - only two items on the $1/2/3 menu - $3.99 4 piece nuggets and $5.19 McChicken sandwich.

It’s comical how bad the gouge customers and franchises alike.


As a frequent work traveler, airport prices are always at least double what they are outside the airport. It's not a fair comparison.


I was addicted to these games as a kid. Too bad they never remade them and now its some obscure title nobody can get.


Microsoft delivers AI recall Everyone hates it Apple integrates AI into every facet of a device that is highly personal Everyone loves it

Please make it make sense.


This is one successful landing. I think its robustness needs a lot more successful landings than this. Plus, its not even landing like it was originally designed to land, its just landing in the ocean like any other rocket expect with thrust vector control rather than parachutes.


I agree with this. In a large organization, if you have risen to a level where you are being relied upon at a regular intervals, it is imperative that you have a well architected solution that you can readily change and this is where you separate your program from spaghetti code to something useful. Sure, its nice to write unmaintainable junk when toying around, but I to have seen too many codebases where people were just throwing features in without thought and it causes the program to become way too constrained to only a specific problem domain and it becomes inflexible for solving new problems (to the point you have to re-write nearly everything from scratch).


The contrast to this is to put in tickets specifically for refactoring and reorganization, but I've rarely seen that work since they often don't have any sweetener included to encourage e.g. product organization to sign off on the work.


Yeah, it's a shame, I often have to do this type of crap in my off time. However, having a well architected app significantly reduces these types of massive undertakings. Doing it right the first time has its advantages. Then again, weighing it against delivering early and other important aspects has its merit too. It's just a double edged sword unfortunately that you often have to walk in this industry.


There is succeeding, then there is your company being overvalued to insane levels because of some hype boom. It also doesn't mean that just because you outcompete, then you can artificially reduce supply and only support a select few OS's (ie. windows) while producing modest performance gains in your GPU's.

It's a modern day Tesla in waiting more than likely.


Consumer harm = selling their graphics cards at outrageous prices by keeping supply low and reducing the OS support for them by making Linux drivers crap.

Then again, it never "required" consumer harm as far as I've read.


> making Linux drivers crap

This has zero consumer harm since an insignificant fraction of consumers use Linux.

The harm is in their control of both chip making and e.g. CUDA. The analogy would be Intel refusing to license 8086 and x86 to AMD [1].

[1] https://itigic.com/x86-on-intel-and-amd-why-cant-anyone-else...


Isn't this reasoning circular?

How would a significant fraction of consumers use Linux if it's not supported by a monopolist?

edit: also, is the premise even true? Are these compute farms really running windows?


It's circular the way it was described. Seems like Linux servers are fine with Nvidia GPUs for ML training, but not for graphics, which is more applicable to Linux desktops.


This is not true and very ignorant. Linux has huge marketshare especially across server space where NVidia cards are used also. Furthermore plenty of people run Linux desktops that would be affected by this.

Consumer harm comes when there is intention to control consumer choice. Since Linux isn’t the biggest platform, but a major choice: It is therefor harmful for Nvidia to not support it properly.

Check your bias.


> Linux has huge marketshare especially across server space where NVidia cards are used also

Consumers don't generally rent server space. It would be difficult to establish consumer harm on the basis of server prices.

> Furthermore plenty of people run Linux desktops that would be affected by this

Right, this is the insignificant bit. Inconveniencing 2 or 3% of the market is not a valid antitrust claim [1][2].

> Consumer harm comes when there is intention to control consumer choice

No, it comes when you can prove prices were raised, output reduced, innovation diminished or customers were "otherwise harmed" [3]. To the degree intent is considered in the enabling case, it's in reading the intent of the Congress, not the defendant [4].

[1] https://en.wikipedia.org/wiki/Usage_share_of_operating_syste...

[2] https://gs.statcounter.com/os-market-share/desktop/north-ame...

[3] https://www.americanactionforum.org/insight/why-the-consumer...

[4] https://supreme.justia.com/cases/federal/us/442/330/


Output has been scarce for NVIDIA gpu's for quite some time. Often not even being able to support the demand for their cards. NVIDIA is a trillion dollar company right now. There should be no reason why they are restricting access to their reference cards and not able to support demand.

You are also not counting handheld use such as the Steamdeck. There is a reason Steamdeck doesn't use NVIDIA graphics.


Consumers do rent server space and prices were manipulated, because they could control consumer choice. It’s all done for the control of the consumer, hence the antitrust.


All those Linux servers using Nvidia cards must be getting good enough support from Nvidia for what they need to do, otherwise they wouldn't use it.

GNU/Linux desktop is a different story. Yes it's accurate to say that's an insignificant market share. And I said "GNU" to differentiate from the most popular Linux desktop OS, ChromeOS.


So you’re just looking to isolate consumers so you can make a point. Linux users aren’t insignificant regardless how you roll the dice and want them to be. We push more technology forward than non-Linux users. The insignificance is just something Macos and Windows users tell themselves to feel good about their desire to dislike Linux as they spin up LXC containers.


They're different use cases from the industry's perspective. Nvidia supports GPU compute on Linux very well, but graphics not so much.

I'm not interested in the OS fan wars and neither is Nvidia, but if you want to consider Linux server users the same as Linux desktop users instead of isolating them, you can count me on that side. I have an RPi, a PowerEdge, an Android phone, and yes an Alpine Linux Docker container on my Mac.


I would be surprised if more than a small minority of consumers using CUDA ran anything other Linux. I would like it if I wasn't forced to use their crap drivers that force me to use X just because I want to do some ML.


They’re not required to make Linux drivers. They open sourced the stack years ago. Take it up with the driver community.

Second, what? The drivers for AI work loads on Linux are powering massive models right now.

I think what you actually meant is “I don’t understand antitrust laws or really underlying tech and am mad I can’t game on Linux”


I understand antitrust law enough to know that "mad I can't game on Linux" is an anti-competitive measure.

As far as what I read about the driver stack, they only recently (within a year) release source for kernel modules. They did not open source the whole stack. And these are only new modules in an alpha state.

Its basically the bare minimum.

Most people aren't running ML on their workstations and these mostly use entirely different types of cards.


I had to use a bleeding edge live mint ISO yesterday because the regular release ISO gave me a black screen on my 3099z. Nvidia drivers continue to be my #1 Linux paint point aside from windows apps that don’t run on wine due to DRM/anticheats.


Why would Nvidia intentionally use monopoly power to give Linux a hard time? You're also free to use AMD or Intel.


That is at best a weak argument because guess what: AMD is competition and also sell at prices outrageous enough that there is no downward pressure.

If anything, I would take Nvidia, AMD, and maybe Intel to court for price fixing their GPUs.


I'm not sure what world you live in, but Nvidia sells their cards at 1.5 - 2 times the price than AMD's on average and AMD has had strong Linux support for years.


Do they though?

Another narrative is that AMD undercuts Nvidia prices because their product isn’t as good.

Worse, Intel then undercuts AMD pricing for video cards because theirs aren’t even as good as AMD’s.

When you get to antitrust, things like this are important. I may agree with you, but an Nvidia lawyer will take this tact and run with it.


When AMD can run 95% of games as good as any NVIDIA card and you factor in that most games have virtually no ray tracing support or any other fancy feature that NVIDIA offers, coupled with the known low performance/cost gains in their own graphics family, and their lack of ability to support demand despite being a trillion dollar company, I'd say that there isn't much of a leg to stand on.


You could replace those brands with Times and Rolex. Still wouldn’t be antitrust.


It would be anti-trust if they were deliberately shrinking supply to hike up prices on their cards. For example, I know I have the best card in the industry, so I'm going to force the supply low so I can charge effectively whatever I want.


Nvidia and AMD (previously ATI) have had a similar relationship for a long time, before ML was a use case. Nvidia has always been the more expensive option afaik. Kinda like Intel vs AMD.


It wasn't in the 2000's back when ATI and NVIDIA cards had comparable prices. Sometimes ATI cost a little more because they were better, but they were never crazy different from NVIDIA. And there were never supply problems.


DEI has always been in companies before the term even arose to the larger public discourse. I have to go through such trainings every year at every company I've been at. There has never been true "diversity of thought" at most institutions and I doubt you would really want it, especially if you were on the receiving end of it every day and it was just blatant bigotry towards you. As far as achievements, I can't recall a single instance of someone being brought down because of a participation award. Unless you think being 1st, 2nd, or third gets robbed of all meaning because everyone got an award.


I think Claudine Gay is a bad example. First of all, she wasn't fired, she resigned. If you look at her credentials, she is more than qualified for the position. Many people attacked her primarily for the faux claims of antisemitism and her logical response to an inflammatory question from a Congressional hearing. Such a notion of antisemitism has been largely not backed up by much of anything other than spurious claims and deliberate misinterpretations (from the river to the sea, etc.) because people are obsessed with Israel which has been condemned for their actions against the Palestinians. As far as her plagiarism allegations, they have much more merit. However, so many academics have deliberately faked data and plagiarized that a few examples of miss-citations and misquotes (that were largely discovered due to the fact that people wanted her fired anyways, nobody cared before) is very weak in my opinion, especially if it wasn't deliberate. Not to excuse it, but its not out of the realm of possibility for it to be a mistake and it certainly isn't because of DEI that she somehow plagiarizes worse (all races and genders do it). She also requested to make corrections to some of these accusations showing an attempt to fix it. DEI is an attempt to level the playing field for people who have far worse upbringings by nearly every statistical measure. It's what affirmative action attempted to alleviate. These poor upbringings largely affect minorities and this is a statistical fact.


I agree that the "oh my god, she plagiarized someone" shock & horror was mainly (not entirely) an exercise in firing her, without admitting the main reason - that she initially gave lukewarm pushback to the hysterical "anti-semitism is widespread on campus" fake hoax.

In hindsight, she (& others) might feel that they ought to have pushed back more strongly, and debunked this hoax more definitively, when it was first trotted out.

Since then, we've had, on campuses, Zionist students made into media darlings for a day for being stabbed in the eye, when all that happened is that a piece of fabric lightly brushed her cheek. While other (Seinfeld funded) students attacked campus protestors with lead pipes and whatnot, but getting away scott free.

It's like McCarthyism, mixed with 30's era brownshirts, with the mainstream media coverage being directed by Goebbels.


I agree with your sentiments, but she was not fired. She resigned. There is a big difference. Of course, I wasn't there so I don't know if it was a forced resignation (ie. firing) or not, but everything I look up says she resigned.


Almost no one is ever "fired" from executive roles. They almost always "resign" under pressure. People don't want to give up the best job they will ever have.


You can always get a job as another executive, so I doubt its the "best job she will ever have". But I agree, she was most likely pressured.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: