Hacker Newsnew | past | comments | ask | show | jobs | submit | jcranmer's commentslogin

The audits of people under that are going to fall under 2 or 3 categories:

a) People who filled in the wrong number on the spreadsheet that is taxes for whatever reason, and the audit is informing the filer that they filled it out incorrectly. I mean, really, taxes should start with the government sending me the form of what it thinks I owe and I should be making corrections to that, since the government already has this information and has done it, and that would make many of these audits go away.

b) People who misunderstood eligibility requirements and claimed deductions they weren't entitled to.

c) So I don't know how these people are counted, but there are absolutely millionaires and billionaires out there cheating on their taxes and claiming no income (e.g., the current president). It's totally plausible that they get listed in the "under 25k income" audit section despite the fact that they are in fact the uber-rich that is the intended target of the outrage.


>c) So I don't know how these people are counted, but there are absolutely millionaires and billionaires out there cheating on their taxes and claiming no income (e.g., the current president). It's totally plausible that they get listed in the "under 25k income" audit section despite the fact that they are in fact the uber-rich that is the intended target of the outrage.

There's a sleight of hand in your argument here. I said under 25k with EITC. You can't get EITC if you're "claiming no income." That's why it's called earned income tax credit as the credit is intended to help offset welfare cliffs as you start to earn more money but at low incomes. So your whole paragraph here about <25k is null and void as "millionaires and billionaires out there cheating on their taxes and claiming no income" aren't in the <25k EITC bucket I mentioned, they're in the bucket of others earning under 200k.

(As an aside, if someone actually thinks earns nothing and doesn't want EITC which they can't get anyway with zero income they probably won't even be filing, there is no "return" to audit.)


> The justification is to force people to work until they are too old to do so.

Actually, the justification is to prevent old people from having to work. Retirement didn't really exist until the creation of pension systems in the late 19th century, and the modern social security system was a poverty alleviation measure introduced in the 1930s. Hell, social security was initially resented by older workers because of the cover it gave employers for firing them for being too old.


The government doesn't decide when you retire. The government decides when it is willing to pay you to be retired.

Social security is an entitlement. They have taken money from your paycheck to fund it. In fact, they have taken more from your paycheck than they will pay back to you in order to pay for an aging population. The extra goes to bonds which the government then uses to reduce inflation when they decide to invade random countries or bail out a bank.

Now, why does the government get to decide when I retire with my own money?


If you don't like it, join the Amish and file a Form 4029.


Well one obvious reason is that you're not retiring with your own money; your contributions fund current retirees.

> What no one wants to hear is rust is destined for the same fate. If you want to see the future of rust, look at C++. Rust has a much better initial state, but the rules evolving the system (the governance model, the kinds of developers that work on it, etc.) are the same as C++ and so we should expect the same trajectory.

Dear lord that is not the case. The C++ standardization process is extremely different from Rust's specification process, and the resulting pathologies are extremely dissimilar. Hell, C is fairly close to C++ in terms of process, and yet it still has its own set of pathologies.

The C++ committee is dominated not by experts on compiler implementation, but by people looking to get their own proposals incorporated into the standard, and is structurally organized in such a way that it can be difficult for any group to feel empowered to actually reject a feature. It should be noted that in the most recent batch of C++ papers, there was effectively an implementers' revolt: https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2026/p39....

The Rust proposal process is much more ponderous, and when you take into account the lag between an accepted RFC and implementation and stabilization (and the fact that some accepted RFCs turn out to be unworkable and effectively get backed out without ever being stabilized), it's pretty clear that the actual development process is night-and-day different. For example, the Try trait in Rust still has yet to be stabilized, despite the RFC proposing it being introduced over nine years ago and a v2 RFC being accepted five years ago.


This kind of "but for us it's different" thinking is a little amusing.

I don't care about the implementation process or the RFCs or what-have-you. If there is a democratic committee of humans that decides what goes in, and there is no bias for minimalism (e.g. 1/3 could strike down a proposal instead of 1/2) then the process will tend towards bloat.


The Rust RFC process requires essentially unanimous consent: there's no formal voting procedure, but the various teams can block any feature from going in.

But sure, keep on saying they're basically the same thing.


Wikipedia links to this guide to the APP, published in December 2006 (much closer to when the rule itself came out): https://web.archive.org/web/20191007132037/https://www.bis.d.... At the end of the guide is a list of examples.

Only two of these examples meet the definition of vector processor, and these are very clearly classical vector processor computers, the Cray X1E and the NEC SX-8 (as in, if you're preparing a guide on historical development of vector processing, you're going to be explicitly including these systems or their ancestors as canonical examples of what you mean by a vector super computer!). And the definition is pretty clearly tailored to make sure that SIMD units in existing CPUs wouldn't qualify for the definition of vector processor.

The interesting case to point out is the last example, a "Hypothetical coprocessor-based Server" which hypothetically describes something that is actually extremely similar to the result of GPGPU-based HPC systems: "The host microprocessor is a quad-core (4 processors) chip, and the coprocessor is a specialized chip with 64 floating-point engines operating in parallel, attached to the host microprocessor through a specialized expansion bus (HyperTransport or CSI-like)." This hypothetical system is not a "vector processor," it goes on to explain.

From what I can find, it seems that neither NVidia nor the US government considers the GPUs to count as vector processors and thus give it the 0.3 rather than the 0.9 weight.


> Assume FP64 units are ~2-4x bigger.

I'm not a hardware guy, but an explanation I've seen from someone who is says that it's not much extra hardware to add to a 2×f32 FMA unit the capability to do 1×f64. You already have all of the per-bit logic, you mostly just need to add an extra control line to make a few carries propagate. So the size overhead of adding FP64 to the SIMD units is more like 10-50%, not 100-300%.


Most of the logic can be reused, but the FP64 multiplier is up to 4 times larger. Also some shifters are up to 2 times larger (because they need more stages, even if they shift the same number of bits). Small size increases occur in other blocks.

Even so, the multipliers and shifters occupy only a small fraction of the total area, a fraction that is smaller then implied by their number of gates, because they have very regular layouts.

A reduction from the ideal 1:2 FP64/FP32 throughput to 1:4 or in the worst case to 1:8 should be enough to make negligible the additional cost of supporting FP64, while still keeping the throughput of a GPU competitive with a CPU.

The current NVIDIA and AMD GPUs cannot compete in FP64 performance per dollar or per watt with Zen 5 Ryzen 9 CPUs. Only Intel B580 is better in FP64 performance per dollar than any CPU, though its total performance is exceeded by CPUs like 9950X.


I really, really hate the modern trend of scrollbar design. I guess it makes some amount of sense if you're aiming for a mobile phone factor, where real estate is somewhat limited, but changing the scrollbar from a widget that lives functionally outside of the content it is scrolling to a translucent show-only-on-hover widget that overlays the content (and can thus become functionally invisible if the content is just the wrong color) is a real step backwards in UI design.

I finally got frustrated enough to go in and manually increase the default scrollbar size in Firefox. Slim scrollbars are awful both to look at and to use. I'm working with an ultrawide monitor here, please give me more than 0-3 pixels of scrollbar!

I didn't know this was possible. This is amazing! I found it under the inconspicuous config item called "widget.non-native-theme.scrollbar.style". I changed it from 0 to 4. There seems to be no UI for this.

Source: https://support.mozilla.org/en-US/questions/1443060


x86 has (not counting the system-management mode stuff) 4 major modes: real mode, protected mode, virtual 8086 mode, and IA-32e mode. Protected mode and IA-32e mode rely on the bits within the code segment's descriptor to figure out whether or not it is 16-bit, 32-bit, or 64-bit. (For extra fun, you can also have "wrong-size" stack segments, e.g., 32-bit code + 16-bit stack segment!)

16-bit and 32-bit code segments work almost exactly in IA-32e mode (what Intel calls "compatibility mode") as they do in protected mode; I think the only real difference is that the task management stuff doesn't work in IA-32e mode (and consequently features that rely on task management--e.g., virtual-8086 mode--don't work either). It's worth pointing out that if you're running a 64-bit kernel, then all of your 32-bit applications are running in IA-32e mode and not in protected mode. This also means that it's possible to have a 32-bit application that runs 64-bit code!

But I can run the BCD instructions, the crazy segment stuff, etc. all within a 16-bit or 32-bit code segment of a 64-bit executable. I have the programs to prove it.


Yes, but you transition between the 2 modes with far jumps, far calls or far returns, which reload the code segment.

Without passing through a far jump/call/return, you cannot alternate between instructions that are valid only in 32-bit mode and instructions that are valid only in 64-bit mode.

Normally you would have 32-bit functions embedded in a 64-bit main program, or vice-versa. Unlike normal functions, which are invoked with near calls and end in near returns, such functions would be invoked with far calls and they would end in far returns.

However, there is no need to write now such hybrid programs. The 32-bit compatibility mode exists mainly for running complete legacy programs, which have been compiled for 32-bit CPUs.


> The problem with extremely smart people is not many people understand them.

I know this is a common trope in many media portrayals, but it's really not my experience. The "insufferable genius" stereotype tracks most not for the extremely smart people but the kinda-smart people who are absolute jerks but try to defend their jerkassery on the basis of their intelligence.


The few very brilliant people I've known devoted themselves to master a subject, at the cost of neglecting others, like socialization. They were not autists by any measure of the condition, just very socially undeveloped. Some embraced the awkwardness, but others chose to be jerks because it is easier than rescuing an atrophied skill. The jock equivalent of wearing baggy pants because they skipped leg days.

I've also known a handful of artists, and some seemed to adopt the tortured artist stereotype out of style, not fate. They were convinced no one would take them seriously artistically if they weren't interesting and eccentric. In their case, being a jerk is a fashion.

I guess my point is, we choose what skills we want to develop, and also if we accept the skill exchange, or make excuses like "I'm bad at X", "I am this way and can't change", etc. Leave that to people that are actually diagnosed with a limiting condition; they usually put a great deal of effort and still need help to succeed.


Feynman is known for being a very social guy, though.

I understand where you're coming from. I wasn't meaning from the context of the pseudo-smart person portraying that (which is obviously a thing, probably more obvious nowadays), but a person that is the real article. You meet all walks of life in your lifetime and that unattainable-ness of very smart people can come across as inaccessible, unexplainable or arrogant.

The kind of person that has spent much time chiselling their belief system or is simply fascinated by a field of study that not many people can relate to on that depth. Feynman was a great communicator, but I can think of a few people that may have Asperger's syndrome that have that exceptional insight into things that sometimes results in collateral damage in relationships.

What I mean is there are exceptional people, and sometimes people fail to understand what is exceptional and take exception themselves.

The political narrative of the time obviously was extra cynical about declarations of which team you're playing for, or non-declaration. That's what I meant about non-conformist, they're not interested in the politics.


> The "insufferable genius" stereotype tracks most not for the extremely smart people but the kinda-smart people who are absolute jerks but try to defend their jerkassery on the basis of their intelligence.

Autism plays a lot into this. You'll get people who can seem condescending or unaware of different social norms, and it's genuinely not from a bad place, just a complete inability to understand their own communication style (especially in the moment).


> Autism plays a lot into this. You'll get people who can seem condescending or unaware of different social norms

Recently "autism" is a scapegoat for everything, both claiming to be autist to get a free pass to be a jerk, or calling someone autist because they do something unexpected.

I have been called autist after a meeting just because I said something could not be done in the timeframe proposed. Acording to social norms, the correct thing to do was to lie, say it could be easily done, and deal with expected missed deadlines with even more lies.

Another "autism" trait I have is to say a dry "no" to invitations I don't want to attend, apparently the social norm is to say "yes" and then fake an excuse a couple of hours ahead, or even worse, just don't go.

The point is the word "autism" (or even jerk) is being used as a synonym of "direct", "sincere" or "no bullshit" too often. And I am not talking about calling people fat or ugly out of the blue (that's a real jerk), but saying "no" when it is enough.


> misogynistic behaviors were cultural at the time, I agree they're abhorrent but people are embedded in their culture.

Moral relativism is a thing, but I think a more useful way to think of it rather than just saying "misogyny was a thing back then, should we care he was a misogynist then?" is to ask "if he were to have lived and worked in the 2000s, would he associate with Epstein?" And to be honest… Feynman does strike me as the kind of person to have the intellect to attract Epstein's attention and also the, for lack of a better term, party attitude to go to a couple of Epstein's parties that would result in him having awkward press releases trying to explain that he just had no possible idea that Epstein was doing anything sexual with children and conveniently forgetting all the times he was on the private island for some party or another...

That's the real strong vibe I get from Surely You're Joking. He's the kind of person who wants to be seen as someone who gets up to wacky hijinks, to be seen as "cool," and he specifically interprets "cool" in a way that's misogynistic even at a time (when he was dictating the stories that led to Surely You're Joking) when misogyny was starting to become a professional hindrance.

(And one of the things that really worries me about Surely You're Joking is that it's often recommended as a sort of "look at the wacky hijinks you can get up to as a physicist," so recommending the book is a valorization of his wacky hijinks and... well, that's ultimately what Angela's video is about, that's a thing we need to stop doing.)


> That's the real strong vibe I get from Surely You're Joking. He's the kind of person who wants to be seen as someone who gets up to wacky hijinks, to be seen as "cool," and he specifically interprets "cool" in a way that's misogynistic even at a time (when he was dictating the stories that led to Surely You're Joking) when misogyny was starting to become a professional hindrance.

In my experience, everyone who says this is talking about exactly one chapter in Surely You're Joking, but they don't appear to actually have paid close attention to the story. It's a story that Feynman recounts about trying to pick up girls when he was younger. He was advised by an older, "cooler" man to be mean. Feynman tries it and it works, but he feels bad about it and says that he never did it again. People calling Feynman a misogynist for this story seem to have just skipped the end of the chapter.


It's been decades since I read Surely You're Joking, and I've completely forgotten about that chapter. It plays no part in my conscious recollection of the book.

The episode that really stuck in my mind was the episode about his competition with the abacus-user, who was better at math, which essentially ends with him giving up trying to explain how he could mental math a cube root faster, because the abacus-user was just someone who couldn't understand a math explanation.


I remembered enjoying the book, so having not read it in a long time, I tried sharing Surely You're Joking with my kids at bedtime.

That chapter wasn’t the only thing I ended up skipping or heavily editing.

* Picking a room at Los Alamos with a window facing the women’s housing, but being disappointed that a tree or something blocked his view. (Wasn’t he also married at this point?)

* Starting a new Uni faculty position and hanging out at student dances, dismayed that girls would stop chatting & dancing with him when they learned he was a prof and not a fellow student.

* Hanging out at strip clubs to practice his drawing skills.

* Considering a textbook sales rep’s offer to help him find “trouble” in Vegas.

So maybe that one chapter turns around some at the end, but it’s not the only cringe-worthy moment in the book, and I can see why some people may have an overall negative opinion.

If I were going to do this with my kids now that they are teens, I wouldn’t filter as much and use the more questionable events as points of discussion.


> would he associate with Epstein?

This is from Lawrence Krauss[0]'s email to Epstein[1]:

> ps. I have decided that Feynman would have done what I did... and I am therefore content.. no matter what... :)

> On Apr 6, 2011, at 3:56 PM, Jeffrey Epstein wrote:

> what evidence? no real sex.. where is she getting her so called facts

Krauss's letter is obviously horrible in its implications. What's interesting to me is his interpretation of what Feynman would have done. Is it his delusional justification of what he'd done with Epstein, or is it based on a certain reputation of Feynman in the science community?

[0] https://en.wikipedia.org/wiki/Lawrence_Krauss [1] https://www.epstein.media/files/house_oversight_030915/


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: