Hacker Newsnew | past | comments | ask | show | jobs | submit | lapcat's commentslogin

The typos will continue until morale improves.

I wouldn't place much stock in small studies like this. https://en.wikipedia.org/wiki/Replication_crisis

Even if the data is truly representative and unbiased, we should also not believe conclusions that draw so heavily on interpreting correlation. One could just as well conclude that male intelligence and beauty go hand in hand. Or that men have a better relationship with the teacher. Or that the effect of female beauty fades with distance. Or that COVID had an effect on make-up style. The confounds are endless.

I wouldn't either. The difference for female and males reeks of the law of small numbers.

It's comment catnip for HN though.

This is incorrect. The setting does not apply to Safari. It's for App Store, for example.

Now tested - you’re right

> I’m also increasingly skeptical of anyone who sells me something that supposedly saves my time.

Imagine a world in which the promise of AI was that workers could keep their jobs, at the same compensation as before, but work fewer hours and days per week due to increased productivity.

What could you do with those extra hours and days? Sleep better. Exercise more. Prepare healthy meals. Spend more time with family and friends. The benefits to physical and mental well-being are priceless. Even if you happened to earn extra money for the same amount of work, your time can be infinitely more valuable than money.

Unfortunately, that's not this world. Which is why the "increased productivity" promise doesn't seem to benefit workers at all.

If you look at the technological utopias that people imagined 50, 60+ years ago, they involved lives of leisure. If you would have told them that advances in technology would not reduce our working hours at all, maybe they would have started smashing the machines back then. Now we're supposed to be happy with more "stuff", even if there's no more time to enjoy stuff.


> Unfortunately, that's not this world.

At one point, it was this world[1]:

> Consider a typical working day in the medieval period. It stretched from dawn to dusk (sixteen hours in summer and eight in winter), but, as the Bishop Pilkington has noted, work was intermittent - called to a halt for breakfast, lunch, the customary afternoon nap, and dinner. Depending on time and place, there were also midmorning and midafternoon refreshment breaks. These rest periods were the traditional rights of laborers, which they enjoyed even during peak harvest times. During slack periods, which accounted for a large part of the year, adherence to regular working hours was not usual. According to Oxford Professor James E. Thorold Rogers[1], the medieval workday was not more than eight hours. The worker participating in the eight-hour movements of the late nineteenth century was "simply striving to recover what his ancestor worked by four or five centuries ago."

> The contrast between capitalist and precapitalist work patterns is most striking in respect to the working year. The medieval calendar was filled with holidays. Official -- that is, church -- holidays included not only long "vacations" at Christmas, Easter, and midsummer but also numerous saints' andrest days. These were spent both in sober churchgoing and in feasting, drinking and merrymaking. In addition to official celebrations, there were often weeks' worth of ales -- to mark important life events (bride ales or wake ales) as well as less momentous occasions (scot ale, lamb ale, and hock ale). All told, holiday leisure time in medieval England took up probably about one-third of the year. And the English were apparently working harder than their neighbors. The ancien règime in France is reported to have guaranteed fifty-two Sundays, ninety rest days, and thirty-eight holidays. In Spain, travelers noted that holidays totaled five months per year.[5]

> The peasant's free time extended beyond officially sanctioned holidays. There is considerable evidence of what economists call the backward-bending supply curve of labor -- the idea that when wages rise, workers supply less labor. During one period of unusually high wages (the late fourteenth century), many laborers refused to work "by the year or the half year or by any of the usual terms but only by the day." And they worked only as many days as were necessary to earn their customary income -- which in this case amounted to about 120 days a year, for a probable total of only 1,440 hours annually (this estimate assumes a 12-hour day because the days worked were probably during spring, summer and fall). A thirteenth-century estime finds that whole peasant families did not put in more than 150 days per year on their land. Manorial records from fourteenth-century England indicate an extremely short working year -- 175 days -- for servile laborers. Later evidence for farmer-miners, a group with control over their worktime, indicates they worked only 180 days a year.

[1] https://groups.csail.mit.edu/mac/users/rauch/worktime/hours_...


> if it's limited to re-writing clickbait headlines

It's already not so limited:

"sometimes changing their meaning in the process."

"It almost sounds like we’re endorsing a product we do not recommend at all."


> In the 90s the same argument was directed at this new thing called the internet, and those who placed a bet on it being a fad ended up being forgotten by history.

Almost all people are "forgotten" by history.

In any case, people who were not even born yet in the 1990s are using the internet today, very successfully, so clearly you can wait.


> it will be nearly impossible to learn them from scratch.

Are you claiming that all future generations of would-be programmers are doomed?


I'm saying that there's a cost to waiting, just like there's a cost to jumping in early. The cost is SPECIFICALLY that it is harder to jump in to a mature field with its own jargon and concerns. The assumption I think folks are making is their engineering prowess will save them here: whatever complicated thing that matters in AI land will be easily visible to an rank outsider.

The whole premise of "imma wait" is not sober patience, it's the implied "imma wait until everything falls apart, then we'll go back to what I know how to do." that people don't like saying. It's an argument (often not even stated as such) that the people who don't jump in will be healthier and happier for just having ignored this wave.

I think that's baloney. It's not FOMO I'm arguing about but the idea that real practices and infrastructure are being built right now that people are internalizing. Folks who aren't a part of it just aren't internalizing any of that. As the tech gets better (and it will!), those practices and infrastructure get more complex, more specialized. The idea that I can just wait years and then "engineer harder" to undertand this from the outside while being competitive is fantasy. Maybe some subset of people can, bully for them. Most people won't be able to.

Future programmers aren't doomed. Future programmers who can't or won't adapt to the biggest change in computing since the slide rule are doomed.


> The cost is SPECIFICALLY that it is harder to jump in to a mature field with its own jargon and concerns.

Hasn't that been the case for decades? What specifically is different now, such that for some reason it's harder to jump in now than it was before?

If anything, LLMs are supposed to make things easier, aren't they?

> it's the implied "imma wait until everything falls apart, then we'll go back to what I know how to do." that people don't like saying.

You can read whatever assumption you want into the blog post, but it's not there in the words. You're dunking on a straw man.


>What specifically is different now, such that for some reason it's harder to jump in now than it was before?

Well, the obvious evidence that something is different now is all around us. It's been made with painful seriousness by people who thought they were making a different point, namely that LLMs represent an unreliable, poorly understood, and hazardous abstraction layer between coders and the machine. Specifically that this abstraction layer is DIFFERENT than others in the past. There are dozens and dozens of blog posts making this point (some written by machines) on HN. It would be hard to have not come across this point or miss the chorus of engineers agreeing with it. It's supposedly a cardinal reason why assembly -> C was a "good abstraction" and natural language -> slop is a "bad abstraction." If we take that argument seriously, it represents strong evidence that something new is happening, independent of anything I might say.

Why is it different? C'mon. COME ON. why can I find a post on the front page of HN when Claude is down for more than 10 minutes? Why can I find out that a new model has been released from the big 5 frontier labs, again on the front page, inside minutes? Why is it different? Did we build trillions of dollars of datacenters for NetBeans or SecondLife or whatever other cartoonish old fad I'm supposed to treat as analogous today? Are we just supposed to imagine that Microsoft, NVIDIA, Facebook, Google, Alibaba are all just staffed with idiots, or they're all caught up in irrational exuberance? Are we supposed to watch generation costs march down and outcomes improve and still think 'yeah, this is just like Pets.com? Are we supposed to yield to vague and suggestive motions toward e.g. the dot com boom as though working with agents were the same thing as investing in a specific internet company ca. 1998? Are we supposed to take from that analogy that an engineer who said "no thanks, I'll wait to see how this internet thing shakes out" in the 1990s was a real smarty to be emulated? Come on.

It's both categorically different and clearly has meaningful material force behind it.

This is a whole different interface to the computer and even if the eventual outcome is that real engineering work happens with tightly constrained and specialized harnesses around agents, understanding the actual interface is critical. Ironically, the meta-claim here is that good engineers will just be able to vibe out correct practice by engineering harder instead of understanding that core interface! Rather what will be needed is attention and orientation to concerns that people care about in the space.

I don't want to dunk on a strawman. I'd much rather not see a whole community of engineers loudly pat each other on the back for not learning about something.


> I don't want to dunk on a strawman.

You said "it is harder to jump in to a mature field with its own jargon and concerns." I asked how it's harder than in the past. Your long reply did not appear to explain this at all. Rather, it seemed to be a series of red herrings and dunks, not directly addressing my question.

> I'd much rather not see a whole community of engineers loudly pat each other on the back for not learning about something.

The submitted article said, "I've tried a bunch of them. Some are good. Most are a bit shit. Few are useful to me as they are now."


> Transformative for the better? Time will tell I suppose

That's the point of the blog post. If you can't even say right now whether it's for the better, then there's no reason to rush in.


I read OP as saying it is transformative, at least for them. Whether it's transformative for society is left to be decided.

And conversely if it is, then there is no point to getting in early since the whole point is to externalize knowledge and experience

> https://feedbackassistant.apple.com/feedback/22280434 (that seems to need a login?).

All Feedbacks that you file are private to your own Apple Account.


It depends. Some sites have a soft, client-side paywall and others have a hard, server-side paywall. NYT has the latter, so you can't get the full article text with JS blocked.

Yep, that's true, and it feels like an intentional decision on the part of companies. Wider access, or higher margins?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: