Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

LLMs pattern match, they say something that sounds good at this point but with no notion of correct. copilot is like pair programming with a loud pushy intern that has seen you write stuff before didn't understand it, but keeps suggesting what to do anyway. some medium sized chunks of code can be delegated but everyline it writes needs careful review.

Crazy tech, but companies are just wring to be trying to use LLMs as any kind of source of truth. Even Google is blind enough to think that aí could be used for search results, which are memes they are soo bad. And they won't get better. They just become more convincing



I've had quite a bit of success but my technique is to explain the technology and libraries I'm going to use, think through the problem, stub out function names, how they'll interact, and then llm saves me the typing.

I'll also use openrouter with sessions so I can take one context and use it around a variety of invocation tools without losing the attention.

It hasn't done anything I don't know how to do - fails if I ask it to do that. But it does save me lots of typing and thinking of minutia

It's not magic, it's still just a program running on a computer - a decent abstraction tool.

I'm sure it will be ruined in time like every new paradigm when the next generation feels a need to complicate this new tidy little world.


Not important once has copilot ever suggested a correction, found a bug, noticed a typo, prompted for a better solution, which is what any human pair programmer would do. It's a tool. But thinking ng it's like a "copilot" marketing as such is fundamentally missing the point. It won't get better untill people recognise what it _can't_ do as much as what it appears it can do.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: