Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> In order for the LLM or agent to do what you want, you'd need to be able to precisely specify what you want

no, you just need to vaguely know what you want, and get the LLM to produce something that you then examine, and crawl towards the end goal.

LLM's could potentially allow fast iteration from a laymen's description of what they want.



Sometimes when writing 'fiddly' code, I'll have a bug.

But I can't find the bug. I get the wrong answers but can't trace it through the logic.

Maybe it's a dumb thing like a missing index increment? Or a missing assignment and I just can't see it.

Maybe it's easier to just tear down the mess and write it again.

This is how I feel whenever I deal with AI generated code.


Exactly this - show me something and I can tell the AI what I don't like or what it is missing.

Equally, you can ask the GenAI to keep asking you questions to broaden its knowledge of the problem you are solving, and also ask it to research the issues customers are having with a current solution.

Some engineers seem to imagine any non coder using AI will behave very simply 'make me a new search engine' . Lots of very clever people (who just don't know how to or want to learn to code) will be picking up the skills to use AI as it gets better and better.

I can see AI being used to write far better requirements and produce amazing prototypes - but if you work at a megacorp, chances are (for now) they will want that code rewritten by a 'human' developer.


Today, millions of lines of chatGPT generated code will be committed in large organizations.


True, I'm no doubt being too cautious - where I work it is mainly used for unit testing and prototyping - as I understand it we are using it with developers, but always with a human review - we never have a product owner making code with a AI tool and deploying to live. Yet.


Oh dear God, if you think product owners are going to be committing code you must be missing something. The AI is to help devs work easier, not replace them. We are nowhere near that. Too much involved.


In what languages can chatGPT write anything remotely sane and deployable?


I have had it build out entire API in node Express. Shit needed help for sure. The point of AI is to have it so low value work like scaffolding and boilerplate. The AI goes off the rails a lot. You have to have the skills to recognize when it is and somehow change course.


it doesn't matter, it's still likely true. many programmers use it, many of those generated lines will in fact become part of commits. (maybe not millions, but ... it depends on the definition of large orgs.)


The problem with this plan is reading code is the hardest part of coding. Especially code you haven't written.


The issue with LLM driven development is that it’s often as hard to verify the outputs of the model as it would’ve been to write it myself. It’s basically the programming equivalent of a Gish gallop.


You could also do the same thing with a high-level language. Your LLM is nothing more than an interactive optimizer.


but now you got to go learn that high level language, rather than use natural language you already know.


But now you need to learn how the LLM understands your natural language words, which is very context dependent and will change in the next LLM update. I don't think that takes less time, at least if you are going to write something non-trivial.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: