This is an interesting word. Halfway across the Alpo-Himalayan backbone the word for this tool is "kobilitsa", cognate to "kobila" (female horse) and I always figured it was a metaphor for the arched form of the bar (and probably referring to how women were stuck with the task of fetching water with it).
But now it seems the word "kaavidi" has reached our ancestral lifestyle all the way from the Indian lands! And the transformation it underwent was more due to how people "normalize" foreign words; same linguistic churn that gives us backronyms and false etymologies (see also "eggcorn"). Whoa
I presume linguists have already studied the naming of household implements when deriving Proto-Indo-European. I've never encountered much literature on the subject, nonetheless I find the subject rather fascinating.
Ah well, I'm neurodivergent and it’s challenging for me to write a comment while remembering that others don’t have access to my thoughts and might interpret things differently. And it's too late to edit it now
What I wanted to show is that, clearly different from a camera or other devices, AI can copy originality. OPs comment was pretty original in it's wording, and gpt came pretty close imo. It really wasn't meant as a low effort comment
If AI can copy originality, we should be questioning our concept of originality. Not that it isn't a very suspect concept already. It's something to praise children for (autonomously combining the concepts provided to them) - but I've had the unwisdom of lurking long enough on the pre-LLM noosphere to get a glimpse of what gets done to the real originals since time self-fulfillingly immemorial.
'sides, I'm also neurocute af and feel acutely endangered by stuff that's effectively channeled: expressions which represent coherent agentic thought, yet are not bound to the volition of any individual embodied organism by regular personal accountability. If psychology is any indication, as non-augmented humans we're already not consciously aware of half our driving forces, and that's been trouble enough throughout history. How do you find it at all safe for the environment to broadcast Big Nobody's thoughts towards sentient beings?
AI is the linguistic instrument of those who cast away their capacity for bona fide verbal cognition as a condition of entering the hierarchy. They're a rather dangerous bunch, what with no beans to spill or marbles to lose. Subtle too, "somehow" got all normies working for 'em and usually not even knowing it till it's too late. Tell em I said hi!
And programming languages are designed for clarifying the implementation details of abstract processes; while human language is this undocumented, half grandfathered in, half adversarially designed instrument for making apes get along (as in, move in the same general direction) without excessive stench.
The humane and the machinic need to meet halfway - any computing endeavor involves not only specifying something clearly enough for a computer to execute it, but also communicating to humans how to benefit from the process thus specified. And that's the proper domain not only of software engineering, but the set of related disciplines (such as the various non-coding roles you'd have in a project team - if you have any luck, that is).
But considering the incentive misalignments which easily come to dominate in this space even when multiple supposedly conscious humans are ostensibly keeping their eyes on the ball, no matter how good the language machines get at doing the job of any of those roles, I will still intuitively mistrust them exactly as I mistrust any human or organization with responsibly wielding the kind of pre-LLM power required for coordinating humans well enough to produce industrial-scale LLMs in the first place.
What's said upthread about the wordbox continually trying to revert you to the mean as you're trying to prod it with the cowtool of English into outputting something novel, rings very true to me. It's not an LLM-specific selection pressure, but one that LLMs are very likely to have 10x-1000xed as the culmination of a multigenerational gambit of sorts; one whose outset I'd place with the ever-improving immersive simulations that got the GPU supply chain going.
I first had a conversation with my wife at a pizza place hanging out with mutual friends. We bonded through in person face to face conversation like cave people. She never used dating apps so there was no other way we were going to meet.
Turns out if you leave your home and hang out with people a lot, you build better social skills, and potential partners can get to know you as a friend first, and are more inclined to give you a chance at something more than they are when you are just another awkward photo in a dating app.
IMO both perspectives have their place. Sometimes what's missing is the information, sometimes what's lacking is the ability to communicate it and/or the willingness to understand it. So in different circumstances either viewpoint may be appropriate.
What's missing more often than not, across fields of study as well as levels of education, is the overall commitment to conceputal integrity. From this we observe people's habitual inability or unwillingness to be definite about what their words mean - and their consequent fear of abstraction.
If one is in the habit of using one's set of concepts in the manner of bludgeons, one will find many ways and many reasons to bludgeon another with them - such as if a person turned out to be using concepts as something more akin to clockwork.
Username checks out... well, I can help ya.
You start out easy, like "who invented all those damn conspiracy theories and introduced them into the public culture, anyway?"
reply