>> People are doubtful that the agent will be able to complete the task properly.
You answered your own question.
I do not trust an agent to give it unsupervised access to my systems.
If I had a completely local agent that was fully sandboxed and I would be willing to put data in the sandbox, give it a task, and come back later to see what it did.
I would not trust agents to run unsupervised with similar restrictions.
>> Six weeks in, things changed. The Linux installations started to degrade — subtle at first, then undeniable. Random slowdowns. Browser links that wouldn't register for 10 or 15 seconds. The kind of frustration that makes you stare at the screen and wonder what's happening under the hood. It was consistent across distributions, which suggests this wasn't just a bad package here or there. Something fundamental was happening.
Without more details it would be difficult to determine what problems you were having.
I have never had problems like you describe with Linux. I would be interested to know more details.
"* The header "Vaccines do not cause autism" has not been removed due to an agreement with the chair of the U.S. Senate Health, Education, Labor, and Pensions Committee that it would remain on the CDC website."
They kept the header because they legally agreed to keep it, but the rest is conspiracy propaganda.
Having a bard in your party let you choose a soundtrack and their songs brought magical effects. For example, the Rhyme of Duotime let your party attack more frequently in combat: https://www.youtube.com/watch?v=_oR4j7w4FIY
Interesting, I didn't know BT3 was by a different author, it definitely had its own vibe distinct from the first two, which this guy wrote (https://en.wikipedia.org/wiki/Michael_Cranford). I liked them all though.
The steam remasters are incredibly faithful to the originals - right down to the timing and flow of the turn-based combat. Makes me wonder if they are emulating the original code somehow.
The first trilogy (including BT3) was also remastered about 7 years ago and released on Steam, it's like $15 and has many quality of life improvements.
In middle school, a friend and I 'cracked' that decoder ring by copying all the info by hand on to paper so we could both play the game from one store bought copy because we were poor. I don't think we ever finished the game, but it's still one of my happiest early gaming memories.
They remastered all three of the first Bard's Tale games a few years ago and released them on Steam with many quality of life improvements-- I bought the set without a second thought even though I know I will probably never take the time to play it all the way through. I've spent a few dozen hours on it so far, though.
I wonder which [publicly listed] companies would look at the abandonment of Jetson and still commit to having Nvidia set the depreciation schedule for them.
They already do have a pretty robust software stack that goes all the way to code/analytics libraries. I'm not sure on the current state of things but ~2020 they were automatically testing chip designs for performance regressions in analytics libraries across the entire stack from hardware to each piece of software
Debian just cut i386, Wikipedia says the i386 was discontinued in 2007. These systems are all of the same vintage, so it does not seem a huge leap to be culled from the support list.
The actual Intel 80386 processor was discontinued in 2007, but the i386 architecture -- ia32, 32-bit x86 -- lived on for longer in the semi-mainstream (there were 32-bit Intel Atoms introduced to the market as late as 2012, AMD Geodes were sold until at least 2019, and I believe some VIA C3/C7 derivatives made it into the 2010s as well) and is in fact still in production to this day for the embedded and industrial markets (Vortex86, etc).
Yeah, and you can still run i386 binaries on widely available amd64 CPUs. So this is an even stronger argument for killing these other obsolete platforms.
i386 (32 bit) only processors we discontinued but 64bit processors can operate in 32bit mode so toolchain was still widely available and there was still demand for i386 OS that would run on modern hardware in i386 mode for some ancient software.
> Who is still using these machines? Genuine question, not trolling.
Either legacy systems (which are most certainly not running the current bleeding-edge Debian) or retro computing enthusiast.
These platforms are long obsolete and there are no practical reasons to run them besides "I have a box in the corner that's running untouched for the last 20 years" and "for fun". I can get a more powerful and power efficient computer (than any of these systems) from my local e-waste recycling facility for free.
It’s usually a loud minority of trolls or hobbyists. It just takes one to spark a doubt.
Here is one famous example of a dude who’s managed to get PRs merged in dozens of packages, just to make them compatible with ancient versions of nodejs https://news.ycombinator.com/item?id=44831811
Wow that was an interesting read. I find it amusing that nobody seems to really know who he is or what his motives are, yet his code is run on millions of machines every day.
Sure, but almost nobody uses or wants modern linuxes on those machines. It's almost always described (in their own words) as "some ancient crusty version of Debian"
> Nobody wants to play games on Linux given the small userbase compared to Windows.
According to the last Steam survey, 3% of players use Linux. Steam has 130 million active players, so that means there are 4 million people playing on Linux. Definitely not "nobody", and way bigger than the whole retrocomputing community.
By the way, I am also one of those retrocomputing guys, I have a Pentium 2 running Windows 98 right here. IMHO, trying to shoehorn modern software on old hardware is ridiculous, the whole point of retro hardware is using retro software.
> Who is still using these machines? Genuine question, not trolling.
Well, there are so many things were you could argue about the relevance of a userbase.
If the size of a userbase would be the only argument, Valve could just drop support for the Linux userbase which is just 2-3% of their overall userbase.
Not your point, but Linux compatibility is Valve protecting themselves from the existential risk that is a capricious Microsoft. At one point, it seemed Microsoft was trying to make the Microsoft Store be the mechanism for distributing all software. Linux being viable for gaming is/was their safety net to avoid being locked out from the ecosystem.
popcon.debian.org reports 3 alpha installations and 261750 amd64 installations. Assuming comparable opt-in rates there are less than 0.002% of the users using alpha.
The other mentioned architectures hppa, m68k and sh4 are at a similar level.
Valve isn't a good example. They have strong Linux support so they can sell Steamdecks without licensing with Microsoft. Without their work on Proton, Steam effectively lives or dies by the will of Microsoft.
they might run Debian but not upstream Debian/stable
you mainly find that with systems needing certification
this are the kind of situations where having a C language spec isn't enough but you instead need a compiler version specific spec of the compiler
similar they tend to run the same checkout of the OS with project specific security updates back-ported to it, instead of doing generic system updates (because every single updates needs to be re-certified)
but that is such a huge effort that companies don't want to run a full OS at all. Just the kernel and the most minimal choice of packages you really need and not one more binary then that.
and they might have picked Debian as a initial source for their packages, kernel etc. but it isn't really Debian anymore
If we are talking about embedded control systems no, you don't want new software, you want your machine to do what is supposed to do. At my workplace we have some old VME crates running VxWorks, and nobody is gonna update those to the latest Linux distro.
This is incorrect. Internet connections and 3rd party integrations have changed this view of “the software doesn’t need to change once it leaves the factory”.
John Deere, Caterpillar, etc are leaning heavily into the “connected industrial equipment” world. GE engines on airplanes have updatable software and relay telemetry back to GE from flights.
The embedded world changed. You just might have missed it if your view is what shipped out before 2010.
My experience is in big scientific experiments like particle accelerators, I guess other fields are different. Still, my experience is that:
1) The control network is air gapped, any kind of direct Internet connection is very much forbidden.
2) Embedded real-time stuff usually runs on VxWorks or RTEMS, not Linux. If it is Linux, it is an specialized distro like NI Linux.
3) Anything designed in the last 15 years uses ARM. Older systems use PowerPC. Nobody has used Alpha, HPPA, SH4 or m68k in ages. So if you really want to run Debian on it, just go ahead and use Armbian.
Yes, you are out of touch with what has happened with embedded. Companies love internet connected things, especially big industrial things.
It’s absolutely terrible for security but remote visibility into how your 100 ton haul truck is operating via some cloud API is what people like and keep buying.
No air gap, just hooked up to a cell phone network with maybe a VPN if you’re lucky. Either way, the kernel is handling packets directly from the Internet and keeping the kernel up to date is critical.
You answered your own question.
I do not trust an agent to give it unsupervised access to my systems.
If I had a completely local agent that was fully sandboxed and I would be willing to put data in the sandbox, give it a task, and come back later to see what it did.
I would not trust agents to run unsupervised with similar restrictions.
reply