I will never use Homebrew again because I'm still sore that they dropped support for a Mac OS version that I was still using and couldn't upgrade because Apple didn't support my hardware anymore.
Any decent project should have a way to install without Homebrew. It's really not necessary.
> and couldn't upgrade because Apple didn't support my hardware anymore
I'd classify that as an Apple problem rather than a Homebrew problem. If Apple themselves cannot be arsed to support an OS version, why would a volunteer project take on such a challenge?
For every piece of software I've fetched using Homebrew, there's a "compile from source" option available on Github or some other source repo.
It wouldn’t cost Homebrew folks much to add a flag to skip dependency version checking which would solve most issues with using older macOS. But they don’t want to, and have closed all issues asking for it as wontfix.
Seems like good enough a reason for them not to do it.
Their tooling is open-source, surely the few people still using unmaintained versions of macOS can create a `LegacyHomeBrew/brew` repository with patches for old macOS versions? It would also be a good place to stuff all the patches and workarounds that may be necessary to support old macOS versions.
They said they don’t want that [1]. It’s not just me, several people have asked for it. Maintaining an extra fork just for that is also out of the question for most people.
Was gonna say the same thing. There are tons of projects that support older unsupported OS versions or even different platforms. Whether that's macOS, Windows, or older versions of the Linux kernel.
>I will never use Homebrew again because I'm still sore that they dropped support for a Mac OS version that I was still using and couldn't upgrade because Apple didn't support my hardware anymore.
How old was it? With macOS "running an old version" is not really a viable or advisable path beyond a certain point. Might be something people want to do, might it a great option to have, but it's not very workable nor supported by Apple and the general ecosystem.
>Any decent project should have a way to install without Homebrew. It's really not necessary.
We don't install homebrew because it's necessary, but because it's convenient. No way in hell I'm gonna install 50+ programs I use one by one using the projects' own installers.
Besides, if "Homebrew dropped support" is an incovenience, "manually look for dozens of individual installers or binaries, make sure dependencies work well together, build when needed, and update all that yourself again manually" is even more of an inconvenience. Not to mention many projects on their own drop support for macOS versions all the time, or offer no binaries or installers.
Why not use MacPorts, which currently supports all the way back to Leopard, has far more packages than Homebrew, has a better design, and was created by the creator of the original FreeBSD ports system who also worked on Apple's UNIX team?
The ubiquity of Homebrew continues to confound me.
I switched to Homebrew after years of Macports because Macports required me to laboriously upgrade all the ports with each major macOS update. Homebrew does not require this. I understand the better design of Macports but in the end Homebrew works well enough and saves much time annually without the need for the manual upgrade.
Or use Homebrew on the old OS with TigerBrew (https://github.com/mistydemeo/tigerbrew), but people online suggest MacPorts, not only because it has first-party support but also because it’s apparently better designed.
I'm fine with homebrew not supporting whatever versions they choose.
I think GP's issue is forcing the use of homebrew for what seems like a rather trivial install. Just make the binary easily downloadable. It's not like you can't open the curled script to see what it fetches and do it yourself. It's just that having to jump through this useless hoop is annoying.
My mac is running the latest version of Tahoe but I never liked homebrew. You can bet I won't install it just for one app.
Homebrew really helps when you want to install more than one app... And you want to keep them updated... And you want to easily delete some of them at some point.
Managing the install lifecycle with one set of commands for multiple apps is why I love Homebrew
Apple controls these computers? I am using Linux myself; I compile from source though. To me it would seem super-strange to use an operating system where a private entity decides what it wants to do.
The people who pay for operating systems are paying for a private entity to decide what the operating system should do. They're paying for someone to compile it from source and get it to run on their computer and maintain it.
That's the whole point. Paying someone for that thing you also know how to do so they can consider that problem solved and focus on the things they know how to do.
Not sure where you're getting this from, but the latest MacOS works on devices from 2019 so it's at least 6 years of support. And homebrew supports versions from macOS 14 fully (and some support up to 10.15) which means full support for 2018 devices and potentially even devices from 2012 will work.
More than six. 2019/2020 Intel Macs get Tahoe 26.0 + about three years of security patches for Tahoe. The last Intel Mac will be out of support in probably late 2028.
The iMac Pro is a 2017 computer, although it was sold until 2021. So given that it runs Sequoia, that's anywhere from six to ten years of OS support. OCLP will probably figure out how to patch Tahoe for the iMac Pro soon enough, but until then, you can rejoice in the fact that you don't have to run Tahoe.
It could be worse -- at least you didn't spend tens of thousands on a 2019 model Intel Mac Pro in 2023. (Yes, they still sold them, and owners of those will be SOL in 2028. That's probably the worst OS support story in recent Apple history, and it's for some of their most expensive machines)
Actually you are correct. I've been following the HN threads about Tahoe and even watched a few YouTube videos and could only facepalm.
But then again I'll get rid of the iMac Pro this year. I'll have technicians butcher it and salvage whatever they can from it -- I suspect only the SSD will survive -- and will then tell them to hollow it out and put an R1811 board inside it so I can use it as a proper standalone 5K screen. I don't care about Macs anymore, they limit me too much and I can't maintain multiple Linux machines just when I figure I would want to do something that Macs can't do (like experiment with bcachefs or ZFS pools and volumes and snapshots for my continually evolving backup setup).
Fair. The screens are really beautiful, absolutely worth reusing if possible.
I'll be decommissioning 40+ 2020 27" iMacs this year (i9-9900, 32 GB) and it's such a shame to see so many great displays and otherwise functional and plenty fast computers become, essentially, e-waste.
I agree, it is a huge shame. And the R1811 boards are more or less 300 EUR (~360 USD). Not many companies would agree to spend $360 on a near-future e-waste, per device, just to be able to extract the high-quality display. True shame.
But I've learned my lesson. While Apple computer served me well from 2019 to 2026, macOS gets less and less usable for me and the bunch of things I want to be able to do on it only increases, and its appeal only decreases (not to mention the very justified OCD I get when I look at how much crap is running 24/7 on it!).
The iPhone stays, though I wonder for how long more. But the Mac will be on its way soon enough.
Homebrew and MacPorts unfortunately do not fit to macOS installation layout very well anymore. Packages installed outside usual places create a lot of headaches during updates.
I also do not prefer to use these for the last 16+ years, and not planning to do so.
I wish mac users would stop using homebrew and use a real package manager with actual dependency management.
At the very least, replace homebrew with something like devbox which has `devbox global` for globally managing packages, it uses nix under the hood, and it's probably the simplest most direct replacement for homebrew.
I don't agree this is an issue and I'll tell you why: Homebrew isn't responsible for keeping the system functional like apt or pacman, it's a supplemental thing. I've also found it's useful in this capacity on Linux specifically with LTS distros, I can get the latest fzf or zoxide or whatever without having to add some shady repo.
This is how I see/use brew as well, and being able to just blow the directory away anytime and start over if need be is nice.
It's not a "system" package manager, nor was it ever meant to be. Its supplemental. I've also found it valuable on the various immutable linux distros.
I use MacPorts because of older versions of Homebrew having a weird and insecure design. [1] I think some of those design issues may have been fixed, but I’m wary of Homebrew.
It's not necessary because Mac applications shouldn't have any dependencies other than the OS. (Whatever additional libraries they use should be included.) This should also be true of basic developer tools. Once you're in a particular ecosystem, tools like deno, npm, or uv can handle their own dependencies.
Alternatively, you could do development in a container and use apt-get there. That's probably safest now that we're using coding agents.
MacPorts was created by the creator of the original FreeBSD ports system who was also an Apple employee. It ought to be everyone's first choice for package management on macOS.
I wish the mac users would switch to a real OS, linux, so that software companies would release linux versions of stuff first.
Codex, Claude Desktop, etc etc all starting out as "macOS exclusive" feels so silly when they're targeting programmers. Linux is the only OS a programmer can actually patch and contribute to, and yet somehow we've got a huge number of developers who don't care about having a good package manager, don't care about being able to modify their kernel, don't care about their freedom to access and edit the code of the software they rely on to work...
It's depressing how much of the software industry is just people on macbooks using homebrew to install a newer version of bash and paying $5 for "magnet" to snap windows to the corners since their OS holds them in a prison where they can't simply build themselves a tiling window manager in a weekend.
The OS is core to your tools and workflows, and using macOS cedes your right to understand, edit, and improve your OS and workflows to a company that is actively hostile to open source, and more and more hostile to users (with a significant increase in ads and overly priced paid services over the years).
Anyway, yeah, homebrew sucks. At least nix works on macOS now so there's an okay package manager there, but frankly support for macOS has been a huge drag of resources on the nix ecosystem, and I wish macOS would die off in the programming ecosystem so nix could ditch it.
I harbor similar sentiments, but I understand why OpenAI, Anthropic, Zed, etc begin with a macOS version. They're able to target a platform which is a known quantity and a good jumping off point to Linux.
I'm writing software for Linux myself and I know that you run into weird edge case windowing / graphical bugs based on environment. People are reasonably running either x11 or wayland (ecosystem is still in flux in transition) against environments like Gnome, KDE, Sway, Niri, xfce, Cinnamon, labwc, hyprland, mate, budgie, lxqt, cosmic... not to mention the different packaging ecosystem.
I don't blame companies, it seems more sane to begin with a limited scope of macOS.
The problem is that right now I have to choose the lesser of 2 evils. I hate what W11 has become. I only use it for games at the moment and the only reason is that some games Apex/BF6 do not run under proton because of their anticheat.
And I also hate what modern Macos is heading towards. I'm still ignoring/canceling the update on both my devices for the new "glass" interface.
And a thinkpad running Linux is just not doing it for me. I want my power efficient mac hardware.
Truth be told I just want to have my mbp running Linux. But right now it's not yet where it needs to be and I am most certainly not smart enough to help build it :(
> And a thinkpad running Linux is just not doing it for me. I want my power efficient mac hardware.
I'm using a decade old thinkpad running linux and it is definitely 'doing it for me'. And I'm not exactly a light user. Power efficient mac hardware should be weighed against convenience and price. The developer eco-system on Linux is lightyears ahead of the apple one, I don't understand why developers still use either Windows or the Mac because I always see them struggle with the simplest things that on Linux you don't even realize could be a problem.
Other OSs feel like you're always in some kind of jailbreak mode working around artificial restrictions. But sure, it looks snazzy, compared to my chipped battle ax.
> And a thinkpad running Linux is just not doing it for me. I want my power efficient mac hardware.
Are you talking about the battery? I bought a T16 AMD a month ago with the 86Wh battery and it lasts between 8 and 12 hour depending on the usage. Not as much as a macbook but enough to not worry too much about it. New intel ones are supposed to be much better on power efficiency.
It's off course one level bellow on the mac on that regard (and others maybe too), but if you want to use linux I think the trade-off is worth it.
It's Apple, not the users, that need to make that switch in the first instance. I'd love to use Linux again but I'm not leaving Apple hardware for it, or accepting poor software support for recent hardware.
I admit I love the mbp hardware, but I can't stand macos anymore. So when my work computer was up for replacement, I didn't think twice and went with a PC, the latest thinkpad p14s. Everything works out of the box on Linux.
Is it as nice as a mac? No, especially the plastic case doesn't feel as nice under the hands as a mac's aluminum, the touchpad is quite good but worse than a mac's, and there are some gaps around the display hinge. But the display itself is quite nice (similar resolution, oled, although not as bright as a mac's), it's silent and it's plenty fast for what I do. I didn't pay for it, so I don't directly care about this point in this situation, but it also cost around half of what an equivalent mbp would have cost.
I also haven't tried the battery life yet, but it should hold at least as well as my 5-yo hp elitebook, which still held for around 5 hours last year. I basically never use it for more than an hour unplugged, so battery life is low on my priorities.
I dunno, I'm pretty happy with my thinkpad. Even if I could run Linux flawless on a macbook (which you can't unfortunately) I'd still take the thinkpad hardware over a macbook.
A macbook air is 1.25kg, and my thinkpad is 910g, and I can really feel that difference. The thinkpad keyboard also feels ever so slightly better too... and Linux working well is worth more than pretty much anything else.
It's ok, Apple knows this and will lock it's OS down to an iPhone like OS step by step until you're boxed in a nice little prison, and you'll accept it.
Also you'll pay them 30% on every transaction you do on said computer.
I'd say support for linux has improved an incredible amount compared to 5-10 years ago. I'm often pleasantly surprised when ever a linux version of something is available because I'm used to not expecting that haha.
MacPorts has existed since 2002 and was invented by Jordan Hubbard, who created the original FreeBSD ports system and was also employed on Apple's UNIX team.
The package management story on Linux is hideously bad. The next generation replacements are all over the place (do I use snaps? Flatpak?). No one is going to learn Nix if it means you need to become a programmer just to install something.
The graphics story on Linux also sucks. I recently tried to convert my Windows gaming machine to Linux (because I hate W11 with a burning passion). It does work, but it’s incredibly painful. Wayland, fractional scaling, 120+ Hz, HDR. It’s getting better thanks to all the work Valve etc are putting in, but it’s still a janky messy patchwork.
MacOS just works. It works reliably. Installing things is easy. Playing games is easy. I’m able to customize and configure enough for my needs. I love it and I hope it sticks around because there is no way in hell I would move my work machines over to Linux full time.
What's wrong with those? I don't have a single screen which does 120 Hz + HDR, but I'm typing this on a 120 Hz laptop, with variable refresh rate, at 125% scaling, and everything works great with Plasma (haven't tried anything else). I also have an external HDR screen, but it only does 60 Hz. It works great, too, doing HDR on it but not on the laptop screen (running at the same time, of course). They also run at different scaling (125% and 100%).
Now I don't know how to confirm that VRR is actually doing anything, but I can tell there's a difference between setting the monitor to 60 and to 120 Hz. HDR on the other screen also produces a clear difference.
This is all running from integrated intel graphics, maybe with other GPUs it's more of a crapshoot, no idea.
Huh? Homebrew supports and frequently uses dependencies between formulae. It’s a bit janky around upgrades in my experience, but you’re going to have to clarify what you mean.
Dependency management means the ability to have more than 1 version of the dependency installed, under the same package name.
i.e. Let's say you install a bunch of homebrew packages, everything is working. Then 6 months later you go to install another package - homebrew likes to upgrade all your packages (and their dependencies) willy nilly.
And if it breaks shit, there's no way to downgrade to a specific version. Sometimes shit broke because the newer package is actually a broken package, or sometimes it's because the dev environment was depending on a specific version of that package.
There's basically no way to have multiple versions of the exact same package installed unless they use their hacky workaround to create additional packages with the version number included in the package name.
That wouldn't really help, it could be more naughty and use pastejacking so you don't even realize what's happening. That might end up catching a lot of people because as far as i know by default bash doesn't use bracketed paste, so you think you're copying a real command and it ends up sending your secrets before you know what happened.
Disabling JS + bracketed paste seems to be the only good solution.
Btw OP article uses a weird setup, why would they use `bash -c "$(curl $(echo qux | base64))"` instead of just "curl | bash"
It's not really any different than downloading a binary from a website, which we've been doing for 30 years. Ultimately, it all comes down to trusting the source.
>> Attacks like this are not helped by the increasingly-common "curl | bash" installation instructions ...
> It's not really any different than downloading a binary from a website, which we've been doing for 30 years.
The two are very different, even though some ecosystems (such as PHP) have used the "curl | bash" idiom for about the same amount of time. Specifically, binary downloads from reputable sites have separately published hashes (MD5, SHA, etc.) to confirm what is being retrieved along with other mechanisms to certify the source of the binaries.
Which is the reason why it's better to actually cryptographically sign the packages, and put a key in some trusted keystore, where it can actually be verified to belong to the real distributor, as well as proving that the key hasn't been changed in X amount of days/months/years.
Still doesn't address the fact that keys can be stolen, people can be tricked, and the gigantic all-consuming issue of people just being too lazy to go through with verifying anything in the first place. (Which is sadly not really a thing you can blame people for, it takes up time for no easily directly discernable reason other than the vague feeling of security, and I myself have done it many more times than I would like to admit...)
> If the attacker already controls the download link and has a valid https certificate, can't they just modify the published hash as well?
This implies an attacker controlling the server having the certificate's private key or the certificate's private key otherwise being exfiltrated (likely in conjunction with a DNS poisoning attack). There is no way for a network client to defend against this type of TLS[0] compromise.
Which is why package managers with well-maintained repositories are the civilized solution to software disruption. Unfortunately the Linux world has been dedicating a lot of energy to making Windows-style "download and run the exe" possible on Linux.
>Which is why package managers with well-maintained repositories are the civilized solution to software disruption.
How does that model work with distros like debian, where they freeze package versions and you might not get claude code until 2027 (or whenever the next release is)?
>Sounds like you either shouldn't use Debian or should find a repo with maintainers who align with your preferred style of package inclusion.
Are there actually viable alternatives to the default debian repo? At best there's repositories run by various projects, but that's basically the same as level of security as "run a random binary you downloaded off the internet". The only plausible way that package managers increase security is through curation. If you're just blindly adding whatever repo to get some software installed, you're back at square one.
If the debian maintainers don't align with your preferences you can:
1. Create your own apt repository with newer software, and install from that. It's easy to package things, you can share the repository with trusted friends, running linux with friends is fun.
2. You can switch to a distro, like NixOS or Arch, which values up-to-date software more than slow stable updates.
Debian does seem to be more aligned with mailservers and such, where updates can be slow and thoughtful, not as much with personal ai development boxes where you want the hot new ai tool of the week available asap.
... Either way, learning to package software correctly for your distro of choice is a good idea, it's fun to bang out a nix expression or debian package when you need to install something that's not available yet.
I've heard this time and time again from new Linux users: "I don't want to learn the command line, I just want to be able to install and run whatever I want"
On Mac binaries need to be signed and notarized and Apple could stop the spread of the malware once it's identified or even detect it before notarizing it.
I've downloaded and installed too many packages where the developers didn't bother with this, but I uncritically went to Mac's security settings to let it do its thing anyway.
I don't know if developer utilities can be distributed through the app store, but they should be so that Apple can review them properly. Criticisms aside, the iOS App Store and the iOS security model has been the best thing for software security (especially for lay-people), ever.
Apple controlling CLI utilities is a bad supposedly good idea.
They can’t stop themselves from tightening their grip ever tighter, and always want to ensure you have no functionality above what they deemed sufficient.
All the homebrew packages have checksums and are versioned in git, so if the upstream website is compromised and a malware installer is put in place of the package, `curl | bash` will just install the malware, while `brew` would start erroring out and refuse to install after downloading something with a different checksum.
You also get an audit log in the form of the git repo, and you also ensure everyone's downloading the same file, since `curl | bash` could serve different scripts to different IPs or user-agents.
I don't think brew does proper build sandboxing, so like `./configure.sh` could still download some random thing from the internet that could change, so it's only a bit better.
If you want proper sandboxing and thus even more security, consider nix.
Maybe tools like https://github.com/vet-run/vet could help with these projects that would rather you use their custom install script instead of complying to distro-specific supply chains.
Civilization is about cooperating with your fellow man to build great things, not bowing to the feudal lord Apple Inc.
A truly civilized person would use Linux, OpenBSD, etc, a free operating system where they may contribute fixes for their fellow man without having to beg at the boots of the single richest company on the planet with radar numbers asking for fixes from on high.
Projects like MacPorts and Homebrew are trying to bring at least some freedom into the macOS fiefdom. I'm just saying MacPorts is the better of those two.
MacPorts, of course, features an actual .pkg installer, as well as doing pretty much everything else better, and having more packages, and existing first.
I use brew but willing to try out Macports.
How come the package install instructions seem to require sudo under macports? Does that not carry more risk during the install ?
A loop I've found that works pretty well for bugs is this:
- Ask Claude to look at my current in-progress task (from Github/Jira/whatever) and repro the bug using the Chrome MCP.
- Ask it to fix it
- Review the code manually, usually it's pretty self-contained and easy to ensure it does what I want
- If I'm feeling cautious, ask it to run "manual" tests on related components (this is a huge time-saver!)
- Ask it to help me prepare the PR: This refers to instructions I put in CLAUDE.md so it gives me a branch name, commit message and PR description based on our internal processes.
- I do the commit operations, PR and stuff myself, often tweaking the messages / description.
- Clear context / start a new conversation for the next bug.
On a personal project where I'm less concerned about code quality, I'll often do the plan->implementation approach. Getting pretty in-depth about your requirements ovbiously leads to a much better plan. For fixing bugs it really helps to tell the model to check its assumptions, because that's often where it gets stuck and create new bugs while fixing others.
All in all, I think it's working for me. I'll tackle 2-3 day refactors in an afternoon. But obviously there's a learning curve and having the technical skills to know what you want will give you much better results.
I switched to Zed from a tmux/nvim setup. I think Zed is the first editor I've tried that has a good enough Vim mode for me to switch and keep my built-up muscle memory.
I tried switching to Zed from vim / Idea Suite with IdeaVim, and I was disappointed that I could not just use my .vimrc, have they fixed it yet? It's currently the only blocker for me.
I think there should perhaps be a law that any corporation automatically has a new class of un-tradeable VOTING shares, worth 50% of the overall vote, held by the employees. Everybody with an employment contract with this company is entitled to 1 vote, no more, no less; whether they're the janitor or the CEO.
Employees of a company are the ones who are the most affected by the company's decisions, it's only fair that they have a say.
How much is a vote worth in dollars? Because there would be a market for those votes, not just a spot market for dollars or internal market using vacation days, it would be reflected in salary and benefits and company policy etc.
Couldn’t you just make the voting anonymous to make sure that buying votes isn’t possible? Why wouldn’t I just take your money and still vote however I like?
A law like this just means getting full time employment becomes that much more difficult and the vast majority of people working for a company will be non-voting contractors without benefits. The existing employees would even vote for changes that make full time hiring more difficult in order to avoid diluting their own votes.
It would obviously need to be accompanied with rigorous enforcement of employee classification. I know there would be a bunch of possible ways to game this, so there are a lot of other rules we'd need to add but I didn't want to make my comment too long.
Also, I wouldn't necessarily make a distinction between the full-time employees vs the part-time ones.
I think you’ll find that won’t actually work in practice. Many contract workers are not independent freelancers but actually employees of a different company who contracts the work out as a whole.
For example, a courier company like UPS employs all of its workers but the packages it delivers are for other companies who contract with UPS to do the work. If you force all businesses to employ their own couriers then UPS can’t even exist as a company and small businesses that depend on courier services would simply be unable to function at all.
Both can be true at the same time. Similar to the early days of the Internet, the dot-com bubble eventually popped, but the Internet (and dot-coms, for that matter) didn't go away.
What people are saying is that this mad race to throw cash at anything that has "AI" in it will eventually stop, and what will remain are the useful things.
Yeah, I saw "AI winter" mentioned elsewhere in the thread...
IMO there is a real qualitative difference between AI and crypto in terms of the durable impact it's going to have on the world. Does that mean I've bought into the AI hype? Maybe. But I think the signs are there.
AI winters are a recurring phenomenon, not a myth, and, like the dotcom bust, involve a collapsing hype bubble, reductions in focussed speculative investment in the field, but the technologies that were big during the preceding hype cycle continuing to be important, and develop, though in the case of AI winters often they stop being thought of as AI and just get referred to with a name for the specific technology (often a different one than the main one they were known by in the hype cycle, e.g. “expert systems” from th blast hype cycle are largely “business rules engines” now.)
Googling, there seem to have been two AI winters, the first (late 1970s - early 1980s) when people first figured AI was overhyped, and the second (late 1980s - early 1990s) with the collapse of expert systems. I don't think we are about to get one now - more like AI spring leading to AGI summer.
> If I can't get the materials to repair my building in a hurry, I go outside and I wait. Or I stay inside and I wait. And if I can't do that for my Venusian balloon city, I slowly sink into a zone that melts lead and bakes me alive. And if I get the materials after it has stared sinking, repairing it won't reinflate the balloon and have it rise again, because some significant fraction of the air has leaked out.
It's more similar to a boat than a house. If your boat has a leak, you need to repair it very quickly or it ends up at the bottom of the ocean. Yet we've managed to do it relatively reliably.
Sure. Do that when you're in the middle of an ocean that's a few trillion miles wide. It's not as if you can just dive down to the bottom of the ocean there, mine some bauxite, take it back up to your sinking ship, refine it, manufacture new repair materials for the boat, then repair it, is it?
No, you have to have it shipped from a coast a trillion miles away. So again, where are they manufactured, and how long do they take to get there? Can any of this shit even be made in the vicinity of Venus, where transit times might be non-absurd? There are no recoverable materials on the planet itself, are there?
My recent experience with getting an app deployed from Gitlab to a kubernetes cluster on DigitalOcean was exactly like this. There were like 3 or 4 different third-party technologies I was expected to set up with absolutely no explanation of what problem they're solving, and there was a bunch of steps where I had to supply names or paths as command-line arguments with no guidance on what these values should contain (is it arbitrary? Does it need to match something else?)
Mind you, I have relatively good Docker experience (wrote Dockerfiles, have a pretty extensive Docker-Compose - based home server with ~15 services) so I'm not new to containers at all. But man, the documentation for all these tools was worse than useless.
One area where it really shines for me is personal projects. You know, the type of projects you might get to spend a couple hours on once the kids are in bed... Spending that couple hours guiding Claude do do what I want is way quicker than doing it all myself. Especially since I do have the skills to do it all myself, just not the time. It's been particularly effective around UI stuff since I've selected a popular UI library (MUI) but I don't use it in my day job; I had to keep looking up documentation but Claude just bangs it out very easily.
One thing where it hasn't shone is configuring my production deployment. I had set this project up with a docker-compose, but my selected CI/CD (Gitlab) and my selected hosting provider (DigitalOcean) seemed to steer me more towards Kubernetes, which I don't know anything about. Gitlab's documentation wanted me to setup Flux (?) and at some point referred to a Helm chart (?)... All words I've heard but their documentation is useless to newcomers ("manage containers in production!": yes, that's obviously what I'm trying to do... "Getting started: run this obscure command with 5 arguments": wth is this path I need to provide? what's this parameter? etc.) I honestly can't believe how complex the recommended setup is, to ultimately run 2 containers that I already have defined in ~20 lines of docker-compose...
Claude got me through it. Took it about 5-6 hours of trying stuff, build failing, trying again. And even then, it still doesn't deploy when I push. It builds, pushes the new container images, and spins up a new pod... which it then immediately kills because my older one is still running and I only want one pod running... Oh well, I'll just keep killing the old pod until I have some more energy to throw at it to try and fix it.
TL;DR: it's much better at some things than others.
Totally. Being able to start shipping from the first commit using something like Picocss and just add features helps gets things out of the design stage, but shipping features individually.
Some folks seem to like Docker Swarm before kubernetes as well and I've found it's not bad for personal projects for sure.
AI will always return the average of it's corpus given the chance (or not clear direction in the prompt). I usually let my opinions rip and say to avoid building myself a stack temple to my greatness. It often comes back with a nice lean stack.
I usually avoid or minimize Javascript libraries for their brittleness, and the complexity can eat up more of the AI's context and awareness to map the abstractions vs something it knows incredibly well.
Python is great, but web stuff is still emerging, FastAPI is handy though, and putting something like Pico/HTMX/alpine.js on the front seems reasonable.
Laravel is also really hard to overlook sometimes when working with LLMs on quick things, there's so much working code out there that it can really get a ton done for an entire production environment with all of the built in tools.
Happy to learn about what other folks are using and liking.
I tend to have auto-accept on for edits, and once Claude is done with a task I'll just use git to review and stage the changes, sometimes commit them when it's a logical spot for it.
I wouldn't want to have Claude auto-commit everything it does (because I sometimes revert its changes), nor would I want to YOLO it without any git repo... This seems like a nice tool, but for someone who has a very different workflow.
"Checkpoints for Claude Code" use git under the hood, but stored in .claudecheckpoints folder, to not mess with your own git. Add itself to .gitignore.
It auto commits with a git message for the changes done through MCP locally.
As someone who doesn't use CC, auto-commit seems like it would be the easiest way to manage changes. It's easy enough to revert or edit a commit if I don't like what happened.
It's also very easy to throw away actual commits, as long as you don't push them (and even then not so difficult if you're in a context where force-pushing is tolerable).
True, but it's harder to reject changes in one file, make a quick fix, etc. I like to keep control over my git repo as it's a very useful tool for supervising the AI.
Yeah I basically have Claude commit via git regularly and the majority of the other features described her can be done via git. I agree it's a neat idea for someone though.
Publish through homebrew like a civilized person, please!
reply