Hacker Newsnew | past | comments | ask | show | jobs | submit | lr1970's commentslogin

An opportunity for Apple to resurrect Airport line of WiFi routers (I love Airport Extreme) and on-shore their production.

Would be more logical to use FYN_ prefix

From fyn's roadmap:

> 2. Centralized venv storage — keep .venvs out of your project dirs

I do not like this. virtual environments have been always associated with projects and colocated with them. Moving .venv to centralized storage recreates conda philosophy which is very different from pip/uv approach.

In any case, I am using pixi now and like it a lot.


I like it. Enjoyed having it with Conda, was sorry when it was lost with uv. Been a pain to search my projects and have irrelevant results that I then have to filter. Or to remember to filter in the first place. The venvs may be associated with the projects, but they're just extraneous clutter unless there's actually something to be done directly on them, which is very rare.

Here's where that feature was (and is still being) discussed in the uv repo: https://github.com/astral-sh/uv/issues/1495

It's been open for two years but it looks like there's a PR in active development for it right now: https://github.com/astral-sh/uv/pull/18214


One problem I have on my work machine is that it will do a blind backup of project directories. Useless .venv structure with thousands of files completely trashes the backup process. Having at least the flexibility to push the .venv to a cache location is useful. There was (is?) a uv issue about this similar use case (eg having a Dropbox/Onedrive monitored folder).

thats my biggest problem with uv, i liked the way pipenv did it much better. I want to be able to use find and recursive grep without worrying that libraries are in my project directory.

uv is just so fast that i deal with it.


rg/fd respect gitignore automatically which solves this problem

Ill check them out, thanks!

…but I don’t have everything in a git repo. Some of my “projects” are just local scraps for trying things out.

And it doesn’t account for any other tooling that may not respect gitignore by default.

it’s my biggest problem with npm too. Ive worked around it long enough, it’s just annoying.


rg also ignores "hidden" files by default (files/dirs starting with a period), so it will ignore .venv regardless if it's in a repo.

Pip doesn’t have any philosophy here. It doesn’t manage your virtualenv at all, and definitely doesn’t suggest installing dependencies into a working directory.

Putting the venv in the project repository is a mess; it mixes a bunch of third party code and artifacts into the current workspace. It also makes cleaning disk space a pain, since virtualenvs end up littered all over the place. And every time you “git clean” you have to bootstrap all over again.

Perhaps a flag to control this might be a good fit, but honestly, I always found uv’s workflow here super annoying.


Disagree—better to have space allocated in each project where they can be easily deleted at once. Rather than half hidden in your home folder somewhere with random names and forgotten about.

If for some rare reason you wanted to delete all venvs, a find command is easy enough to write.


Sometimes I want the venvs to be in a centralized location, and just do:

UV_PROJECT_ENVIRONMENT=$HOME/.virtualenvs/{env-name} uv {command}


I like it a lot :D.

Virtual environments have been always associated with projects in your use case I guess.

In my use case, they almost never are. Most people in my industry have 1-2 venvs that they use across all their projects, and uv forcing it into a single project directory made it quite inconvenient and unnecessary duplication of the same sets of libraries.

I dislike conda not because of the centralized venvs, but because it's bloated, poorly engineered, slow and inconvenient to use.

At the end of the day, this gives us choice. People can use uv or they can use fyn and have both use cases covered.


> and uv forcing it into a single project directory made it quite inconvenient and unnecessary duplication of the same sets of libraries.

Actually, uv intelligently uses hardlinks or reflinks to avoid file duplication. On the surface, venvs in different projects are duplicate, but in reality they reference the same files in the uv's cache.

BTW, pixi does the same. And `pixi global` allows you to create global environments in central location if you prefer this workflow.

EDIT: I forgot to mention an elephant in the room. With agentic AI coding you do want all your dependencies to be under your project root. AI agents run in sandboxes and I do not want to give them extra permissions pocking around in my entire storage. I start an agent in the project root and all my code and .venv are there. This provides sense of locality to the agent. They only need to pock around under the project root and nowhere else.


This is actually the feature that initially drew me towards uv. I never have to worry about where venvs live while suffering literally zero downsides. It's blazing fast, uses minimal storage, and version conflicts are virtually impossible.

Do you only work on projects individually? Without project-specific environments I don’t know how you could share code with someone else without frequent breakages.

How is pixi better than uv?

> How is pixi better than uv?

pixi is a general multi-languge, multi-platform package manager. I am using it now on my new macbook neo as a homebrew _replacement_. Yes, it goes beyond python and allows you to install git, jj, fzf, cmake, compilers, pandoc, and many more.

For python, pixi uses conda-forge and PyPI as package repos and relies on uv's rattler dependency resolver. pixi is as fast as uv (it uses fast code path from uv) but goes further beyond python wheels. For detail see [0] or google it :-)

[0] https://pixi.prefix.dev/latest/


How is it different than mise?

There is a good chunk of overlap but mise predominately pulls from github releases artifacts/assets and pixi uses conda packages. While mise can use conda packages, the mise-conda backend is still experimental. I don't think github releases or conda packages are better than the other, they both have tradeoffs.

Pixi is very python focused, it's both a tool manager and a library dependency manager (see uv/pip). Mise considered library dependency an anti-goal for a long time, while I don't see that on the website anymore I haven't seen any movement to go into that space.


They are all anachronisms, as they have no GUIs, just commands to be typed into a REPL.

It has been working fine for build systems like cargo.

> I wonder how much can I read into it about gpt-5.4's personality.

Modeled on Sam Altman's personality :-)


<joke> How to tell a software engineer from a real one? A real engineer thinks that 1 kilobyte is 1000 bytes while software engineer believes that there are 1024 meters in a kilometer :-) </joke>


I wonder if some day people will use {"joke": ...} instead of <joke>


Why?


Another important person in creation of the TV technology was Vladimir Zworykin [0]. He developed cathode-ray tube based TV transmission devices that he patented in 1923 and 1925.

[0] https://en.wikipedia.org/wiki/Vladimir_K._Zworykin


<sarcasm> I thought that Anthropic and OpenAI created precedents that legitimize libgen and Anna archive. They pillaged all the books and scientific papers from those archives and got away with it. Why not the rest of us? </sarcasm>


Be careful not to confuse using the material and distributing it. There are open legal cases sorting out what fair use means for generative AI. Distribution (seeding in the case of torrents) of this material isn't legal. It got Meta in trouble, and it's getting Anna's archive in trouble.


> got away with it

Anthropic paid a settlement of $1 500 000 000 to authors.


Doesn't really mean much until individuals can also sacrifice a small percentage of their ARR to completely ignore IP law.

I'd take that deal, but until it becomes and option, we have a clearly broken system. Rules for thee; not for me, etc.


People always say this like the tech industry wasn't culturally anti-copyright and pro-creative commons before. Those same people probably work at Meta and Anthropic, just like Google's book project which got them in trouble.


> People always say this like the tech industry wasn't culturally anti-copyright and pro-creative commons before.

I completely agree with that. The problem is that the current system is such that only billion dollar players can flout the rules, while everyone else is left in the dust.

Worst of both worlds IMO.


Others already mentioned they lost their lawsuit. Should the fines have killed Anthropic? Would have been more fair and a less bad world?

Why not focus energy on being anti-aggressive copyright in general. These system won't ever be fair. It's just rent seeking enabled by the government and some people can afford the rent.


You’re talking past me for no real reason, mate. That’s precisely the point I’m making.

Young Carlos thinks it matters that Anthropic got sued when they can keep flouting the rules anyway, and I disagree: it’s not a fair system until we ditch the rent-seeking entirely.


They paid one of the largest settlements in world history. Should I guess that hackers are only satisfied with the public execution of the company leadership?


To pick a nit: Technically Anthropic didn't loose any lawsuit or pay any fine. They came to an agreement with the authors to pay them a $1,5 billion settlement. Which was a lot of money per book.


And a lot more expensive per book than buying, digitizing, and destroying the physical copy. Which was ruled legal.


Wasn't the Google project scanning physical books and not distributing them externally? That seems like a very different thing than torrenting or even downloading stuff uploaded by a third party.


Why the negativity? You can also as an individual do the same as Anthropic and get sued for billions. You have that option, don't let anybody hold you back!


They can: VPNs.


Fair point, but I think the Pinkertons would be at my door within the hour if I started re-appropriating the art style of Studio Ghibli or Disney for commercial profit.


About four years ago, during COVID, there was a shortage of chips for cars, fueled in part by PE scalpers hoarding chips. Sounds familiar?

EDIT: more accurate language


The biggest red flag for me is the author hiding their name. If you wrote quality book about a programming language you are not hiding your identity from the world.


The repository also has a misconfigured .gitignore file which allowed them to check in some built executables into the repository.

This is something that I wouldn't judge beginners for, but someone claiming to be an expert writing a book on the topic should know how to configure a .gitignore for their particular language of expertise.


One of the best movies about history of Apple and people involved is "Pirates Of Silicon Valley". Pretty accurate portrayal of Steve Jobs, Woz and the rest.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: