Hacker Newsnew | past | comments | ask | show | jobs | submit | miffy900's commentslogin

>Canada dumps good milk down the drain while people go hungry and suffer high food prices

I'm not sure if you realise this, but the exact same thing happens in the US.


> The firm gradually grew more contentious, demanding that the RTX 5060 be handed in because the event it was acquired at was part of a business trip, entirely paid for by the company. The employee would never have won the GPU had the firm not enabled him to attend the venue. Our winner refused, arguing that it belonged to him because he had won it on his own by pure luck.

Hmm...I feel like the company's reasoning here is almost acceptable. Almost, because I know as a (paid) employee, all of the code I write, any inventions or IP I come up with are the company's property, so it almost makes sense that the company might also want to assert its right to claim that any physical things given or gifted in the course of work-related trips that employees take on company time.

but the article mentions the winner was an intern, not an employee, and I know many interns i've worked with never actually signed an employment agreement, because they dont actually get paid. They sign NDAs but not full on employment agreements, so how can any company treat them like an employee? if I wasn't getting paid, I'd 100% hold my ground like the intern did and take it.


Doesn't matter. It's a small amount (in the eyes of the company), and is bound to feel unfair to the employee.

It's like your employer asking that you keep the pretzels on your business flights and hand them in to the office snack bar. Only ill will can come from that, and zero profit.


You realize you can redline the default IP assignment clauses, right? It should never have been normalized that an employer gets blanket claim to all mental output on your part. Especially things done in your off hours on equipment the company doesn't own.

It's just another example of how contract law, lawyers, and legal fictions represent a bottom up funnel of value extraction from the populace in which they exist. Can't even just work and get paid without some arsehole driving/hiding behind a legal fiction strip mining you for all the law will let them get away with.


What name calling? Calling the author 'an unserious person' isn't name calling. Might be worth reading the article:

> "If you like Windows 8’s look, you are a bad person. You are the one Steve Jobs was talking about when he said Microsoft had no taste."

yeah you don't need to read very much of this to know this author hasn't exactly written a substantive article; they certainly aren't bothering to backup their claims with any reasoning. the whole post itself is 'this version of windows was ugly, this one wasn't etc'.


That was exactly the same behaviour in Windows 7 though; it wasn't exactly novel. At least Windows 7 searched your apps, and documents all at once. Windows 8 limited you to just apps. Windows 8 was a huge step down in usability.


They can afford to make a big song and dance about this because chances are they are not selling the hardware at a loss and they have the regular steam store to offset the short term costs. If they were selling the hardware at a loss, I think their marketing trying to sell this device would be very different.


they probably will handle it like with the Steam Deck

- no loss

- but small profit margin anyway, to max reduce the price, to max increase adoption/reach

for Valve people using Steam on non Windows platforms is more important then making a big buck from Steam Machines (because this makes them less dependent on Windows, MS has tried(and failed) to move into the direction of killing 3rd party app stores before, and Windows has gotten ... crappy/bloated/ad-infested which is in the end a existential risk for Valve because if everyone moves away from PC gaming they will lose out hugely)


Is Apple selling their hardware at a loss?


No, but I think the primary comparison is meant to be other major consoles (xbox, playstation, nintendo)


Sort of, maybe. I read it more as them assuring everyone that it's still a PC if a customer ends up wanting a plan B.


I know you are being rhetorical but for reference, of course not, their margin on hardware is 36%


Do we count socks and slings (Pocket™) as hardware?


Switch was always sold for more than component and manufacturing cost. PS4 crossed the threshold quickly (per Sony iirc?)

However, that ignores R&D costs which presumably have to be amortized, largely through game sales and platform fees. The same is true for other platforms like iOS.


I feel like there could be a loophole here for the new-framework-author. Stick to using JSX for the view; JSX is just syntax sugar for built in react functions for constructing a tree, which can be easily swappable with your own implementation. I recall years ago using a babel plugin that just emitted static HTML from JSX. I know Vue.js v2 also had JSX support that way.

I think LLMs, despite already being trained massively on React, can easily adapt their output to suit a new framework's-specific API surface with a simple adjustment to the prompt. Maybe include an abbreviated list of type/function signatures that are specific to your new framework and just tell the LLM to use JSX for the views?

What I think will definitely be a challenge for new library authors in the age of LLMs is state management. There are already tons of libraries that basically achieve the same thing but have vastly different APIs. In this case, new lib-authors may be forced to just write pluggable re-implementations of existing libraries just to enable LLMs to emit compilable/runnable code. Though I dont know of any state management library that dominates the web like React does with the view layer.


Huh - that's actually pretty interesting and I hadn't thought of that as an option.. I know Preact was built as a faster alternative while being broadly compatible, but what you are describing is maybe even blending the technologies as that short circuit. neat.


> Stick to using JSX for the view

That's what I did. https://mutraction.dev/

My framework has approximately zero users and this is not a plug, but the idea is sound and it works.


for the longest time, no one in linux land cared about API stability or backward compatibility - then app/game developers realised if they could port a portion of Win32 to Linux via WINE, they could just target the win32 API or at least a portion of it and so long as WINE was installed, their app/game would always work. i find it a bit ironic; desktop Linux is being enabled by re-implementing APIs from another OS.


It's like they always say: win32 is the only stable ABI on Linux.


To be fair, Microsoft has always had a culture of strong backwards compatibility, even between major OS versions - this is something they cultivate internally AND also tell their customers/users about.

Apple has had no such culture internally and they sure as heck don't emphasise backward compatibility to their customers (users or otherwise) - if anything, Apple prods and nags their developers to stick to the latest SDK/platform APIs, and shove the burden of software compatibility and maintenance onto them and hand wave away the breaking changes as being part and parcel of membership in the Apple ecosystem. This attitude can be traced back to the Steve Jobs era at Apple. It's definitely not new and comparing what Microsoft does with software and backward compatibility and expecting Apple to do the same is not fair - they really are different companies.


It's been this way forever, and consistently that's been my annoyance with Apple. If I weren't a coder or hobbyist musician or filmmaker, there's no way I would've wanted a Mac in the 2000s because it failed its main job of running software. If I were grown up back then with a desk job, it's probably involve a Windows PC.

Web partially fixed this, but only by accident, because Apple isn't for the web. And if I cared at all about video games or were doing certain fields of work (maybe creative tools now that Apple even lost that hegemony), that'd take me off the Mac. Somehow the Mac 3P software scene is even worse now than in the PPC era. And Microsoft is now testing just how annoying Windows can be without people leaving, answer is a lot.

Apple is limiting their reach so much, for reasons I still can't rationalize. Some basic level of backwards compatibility or at least more stable APIs should be possible without sacrificing the good things. I've done some iPhone and Mac dev too, it sucks in every possible way, and I get why people trust it so little that they'd rather shoehorn a web app into a native shell.


How is Apple limiting its reach? It doesn’t want to compete with the bottom of the barrel low end PC sales and for the most part people with money are already buying Macs unless they are gamers. Apple routinely captures around 50% of PC revenue


Just wanted to mention that some basic Windows-OS keyboard shortcuts don't work, like ALT+F to open the File menu. Also things like ALT+SPACEBAR to bring up the system context menu for the focussed window (the menu with maximise, minimise, close options etc.) do not seem to work. I'm guessing with the DirectX rendering backend, the 'app' is rendered more akin to a video game than a native win32 process.

Also after install, the install directory takes up 400MB+. Even VSCode only takes up around 380MB. I believe it when they say it's not an Electron app, but I do wonder what's being packed in there. I was always under the impresion that Rust apps are pretty lightweight, but that install size is nearing Java levels of binary/dependency bloat.


Compared to Sublime Text:

RAM:

213 MB Zed

41 MB ST

Storage:

406 MB Zed

52 MB ST

Startup time:

Zed is slower than ST (but only by a few milliseconds).

Also when you reopen ST it will remember how you've resized the window from last time whereas Zed won't.


Probably it helps that Sublime doesn't come with an AI agentic features, LSP, and a whole video-conferncing and screen-sharing client by default.


> and a whole video-conferncing and screen-sharing client by default

Haha wait what? Are you confusing Zed (the text editor) with something else? Surely it doesn't ship with video conferencing???


I haven't seen video but it does have voice. And similarly I don't think it's screen-share, it's just editor state syncing, so live collaboration. Still quite a lot.


There is a feature that lets you share your screen. It shows up for other participants in the collaboration session as a tab in the editor.


Probably referring to the collaboration tools. Zed has a bunch of stuff around remote pair programming with people.


It has a voice chat system built in as part if the collaboration tools. Personally I think they should remove the voice chat...


ST is a text editor while Zed is an IDE. I wish there were something like VSCode that is very modular but written in native. But VSCode is good enough and it is my daily driver.


For those on Windows, which is the topic at hand, UltraEdit and Notepad++.

I disagree Zed is an IDE, it is quite far from InteliJ, Borland/Embarcadero, VS, XCode, Eclipse, Netbeans...

If it is about adding enough plugins until it eventually becomes one, then any programmer's editor is an IDE.


My line is - if I can compile, run and debug my program through the editor UI instead of a terminal, it's an IDE.


Sublime Text can run code from its UI too. IDE is much more full-featured, like VS vs VSCode or IntelliJ vs Fleet.


As I said, then any programmer editor is an IDE, including UltraEdit and Notepad++.


Notepad++ has a debugger UI? One that goes beyond running a terminal inside a pane?


It has plugins....and about 30 years of ecosystem history.

Try to use Zed to debug Go code.


I'm literally doing that right now. I can set breakpoints and graphically step through them in Go files.



There are always: Vim and Emacs.


> ST is a text editor while Zed is an IDE.

Zed is the new emacs?


No. In Emacs you can write a simple one line script and change anything.

In Zed, you need to write a full blown plugin, and can change only what the Zed authors exposed through their plugin API.


No, that doesn't matter. I think you should be looking for how quickly it can get you a working environment for your favorite language not how long it takes to boot up once per reboot. If you want features the bits have to live somewhere. Look at it like a trade off, if you're just going to look at it, by all means, take a memory dump. But I find that a little bit hard to work with.

For me, as long as it's better than alternatives it's good enough. Especially if it's not running JS.


RAM does matter, especially when you have web browser with multiple tabs opened at the same time. Sublime Text or Notepad++ are powerful, yet much more lightweight than Zed. Not to mention Vim/Emacs.


> I was always under the impresion that Rust apps are pretty lightweight, but that install size is nearing Java levels of binary/dependency bloat.

For what it's worth, the zed executable on Linux weighs 3.2 MB.

EDIT: Sorry, the nix store is too good at hiding things from me. It's actually around 337 MB plus webrtc-sys.


I just compiled "zed" with "cargo build --release" and not only did it pull >2000 dependencies, its size (executable file) is literally 1.4G. Debug is 1.2G.

  $ pwd
  /tmp/zed/target/release
  $ ls -lh ./zed
  -rwx------ 2 john john 1.4G Aug 28 17:10 zed
---

  $ dut zed/ | sort -h
   598M    0B   | | /- webrtc-sys-0a11149cbc74bc90
   598M    0B   | | | /- out
   598M    0B   | | |- webrtc-sys-090125d01b76a5e8
   635M  160M   | |   /- s-hal7osjfce-1h7vhjb-4bdtrsk93m145adnqs17i9dxe
   635M  160M   | | |- project-06kh4lhaqfutk
   641M  161M   | | /- project-1ulvakop54j8y
   641M  161M   | | | /- s-hal5rdrth3-0j8nxqq-d0wsc7qnin39797z4e8ibhj4w
   1.1G  1.1G   | | /- zed-ed67419e7a858570
   1.1G  1.1G   | |- zed
   1.3G  1.3G     | /- zed-64b9faeefdf3b7df
   1.3G  1.3G     |- zed
   1.4G    0B     |- build
   2.2G    0B   | |- build
   7.9G  1.4G     /- deps
   9.4G    0B   |- release
    14G  2.9G   | |- incremental
    19G  4.2G   | /- deps
    33G    0B   /- debug
    42G    0B /- target
    42G    0B zed
Summary:

  $ du -h ./target/debug/deps/
  20G     ./target/debug/deps/
  $ du -h ./target/release/deps/
  8.0G    ./target/release/deps/

  $ du -h ./target/debug/zed
  1.2G    ./target/debug/zed
  $ du -h ./target/release/zed
  1.4G    ./target/release/zed
This is on a whole new level of bloat; both with regarding to dependencies AND the resulting executable file(s) (EDIT: executable files are unstripped).

Any explanations as to why "cargo" does not seem to re-use libraries (dependencies) in a shared directory, or why it needs >2000 dependencies (that I see being downloaded and compiled), or why the executable file of the release mode is 1.4G unstripped while of the debug one it is less?


This is pretty common for larger rust projects. Its basically the new javascript+npm mess, this time with a borrow checker.


Cargo does the de-duplication, but only up to a point. If two packages request the same dependency with semver ranges that have a common overlap (say, `1.4` and `1.6`) then it will use a single package for both (say, `1.7.12`). But if they request semver-incompatible versions (`2.1` and `1.6`) then cargo will use both.


I read the question differently as: Why doesn't cargo cache (compiled) crates in ~/.cargo?


Unstripped, perhaps?

    ls -lh /nix/store/63rdpgbzn7f1smh7688crcrpfsh833bb-zed-editor-0.199.10/bin/zeditor
    -r-xr-xr-x. 2 root root 3.2M Jan  1  1970 /nix/store/63rdpgbzn7f1smh7688crcrpfsh833bb-zed-editor-0.199.10/bin/zeditor
EDIT: Ah, it was too good to be true. The true binary is hidden in libexec/.zed-editor-wrapped :(

    ls -lh /nix/store/52smrb1z8r4n71zx50xagkcdrhlga4y5-zed-editor-0.207.4/libexec/.zed-editor-wrapped
    -r-xr-xr-x. 2 root root 337M Jan  1  1970 /nix/store/52smrb1z8r4n71zx50xagkcdrhlga4y5-zed-editor-0.207.4/libexec/.zed-editor-wrapped
Extra weight also comes from webrtc, which nixpkgs dynamically links. So yeah, it's quite a large binary indeed.


Additionally, in any case, now I know what I have to do to free up some space. Get rid of Rust projects I built from scratch.

Maybe something like this to figure out what directories to delete:

  # With dut
  find . -type f -name Cargo.toml -printf '%h\n' | sort -u | xargs -r -d '\n' -I{} dut -s {} | sort -h
  # With du
  find . -type f -name Cargo.toml -printf '%h\n' | sort -u | xargs -r du -sh | sort -h
I found "websocat" and "ripgrep". Thankfully I got rid of everything else.

  ripgrep $ cargo clean
     Removed 3411 files, 1020.2MiB total

  websocat $ cargo clean
     Removed 1726 files, 820.7MiB total
That said, ripgrep itself is only 5.0M.


Probably due to treesitter modules for many languages compiled in. AFAK Treesitter's codegen is unfortunately a share nothing between different languages. So a dozen language parsers can easily cross upward of 200 MB.


Binaries for dynamic libraries of tree-sitter (usually compiled with C compiler) would be smaller than that. For example this [1] .so bundle for 107 different grammars is ~137 MiB.

Unless by "compiled in", some in-lining of the C code into Rust codebase is meant.

[1] https://github.com/emacs-tree-sitter/tree-sitter-langs/relea...


  $ strip --strip-all ./target/release/zed
  $ du -h ./target/release/zed
  261M    ./target/release/zed

  $ strip --strip-all ./target/debug/zed
  $ du -h ./target/debug/zed
  482M    ./target/debug/zed
Correct. It is still embarrassing, in my opinion.

To make matters worse, it takes several minutes for Zed's window to appear on a cold start, whereas VSCode launches almost instantly.

[1] I am trying to measure it as we speak but it is taking quite a long time.


> Not to mention it takes minutes for the window of Zed to open, whereas VSCode is almost instant.

That one is interesting. It's much quicker for me, even cold starts are below 1s, and subsequent startups are basically instant.


Cold starts are minutes, subsequent startups are much faster than VSCode[1].

I wonder why though.

[1] I have not measured subsequent launches of VSCode though, but Zed is relatively pretty quick after the initial launch.


Maybe some kind of "security" software interfering?


I suppose it has to do with how every Rust crate (including dependencies) gets statically linked into the final binary, and this leads to extremely large intermediate artifacts, even when many crates share common dependencies.

Or the fact that there is incremental compilation artifacts...

And of course the amount of dependencies. A single project might depend on hundreds of crates (quite common), each compiled separately with its own build artifacts. sighs.


What does a desktop text editor have to do with WebRTC?


Judging by this note in the docs: Collaboration features.

https://zed.dev/docs/development/freebsd#webrtc-notice


If they're going to implement every feature under the sun and include half of userspace to support it, they might as well build the whole thing on top of a browser.


vscode has entered the chat...


Amazon q is 100mb and that’s a cli app. Rust programs be huge.


A 400mb+ install of bloat will upset many people

This needs to be justified asap to help people understand and reconsider installing it.


Strangely it's the actual binary's .text section that's about 400MB. Time to dive in!


The Rust compiler always produces quite large binaries compared to other programming language. I notice there's a (closed) issue on the Zed github [https://github.com/zed-industries/zed/issues/34376],

> At this time, we prioritize performance and out-of-the-box functionality over minimal binary size. As is, this issue isn't very actionable, but if you have concrete optimization ideas that don't compromise these priorities, we'd be happy to consider them in a new issue.


Welcome to static linking of large applications.

The world moved into dynamic linking in the 1980's for a reason.

It is cool to advocate for a return to static linking when it is basic CLI tools.


All those beautiful dlls will anyways sit comfortably in the same folder as your "dynamically" linked executable on Windows.


They might be, or not.


I think there should be a best-of-both-worlds type of linking - during compilation, the linker places a statically compiled library at a certain address, but doesn't include it in the binary. Then, during startup, the OS maps the same library to the given address (sharing the data between processes). This would improve memory use and startup time both and performance, avoiding dynamic linking. Of course you need to match the exact versions between the compiled executable and the dependency, but this should be a best practice anyways.


Static linkers generally don't compile a "full copy" of the library. Just the code paths the compiled application uses. The compiler may have also made optimizations based on the apppication's usage patterns.


I say "strangely" because honestly it just seems large for any application. I thought they might not be doing LTO or something but they do thin LTO. It's just really that much code.


> The world moved into dynamic linking in the 1980's for a reason.

Reasons that no longer exist. Storage is cheap, update distribution is free, time spent debugging various shared lib versions across OSes is expensive.


Yet everyone is complaining on this thread about Zed distribution size, go figure.

They should shut up and just buy bigger drives. Ah, they can't on their laptops, bummer.

Also try to develop mobile apps with that mentality,

https://www.abbacustechnologies.com/why-your-app-keeps-getti...


Tbh, the rights and wrongs aside, I suspect "everyone" is complaining about it because it's the easiest thing to talk about. Much like how feature discussions tend towards bikeshedding.


Precisely. It seems like the people who say storage is cheap assume everyone is using desktop PCs


Storage is cheap and upgradeable on all but very very few Windows laptops.


> Storage is cheap

My /usr is 15G already, and /var/lib/docker isn't that far off despite people's obsession with alpine images. If more people would dismiss storage as cheap it'll quickly become expensive, just not per GiB.

> update distribution is free

I wouldn't be surprised if at one point Github would start restricting asset downloads for very popular projects simply because of how much traffic they'd generate.

Also, there's still plenty of places on the planet with relatively slow internet connectivity.


Storage doesn't really feel cheap. I'm considering buying a new laptop, and Apple charges $600 per TB. Sure, it's cheaper than it was in the '80s, but wasting a few gigabytes here and a few gigabytes there is quickly enough to at least force you to go from a 500GB drive to a 1TB drive, which costs $300.


That's more of an Apple problem? Storage is under $50/TB.


It's the reality of storage pricing. The general statement "storage is cheap" is incorrect. For some practically relevant purposes, such as Apple laptops, it's $600/TB. For other purposes, it's significantly below $50/TB.

You could say "just don't buy Apple products". And sure, that might be a solution for some. But the question of what laptop to buy is an extremely complicated one, where storage pricing is just one of many, many, many different factors. I personally have landed on Apple laptops, for a whole host of reasons which have nothing to do with storage. That means that if I have to bump my storage from 1TB to 2TB, it directly costs me $600.


If you're buying Apple then you should expect inflated prices. I got a 4TB NVMe SSD for like 350€, a 2TB one goes from 122 - 220 € depending on read/write speeds.

I don't check the installation size of applications anymore.


I'm just saying that $600/TB is a real storage price that lots of people deal with. Storage isn't universally cheap.

This feels especially relevant since we're discussing Zed here, the Mac-focused developer tool, and developers working on Mac are the exact people who pay $600/TB.


A 2TB SSD for the Framework 13 cost me 200 euros. But I agree that it's not cheap, files are getting bigger, games are big, apps are huge, and then you need backups and external storage and always some free space as temp storage so you can move files around.


Bro, with this mentality, you won't get far in Apple universe.

Embrace your wallet will be owned by Apple. Then you can continue.

Sorry, but people buying Apple products are different bread :D


I don't need to "get far in the Apple universe", I need a laptop. My current MacBook Pro cost about the same as the Dell XPS I was using before it, I like nice laptops


RAM isn't cheap (it may be for your tasks and wallet depth, but generally it isn't, especially since DDR5). Shared objects also get "deduplicated" in RAM, not just on disk.


What objects is the Zed process using that would even be shared with any other process on my system? Language support is mostly via external language servers. It uses its own graphics framework, so the UI code wouldn't be shared. A huge amount of the executable size is tree-sitter related.


I 100% agree. As soon as you step outside of the comfort of your Linux distributions' package manager, dynamic linking turns into dependency hell. And the magic solution to that problem our industry has come up with is packaging half an OS inside of a container...


> Storage is cheap

I'll be very grateful if you stopped using all my RAM for two buttons and a scrollbar thank you.


OSes don't load the full executable into physical RAM, only the pages in the working set. Most of the Zed executable's size is tree-sitter code for all the supported languages, and only needs to page in if those languages are being used in a project.


Maybe for this particular case but the comment shows a certain mindset...


Big sigh. I wish we still had pride in our field, rather than this race to the bottom mentality.


I really like this article "How Swift Achieved Dynamic Linking Where Rust Couldn't" https://faultlore.com/blah/swift-abi


I was a little sus, so I checked: https://imgur.com/a/AJFQjfL

897MB! But it appears to have installed itself twice for some reason. Maybe one is an 'update' which it didn't clean up...? I'm not sure.

Edit: I just opened it and it cleaned itself up. 408MB now. I guess it was in the process of upgrading.


So the upgrades are not delta diffs either?


Even if it’s delta, it cannot patch itself when running on Windows. So it runs the updater, creates a new exec and switches to it after relaunch. Same as Chrome or Firefox.


OS deficiency. And maybe programs shouldn't be allowed to update themselves.


Is it? On Linux, you can overwrite the file, but the underlying inode will still be open, and the 'invisble' old version will linger around - you don't have any easy way short of restarting everything to make sure the new versions are being used.

And with Chromium this directly leads to crashes - when you update the browser as its open, the new tabs will open with the new version of the binary, with the old ones still using the old binary - which usually leads to crash.

I prefer 'you cannot do X' instead of 'we allow you to do it, but it might misbehave in unpredictable ways'.


I don't use Chromium. I never had issues with Apache, MySQLd, Firefox, Thunderbird, ... . You can even swap out the Linux kernel under userspace it still keeps all running.


> maybe programs shouldn't be allowed to update themselves.

Honestly I'd be all for this if the OS had a good autoupdate mechanism for 3rd party applications. But that's not the world we live in. Certainly not on windows - which is too busy adding antivax conspiracy articles to the start menu.


Will it though? I mean it's a lot for a "text editor", but much less than a classical IDE. And 400M is pretty negligible if you're on Windows, where your OS takes up dozens of GB for no reason.


Yeah I don't think 400M is really that big a deal. My `.emacs.d/` dir weighs in at over 1G and I've never thought twice about it.

For people who are serious about their text editors, 400m is a small price to pay for something that works for you.


If the OS is already bloated, that leaves LESS space for your editor!


> I was always under the impresion that Rust apps are pretty lightweight

I'm not sure what gave you that impression. I'd say Rust is pretty well known for fat binaries


Most of the sticker shock from Rust binaries is due to them being statically-linked by default. Considering that, Rust binaries aren't especially large, especially if you strip them. Dynamically-linked binaries are better at obscuring their size.


Rust Hello World is larger than Git. Still smaller than Java and Electron, but not exactly small.


Entirely untrue. Download git, run make and you'll get a 19MB `git` binary along with a whole lot of other 19MB binaries. Running `cargo build` produces a 3.8MB binary.

And that's still comparing apples to oranges, because git is compiled with full optimizations. Running `cargo build --release` produces a 462KB binary.

Even if I'm comparing to my system's git installation, that's still 3.9MB, and that's with all the debug info stripped.

Yes rust (like C++) tends to produce larger binaries than C, but lets be real here: The reason Zed has a bloated binary is the ~2000 rust packages that comprise it.


> The reason Zed has a bloated binary is the ~2000 rust packages that comprise it.

Hundreds of those MBs are from tree-sitter grammars, which are JavaScript compiled to C.


That's an entirely different issue. The kb's of overhead for backtrace printing and the format machinery is fixed and does not grow with the binary size. All combined it wouldn't account for anywhere close to 1mb let alone 100's of mb.


Helix binary on my system is 20MB+ but dynamically linked grammars are additional 200MB. Those 380-400MB are probably not pure binaries are they?


> I was always under the impresion that Rust apps are pretty lightweight

Maybe compared to electron, but binary size is an issue with any nontrivial rust application. Due to how cargo works, it compiles and bundles in every dependency in world.

400MB is unnecessarily large though.


> I believe it when they say it's not an Electron app, but I do wonder what's being packed in there

Half of Electron namely Node.js. As majority of lsp are .js based. Also extensions are WASM. Also VS Code keeps extensions in separate config directory, while Zed in main directory.


Just so others here know, it’s possible to have a graphics context and a Win32 menu bar in the same window.


PSPad is 40MB. And that is quite a legacy software that is still being updated to this day. Notepad++ is 17 mb.

400 mb for new project in this amazing bestest compiled language ever made is ridiculous.


I think this idealism reveals a naive viewpoint about what users really care about. They care that apps work - that they do what they're supposed to and do it fast or efficiently. Not even Microsoft makes apps for their own platform that are native apps (example Teams, the new Outlook), and they service millions of users. Indeed, if you look at Microsoft's UI over the years, they are inconsistent as hell (all of the Office apps throughout the years is a good example), but so long as performance, functionality and usability hasn't suffered too much, users are OK with non-native apps that do not appear native. Another example is iTunes on Windows - looks nothing like a native Windows app.

There's also the fact that having control over your own apps UI/design language is better over the long term. What if Apple decides to ditch this liquid glass for something else years in the future? They ditched their own design language in iOS7, and now with iOS26 they've done it again.

And the basis for UI redesigns as wide ranging as this are almost entirely nonsensical. Does liquid glass suddenly improve usability by whatever percent? Nope - I guarantee Apple does NOT interrogate or benchmark their UI designs in the same way as NN Group does. Usability is actually hurt by the fact users need to re-learn basic interactions, and existing ones are now slower. Is overall performance improved over the previous version? Absolutely not - performance metrics such as battery life and UI responsiveness have regressed with the over use of visual effects like translucency and minute pixel manipulations. Why bother following changes to a design language when they are not based on real reasoning backed up by actual data or solid logic, and they end up regressing performance to an even worse state? Why should any app vendor be obligated to follow what are ultimately arbitrary and whimsical changes?

Redesigns such as this result in literally more work for the sake of it, zero net improvements and whole lot of wasted effort, all for what? Just to look different for a while, until the next redesign?


> They ditched their own design language in iOS7, and now with iOS26 they've done it again.

They ditched their 30-pin dock with the iPhone 5, and then chucked Lightning in favor of USB-C with the iPhone 15.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: