Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If the point is being able to access some files even if the whole archive isn’t uploaded, why not create 100 separate archives each with a partial set of files?

Or use a protocol that supports resume of partial transmits.



Because sometimes your files are very large it's not easy to create separate archives with (roughly) even size.

A single video can easily be over 20GB, for example.


This carries the information that all those files are a pack in an inseparable and immutable way, contrary to encoding that in the archive's name or via some parallel channel.


Presumably it compresses better if it's all one archive?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: