On Tue, Jan 08, 2019 at 04:16:41PM +0100, Harald Dunkel wrote: > Hi folks, > > I wonder why git-lfs is needed to efficiently handle large files > in git. Would it be reasonable to integrate this functionality > into the native git? Most of the problems Git has with handling large files aren't really problems if you have unlimited resources, but they are practical problems in some situations. Git doesn't really handle files that exceed memory capacity very well. In order to deltify files, we need to have them in memory, so the option is either to not deltify large files and have a lot of storage used, or deltify and spend a lot of CPU and memory compressing, decompressing, and deltifying them. This means that Git can require a lot of resources to store and repack large files. This is not only a problem on your system, but on whatever remote system you host your repos on (your own server, GitHub, GitLab, etc.). Your host, while probably having more resources than your local machine, also probably has more repos as well. Git LFS makes the trade-off to store files uncompressed and only copy the needed files from the server to your system. That means that you don't need to bloat your local clone with files you may never check out, but you have the downside that your clone isn't necessarily complete. I'm a maintainer of Git LFS, and I'm perfectly happy with solutions that help Git handle large files better. Ævar gave a great explanation of some of the work that's going on in this regard, and I'm also happy to hear about other improvements that may come up as well. -- brian m. carlson: Houston, Texas, US OpenPGP: https://keybase.io/bk2204