git@vger.kernel.org mailing list mirror (one of many)
 help / color / mirror / code / Atom feed
* git clone algorithm
@ 2012-10-09 17:23 Bogdan Cristea
  2012-10-10  1:13 ` Sitaram Chamarty
  0 siblings, 1 reply; 2+ messages in thread
From: Bogdan Cristea @ 2012-10-09 17:23 UTC (permalink / raw)
  To: git

I have already posted this message on git-users@googlegroups.com but I have been 
advised to rather use this list. I know that there is a related thread 
(http://thread.gmane.org/gmane.comp.version-control.git/207257), but I don't 
think that this provides an answer to my question (me too I am on a slow 3G 
connection :))

I am wondering what algorithm is used by git clone command ? 
When cloning from remote repositories, if there is a link failure and
the same command is issued again, the process should be smart enough
to figure out what objects have been already transferred locally and
restart the cloning process from the point it has been interrupted.
As far as I can tell this is not the case, each time I have restarted
the cloning process everything started from the beginning. This is
extremely annoying with slow, unreliable connections. Are there any
ways to cope with this situation or any future plans ?

^ permalink raw reply	[flat|nested] 2+ messages in thread

* Re: git clone algorithm
  2012-10-09 17:23 git clone algorithm Bogdan Cristea
@ 2012-10-10  1:13 ` Sitaram Chamarty
  0 siblings, 0 replies; 2+ messages in thread
From: Sitaram Chamarty @ 2012-10-10  1:13 UTC (permalink / raw)
  To: Bogdan Cristea; +Cc: git

On Tue, Oct 9, 2012 at 10:53 PM, Bogdan Cristea <cristeab@gmail.com> wrote:
> I have already posted this message on git-users@googlegroups.com but I have been
> advised to rather use this list. I know that there is a related thread
> (http://thread.gmane.org/gmane.comp.version-control.git/207257), but I don't
> think that this provides an answer to my question (me too I am on a slow 3G
> connection :))
>
> I am wondering what algorithm is used by git clone command ?
> When cloning from remote repositories, if there is a link failure and
> the same command is issued again, the process should be smart enough
> to figure out what objects have been already transferred locally and
> restart the cloning process from the point it has been interrupted.
> As far as I can tell this is not the case, each time I have restarted
> the cloning process everything started from the beginning. This is
> extremely annoying with slow, unreliable connections. Are there any
> ways to cope with this situation or any future plans ?

This is not an answer to your question in the general case, sorry...

Admins who are managing a site using gitolite can set it up to
automatically create and maintain "bundle" files, and allow them to be
downloaded using rsync (which, as everyone knows, is resumable), using
the same authentication and access rules as gitolite itself.  Once you
add a couple of lines to the gitolite.conf, it's all pretty much
self-maintaining.

^ permalink raw reply	[flat|nested] 2+ messages in thread

end of thread, other threads:[~2012-10-10  1:13 UTC | newest]

Thread overview: 2+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2012-10-09 17:23 git clone algorithm Bogdan Cristea
2012-10-10  1:13 ` Sitaram Chamarty

Code repositories for project(s) associated with this public inbox

	https://80x24.org/mirrors/git.git

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).