mailing list mirror (one of many)
 help / color / mirror / code / Atom feed
From: Joachim Durchholz <>
To: Git Mailing List <>
Subject: Re: Mirroring for offline use - best practices?
Date: Thu, 13 Jul 2017 00:14:01 +0200	[thread overview]
Message-ID: <> (raw)
In-Reply-To: <>

Am 12.07.2017 um 19:40 schrieb Stefan Beller:

Thanks for the feedback - it's been very, very useful to me!

 > Yes, a local path implies --local in git-clone, which (a) uses hardlinks
 > and (b) avoids some other protocol overhead.

I guess (a) is the most important one for repositories large enough to 
make this kind of stuff matter.
I had gathered so much, but I wasn't sure - my own repos aren't large 
enough to take any measurements.

 >> Ramification 1:
 >> I'm not sure how best to prepare patches for push-to-upstream.
 >> Is there value in collecting them locally into a push-to-upstream 
repo, or
 >> is it better to just push from each local clone individually?
 > It depends on a lot of things:
 > * How critical is the latency in the desired workflow?
 >    Say you have this setup on a cruise ship and only push once when
 >    you are in a harbor, then (a) you want to make sure you pushed 
 >    and (b) you care less about latency. Hence you would prefer to collect
 >    everything in one repo so nothing gets lost.

Yeah, that's the kind of scenario I'm having.
Less cruise ship (I wish!) and more on-train work during commutes, but 
it's similar.

But I think the "make sure nothing gets lost" aspect is the most 
important one. It's so easy to forget to push some sideline branch, 
particularly if you're in a workflow that uses many branches.

So... an "outbox" repository would be the Right Thing for me.
Not sure how generally applicable this is - what do other people think?

 >    Say you are in a fast paced environment, where you want instant 
 >    on your patches as they are mostly exploratory designs. Then you 
want to
 >    push directly from the local clone individually to minimize 
latency, I would
 >    imagine.

That's for online work I think.

Of course, there's the situation where you're sometimes offline and 
sometimes online.
I'm not sure how to best handle that - have a script that switches 
configurations? Just stick with that outbox workflow because switching 
workflows would invite error?

One thing that's specific to me is that I tend to be active in multiple 
projects, so I might get home and have queued up pushes in multiple repos.
I'm not sure how to best make sure that I don't forget a push.
OTOH maybe I'm overly optimistic about for how many repositories I might 
be working on a day, and it's a non-issue.

 > * Does a local clone have any value for having the work from
 >    another local clone available? In that case you may want to
 >    have all your changes accumulated into the mirror.

Yeah, definitely.
A repository I work on might be used as a submodule, and I might want to 
check the ramifications.

 > When a submodule gets deleted (git rm <submodule> && git commit),
 > then all entries for that submodule in the .gitmodules file are also 
 > That seems ok, but in an ideal world we may have a tombstone in there
 > (e.g. the submodule.NAME.path still set) that would help for tasks 
like finding
 > all submodules in the future.

I wouldn't want to use tombstone entries actually, because the content 
of .gitconfig and .gitmodules might have been modified for any number of 

The incantations that I'm using for my own "gitmirror" script are:
1. Get all commits that touch .gitmodules via
   git rev-list --all --full-history -- .gitmodules
2. Get a list of all module names mentioned in that .gitmodules version via
   git config \
   --blob "${commit}:.gitmodules" \
   --name-only \
   --get-regexp "^submodule\..*\.path$"
3. Given the module name, extract path and url via
   git config \
     --blob "${commit}:.gitmodules" \
     --get "submodule.${module_name}.path"
     --get "submodule.${module_name}.url" \

It's not the most efficient way conceivable, but it's using git's 
configuration parser, and it won't get put off by manual edits in 
configuration files .

It's nothing you'd do on the command line, hence the scripting :-)

 >> I'm seeing the --recurse-submodules option for git fetch, so this 
might (or
 >> might not) be the Right Thing.
 > That only works for currently initialized (active) submodules. The 
 > of the past and those which you do not have, are not fetched.

Ah well.

 > Without the submodule ramifications, I would have advised to have
 > have the local mirror a 'bare' repo.
I'm currently steering towards having a cache for all repositories I 
ever downloaded, which would live in ~/.cache/gitmirror.

I can turn this script into a public, maintained project if there's 
Current state is two files, the script itself and a unit test script. 
It's still in a state of flux so sharing would be for the code reviews 
at this time, general usefulness would come later.


      reply	other threads:[~2017-07-12 22:14 UTC|newest]

Thread overview: 3+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2017-07-12 10:47 Mirroring for offline use - best practices? Joachim Durchholz
2017-07-12 17:40 ` Stefan Beller
2017-07-12 22:14   ` Joachim Durchholz [this message]

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:

  List information:

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \ \ \ \

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
Code repositories for project(s) associated with this public inbox

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).