git@vger.kernel.org mailing list mirror (one of many)
 help / color / mirror / code / Atom feed
* Large-scale configuration backup with GIT?
@ 2007-09-02 20:17 Jan-Benedict Glaw
  2007-09-02 20:37 ` David Kastrup
  2007-09-02 21:49 ` Junio C Hamano
  0 siblings, 2 replies; 5+ messages in thread
From: Jan-Benedict Glaw @ 2007-09-02 20:17 UTC (permalink / raw)
  To: git

[-- Attachment #1: Type: text/plain, Size: 2028 bytes --]

Hi!

I'm just thinking about storing our whole company's configuration into
GIT, because I'm all too used to it. That is, there are configuration
dumps of n*10000 routers and switches, as well as "regular"
configuration files on server machines (mostly Linux and Solaris.)
While probably all of the server machines could run GIT natively, we
already have some scripts to dump all router's/switch's configuration
to a Solaris system, so we could it import/commit from there. There
might be a small number of Windows machines, but I guess these will be
done by exporting the interesting stuff to Linux/Solaris machines...

I initially thought about running a git-init-db on each machine's root
directory and adding all interesting files, but that might hurt GIT's
usage for single software projects on those machines, no?
Additionally, a lot of configuration files will be common, or at least
very similar. A lot of repos would probably result in worse
compression when starting with packs.

Another idea would be to regularly copy all interesting files into a
staging directory (with the same directory structure as the root
filesystem) and git-init-db'ing this staging directory, to not have a
machine-wide .git/ in the root directory.

In both cases, I'd be left with a good number of GIT repos, which
should probably be bound together with the GIT subproject functions.
However, one really interesting thing would be to be able to get the
diff of two machine's configuration files. (Think of machines that
*should* be all identical!)  For this, it probably would be easier to
not put each machine into its own GIT repo, but to use a single one
with a zillion branches, one for each machine.

Did anybody already try to do something like that and can help me with
some real-life experience on that topic?

MfG, JBG

-- 
      Jan-Benedict Glaw      jbglaw@lug-owl.de              +49-172-7608481
Signature of:                http://catb.org/~esr/faqs/smart-questions.html
the second  :

[-- Attachment #2: Digital signature --]
[-- Type: application/pgp-signature, Size: 189 bytes --]

^ permalink raw reply	[flat|nested] 5+ messages in thread

* Re: Large-scale configuration backup with GIT?
  2007-09-02 20:17 Large-scale configuration backup with GIT? Jan-Benedict Glaw
@ 2007-09-02 20:37 ` David Kastrup
  2007-09-02 21:24   ` Jan-Benedict Glaw
  2007-09-02 21:49 ` Junio C Hamano
  1 sibling, 1 reply; 5+ messages in thread
From: David Kastrup @ 2007-09-02 20:37 UTC (permalink / raw)
  To: Jan-Benedict Glaw; +Cc: git

Jan-Benedict Glaw <jbglaw@lug-owl.de> writes:

> I'm just thinking about storing our whole company's configuration into
> GIT, because I'm all too used to it. That is, there are configuration
> dumps of n*10000 routers and switches, as well as "regular"
> configuration files on server machines (mostly Linux and Solaris.)
> While probably all of the server machines could run GIT natively, we
> already have some scripts to dump all router's/switch's configuration
> to a Solaris system, so we could it import/commit from there. There
> might be a small number of Windows machines, but I guess these will be
> done by exporting the interesting stuff to Linux/Solaris machines...
>
> I initially thought about running a git-init-db on each machine's root
> directory and adding all interesting files, but that might hurt GIT's
> usage for single software projects on those machines, no?

It could break shell scripts, since
cd /;echo `pwd`/filename
does not return /filename.

I don't think that the root directory is a good place for starting
git.

-- 
David Kastrup, Kriemhildstr. 15, 44793 Bochum

^ permalink raw reply	[flat|nested] 5+ messages in thread

* Re: Large-scale configuration backup with GIT?
  2007-09-02 20:37 ` David Kastrup
@ 2007-09-02 21:24   ` Jan-Benedict Glaw
  0 siblings, 0 replies; 5+ messages in thread
From: Jan-Benedict Glaw @ 2007-09-02 21:24 UTC (permalink / raw)
  To: David Kastrup; +Cc: git

[-- Attachment #1: Type: text/plain, Size: 1532 bytes --]

On Sun, 2007-09-02 22:37:43 +0200, David Kastrup <dak@gnu.org> wrote:
> Jan-Benedict Glaw <jbglaw@lug-owl.de> writes:
> 
> > I'm just thinking about storing our whole company's configuration into
> > GIT, because I'm all too used to it. That is, there are configuration
> > dumps of n*10000 routers and switches, as well as "regular"
> > configuration files on server machines (mostly Linux and Solaris.)
> > While probably all of the server machines could run GIT natively, we
> > already have some scripts to dump all router's/switch's configuration
> > to a Solaris system, so we could it import/commit from there. There
> > might be a small number of Windows machines, but I guess these will be
> > done by exporting the interesting stuff to Linux/Solaris machines...
> >
> > I initially thought about running a git-init-db on each machine's root
> > directory and adding all interesting files, but that might hurt GIT's
> > usage for single software projects on those machines, no?
> 
> It could break shell scripts, since
> cd /;echo `pwd`/filename
> does not return /filename.

Well, I don't think that this is much of a problem.

> I don't think that the root directory is a good place for starting
> git.

Maybe :)  But this is why I brought up the discussion, to get other
people's oppinion on this topic.

MfG, JBG

-- 
      Jan-Benedict Glaw      jbglaw@lug-owl.de              +49-172-7608481
Signature of:                http://catb.org/~esr/faqs/smart-questions.html
the second  :

[-- Attachment #2: Digital signature --]
[-- Type: application/pgp-signature, Size: 189 bytes --]

^ permalink raw reply	[flat|nested] 5+ messages in thread

* Re: Large-scale configuration backup with GIT?
  2007-09-02 20:17 Large-scale configuration backup with GIT? Jan-Benedict Glaw
  2007-09-02 20:37 ` David Kastrup
@ 2007-09-02 21:49 ` Junio C Hamano
  2007-09-03  0:35   ` Martin Langhoff
  1 sibling, 1 reply; 5+ messages in thread
From: Junio C Hamano @ 2007-09-02 21:49 UTC (permalink / raw)
  To: Jan-Benedict Glaw; +Cc: git

Jan-Benedict Glaw <jbglaw@lug-owl.de> writes:

> I'm just thinking about storing our whole company's configuration into
> GIT, because I'm all too used to it. That is, there are configuration
> ...
> In both cases, I'd be left with a good number of GIT repos, which
> should probably be bound together with the GIT subproject functions.
> However, one really interesting thing would be to be able to get the
> diff of two machine's configuration files. (Think of machines that
> *should* be all identical!)  For this, it probably would be easier to
> not put each machine into its own GIT repo, but to use a single one
> with a zillion branches, one for each machine.
>
> Did anybody already try to do something like that and can help me with
> some real-life experience on that topic?

This is something similar to what I and others in my group did
long time before git was even invented.  I'd suggest you go in
the opposite direction.

If you have 5 configurations, each of which have 20 machines
that _should_ share that configuration (modulo obvious
differences that come from hostname, IP address assignment,
etc), then

 - You keep track of 5 configurations; in git, you would
   probably maintain them as 5 branches.

 - You have a build mechanism to create systemic variation among
   20 machines that shares one configuration; this can be
   different per branch.  So if you have 20 solaris machines all
   should share logically the same configuration, you would:

	$ git checkout solarisconf

        ... tweak the config for machine #27, adjusting for
        ... hostname, IP address variation, etc...
        $ make target=solaris27 output=../solaris27.expect

   Make that makefile produce the output in named directory;

 - You get the config dump from your machines (your "staging
   area"), as you planned.  Then after running the above, you
   could:

	$ cd ..
        $ diff -r solaris27.expect solaris27.actual

   if your "staging area" for machine #27 is "solaris27.actual".

The difference you would see is something done by *hand* on the
machine, which you would want to propagate back to the solaris
configuration *source* you keep track in git.  For some changes,
you may even want to adjust that single manual change done on
machine #27 so that you do not have to do that on other 19
solaris boxes manually, by adjusting the build procedure in the
solarisconf branch.



 

^ permalink raw reply	[flat|nested] 5+ messages in thread

* Re: Large-scale configuration backup with GIT?
  2007-09-02 21:49 ` Junio C Hamano
@ 2007-09-03  0:35   ` Martin Langhoff
  0 siblings, 0 replies; 5+ messages in thread
From: Martin Langhoff @ 2007-09-03  0:35 UTC (permalink / raw)
  To: Junio C Hamano; +Cc: Jan-Benedict Glaw, git

On 9/3/07, Junio C Hamano <gitster@pobox.com> wrote:
> This is something similar to what I and others in my group did
> long time before git was even invented.  I'd suggest you go in
> the opposite direction.

Agreed. The infrustructures.org crowd has been exploring this space
quite a bit, and developing tools like isconf that allow you to manage
a huge number of machines across various unixen.

And recently someone's integrated Debian APT with git tracking config
files (called IsiSetup http://www.isisetup.ch/ ). I haven't reviewed
it in detail, but it'd be first on my list.

> If you have 5 configurations, each of which have 20 machines
> that _should_ share that configuration (modulo obvious
> differences that come from hostname, IP address assignment,
> etc), then

Indeed. I ended up liking the makefile-stanzas approach it is
incredibly simple and flexible. Add debian/rpm packages, some of the
tools in the cfengine toolchain, git to track your
scripts/configuration and you are golden.

For the Windows side of things, see "Real Men Don't Click"
<http://isg.ee.ethz.ch/tools/realmen/> -- a bunch of unixy sysadmins
with the "infrastructures.org" background took on Windows
servers/desktops management -- and succeeded. PG-rated, fun for the
whole family. ;-)

HTH,



martin-who-survived-the-sysadmin-wars

^ permalink raw reply	[flat|nested] 5+ messages in thread

end of thread, other threads:[~2007-09-03  0:35 UTC | newest]

Thread overview: 5+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2007-09-02 20:17 Large-scale configuration backup with GIT? Jan-Benedict Glaw
2007-09-02 20:37 ` David Kastrup
2007-09-02 21:24   ` Jan-Benedict Glaw
2007-09-02 21:49 ` Junio C Hamano
2007-09-03  0:35   ` Martin Langhoff

Code repositories for project(s) associated with this public inbox

	https://80x24.org/mirrors/git.git

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).