If you’re like me, you write a fair bit of a code, which means you have to interact with many Git repositories. If you’re also like me, chances are you have them in a directory called development/ or similar. It might even have some nested directories, something like this:
./allaboutcheetahs.info ./diceware ./docker/check-disk-space ./docker/health-check ./node/circuitbreaker-demo ./node/neural-network ./s3/bucket-sizes ./s3/disk-usage ./snowdrift ./ssh-to
So that’s cool, but let’s say that you get a new machine and you want replicate your development/ directory structure onto it? One way is to check out everything by hand, but that’s laborious and time consuming. A second way is to keep backups–and you should absolutely do this–but aside from challenges of restoring a single directory out of an entire archive, what if that backup doesn’t have the latest commits in it?
I can now offer a third way. I recently wrote a couple of scripts available on GitHub that can be used to extract Git remote from each repo in an entire directory stucture, and save those remotes and the directories they belong in to a file. Given the above example, it might look something like this:
./allaboutcheetahs.info email@example.com:dmuth/dmuth.github.io.git ./diceware firstname.lastname@example.org:dmuth/diceware.git ./docker/check-disk-space email@example.com:dmuth/docker-check-disk-usage.git ./docker/health-check firstname.lastname@example.org:dmuth/docker-health-check.git ./node/circuitbreaker-demo email@example.com:dmuth/another-circuit-breaker.git ./node/neural-network firstname.lastname@example.org:dmuth/neural-network.git ./s3/bucket-sizes email@example.com:dmuth/s3-bucket-sizes.git ./s3/disk-usage firstname.lastname@example.org:dmuth/s3-disk-usage.git ./snowdrift email@example.com:Comcast/snowdrift.git ./ssh-to firstname.lastname@example.org:Comcast/ssh-to.git
“Sounds great! How do I get started?”
Glad you asked! While you can clone the entire repository, it is possible instead to run the scripts by inlining curl commands into bash. Here’s how to save your reops:
bash <(curl -s https://raw.githubusercontent.com/dmuth/save-and-restore-development-directory/master/save.sh) path/to/development/directory repos.txt
That will write the file repos.txt and it will contain the directory (relative to your development directory) and Git remote for the origin remote.
Checking out all of those repos on a new machine is just as straightforward:
bash <(curl -s https://raw.githubusercontent.com/dmuth/save-and-restore-development-directory/master/restore.sh) repos.txt path/to/new/directory
If this runs successfully, you will see each project checked out:
# # Restoring ./allaboutcheetahs.info with email@example.com:dmuth/dmuth.github.io.git... # Cloning into '.'... remote: Enumerating objects: 667, done. remote: Total 667 (delta 0), reused 0 (delta 0), pack-reused 667 Receiving objects: 100% (667/667), 1.62 MiB | 0 bytes/s, done. Resolving deltas: 100% (416/416), done. # # Restoring ./diceware with firstname.lastname@example.org:dmuth/diceware.git... # Cloning into '.'... remote: Enumerating objects: 591, done. remote: Total 591 (delta 0), reused 0 (delta 0), pack-reused 591 Receiving objects: 100% (591/591), 7.48 MiB | 0 bytes/s, done. Resolving deltas: 100% (334/334), done.
…and so on, leaving you with a new directory full of freshly checked out Git repos:
development2/allaboutcheetahs.info development2/diceware development2/docker development2/node development2/s3 development2/snowdrift development2/ssh-to
…and that’s all there is to it!
Got any thoughts on these scripts? Do let me know in the comments!