Clone all your repositories and apply sweeping changes.
pip install all-repos
All command line interfaces provided by all-repos
provide the following
options:
-h
/--help
: show usage information-C CONFIG_FILENAME
/--config-filename CONFIG_FILENAME
: use a non-default config file (the defaultall-repos.json
can be changed with the environment variableALL_REPOS_CONFIG_FILENAME
).--color {auto,always,never}
: use color in output (defaultauto
).
Add git clone
tab completion for all-repos repositories.
Requires jq to function.
Add to .bash_profile
:
eval "$(all-repos-complete -C ~/.../all-repos.json --bash)"
Clone all the repositories into the output_dir
. If run again, this command
will update existing repositories.
Options:
-j JOBS
/--jobs JOBS
: how many concurrent jobs will be used to complete the operation. Specify 0 or -1 to match the number of cpus. (default8
).
Sample invocations:
all-repos-clone
: clone the repositories specified inall-repos.json
all-repos-clone -C all-repos2.json
: clone using a non-default config filename.
Similar to a distributed git ls-files | grep -P PATTERN
.
Arguments:
PATTERN
: the python regex to match.
Options:
--repos-with-matches
: only print repositories with matches.
Sample invocations:
all-repos-find-files setup.py
: find allsetup.py
files.all-repos-find-files --repos setup.py
: find all repositories containing asetup.py
.
Similar to a distributed git grep ...
.
Options:
--repos-with-matches
: only print repositories with matches.GIT_GREP_OPTIONS
: additional arguments will be passed on togit grep
. seegit grep --help
for available options.
Sample invocations:
all-repos-grep pre-commit -- 'requirements*.txt'
: find all repositories which havepre-commit
listed in a requirements file.all-repos-grep -L six -- setup.py
: find setup.py files which do not containsix
.
List all cloned repository names.
Interactively apply a manual change across repos.
note: all-repos-manual
will always run in --interactive
autofixing mode.
note: all-repos-manual
requires the --repos
autofixer option.
Options:
- autofix options:
all-repos-manual
is an autofixer and supports all of the autofixer options. --branch-name BRANCH_NAME
: override the autofixer branch name (defaultall-repos-manual
).--commit-msg COMMIT_MSG
(required): set the autofixer commit message.
Similar to a distributed
git ls-files -z -- FILENAMES | xargs -0 sed -i EXPRESSION
.
note: this assumes GNU sed. If you're on macOS, install gnu-sed
with Homebrew:
brew install gnu-sed
# Add to .bashrc / .zshrc
export PATH="$(brew --prefix)/opt/gnu-sed/libexec/gnubin:$PATH"
Arguments:
EXPRESSION
: sed program. For example:s/hi/hello/g
.FILENAMES
: filenames glob (passed togit ls-files
).
Options:
- autofix options:
all-repos-sed
is an autofixer and supports all of the autofixer options. -r
/--regexp-extended
: use extended regular expressions in the script. Seeman sed
for further details.--branch-name BRANCH_NAME
override the autofixer branch name (defaultall-repos-sed
).--commit-msg COMMIT_MSG
override the autofixer commit message. (defaultgit ls-files -z -- FILENAMES | xargs -0 sed -i ... EXPRESSION
).
Sample invocations:
all-repos-sed 's/foo/bar/g' -- '*'
: replacefoo
withbar
in all files.
A configuration file looks roughly like this:
{
"output_dir": "output",
"source": "all_repos.source.github",
"source_settings": {
"api_key": "...",
"username": "asottile"
},
"push": "all_repos.push.github_pull_request",
"push_settings": {
"api_key": "...",
"username": "asottile"
}
}
output_dir
: where repositories will be cloned to whenall-repos-clone
is run.source
: the module import path to asource
, see below for builtin source modules as well as directions for writing your own.source_settings
: the source-type-specific settings, the source module's documentation will explain the various possible values.push
: the module import path to apush
, see below for builtin push modules as well as directions for writing your own.push_settings
: the push-type-specific settings, the push module's documentation will explain the various possible values.include
(default""
): python regex for selecting repositories. Only repository names which match this regex will be included.exclude
(default"^$"
): python regex for excluding repositories. Repository names which match this regex will be excluded.all_branches
(defaultfalse
): whether to clone all of the branches or just the default upstream branch.
Clones all repositories listed in a file. The file must be formatted as follows:
{
"example/repo1": "https://git.example.com/example/repo1",
"repo2": "https://git.example.com/repo2"
}
filename
: file containing repositories one-per-line.
output/
+--- repos.json
+--- repos_filtered.json
+--- {repo_key1}/
+--- {repo_key2}/
+--- {repo_key3}/
Clones all repositories available to a user on github.
api_key
: the api key which the user will log in as.- Use the settings tab to create a personal access token.
- The minimum scope required to function is
public_repo
, though you'll needrepo
to access private repositories.
api_key_env
: alternatively API key can also be passed via an environment variableusername
: the github username you will log in as.
collaborator
(defaultfalse
): whether to include repositories which are not owned but can be contributed to as a collaborator.forks
(defaultfalse
): whether to include repositories which are forks.private
(defaultfalse
): whether to include private repositories.archived
(default:false
): whether to include archived repositories.base_url
(default:https://api.github.com
) is the base URL to the Github API to use (for Github Enterprise support set this tohttps://{your_domain}/api/v3
).
output/
+--- repos.json
+--- repos_filtered.json
+--- {username1}/
+--- {repo1}/
+--- {repo2}/
+--- {username2}/
+--- {repo3}/
Clones all repositories forked from a repository on github.
api_key
: the api key which the user will log in as.- Use the settings tab to create a personal access token.
- The minimum scope required to function is
public_repo
.
api_key_env
: alternatively API key can also be passed via an environment variablerepo
: the repo which has forks
collaborator
(defaulttrue
): whether to include repositories which are not owned but can be contributed to as a collaborator.forks
(defaulttrue
): whether to include repositories which are forks.private
(defaultfalse
): whether to include private repositories.archived
(default:false
): whether to include archived repositories.base_url
(default:https://api.github.com
) is the base URL to the Github API to use (for Github Enterprise support set this tohttps://{your_domain}/api/v3
).
See the directory structure for
all_repos.source.github
.
Clones all repositories from an organization on github.
api_key
: the api key which the user will log in as.- Use the settings tab to create a personal access token.
- The minimum scope required to function is
public_repo
, though you'll needrepo
to access private repositories.
api_key_env
: alternatively API key can also be passed via an environment variableorg
: the organization to clone from
collaborator
(defaulttrue
): whether to include repositories which are not owned but can be contributed to as a collaborator.forks
(defaultfalse
): whether to include repositories which are forks.private
(defaultfalse
): whether to include private repositories.archived
(default:false
): whether to include archived repositories.base_url
(default:https://api.github.com
) is the base URL to the Github API to use (for Github Enterprise support set this tohttps://{your_domain}/api/v3
).
See the directory structure for
all_repos.source.github
.
Clones all repositories available to a user on a gitolite host.
username
: the user to SSH to the server as (usuallygit
)hostname
: the hostname of your gitolite server (e.g.git.mycompany.com
)
The gitolite API is served over SSH. It is assumed that when all-repos-clone
is called, it's possible to make SSH connections with the username and hostname
configured here in order to query that API.
-
mirror_path
(defaultNone
): an optional mirror to clone repositories from. This is a Python format string, and can use the variablerepo_name
.This can be anything git understands, such as another remote server (e.g.
gitmirror.mycompany.com:{repo_name}
) or a local path (e.g./gitolite/git/{repo_name}.git
).
output/
+--- repos.json
+--- repos_filtered.json
+--- {repo_name1}.git/
+--- {repo_name2}.git/
+--- {repo_name3}.git/
Clones all repositories available to a user on Bitbucket Cloud.
username
: the Bitbucket username you will log in as.app_password
: the authentication method for the above user to login with- Create an application password within your account settings.
- We need the scope: Repositories -> Read
Clones all repositories available to a user on Bitbucket Server.
base_url
: the bitbucket server URL (egbitbucket.domain.com
)username
: the Bitbucket username you will log in as.app_password
: the authentication method for the above user to login with- Create an application password within your account settings.
- We need the scope: Repositories -> Read
project
(defaultNone
): an optional project to restrict the search for repositories.
output/
+--- repos.json
+--- repos_filtered.json
+--- {username1}/
+--- {repo1}/
+--- {repo2}/
+--- {username2}/
+--- {repo3}/
Clones all repositories from an organization on gitlab.
api_key
: the api key which the user will log in as.- Use the settings tab (eg https://{gitlab.domain.com}/-/profile/personal_access_tokens) to create a personal access token.
- We need the scope:
read_api
,read_repository
.
api_key_env
: alternatively API key can also be passed via an environment variableorg
: the organization to clone from
base_url
: (defaulthttps://gitlab.com/api/v4
) the gitlab server URLarchived
(default:false
): whether to include archived repositories.
output/
+--- repos.json
+--- repos_filtered.json
+--- {org}/
+--- {subpgroup1}/
+--- {subpgroup2}/
+--- {repo1}/
+--- {repo2}/
+--- {repo3}/
+--- {repo4}/
First create a module. This module must have the following api:
This class will receive keyword arguments for all values in the
source_settings
dictionary.
An easy way to implement the Settings
class is by using a namedtuple
:
Settings = collections.namedtuple('Settings', ('required_thing', 'optional'))
Settings.__new__.__defaults__ = ('optional default value',)
In this example, the required_thing
setting is a required setting
whereas optional
may be omitted (and will get a default value of
'optional default value'
).
This callable will be passed an instance of your Settings
class. It must
return a mapping from {repo_name: repository_url}
. The repo_name
is the
directory name inside the output_dir
.
Merges the branch directly to the default branch and pushes. The commands it runs look roughly like this:
git checkout main
git pull
git merge --no-ff $BRANCH
git push origin HEAD
fast_forward
(default:false
): iftrue
, perform a fast-forward merge (--ff-only
). Iffalse
, create a merge commit (--no-ff
).
Pushes the branch to origin
and then creates a github pull request for the
branch.
api_key
: the api key which the user will log in as.- Use the settings tab to create a personal access token.
- The minimum scope required to function is
public_repo
, though you'll needrepo
to access private repositories.
api_key_env
: alternatively API key can also be passed via an environment variableusername
: the github username you will log in as.
base_url
(default:https://api.github.com
) is the base URL to the Github API to use (for Github Enterprise support set this tohttps://{your_domain}/api/v3
).draft
(default:false
) if true will open the pull request as a draft.fork
(default:false
): (if applicable) a fork will be created and pushed to instead of the upstream repository. The pull request will then be made to the upstream repository.
Pushes the branch to origin
and then creates a Bitbucket pull request for the branch.
base_url
: the Bitbucket server URL (egbitbucket.domain.com
)username
: the Bitbucket username you will log in as.app_password
: the authentication method for the above user to login with- Create an application password within your account settings.
- We need the scope: Repositories -> Read
Pushes the branch to origin
and then creates a GitLab pull request for the branch.
base_url
: the GitLab server URL (eghttps://{gitlab.domain.com}/api/v4
)api_key
: the api key which the user will log in as.- Use the settings tab (eg https://{gitlab.domain.com}/-/profile/personal_access_tokens) to create a personal access token.
- We need the scope:
write_repository
.
api_key_env
: alternatively API key can also be passed via an environment variable
Does nothing.
There are no configurable settings for readonly
.
First create a module. This module must have the following api:
This class will receive keyword arguments for all values in the push_settings
dictionary.
This callable will be passed an instance of your Settings
class. It should
deploy the branch. The function will be called with the root of the
repository as the cwd
.
An autofixer applies a change over all repositories.
all-repos
provides several api functions to write your autofixers with:
def add_fixer_args(parser):
Adds the autofixer cli options.
Options:
--dry-run
: show what would happen but do not push.-i
/--interactive
: interactively approve / deny fixes.-j JOBS
/--jobs JOBS
: how many concurrent jobs will be used to complete the operation. Specify 0 or -1 to match the number of cpus. (default1
).--limit LIMIT
: maximum number of repos to process (default: unlimited).--author AUTHOR
: override commit author. This is passed directly togit commit
. An example:--author='Herp Derp <[email protected]>'
.--repos [REPOS [REPOS ...]]
: run against specific repositories instead. This is especially useful withxargs autofixer ... --repos
. This can be used to specify repositories which are not managed byall-repos
.
def from_cli(args, *, find_repos, msg, branch_name):
Parse cli arguments and produce autofix_lib
primitives. Returns
(repos, config, commit, autofix_settings)
. This is handled separately from
fix
to allow for fixers to adjust arguments.
find_repos
: callback takingConfig
as a positional argument.msg
: commit message.branch_name
: identifier used to construct the branch name.
def fix(
repos, *,
apply_fix,
check_fix=_noop_check_fix,
config: Config,
commit: Commit,
autofix_settings: AutofixSettings,
):
Apply the fix.
apply_fix
: callback which will be called once per repository. Thecwd
when the function is called will be the root of the repository.
def run(*cmd, **kwargs):
Wrapper around subprocess.run
which prints the command it will run. Unlike
subprocess.run
, this defaults check=True
unless explicitly disabled.
The trivial autofixer is as follows:
import argparse
from all_repos import autofix_lib
def find_repos(config):
return []
def apply_fix():
pass
def main(argv=None):
parser = argparse.ArgumentParser()
autofix_lib.add_fixer_args(parser)
args = parser.parse_args(argv)
repos, config, commit, autofix_settings = autofix_lib.from_cli(
args, find_repos=find_repos, msg='msg', branch_name='branch-name',
)
autofix_lib.fix(
repos, apply_fix=apply_fix, config=config, commit=commit,
autofix_settings=autofix_settings,
)
if __name__ == '__main__':
raise SystemExit(main())
You can find some more involved examples in all_repos/autofix:
all_repos.autofix.azure_pipelines_autoupdate
: upgrade pinned azure pipelines template repository references.all_repos.autofix.pre_commit_autoupdate
: runspre-commit autoupdate
.all_repos.autofix.pre_commit_autopep8_migrate
: migratesautopep8-wrapper
from pre-commit/pre-commit-hooks to mirrors-autopep8.all_repos.autofix.pre_commit_cache_dir
: updates the cache directory for travis-ci / appveyor for pre-commit 1.x.all_repos.autofix.pre_commit_flake8_migrate
: migratesflake8
from pre-commit/pre-commit-hooks to pycqa/flake8.all_repos.autofix.pre_commit_migrate_config
: runspre-commit migrate-config
.all_repos.autofix.setup_py_upgrade
: runs setup-py-upgrade and then setup-cfg-fmt to migratesetup.py
tosetup.cfg
.