-
-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Automatic builds for 3.0.0-dev #4
Comments
@dkastl @cvvergara Yes, there is a way. We can do this using build triggers so we can run our automated builds by sending a POST request to DockerHub API. I have already activated "Build Triggers" in our pgrouting Docker repository and we have a Token to use for this purpose. We can call Docker API from TravisCI. I explain you how it works this API with several examples:
|
Now the remaining question is: do we want to build a new docker image version (and tag?) with every commit to the What's the benefit? Who would benefit? Is there demand for this? Would someone use it actually, or do we just do it, because we can do it ;-) I think CI could be a reason, but we are already using Travis for that. If there is no reason to do automated builds that frequently, then we could still try to automate the whole process, but maybe we should only do this then for tagged releases. |
This is a very good question :-) We can do that but I don't know what people wants. People always can build locally images clonning Github repository instead of download prebuilded images from Docker hub. |
#15 seems to be related. |
@cayetanobv , has this issue been eventually resolved already? |
When we merge master branch of this repository (docker-pgrouting) all builds are launched. If we want in the future to automate builds with each merge of master (or develop) branch at pgrouting main repository we probably need to move this repository as a folder inside pgrouting repository. |
How are changes to 3.0.0-dev being handled?
Because almost everyday there is a PR
@dkastl says
The text was updated successfully, but these errors were encountered: