Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Making Travis builds faster #34

Open
phansch opened this issue Feb 23, 2014 · 2 comments
Open

Making Travis builds faster #34

phansch opened this issue Feb 23, 2014 · 2 comments
Assignees

Comments

@phansch
Copy link
Contributor

phansch commented Feb 23, 2014

Currently it takes very long to install all the gems on travis, because they are downloaded with every build. One solution would be to use a cache for the gems.

Here is a rough overview on how to use caching with Amazon S3. A commenter also mentioned using your own server to cache stuff, since Amazon S3 can get very expensive. (Travis has support for caching as well, but it's only for private repositories on travis right now.)

Now, since we recently found out about uberspace.de, how about we try and see if we can cache the gems over there, to make the builds faster?

Here is a bundle install script that is used by other people: https://github.com/minad/moneta/blob/master/script/install-bundle

@phansch phansch self-assigned this Feb 24, 2014
@phansch
Copy link
Contributor Author

phansch commented Feb 26, 2014

45 seconds on another repo of mine: https://travis-ci.org/phansch/typist/builds/19682021
There might be a problem with uploading files, but I'm not sure. If that's the case, then I'd try to setup a DAV server and use curl -T instead of scp.
Will see if everything remains working fine.

  • Need to distribute public keys on both travis and uberspace

Related changes are: https://github.com/phansch/typist/pull/5/files

@phansch
Copy link
Contributor Author

phansch commented Mar 4, 2014

Turns out this is taking a lot more time than I thought it would. The solution from above had problems with uploading and was not very good in general.

The second approach involved the same scripts, but this time using a webdav server to upload the files. This didn't work either.

What I am trying right now is using Amazon S3 and the bundle_cache gem. The problem with that gem is that it depends on nokogiri, which takes 5 minutes to install.
Luckily, there is a guy with similar problems. I went ahead and implemented his proposed changes into our own fork of bundle_cache: https://github.com/BatchZero/bundle_cache/tree/UseS3Gem
Unfortunately, this is not working either, due to some weird connection error to Amazon S3.

💻 🔨

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant