Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

possiblity to generate just the dockerfile and artifacts and change the default route #34

Open
mayacr86 opened this issue Jul 21, 2015 · 8 comments

Comments

@mayacr86
Copy link

Hi! Couple of silly questions:

  1. Is there a way to configure the plugin to just generate the dockerfile and the artifacts (all the .jar files under /target/docker) and not creating the image in my local machine (due to space limitations) whenever I run the "sbt docker" command?

  2. Is there a way to change the default path of the artifacts "[app-root]/target/docker" to lets say, just "[app-root]/docker"?

Thanks :)

@marcus-drake
Copy link
Owner

Hi!

  1. There is currently no task to only create the staging directory with the Dockerfile. But you can create your own task that creates a Dockerfile and runs DefaultDockerfileProcessor(dockerfile, stagingDirectory).

  2. You can change the staging directory by setting target in docker, for example target in docker := "docker".

@EdgeCaseBerg
Copy link

I tried doing this, but am not seeing the Dockerfile show up. Any idea? I've got a taskKeyUnit defined and the definition has this inside:

someTask := { ...
    val dockerDir = target.vaule / "docker"
    val dockerFile = new sbtdocker.mutable.Dockerfile {
            from("java:7")
            add(artifact, artifactTargetPath)       
            expose(someport)
            entryPoint("java","-jar", artifactTargetPath)
    }
    sbtdocker.staging.DefaultDockerfileProcessor(dockerFile, dockerDir)
   ... more stuff ...
}
...
someTask <<= someTask.dependsOn(compile in Compile, dockerfile in docker)

but when I run my task I don't see the docker file being created, am I missing something to actually generate the file?

@EdgeCaseBerg
Copy link

Nevermind, figure'd out how to do it. For anyone else:

IO.write(dockerDir / "Dockerfile", sbtdocker.staging.DefaultDockerfileProcessor(dockerFile, dockerDir).instructionsString)

should do the trick within the task.
But +1 for having this be a build step that one can hook into, it'd be useful

@AlexBrickPorch
Copy link

A 👍 from me here. We use gcloud for our Docker stuff, which means that we need to build and push our Docker images manually (we need to pull the base image and push the final image using gcloud commands). Being able to generate the directory and Dockerfile easily would be super helpful for us.

@nicolasdalsass
Copy link

nicolasdalsass commented May 19, 2016

👍 here too.

For others stumbling here, a more complete example (without any kind of guarantee whatsoever, for when you use sbt assembly) :

val dockerFileTask = taskKey[Unit]("Prepare the dockerfile and needed files")

dockerFileTask := {
  val dockerDir = target.value / "docker"
  val artifact: File = assembly.value
  val artifactTargetPath = s"/app/${artifact.name}"

  val dockerFile = new Dockerfile {
    from("java")
    add(artifact, artifactTargetPath)
    entryPoint("java", "-jar", artifactTargetPath)
  }

  val stagedDockerfile =  sbtdocker.staging.DefaultDockerfileProcessor(dockerFile, dockerDir)
  IO.write(dockerDir / "Dockerfile",stagedDockerfile.instructionsString)
  stagedDockerfile.stageFiles.foreach {
    case (source, destination) =>
      source.stage(destination)
  }
}

dockerFileTask <<= dockerFileTask.dependsOn(compile in Compile, dockerfile in docker)

And then just run sbt dockerFileTask

@laugimethods
Copy link

Using Wercker to build the Dockerfile, I also need that feature since I'm not able to properly run docker inside a (dedicated) Docker image started by Wercker...

@colindean
Copy link
Contributor

colindean commented Jun 4, 2019

I'm hitting this, too, but I struggled to integrate @nicolasdalsass's suggestion into my project, which has several subprojects. I finally managed to get it to kinda work, barebones copied here:

lazy val `dfp-services` = (
  project in file(".")
  enablePlugins GitVersioning
  settings noPublishing
  aggregate(
    `dfp-common`,
    `dfp-init-stream`
  )
)

lazy val `dfp-common` = (
  project in file("dfp-common")
  enablePlugins GitVersioning
  settings common ++ publishAsLibrary
  settings(
    crossScalaVersions := Seq(middleTierScalaVersion),
    libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.5.0",
    libraryDependencies += "com.google.api-ads" % "ads-lib" % "4.5.0",
    libraryDependencies += "com.google.api-ads" % "dfp-axis" % "4.5.0",
    libraryDependencies += "org.scala-lang.modules" %% "scala-xml" % "1.0.6",
    libraryDependencies += "com.typesafe" % "config" % "1.3.1",
    libraryDependencies += "org.scalatest" %% "scalatest" % scalatestVersion % Test
  )
)

lazy val dockerFileTask = taskKey[Unit]("Prepare the dockerfile and needed files")

lazy val `dfp-init-stream` = (
  project in file("dfp-init-stream")
  enablePlugins GitVersioning
  enablePlugins DockerPlugin
  settings common ++ noPublishing
  dependsOn `dfp-init`
  dependsOn `dfp-common`
  settings (
    resolvers += "cakesolutions-bintray" at "https://dl.bintray.com/cakesolutions/maven/",
    libraryDependencies ++= Seq(
      // platform
      "com.typesafe.akka"   %% "akka-actor"                   % "2.5.23"          % Provided,
      "com.typesafe.akka"   %% "akka-testkit"                 % "2.5.23"          % Test,
      "net.cakesolutions"   %% "scala-kafka-client-akka"      % kafkaVersion,
      "net.cakesolutions"   %% "scala-kafka-client-testkit"   % kafkaVersion      % Test,
      "org.scalatest"       %% "scalatest"                    % scalatestVersion  % Test,
      // logging
      "ch.qos.logback"      %  "logback-classic"              % logbackVersion
    ),
    dockerfile in docker := {
      // The assembly task generates a fat JAR file
      val artifact: File = assembly.value
      val artifactTargetPath = s"/app/${artifact.name}"

      new Dockerfile {
        from("example.com/kubernetes/alpine-jre")
        label("servicename","dfp-init-stream")
        user("nobody:nogroup")
        add(artifact, artifactTargetPath)
        entryPoint("java", "-jar", artifactTargetPath)
        cmd("com.example.StreamingDfpInit")
      }
    },
    dockerFileTask := {
      val dockerDir = target.value / "docker"
      val artifact: File = assembly.value
      val artifactTargetPath = s"/app/${artifact.name}"

      val dockerFile = /*(dockerfile in docker).value*/
        new Dockerfile {
          from("example.com/kubernetes/alpine-jre")
          label("servicename","dfp-init-stream")
          user("nobody:nogroup")
          add(artifact, artifactTargetPath)
          entryPoint("java", "-jar", artifactTargetPath)
          cmd("com.example.stream.StreamingDfpInit")
        }

      val stagedDockerfile = sbtdocker.staging.DefaultDockerfileProcessor(dockerFile, dockerDir)
      IO.write(dockerDir / "Dockerfile", stagedDockerfile.instructionsString)
      stagedDockerfile.stageFiles.foreach {
        case (source, destination) =>
          source.stage(destination)
      }
    }
  )
)

You'll notice that I tried to use the Dockerfile exposed by (dockerfile in docker).value before copying it. I'd prefer to single-source the Dockerfile definition and do so in a way that a dev can continue to use sbt docker locally while we can build just the Dockerfile in CI (business processes demand that we know exactly the content of the Dockerfile used and push through our own mechanisms).

This works for one run of sbt dfp-init-stream/dockerFileTask, but subsequent runs error because the assembled jar is already present somewhere:

[error] java.nio.file.FileAlreadyExistsException: /Users/colindean/dfp-services/dfp-init-stream/target/docker/0/dfp-init-stream-assembly-0.8.2-15-g831fe33-SNAPSHOT.jar
[error] 	at sun.nio.fs.UnixCopyFile.copy(UnixCopyFile.java:551)
[error] 	at sun.nio.fs.UnixFileSystemProvider.copy(UnixFileSystemProvider.java:253)
[error] 	at java.nio.file.Files.copy(Files.java:1274)
[error] 	at sbtdocker.staging.CopyFile.copy(SourceFile.scala:31)

It looks like this copy shouldn't fail in this way if the file already exists. I'm reaching into the API to do all of this so I'm not sure that it's worth the effort to alter that copy call so that it would overwrite existing files.

@jakobmerrild
Copy link

@colindean Thanks for this full example it helped me to get to this:

lazy val dockerFileTask = taskKey[Unit]("Prepare the dockerfile and needed files")

dockerFileTask := {
    val dockerDir = target.value / "docker"

    val dockerFile = (docker / dockerfile).value

    // docker / dockerfile supports native docker files which cannot be
    // used to call the DefaultDockerfileProcessor
    if (dockerFile.isInstanceOf[sbtdocker.DockerfileLike]) {
      val stagedDockerfile =
        sbtdocker.staging.DefaultDockerfileProcessor(dockerFile.asInstanceOf[sbtdocker.DockerfileLike], dockerDir)
      IO.write(dockerDir / "Dockerfile", stagedDockerfile.instructionsString)
      stagedDockerfile.stageFiles.foreach { case (source, destination) =>
        // source.stage fails if the destination
        // is a non-empty directory.
        if (!destination.exists() || !destination.isDirectory()) {
          source.stage(destination)
        }
      }
    }
  }

As far as I can tell this works and you don't have to define your docker / dockerfile more than once (assuming you don't use the NativeDockerFile, at which point you don't need this task.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants