Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Define shared Params for Spark Transformers/Estimators #8

Open
j-coll opened this issue Nov 23, 2018 · 0 comments
Open

Define shared Params for Spark Transformers/Estimators #8

j-coll opened this issue Nov 23, 2018 · 0 comments
Assignees
Labels
spark-analysis New Spark analysis
Milestone

Comments

@j-coll
Copy link
Member

j-coll commented Nov 23, 2018

Most of the Params used in our Transformers and Estimators are going to be shared among them. Spark ML solves this issue by having small interfaces with the shared Params.

See apache/spark/mllib/.../sharedParams.scala

The proposal is to build our own set of shared params, e.g.:

public interface HasStudyId extends Params {
    default Param<String> studyIdParam() {
        return new Param<>(this, "studyId", "Id of the study to be used.");
    }
    default String getStudyId() {
        return getOrDefault(studyIdParam());
    }
    default HasStudyId setStudyId(String study) {
        set(studyIdParam(), study);
        return this;
    }
}
@j-coll j-coll added the spark-analysis New Spark analysis label Nov 23, 2018
@j-coll j-coll added this to the v1.0.0 milestone Nov 23, 2018
@j-coll j-coll self-assigned this Nov 23, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
spark-analysis New Spark analysis
Projects
None yet
Development

No branches or pull requests

1 participant