You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is there any way to provide specific configuration for a specific SparkApplication (I'm talking about configuration parameters that are passed in using --conf when calling spark-submit from the command line). Examples include:
spark.sql.parquetFilterPushdown=true
spark.sql.execution.arrow=true
spark.default.parallelism=<some_numeric_value>
etc.
I see its possible at the SparkCluster level (see here) but what about the SparkApplication level?
I noticed there was a "sparkConfigMap" parameter at the SparkApplication level. Would this allow for me to accomplish the above? If so, I haven't found much information on how to use it. If this is the case, I would appreciate an example.
The text was updated successfully, but these errors were encountered:
Description:
Is there any way to provide specific configuration for a specific SparkApplication (I'm talking about configuration parameters that are passed in using --conf when calling spark-submit from the command line). Examples include:
spark.sql.parquetFilterPushdown=true
spark.sql.execution.arrow=true
spark.default.parallelism=<some_numeric_value>
etc.
I see its possible at the SparkCluster level (see here) but what about the SparkApplication level?
I noticed there was a "sparkConfigMap" parameter at the SparkApplication level. Would this allow for me to accomplish the above? If so, I haven't found much information on how to use it. If this is the case, I would appreciate an example.
The text was updated successfully, but these errors were encountered: