Replies: 4 comments 1 reply
-
Hi,
The second thing to keep an eye on is If you provide how you launch/start your SparkSession I can make sure whether it's default Apache Spark configs like |
Beta Was this translation helpful? Give feedback.
-
Thank you for your response. I have pasted my SparkSession Configs below: SparkConf sparkConfig = new SparkConf(); The problem occurred with the default configuration, also set the memory config parameter but we are still facing the same issues with consuming all memory (RAM). |
Beta Was this translation helpful? Give feedback.
-
Hello @maziyarpanahi waiting for your suggestion for the above configuration. |
Beta Was this translation helpful? Give feedback.
-
Spark NLP does not control nor has a way to even access available memory. If you want your Spark application to consume up to a certain memory, you must mention it explicitly:
Apart from this, the rest is between JVM and Apache Spark which I believe you can have better answers looking into the Spark configs and ask in their Dev/User mailing lists. |
Beta Was this translation helpful? Give feedback.
-
Hello @maziyarpanahi
Actually, I am trying to load the model
t5_base
in Java in 2 machine configurations:The model is of size ~950 MB.
Problem
I have kept all configurations the same in both instances (8GB and 16GB), all XMX and XMS are the same in both.
But the thing is, in 8GB instance it gets loaded under 7.5 GB RAM while in 16GB instance the same model with the same code it takes around 13GB of RAM.
Now, because of this I am not able to make sure if this will work on the client machine or will crash.
Is there any special configuration that I will need to set to limit its RAM usage?
Does it use RAM respective to the availability?
Spark Version: 3.4.1
Spark NLP Version: 4.3.2
modelPath = "t5base model dir path"
model = (T5Transformer) T5Transformer.load(modelPath);
Please help, thanks
Beta Was this translation helpful? Give feedback.
All reactions