-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Umbrella issue for Chronos refactor to enable customized installation #3170
Comments
|
|
So you have three options:
|
yep, that looks good. |
Note that For the first release of We will support other install options after the corresponding code is ready. |
Shall we also consider dependencies like TensorFlow and PyTorch? Since Chronos contains both TensorFlow models and PyTorch models, and we might add more Tensorflow models in the future. I extended the table above, detailed all Chronos components and the corresponding dependencies regarding Orca, Ray, TF, Pytorch.
Therefore, with Tensorflow and Pytorch considered, the dependency options might be
In the future, we may support distributed training or tuning with Tensorflow-based model, then we may add:
|
small update: detector also rely on some pytorch models |
Seperating pytorch and tensorflow installation seems only a valid request for inference with size concerns. And it seems to me the option should be with nano instead of chronos (e.g. seperate libs installation such as pytorch-lightening, IPEX, intel-tensorflow, etc. ). So the install options can be simplified to train/nano[pytorch]/nano[tensorflow]. Further, as our pytorch support is much better and tensorflow layer is thin (correct me if i'm wrong), we can simplify it to |
We should expect new customers to use our upcoming release. What's the target usage for w and w/o
We might need to consider our target usage when defining the options and which modules to include. |
How about:
|
I prefer to simplify them to 3 options
And once our tf support is enhanced, we may change the options. currently, if users want to use tf, we can let them install tf themselves now. The other reason is that tf1 is conflict with pytorch-lightning on some dependencies which may cause "dependency hell". nano+orca+pytorch will only support 2 more features than nano+pytorch
|
The default should be distributed; a light version can be single node only. |
And once our tf support is enhanced, we may change the options. Currently, if users want to use tf, we can let them install tf themselves now. |
preprocessing is available since TSDataset is based on single node pandas while only XShardsTSDataset is based on orca.data. Still, we should not have this nearly-useless option for this release. So we may simply let the I am not sure if some of our customers require a lighter version for this release? And of course we always have the nightly built version later. |
For this release, So with the default With extra [all] option, users could enable the distributed functions, including distributed tuning with AutoTS, distributed forecasting, distributed dataset. |
confirmed, thx. This will also be reflected in our user guide. |
This issue is a detailed plan to realize issue intel-analytics/analytics-zoo#107 and decoupling the major function of Chronos with other BigDL components.
@shane-huang @yushan111
Overall design strategy [edited after discussion with Jason]
Chronos
should stick tonano
for single node acceleration when appropriate and be self-contained and able to complete most of its functionalities without any other dependencies (ray
/orca
).Chronos
will rely onorca
/ray
for functionalities with distribution fashion (will be reflected in the following tabel).Chronos
will contain a light-weighted inference installation strategy. (maybe not a new whl)tensorflow
is not my first priority since tf2 is intel's AI strategy whileChronos
has no tf2 model right now.Here is a full functionality table (will be updated)
Chronos
nano
orca
pip install bigdl-chronos
;pip install bigdl-nano
pip install bigdl-chronos[all]
* F/D/S means Forecaster/Detector/Simulators
** This two version can be used as light-weighted inference install strategy.
To complete these, we mainly need these steps(can be done simultaneously)
The text was updated successfully, but these errors were encountered: