Dockerization of Ersilia CLI #723
Unanswered
miquelduranfrigola
asked this question in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Background
The Ersilia CLI has many dependencies, most of them inherited from BentoML. BentoML evolves very quickly and we are experiencing severe problems when new versions of BentoML are released. In particular, there are issues related SQL Alchemy that are extremely difficult to mainatin. Long story short: we decided to fork BentoML at version
0.11.0
. Now, when Ersilia is installed (see the setup.py file), we are installing the version of BentoML maintained by Ersilia. In my opinion, this solution is only provisional and we should find a more persistent alternative. This is specially critical for Mac users on M1 & M2 chips.What do we need
It would be great to have a version of the Ersilia CLI that has essentially no dependencies (almost pure-Python). Let's call it Ersilia Client for the purpose of this discussion. This is a long shot, but here are two possible approaches:
A. Dockerize the current Ersilia CLI tool and run as a master container
In this case, we would have a dockerized version of Ersilia, exposing a REST API with
fetch
,catalog
,serve
,close
,run
... methods. Then, the Ersilia Client can basically interact with the Ersilia container, which will in turn interact with one (or multiple) model containers.👍 Pros: Every time we do an update to Ersilia CLI, we don't have to implement it in every model.
👎 Cons: We will always need at least 2 docker containers to run Ersilia (the CLI + the model). There is a somewhat weird design since Ersilia is actually installed inside the model container, and also outside.
B. Serve models at an outer level, including input and output processing as part of the run procedure
In this case, a REST API encompassing input processing, model run, and output processing, would be developed for each model. This would use all Ersilia CLI functionalities on a per-model basis, so that interaction with the model then become trivial with the Ersilia Client.
👍 Pros: Each model can be run in a standalone form.
👎 Cons: Currently, dockerization of models does not allow for this feature, since dockerization happens at an inner level (i.e. at the serve level). Therefore, we should develop two casuistics for ersilia serve, one that encompasses ersilia input processing, serving and output processing, and one that only relates to the serving. In practical terms, this means that the BentoML logic is lost. In addition, every time we do a major change in Ersilia, we will have to update all models.
Currently chosen strategy: A
As of now, in my opinion, the easiest will be option A, both in terms of implementation and also maintenance.
Beta Was this translation helpful? Give feedback.
All reactions