Skip to content

RheagalFire/Browserside-Inferencing

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Browserside-Inferencing

ML models that run on your browsers

  • Model file is taken from hugging face model hub here
  • Model file is wasm file through onnx. The Wasm files can be directly run on browsers.
  • Since the model is pretty large it takes sometime for CDN to deliver it to the browser. Once that is done the inferncing is pretty fast in comparision to server side inferencing.

image

About

ML models that run on your browsers

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published