Replies: 4 comments 6 replies
-
Hi. Storing a tensor or a network is not as simple a task as it seems, since the internal number formats of these objects is opaque. Even vectors can have strides, etc... You can not transfer to a file channel, you can map memory to a file channel. Please see this function: https://github.com/uncomplicate/neanderthal/blob/e4a03ddca20261bf7718b7bcabc66e7ef48ee072/src/clojure/uncomplicate/neanderthal/native.clj#L405 The idea is to memory map an object (it's best to first start with vectors to see how it works), and then the changes to that vector are tied to your disk. That vector should be in a format that is simple and universally readable on other machines, but it doesn't have to be. Then, you work with it (and your changes are saved), or you transfer to/from other more complex objects when you want to load/save them from the disk. For tensors, you can use map-tensor to load/save from disk. This function is used for loading MNIST data for example. Networks can be transferred to channels automatically, but you have to provie a file channel. As far as I remember, network (nor tensor) does not implement .getChannel method. |
Beta Was this translation helpful? Give feedback.
-
Thanks for reporting. I'll investigate this after the weekend. It might be a bug introduced during the port to JavaCPP, but I'm not sure now without trying it out myself. |
Beta Was this translation helpful? Give feedback.
-
BTW the book code uses the older version of deep diamond, so it shouldn't have been affected by port to JavaCPP... |
Beta Was this translation helpful? Give feedback.
-
Can you please send me a notification about which chapter files work as-is, and which ones require changes to work with the new version? Most of it should be backward compatible, but some parts might need changes. |
Beta Was this translation helpful? Give feedback.
-
I'm reading the book DLFP and just finished the Tensors section and learning quite a lot.
The only thing that I can't find in the book is a simple way on the Deep Diamond high level API to store the trained network and reload it later ?
Could you suggest an approach ?
I tried the function transfer! to a FileChannel but it doesn't work.
(def network-blueprint (network (desc [512 1 28 28] :float :nchw)
[(fully-connected [512] :relu)
(fully-connected [10] :sigmoid)]))
(def net (init! (network-blueprint :adam)))
(train! net x-train y-train :crossentropy 30 [])
(defonce network-trained (random-access "C:/workspace/dlfp-code-0.34.0-2.0/dlfp/data/mnist/network.trained"))
(transfer! net (.getChannel network-trained))
Execution error (IllegalArgumentException) at uncomplicate.commons.core/eval2100$fn$G (core.clj:116).
No implementation of method: :bytesize* of protocol: #'uncomplicate.commons.core/Bytes found for class: uncomplicate.diamond.internal.dnnl.tensor.DnnlTensor
Thanks
Beta Was this translation helpful? Give feedback.
All reactions