-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Nonseq #245
base: develop
Are you sure you want to change the base?
Nonseq #245
Conversation
Updated function headers and type-hints.
- DynapcnnNetwork._to_device() calling DynapcnnLayer.to() (no need to call .to() from each individual layer within). - _to_device() is calling .to() on the new Merge() layers created for the forward call to. - forward method works like a charm: if DynapcnnNetwork(discretize=True) the loss does not change during training.
- Extraction of config. dict of a DynapcnnLayer removed from class and moved back to ConfigBuilder class. - Updated function headers/type hints/in-line documentation. - Refactored 'ConfigBuilder.get_dynapcnn_layer_config_dict' to create 'destinations' dict entry to handle setting of the destinations config. object. - Ranamed '_forward_map' into 'layers_mapper' in DynapcnnNetwork to better fit its role in the network construction/chip deployment.
- DynapcnnLayer accessible from dynapcnn/__init__. - Removed wild prints.
self.dvs_input = dvs_input | ||
self.input_shape = input_shape | ||
|
||
assert len(self.input_shape) == 3, "infer_input_shape did not return 3-tuple" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- Should not be assert, but if/else and possible ValueError
- Where did the
infer_input_shape
part go? Is that not possible anymore?
…to gather network-level info to generate the arguments for a DynapcnnLayer instance
…ame a 'handler' class whose only job is to pre-processes network-level data to generate args for a DynapcnnLayer instance
…nts of the network
…e DynapcnnLayers, but instead holding the DynapcnnLayerHandlers used to instantiate each layer
…from the instance if deployment is successfull
…rk._layers_mapper
…forward function need validation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is not a full review, but I will finish it for now so the suggestions become visible
All unit tests passing on my machine 🍾 |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## develop #245 +/- ##
===========================================
- Coverage 90.91% 88.83% -2.08%
===========================================
Files 52 58 +6
Lines 2718 3664 +946
===========================================
+ Hits 2471 3255 +784
- Misses 247 409 +162 ☔ View full report in Codecov by Sentry. |
All checks passing on the CI now (with warnings from codecov - we should probably add more unit tests eventually). |
I have extended the to-do list in the original post to be more specific and complete. It should now mention everything that still needs to be addressed. Please feel free to add and tick off items when appropriate. |
Checklist before requesting a review
Unit tests
Documentation
Review
TODO
s inside the codeWhat kind of change does this PR introduce? (Bug fix, feature, docs update, ...)
New feature, including extensive changes in code, documentation and unit tests.
What is the current behavior? (You can also link to an open issue here)
So far, only sequential models can be deployed on DynapCNN chips.
What is the new behavior (if this is a feature change)?
Allow non-sequential SNNs (e.g. residuals, recurrent, ...) to be deployed on DynapCNN chips.
Does this PR introduce a breaking change? (What changes might users need to make in their application due to this PR?)
Potentially. Ideally for sequential SNNs there are no (or minimal) breaking changes, but this has to be assessed when reviewing this PR.
Other information:
The changes are quite extensive, so reviewing this will take some time.