Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
fszewczyk authored Nov 8, 2023
1 parent 15ec316 commit efa0d60
Showing 1 changed file with 43 additions and 1 deletion.
44 changes: 43 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,4 +15,46 @@ Make sure your compiler supports C++17. Shkyera Grad is a header-only library, s
#include "include/ShkyeraGrad.hpp"
```

Check out the [examples](examples/README.md) for a quick start on Shkyera Grad.
Check out the [examples](examples/README.md) for a quick start on Shkyera Grad. In the meantime, here's a neural network that learns the XOR function.

```cpp
#include "include/ShkyeraGrad.hpp"

int main() {
using namespace shkyera;

std::vector<Vec32> xs;
std::vector<Vec32> ys;

// ---------- INPUT ----------- | -------- OUTPUT --------- //
xs.push_back(Vec32::of({0, 0})); ys.push_back(Vec32::of({0}));
xs.push_back(Vec32::of({1, 0})); ys.push_back(Vec32::of({1}));
xs.push_back(Vec32::of({0, 1})); ys.push_back(Vec32::of({1}));
xs.push_back(Vec32::of({0, 0})); ys.push_back(Vec32::of({0}));

auto mlp = SequentialBuilder<Type::float32>::begin()
.add(Layer32::create(2, 15, Activation::relu<Type::float32>))
.add(Layer32::create(15, 5, Activation::relu<Type::float32>))
.add(Layer32::create(5, 1, Activation::sigmoid<Type::float32>))
.build();

Optimizer32 optimizer = Optimizer<Type::float32>(mlp->parameters(), 0.1);
Loss::Function32 lossFunction = Loss::MSE<Type::float32>;

// ------ TRAINING THE NETWORK ------- //
for (size_t epoch = 0; epoch < 100; epoch++) {
auto epochLoss = Val32::create(0);

optimizer.reset();
for (size_t sample = 0; sample < xs.size(); ++sample) {
Vec32 pred = mlp->forward(xs[sample]);
auto loss = lossFunction(pred, ys[sample]);

epochLoss = epochLoss + loss;
}
optimizer.step();

std::cout << "Epoch: " << epoch + 1 << " Loss: " << epochLoss->getValue() << std::endl;
}
}
```

0 comments on commit efa0d60

Please sign in to comment.