Skip to content

Commit

Permalink
Docusaurus: Documentation generation (Farfetch#314)
Browse files Browse the repository at this point in the history
* docs: docusaurus setup

* chore: website deploy workflows

* chore: trigger on pull request

* chore: add yarn.lock
  • Loading branch information
Gui ⚡️ Guilherme Ferreira authored Oct 31, 2022
1 parent 174eda9 commit b7732d5
Show file tree
Hide file tree
Showing 35 changed files with 30,724 additions and 0 deletions.
32 changes: 32 additions & 0 deletions .github/workflows/deploy-website.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
name: Deploy to GitHub Pages

on:
release:
types: [ published ]

jobs:
deploy:
name: Deploy to GitHub Pages
runs-on: ubuntu-latest
defaults:
run:
working-directory: ./website
steps:
- uses: actions/checkout@v2
- uses: actions/setup-node@v3
with:
node-version: 18
cache: yarn
cache-dependency-path: website/yarn.lock


- name: Install dependencies
run: yarn install --frozen-lockfile
- name: Build website
run: yarn build

- name: Deploy to GitHub Pages
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./website/build
24 changes: 24 additions & 0 deletions .github/workflows/test-deploy-website.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
name: Test deployment

on:
pull_request:

jobs:
test-deploy:
name: Test deployment
runs-on: ubuntu-latest
defaults:
run:
working-directory: ./website
steps:
- uses: actions/checkout@v2
- uses: actions/setup-node@v3
with:
node-version: 18
cache: yarn
cache-dependency-path: website/yarn.lock

- name: Install dependencies
run: yarn install --frozen-lockfile
- name: Test build website
run: yarn build
20 changes: 20 additions & 0 deletions website/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# Dependencies
/node_modules

# Production
/build

# Generated files
.docusaurus
.cache-loader

# Misc
.DS_Store
.env.local
.env.development.local
.env.test.local
.env.production.local

npm-debug.log*
yarn-debug.log*
yarn-error.log*
41 changes: 41 additions & 0 deletions website/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
# Website

This website is built using [Docusaurus 2](https://docusaurus.io/), a modern static website generator.

### Installation

```
$ yarn
```

### Local Development

```
$ yarn start
```

This command starts a local development server and opens up a browser window. Most changes are reflected live without having to restart the server.

### Build

```
$ yarn build
```

This command generates static content into the `build` directory and can be served using any static contents hosting service.

### Deployment

Using SSH:

```
$ USE_SSH=true yarn deploy
```

Not using SSH:

```
$ GIT_USER=<Your GitHub username> yarn deploy
```

If you are using GitHub pages for hosting, this command is a convenient way to build the website and push to the `gh-pages` branch.
3 changes: 3 additions & 0 deletions website/babel.config.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
module.exports = {
presets: [require.resolve('@docusaurus/core/lib/babel/preset')],
};
7 changes: 7 additions & 0 deletions website/docs/getting-started/_category_.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
{
"label": "Getting Started",
"position": 2,
"link": {
"type": "generated-index"
}
}
245 changes: 245 additions & 0 deletions website/docs/getting-started/create-your-first-application.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,245 @@
---
sidebar_position: 2
sidebar_label: Quickstart
---


# Quickstart: Create your first application with KafkaFlow

In this article, you use C# and the .NET CLI to create two applications that will produce and consume events from Apache Kafka.

By the end of the article, you will know how to use KafkaFlow to either Produce or Consume events from Apache Kafka.


## Prerequisites

- [.NET 6.0 SDK](https://dotnet.microsoft.com/en-us/download/dotnet/6.0)
- [Docker Desktop](https://www.docker.com/products/docker-desktop/)

## Overview

You will create two applications:

1. **Consumer:** Will be running waiting for incoming messages and will write them to the console.
2. **Producer:** Will send a message every time you run the application.

To connect them, you will be running an Apache Kafka cluster using Docker.

## Steps

### 1. Create a folder for your applications

Create a new folder with the name _KafkaFlowQuickstart_.

### 2. Setup Apache Kafka

Inside the folder from step 1, create a `docker-compose.yml` file. You can download it from [here](../../../docker-compose.yml).

### 3. Start the cluster

Using your terminal of choice, start the cluster.

```bash
docker-compose up -d
```

### 4. Create Producer Project

Run the following command to create a Console Project named _Producer_.
```bash
dotnet new console --name Producer
```

### 5. Install KafkaFlow packages

Inside the _Producer_ project directory, run the following commands to install the required packages.

```bash
dotnet add package KafkaFlow
dotnet add package KafkaFlow.Microsoft.DependencyInjection
dotnet add package KafkaFlow.LogHandler.Console
dotnet add package KafkaFlow.TypedHandler
dotnet add package KafkaFlow.Serializer
dotnet add package KafkaFlow.Serializer.JsonCore
dotnet add package Microsoft.Extensions.DependencyInjection
```

### 6. Create the Message contract

Add a new class file named _HelloMessage.cs_ and add the following example:

```csharp
namespace Producer;

public class HelloMessage
{
public string Text { get; set; } = default!;
}
```

### 7. Create message sender

Replace the content of the _Program.cs_ with the following example:

```csharp
using Microsoft.Extensions.DependencyInjection;
using KafkaFlow.Producers;
using KafkaFlow.Serializer;
using KafkaFlow;
using Producer;

var services = new ServiceCollection();

const string topicName = "sample-topic";
const string producerName = "say-hello";

services.AddKafka(
kafka => kafka
.UseConsoleLog()
.AddCluster(
cluster => cluster
.WithBrokers(new[] { "localhost:9092" })
.CreateTopicIfNotExists(topicName, 1, 1)
.AddProducer(
producerName,
producer => producer
.DefaultTopic(topicName)
.AddMiddlewares(m =>
m.AddSerializer<JsonCoreSerializer>()
)
)
)
);

var serviceProvider = services.BuildServiceProvider();

var producer = serviceProvider
.GetRequiredService<IProducerAccessor>()
.GetProducer(producerName);

await producer.ProduceAsync(
topicName,
Guid.NewGuid().ToString(),
new HelloMessage { Text = "Hello!" });


Console.WriteLine("Message sent!");

```


### 8. Create Consumer Project

Run the following command to create a Console Project named _Consumer_.
```bash
dotnet new console --name Consumer
```

### 9. Add a reference to the Producer

In order to access the message contract, add a reference to the Producer Project.

Inside the _Consumer_ project directory, run the following commands to add the reference.

```bash
dotnet add reference ../Producer
```

### 10. Install KafkaFlow packages

Inside the _Consumer_ project directory, run the following commands to install the required packages.

```bash
dotnet add package KafkaFlow
dotnet add package KafkaFlow.Microsoft.DependencyInjection
dotnet add package KafkaFlow.LogHandler.Console
dotnet add package KafkaFlow.TypedHandler
dotnet add package KafkaFlow.Serializer
dotnet add package KafkaFlow.Serializer.JsonCore
dotnet add package Microsoft.Extensions.DependencyInjection
```

### 11. Create a Message Handler

Create a new class file named _HelloMessageHandler.cs_ and add the following example.

```csharp
using KafkaFlow;
using KafkaFlow.TypedHandler;
using Producer;

namespace Consumer;

public class HelloMessageHandler : IMessageHandler<HelloMessage>
{
public Task Handle(IMessageContext context, HelloMessage message)
{
Console.WriteLine(
"Partition: {0} | Offset: {1} | Message: {2}",
context.ConsumerContext.Partition,
context.ConsumerContext.Offset,
message.Text);

return Task.CompletedTask;
}
}
```

### 12. Create the Message Consumer

Replace the content of the _Program.cs_ with the following example.

```csharp
using KafkaFlow;
using KafkaFlow.Serializer;
using Microsoft.Extensions.DependencyInjection;
using KafkaFlow.TypedHandler;
using Consumer;

const string topicName = "sample-topic";
var services = new ServiceCollection();

services.AddKafka(kafka => kafka
.UseConsoleLog()
.AddCluster(cluster => cluster
.WithBrokers(new[] { "localhost:9092" })
.CreateTopicIfNotExists(topicName, 1, 1)
.AddConsumer(consumer => consumer
.Topic(topicName)
.WithGroupId("sample-group")
.WithBufferSize(100)
.WithWorkersCount(10)
.AddMiddlewares(middlewares => middlewares
.AddSerializer<JsonCoreSerializer>()
.AddTypedHandlers(h => h.AddHandler<HelloMessageHandler>())
)
)
)
);

var serviceProvider = services.BuildServiceProvider();

var bus = serviceProvider.CreateKafkaBus();

await bus.StartAsync();

Console.ReadKey();

await bus.StopAsync();
```

### 13. Run!

From the `KafkaFlowQuickstart` directory:

1. Run the Consumer:

```bash
dotnet run --project Consumer/Consumer.csproj
```

2. From another terminal, run the Producer:

```bash
dotnet run --project Producer/Producer.csproj
```
Loading

0 comments on commit b7732d5

Please sign in to comment.