Skip to content

Commit

Permalink
docs(rhoas): add helm guide for RHOAS (#583)
Browse files Browse the repository at this point in the history
  • Loading branch information
rkpattnaik780 authored Sep 19, 2022
1 parent 185de6b commit 2741367
Showing 1 changed file with 395 additions and 0 deletions.
395 changes: 395 additions & 0 deletions docs/rhoas/rhoas-helm-guide/README.adoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,395 @@
////
START GENERATED ATTRIBUTES
WARNING: This content is generated by running npm --prefix .build run generate:attributes
////

//All OpenShift Application Services
:org-name: Application Services
:product-long-rhoas: OpenShift Application Services
:community:
:imagesdir: ./images
:property-file-name: app-services.properties
:samples-git-repo: https://github.com/redhat-developer/app-services-guides
:base-url: https://github.com/redhat-developer/app-services-guides/tree/main/docs/
:sso-token-url: https://sso.redhat.com/auth/realms/redhat-external/protocol/openid-connect/token
:cloud-console-url: https://console.redhat.com/
:service-accounts-url: https://console.redhat.com/application-services/service-accounts

//OpenShift Application Services CLI
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
:command-ref-url-cli: commands
:installation-guide-url-cli: rhoas/rhoas-cli-installation/README.adoc
:service-contexts-url-cli: rhoas/rhoas-service-contexts/README.adoc

//OpenShift Streams for Apache Kafka
:product-long-kafka: OpenShift Streams for Apache Kafka
:product-kafka: Streams for Apache Kafka
:product-version-kafka: 1
:service-url-kafka: https://console.redhat.com/application-services/streams/
:getting-started-url-kafka: kafka/getting-started-kafka/README.adoc
:kafka-bin-scripts-url-kafka: kafka/kafka-bin-scripts-kafka/README.adoc
:kafkacat-url-kafka: kafka/kcat-kafka/README.adoc
:quarkus-url-kafka: kafka/quarkus-kafka/README.adoc
:nodejs-url-kafka: kafka/nodejs-kafka/README.adoc
:getting-started-rhoas-cli-url-kafka: kafka/rhoas-cli-getting-started-kafka/README.adoc
:topic-config-url-kafka: kafka/topic-configuration-kafka/README.adoc
:consumer-config-url-kafka: kafka/consumer-configuration-kafka/README.adoc
:access-mgmt-url-kafka: kafka/access-mgmt-kafka/README.adoc
:metrics-monitoring-url-kafka: kafka/metrics-monitoring-kafka/README.adoc
:service-binding-url-kafka: kafka/service-binding-kafka/README.adoc
:message-browsing-url-kafka: kafka/message-browsing-kafka/README.adoc

//OpenShift Service Registry
:product-long-registry: OpenShift Service Registry
:product-registry: Service Registry
:registry: Service Registry
:product-version-registry: 1
:service-url-registry: https://console.redhat.com/application-services/service-registry/
:getting-started-url-registry: registry/getting-started-registry/README.adoc
:quarkus-url-registry: registry/quarkus-registry/README.adoc
:getting-started-rhoas-cli-url-registry: registry/rhoas-cli-getting-started-registry/README.adoc
:access-mgmt-url-registry: registry/access-mgmt-registry/README.adoc
:content-rules-registry: https://access.redhat.com/documentation/en-us/red_hat_openshift_service_registry/1/guide/9b0fdf14-f0d6-4d7f-8637-3ac9e2069817[Supported Service Registry content and rules]
:service-binding-url-registry: registry/service-binding-registry/README.adoc

//OpenShift Connectors
:connectors: Connectors
:product-long-connectors: OpenShift Connectors
:product-connectors: Connectors
:product-version-connectors: 1
:service-url-connectors: https://console.redhat.com/application-services/connectors
:getting-started-url-connectors: connectors/getting-started-connectors/README.adoc
:getting-started-rhoas-cli-url-connectors: connectors/rhoas-cli-getting-started-connectors/README.adoc

//OpenShift API Designer
:product-long-api-designer: OpenShift API Designer
:product-api-designer: API Designer
:product-version-api-designer: 1
:service-url-api-designer: https://console.redhat.com/application-services/api-designer/
:getting-started-url-api-designer: api-designer/getting-started-api-designer/README.adoc

//OpenShift API Management
:product-long-api-management: OpenShift API Management
:product-api-management: API Management
:product-version-api-management: 1
:service-url-api-management: https://console.redhat.com/application-services/api-management/

////
END GENERATED ATTRIBUTES
////

[id="chap-helm-guide-rhoas-cli"]
= Configuring OpenShift Application Services in Helm
ifdef::context[:parent-context: {context}]
:context: helm-guide-rhoas-cli

// Purpose statement for the assembly
[role="_abstract"]
In this guide we are going to explain how to configure Application Services in your Helm deployment. For more information about helm, please refer to the official docs.

A developer can have his application deployed in a Kubernetes platform such as Red Hat OpenShift in various ways - application deployed as a Kubernetes deployment or application packaged as a Helm chart. We will walk through how to connect the applications to Application Services for both the scenarios.

To connect an application deployed in OpenShift to the Application Services, we need to supply connection configurations and credentials. RHOAS CLI simplifies this task for us by generating connection configurations and credentials as Kubernetes objects.

.Prerequisites
ifndef::community[]
* You have a Red Hat account.
endif::[]
* You have a running Kafka instance in {product-kafka}.
* You've installed the latest version of the `rhoas` CLI. See {base-url}{installation-guide-url-cli}[Installing and configuring the rhoas CLI^].
* You've installed https://helm.sh/docs/intro/quickstart/:[Helm CLI^] version 3.9.0 or above
* You've installed https://docs.openshift.com/container-platform/4.7/cli_reference/openshift_cli/getting-started-cli.html:[OpenShift CLI] version 4.8.5 or above

[id="proc-generating-configmap-for-application-services_{context}"]
== Generating a ConfigMap for Application Services instances

The application deployed in the Kubernetes platform needs to connect to the Kafka instance. To achieve this, one needs to supply the URL to the Kafka instance. The URLs being non-confidential data can be supplied as Kubernetes ConfigMap. The Helm chart can refer to the ConfigMap deployed in the Kubernetes cluster which will henceforth connect the application to the Kafka instance.

This is implemented by RHOAS CLI using service contexts. In OpenShift Application Services, a service context is a defined set of instances running in Application Services such as OpenShift Streams for Apache Kafka and OpenShift Service Registry. You might create different contexts for specific use cases, projects, or environments. To learn more about contexts, see "https://access.redhat.com/documentation/en-us/red_hat_openshift_application_services/1/guide/12b72a70-22b9-44a4-a7f3-6977759bfc67":[Connecting client applications to Red Hat OpenShift Application Services using the rhoas CLI].

.Procedure

. Confirm that the Application Service instances are set in the current context and are running.

+
[source,shell]
----
$ rhoas context status
----

. Generate a YAML file that contains the connection configuration as a Kubernetes ConfigMap object.
+
[source,shell]
----
$ rhoas generate-config --type configmap --output-file ./rhoas-services.yaml
----
+
. Apply the generated ConfigMap file.
+
[source,shell]
----
$ oc apply -f “./rhoas-services.yaml”
----
+
. Consuming ConfigMap in a Kubernetes deployment.
+
An application deployed as a Kubernetes deployment can have the ConfigMap values injected as environment variables.
+
[source,shell]
----
env:
- name: KAFKA_HOST
valueFrom:
configMapKeyRef:
name: my-context-3-configuration
key: kafka_host
- name: SERVICE_REGISTRY_URL
valueFrom:
configMapKeyRef:
name: my-context-3-configuration
key: service_registry_url
----
. Consuming ConfigMap in a Helm Chart.
+
Helm charts support templating whichmeans that you can pass the ConfigMap values into a template. This enables you to to quickly deploy applications to Kubernetes clusters. To do this, you need a Helm chart, a Kubernetes ConfigMap and a Kubernetes secret. The Helm chart needs to follow the naming standards set by the RHOAS CLI.
+
[source,shell]
----
env:
- name: KAFKA_HOST
valueFrom:
configMapKeyRef:
name: {{ .Values.rhoas.config }}
key: kafka_host
- name: SERVICE_REGISTRY_URL
valueFrom:
configMapKeyRef:
name: {{ .Values.rhoas.config }}
key: service_registry_url
----
+
Helm renders the value for ConfigMap name and generates the manifest file for the deployment.
The values can be fetched from the Values.yaml file which is native to Helm, or a different file can be used by passing the file name to “--values” flag:
+
[source,shell]
----
$ helm install . --generate-name --values my-values.yaml
----
+
The values can be set without using any file, by using the “--set-string” flag:
+
[source,shell]
----
$ helm install . --generate-name --set-string rhoas.config=my-context-3-configuration
----

[id="proc-generating-credentials-for-application-services_{context}"]
== Generate credentials for Application Service Instances

Applications make use of service-accounts to authenticate with the Kafka and Service Registry instances. RHOAS CLI can create a service-account and save the credentials as a Kubernetes secret object.

.Procedure

. Create a service account and save the credentials in a Kubernetes secret object.
+
[source,shell]
----
$ rhoas service-account create --file-format secret --output-file ./rhoas-secrets.yaml
----

. Storing the secret object
+
Unlike ConfigMap objects that contain URLs to connect to the instances, Secrets need to be securely stored so that the credentials are not exposed and security is not breached.
+
The generated secret can be stored securely using:

* The `helm-secrets` plug-in
* Encrypting and pushing the secret in Helm chart
* A secret-management solution such as Hashicorp Vault
+
For this example, we will be encrypting the secret values. Run the following commands to obtain the hashed values of credentials.
+
[source,shell]
----
$ echo -n <client-id> | base64
----
+
[source,shell]
----
$ echo -n <client-secret> | base64
----
+
. Copy the obtained base64 values and paste it in the rhoas-secrets.yaml file.
+
+
. Apply the modified secret file.
+
[source,shell]
----
$ oc apply -f “./rhoas-secrets.yaml”
----
+
. Consuming secrets in a Kubernetes Deployment.
+
Secrets should be injected as environment variables in the Deployments.
+
[source,shell]
----
env:
- name: RHOAS_SERVICE_ACCOUNT_CLIENT_ID
valueFrom:
secretKeyRef:
name: service-account-credentials
key: RHOAS_SERVICE_ACCOUNT_CLIENT_ID
- name: RHOAS_SERVICE_ACCOUNT_CLIENT_SECRET
valueFrom:
secretKeyRef:
name: service-account-credentials
key: RHOAS_SERVICE_ACCOUNT_CLIENT_SECRET
----
. Consuming secret in a Helm Chart.
+
Similar to ConfiGmaps, Secrets can be passed to Helm templates. Here too, the Helm chart needs to follow the naming standards set by RHOAS CLI.
+
[source,shell]
----
env:
- name: RHOAS_SERVICE_ACCOUNT_CLIENT_ID
valueFrom:
secretKeyRef:
name: {{ .Values.rhoas.secret }}
key: RHOAS_SERVICE_ACCOUNT_CLIENT_ID
- name: RHOAS_SERVICE_ACCOUNT_CLIENT_SECRET
valueFrom:
secretKeyRef:
name: {{ .Values.rhoas.secret }}
key: RHOAS_SERVICE_ACCOUNT_CLIENT_SECRET
----
+
Similar to ConfigMaps, the value can be supplied using “--set-string” flag:
+
[source,shell]
----
$ helm install . --generate-name --set-string rhoas.secret=service-account-credentials
----


[id="proc-connecting-helm-chart-with-application-services_{context}"]
== Connecting a Helm chart to Application Services

The following example shows how to connect an https://github.com/rkpattnaik780/rhoas-helm-example:[example Helm chart] to the service instances defined in a context in OpenShift Application Services. The example Helm chart contains Kubernetes resources required to deploy the application used in https://github.com/redhat-developer/app-services-guides/tree/main/code-examples/quarkus-kafka-quickstart:[Quarkus Kafka quickstart] in a Kubernetes cluster.

Once we have the files for connection configuration and credentials. We can use it with an example Helm chart that uses the values.
Now we will be using the generated configurations and credentials with the Helm example.

.Procedure

. On the command line, clone the example repository from GitHub.
+
[source,shell]
----
$ git clone https://github.com/rkpattnaik780/rhoas-helm-example.git
----

. Log in to the CLI.
+
[source,shell]
----
$ rhoas login
----
+
The login command opens a sign-in process in your web browser.

. Use the CLI to create a new service context.
+
[source,shell]
----
$ rhoas context create --name helm-context
----
+
The new context becomes the current (that is, active) context by default.

. Use the CLI to create a new service context.
+
[source,shell]
----
$ rhoas context create --name helm-context
----
+
The new context becomes the current (that is, active) context by default.

. Create a Kafka instance in the current context.
+
[source,shell]
----
$ rhoas kafka create --name my-kafka-instance
----

. Generate connection configuration for the context as a ConfigMap.
+
[source,shell]
----
$ rhoas generate-config --type configmap --output-file ./rhoas-services.yaml
----

. Create a topic “prices” for the Kafka instance.
+
[source,shell]
----
$ rhoas kafka topic create --name prices
----

. Grant permission to the created service account to consume and produce messages to topics in the created Kafka instance.
+
[source,shell]
----
$ rhoas kafka acl grant-access --producer --consumer --service-account <client-id> --topic all --group all
----

. Create an OpenShift cluster using Red Hat Developer Sandbox. Once activated, copy the command to login using the OpenShift CLI.
+
[source,shell]
----
$ oc login --token=sha256~WMj84YiOuzVTUp7dIYajetZM2FG-rTAIEJrriPTQJpo --server=https://api.sandbox-m2.ll6k.p8.openshiftapps.com:6443
----

. Apply the generated connection configuration and credential to the OpenShift cluster.

+
[source,shell]
----
$ oc apply -f ./rhoas-services.yaml
----
+
[source,shell]
----
$ oc apply -f ./rhoas-secrets.yaml
----

. Deploy the Helm chart setting the appropriate values:

+
[source,shell]
----
$ helm install . --generate-name --set-string rhoas.config=my-context-3-configuration,rhoas.secret=service-account-credentials
----

. Get the URL of the deployed service.


+
[source,shell]
----
$ oc get service
NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE
rhoas-quarkus-kafka-quickstart LoadBalancer 172.30.128.12 a81b115a35629488685b6ed3cf322fbf-1904626303.us-east-2.elb.amazonaws.com 8080:31110/TCP 11m
workspacef396ea393cc746aa-service ClusterIP 172.30.133.190 <none> 4444/TCP 5d18h
----

Thus now we have the Quarkus application up and running in the OpenShift cluster. To check the application in your browser, navigate to the URL:

```
<external-IP>:8080/prices.html
```

0 comments on commit 2741367

Please sign in to comment.