Bootstrap FreeKB - OpenShift - Event Router to log Kubernetes events
OpenShift - Event Router to log Kubernetes events

Updated:   |  OpenShift articles

Event Router fetches the events from all namespaces in the OpenShift cluster and stores them in one or more pods named eventrouter in the openshift-logging namespace. Often, a logging subsystem such as Loki will then fetch the logs from the eventrouter pods in the openshift-logging namespace to be passed onto an observability system, such as Kibana.

 

Create the eventrouter service account 

First, let's create a Service Account named eventrouter in the openshift-logging namespace. For example, let's say you have the following in a file named service_account.yml.

apiVersion: v1
kind: ServiceAccount 
metadata:
  name: eventrouter
  namespace: openshift-logging

 

The oc apply command can be used to create the service account.

oc apply -f service_account.yml

 

Or in the OpenShift console, at User Management > ServiceAccounts > Create Service Account.

 

Create the Cluster Role

Next let's create a Cluster Role. For example, let's say you have the following in a file named cluster_role.yml.

apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRole 
metadata:
  name: event-reader
rules:
- apiGroups: [""]
  resources: ["events"]
  verbs: ["get", "list", "watch" ]

 

The oc apply command can be used to create the cluster role.

oc apply -f cluster_role.yml

 

Create the Cluster Role Binding

Next let's create a Cluster Role Binding. For example, let's say you have the following in a file named cluster_role_binding.yml.

apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRoleBinding  
metadata:
  name: event-reader-binding
subjects:
- kind: ServiceAccount
  name: eventrouter
  namespace: openshift-logging
roleRef:
  kind: ClusterRole
  name: event-reader

 

The oc apply command can be used to create the cluster role binding.

oc apply -f cluster_role_binding.yml

 

Or in the OpenShift console, at User Management > RoleBindings, create the Cluster Role Binding.

 

Create the Config Map

Next let's create a Config Map. For example, let's say you have the following in a file named config_map.yml.

apiVersion: v1
kind: ConfigMap 
metadata:
  name: eventrouter
  namespace: openshift-logging
data:
  config.json: |-
    {
      "sink": "stdout"
    }

 

The oc apply command can be used to create the config map.

oc apply -f config_map.yml

 

Or in the OpenShift console, at Workloads > ConfigMaps > Create ConfigMap.

 

Create the Deployment

Next let's create the eventrouter deployment in the openshift-logging namespace. For example, let's say you have the following in a file named deployment.yml. The deployment will mount the config.json file in the config map to /etc/eventrouter/config.json in the container. The container is also run using the eventrouter service account which maps to the cluster role binding which maps to the cluster role which provides get, list, and watch permissions for events.

apiVersion: apps/v1
kind: Deployment 
metadata:
  name: eventrouter
  namespace: openshift-logging
  labels:
    component: "eventrouter"
    logging-infra: "eventrouter"
    provider: "openshift"
spec:
  selector:
    matchLabels:
      component: "eventrouter"
      logging-infra: "eventrouter"
      provider: "openshift"
  replicas: 1
  template:
    metadata:
      labels:
        component: "eventrouter"
        logging-infra: "eventrouter"
        provider: "openshift"
      name: eventrouter
    spec:
      serviceAccount: eventrouter
      containers:
        - name: kube-eventrouter
          image: "registry.redhat.io/openshift-logging/eventrouter-rhel8:v0.4"
          imagePullPolicy: IfNotPresent
          resources:
            requests:
              cpu: "100m"
              memory: "128Mi"
          volumeMounts:
          - name: config-volume
            mountPath: /etc/eventrouter
      volumes:
        - name: config-volume
          configMap:
            name: eventrouter

 

The oc apply command can be used to create the deployment.

oc apply -f config_map.yml

 

Viewing Logs

The oc get pods command can be used to confirm that there is now an eventrouter pod in the openshift-logging namespace.

~]# oc get pods --namespace openshift-logging
NAME                                           READY   STATUS     RESTARTS  AGE
cluster-logging-eventrouter-d649f97c8-qvv8r    1/1     Running    0         8d

 

And the oc logs command can be used to view the logs in the pod.

oc logs pod/cluster-logging-eventrouter-d649f97c8-qvv8r --namespace openshift-logging

 

Which should return something like this which shows the pod is logging events. Nice!

{"verb":"ADDED","event":{"metadata":{"name":"openshift-service-catalog-controller-manager-remover.1632d931e88fcd8f","namespace":"openshift-service-catalog-removed","selfLink":"/api/v1/namespaces/openshift-service-catalog-removed/events/openshift-service-catalog-controller-manager-remover.1632d931e88fcd8f","uid":"787d7b26-3d2f-4017-b0b0-420db4ae62c0","resourceVersion":"21399","creationTimestamp":"2020-09-08T15:40:26Z"},"involvedObject":{"kind":"Job","namespace":"openshift-service-catalog-removed","name":"openshift-service-catalog-controller-manager-remover","uid":"fac9f479-4ad5-4a57-8adc-cb25d3d9cf8f","apiVersion":"batch/v1","resourceVersion":"21280"},"reason":"Completed","message":"Job completed","source":{"component":"job-controller"},"firstTimestamp":"2020-09-08T15:40:26Z","lastTimestamp":"2020-09-08T15:40:26Z","count":1,"type":"Normal"}}

 

 




Did you find this article helpful?

If so, consider buying me a coffee over at Buy Me A Coffee



Comments


Add a Comment


Please enter adaee1 in the box below so that we can be sure you are a human.