Hi,Teams.I am using OpenTelemetry to monitor the A...
# ask-ai
x
Hi,Teams.I am using OpenTelemetry to monitor the Airbyte On Helm (EKS,version:0.45.12) cluster, and the following error is reported during the deployment process:
Copy code
Jun 05, 2023 9:17:32 AM io.opentelemetry.sdk.internal.ThrottlingLogger doLog
WARNING: Instrument num_orphan_running_jobs has recorded multiple values for the same attributes.
Jun 05, 2023 9:17:32 AM io.opentelemetry.sdk.internal.ThrottlingLogger doLog
WARNING: Instrument num_pending_jobs has recorded multiple values for the same attributes.
Jun 05, 2023 9:17:32 AM io.opentelemetry.sdk.internal.ThrottlingLogger doLog
WARNING: Instrument num_running_jobs has recorded multiple values for the same attributes.
Jun 05, 2023 9:17:32 AM io.opentelemetry.sdk.internal.ThrottlingLogger doLog
WARNING: Instrument oldest_running_job_age_secs has recorded multiple values for the same attributes.
Jun 05, 2023 9:17:32 AM io.opentelemetry.sdk.internal.ThrottlingLogger doLog
WARNING: Instrument oldest_pending_job_age_secs has recorded multiple values for the same attributes.
Jun 05, 2023 9:17:32 AM io.opentelemetry.sdk.internal.ThrottlingLogger doLog
SEVERE: Failed to export metrics. The request could not be executed. Full error message: Failed to connect to otel-collector/172.20.121.101:4317
otel-collector yaml file:
Copy code
---
apiVersion: v1
kind: ConfigMap
metadata:
  name: otel-collector-conf
  labels:
    app: opentelemetry
    component: otel-collector-conf
data:
  otel-collector-config: |
    receivers:
      otlp:
        protocols:
          grpc:
            endpoint: localhost:4317
    processors:
      batch:
      memory_limiter:
        limit_mib: 1500
        spike_limit_mib: 512
        check_interval: 5s
    exporters:
      logging:
        loglevel: debug
    service:
      pipelines:
        metrics:
          receivers: [otlp]
          processors: [memory_limiter, batch]
          exporters: [logging]
      telemetry:
        metrics:
          address: localhost:8888

---
apiVersion: apps/v1
kind: Deployment
metadata:
  name: otel-collector
  labels:
    componet: otel-collector
    app: opentelemetry
spec:
  selector:
    matchLabels:
      app: opentelemetry
      component: otel-collector
  replicas: 1
  template:
    metadata:
      labels:
        app: opentelemetry
        component: otel-collector
    spec:
      containers:
        - command:
            - "/otelcol"
            - "--config=/conf/otel-collector-config.yaml"
          image: "otel/opentelemetry-collector:latest"
          name: otel-collector
          ports:
            - containerPort: 4317 # Default endpoint for OpenTelemetry receiver.
            - containerPort: 8888 # Port for Prometheus instance
          volumeMounts:
            - name: config
              mountPath: /conf
      volumes:
        - configMap:
            name: otel-collector-conf
            items:
              - key: otel-collector-config
                path: otel-collector-config.yaml
          name: config

---
apiVersion: v1
kind: Service
metadata:
  name: otel-collector
  labels:
    app: opentelemetry
    component: otel-collector
spec:
  ports:
    - name: otlp-grpc # Default endpoint for OpenTelemetry gRPC receiver.
      port: 4317
      protocol: TCP
      targetPort: 4317
    - name: metrics
      port: 8888
  selector:
    component: otel-collector
k
A message from kapa.ai
New message text here