No logs are forwarded to Splunk

After enabling log forwarding to Splunk as described in Enable log forwarding to external destinations, you may see no specific errors but logs are not being sent to Splunk. In this case, debug the issue using the procedure below.

To debug the issue:

  1. Temporary set the debug logging level for the syslog output plugin:

    logging:
      externalOutputs:
        splunk_syslog_output:
          plugin_log_level: debug
          type: remote_syslog
          host: remote-splunk-syslog.svc
          port: 514
          protocol: tcp
          tls: true
          ca_file: /etc/ssl/certs/splunk-syslog.pem
          verify_mode: 0
          buffer:
            chunk_limit: 16MB
            total_limit: 128MB
      externalOutputSecretMounts:
      - secretName: syslog-pem
        mountPath: /etc/ssl/certs/splunk-syslog.pem
    
  2. When the fluentd-logs pods are updated, grep any pod by splunk_syslog_output:

    kubectl logs -n stacklight -f <fluentd-logs-pod-name>| grep 'splunk_syslog_output'
    

    In the following example output, the error indicates that the specified Splunk host name cannot be resolved. Therefore, verify and update the host name accordingly.

    Example output

    2023-07-25 09:57:29 +0000 [info]: adding match in @splunk_syslog_output-external pattern="**" type="remote_syslog"
           @label @splunk_syslog_output-external
      <label @splunk_syslog_output-external>
          @id splunk_syslog_output-external
           path "/var/log/fluentd-buffers/splunk_syslog_output-external.system.buffer"
      path "/var/log/fluentd-buffers/splunk_syslog_output-external.system.buffer"
     path "/var/log/fluentd-buffers/splunk_syslog_output-external.system.buffer"
    2023-07-25 09:57:30 +0000 [debug]: [splunk_syslog_output-external] restoring buffer file: path = /var/log/fluentd-buffers/splunk_syslog_output-external.system.buffer/buffer.q6014c3643b68e68c03c6217052e1af55.log
    2023-07-25 09:57:30 +0000 [debug]: [splunk_syslog_output-external] restoring buffer file: path = /var/log/fluentd-buffers/splunk_syslog_output-external.system.buffer/buffer.q6014c36877047570ab3b892f6bd5afe8.log
    2023-07-25 09:57:30 +0000 [debug]: [splunk_syslog_output-external] restoring buffer file: path = /var/log/fluentd-buffers/splunk_syslog_output-external.system.buffer/buffer.b6014c36d40fcc16ea630fa86c9315638.log
    2023-07-25 09:57:30 +0000 [debug]: [splunk_syslog_output-external] buffer started instance=61140 stage_size=17628134 queue_size=5026605
    2023-07-25 09:57:30 +0000 [debug]: [splunk_syslog_output-external] flush_thread actually running
    2023-07-25 09:57:30 +0000 [debug]: [splunk_syslog_output-external] enqueue_thread actually running
    2023-07-25 09:57:33 +0000 [debug]: [splunk_syslog_output-external] taking back chunk for errors. chunk="6014c3643b68e68c03c6217052e1af55"
    2023-07-25 09:57:33 +0000 [warn]: [splunk_syslog_output-external] failed to flush the buffer. retry_times=0 next_retry_time=2023-07-25 09:57:35 +0000 chunk="6014c3643b68e68c03c6217052e1af55" error_class=SocketError error="getaddrinfo: Name or service not known"