Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No logs being collected when only /var/log/pods directory is available #1552

Open
vdombrovski opened this issue Dec 10, 2024 · 1 comment
Open

Comments

@vdombrovski
Copy link

Hello,

I'm using datadog-operator v1.10.0 with the DatadogAgent CRD described below. My environment requires me to comply with the "baseline" pod security standard, which does not allow the use of hostPath. I however am able to mount a persistent volume containing all pod logs in /var/log/pods with the correct permissions and the correct directory structure. The agent launches fine, however my logs are not being collected and I'm getting the following message:

2024-12-10 11:07:18 UTC | CORE | DEBUG | (pkg/logs/launchers/file/launcher.go:155 in scan) | Scan - got 0 files from FilesToTail and currently tailing 0 files
2024-12-10 11:07:18 UTC | CORE | DEBUG | (pkg/logs/launchers/file/launcher.go:213 in scan) | After stopping tailers, there are 0 tailers running.
2024-12-10 11:07:18 UTC | CORE | DEBUG | (pkg/logs/launchers/file/launcher.go:230 in scan) | After starting new tailers, there are 0 tailers running. Limit is 500.

From what I understand, Datadog Agent resolves symlinks in /var/log/containers to get the actual pod files, I'm wondering if there is an alternate mode that would directly get the pod logs without going through this resolution? I have no issues to make it work using other log collection agents, but I'm having no success with Datadog so far.

Here is the contents of my pod directory (on a sample node):

find /var/log/pods/ -type f
/var/log/pods/datadog-datadog-agent-5qmkj_6e2cdca1-e143-4bf8-8d07-fa3fd6d0a630/init-volume/0.log
/var/log/pods/datadog-datadog-agent-5qmkj_6e2cdca1-e143-4bf8-8d07-fa3fd6d0a630/process-agent/0.log
/var/log/pods/datadog-datadog-agent-5qmkj_6e2cdca1-e143-4bf8-8d07-fa3fd6d0a630/init-config/0.log
/var/log/pods/datadog-datadog-agent-5qmkj_6e2cdca1-e143-4bf8-8d07-fa3fd6d0a630/agent/0.log

My agent status (v7.58.2) shows the following status (only relevant parts included):

==========
Logs Agent
==========

    Reliable: Sending compressed logs in HTTPS to agent-http-intake.logs.datadoghq.eu on port 443
    BytesSent: 0
    EncodedBytesSent: 2
    LogsProcessed: 0
    LogsSent: 0
    RetryCount: 0
    RetryTimeSpent: 0s
    CoreAgentProcessOpenFiles: 317
    OSFileLimit: 1048576

    kubernetes_apiserver
    --------------------
      Instance ID: kubernetes_apiserver [OK]
      Configuration Source: file:/etc/datadog-agent/conf.d/kubernetes_apiserver.d/conf.yaml.default
      Total Runs: 62
      Metric Samples: Last Run: 0, Total: 0
      Events: Last Run: 0, Total: 0
      Service Checks: Last Run: 0, Total: 0
      Average Execution Time : 0s
      Last Execution Date : 2024-12-10 11:08:19 UTC (1733828899000)
      Last Successful Execution Date : 2024-12-10 11:08:19 UTC (1733828899000)

My DatadogAgent CRD declaration is as follows:

apiVersion: "datadoghq.com/v2alpha1"
kind: "DatadogAgent"
metadata:
  name: "datadog"
  namespace: "datadog"
spec:
  global:
    logLevel: DEBUG
    site: "datadoghq.eu"
    credentials:
      apiSecret:
        secretName: "datadog-secret"
        keyName: "api-key"
    kubelet:
      host:
        fieldRef:
          fieldPath: spec.nodeName
      tlsVerify: false
  features:
    kubeStateMetricsCore:
      enabled: true
    admissionController:
      enabled: false
    externalMetricsServer:
      enabled: false
      useDatadogMetrics: false
    apm:
      enabled: false
    logCollection:
      enabled: true
      containerCollectAll: true
      containerCollectUsingFiles: true
    liveContainerCollection:
      enabled: true
    liveProcessCollection:
      enabled: false
    processDiscovery:
      enabled: false
    cspm:
      enabled: false
    dogstatsd:
      unixDomainSocketConfig:
        enabled: false
  override:
    clusterAgent:
      env:
        - name: DD_CLUSTER_CHECKS_ENABLED
          value: "true"
    nodeAgent:
      env:
        - name: DD_LOGS_CONFIG_K8S_CONTAINER_USE_FILE
          value: "true"
        - name: DD_LOGS_CONFIG_DOCKER_CONTAINER_FORCE_USE_FILE
          value: "true"
        - name: DD_CONTAINER_INCLUDE_LOGS
          value: ".*"
        - name: DD_LOGS_CONFIG_VALIDATE_POD_CONTAINER_ID
          value: "false"
        - name: DD_HOSTNAME
          valueFrom:
            fieldRef:
              fieldPath: spec.nodeName
      volumes:
      - emptyDir: {}
        name: logdatadog
      - emptyDir: {}
        name: datadog-agent-auth
      - configMap:
          defaultMode: 420
          name: datadog-install-info
        name: installinfo
      - emptyDir: {}
        name: checksd
      - emptyDir: {}
        name: confd
      - emptyDir: {}
        name: config
      - name: procdir
        persistentVolumeClaim:
          claimName: proc-pvc 
      - emptyDir: {}
        name: cgroups
      - name: dsdsocket
        emptyDir: {}
      - name: runtimesocketdir
        emptyDir: {}
      - name: pointerdir
        emptyDir: {}
      - name: logpodpath
        persistentVolumeClaim:
          claimName: pod-logs-pvc 
      - name: logcontainerpath
        emptyDir: {}
      - name: symlinkcontainerpath
        emptyDir: {}
      - name: passwd
        emptyDir: {}
@vdombrovski vdombrovski changed the title No logs being collected when only /var/log/pods dreictory is available No logs being collected when only /var/log/pods directory is available Dec 10, 2024
@levan-m
Copy link
Contributor

levan-m commented Dec 10, 2024

Thank you for reporting this! In order to investigate this further, can you please open a support ticket referencing this github issue and one of our agents will dig in and get back to your shortly. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants