Skip to content

out_datadog: added support for site configuration field #10582

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

lucastemb
Copy link

Add site Parameter to Fluent Bit Plugin for Regional Data Routing

Enter [N/A] in the box, if an item is not applicable to your change.

Testing
Before we can approve your change; please submit the following in a comment:

  • Example configuration file for the change
  • Debug log output from testing the change
  • Attached Valgrind output that shows no leaks or memory corruption was found

If this is a change to packaging of containers or native binaries then please confirm it works for all targets.

  • [N/A] Run local packaging test showing all targets (including any new ones) build.
  • [N/A] Set ok-package-test label to test for all targets (requires maintainer to do).

Documentation

  • [ N/A ] Documentation required for this feature

Backporting

  • [N/A] Backport to latest stable release.

Fluent Bit is licensed under Apache 2.0, by submitting this pull request I understand that this code will be released under the terms of that license.

@lucastemb
Copy link
Author

Example Config File:

[SERVICE]
    Flush        1
    Log_Level    trace
    Parsers_File parsers.conf

[INPUT]
    Name        dummy
    Tag         dummy
    Dummy       {"message": "Testing support for site confiruation ", "test": "AGNTLOG-206", "container_id": "abc123def456", "container_name": "/ecs-web-service", "container_image": "nginx:1.24-alpine", "ecs_cluster": "arn:aws:ecs:us-west-2:123456789012:cluster/production-cluster", "ecs_task_definition": "web-service:42", "ecs_task_arn": "arn:aws:ecs:us-west-2:123456789012:task/production-cluster/abc123def456-7890-1234-5678-901234567890"}

[OUTPUT]
    Name        datadog
    Match       *
    apikey      ${DD_API_KEY}
    dd_hostname foobar
    provider    ecs
    dd_source   fluent-bit
    dd_service  test-service
    dd_tags     env:test,version:1.0
    compress    gzip 
    site        us3.datadoghq.com```

@lucastemb
Copy link
Author

Valgrind Output:

==1318749== 
==1318749== HEAP SUMMARY:
==1318749==     in use at exit: 0 bytes in 0 blocks
==1318749==   total heap usage: 10,659 allocs, 10,659 frees, 27,179,822 bytes allocated
==1318749== 
==1318749== All heap blocks were freed -- no leaks are possible
==1318749== 
==1318749== For lists of detected and suppressed errors, rerun with: -s
==1318749== ERROR SUMMARY: 0 errors from 0 contexts (suppressed: 0 from 0)

@lucastemb
Copy link
Author

Debug Log Output:

[2025/07/11 18:38:45] [ info] Configuration:
[2025/07/11 18:38:45] [ info]  flush time     | 1.000000 seconds
[2025/07/11 18:38:45] [ info]  grace          | 5 seconds
[2025/07/11 18:38:45] [ info]  daemon         | 0
[2025/07/11 18:38:45] [ info] ___________
[2025/07/11 18:38:45] [ info]  inputs:
[2025/07/11 18:38:45] [ info]      dummy
[2025/07/11 18:38:45] [ info] ___________
[2025/07/11 18:38:45] [ info]  filters:
[2025/07/11 18:38:45] [ info] ___________
[2025/07/11 18:38:45] [ info]  outputs:
[2025/07/11 18:38:45] [ info]      datadog.0
[2025/07/11 18:38:45] [ info] ___________
[2025/07/11 18:38:45] [ info]  collectors:
[2025/07/11 18:38:45] [ info] [fluent bit] version=4.0.5, commit=9def01d7de, pid=1326162
[2025/07/11 18:38:45] [debug] [engine] coroutine stack size: 24576 bytes (24.0K)
[2025/07/11 18:38:45] [ info] [storage] ver=1.5.3, type=memory, sync=normal, checksum=off, max_chunks_up=128
[2025/07/11 18:38:45] [ info] [simd    ] disabled
[2025/07/11 18:38:45] [ info] [cmetrics] version=1.0.4
[2025/07/11 18:38:45] [ info] [ctraces ] version=0.6.6
[2025/07/11 18:38:45] [ info] [input:dummy:dummy.0] initializing
[2025/07/11 18:38:45] [ info] [input:dummy:dummy.0] storage_strategy='memory' (memory only)
[2025/07/11 18:38:45] [debug] [dummy:dummy.0] created event channels: read=28 write=29
[2025/07/11 18:38:45] [debug] [datadog:datadog.0] created event channels: read=30 write=31
[2025/07/11 18:38:45] [debug] [output:datadog:datadog.0] scheme: http://
[2025/07/11 18:38:45] [debug] [output:datadog:datadog.0] site parameter set to: us3.datadoghq.com
[2025/07/11 18:38:45] [debug] [output:datadog:datadog.0] uri: /api/v2/logs
[2025/07/11 18:38:45] [debug] [output:datadog:datadog.0] using site for host construction: us3.datadoghq.com
[2025/07/11 18:38:45] [debug] [output:datadog:datadog.0] created base hostname: http-intake.logs.
[2025/07/11 18:38:45] [debug] [output:datadog:datadog.0] about to concatenate site: us3.datadoghq.com (length: 17)
[2025/07/11 18:38:45] [debug] [output:datadog:datadog.0] after concatenation: http-intake.logs.us3.datadoghq.com
[2025/07/11 18:38:45] [debug] [output:datadog:datadog.0] host constructed from site: http-intake.logs.us3.datadoghq.com
[2025/07/11 18:38:45] [debug] [output:datadog:datadog.0] port: 80
[2025/07/11 18:38:45] [debug] [output:datadog:datadog.0] json_date_key: timestamp
[2025/07/11 18:38:45] [debug] [output:datadog:datadog.0] compress_gzip: 1
[2025/07/11 18:38:45] [ info] [sp] stream processor started
[2025/07/11 18:38:45] [ info] [engine] Shutdown Grace Period=5, Shutdown Input Grace Period=2
[2025/07/11 18:38:45] [trace] [sched] 0 timer coroutines destroyed
[2025/07/11 18:38:45] [trace] [sched] 0 timer coroutines destroyed
[2025/07/11 18:38:46] [trace] [input chunk] update output instances with new chunk size diff=403, records=1, input=dummy.0
[2025/07/11 18:38:46] [trace] [sched] 0 timer coroutines destroyed
[2025/07/11 18:38:46] [trace] [sched] 0 timer coroutines destroyed
[2025/07/11 18:38:46] [trace] [sched] 0 timer coroutines destroyed
[2025/07/11 18:38:46] [trace] [sched] 0 timer coroutines destroyed
[2025/07/11 18:38:47] [trace] [task 0x5e9f010] created (id=0)
[2025/07/11 18:38:47] [debug] [task] created task=0x5e9f010 id=0 OK
[2025/07/11 18:38:47] [trace] [input chunk] update output instances with new chunk size diff=403, records=1, input=dummy.0
[2025/07/11 18:38:47] [trace] [upstream] get new connection for http-intake.logs.us3.datadoghq.com:80, net setup:
net.connect_timeout        = 10 seconds
net.source_address         = any
net.keepalive              = enabled
net.keepalive_idle_timeout = 30 seconds
net.max_worker_connections = 0
[2025/07/11 18:38:47] [trace] [net] connection #38 in process to http-intake.logs.us3.datadoghq.com:80

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant