Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integrations Cleanup and Updates #1197

Merged
merged 44 commits into from
Feb 12, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
44 commits
Select commit Hold shift + click to select a range
2de80e3
Remove extra lines in app-o11y notes
bentonam Jan 30, 2025
fd44e57
Added namespaces property
bentonam Jan 30, 2025
f1b94f0
Drop debug logs by default
bentonam Jan 30, 2025
27c4164
fixed scrubTimestamp bug
bentonam Feb 3, 2025
a05ad4e
Update ts format
bentonam Feb 3, 2025
65c54d3
Documentation
bentonam Feb 3, 2025
56b169a
Set default allow list to null
bentonam Feb 3, 2025
37b9bcd
Updated Meta-Monitoring Example
bentonam Feb 3, 2025
26723e5
Updates
bentonam Feb 3, 2025
cc78a1c
Updated Tests and Rebuilt
bentonam Feb 3, 2025
41b7494
Update charts/k8s-monitoring/charts/feature-cluster-metrics/values.yaml
bentonam Feb 3, 2025
3cd9e02
Fix a few things: (#1200)
petewall Feb 4, 2025
5b3b1d9
Added otlp-gateway as part of validation check (#1202)
bentonam Feb 4, 2025
952a9d6
Bump v2 version to 2.0.7
rlankfo Feb 4, 2025
3c04831
Add unit tests for the new Grafana cloud validators. (#1204)
petewall Feb 4, 2025
475a1d2
Bump v2 version to 2.0.8
rlankfo Feb 4, 2025
ee9c568
Update Update dependency "kepler" for Helm chart "feature-cluster-met…
github-actions[bot] Feb 6, 2025
a1caf4e
Update Update dependency "kepler" for Helm chart "k8s-monitoring-v1" …
github-actions[bot] Feb 6, 2025
af43837
add application-observability platform test (#1206)
rlankfo Feb 6, 2025
1b56dfe
Added Tempo Integration (#1168)
Imshelledin21 Feb 6, 2025
469a523
Add validation for otlp destination protocol (#1212)
petewall Feb 7, 2025
5ac3d33
Actually implement proxy URL for loki destinations (#1215)
petewall Feb 7, 2025
38df6f8
Add the ability to set an additional service for the recevier (#1213)
petewall Feb 7, 2025
9edb37b
adjust default batch size for feature-application-observability (#1205)
rlankfo Feb 7, 2025
e9f96bf
make build (#1216)
rlankfo Feb 7, 2025
0f676ad
Make it possible to skip mysql logs integration (#1218)
petewall Feb 9, 2025
729ecb0
Fix Pod log annotation and label assignment (#1222)
petewall Feb 10, 2025
0267b6e
Bump versions to 1.6.24 and 2.0.9
petewall Feb 10, 2025
1fa34d5
Updates
bentonam Feb 4, 2025
a819ac2
Fixed integration bug not merging booleans
bentonam Feb 5, 2025
9d2b88b
WIP
bentonam Feb 10, 2025
7fbeb00
Fixed Deep Copy / Merge of Loki Values and added Tests
bentonam Feb 11, 2025
629f82d
Fixed Deep Copy / Merge of Mimir Values and added Tests
bentonam Feb 11, 2025
7ed7480
Fixed Deep Copy / Merge of Grafana Values and added Tests
bentonam Feb 11, 2025
64806e8
Fixed Deep Copy / Merge of Tempo Values and added Tests
bentonam Feb 11, 2025
bcd6ab1
Rebuilt
bentonam Feb 11, 2025
63ea1c9
Fix a few things: (#1200)
petewall Feb 4, 2025
d1a22bb
Only run these workflows on weekday mornings
petewall Feb 10, 2025
ea5ed3d
Updated Tests and Rebuilt
bentonam Feb 3, 2025
4bf9be8
Rebuilt
bentonam Feb 11, 2025
6cf2d36
Merge branch 'main' of github.com:grafana/k8s-monitoring-helm into cl…
bentonam Feb 11, 2025
f5485ea
Rebuilt
bentonam Feb 11, 2025
5aba37a
Fixed Tests
bentonam Feb 11, 2025
999aa98
Merge branch 'main' of github.com:grafana/k8s-monitoring-helm into cl…
bentonam Feb 12, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -17,10 +17,10 @@ Gather application data via {{ include "english_list" $receivers }} {{ $receiver
Configure your applications to send telemetry data to:
{{- if .Values.receivers.otlp.grpc.enabled }}
* http://{{ .Collector.ServiceName }}.{{ .Collector.Namespace }}.svc.cluster.local:{{ .Values.receivers.otlp.grpc.port }} (OTLP gRPC)
{{ end }}
{{- end }}
{{- if .Values.receivers.otlp.http.enabled }}
* http://{{ .Collector.ServiceName }}.{{ .Collector.Namespace }}.svc.cluster.local:{{ .Values.receivers.otlp.http.port }} (OTLP HTTP)
{{ end }}
{{- end }}
{{- if .Values.receivers.jaeger.grpc.enabled }}
* http://{{ .Collector.ServiceName }}.{{ .Collector.Namespace }}.svc.cluster.local:{{ .Values.receivers.jaeger.grpc.port }} (Jaeger gRPC)
{{- end }}
Expand All @@ -35,7 +35,7 @@ Configure your applications to send telemetry data to:
{{- end }}
{{- if .Values.receivers.zipkin.enabled }}
* http://{{ .Collector.ServiceName }}.{{ .Collector.Namespace }}.svc.cluster.local:{{ .Values.receivers.zipkin.port }} (Zipkin)
{{ end }}
{{- end }}
{{- end }}

{{- define "feature.applicationObservability.summary" -}}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -203,6 +203,7 @@ Be sure perform actual integration testing in a live environment in the main [k8
| kube-state-metrics.metricsTuning.includeMetrics | list | `[]` | Metrics to keep. Can use regular expressions. |
| kube-state-metrics.metricsTuning.useDefaultAllowList | bool | `true` | Filter the list of metrics from Kube State Metrics to a useful, minimal set. |
| kube-state-metrics.namespace | string | `""` | Namespace to locate kube-state-metrics pods. If `deploy` is set to `true`, this will automatically be set to the namespace where this Helm chart is deployed. |
| kube-state-metrics.namespaces | string | `""` | Comma-separated list(string) or yaml list of namespaces to be enabled for collecting resources. By default all namespaces are collected. |
| kube-state-metrics.scrapeInterval | string | `60s` | How frequently to scrape kube-state-metrics metrics. |
| kube-state-metrics.service.portName | string | `"http"` | The port name used by kube-state-metrics. |
| kube-state-metrics.service.scheme | string | `"http"` | The scrape scheme used by kube-state-metrics. |
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -323,6 +323,9 @@
"namespace": {
"type": "string"
},
"namespaces": {
"type": "string"
},
"nodeSelector": {
"type": "object",
"properties": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -434,6 +434,10 @@ kube-state-metrics:
# @section -- kube-state-metrics
namespace: ""

# -- Comma-separated list(string) or yaml list of namespaces to be enabled for collecting resources. By default all namespaces are collected.
# @section -- kube-state-metrics
namespaces: ""
bentonam marked this conversation as resolved.
Show resolved Hide resolved

# -- Rule blocks to be added to the discovery.relabel component for kube-state-metrics.
# These relabeling rules are applied pre-scrape against the targets from service discovery.
# Before the scrape, any remaining target labels that start with __ (i.e. __meta_kubernetes*) are dropped.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
| Key | Type | Default | Description |
|-----|------|---------|-------------|
| logs.enabled | bool | `true` | Whether to enable special processing of Grafana pod logs. |
| logs.tuning.dropLogLevels | list | `[]` | The log levels to drop. Will automatically keep all log levels unless specified here. |
| logs.tuning.dropLogLevels | list | `["debug"]` | The log levels to drop. Will automatically keep all log levels unless specified here. |
| logs.tuning.excludeLines | list | `[]` | Line patterns (valid RE2 regular expression)to exclude from the logs. |
| logs.tuning.scrubTimestamp | bool | `true` | Whether the timestamp should be scrubbed from the log line |
| logs.tuning.structuredMetadata | object | `{}` | The structured metadata mappings to set. To not set any structured metadata, set this to an empty object (e.g. `{}`) |
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
| Key | Type | Default | Description |
|-----|------|---------|-------------|
| logs.enabled | bool | `true` | Whether to enable special processing of Loki pod logs. |
| logs.tuning.dropLogLevels | list | `[]` | The log levels to drop. Will automatically keep all log levels unless specified here. |
| logs.tuning.dropLogLevels | list | `["debug"]` | The log levels to drop. Will automatically keep all log levels unless specified here. |
| logs.tuning.excludeLines | list | `[]` | Line patterns (valid RE2 regular expression)to exclude from the logs. |
| logs.tuning.scrubTimestamp | bool | `true` | Whether the timestamp should be scrubbed from the log line |
| logs.tuning.structuredMetadata | object | `{}` | The structured metadata mappings to set. To not set any structured metadata, set this to an empty object (e.g. `{}`) |
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
| Key | Type | Default | Description |
|-----|------|---------|-------------|
| logs.enabled | bool | `true` | Whether to enable special processing of Mimir pod logs. |
| logs.tuning.dropLogLevels | list | `[]` | The log levels to drop. Will automatically keep all log levels unless specified here. |
| logs.tuning.dropLogLevels | list | `["debug"]` | The log levels to drop. Will automatically keep all log levels unless specified here. |
| logs.tuning.excludeLines | list | `[]` | Line patterns (valid RE2 regular expression)to exclude from the logs. |
| logs.tuning.scrubTimestamp | bool | `true` | Whether the timestamp should be scrubbed from the log line |
| logs.tuning.structuredMetadata | object | `{}` | The structured metadata mappings to set. To not set any structured metadata, set this to an empty object (e.g. `{}`) |
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
| Key | Type | Default | Description |
|-----|------|---------|-------------|
| logs.enabled | bool | `true` | Whether to enable special processing of Tempo pod logs. |
| logs.tuning.dropLogLevels | list | `[]` | The log levels to drop. Will automatically keep all log levels unless specified here. |
| logs.tuning.dropLogLevels | list | `["debug"]` | The log levels to drop. Will automatically keep all log levels unless specified here. |
| logs.tuning.excludeLines | list | `[]` | Line patterns (valid RE2 regular expression)to exclude from the logs. |
| logs.tuning.scrubTimestamp | bool | `true` | Whether the timestamp should be scrubbed from the log line |
| logs.tuning.structuredMetadata | object | `{}` | The structured metadata mappings to set. To not set any structured metadata, set this to an empty object (e.g. `{}`) |
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ logs:
# -- The timestamp format to use for the log line, if not set the default timestamp which is the collection
# will be used for the log line
# @section -- Logs Settings
timestampFormat: "RFC3339Nano"
timestampFormat: RFC3339Nano

# -- Whether the timestamp should be scrubbed from the log line
# @section -- Logs Settings
Expand All @@ -78,7 +78,8 @@ logs:
# -- The log levels to drop.
# Will automatically keep all log levels unless specified here.
# @section -- Logs Settings
dropLogLevels: []
dropLogLevels:
- debug

# -- Line patterns (valid RE2 regular expression)to exclude from the logs.
# @section -- Logs Settings
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ logs:
# -- The timestamp format to use for the log line, if not set the default timestamp which is the collection
# will be used for the log line
# @section -- Logs Settings
timestampFormat: "RFC3339Nano"
timestampFormat: RFC3339Nano

# -- Whether the timestamp should be scrubbed from the log line
# @section -- Logs Settings
Expand All @@ -77,7 +77,8 @@ logs:
# -- The log levels to drop.
# Will automatically keep all log levels unless specified here.
# @section -- Logs Settings
dropLogLevels: []
dropLogLevels:
- debug

# -- Line patterns (valid RE2 regular expression)to exclude from the logs.
# @section -- Logs Settings
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ logs:
# -- The timestamp format to use for the log line, if not set the default timestamp which is the collection
# will be used for the log line
# @section -- Logs Settings
timestampFormat: "RFC3339Nano"
timestampFormat: RFC3339Nano

# -- Whether the timestamp should be scrubbed from the log line
# @section -- Logs Settings
Expand All @@ -77,7 +77,8 @@ logs:
# -- The log levels to drop.
# Will automatically keep all log levels unless specified here.
# @section -- Logs Settings
dropLogLevels: []
dropLogLevels:
- debug

# -- Line patterns (valid RE2 regular expression)to exclude from the logs.
# @section -- Logs Settings
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ logs:
# -- The timestamp format to use for the log line, if not set the default timestamp which is the collection
# will be used for the log line
# @section -- Logs Settings
timestampFormat: "RFC3339Nano"
timestampFormat: RFC3339Nano

# -- Whether the timestamp should be scrubbed from the log line
# @section -- Logs Settings
Expand All @@ -77,7 +77,8 @@ logs:
# -- The log levels to drop.
# Will automatically keep all log levels unless specified here.
# @section -- Logs Settings
dropLogLevels: []
dropLogLevels:
- debug

# -- Line patterns (valid RE2 regular expression)to exclude from the logs.
# @section -- Logs Settings
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,10 @@
"type": "object",
"properties": {
"dropLogLevels": {
"type": "array"
"type": "array",
"items": {
"type": "string"
}
},
"excludeLines": {
"type": "array"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,10 @@
"type": "object",
"properties": {
"dropLogLevels": {
"type": "array"
"type": "array",
"items": {
"type": "string"
}
},
"excludeLines": {
"type": "array"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,10 @@
"type": "object",
"properties": {
"dropLogLevels": {
"type": "array"
"type": "array",
"items": {
"type": "string"
}
},
"excludeLines": {
"type": "array"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,10 @@
"type": "object",
"properties": {
"dropLogLevels": {
"type": "array"
"type": "array",
"items": {
"type": "string"
}
},
"excludeLines": {
"type": "array"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,8 @@
{{- $defaultValues := "integrations/grafana-values.yaml" | .Files.Get | fromYaml }}
{{- $logsEnabled := false }}
{{- range $instance := .Values.grafana.instances }}
{{- with merge $instance $defaultValues (dict "type" "integration.grafana") }}
{{- $logsEnabled = or $logsEnabled $instance.logs.enabled }}
{{- with merge (deepCopy $defaultValues) (deepCopy $instance) (dict "type" "integration.grafana") }}
{{- $logsEnabled = or $logsEnabled .logs.enabled }}
{{- end }}
{{- end }}
{{- $logsEnabled -}}
Expand All @@ -13,7 +13,7 @@
{{- define "integrations.grafana.logs.discoveryRules" }}
{{- $defaultValues := "integrations/grafana-values.yaml" | .Files.Get | fromYaml }}
{{- range $instance := $.Values.grafana.instances }}
{{- with mergeOverwrite $defaultValues (deepCopy $instance) }}
{{- with $defaultValues | merge (deepCopy $instance) }}
{{- if .logs.enabled }}
{{- $labelList := list }}
{{- $valueList := list }}
Expand Down Expand Up @@ -52,9 +52,9 @@ rule {
{{- define "integrations.grafana.logs.processingStage" }}
{{- if eq (include "integrations.grafana.type.logs" .) "true" }}
{{- $defaultValues := "integrations/grafana-values.yaml" | .Files.Get | fromYaml }}
// Integration: Loki
// Integration: Grafana
{{- range $instance := $.Values.grafana.instances }}
{{- with mergeOverwrite $defaultValues (deepCopy $instance) }}
{{- with $defaultValues | merge (deepCopy $instance) }}
{{- if .logs.enabled }}
stage.match {
{{- if $instance.namespaces }}
Expand All @@ -66,10 +66,8 @@ stage.match {
// extract some of the fields from the log line
stage.logfmt {
mapping = {
"timestamp" = "t",
"ts" = "t",
"level" = "",
"logger" = "",
"type" = "",
{{- range $key, $value := .logs.tuning.structuredMetadata }}
{{ $key | quote }} = {{ if $value }}{{ $value | quote }}{{ else }}{{ $key | quote }}{{ end }},
{{- end }}
Expand All @@ -86,22 +84,31 @@ stage.match {
{{- if .logs.tuning.timestampFormat }}
// reset the timestamp to the extracted value
stage.timestamp {
source = "timestamp"
source = "ts"
format = {{ .logs.tuning.timestampFormat | quote }}
}
{{- end }}

{{- if .logs.tuning.scrubTimestamp }}
// remove the timestamp from the log line
stage.replace {
expression = "( t=[^ ]+\\s+)"
expression = `(?:^|\s+)(t=\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d+[^ ]*\s+)`
replace = ""
}
{{- end }}

{{- if hasKey .logs.tuning.structuredMetadata "caller" }}
// clean up the caller to remove the line
stage.replace {
source = "caller"
expression = "(:[0-9]+$)"
replace = ""
}
{{- end }}

{{- /* the stage.structured_metadata block needs to be conditionalized because the support for enabling structured metadata can be disabled */ -}}
{{- /* through the grafana limits_conifg on a per-tenant basis, even if there are no values defined or there are values defined but it is disabled */ -}}
{{- /* in Loki, the write will fail. */ -}}
{{- /* in Grafana, the write will fail. */ -}}
{{- if gt (len .logs.tuning.structuredMetadata) 0 }}
// set the structured metadata values
stage.structured_metadata {
Expand Down Expand Up @@ -130,6 +137,7 @@ stage.match {
drop_counter_reason = "grafana-exclude-line"
}
{{- end }}

}
{{- end }}
{{- end }}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,11 @@
{{/* Inputs: instance (grafana integration instance) Files (Files object) */}}
{{- define "integrations.grafana.allowList" }}
{{- $allowList := list -}}
{{- if .instance.metrics.tuning.useDefaultAllowList -}}
{{- $allowList = concat $allowList (list "up" "scrape_samples_scraped") (.Files.Get "default-allow-lists/grafana.yaml" | fromYamlArray) -}}
{{- end -}}
{{- if .instance.metrics.tuning.includeMetrics -}}
{{- $allowList = concat $allowList .instance.metrics.tuning.includeMetrics -}}
{{- $allowList = concat $allowList (list "up" "scrape_samples_scraped") .instance.metrics.tuning.includeMetrics -}}
{{- end -}}
{{ $allowList | uniq | toYaml }}
{{- end -}}
Expand Down Expand Up @@ -89,7 +92,7 @@ declare "grafana_integration" {
}

argument "job_label" {
comment = "The job label to add for all Loki metrics (default: integrations/grafana)"
comment = "The job label to add for all Grafana metrics (default: integrations/grafana)"
optional = true
}

Expand Down Expand Up @@ -137,9 +140,27 @@ declare "grafana_integration" {
// drop metrics that match the drop_metrics regex
rule {
source_labels = ["__name__"]
regex = coalesce(argument.drop_metrics.value, "(^(go|process)_.+$)")
regex = coalesce(argument.drop_metrics.value, "")
action = "drop"
}

// keep only metrics that match the keep_metrics regex
rule {
source_labels = ["__name__"]
regex = coalesce(argument.keep_metrics.value, "(.+)")
action = "keep"
}

// the grafana-mixin expects the instance label to be the node name
rule {
source_labels = ["node"]
target_label = "instance"
replacement = "$1"
}
rule {
action = "labeldrop"
regex = "node"
}
}
}
{{- range $instance := $.Values.grafana.instances }}
Expand All @@ -151,10 +172,10 @@ declare "grafana_integration" {
{{/* Instantiates the grafana integration */}}
{{/* Inputs: integration (grafana integration definition), Values (all values), Files (Files object) */}}
{{- define "integrations.grafana.include.metrics" }}
{{- $defaultValues := "integrations/grafana-values.yaml" | .Files.Get | fromYaml }}
{{- with mergeOverwrite $defaultValues (deepCopy .instance) }}
{{- $defaultValues := fromYaml (.Files.Get "integrations/grafana-values.yaml") }}
{{- with mergeOverwrite $defaultValues .instance (dict "type" "integration.grafana") }}
{{- $metricAllowList := include "integrations.grafana.allowList" (dict "instance" . "Files" $.Files) | fromYamlArray }}
{{- $metricDenyList := .excludeMetrics }}
{{- $metricDenyList := .metrics.tuning.excludeMetrics }}
{{- $labelSelectors := list }}
{{- range $k, $v := .labelSelectors }}
{{- if kindIs "slice" $v }}
Expand All @@ -174,7 +195,7 @@ grafana_integration_discovery {{ include "helper.alloy_name" .name | quote }} {

grafana_integration_scrape {{ include "helper.alloy_name" .name | quote }} {
targets = grafana_integration_discovery.{{ include "helper.alloy_name" .name }}.output
job_label = {{ .jobLabel | quote }}
job_label = "integrations/grafana"
clustering = true
{{- if $metricAllowList }}
keep_metrics = {{ $metricAllowList | join "|" | quote }}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,8 @@
{{- $defaultValues := "integrations/loki-values.yaml" | .Files.Get | fromYaml }}
{{- $logsEnabled := false }}
{{- range $instance := .Values.loki.instances }}
{{- with merge $instance $defaultValues (dict "type" "integration.loki") }}
{{- $logsEnabled = or $logsEnabled $instance.logs.enabled }}
{{- with merge (deepCopy $instance) (deepCopy $defaultValues) (dict "type" "integration.loki") }}
{{- $logsEnabled = or $logsEnabled .logs.enabled }}
{{- end }}
{{- end }}
{{- $logsEnabled -}}
Expand Down Expand Up @@ -105,7 +105,7 @@ stage.match {
{{- if .logs.tuning.scrubTimestamp }}
// remove the timestamp from the log line
stage.replace {
expression = "(ts=[^ ]+\\s+)"
expression = `(?:^|\s+)(ts=\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d+[^ ]*\s+)`
replace = ""
}
{{- end }}
Expand Down
Loading