From 91fbe1e8e04f4a0d3eb30ec82c7a874b7fdd2fb9 Mon Sep 17 00:00:00 2001 From: Barend Garvelink <159024183+barend-xebia@users.noreply.github.com> Date: Fri, 26 Jul 2024 14:32:41 +0200 Subject: [PATCH] Update README --- README.md | 6 +++++- 1 file changed, 5 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index 2f9eeae..77cddb9 100644 --- a/README.md +++ b/README.md @@ -3,7 +3,7 @@ This package connects [Apache Spark™][sp-home] to [OpenTelemetry][ot-home]. -This creates a layer of indirection to allow reporting metrics from any Spark or PySpark job to [OpenTelemetry Collector][ot-col], or directly to any [supported backend][ot-export]. +This allows reporting tracing and metrics from any Spark or PySpark job to [OpenTelemetry Collector][ot-col], or directly to any [supported backend][ot-export]. ## Status @@ -67,6 +67,10 @@ If the OpenTelemetry Autoconfigure mechanism doesn't meet your requirements, you ## Design Choices +### Why not simply use Spark's built-in DropWizard support? + +Because that's something that already exists, and this is something I wanted to build. If the DropWizard metrics in Spark meet your needs, you should consider using those. + ### Crash on initialization failure If the OpenTelemetry SDK cannot be obtained during startup, we allow the listener –and enclosing spark job– to crash.