Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Restored metric logging to third-party loggers #2489

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions src/anomalib/cli/pipelines.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,13 +6,13 @@
import logging

from jsonargparse import Namespace
from lightning_utilities.core.imports import package_available
from lightning_utilities.core.imports import module_available

from anomalib.cli.utils.help_formatter import get_short_docstring

logger = logging.getLogger(__name__)

if package_available("anomalib.pipelines"):
if module_available("anomalib.pipelines"):
from anomalib.pipelines import Benchmark
from anomalib.pipelines.components.base import Pipeline

Expand Down
4 changes: 2 additions & 2 deletions src/anomalib/cli/utils/openvino.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,12 +6,12 @@
import logging

from jsonargparse import ArgumentParser
from lightning_utilities.core.imports import package_available
from lightning_utilities.core.imports import module_available

logger = logging.getLogger(__name__)


if package_available("openvino"):
if module_available("openvino"):
from openvino.tools.ovc.cli_parser import get_common_cli_parser
else:
get_common_cli_parser = None
Expand Down
4 changes: 2 additions & 2 deletions src/anomalib/deploy/inferencers/openvino_inferencer.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@

import cv2
import numpy as np
from lightning_utilities.core.imports import package_available
from lightning_utilities.core.imports import module_available
from omegaconf import DictConfig
from PIL import Image

Expand Down Expand Up @@ -94,7 +94,7 @@ def __init__(
task: str | None = None,
config: dict | None = None,
) -> None:
if not package_available("openvino"):
if not module_available("openvino"):
msg = "OpenVINO is not installed. Please install OpenVINO to use OpenVINOInferencer."
raise ImportError(msg)

Expand Down
4 changes: 2 additions & 2 deletions src/anomalib/loggers/wandb.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,12 +9,12 @@
from lightning.fabric.utilities.types import _PATH
from lightning.pytorch.loggers.wandb import WandbLogger
from lightning.pytorch.utilities import rank_zero_only
from lightning_utilities.core.imports import package_available
from lightning_utilities.core.imports import module_available
from matplotlib.figure import Figure

from .base import ImageLoggerBase

if package_available("wandb"):
if module_available("wandb"):
import wandb

if TYPE_CHECKING:
Expand Down
6 changes: 3 additions & 3 deletions src/anomalib/models/components/base/export_mixin.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@

import numpy as np
import torch
from lightning_utilities.core.imports import package_available
from lightning_utilities.core.imports import module_available
from torch import nn
from torchmetrics import Metric
from torchvision.transforms.v2 import Transform
Expand Down Expand Up @@ -245,7 +245,7 @@ def to_openvino(
... task="segmentation",
... )
"""
if not package_available("openvino"):
if not module_available("openvino"):
logger.exception("Could not find OpenVINO. Please check OpenVINO installation.")
raise ModuleNotFoundError

Expand Down Expand Up @@ -294,7 +294,7 @@ def _compress_ov_model(
Returns:
model (CompiledModel): Model in the OpenVINO format compressed with NNCF quantization.
"""
if not package_available("nncf"):
if not module_available("nncf"):
logger.exception("Could not find NCCF. Please check NNCF installation.")
raise ModuleNotFoundError

Expand Down
4 changes: 2 additions & 2 deletions src/anomalib/models/image/vlm_ad/backends/chat_gpt.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,13 +10,13 @@
from typing import TYPE_CHECKING

from dotenv import load_dotenv
from lightning_utilities.core.imports import package_available
from lightning_utilities.core.imports import module_available

from anomalib.models.image.vlm_ad.utils import Prompt

from .base import Backend

if package_available("openai"):
if module_available("openai"):
from openai import OpenAI
else:
OpenAI = None
Expand Down
4 changes: 2 additions & 2 deletions src/anomalib/models/image/vlm_ad/backends/huggingface.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
from pathlib import Path
from typing import TYPE_CHECKING

from lightning_utilities.core.imports import package_available
from lightning_utilities.core.imports import module_available
from PIL import Image

from anomalib.models.image.vlm_ad.utils import Prompt
Expand All @@ -18,7 +18,7 @@
from transformers.modeling_utils import PreTrainedModel
from transformers.processing_utils import ProcessorMixin

if package_available("transformers"):
if module_available("transformers"):
import transformers
else:
transformers = None
Expand Down
4 changes: 2 additions & 2 deletions src/anomalib/models/image/vlm_ad/backends/ollama.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,13 +12,13 @@
import logging
from pathlib import Path

from lightning_utilities.core.imports import package_available
from lightning_utilities.core.imports import module_available

from anomalib.models.image.vlm_ad.utils import Prompt

from .base import Backend

if package_available("ollama"):
if module_available("ollama"):
from ollama import chat
from ollama._client import _encode_image
else:
Expand Down
72 changes: 72 additions & 0 deletions src/anomalib/pipelines/benchmark/job.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,41 @@

logger = logging.getLogger(__name__)

# Import external loggers
AVAILABLE_LOGGERS: dict[str, Any] = {}

try:
from anomalib.loggers import AnomalibCometLogger

AVAILABLE_LOGGERS["comet"] = AnomalibCometLogger
except ImportError:
logger.debug("Comet logger not available. Install using `pip install comet-ml`")
try:
from anomalib.loggers import AnomalibMLFlowLogger

AVAILABLE_LOGGERS["mlflow"] = AnomalibMLFlowLogger
except ImportError:
logger.debug("MLflow logger not available. Install using `pip install mlflow`")
try:
from anomalib.loggers import AnomalibTensorBoardLogger

AVAILABLE_LOGGERS["tensorboard"] = AnomalibTensorBoardLogger
except ImportError:
logger.debug("TensorBoard logger not available. Install using `pip install tensorboard`")
try:
from anomalib.loggers import AnomalibWandbLogger

AVAILABLE_LOGGERS["wandb"] = AnomalibWandbLogger
except ImportError:
logger.debug("Weights & Biases logger not available. Install using `pip install wandb`")

LOGGERS_AVAILABLE = len(AVAILABLE_LOGGERS) > 0

if LOGGERS_AVAILABLE:
logger.info(f"Available loggers: {', '.join(AVAILABLE_LOGGERS.keys())}")
else:
logger.warning("No external loggers available. Install required packages using `anomalib install -v`")

Comment on lines +26 to +60
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# Import external loggers
AVAILABLE_LOGGERS: dict[str, Any] = {}
try:
from anomalib.loggers import AnomalibCometLogger
AVAILABLE_LOGGERS["comet"] = AnomalibCometLogger
except ImportError:
logger.debug("Comet logger not available. Install using `pip install comet-ml`")
try:
from anomalib.loggers import AnomalibMLFlowLogger
AVAILABLE_LOGGERS["mlflow"] = AnomalibMLFlowLogger
except ImportError:
logger.debug("MLflow logger not available. Install using `pip install mlflow`")
try:
from anomalib.loggers import AnomalibTensorBoardLogger
AVAILABLE_LOGGERS["tensorboard"] = AnomalibTensorBoardLogger
except ImportError:
logger.debug("TensorBoard logger not available. Install using `pip install tensorboard`")
try:
from anomalib.loggers import AnomalibWandbLogger
AVAILABLE_LOGGERS["wandb"] = AnomalibWandbLogger
except ImportError:
logger.debug("Weights & Biases logger not available. Install using `pip install wandb`")
LOGGERS_AVAILABLE = len(AVAILABLE_LOGGERS) > 0
if LOGGERS_AVAILABLE:
logger.info(f"Available loggers: {', '.join(AVAILABLE_LOGGERS.keys())}")
else:
logger.warning("No external loggers available. Install required packages using `anomalib install -v`")
def try_create_logger(logger_class: str, config: dict[str, dict[str, Any]]) -> Logger:
"""Try to import a logger class.
Args:
logger_class (str): The name of the logger class to import.
config (dict[str, dict[str, Any]]): The configuration for the logger.
Returns:
Logger: The logger instance.
"""
try:
module = importlib.import_module("anomalib.loggers")
logger_class = getattr(module, f"Anomalib{logger_class}Logger")
return logger_class(**config)
except (ImportError, ModuleNotFoundError):
logger.info(
f"{logger_class} logger not available. Please install the respective package.",
)
return None


class BenchmarkJob(Job):
"""Benchmarking job.
Expand Down Expand Up @@ -69,6 +104,7 @@ def run(
accelerator=self.accelerator,
devices=devices,
default_root_dir=temp_dir,
logger=self._initialize_loggers(self.flat_cfg or {}) if LOGGERS_AVAILABLE else [],
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
logger=self._initialize_loggers(self.flat_cfg or {}) if LOGGERS_AVAILABLE else [],
logger=self._initialize_loggers(self.flat_cfg),

)
fit_start_time = time.time()
engine.fit(self.model, self.datamodule)
Expand All @@ -89,8 +125,44 @@ def run(
**test_results[0],
}
logger.info(f"Completed with result {output}")
# Logging metrics to External Loggers (excluding TensorBoard)
trainer = engine.trainer()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you sure that you can call trainer?

for logger_instance in trainer.loggers:
if any(
isinstance(logger_instance, AVAILABLE_LOGGERS.get(name, object))
for name in ["comet", "wandb", "mlflow"]
):
logger_instance.log_metrics(test_results[0])
logger.debug(f"Successfully logged metrics to {logger_instance.__class__.__name__}")
Comment on lines +128 to +136
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# Logging metrics to External Loggers (excluding TensorBoard)
trainer = engine.trainer()
for logger_instance in trainer.loggers:
if any(
isinstance(logger_instance, AVAILABLE_LOGGERS.get(name, object))
for name in ["comet", "wandb", "mlflow"]
):
logger_instance.log_metrics(test_results[0])
logger.debug(f"Successfully logged metrics to {logger_instance.__class__.__name__}")
# Logging metrics to External Loggers (excluding TensorBoard)
for logger_instance in engine.trainer.loggers:
logger_instance.log_metrics(test_results[0])
logger.info(f"Successfully logged metrics to {logger_instance.__class__.__name__}")

return output

@staticmethod
def _initialize_loggers(logger_configs: dict[str, dict[str, Any]]) -> list[Any]:
"""Initialize configured external loggers.

Args:
logger_configs: Dictionary mapping logger names to their configurations.

Returns:
List of initialized loggers.
"""
active_loggers = []
default_configs = {
"tensorboard": {"save_dir": "logs/benchmarks"},
"comet": {"project_name": "anomalib"},
"wandb": {"project": "anomalib"},
"mlflow": {"experiment_name": "anomalib"},
}

for logger_name, logger_class in AVAILABLE_LOGGERS.items():
# Use provided config or fall back to defaults
config = logger_configs.get(logger_name, default_configs.get(logger_name, {}))
logger_instance = logger_class(**config)
active_loggers.append(logger_instance)
logger.info(f"Successfully initialized {logger_name} logger")

return active_loggers
Comment on lines +139 to +164
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
@staticmethod
def _initialize_loggers(logger_configs: dict[str, dict[str, Any]]) -> list[Any]:
"""Initialize configured external loggers.
Args:
logger_configs: Dictionary mapping logger names to their configurations.
Returns:
List of initialized loggers.
"""
active_loggers = []
default_configs = {
"tensorboard": {"save_dir": "logs/benchmarks"},
"comet": {"project_name": "anomalib"},
"wandb": {"project": "anomalib"},
"mlflow": {"experiment_name": "anomalib"},
}
for logger_name, logger_class in AVAILABLE_LOGGERS.items():
# Use provided config or fall back to defaults
config = logger_configs.get(logger_name, default_configs.get(logger_name, {}))
logger_instance = logger_class(**config)
active_loggers.append(logger_instance)
logger.info(f"Successfully initialized {logger_name} logger")
return active_loggers
@staticmethod
def _initialize_loggers(training_config: dict[str, dict[str, Any]]) -> list[Logger]:
"""Initialize configured external loggers.
Args:
training_config: Dictionary mapping logger names to their configurations.
Returns:
List of initialized loggers.
"""
active_loggers: list[Logger] = []
default_configs = {
"TensorBoard": {"save_dir": "logs/benchmarks"},
"Comet": {"project_name": "anomalib"},
"Wandb": {"project": "anomalib"},
"MLFlow": {"experiment_name": "anomalib"},
}
for logger_name, default_config in default_configs.items():
# Use provided config or fall back to defaults
config = training_config.get(logger_name.lower(), default_config)
logger_instance = try_create_logger(logger_name, config)
if logger_instance is None:
continue
active_loggers.append(logger_instance)
logger.info(f"Successfully initialized {logger_name} logger")
return active_loggers

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @ashwinvaidya17 ,
Thankyou for your help and providing the enhanced implementation, I really appreciate that, I apologize for the delayed response, was dealing with some health issues that kept me away.
I was trying to test the changes locally both with and without the backend logger dependencies present, I followed the build steps from the documentation but ran into some issues, Here are the steps I followed,

Created and Activated the Conda Environment
conda create -n anomalib_dev python=3.10
conda activate anomalib_dev

Did a full installation with all dependencies
anomalib install --option full

Tried running the benchmark tool
anomalib benchmark --config tools/benchmarking/benchmark_params.yaml
from this documentation
I looked into tools/experimental/benchmarking and figured there's no such .yaml file so I used the command for other files, But in every case the error I faced was
To use other subcommand using `anomalib install
Usage: anomalib [-h] [-c CONFIG] [--print_config [=flags]] {install} ...
error: argument subcommand: invalid choice: 'benchmark' (choose from 'install')

I know I might be doing something wrong and might've missed something, but I’ve gone through the documentation thoroughly and can’t seem to figure it out. Could you please guide me on what I might be doing wrong and how to go ahead with testing it ?

Thanks again for your time and help!


@staticmethod
def collect(results: list[dict[str, Any]]) -> pd.DataFrame:
"""Gather the results returned from run."""
Expand Down
2 changes: 1 addition & 1 deletion src/anomalib/utils/exceptions/imports.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ def try_import(import_path: str) -> bool:

warnings.warn(
"The 'try_import' function is deprecated and will be removed in v2.0.0. "
"Use 'package_available' from lightning-utilities instead.",
"Use 'module_available' from lightning-utilities instead.",
DeprecationWarning,
stacklevel=2,
)
Expand Down