A sophisticated, professional tool for analyzing Coralogix logs and detecting failures with intelligent remediation suggestions. This tool provides both basic pattern-based and LLM-driven analysis capabilities.
- π Dual Analysis Modes: Both pattern-based and AI-driven analysis
- π€ LLM Integration: GPT-4 powered intelligent investigation
- π Rich Reporting: Beautiful console output with detailed analysis
- π§ Professional CLI: Command-line interface with comprehensive options
- π MCP Support: Model Context Protocol integration for AI assistants
- β‘ Fast & Efficient: Optimized for production use
- π‘οΈ Error Handling: Robust error handling and validation
- π Extensible: Modular architecture for easy extension
- Python 3.8 or higher
- Coralogix API key with
DataQueryingpermission - OpenAI API key (for LLM-driven analysis)
# Clone the repository
git clone https://github.com/your-org/coralogix-log-analyzer.git
cd coralogix-log-analyzer
# Install dependencies
pip install -r requirements.txt
# Install the package
pip install -e .pip install coralogix-log-analyzerexport CORALOGIX_API_KEY="test"
export CORALOGIX_TEAM_HOSTNAME="test"
export CORALOGIX_DOMAIN="test" # or your domain
export CORALOGIX_VERIFY_SSL="false" # set to "false" for development if SSL issues occur
export OPENAI_API_KEY="test" # for LLM analysis# Analyze logs using pattern-based detection
python main.py analyze "Why is my app failing?" --namespace my-app
# Or use the installed command
coralogix-analyzer analyze "Why is my app failing?" --namespace my-app# Use AI-powered investigation
python main.py investigate "What's causing the performance issues?" --namespace production
# Or use the installed command
coralogix-analyzer investigate "What's causing the performance issues?" --namespace production# Check service connectivity
python main.py health# Start MCP server for AI assistant integration
python mcp_server.py
# Or use the installed command
coralogix-mcpThe tool provides a comprehensive CLI with multiple commands:
# Basic pattern-based analysis
coralogix-analyzer analyze [QUERY] [OPTIONS]
# LLM-driven investigation
coralogix-analyzer investigate [QUERY] [OPTIONS]
# Health check
coralogix-analyzer health
# MCP server
coralogix-mcp
# Version information
coralogix-analyzer version
### Analysis Options
```bash
# Basic analysis with options
coralogix-analyzer analyze "Check for errors" \
--namespace my-app \
--pod my-pod \
--hours-back 2 \
--output results.txt \
--format json
# LLM investigation with options
coralogix-analyzer investigate "Why is the app slow?" \
--namespace production \
--hours-back 4 \
--max-iterations 10 \
--output investigation.json \
--format jsonThe tool supports multiple output formats:
- Text: Human-readable console output (default)
- JSON: Structured data for programmatic consumption
# Save results to file
coralogix-analyzer analyze "Check errors" --output results.txt
# Get JSON output
coralogix-analyzer investigate "Analyze failures" --format json --output results.json| Variable | Description | Required | Default |
|---|---|---|---|
CORALOGIX_API_KEY |
Coralogix API key | Yes | - |
CORALOGIX_TEAM_HOSTNAME |
Team hostname | Yes | - |
CORALOGIX_DOMAIN |
Coralogix domain | No | eu2.coralogix.com |
CORALOGIX_VERIFY_SSL |
SSL certificate verification | No | true |
OPENAI_API_KEY |
OpenAI API key | Yes (for LLM) | - |
LLM_MODEL |
LLM model to use | No | gpt-4.1 |
LLM_TEMPERATURE |
LLM temperature | No | 0.1 |
LLM_MAX_TOKENS |
Max tokens for response | No | 1000 |
LLM_MAX_ITERATIONS |
Max investigation iterations | No | 5 |
LOG_LEVEL |
Logging level | No | INFO |
DEBUG |
Enable debug mode | No | false |
You can also use a .env file:
# .env
CORALOGIX_API_KEY=your-api-key
CORALOGIX_TEAM_HOSTNAME=your-team
OPENAI_API_KEY=your-openai-key
LLM_MODEL=gpt-4
DEBUG=truecoralogix-log-analyzer/
βββ src/
β βββ __init__.py
β βββ config/
β β βββ __init__.py
β β βββ settings.py
β β βββ environment.py
β βββ models/
β β βββ __init__.py
β β βββ log_entry.py
β β βββ failure_pattern.py
β β βββ tool_result.py
β β βββ investigation_result.py
β βββ core/
β β βββ __init__.py
β β βββ coralogix_client.py
β β βββ llm_client.py
β β βββ analyzer.py
β βββ cli/
β β βββ __init__.py
β β βββ main.py
β βββ mcp/
β βββ __init__.py
β βββ server.py
β βββ tools.py
βββ tests/
βββ docs/
βββ main.py
βββ mcp_server.py
βββ requirements.txt
βββ pyproject.toml
βββ README.md
βββ MCP_INTEGRATION.md
- Settings: Centralized configuration with validation
- Environment: Environment variable loading and validation
- LogEntry: Represents individual log entries
- FailurePattern: Detected failure patterns with metadata
- ToolResult: Results from tool executions
- InvestigationResult: Complete investigation outcomes
- CoralogixClient: Professional API client for Coralogix
- LLMClient: OpenAI integration for AI-driven analysis
- BasicAnalyzer: Pattern-based analysis engine
- LLMAnalyzer: LLM-driven investigation engine
- Main CLI: Professional command-line interface
- Commands: Analyze, investigate, health check, version
- MCP Server: Model Context Protocol server for AI assistant integration
- Tools: Standardized tools for log analysis and investigation
Fast, predictable analysis using predefined patterns:
from src.core.analyzer import BasicAnalyzer
from src.core.coralogix_client import CoralogixClient
# Initialize
client = CoralogixClient(config)
analyzer = BasicAnalyzer(client, settings)
# Analyze
result = analyzer.analyze("Check for errors", "my-namespace")Features:
- β‘ Fast execution (milliseconds)
- π― Predictable results
- π° Low cost (no LLM calls)
- π§ Simple to understand and modify
Intelligent, adaptive analysis using AI:
from src.core.analyzer import LLMAnalyzer
from src.core.llm_client import LLMClient
# Initialize
llm_client = LLMClient(config)
analyzer = LLMAnalyzer(client, llm_client, settings)
# Investigate
result = analyzer.investigate("Why is my app failing?", "my-namespace")Features:
- π€ AI-driven analysis
- π Adaptive investigation
- π Context-aware results
- π‘ Intelligent remediation
class CoralogixClient:
def health_check(self) -> Tuple[bool, str]
def fetch_logs(self, namespace: str, pod_name: Optional[str] = None,
hours_back: int = 1, limit: int = 1000) -> List[LogEntry]
def search_errors(self, namespace: str, search_pattern: str,
hours_back: int = 1) -> List[LogEntry]
def get_pod_status(self, namespace: str, pod_name: Optional[str] = None) -> Dict[str, Any]class BasicAnalyzer:
def analyze(self, query: str, namespace: str, pod_name: Optional[str] = None,
hours_back: int = 1) -> InvestigationResult
def detect_failures(self, logs: List[LogEntry]) -> List[FailurePattern]class LLMAnalyzer:
def investigate(self, query: str, namespace: str, pod_name: Optional[str] = None,
hours_back: int = 1) -> InvestigationResult
def llm_analyze_and_decide(self, user_input: str,
tool_results: List[ToolResult] = None) -> Tuple[str, List[Dict[str, Any]]]If you encounter SSL certificate verification errors like:
β Connection failed: Connection failed: HTTPSConnectionPool(host='ng-api-http.jfrog-dev.app.coralogix.us', port=443): Max retries exceeded with url: /api/v1/health (Caused by SSLError(SSLCertVerificationError(1, "[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: Hostname mismatch, certificate is not valid for 'ng-api-http.jfrog-dev.app.coralogix.us'. (_ssl.c:1028)")))
Solutions:
-
For Development/Testing: Disable SSL verification temporarily:
export CORALOGIX_VERIFY_SSL="false"
-
For Production: Verify your domain configuration:
# Check if the domain is correct export CORALOGIX_DOMAIN="eu2.coralogix.com" # Use the default domain # or export CORALOGIX_DOMAIN="your-correct-domain.coralogix.com"
-
Check your team hostname:
export CORALOGIX_TEAM_HOSTNAME="your-correct-team-hostname"
- "Missing required environment variables": Ensure all required environment variables are set
- "Connection failed": Check your API key and domain configuration
- "No logs found": Verify the namespace and time range settings
# Install development dependencies
pip install -e ".[dev]"
# Run all tests
pytest
# Run with coverage
pytest --cov=src
# Run specific test file
pytest tests/test_analyzer.py# Format code
black src/
# Lint code
flake8 src/
# Type checking
mypy src/We welcome contributions! Please see our Contributing Guide for details.
# Clone the repository
git clone https://github.com/your-org/coralogix-log-analyzer.git
cd coralogix-log-analyzer
# Create virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install development dependencies
pip install -e ".[dev]"
# Install pre-commit hooks
pre-commit installWe use:
- Black for code formatting
- Flake8 for linting
- MyPy for type checking
- Pre-commit for automated checks
This project is licensed under the MIT License - see the LICENSE file for details.
For detailed information about the Model Context Protocol (MCP) integration, see MCP_INTEGRATION.md.
- Built with inspiration from the HolmesGPT project
- Uses OpenAI's GPT models for intelligent analysis
- Integrates with Coralogix for log aggregation
- MCP integration for AI assistant compatibility
- Documentation: GitHub Wiki
- Issues: GitHub Issues
- Discussions: GitHub Discussions
Made with β€οΈ for the DevOps community