PluginMIT
SparkAnalyzer
A plugin that can help you to checkout on Spark profiler and give back some advices
27
Downloads
0
Followers
2 months ago
Updated
📦
1
Versions
📖About SparkAnalyzer
SparkAnalyzer
A Minecraft plugin that analyzes Spark profiler data using AI to provide actionable performance recommendations.
Features
AI-Powered Analysis
- Supports multiple AI providers: OpenAI, Anthropic (Claude), Google Gemini, and OpenRouter
- Evidence-based diagnostics with severity ratings (Critical, High, Medium, Low)
- Structured analysis reports with specific recommendations
Easy to Use
- Simply run
/sparkanalyze <spark-url>with any spark.lucko.me profile link - Upload raw JSON data with
/sparkanalyze upload - Re-analyze last profile with
/sparkanalyze last
Webhook Integration
- Send analysis reports to Discord, Slack, or custom webhooks
- Automatic retry with exponential backoff
- Auto-disable on repeated failures
Internationalization
- Full i18n support with customizable language files
- English (en-US) included by default
Security Hardened
- No backdoors, telemetry, or hidden network calls
- API keys are never logged
- All network activity is user-triggered only
# SparkAnalyzer Configuration
# Security-hardened Minecraft Paper plugin for AI-powered Spark analysis
# Language setting - uses standard locale codes
# Available: en-US (default), vi-VN, ja-JP, etc.
# If an invalid code is provided, falls back to en-US
language: en-US
# AI Provider Configuration
ai:
# Supported providers: openai, anthropic, gemini,
provider: openrouter
# Your API key - NEVER share this publicly
# This value is loaded securely and never logged
api-key: your_api_key_here
# Model to use for analysis
# OpenAI: gpt-4.1-mini, gpt-4o, gpt-4-turbo
# Anthropic: claude-3-opus, claude-3-sonnet
# Gemini: gemini-pro
# OpenRouter: Any model from openrouter.ai (e.g., openai/gpt-4o, anthropic/claude-3.5-sonnet, meta-llama/llama-3.1-70b-instruct)
model: liquid/lfm-2.5-1.2b-thinking:free
# Temperature controls randomness (0.0 = deterministic, 2.0 = creative)
# For technical analysis, lower values (0.1-0.3) are recommended
# Valid range: 0.0 to 2.0
temperature: 0.2
# Maximum tokens in the AI response
# Higher values allow for more detailed analysis
# Must be greater than 0
max-tokens: 32000
# Webhook Configuration (Optional)
webhook:
# Set to true to enable webhook notifications
enabled: false
# Webhook URL - only this URL will receive notifications
# Example: https://discord.com/api/webhooks/...
url: ""
# Format: discord, slack, or generic
format: discord
# Include full analysis report in webhook (may be truncated)
include-full-report: false
# Maximum retry attempts for failed webhooks
max-retries: 3
# Auto-disable webhook after this many consecutive failures
failure-threshold: 5
# Report Settings
reports:
# Save detailed reports to plugins/SparkAnalyzer/reports/
save-to-file: true
# Include timestamps in report filenames
include-timestamp: true
# Debug Settings (for troubleshooting only)
debug:
# Enable verbose logging (does NOT log API keys)
enabled: false


Requirements
- Paper 1.16.5 - 1.21.11
- Java 17+
- Spark profiler installed
Permissions
| Permission | Description |
|---|---|
sparkanalyzer.analyze |
Use analysis commands |
sparkanalyzer.reload |
Reload configuration |