"Hit the right spot with your energy prices"
Home Assistant custom integration providing electricity spot prices from global markets with intelligent interval handling (15-minute, hourly, 5-minute) and automatic source fallback.
If you find this project useful, please consider sponsoring the development on GitHub Sponsors: https://github.com/sponsors/enoch85
- Installation
- Supported Price Sources & Regions
- Features
- Configuration
- Architecture
- Usage Examples
- Troubleshooting
- For Developers
- Contributing
- Technical Architecture (Advanced)
GE-Spot is available in the default HACS store!
- Make sure HACS is installed
- Click the button above, or go to HACS → Integrations
- Click the "+ EXPLORE & DOWNLOAD REPOSITORIES" button
- Search for "GE-Spot" or "Global Electricity Spot Prices"
- Click "Download"
- Restart Home Assistant
- Go to Settings → Devices & Services → Add Integration → Search for "GE-Spot"
- Copy the
ge_spotdirectory from this repository to your Home Assistant'scustom_componentsdirectory - Restart Home Assistant
The integration supports multiple price data sources with automatic fallback capabilities:
- Nordpool - Nordic (Norway, Sweden, Denmark, Finland), Baltic (Estonia, Latvia, Lithuania), and Central/Western Europe (Germany, Austria, Belgium, France, Netherlands, Poland)
- ENTSO-E - European Network of Transmission System Operators (requires API key)
- Energy-Charts - European spot prices (Germany, France, Netherlands, Belgium, Austria, and more)
- Energi Data Service - Denmark
- Stromligning - Denmark
- OMIE - Spain and Portugal
- AEMO - Australian Energy Market Operator
- ComEd - Chicago area real-time pricing
- Amber - Australian residential pricing
The table below shows which price sources support specific regions:
| Region | Description | Nordpool | ENTSO-E | Energy-Charts | Energi Data | Stromligning | OMIE | AEMO | ComEd | Amber |
|---|---|---|---|---|---|---|---|---|---|---|
| AT | Austria | ✓ | ✓ | ✓ | ||||||
| BE | Belgium | ✓ | ✓ | ✓ | ||||||
| BG | Bulgaria | ✓ | ✓ | |||||||
| CH | Switzerland | ✓ | ✓ | |||||||
| ComEd | Chicago Area | ✓ | ||||||||
| CZ | Czech Republic | ✓ | ✓ | |||||||
| DE | Germany | ✓ | ✓ | ✓ | ||||||
| DE-LU | Germany-Luxembourg | ✓ | ✓ | ✓ | ||||||
| DK1-2 | Denmark | ✓ | ✓ | ✓ | ✓ | ✓ | ||||
| EE | Estonia | ✓ | ✓ | ✓ | ||||||
| ES | Spain | ✓ | ✓ | ✓ | ||||||
| FI | Finland | ✓ | ✓ | ✓ | ||||||
| FR | France | ✓ | ✓ | ✓ | ||||||
| GR | Greece | ✓ | ✓ | |||||||
| HR | Croatia | ✓ | ✓ | |||||||
| HU | Hungary | ✓ | ✓ | |||||||
| IT-Centre-North | Italy Centre-North | ✓ | ||||||||
| IT-Centre-South | Italy Centre-South | ✓ | ||||||||
| IT-North | Italy North | ✓ | ✓ | |||||||
| IT-Sardinia | Italy Sardinia | ✓ | ||||||||
| IT-Sicily | Italy Sicily | ✓ | ||||||||
| IT-South | Italy South | ✓ | ||||||||
| LT | Lithuania | ✓ | ✓ | ✓ | ||||||
| LV | Latvia | ✓ | ✓ | |||||||
| ME | Montenegro | ✓ | ||||||||
| NL | Netherlands | ✓ | ✓ | ✓ | ||||||
| NO1-5 | Norway | ✓ | ✓ | ✓ | ||||||
| NSW1 | Australia NSW | ✓ | ✓ | |||||||
| PL | Poland | ✓ | ✓ | ✓ | ||||||
| PT | Portugal | ✓ | ✓ | ✓ | ||||||
| QLD1 | Australia Queensland | ✓ | ✓ | |||||||
| RO | Romania | ✓ | ✓ | |||||||
| RS | Serbia | ✓ | ✓ | |||||||
| SA1 | Australia South | ✓ | ✓ | |||||||
| SE1-4 | Sweden | ✓ | ✓ | ✓ | ||||||
| SI | Slovenia | ✓ | ✓ | |||||||
| SK | Slovakia | ✓ | ✓ | |||||||
| TAS1 | Australia Tasmania | ✓ | ✓ | |||||||
| VIC1 | Australia Victoria | ✓ | ✓ |
For complete area mappings, see const/areas.py.
- Flexible intervals - Handles 15-min, hourly, and 5-min data from different markets
- Unified output - Standardizes to 15-minute intervals (96 data points per day)
- Multi-source fallback - Automatic switching between data sources
- Global coverage - Europe, Australia, and North America
- Currency conversion - Live ECB exchange rates
- Timezone handling - Consistent display regardless of API source
- Tomorrow's prices - Available after daily publication (typically 13:00 CET)
- EV Smart Charging integration - Native support for EV Smart Charging via
today_interval_pricesandtomorrow_interval_pricesattributes
- Current Price - Current 15-minute interval price
- Next Interval Price - Upcoming interval price
- Average Price - Today's average
- Peak/Off-Peak Price - Today's high/low
- Price Difference - Current vs average (absolute)
- Price Percentage - Current vs average (relative)
- Hourly Average Price - Current hour's average (calculated from 15-min intervals)
- Tomorrow Average Price - Tomorrow's average forecast
- Tomorrow Peak/Off-Peak Price - Tomorrow's high/low forecasts
- Tomorrow Hourly Average Price - Tomorrow's hourly averages
See docs/hourly_average_sensors.md for details on hourly average sensors.
After installation:
- Go to Configuration → Integrations
- Click "Add Integration" and search for "GE-Spot: Global Electricity Spot Prices"
- Select your region/area from the dropdown
- Configure settings:
- Region/Area: Select your electricity price area (e.g. SE4, DK1)
- Source Priority: Order of data sources to try (first = highest priority)
- VAT Rate: Set your applicable VAT percentage (e.g. 25 for 25%)
The order in which you select data sources determines their priority. To set a specific source as your primary:
- Uncheck all source options
- Select your preferred primary source first (e.g., ENTSO-E)
- Continue selecting additional sources in your desired priority order
- Submit the configuration
The first selected source becomes your highest priority, and the integration will attempt to use sources in the order you configured them.
- Display Format: Choose between decimal (e.g. 0.15 EUR/kWh) or subunit (e.g. 15 cents/kWh)
- Additional Tariff: Add grid/transfer fees from your provider (per kWh, applied before VAT)
- Energy Tax: Add fixed energy tax per kWh (e.g., government levy, applied before VAT)
- Timezone Reference: Display prices in Home Assistant timezone or local area timezone
- API Keys: For ENTSO-E, you'll need to register for an API key
- API Key Reuse: The integration will reuse API keys across different regions using the same source
- Rate limiting - Minimum 15-minute intervals
- Automatic retries - Exponential backoff for failed requests (5s → 15s → 45s)
- Data caching - Persistent storage with TTL
- Intelligent interval validation - DST-aware validation ensures complete data:
- Normal days: Expects 96 intervals (15-min × 96 = 24 hours)
- DST spring forward: Expects 92 intervals (23 hours)
- DST fall back: Expects 100 intervals (25 hours)
- Strict validation: Allows only 1 missing interval (15 minutes) tolerance
- Automatic fallback: Switches to alternative sources when data is incomplete
- Source fallback - Try all sources in priority order until complete data is found
- Daily health check - All configured sources validated once per day during special windows
- Source health monitoring - Track which sources are working vs failed, with retry schedules
Example: If ENTSO-E returns 94/96 intervals (missing 30 minutes), the system automatically:
- Detects incomplete data (94 < 95 minimum required)
- Logs warning about missing intervals
- Tries next configured source (e.g., Energy Charts)
- Uses complete data from working source
- Caches complete result for future requests
Data Flow: API Client → Parser → Timezone Conversion → Currency Conversion → Cache → Sensors
Three-Layer System:
- API Layer - Source-specific clients (Nordpool, ENTSO-E, AEMO, etc.)
- Coordinator Layer - Unified manager with fallback and caching
- Sensor Layer - Home Assistant entities with consistent IDs
- Source timezone detection - Each API has known timezone behavior
- DST transitions - Handles 92-100 intervals on transition days automatically
- Interval validation - Ensures data completeness before acceptance:
- Validates exact interval count matches expected (92/96/100 depending on DST)
- Tolerates 1 missing interval (15 minutes) for API timing edge cases
- Rejects incomplete data (2+ missing intervals = 30+ minutes)
- Automatically tries alternative sources when primary source is incomplete
- 15-minute alignment - All data normalized to :00, :15, :30, :45 boundaries
- Home Assistant integration - Displays in your configured timezone
Conversion Pipeline: Raw API Data → Currency Conversion → Unit Conversion → VAT Application → Display Formatting
Currency handling:
- Live ECB exchange rates (24h cache)
- Automatic currency detection by region
- Display in main units (EUR/kWh) or subunits (cents/kWh)
Different sources include different price components:
| Source | Price Components |
|---|---|
| Nordpool/ENTSO-E/OMIE | Raw spot price |
| Stromligning | Spot + grid fees + taxes (Denmark) |
| AEMO | Pre-dispatch trading prices (30-min intervals) |
| ComEd | Real-time market pricing (5-min dispatch) |
| Amber | Spot + network + carbon costs |
GE-Spot intelligently handles different native resolutions from APIs:
| Source | Native Data | GE-Spot Processing |
|---|---|---|
| ENTSO-E | 15/30/60 min | Uses native 15-min when available, expands others |
| Nordpool | 15/60 min | Uses native 15-min, expands hourly to 15-min |
| Energy-Charts | 15 min | Uses native 15-min data (96 intervals/day) |
| OMIE/Stromligning | 60 min | Expands hourly to 15-min (duplicates across 4 intervals) |
| AEMO | 30 min trading | Expands to 15-min (duplicates across 2 intervals) |
| ComEd | 5 min dispatch | Aggregates to 15-min (averages 3 values per interval) |
| Amber | 30 min | Expands to 15-min (duplicates across 2 intervals) |
Strategy: All sources output 96 intervals per day (15-minute granularity) for consistent automation timing.
type: entities
entities:
- entity: sensor.gespot_current_price_se4
name: Current Electricity Price (15-min interval)
- entity: sensor.gespot_next_interval_price_se4
name: Next Interval Price
- entity: sensor.gespot_average_price_se4
name: Today's Average
- entity: sensor.gespot_hourly_average_price_se4
name: Current Hour Average
- entity: sensor.gespot_tomorrow_average_price_se4
name: Tomorrow's Average
type: custom:apexcharts-card
now:
show: true
label: ""
graph_span: 2d
span:
start: day
apex_config:
chart:
height: 300px
legend:
show: false
xaxis:
labels:
format: HH:mm
grid:
borderColor: "#e0e0e0"
strokeDashArray: 3
tooltip:
x:
format: HH:mm
annotations:
yaxis:
- "y": 0
yAxisIndex: 0
strokeDashArray: 0
borderColor: rgba(128, 128, 128, 0.8)
borderWidth: 2
opacity: 1
yaxis:
- id: watts
decimals: 0
- id: price
decimals: 0
opposite: true
experimental:
color_threshold: true
series:
- entity: sensor.gespot_current_price_se4
name: Price (öre)
type: area
curve: stepline
yaxis_id: price
extend_to: now
stroke_width: 0
opacity: 0.7
data_generator: |
return [
...(entity.attributes.today_interval_prices || []).map(item => [new Date(item.time), item.value]),
...(entity.attributes.tomorrow_interval_prices || []).map(item => [new Date(item.time), item.value])
];
color_threshold:
- value: -50
color: cyan
- value: 0
color: green
- value: 40
color: orange
- value: 100
color: red
- value: 200
color: magenta
- value: 500
color: black
- entity: sensor.YOUR_ENERGY_METER
name: Watts
type: line
curve: smooth
yaxis_id: watts
color: "#FF0000"
stroke_width: 2
opacity: 0.5
extend_to: false
group_by:
func: avg
duration: 5min
update_interval: 300sThe price sensors expose interval prices through attributes in a standardized format compatible with various integrations:
Attribute Format:
{
"today_interval_prices": [
{"time": "2025-10-14T00:00:00+02:00", "value": 0.0856, "raw_value": 0.0754},
{"time": "2025-10-14T00:15:00+02:00", "value": 0.0842, "raw_value": 0.0740},
...
],
"tomorrow_interval_prices": [
{"time": "2025-10-15T00:00:00+02:00", "value": 0.0891, "raw_value": 0.0789},
...
]
}Key Points:
- Each price entry contains:
time: ISO 8601 datetime string in your Home Assistant timezonevalue: Final consumer price (with VAT, tariffs, and energy taxes applied)raw_value: Market spot price (currency and unit converted only, no VAT/fees) (New in v1.6.0)
- List contains 96 entries for a normal day (15-minute intervals)
- During DST transitions: 92 entries (spring) or 100 entries (fall)
- Compatible with EV Smart Charging, ApexCharts, and custom automations
Price Calculation:
value = ((raw_value + additional_tariff + energy_tax) × (1 + VAT%)) × display_unit_multiplier
When no VAT, tariffs, or taxes are configured, raw_value equals value.
Using in Templates:
# Get final consumer price at 14:00
{{ state_attr('sensor.gespot_current_price_se3', 'today_interval_prices')
| selectattr('time', 'search', 'T14:00')
| map(attribute='value')
| first }}
# Get raw market price at 14:00 (without VAT/fees)
{{ state_attr('sensor.gespot_current_price_se3', 'today_interval_prices')
| selectattr('time', 'search', 'T14:00')
| map(attribute='raw_value')
| first }}
# Get all prices above 0.10
{{ state_attr('sensor.gespot_current_price_se3', 'today_interval_prices')
| map(attribute='value')
| select('>', 0.10)
| list }}
# Compare market prices to final prices
{% set prices = state_attr('sensor.gespot_current_price_se3', 'today_interval_prices') %}
Market avg: {{ prices | map(attribute='raw_value') | average | round(4) }}
Final avg: {{ prices | map(attribute='value') | average | round(4) }}
Difference: {{ ((prices | map(attribute='value') | average) - (prices | map(attribute='raw_value') | average)) | round(4) }}
# Count hours with negative prices (on market)
{{ state_attr('sensor.gespot_current_price_se3', 'today_interval_prices')
| map(attribute='raw_value')
| select('<', 0)
| list
| length }}automation:
- alias: Turn on water heater when prices are low
trigger:
- platform: state
entity_id: sensor.gespot_current_price_se4
condition:
- condition: template
value_template: "{{ states('sensor.gespot_current_price_se4')|float < states('sensor.gespot_average_price_se4')|float * 0.8 }}"
action:
- service: switch.turn_on
entity_id: switch.water_heaterTo integrate GE-Spot with the Energy Dashboard, you can create template sensors:
template:
- sensor:
- name: "Energy Cost Sensor"
unit_of_measurement: "SEK/kWh"
state: "{{ states('sensor.gespot_current_price_se4') }}"Then set this sensor as your energy cost sensor in the Energy Dashboard settings.
Common Issues:
- No data - Check area is supported by selected source
- API key errors - Verify ENTSO-E API key if using that source
- Missing tomorrow prices - Available after 13:00 CET daily
- 96 data points - Correct! 15-minute intervals = 96 per day (92 on DST spring, 100 on DST fall)
- Incomplete data warnings - If you see warnings about incomplete intervals:
- System automatically tries alternative sources
- Check
active_sourcein sensor attributes to see which source is being used - Configure multiple sources for better reliability
- Example:
[NL] Incomplete today data from entsoe: 94/96 intervals (missing 2)→ System switches to Energy Charts
Source Health Monitoring:
Check sensor attributes for source health information:
sensor.gespot_current_price_se4:
attributes:
source_info:
active_source: "nordpool" # Currently used source
validated_sources: # Sources that are working
- "nordpool"
- "entsoe"
failed_sources: # Sources that failed (if any)
- source: "energy_charts"
failed_at: "2025-10-10T17:36:42+02:00"
retry_at: "2025-10-11T13:00:00+02:00"- validated_sources: List of sources that have been tested and are working
- failed_sources: List of sources that failed, with timestamps and retry schedule
- active_source: The source currently providing data
Diagnostics:
- Check sensor attributes:
data_source,active_source,using_cached_data - Review Home Assistant logs for
ge_spoterrors - Configure multiple sources for better reliability
custom_components/ge_spot/
├── __init__.py # Integration setup and coordinator registration
├── config_flow.py # Configuration flow handler
├── manifest.json # Integration manifest and dependencies
├── api/ # API clients for different price sources
│ ├── __init__.py # API client factory and source mapping
│ ├── base/ # Base classes and shared functionality
│ │ ├── api_client.py # HTTP client wrapper with retry logic
│ │ ├── base_price_api.py # Abstract base for all price APIs
│ │ └── error_handler.py # Error handling and retry mechanisms
│ ├── parsers/ # Data parsers for each API source
│ │ ├── nordpool_parser.py # Nord Pool price data parser
│ │ ├── entsoe_parser.py # ENTSO-E XML response parser
│ │ ├── aemo_parser.py # AEMO market data parser
│ │ └── ... # Other source-specific parsers
│ ├── nordpool.py # Nord Pool API client
│ ├── entsoe.py # ENTSO-E API client
│ ├── energy_charts.py # Energy-Charts API client
│ ├── aemo.py # AEMO API client
│ ├── omie.py # OMIE API client
│ ├── stromligning.py # Strømligning API client
│ ├── energi_data.py # Energi Data Service API client
│ ├── comed.py # ComEd API client
│ ├── amber.py # Amber Electric API client
│ └── utils.py # API utility functions
├── config_flow/ # Configuration flow logic
│ ├── __init__.py # Config flow exports
│ ├── implementation.py # Main configuration steps
│ ├── options.py # Options flow for reconfiguration
│ ├── schemas.py # Voluptuous schemas for validation
│ ├── utils.py # Config flow utility functions
│ └── validators.py # Custom validation logic
├── const/ # Constants and configuration
│ ├── __init__.py # Constants exports
│ ├── api.py # API-specific constants
│ ├── areas.py # Area codes and mappings
│ ├── attributes.py # Sensor attribute constants
│ ├── config.py # Configuration keys
│ ├── currencies.py # Currency codes and mappings
│ ├── defaults.py # Default configuration values
│ ├── display.py # Display format constants
│ ├── energy.py # Energy unit constants
│ ├── errors.py # Error message constants
│ ├── intervals.py # Interval-related constants
│ ├── network.py # Network and timeout constants
│ ├── sensors.py # Sensor type constants
│ ├── sources.py # Source definitions and mappings
│ └── time.py # Time and timezone constants
├── coordinator/ # Data coordination and management
│ ├── __init__.py # Coordinator exports
│ ├── unified_price_manager.py # Main price data orchestrator
│ ├── fallback_manager.py # Source fallback logic
│ ├── data_processor.py # Raw data processing
│ ├── cache_manager.py # Data caching with TTL
│ ├── fetch_decision.py # Fetch timing decisions
│ ├── data_validity.py # Data validation logic
│ └── api_key_manager.py # API key management
├── price/ # Price data processing
│ ├── __init__.py # Price processing exports
│ ├── currency_converter.py # Currency and unit conversion
│ ├── currency_service.py # ECB exchange rate service
│ ├── formatter.py # Price display formatting
│ └── statistics.py # Price statistics calculation
├── sensor/ # Home Assistant sensor entities
│ ├── __init__.py # Sensor exports
│ ├── base.py # Base sensor class
│ ├── electricity.py # Main sensor setup
│ └── price.py # Individual price sensor types
├── timezone/ # Timezone handling
│ ├── __init__.py # Timezone exports
│ ├── converter.py # Main timezone conversion
│ ├── dst_handler.py # DST transition handling
│ ├── interval_calculator.py # 15-min interval calculations
│ ├── parser.py # Timestamp parsing
│ ├── service.py # Timezone service orchestrator
│ ├── source_tz.py # Source-specific timezone logic
│ ├── timezone_converter.py # Core timezone conversion
│ └── timezone_utils.py # Timezone utility functions
├── translations/ # UI translations
│ ├── en.json # English translations
│ └── strings.json # Translation strings
└── utils/ # Utility functions
├── __init__.py # Utilities exports
├── advanced_cache.py # Advanced caching implementation
├── data_validator.py # Data validation helpers
├── date_range.py # Date range utilities
├── debug_utils.py # Debugging helpers
├── exchange_service.py # ECB exchange rate fetching
├── form_helper.py # Configuration form helpers
├── parallel_fetcher.py # Parallel API fetching
├── rate_limiter.py # API rate limiting
├── timezone_converter.py # Timezone conversion utilities
├── unit_conversion.py # Unit conversion helpers
└── validation/ # Validation modules
- Create API Client: Extend
BasePriceAPIinapi/new_source.py - Create Parser: Add
api/parsers/new_source_parser.pyfor data parsing - Register Source:
- Add to
const/sources.py(source constants and mappings) - Update
const/areas.py(supported regions)
- Add to
- Update Config Flow: Enable source selection in configuration
- Add Tests: Unit and integration tests for reliability
- Unit Tests:
pytest tests/pytest/unit/ - Integration Tests:
pytest tests/pytest/integration/ - Manual Testing:
python -m tests.manual.integration.source_test AREA
Want to help improve GE-Spot? Check out the TODO folder for a list of tasks!
We've organized contribution opportunities into categories:
- Testing - Add tests for better reliability
- Code Quality - Improve maintainability
- Documentation - Help new contributors
- Enhancements - Add monitoring and features
- Future Features - Long-term ideas
Pick something that interests you, no deadlines or pressure. See the TODO/README.md for details.
flowchart TD
Config["User Configuration"] --> Coord["UnifiedPriceCoordinator"]
Coord --> UPM["UnifiedPriceManager"]
UPM --> LoadCache["Load Cached Data<br/>(with DataValidity)"]
LoadCache --> ExtractValidity["Extract DataValidity"]
ExtractValidity --> FetchDecision["FetchDecisionMaker<br/>(DataValidity-driven)"]
FetchDecision --> |"Should fetch"| FilterSources["Pre-filter Failed Sources<br/>(skip sources with failure timestamp)"]
FetchDecision --> |"Rate limited or<br/>data still valid"| UseCache["Use Cached Data"]
FilterSources --> FallbackMgr["FallbackManager<br/>(Exponential Backoff:<br/>5s → 15s → 45s)"]
FallbackMgr --> |"Try source 1"| API1["API Client 1<br/>(attempt 1-3)"]
FallbackMgr --> |"Try source 2"| API2["API Client 2<br/>(attempt 1-3)"]
FallbackMgr --> |"Try source N"| APIN["API Client N<br/>(attempt 1-3)"]
API1 --> |"Success"| Parser1["Parser 1"]
API2 --> |"Success"| Parser2["Parser 2"]
APIN --> |"Success"| ParserN["Parser N"]
API1 --> |"Failed"| MarkFailed1["Mark source 1 failed<br/>(timestamp = now)"]
API2 --> |"Failed"| MarkFailed2["Mark source 2 failed"]
APIN --> |"Failed"| MarkFailedN["Mark source N failed"]
MarkFailed1 -.-> HealthCheckBG["Continuous Health Check Task<br/>(running in background)"]
MarkFailed2 -.-> HealthCheckBG
MarkFailedN -.-> HealthCheckBG
HealthCheckBG -.-> |"Every 15 min check<br/>for special windows<br/>(00:00-01:00 or 13:00-15:00)"| HealthCheck["Validate ALL sources<br/>(during window)"]
HealthCheck --> UpdateAllStatus["Update all source statuses"]
UpdateAllStatus -.-> FilterSources
Parser1 --> Validate["Validate Parsed Data"]
Parser2 --> Validate
ParserN --> Validate
Validate --> |"Valid"| ClearFailed["Clear failure status<br/>(timestamp = None)"]
ClearFailed --> RawData["Raw Standardized Data"]
RawData --> DataProcessor["DataProcessor"]
subgraph DataProcessor["Data Processing Pipeline"]
direction TB
TZ["Timezone Conversion"] --> Currency["Currency Conversion"]
Currency --> VAT["VAT Application"]
VAT --> Stats["Statistics Calculation"]
Stats --> CalcValidity["Calculate DataValidity"]
end
DataProcessor --> CreateIPD["Create IntervalPriceData<br/>(source data + metadata)"]
CreateIPD --> CacheStore["CacheManager.store()<br/>(stores IntervalPriceData)"]
CacheStore --> IPDObject["IntervalPriceData Object<br/>(in memory)"]
UseCache --> LoadIPD["Load IntervalPriceData<br/>(from cache)"]
LoadIPD --> IPDObject
IPDObject --> Sensors["Home Assistant Sensors<br/>(access via @property)"]
flowchart TD
Start["Coordinator Update Trigger"] --> LoadCache["Load Cache + DataValidity"]
LoadCache --> CheckCurrent{"DataValidity:<br/>has_current_interval?"}
CheckCurrent --> |"FALSE<br/>(CRITICAL)"| RateLimitCritical{"Rate Limited?"}
CheckCurrent --> |"TRUE"| CheckInitial{"First fetch ever?"}
RateLimitCritical --> |"Yes"| UseCache["Use Cached Data<br/>(if available)"]
RateLimitCritical --> |"No"| FetchNow["FETCH IMMEDIATELY<br/>(no current data)"]
CheckInitial --> |"Yes (never fetched)"| FetchNow
CheckInitial --> |"No"| CheckBuffer{"DataValidity:<br/>intervals_remaining<br/>< 8 intervals?"}
CheckBuffer --> |"Yes<br/>(running low)"| RateLimitBuffer{"Rate Limited?"}
CheckBuffer --> |"No"| CheckTomorrowWindow{"In tomorrow window?<br/>(13:00-15:00)"}
RateLimitBuffer --> |"Yes"| UseCache
RateLimitBuffer --> |"No"| FetchNow
CheckTomorrowWindow --> |"Yes"| CheckTomorrowData{"DataValidity:<br/>tomorrow_interval_count<br/>< 76?"}
CheckTomorrowWindow --> |"No"| CheckRateLimit{"Rate Limited?<br/>(< 15 min)"}
CheckTomorrowData --> |"Yes<br/>(need tomorrow)"| RateLimitTomorrow{"Rate Limited?"}
CheckTomorrowData --> |"No<br/>(have tomorrow)"| CheckRateLimit
RateLimitTomorrow --> |"Yes"| UseCache
RateLimitTomorrow --> |"No"| FetchNow
CheckRateLimit --> |"Yes"| UseCache
CheckRateLimit --> |"No"| FetchNow
FetchNow --> PreFilter["Pre-filter Failed Sources<br/>(before FallbackManager)"]
PreFilter --> CheckAvailable{"Any sources<br/>available?"}
CheckAvailable --> |"No"| UseCache
CheckAvailable --> |"Yes"| FallbackMgr["FallbackManager<br/>(try each source)"]
FallbackMgr --> Attempt1{"Source 1<br/>Attempt 1 (5s)"}
Attempt1 --> |"Success"| Success["Parse & Validate Data"]
Attempt1 --> |"Fail"| Attempt2{"Source 1<br/>Attempt 2 (15s)"}
Attempt2 --> |"Success"| Success
Attempt2 --> |"Fail"| Attempt3{"Source 1<br/>Attempt 3 (45s)"}
Attempt3 --> |"Success"| Success
Attempt3 --> |"Fail"| NextSource{"More sources?"}
NextSource --> |"Yes"| Attempt1
NextSource --> |"No"| AllFailed["All Sources Failed"]
AllFailed --> UseCache
Success --> ClearFailed["Clear failure status<br/>(timestamp = None)"]
ClearFailed --> ProcessData["Process & Cache<br/>(create IntervalPriceData)"]
ProcessData --> IPDNew["IntervalPriceData<br/>(in memory)"]
UseCache --> IPDCached["IntervalPriceData<br/>(from cache)"]
IPDNew --> UpdateSensors["Update Sensors<br/>(via @property access)"]
IPDCached --> UpdateSensors
UpdateSensors --> End["Wait for Next Update"]
flowchart TD
subgraph CacheManager["Cache Manager"]
direction TB
CacheGet["get(area, target_date, source)"] --> CacheCheck{"Cache exists & valid?"}
CacheCheck --> |"Yes"| CacheHit["Return cached data<br/>(with DataValidity)<br/>(deep copy to prevent mutation)"]
CacheCheck --> |"No"| CacheMiss["Return None"]
CacheStore["store(area, source, data, timestamp, target_date)"] --> Serialize["Serialize data<br/>(includes DataValidity)"]
Serialize --> WriteFile["Write to .storage/"]
CacheCleanup["cleanup()"] --> FindExpired["Find expired entries"]
FindExpired --> DeleteExpired["Delete expired files"]
end
subgraph RateLimiter["Rate Limiter"]
direction TB
RLCheck["should_skip_fetch()"] --> GracePeriod{"In grace period?<br/>(startup/reload)"}
GracePeriod --> |"Yes"| AllowFetch["Allow Fetch<br/>(bypass rate limit)"]
GracePeriod --> |"No"| LastFetch{"Last fetch time"}
LastFetch --> |"< 15 min ago"| Blocked["Rate Limited<br/>(use cache)"]
LastFetch --> |"≥ 15 min ago"| AllowFetch
RLUpdate["update_last_fetch(source, area)"] --> StoreTime["Store current timestamp"]
end
subgraph FailedSourceTracking["Source Health & Validation"]
direction TB
FailedDict["self._failed_sources<br/>Dict[str, Optional[datetime]]"] --> CheckStatus{"Check source status"}
CheckStatus --> |"timestamp = None"| SourceOK["Source validated<br/>(include in fetch)"]
CheckStatus --> |"timestamp exists"| SourceDisabled["Source disabled<br/>(skip until health check)"]
OnSuccess["On API success"] --> ClearTimestamp["Set timestamp = None"]
OnFailure["On API failure"] --> SetTimestamp["Set timestamp = now()"]
HealthCheckTask["Health Check Background Task<br/>(started at init, runs continuously)"] --> CheckLoop["Sleep 15 minutes"]
CheckLoop --> InWindow{"In special window?<br/>(00:00-01:00 or 13:00-15:00)"}
InWindow --> |"No"| CheckLoop
InWindow --> |"Yes"| CheckLastWindow{"Last window hour checked<br/>!= current window hour?"}
CheckLastWindow --> |"Same window"| CheckLoop
CheckLastWindow --> |"Different window"| RandomDelay["Random delay 0-3600s<br/>(spread load)"]
RandomDelay --> ValidateAll["Validate ALL sources<br/>(not just failed ones)"]
ValidateAll --> UpdateStatus["Update all source statuses<br/>(clear or set timestamps)"]
UpdateStatus --> MarkWindow["Mark window hour checked<br/>(0 or 13)"]
MarkWindow --> CheckLoop
end
subgraph Integration["Integration Flow"]
FetchRequest["Fetch Request"] --> RLCheck
Blocked --> CacheGet
CacheGet --> LoadIPD["Load IntervalPriceData<br/>(from cache)"]
AllowFetch --> PreFilter["Pre-filter disabled sources<br/>(sources with failure timestamp)"]
PreFilter --> APICall["API Call with<br/>Exponential Backoff"]
APICall --> |"Success"| OnSuccess
APICall --> |"Failure (all attempts)"| OnFailure
OnSuccess --> ProcessData["Create IntervalPriceData<br/>(from API response)"]
ProcessData --> CacheStore
CacheStore --> NewIPD["IntervalPriceData<br/>(in memory)"]
OnFailure --> CacheGet
LoadIPD --> Sensors["Sensors access via<br/>@property"]
NewIPD --> Sensors
end
flowchart TD
FallbackStart["FallbackManager.fetch_with_fallback()"] --> GetSources["Get enabled API instances<br/>(already pre-filtered)"]
GetSources --> SourceLoop{"More sources to try?"}
SourceLoop --> |"Yes"| NextSource["Get next source"]
SourceLoop --> |"No"| FallbackFail["All sources failed"]
NextSource --> RetryLoop["Exponential Backoff Retry Loop"]
subgraph RetryLoop["Retry with Exponential Backoff"]
direction TB
Try1["Attempt 1: timeout=5s"] --> Check1{"Success?"}
Check1 --> |"Yes"| Success1["✓ Return data"]
Check1 --> |"No"| Try2["Attempt 2: timeout=15s"]
Try2 --> Check2{"Success?"}
Check2 --> |"Yes"| Success2["✓ Return data"]
Check2 --> |"No"| Try3["Attempt 3: timeout=45s"]
Try3 --> Check3{"Success?"}
Check3 --> |"Yes"| Success3["✓ Return data"]
Check3 --> |"No"| Failed["✗ Source failed"]
end
Success1 --> ValidateData{"Data valid?"}
Success2 --> ValidateData
Success3 --> ValidateData
ValidateData --> |"Valid"| ParseData["Parse with source parser"]
ValidateData --> |"Invalid"| LogError["Log error"]
ParseData --> |"Success"| MarkWorking["Clear failure status<br/>(self._failed_sources[source] = None)"]
ParseData --> |"Parse Error"| LogError
MarkWorking --> FallbackSuccess["Return parsed data"]
Failed --> MarkFailed["Mark source as failed<br/>(self._failed_sources[source] = now)"]
MarkFailed --> LogError
LogError --> SourceLoop
FallbackSuccess --> End["Success - data ready for processing"]
FallbackFail --> End2["Failure - use cache or empty result"]
subgraph HealthCheckValidation["Continuous Health Check (Background Task)"]
direction TB
BGTask["Health check task runs continuously<br/>(started at initialization)"] --> Sleep["Sleep 15 minutes"]
Sleep --> WindowCheck{"In special window?<br/>(00:00-01:00 or 13:00-15:00)"}
WindowCheck --> |"No"| Sleep
WindowCheck --> |"Yes, new window"| RandomDelay["Random delay (0-3600s)"]
RandomDelay --> ValidateAllSources["Validate ALL sources<br/>(failed + working)"]
ValidateAllSources --> UpdateStatuses["Update all source statuses:<br/>• Success → timestamp = None<br/>• Failure → timestamp = now"]
UpdateStatuses --> Sleep
end
subgraph SourcePriority["Source Priority Example"]
direction TB
SE4["SE4 (Sweden)"] --> |"1st"| Nordpool["Nord Pool"]
SE4 --> |"2nd"| EnergyCharts["Energy-Charts"]
SE4 --> |"3rd"| ENTSOE["ENTSO-E"]
DK1["DK1 (Denmark)"] --> |"1st"| NordpoolDK["Nord Pool"]
DK1 --> |"2nd"| EnergyChartsDK["Energy-Charts"]
DK1 --> |"3rd"| ENTSODK["ENTSO-E"]
DK1 --> |"4th"| EnergiData["Energi Data Service"]
DK1 --> |"5th"| Stromligning["Strømligning"]
end
For detailed cache architecture documentation, see docs/cache_compute_on_demand.md
This integration is licensed under the MIT License.