Louisiana's scraper carefully notes that it's assuming the headers are consistent across all of their files. In 2026, that stopped being the case.
The resulting bad CSV is bringing down the entire WARN stack. It might be possible to work around this momentarily with transformer patches but that's not sustainable or reasonable.
The answer may be to rebuild _clean_rows, which for now does a bunch of lists-of-lists work. That's not sustainable either, but it can be to our advantage with partial rows.
Possibly letting it do its thing with partial rows would work, and then as we loop through we check to see if there is a new header row. With each new header row, we build a dict keyed to the header row. With a lookup table, we can also account for typing differences etc. in different years that probably exist but were not noticed.
Louisiana's scraper carefully notes that it's assuming the headers are consistent across all of their files. In 2026, that stopped being the case.
The resulting bad CSV is bringing down the entire WARN stack. It might be possible to work around this momentarily with transformer patches but that's not sustainable or reasonable.
The answer may be to rebuild _clean_rows, which for now does a bunch of lists-of-lists work. That's not sustainable either, but it can be to our advantage with partial rows.
Possibly letting it do its thing with partial rows would work, and then as we loop through we check to see if there is a new header row. With each new header row, we build a dict keyed to the header row. With a lookup table, we can also account for typing differences etc. in different years that probably exist but were not noticed.