Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: JuliaDataCubes/YAXArrayBase.jl
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: v0.6.0
Choose a base ref
...
head repository: JuliaDataCubes/YAXArrayBase.jl
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: master
Choose a head ref

Commits on Aug 31, 2022

  1. Copy the full SHA
    1f91f2e View commit details

Commits on Sep 23, 2022

  1. Fix dimname of DimArray

    This enables dimname for other subtypes of DimensionalData.Dimension
    felixcremer committed Sep 23, 2022
    Copy the full SHA
    6cd13ad View commit details

Commits on Dec 29, 2022

  1. Copy the full SHA
    af5b245 View commit details

Commits on Jan 4, 2023

  1. Merge pull request #15 from JuliaDataCubes/fc/dimdata_mapcube

    Fix dimname of DimArray
    meggart authored Jan 4, 2023
    Copy the full SHA
    6261714 View commit details
  2. fix axisindices

    meggart committed Jan 4, 2023
    Copy the full SHA
    01d5f26 View commit details
  3. Merge pull request #17 from JuliaDataCubes/fg/axisindices

    fix axisindices
    meggart authored Jan 4, 2023
    Copy the full SHA
    9663704 View commit details
  4. read into non-contig memory

    meggart committed Jan 4, 2023
    Copy the full SHA
    2cede74 View commit details
  5. test on most recent Julia

    meggart committed Jan 4, 2023
    Copy the full SHA
    f2c0938 View commit details
  6. Update tests

    meggart committed Jan 4, 2023
    Copy the full SHA
    3911151 View commit details
  7. Merge pull request #14 from JuliaDataCubes/fix_noncontig

    read into non-contig memory
    meggart authored Jan 4, 2023
    Copy the full SHA
    8bb1adf View commit details
  8. Update Project.toml

    meggart authored Jan 4, 2023
    2
    Copy the full SHA
    59f9af5 View commit details

Commits on Feb 21, 2023

  1. Enable readblock of a GDALBand into subarrays

    This uses dispatch to use the previous version on proper Matrix aout
    and to use similar to construct a Matrix and to then use the Matrix based method.
    felixcremer committed Feb 21, 2023
    Copy the full SHA
    0922779 View commit details

Commits on Apr 21, 2023

  1. Merge pull request #20 from JuliaDataCubes/fc/gdalbandreadblock

    Enable readblock of a GDALBand into subarrays
    felixcremer authored Apr 21, 2023
    Copy the full SHA
    c9d7d6d View commit details

Commits on Jul 31, 2024

  1. Switch to extensions instead of requires for datasets (#24)

    * Switch to extensions instead of requires for datasets
    
    * Move all array interface implementations to extensions
    
    This moves everything from using Requires to the new extensions setup.
    
    * Switch the array tests to testitems
    
    This ensures, that the tests for different backends are independent.
    
    * Remove the Requires setup for lower julia versions
    
    We decided to only support julia versions larger than 1.9
    
    * Switch CI to test on julia 1.9 instead of 1.6
    
    When we restrict the Julia version to 1.9 we should also stop testing on 1.6
    
    * Remove Zarr and NetCDF as dependencies
    
    * Remove AxisIndices
    
    This seems to be not maintained anymore with the last commit three years ago
    
    * Stop tracking Manifest.toml
    
    * Remove AxisIndices also from test environment
    
    * Add testing on nightly
    
    * Bump version to 0.7.0
    felixcremer authored Jul 31, 2024
    4
    Copy the full SHA
    d779f74 View commit details

Commits on Aug 1, 2024

  1. Upgrade Zarr DD and NetCDF compats

    Adds Zarr 0.9
    NetCDF 0.12
    DimensionalData 0.27
    felixcremer committed Aug 1, 2024
    Copy the full SHA
    6ce0e57 View commit details
  2. Merge pull request #25 from JuliaDataCubes/fc/updeps

    Upgrade Zarr DD and NetCDF compats
    felixcremer authored Aug 1, 2024
    2
    Copy the full SHA
    77dd2e0 View commit details
  3. Bump version to 0.7.1

    felixcremer authored Aug 1, 2024
    2
    Copy the full SHA
    97d7fab View commit details

Commits on Sep 12, 2024

  1. fix gdal extension

    lazarusA committed Sep 12, 2024
    Copy the full SHA
    0526f0e View commit details
  2. tests

    lazarusA committed Sep 12, 2024
    Copy the full SHA
    09e1e6d View commit details

Commits on Sep 13, 2024

  1. do run_package_tests last

    lazarusA committed Sep 13, 2024
    Copy the full SHA
    127650b View commit details
  2. Merge pull request #26 from lazarusA/la/fix_gdal_extension

    fix gdal extension
    felixcremer authored Sep 13, 2024
    Copy the full SHA
    2d6f7d7 View commit details
  3. where to look

    lazarusA committed Sep 13, 2024
    Copy the full SHA
    5374348 View commit details
  4. new writeblock method

    lazarusA committed Sep 13, 2024
    Copy the full SHA
    f2d4e3e View commit details
  5. readblock chunk aware

    lazarusA committed Sep 13, 2024
    Copy the full SHA
    8b8205a View commit details
  6. cs chunks

    lazarusA committed Sep 13, 2024
    Copy the full SHA
    aa09c06 View commit details
  7. revisit eachchunk

    lazarusA committed Sep 13, 2024
    Copy the full SHA
    8b6f623 View commit details
  8. use eachchunk

    lazarusA committed Sep 13, 2024
    Copy the full SHA
    2853303 View commit details
  9. do less

    lazarusA committed Sep 13, 2024
    Copy the full SHA
    de4b919 View commit details
  10. filename

    lazarusA committed Sep 13, 2024
    Copy the full SHA
    370ea14 View commit details
  11. do open read

    lazarusA committed Sep 13, 2024
    Copy the full SHA
    1d350b1 View commit details
  12. do clipping

    lazarusA committed Sep 13, 2024
    Copy the full SHA
    d4fac9c View commit details
  13. clip range

    lazarusA committed Sep 13, 2024
    Copy the full SHA
    aaf2d23 View commit details

Commits on Sep 14, 2024

  1. adds method

    lazarusA committed Sep 14, 2024
    Copy the full SHA
    20d594b View commit details

Commits on Sep 16, 2024

  1. minimal working gdal writing

    lazarusA committed Sep 16, 2024
    Copy the full SHA
    827e37c View commit details
  2. comment

    lazarusA committed Sep 16, 2024
    Copy the full SHA
    674236c View commit details
  3. Merge pull request #27 from JuliaDataCubes/la/write_tif

    La/write tif
    lazarusA authored Sep 16, 2024
    Copy the full SHA
    65c0b13 View commit details
  4. update badges

    lazarusA committed Sep 16, 2024
    Copy the full SHA
    4637a9a View commit details
  5. bump compats

    lazarusA committed Sep 16, 2024
    Copy the full SHA
    64a6acd View commit details
  6. Merge pull request #28 from JuliaDataCubes/la/clean_badges

    La/clean badges
    lazarusA authored Sep 16, 2024
    2
    Copy the full SHA
    88475da View commit details
  7. patch DD

    lazarusA committed Sep 16, 2024
    Copy the full SHA
    f145b62 View commit details
  8. Merge pull request #29 from JuliaDataCubes/la/patch

    patch DD
    lazarusA authored Sep 16, 2024
    2
    Copy the full SHA
    3e6eb31 View commit details

Commits on Nov 8, 2024

  1. Update Project.toml

    DD breaking bump
    lazarusA authored Nov 8, 2024
    Copy the full SHA
    3146712 View commit details
  2. Update Project.toml

    bump version
    lazarusA authored Nov 8, 2024
    Copy the full SHA
    06afb13 View commit details
  3. Merge pull request #30 from JuliaDataCubes/lazarusA-patch-1

    Update Project.toml
    lazarusA authored Nov 8, 2024
    2
    Copy the full SHA
    7773192 View commit details

Commits on Nov 22, 2024

  1. Add option to persist handle to NetCDF files (#31)

    * add interface for keeping handles open for faster dataset opening
    
    * update tests
    
    * test on lts instead of 1.9
    
    * Add dependabot
    
    * test 1.10 since workflows are too old
    meggart authored Nov 22, 2024
    2
    Copy the full SHA
    2ebac48 View commit details

Commits on Nov 25, 2024

  1. Bump actions/checkout from 2 to 4 (#35)

    Bumps [actions/checkout](https://github.com/actions/checkout) from 2 to 4.
    - [Release notes](https://github.com/actions/checkout/releases)
    - [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
    - [Commits](actions/checkout@v2...v4)
    
    ---
    updated-dependencies:
    - dependency-name: actions/checkout
      dependency-type: direct:production
      update-type: version-update:semver-major
    ...
    
    Signed-off-by: dependabot[bot] <support@github.com>
    Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
    dependabot[bot] authored Nov 25, 2024
    Copy the full SHA
    65dbfcf View commit details
  2. Bump codecov/codecov-action from 1 to 5 (#34)

    Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 1 to 5.
    - [Release notes](https://github.com/codecov/codecov-action/releases)
    - [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
    - [Commits](codecov/codecov-action@v1...v5)
    
    ---
    updated-dependencies:
    - dependency-name: codecov/codecov-action
      dependency-type: direct:production
      update-type: version-update:semver-major
    ...
    
    Signed-off-by: dependabot[bot] <support@github.com>
    Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
    dependabot[bot] authored Nov 25, 2024
    Copy the full SHA
    6fea0bc View commit details

Commits on Dec 17, 2024

  1. Bump julia-actions/setup-julia from 1 to 2 (#33)

    Bumps [julia-actions/setup-julia](https://github.com/julia-actions/setup-julia) from 1 to 2.
    - [Release notes](https://github.com/julia-actions/setup-julia/releases)
    - [Commits](julia-actions/setup-julia@v1...v2)
    
    ---
    updated-dependencies:
    - dependency-name: julia-actions/setup-julia
      dependency-type: direct:production
      update-type: version-update:semver-major
    ...
    
    Signed-off-by: dependabot[bot] <support@github.com>
    Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
    Co-authored-by: Fabian Gans <fgans@bgc-jena.mpg.de>
    dependabot[bot] and meggart authored Dec 17, 2024
    Copy the full SHA
    6a7701f View commit details
  2. Bump actions/cache from 1 to 4 (#32)

    Bumps [actions/cache](https://github.com/actions/cache) from 1 to 4.
    - [Release notes](https://github.com/actions/cache/releases)
    - [Changelog](https://github.com/actions/cache/blob/main/RELEASES.md)
    - [Commits](actions/cache@v1...v4)
    
    ---
    updated-dependencies:
    - dependency-name: actions/cache
      dependency-type: direct:production
      update-type: version-update:semver-major
    ...
    
    Signed-off-by: dependabot[bot] <support@github.com>
    Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
    dependabot[bot] authored Dec 17, 2024
    Copy the full SHA
    9ddb084 View commit details

Commits on May 3, 2025

  1. replace @info by @debug

    dr-ko committed May 3, 2025
    Copy the full SHA
    b192037 View commit details
7 changes: 7 additions & 0 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# https://docs.github.com/github/administering-a-repository/configuration-options-for-dependency-updates
version: 2
updates:
- package-ecosystem: "github-actions"
directory: "/" # Location of package manifests
schedule:
interval: "weekly"
12 changes: 7 additions & 5 deletions .github/workflows/CI.yml
Original file line number Diff line number Diff line change
@@ -13,20 +13,22 @@ jobs:
fail-fast: false
matrix:
version:
- '1.6'
- '1.10'
- '1'
- 'nightly'
os:
- ubuntu-latest
- macOS-latest
- windows-latest
arch:
- x64
steps:
- uses: actions/checkout@v2
- uses: julia-actions/setup-julia@v1
- uses: actions/checkout@v5
- uses: julia-actions/setup-julia@v2
with:
version: ${{ matrix.version }}
arch: ${{ matrix.arch }}
- uses: actions/cache@v1
- uses: actions/cache@v4
env:
cache-name: cache-artifacts
with:
@@ -39,6 +41,6 @@ jobs:
- uses: julia-actions/julia-buildpkg@v1
- uses: julia-actions/julia-runtest@v1
- uses: julia-actions/julia-processcoverage@v1
- uses: codecov/codecov-action@v1
- uses: codecov/codecov-action@v5
with:
file: lcov.info
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
.vscode
*.jl.cov
*.jl.*.cov
*.jl.mem
/deps/deps.jl
/docs/build
Manifest.toml
105 changes: 0 additions & 105 deletions Manifest.toml

This file was deleted.

32 changes: 27 additions & 5 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -1,16 +1,38 @@
name = "YAXArrayBase"
uuid = "90b8fcef-0c2d-428d-9c56-5f86629e9d14"
authors = ["Fabian Gans <fgans@bgc-jena.mpg.de>"]
version = "0.6.0"
version = "0.7.6"

[deps]
DataStructures = "864edb3b-99cc-5e75-8d2d-829cb0a9cfe8"
Dates = "ade2ca70-3891-5945-98fb-dc099432e06a"
Downloads = "f43a241f-c20a-4ad4-852c-f6b1247861c6"
Requires = "ae029012-a4dd-5104-9daa-d747884805df"


[compat]
DataStructures = "0.17, 0.18"
Requires = "1"
julia = "1.6"
DataStructures = "0.17,0.18"
julia = "1.9"
ArchGDAL = "0.10"
AxisArrays = "0.4"
AxisKeys = "0.2"
DimensionalData = "0.27, 0.28, 0.29"
NetCDF = "0.11, 0.12"
Zarr = "0.8, 0.9"

[extensions]
ArchGDALExt = "ArchGDAL"
AxisArraysExt = "AxisArrays"
AxisKeysExt = "AxisKeys"
DimensionalDataExt = "DimensionalData"
NamedDimsExt = "NamedDims"
NetCDFExt = "NetCDF"
ZarrExt = "Zarr"

[weakdeps]
ArchGDAL = "c9ce4bd3-c3d5-55b8-8973-c0e20141b8c3"
AxisArrays = "39de3d68-74b9-583c-8d2d-e117c070f3a9"
AxisKeys = "94b1ba4f-4ee9-5380-92f1-94cde586c3c5"
DimensionalData = "0703355e-b756-11e9-17c0-8b28908087d0"
NamedDims = "356022a1-0364-5f58-8944-0da4b18d706f"
NetCDF = "30363a11-5582-574a-97bb-aa9a979735b9"
Zarr = "0a941bbe-ad1d-11e8-39d9-ab76183a1d99"
12 changes: 5 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,10 @@
# YAXArrayBase.jl

![Lifecycle](https://img.shields.io/badge/lifecycle-maturing-blue.svg)<!--
![Lifecycle](https://img.shields.io/badge/lifecycle-stable-green.svg)
![Lifecycle](https://img.shields.io/badge/lifecycle-retired-orange.svg)
![Lifecycle](https://img.shields.io/badge/lifecycle-archived-red.svg)
![Lifecycle](https://img.shields.io/badge/lifecycle-dormant-blue.svg) -->
[![Build Status](https://travis-ci.com/JuliaDataCubes/YAXArrayBase.jl.svg?branch=master)](https://travis-ci.com/JuliaDataCubes/YAXArrayBase.jl)
[![codecov.io](http://codecov.io/github/JuliaDataCubes/YAXArrayBase.jl/coverage.svg?branch=master)](http://codecov.io/github/JuliaDataCubes/YAXArrayBase.jl?branch=master)
![Lifecycle](https://img.shields.io/badge/lifecycle-maturing-blue.svg)
[![][codecov-img]][codecov-url]

[codecov-img]: https://codecov.io/gh/JuliaDataCubes/YAXArrayBase.jl/branch/master/graph/badge.svg
[codecov-url]: https://codecov.io/gh/JuliaDataCubes/YAXArrayBase.jl

# YAXArrayBase

99 changes: 99 additions & 0 deletions ext/ArchGDALExt/ArchGDALExt.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,99 @@
module ArchGDALExt
import ArchGDAL: RasterDataset, AbstractRasterBand, getgeotransform, width, height
import ArchGDAL: getname, getcolorinterp, getband, nraster, getdataset
using ArchGDAL: ArchGDAL as AG

import ArchGDAL.DiskArrays: GridChunks, eachchunk
import ArchGDAL.DiskArrays

import YAXArrayBase: dimname, dimnames, dimvals, iscontdim, getattributes, getdata, yaxcreate
import YAXArrayBase: YAXArrayBase as YAB
using DataStructures: OrderedDict

include("archgdaldataset.jl")

function dimname(a::RasterDataset, i)
if i == 1
return :Y
elseif i == 2
return :X
elseif i == 3
return :Band
else
error("RasterDataset only has 3 dimensions")
end
end
function dimvals(a::RasterDataset, i)
if i == 1
geo=getgeotransform(a)
latr = range(geo[1],length=width(a), step=geo[2])
elseif i == 2
geo=getgeotransform(a)
range(geo[4],length=height(a), step=geo[6])
elseif i == 3
colnames = map(ib -> getname(getcolorinterp(getband(a,ib))),1:nraster(a))
if !allunique(colnames)
colnames = string.("Band_",1:nraster(a))
end
colnames
else
error("RasterDataset only has 3 dimensions")
end
end

iscontdim(a::RasterDataset, i) = i < 3 ? true : nraster(a)<8
function getattributes(a::RasterDataset)
globatts = Dict{String,Any}(
"projection_PROJ4"=>AG.toPROJ4(AG.newspatialref(AG.getproj(a))),
"projection_WKT"=>AG.toWKT(AG.newspatialref(AG.getproj(a))),
)
bands = (getbandattributes(AG.getband(a, i)) for i in 1:size(a, 3))
allbands = mergewith(bands...) do a1,a2
isequal(a1,a2) ? a1 : missing
end
return merge(globatts, allbands)
end


function dimname(::AbstractRasterBand, i)
if i == 1
return :Y
elseif i == 2
return :X
else
error("RasterDataset only has 3 dimensions")
end
end
function dimvals(b::AbstractRasterBand, i)
geo = getgeotransform(getdataset(b))
if i == 1
range(geo[1],length=width(b), step=geo[2])
elseif i == 2
range(geo[4],length=height(b), step=geo[6])
else
error("RasterDataset only has 3 dimensions")
end
end
iscontdim(a::AbstractRasterBand, i) = true
function getattributes(a::AbstractRasterBand)
atts = getattributes(AG.RasterDataset(AG.getdataset(a)))
bandatts = getbandattributes(a)
return merge(atts, bandatts)
end

function insertattifnot!(attrs, val, name, condition)
if !condition(val)
attrs[name] = val
end
end
function getbandattributes(a::AbstractRasterBand)
atts = Dict{String,Any}()
catdict = Dict((i-1)=>v for (i,v) in enumerate(AG.getcategorynames(a)))
insertattifnot!(atts, AG.getnodatavalue(a), "missing_value", isnothing)
insertattifnot!(atts, catdict, "labels", isempty)
insertattifnot!(atts, AG.getunittype(a), "units", isempty)
insertattifnot!(atts, AG.getoffset(a), "add_offset", iszero)
insertattifnot!(atts, AG.getscale(a), "scale_factor", x->isequal(x, one(x)))
return atts
end
end
384 changes: 384 additions & 0 deletions ext/ArchGDALExt/archgdaldataset.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,384 @@
import Base: ==
import DataStructures: OrderedDict
struct GDALBand{T} <: AG.DiskArrays.AbstractDiskArray{T,2}
filename::String
band::Int
size::Tuple{Int,Int}
attrs::Dict{String,Any}
cs::GridChunks{2}
end
function GDALBand(b, filename, i)
s = size(b)
atts = getbandattributes(b)
GDALBand{AG.pixeltype(b)}(filename, i, s, atts, eachchunk(b))
end
Base.size(b::GDALBand) = b.size
DiskArrays.eachchunk(b::GDALBand) = b.cs
DiskArrays.haschunks(::GDALBand) = DiskArrays.Chunked()

function DiskArrays.readblock!(b::GDALBand, aout::Matrix, r::AbstractUnitRange...)
AG.read(b.filename) do ds
AG.getband(ds, b.band) do bh
DiskArrays.readblock!(bh, aout, r...) # ? what to do if size(aout) < r ranges ?, i.e. chunk reads! is a DiskArrays issue!
end
end
end

function DiskArrays.readblock!(b::GDALBand, aout::Matrix, r::Tuple{AbstractUnitRange, AbstractUnitRange})
DiskArrays.readblock!(b, aout, r...)
end

function DiskArrays.writeblock!(b::GDALBand, ain, r::AbstractUnitRange...)
AG.read(b.filename, flags=AG.OF_UPDATE) do ds
AG.getband(ds, b.band) do bh
DiskArrays.writeblock!(bh, ain, r...)
end
end
end
function DiskArrays.readblock!(b::GDALBand, aout, r::AbstractUnitRange...)
aout2 = similar(aout)
DiskArrays.readblock!(b, aout2, r)
aout .= aout2
end


struct GDALDataset
filename::String
bandsize::Tuple{Int,Int}
projection::Union{String, AG.AbstractSpatialRef}
trans::Vector{Float64}
bands::OrderedDict{String}
end

function GDALDataset(filename; mode="r")
nb = AG.read(filename) do r
AG.nraster(r)
end
if nb == 0
return GDALMultiDataset(filename)
else
AG.read(filename) do r
allbands = map(1:nb) do iband
b = AG.getband(r, iband)
gb = GDALBand(b, filename, iband)
name = AG.GDAL.gdalgetdescription(b.ptr)
if isempty(name)
name = AG.getname(AG.getcolorinterp(b))
end
name => gb
end
proj = AG.getproj(r)
trans = AG.getgeotransform(r)
s = AG._common_size(r)
allnames = first.(allbands)
if !allunique(allnames)
allbands = ["Band$i"=>last(v) for (i,v) in enumerate(allbands)]
end
GDALDataset(filename, s[1:end-1], proj, trans, OrderedDict(allbands))
end
end
end
Base.haskey(ds::GDALDataset, k) = in(k, ("X", "Y")) || haskey(ds.bands, k)
#Implement Dataset interface
function YAB.get_var_handle(ds::GDALDataset, name; persist=true)
if name == "X"
range(ds.trans[1], length = ds.bandsize[1], step = ds.trans[2])
elseif name == "Y"
range(ds.trans[4], length = ds.bandsize[2], step = ds.trans[6])
else
ds.bands[name]
end
end


YAB.get_varnames(ds::GDALDataset) = collect(keys(ds.bands))

function YAB.get_var_dims(ds::GDALDataset, d)
if d === "X"
return ("X",)
elseif d==="Y"
return ("Y",)
else
return ("X", "Y")
end
end

YAB.get_global_attrs(ds::GDALDataset) = Dict("projection"=>ds.projection)

function YAB.get_var_attrs(ds::GDALDataset, name)
if name in ("Y", "X")
Dict{String,Any}()
else
merge(ds.bands[name].attrs, YAB.get_global_attrs(ds))
end
end

const colornames = AG.getname.(AG.GDALColorInterp.(0:16))

islat(s) = startswith(uppercase(s), "LAT")
islon(s) = startswith(uppercase(s), "LON")
isx(s) = uppercase(s) == "X"
isy(s) = uppercase(s) == "Y"

function totransform(x, y)
xstep = diff(x)
ystep = diff(y)
if !all(isapprox(first(xstep)), xstep) || !all(isapprox(first(ystep)), ystep)
throw(ArgumentError("Grid must have regular spacing"))
end
Float64[first(x), first(xstep), 0.0, first(y), 0.0, first(ystep)]
end
totransform(x::AbstractRange, y::AbstractRange) =
Float64[first(x), step(x), 0.0, first(y), 0.0, step(y)]

getproj(userproj::String, attrs) = AG.importPROJ4(userproj)
getproj(userproj::AG.AbstractSpatialRef, attrs) = userproj

function getproj(::Nothing, attrs)
if haskey(attrs, "projection")
return AG.importWKT(attrs["projection"])
elseif haskey(attrs, "projection_PROJ4")
return AG.importPROJ4(attrs["projection_PROJ4"])
elseif haskey(attrs, "projection_WKT")
return AG.importWKT(attrs["projection_WKT"])
else
error("Could not determine output projection from attributes, please specify userproj")
end
end

function YAB.create_dataset(
::Type{<:GDALDataset},
outpath,
gatts,
dimnames,
dimvals,
dimattrs,
vartypes,
varnames,
vardims,
varattrs,
varchunks;
userproj = nothing,
kwargs...,
)
# ? flip dimnames and dimvals, this needs a more generic solution!
dimnames = reverse(dimnames)
dimvals = reverse(dimvals)

@assert length(dimnames) == 2
merged_varattrs = merge(varattrs...)

proj, trans = if islon(dimnames[1]) && islat(dimnames[2])
#Lets set the crs to EPSG:4326
proj = AG.importEPSG(4326)
trans = totransform(dimvals[1], dimvals[2])
proj, trans
elseif isx(dimnames[1]) && isy(dimnames[2])
# Try to find out crs
all_attrs = merge(gatts, merged_varattrs)
proj = getproj(userproj, all_attrs)
trans = totransform(dimvals[1], dimvals[2])
proj, trans
else
error("Did not find x, y or lon, lat dimensions in dataset")
end
cs = first(varchunks)
@assert all(isequal(varchunks[1]), varchunks)

# driver = AG.getdriver(AG.extensiondriver(outpath)) # ? it looks like this driver (for .tif) is not working

if !endswith(lowercase(outpath), ".tif") && !endswith(lowercase(outpath), ".tiff")
outpath = outpath * ".tif"
end
# Use this:
driver = AG.getdriver("GTiff")

nbands = length(varnames)
dtype = promote_type(vartypes...)
s = (length.(dimvals)...,)
bands = AG.create(
outpath;
driver = driver,
width = length(dimvals[1]),
height = length(dimvals[2]),
nbands = nbands,
dtype = dtype,
options = [
"BLOCKXSIZE=$(cs[1])",
"BLOCKYSIZE=$(cs[2])",
"TILED=YES",
"COMPRESS=LZW"
]
) do ds
AG.setgeotransform!(ds, trans)
bands = map(1:length(varnames)) do i
b = AG.getband(ds, i)
icol = findfirst(isequal(varnames[i]), colornames)
if isnothing(icol)
AG.setcolorinterp!(b, AG.GDALColorInterp(0))
else
AG.setcolorinterp!(b, AG.GDALColorInterp(icol - 1))
end
AG.GDAL.gdalsetdescription(b.ptr, varnames[i])
atts = varattrs[i]
haskey(atts, "missing_value") && AG.setnodatavalue!(b, atts["missing_value"])
if haskey(atts, "labels")
labeldict = atts[labels]
maxlabel = maximum(keys(labeldict))
kt = keytype(labeldict)
labelvec = [haskey(labeldict, et(i)) ? labeldict[et(i)] : "" for i = 0:maxlabel]
AG.setcategorynames!(b, labelvec)
end
haskey(atts, "units") && AG.setunittype!(b, atts["units"])
haskey(atts, "scale_factor") && AG.setscale!(b, atts["scale_factor"])
haskey(atts, "add_offset") && AG.setoffset!(b, atts["add_offset"])
GDALBand{dtype}(outpath, i, s, atts, AG.DiskArrays.GridChunks(s,cs))
end
end
return GDALDataset(outpath, s, AG.toPROJ4(proj), trans, OrderedDict(vn=>b for (vn,b) in zip(varnames, bands)))
end

allow_parallel_write(::Type{<:GDALDataset}) = false
allow_parallel_write(::GDALDataset) = false

allow_missings(::Type{<:GDALDataset}) = false
allow_missings(::GDALDataset) = false

#MultiDataset implementation
struct GDALBandInfo
bandsize::Tuple{Int,Int}
projection::Union{String, AG.AbstractSpatialRef}
trans::Vector{Float64}
end
==(a::GDALBandInfo,b::GDALBandInfo) = a.bandsize == b.bandsize &&
a.projection == b.projection &&
a.trans == b.trans
struct GDALMultiDataset
bandinfo::Vector{GDALBandInfo}
bands::OrderedDict{String,Tuple{String,GDALBand,Int}}
end
function getbandinfo(gds::GDALMultiDataset, k)
if k in keys(gds.bands)
i = gds.bands[k][3]
bandext = length(gds.bandinfo) == 1 ? "" : "_$i"
gds.bandinfo[i],"X$bandext","Y$bandext"
else
nspl = split(k,'_')
i = if length(nspl)==1
1
else
parse(Int,nspl[2])
end
if startswith(k,"X")
gds.bandinfo[i],k,replace(k,'X'=>'Y')
elseif startswith(k,"Y")
gds.bandinfo[i],replace(k,'Y'=>'X'),k
else
error()
end
end
end

function GDALMultiDataset(filename)
subdatasets = OrderedDict{String, String}()
AG.read(filename) do ds
subds = AG.metadata(ds; domain="SUBDATASETS")
for dsstring in subds
k,v = split(dsstring,'=')
if endswith(k,"_NAME")
k2 = last(split(v,':'))
subdatasets[k2] = v
end
end
end
bandinfos = GDALBandInfo[]
allbands = OrderedDict{String,Tuple{String,GDALBand,Int}}()
for (k,v) in subdatasets
AG.read(v) do ds
s = AG._common_size(ds)
bandinfo = GDALBandInfo((s[1],s[2]), AG.getproj(ds), AG.getgeotransform(ds))
i = findfirst(==(bandinfo),bandinfos)
if isnothing(i)
push!(bandinfos,bandinfo)
i = length(bandinfos)
end
nr = AG.nraster(ds)
for iband in 1:nr
b = AG.getband(ds, iband)
gb = GDALBand(b, v, iband)
if nr == 1
allbands[k] = (v,gb,iband)
else
allbands[(string(k,"_",iband))] = (v,gb,i)
end
end
end
end
GDALMultiDataset(bandinfos, allbands)
end

function xyname(ds::GDALMultiDataset,k)
if k in keys(ds.bands)
i = ds.bands[k][3]
bandext = length(ds.bandinfo) == 1 ? "" : "_$i"
"X$bandext","Y$bandext"
else
if startswith(k,"X")
k,replace(k,'X'=>'Y')
elseif startswith(k,"Y")
replace(k,'Y'=>'X'),k
else
error()
end
end
end
function Base.haskey(ds::GDALMultiDataset, k)
x,y = xyname(ds,k)
in(k, (x,y)) || haskey(ds.bands, k)
end
#Implement Dataset interface
function YAB.get_var_handle(ds::GDALMultiDataset, name; persist=true)
bandinfo,x,y = getbandinfo(ds,name)
if name == x
range(bandinfo.trans[1], length = bandinfo.bandsize[1], step = bandinfo.trans[2])
elseif name == y
range(bandinfo.trans[4], length = bandinfo.bandsize[2], step = bandinfo.trans[6])
else
ds.bands[name][2]
end
end


YAB.get_varnames(ds::GDALMultiDataset) = collect(keys(ds.bands))

function YAB.get_var_dims(ds::GDALMultiDataset, d)
x,y = xyname(ds,d)
if d == x
return (x,)
elseif d == y
return (y,)
else
return (x, y)
end
end

YAB.get_global_attrs(ds::GDALMultiDataset) = Dict()

function YAB.get_var_attrs(ds::GDALMultiDataset, name)
_,x,y = getbandinfo(ds,name)
if name in (x,y)
Dict{String,Any}()
else
ds.bands[name][2].attrs
end
end


function __init__()
@debug "new driver key :gdal, updating backendlist."
YAB.backendlist[:gdal] = GDALDataset
push!(YAB.backendregex,r".tif$"=>GDALDataset)
push!(YAB.backendregex,r".gtif$"=>GDALDataset)
push!(YAB.backendregex,r".tiff$"=>GDALDataset)
push!(YAB.backendregex,r".gtiff$"=>GDALDataset)
end
6 changes: 4 additions & 2 deletions src/axisarrays/axisarrays.jl → ext/AxisArraysExt.jl
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
using .AxisArrays: AxisArrays, AxisArray

module AxisArraysExt
using AxisArrays: AxisArrays, AxisArray
import YAXArrayBase: dimname, dimnames, dimvals, iscontdim, getattributes, getdata, yaxcreate
dimname(a::AxisArray, i) = AxisArrays.axisnames(a)[i]
dimnames(a::AxisArray) = AxisArrays.axisnames(a)
dimvals(a::AxisArray, i) = AxisArrays.axisvalues(a)[i]
@@ -11,3 +12,4 @@ function yaxcreate(::Type{<:AxisArray}, data, dnames, dvals, atts)
end
AxisArray(data; d...)
end
end
6 changes: 4 additions & 2 deletions src/axisarrays/axiskeys.jl → ext/AxisKeysExt.jl
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
import .AxisKeys

module AxisKeysExt
import AxisKeys
import YAXArrayBase: dimname, dimnames, dimvals, iscontdim, getattributes, getdata, yaxcreate
dimnames(a::AxisKeys.KeyedArray) = AxisKeys.dimnames(a)

dimvals(a::AxisKeys.KeyedArray,i) = AxisKeys.getproperty(a,AxisKeys.dimnames(a,i))
@@ -8,3 +9,4 @@ getdata(a::AxisKeys.KeyedArray) = parent(parent(a))

yaxcreate(::Type{<:AxisKeys.KeyedArray}, data, dnames, dvals, atts) =
AxisKeys.KeyedArray(data; map(i->dnames[i]=>dvals[i],1:ndims(data))...)
end
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
using .DimensionalData: DimArray, DimensionalData, data, Dim, metadata

module DimensionalDataExt
using DimensionalData: DimArray, DimensionalData, data, Dim, metadata
import YAXArrayBase: dimname, dimnames, dimvals, iscontdim, getattributes, getdata, yaxcreate
_dname(::DimensionalData.Dim{N}) where N = N
_dname(d::DimensionalData.Dimension) = DimensionalData.name(d)
dimname(x::DimArray, i) = _dname(DimensionalData.dims(x)[i])


@@ -16,3 +18,4 @@ function yaxcreate(::Type{<:DimArray},data,dnames,dvals,atts)
end
DimArray(data,d,metadata = atts)
end
end
5 changes: 4 additions & 1 deletion src/axisarrays/nameddims.jl → ext/NamedDimsExt.jl
Original file line number Diff line number Diff line change
@@ -1,8 +1,11 @@
using .NamedDims: NamedDimsArray
module NamedDimsExt
using NamedDims: NamedDimsArray
import YAXArrayBase: dimname, dimnames, dimvals, iscontdim, getattributes, getdata, yaxcreate
dimname(a::NamedDimsArray{N},i) where N = N[i]
dimnames(a::NamedDimsArray{N}) where N = N
getdata(a::NamedDimsArray) = parent(a)
function yaxcreate(::Type{<:NamedDimsArray},data, dnames, dvals, atts)
n = ntuple(i->dnames[i],ndims(data))
NamedDimsArray(data,n)
end
end
115 changes: 115 additions & 0 deletions ext/NetCDFExt.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,115 @@
module NetCDFExt
import YAXArrayBase: YAXArrayBase as YAB
using NetCDF

"""
NetCDFDataset
Dataset backend to read NetCDF files using NetCDF.jl
The following keyword arguments are allowed when using :netcdf
as a data sink:
- `compress = -1` set the compression level for the NetCDF file
"""
struct NetCDFDataset
filename::String
mode::UInt16
handle::Base.RefValue{Union{Nothing, NcFile}}
end
function NetCDFDataset(filename;mode="r")
m = mode == "r" ? NC_NOWRITE : NC_WRITE
NetCDFDataset(filename,m,Ref{Union{Nothing, NcFile}}(nothing))
end
function dsopen(f,ds::NetCDFDataset)
if ds.handle[] === nothing
NetCDF.open(f, ds.filename)
else
f(ds.handle[])
end
end
function YAB.open_dataset_handle(f, ds::NetCDFDataset)
if ds.handle[] === nothing
try
ds.handle[] = NetCDF.open(ds.filename, mode=ds.mode)
f(ds)
finally
ds.handle[]=nothing
end
else
f(ds)
end
end



import .NetCDF: AbstractDiskArray, readblock!, writeblock!, haschunks, eachchunk

struct NetCDFVariable{T,N} <: AbstractDiskArray{T,N}
filename::String
varname::String
size::NTuple{N,Int}
end
#Define method forwarding for DiskArray methods
for m in [:haschunks, :eachchunk]
eval(:(function $(m)(v::NetCDFVariable,args...;kwargs...)
NetCDF.open(a->$(m)(a,args...;kwargs...), v.filename, v.varname)
end
))
end
function check_contig(x)
isa(x,Array) || (isa(x,SubArray) && Base.iscontiguous(x))
end
writeblock!(v::NetCDFVariable, aout, r::AbstractUnitRange...) = NetCDF.open(a->writeblock!(a,aout,r...), v.filename, v.varname, mode=NC_WRITE)
function readblock!(v::NetCDFVariable, aout, r::AbstractUnitRange...)
if check_contig(aout)
NetCDF.open(a->readblock!(a,aout,r...), v.filename, v.varname)
else
aouttemp = Array(aout)
NetCDF.open(a->readblock!(a,aouttemp,r...), v.filename, v.varname)
aout .= aouttemp
end
end
YAB.iscompressed(v::NetCDFVariable) = NetCDF.open(v->v.compress > 0, v.filename, v.varname)

Base.size(v::NetCDFVariable) = v.size

YAB.get_var_dims(ds::NetCDFDataset,name) = dsopen(v->map(i->i.name,v[name].dim),ds)
YAB.get_varnames(ds::NetCDFDataset) = dsopen(v->collect(keys(v.vars)),ds)
YAB.get_var_attrs(ds::NetCDFDataset, name) = dsopen(v->v[name].atts,ds)
YAB.get_global_attrs(ds::NetCDFDataset) = dsopen(nc->nc.gatts, ds)
function YAB.get_var_handle(ds::NetCDFDataset, i; persist = true)
if persist || ds.handle[] === nothing
s,et = NetCDF.open(j->(size(j),eltype(j)),ds.filename,i)
NetCDFVariable{et,length(s)}(ds.filename, i, s)
else
ds.handle[][i]
end
end
Base.haskey(ds::NetCDFDataset,k) = dsopen(nc->haskey(nc.vars,k),ds)

function YAB.add_var(p::NetCDFDataset, T::Type, varname, s, dimnames, attr;
chunksize=s, compress = -1)
dimsdescr = Iterators.flatten(zip(dimnames,s))
nccreate(p.filename, varname, dimsdescr..., atts = attr, t=T, chunksize=chunksize, compress=compress)
NetCDFVariable{T,length(s)}(p.filename,varname,(s...,))
end

function YAB.create_empty(::Type{NetCDFDataset}, path, gatts=Dict())
NetCDF.create(_->nothing, path, NcVar[], gatts = gatts)
NetCDFDataset(path)
end

YAB.allow_parallel_write(::Type{<:NetCDFDataset}) = false
YAB.allow_parallel_write(::NetCDFDataset) = false

YAB.allow_missings(::Type{<:NetCDFDataset}) = false
YAB.allow_missings(::NetCDFDataset) = false

function __init__()
@debug "new driver key :netcdf, updating backendlist."
YAB.backendlist[:netcdf] = NetCDFDataset
push!(YAB.backendregex,r".nc$"=>NetCDFDataset)
end

end
64 changes: 64 additions & 0 deletions ext/ZarrExt.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
module ZarrExt
using YAXArrayBase
using Zarr: ZArray, ZGroup, zgroup, zcreate, to_zarrtype, zopen, Compressor
import YAXArrayBase: YAXArrayBase as YAB
export ZarrDataset

function __init__()
@debug "new driver key :zarr, updating backendlist."
YAB.backendlist[:zarr] = ZarrDataset
push!(YAB.backendregex, r"(.zarr$)|(.zarr/$)"=>ZarrDataset)
end

struct ZarrDataset
g::ZGroup
end
ZarrDataset(g::String;mode="r") = ZarrDataset(zopen(g,mode,fill_as_missing=false))

YAB.get_var_dims(ds::ZarrDataset,name) = reverse(ds[name].attrs["_ARRAY_DIMENSIONS"])
YAB.get_varnames(ds::ZarrDataset) = collect(keys(ds.g.arrays))
function YAB.get_var_attrs(ds::ZarrDataset, name)
#We add the fill value to the attributes to be consistent with NetCDF
a = ds[name]
if a.metadata.fill_value !== nothing
merge(ds[name].attrs,Dict("_FillValue"=>a.metadata.fill_value))
else
ds[name].attrs
end
end
YAB.get_global_attrs(ds::ZarrDataset) = ds.g.attrs
Base.getindex(ds::ZarrDataset, i) = ds.g[i]
Base.haskey(ds::ZarrDataset,k) = haskey(ds.g,k)

# function add_var(p::ZarrDataset, T::Type{>:Missing}, varname, s, dimnames, attr; kwargs...)
# S = Base.nonmissingtype(T)
# add_var(p,S, varname, s, dimnames, attr; fill_value = defaultfillval(S), fill_as_missing=true, kwargs...)
# end

function YAB.add_var(p::ZarrDataset, T::Type, varname, s, dimnames, attr;
chunksize=s, fill_as_missing=false, kwargs...)
attr2 = merge(attr,Dict("_ARRAY_DIMENSIONS"=>reverse(collect(dimnames))))
fv = get(attr,"_FillValue",get(attr,"missing_value",YAB.defaultfillval(T)))
za = zcreate(T, p.g, varname,s...;fill_value = fv,fill_as_missing,attrs=attr2,chunks=chunksize,kwargs...)
za
end

#Special case for init with Arrays
function YAB.add_var(p::ZarrDataset, a::AbstractArray, varname, dimnames, attr;
kwargs...)
T = to_zarrtype(a)
b = add_var(p,T,varname,size(a),dimnames,attr;kwargs...)
b .= a
a
end

YAB.create_empty(::Type{ZarrDataset}, path, gatts=Dict()) = ZarrDataset(zgroup(path, attrs=gatts))



YAB.allow_parallel_write(::ZarrDataset) = true
YAB.allow_missings(::ZarrDataset) = false
YAB.to_dataset(g::ZGroup; kwargs...) = ZarrDataset(g)
YAB.iscompressed(a::ZArray{<:Any,<:Any,<:Compressor}) = true

end
28 changes: 2 additions & 26 deletions src/YAXArrayBase.jl
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
module YAXArrayBase
using Requires: @require
using DataStructures: OrderedDict

using DataStructures: OrderedDict

include("datasets/datasetinterface.jl")
include("axisarrays/axisinterface.jl")
@@ -22,30 +22,6 @@ function __init__()
)

backendregex = Pair[]

@require NamedDims="356022a1-0364-5f58-8944-0da4b18d706f" include("axisarrays/nameddims.jl")

@require DimensionalData="0703355e-b756-11e9-17c0-8b28908087d0" include("axisarrays/dimensionaldata.jl")

@require AxisArrays="39de3d68-74b9-583c-8d2d-e117c070f3a9" include("axisarrays/axisarrays.jl")

@require AxisIndices="f52c9ee2-1b1c-4fd8-8546-6350938c7f11" include("axisarrays/axisindices.jl")

@require AxisKeys="94b1ba4f-4ee9-5380-92f1-94cde586c3c5" include("axisarrays/axiskeys.jl")

@require ArchGDAL="c9ce4bd3-c3d5-55b8-8973-c0e20141b8c3" begin
include("axisarrays/archgdal.jl")
include("datasets/archgdal.jl")
end


@require Zarr="0a941bbe-ad1d-11e8-39d9-ab76183a1d99" include("datasets/zarr.jl")

@require NetCDF="30363a11-5582-574a-97bb-aa9a979735b9" include("datasets/netcdf.jl")




end


87 changes: 0 additions & 87 deletions src/axisarrays/archgdal.jl

This file was deleted.

8 changes: 0 additions & 8 deletions src/axisarrays/axisindices.jl

This file was deleted.

218 changes: 0 additions & 218 deletions src/datasets/archgdal.jl

This file was deleted.

8 changes: 7 additions & 1 deletion src/datasets/datasetinterface.jl
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
#Functions to be implemented for Dataset sources:
"Return a DiskArray handle to a dataset"
get_var_handle(ds, name) = ds[name]
get_var_handle(ds, name; persist=true) = ds[name]

"Return a list of variable names"
function get_varnames end
@@ -18,6 +18,11 @@ function get_global_attrs end
"Initialize and return a handle to a new empty dataset"
function create_empty end

"Apply a function `f` on a dataset `ds` while keeping possible file handles open during the operations"
function open_dataset_handle(f, ds)
f(ds)
end

"""
add_var(ds, T, name, s, dimlist, atts)
@@ -78,6 +83,7 @@ backendregex = Pair[]

function backendfrompath(g::String; driver = :all)
if driver == :all
isempty(backendregex) && throw("No backend found. Load a backend by using the corresponding package.")
for p in YAXArrayBase.backendregex
if match(p[1],g) !== nothing
return p[2]
68 changes: 0 additions & 68 deletions src/datasets/netcdf.jl

This file was deleted.

54 changes: 0 additions & 54 deletions src/datasets/zarr.jl

This file was deleted.

2 changes: 2 additions & 0 deletions test/Artifacts.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
[ncar]
git-tree-sha1 = "b3ad9125b25731444e8118c0a808aa2d3fc0eb1e"
5 changes: 4 additions & 1 deletion test/Project.toml
Original file line number Diff line number Diff line change
@@ -1,11 +1,14 @@
[deps]
ArchGDAL = "c9ce4bd3-c3d5-55b8-8973-c0e20141b8c3"
AxisArrays = "39de3d68-74b9-583c-8d2d-e117c070f3a9"
AxisIndices = "f52c9ee2-1b1c-4fd8-8546-6350938c7f11"
AxisKeys = "94b1ba4f-4ee9-5380-92f1-94cde586c3c5"
DimensionalData = "0703355e-b756-11e9-17c0-8b28908087d0"
Downloads = "f43a241f-c20a-4ad4-852c-f6b1247861c6"
NamedDims = "356022a1-0364-5f58-8944-0da4b18d706f"
NetCDF = "30363a11-5582-574a-97bb-aa9a979735b9"
Pkg = "44cfe95a-1eb2-52ea-b672-e2afdf69b78f"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
TestItemRunner = "f8b46487-2199-4994-9208-9a1283c18c0a"
TestItems = "1c621080-faea-4a02-84b6-bbd5e436b8fe"
YAXArrayBase = "90b8fcef-0c2d-428d-9c56-5f86629e9d14"
Zarr = "0a941bbe-ad1d-11e8-39d9-ab76183a1d99"
37 changes: 13 additions & 24 deletions test/arrays.jl
Original file line number Diff line number Diff line change
@@ -1,25 +1,9 @@
using YAXArrayBase, DimensionalData, AxisArrays, AxisIndices, Test
using TestItems

struct M
end
Base.ndims(::M) = 2
YAXArrayBase.getdata(::M) = reshape(1:12,3,4)
YAXArrayBase.dimname(::M,i) = i==1 ? :x : :y
YAXArrayBase.dimvals(::M,i) = i==1 ? (0.5:1.0:2.5) : (1.5:0.5:3.0)
YAXArrayBase.getattributes(::M) = Dict{String,Any}("a1"=>5, "a2"=>"att")

@testset "AxisIndices" begin
using AxisIndices: AxisIndices
d = yaxconvert(AxisIndices.AxisArray,M())
@test d isa AxisIndices.AxisArray
@test getdata(d) == reshape(1:12,3,4)
@test YAXArrayBase.dimnames(d) == (:Dim_1, :Dim_2)
@test dimvals(d,1) == 0.5:1.0:2.5
@test dimvals(d,2) == 1.5:0.5:3.0
end

@testset "AxisKeys" begin
@testitem "AxisKeys" begin
using AxisKeys: KeyedArray
include("mock.jl")
d = yaxconvert(KeyedArray,M())
@test d isa KeyedArray
@test getdata(d) == reshape(1:12,3,4)
@@ -29,8 +13,9 @@ end
end


@testset "AxisArrays" begin
@testitem "AxisArrays" begin
using AxisArrays: AxisArrays
include("mock.jl")
d = yaxconvert(AxisArrays.AxisArray,M())
@test d isa AxisArrays.AxisArray
@test getdata(d) == reshape(1:12,3,4)
@@ -39,7 +24,8 @@ end
@test dimvals(d,2) == 1.5:0.5:3.0
end

@testset "NamedTuples" begin
@testitem "NamedTuples" begin
include("mock.jl")
d = yaxconvert(NamedTuple,M())
@test d isa NamedTuple
@test getdata(d) == reshape(1:12,3,4)
@@ -48,16 +34,18 @@ end
@test dimvals(d,2) == 1.5:0.5:3.0
end

@testset "NamedDims" begin
@testitem "NamedDims" begin
using NamedDims: NamedDimsArray
include("mock.jl")
d = yaxconvert(NamedDimsArray,M())
@test d isa NamedDimsArray
@test getdata(d) == reshape(1:12,3,4)
@test YAXArrayBase.dimnames(d) == (:x, :y)
end

@testset "DimensionalData" begin
@testitem "DimensionalData" begin
using DimensionalData
include("mock.jl")
d = yaxconvert(DimArray,M())
@test d isa DimArray
@test getdata(d) == reshape(1:12,3,4)
@@ -67,8 +55,9 @@ end
@test getattributes(d) == Dict{String,Any}("a1"=>5, "a2"=>"att")
end

@testset "ArchGDAL" begin
@testitem "ArchGDAL" begin
import Downloads
include("mock.jl")
p = Downloads.download("https://download.osgeo.org/geotiff/samples/gdal_eg/cea.tif")
using ArchGDAL
AG=ArchGDAL
97 changes: 76 additions & 21 deletions test/datasets.jl
Original file line number Diff line number Diff line change
@@ -1,10 +1,28 @@
using YAXArrayBase, NetCDF, Zarr, Test
using YAXArrayBase, Test
@testset "Empty Backend" begin
@test_throws "No backend found." YAXArrayBase.backendfrompath("test.zarr")
end

@testset "Reading NetCDF" begin
using NetCDF, Zarr

using Pkg.Artifacts
import Downloads
p = Downloads.download("https://www.unidata.ucar.edu/software/netcdf/examples/sresa1b_ncar_ccsm3-example.nc")
# This is the path to the Artifacts.toml we will manipulate
artifact_toml = joinpath(@__DIR__,"Artifacts.toml")
ncar_hash = artifact_hash("ncar", artifact_toml)
if ncar_hash === nothing || !artifact_exists(ncar_hash)
oldhash = ncar_hash
ncar_hash = create_artifact() do artifact_dir
Downloads.download("https://www.unidata.ucar.edu/software/netcdf/examples/sresa1b_ncar_ccsm3-example.nc",joinpath(artifact_dir,"ncar.nc"))
end
if oldhash !== nothing
unbind_artifact!(artifact_toml, "ncar")
end
bind_artifact!(artifact_toml, "ncar", ncar_hash)
end
p2 = joinpath(artifact_path(ncar_hash),"ncar.nc")

p2 = mv(p,string(tempname(),".nc"))
@testset "Reading NetCDF" begin

ds_nc = YAXArrayBase.to_dataset(p2)
vn = get_varnames(ds_nc)
@@ -20,6 +38,29 @@ h = get_var_handle(ds_nc, "tas")
@test all(isapprox.(h[1:2,1:2], [215.893 217.168; 215.805 217.03]))
@test allow_parallel_write(ds_nc) == false
@test allow_missings(ds_nc) == false
#Repeat the same test with an open get_var_handle
ds_nc2 = YAXArrayBase.to_dataset(p2)
YAXArrayBase.open_dataset_handle(ds_nc2) do ds_nc
@test ds_nc.handle[] !== nothing
vn = get_varnames(ds_nc)
@test sort(vn) == ["area", "lat", "lat_bnds", "lon", "lon_bnds", "msk_rgn",
"plev", "pr", "tas", "time", "time_bnds", "ua"]
@test get_var_dims(ds_nc, "tas") == ["lon", "lat", "time"]
@test get_var_dims(ds_nc, "area") == ["lon", "lat"]
@test get_var_dims(ds_nc, "time") == ["time"]
@test get_var_dims(ds_nc, "time_bnds") == ["bnds", "time"]
@test get_var_attrs(ds_nc,"tas")["long_name"] == "air_temperature"
h1 = get_var_handle(ds_nc, "tas",persist=true)
@test !(h1 isa NetCDF.NcVar)
@test !YAXArrayBase.iscompressed(h1)
@test all(isapprox.(h1[1:2,1:2], [215.893 217.168; 215.805 217.03]))
h2 = get_var_handle(ds_nc, "tas",persist=false)
@test h2 isa NetCDF.NcVar
@test !YAXArrayBase.iscompressed(h2)
@test all(isapprox.(h2[1:2,1:2], [215.893 217.168; 215.805 217.03]))
@test allow_parallel_write(ds_nc) == false
@test allow_missings(ds_nc) == false
end
end

@testset "Reading Zarr" begin
@@ -37,30 +78,44 @@ h = get_var_handle(ds_zarr, "psl")
@test allow_parallel_write(ds_zarr) == true
@test allow_missings(ds_zarr) == false
end

@testset "Reading ArchGDAL" begin
using ArchGDAL
import Downloads
p3 = Downloads.download("https://download.osgeo.org/geotiff/samples/gdal_eg/cea.tif")
ds_tif = YAXArrayBase.to_dataset(p3, driver=:gdal)
vn = get_varnames(ds_tif)
@test sort(vn) == ["Gray"]
@test get_var_dims(ds_tif, "Gray") == ("X", "Y")
@test haskey(get_var_attrs(ds_tif, "Gray"), "projection")
h = get_var_handle(ds_tif, "Gray")
@test !YAXArrayBase.iscompressed(h)
@test all(isapprox.(h[1:2,1:2], [0x00 0x00; 0x00 0x00]))
@test allow_parallel_write(ds_tif) == false
@test allow_missings(ds_tif) == true
end
function test_write(T)
p = tempname()
ds = create_empty(T, p)
add_var(ds, 0.5:1:9.5, "lon", ("lon",), Dict("units"=>"degrees_east"))
add_var(ds, 20:-1.0:1, "lat", ("lat",), Dict("units"=>"degrees_north"))
v = add_var(ds, Float32, "tas", (10,20), ("lon", "lat"), Dict{String,Any}("units"=>"Celsius"))
p = tempname()
ds = create_empty(T, p)
add_var(ds, 0.5:1:9.5, "lon", ("lon",), Dict("units"=>"degrees_east"))
add_var(ds, 20:-1.0:1, "lat", ("lat",), Dict("units"=>"degrees_north"))
v = add_var(ds, Float32, "tas", (10,20), ("lon", "lat"), Dict{String,Any}("units"=>"Celsius"))

v[:,:] = collect(reshape(1:200, 10, 20))
v[:,:] = collect(reshape(1:200, 10, 20))

@test sort(get_varnames(ds)) == ["lat","lon","tas"]
@test get_var_dims(ds, "tas") == ["lon", "lat"]
@test get_var_dims(ds, "lon") == ["lon"]
@test get_var_attrs(ds,"tas")["units"] == "Celsius"
h = get_var_handle(ds, "lon")
@test h[:] == 0.5:1:9.5
v = get_var_handle(ds, "tas")
@test v[1:2,1:2] == [1 11; 2 12]
@test sort(get_varnames(ds)) == ["lat","lon","tas"]
@test get_var_dims(ds, "tas") == ["lon", "lat"]
@test get_var_dims(ds, "lon") == ["lon"]
@test get_var_attrs(ds,"tas")["units"] == "Celsius"
h = get_var_handle(ds, "lon")
@test h[:] == 0.5:1:9.5
v = get_var_handle(ds, "tas")
@test v[1:2,1:2] == [1 11; 2 12]
end

@testset "Writing NetCDF" begin
test_write(YAXArrayBase.NetCDFDataset)
test_write(YAXArrayBase.backendlist[:netcdf])
end

@testset "Writing Zarr" begin
test_write(YAXArrayBase.ZarrDataset)
test_write(YAXArrayBase.backendlist[:zarr])
end
7 changes: 7 additions & 0 deletions test/mock.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
struct M
end
Base.ndims(::M) = 2
YAXArrayBase.getdata(::M) = reshape(1:12,3,4)
YAXArrayBase.dimname(::M,i) = i==1 ? :x : :y
YAXArrayBase.dimvals(::M,i) = i==1 ? (0.5:1.0:2.5) : (1.5:0.5:3.0)
YAXArrayBase.getattributes(::M) = Dict{String,Any}("a1"=>5, "a2"=>"att")
14 changes: 9 additions & 5 deletions test/runtests.jl
Original file line number Diff line number Diff line change
@@ -1,8 +1,12 @@
using YAXArrayBase
using Test
@testset "Arrays" begin
include("arrays.jl")
end
using Test, TestItemRunner

@testset "Datasets" begin
include("datasets.jl")
include("datasets.jl")
end

@run_package_tests

#@testset "Arrays" begin
# include("arrays.jl")
#end