We do a thing that is a nice improvement that I'll attribute to Nils Pipenbrinck, who I haven't found on GitHub, but who is on twitter (torusle). He did it nicely and sold me on the idea, but it makes use of reinterpret_cast and you really need to know the endian nature or it'll be all sorts of awful.
I've also heard it suggested that this check be performed on the data at run-time. I don't know what to make of that, but I may make a way to call it with the endian nature explicit and let the make set the default to what it finds native for unsigned ints.