Skip to content

Conversation

Copy link
Contributor

Copilot AI commented Oct 3, 2025

Problem

The wall-to-wall example notebook was failing with an EinopsError when trying to run the Clay model encoder:

EinopsError: Error while processing repeat-reduction pattern "B D -> B L D".
Input tensor shape: torch.Size([48]). Additional info: {'L': 1024}.
Wrong shape: expected 2 dims. Received 1-dim tensor.

Root Cause

The issue was in how time and location tensors were being constructed for multiple timesteps. The notebook was using np.hstack() to combine lists of tuples, which created 1D tensors instead of the required 2D tensors with proper batch dimensions:

# This created a 1D tensor of shape [48] instead of [12, 4]
time_tensor = torch.tensor(np.hstack((week_norm, hour_norm)))

Where:

  • week_norm and hour_norm were lists of 12 tuples each (24 values total)
  • hstack flattened everything into a single 1D array of 48 values
  • The model expected a 2D tensor of shape [B, D] where B=12 (batch size) and D=4 (normalized values)

Solution

  1. Fixed tensor creation in the wall-to-wall notebook by replacing np.hstack() with np.column_stack():

    # Before (broken)
    "time": torch.tensor(np.hstack((week_norm, hour_norm)))
    
    # After (fixed) 
    "time": torch.tensor(np.column_stack((week_norm, hour_norm)))
  2. Corrected inconsistent documentation across the codebase where comments incorrectly stated time/latlon tensors have shape [B, 2] when they actually have shape [B, 4].

Result

The fix ensures proper tensor shapes:

  • Time tensor: [12, 4] containing (week_sin, week_cos, hour_sin, hour_cos) for each sample
  • Latlon tensor: [12, 4] containing (lat_sin, lat_cos, lon_sin, lon_cos) for each sample
  • Combined tensor: [12, 8] which matches the model's expectation for the einops operation

Users can now successfully run the wall-to-wall example without encountering the EinopsError.

Files Changed

  • docs/tutorials/wall-to-wall.ipynb: Fixed tensor creation
  • claymodel/model.py: Updated tensor shape comments
  • claymodel/finetune/embedder/factory.py: Updated tensor shape comments
  • claymodel/finetune/regression/factory.py: Updated tensor shape comments
  • claymodel/finetune/segment/factory.py: Updated tensor shape comments

Fixes #[issue_number]

Original prompt

This section details on the original issue you should resolve

<issue_title>in wall-to-wall example</issue_title>
<issue_description>Hey,
I'm trying to recreate the wall to wall example, but after we create the datacube and try to run it through the encoder.
I'm getting a error of wrong shapes

EinopsError: Error while processing repeat-reduction pattern "B D -> B L D".
Input tensor shape: torch.Size([48]). Additional info: {'L': 1024}.
Wrong shape: expected 2 dims. Received 1-dim tensor.


EinopsError Traceback (most recent call last)
File /opt/anaconda3/envs/claymodel/lib/python3.11/site-packages/einops/einops.py:522, in reduce(tensor, pattern, reduction, **axes_lengths)
521 shape = backend.shape(tensor)
--> 522 recipe = _prepare_transformation_recipe(pattern, reduction, axes_names=tuple(axes_lengths), ndim=len(shape))
523 return _apply_recipe(
524 backend, recipe, cast(Tensor, tensor), reduction_type=reduction, axes_lengths=hashable_axes_lengths
525 )

File /opt/anaconda3/envs/claymodel/lib/python3.11/site-packages/einops/einops.py:365, in _prepare_transformation_recipe(pattern, operation, axes_names, ndim)
364 if ndim != len(left.composition):
--> 365 raise EinopsError(f"Wrong shape: expected {len(left.composition)} dims. Received {ndim}-dim tensor.")
366 left_composition = left.composition

EinopsError: Wrong shape: expected 2 dims. Received 1-dim tensor.

During handling of the above exception, another exception occurred:

EinopsError Traceback (most recent call last)
Cell In[87], line 7
1 #Run the model
2
3 #pass the datacube prepared above to the model to create embeddings.
4 #this will create 1 embedding vector for each image we downloaded
6 with torch.no_grad():
...
--> 533 raise EinopsError(message + "\n {}".format(e))

EinopsError: Error while processing repeat-reduction pattern "B D -> B L D".
Input tensor shape: torch.Size([48]). Additional info: {'L': 1024}.
Wrong shape: expected 2 dims. Received 1-dim tensor.

what exactly is the issue?</issue_description>

Comments on the Issue (you are @copilot in this section)

Fixes #375

💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.

Copilot AI changed the title [WIP] in wall-to-wall example Fix EinopsError in wall-to-wall example due to incorrect tensor shapes Oct 3, 2025
Copilot AI requested a review from brunosan October 3, 2025 23:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

in wall-to-wall example

2 participants