generated from databricks-industry-solutions/industry-solutions-blueprints
-
Couldn't load subscription status.
- Fork 28
Open
Description
While ingesting large 455MB DICOM file on Serverless compute, hit memory limit error.
Environment
AWS
Cluster configuration JSON:
Serverless, Environment 3, 32GB.
Fully reproducible code snippet
path to 455MB 3d 2000x700x700 CT scan compressed with HTJ2K.
from dbx.pixels import Catalog
from dbx.pixels.dicom import DicomMetaExtractor # The Dicom transformers
catalog = Catalog(
spark,
table=table,
volume=volume)
catalog_df = catalog.catalog(path=output_path)
meta_df = DicomMetaExtractor(catalog, deep=False).transform(catalog_df)
catalog.save(meta_df, mode=write_mode)Full error message
[UDF_PYSPARK_USER_CODE_ERROR.MEMORY_LIMIT_SERVERLESS] Execution failed. Function exceeded the limit of 1024 megabytes. Please reduce the memory usage of your function. SQLSTATE: 39000"
Metadata
Metadata
Assignees
Labels
No labels