Replies: 2 comments
-
you'll probably have to build a loop that provides the full set of metadata details for each of the 46 rows, but not all of those details in one go. Have a look at the loops guide in the docs, and use the |
Beta Was this translation helpful? Give feedback.
0 replies
-
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, I got 46 tables to be loaded, they all with a pipeline template:

As you can see, it's about to compare old and new data and perform updates if there is a change or new input rows.
The 46 tables need to run this pipeline with MDI, so I created a parent pipeline:

the parameters table is
metadata config
. This tables structure is:How the pipeline should work:
For each of 46 rows, the MDI transform injects de row params to execute the pipeline template. For each row injected it has to inject the field names (defined as queries in select_column_names field) to be used in
merge rows(diff)
andinsert/update
transforms. So it executes every query to get those table colulumn names as every row frommetadata config
table is injected.metadata config
looks like this:With a limit of 1 row for the
metadata config
parameters-table, the metadata injection performs well, even the output pipeline file is generated correctly with its parameters, but for the 46 rows, I got this error:2025/04/21 15:06:50 - Hop - Pipeline opened.
2025/04/21 15:06:50 - Hop - Launching pipeline [etl_load_tables]...
2025/04/21 15:06:50 - Hop - Started the pipeline execution.
2025/04/21 15:06:50 - etl_load_tables - Executing this pipeline using the Local Pipeline Engine with run configuration 'local'
2025/04/21 15:06:50 - etl_load_tables - Execution started for pipeline [etl_load_tables]
2025/04/21 15:06:50 - metadata config.0 - Finished reading query, closing connection.
2025/04/21 15:06:50 - metadata config.0 - Finished processing (I=3, O=0, R=0, W=6, U=0, E=0)
2025/04/21 15:06:50 - Dynamic SQL row.0 - Finished processing (I=102, O=0, R=3, W=99, U=0, E=0)
2025/04/21 15:06:50 - incremental_load - Executing this pipeline using the Local Pipeline Engine with run configuration 'local'
2025/04/21 15:06:50 - incremental_load - Execution started for pipeline [incremental_load]
2025/04/21 15:06:58 - ETL Table input.0 - Finished reading query, closing connection.
2025/04/21 15:06:58 - ETL Table input.0 - Finished processing (I=309089, O=0, R=0, W=309089, U=0, E=0)
2025/04/21 15:07:01 - ERP Table input.0 - Finished reading query, closing connection.
2025/04/21 15:07:01 - ERP Table input.0 - Finished processing (I=309080, O=0, R=0, W=309080, U=0, E=0)
2025/04/21 15:07:01 - Merge rows (diff).0 - ERROR: Unable to find field [orden] in reference stream.
2025/04/21 15:07:01 - Merge rows (diff).0 - ERROR: Unexpected error
2025/04/21 15:07:01 - Merge rows (diff).0 - ERROR: org.apache.hop.core.exception.HopTransformException:
2025/04/21 15:07:01 - Merge rows (diff).0 - Unable to find field [orden] in reference stream.
2025/04/21 15:07:01 - Merge rows (diff).0 -
2025/04/21 15:07:01 - Merge rows (diff).0 - at org.apache.hop.pipeline.transforms.mergerows.MergeRows.processRow(MergeRows.java:110)
2025/04/21 15:07:01 - Merge rows (diff).0 - at org.apache.hop.pipeline.transform.RunThread.run(RunThread.java:54)
2025/04/21 15:07:01 - Merge rows (diff).0 - at java.base/java.lang.Thread.run(Thread.java:1583)
2025/04/21 15:07:01 - Merge rows (diff).0 - Finished processing (I=0, O=0, R=2, W=0, U=0, E=1)
2025/04/21 15:07:01 - Sort rows.0 - Finished processing (I=0, O=0, R=309080, W=9159, U=0, E=0)
2025/04/21 15:07:01 - incremental_load - Pipeline detected one or more transforms with errors.
2025/04/21 15:07:01 - incremental_load - Pipeline is killing the other transforms!
2025/04/21 15:07:01 - Sort rows 2.0 - Finished processing (I=0, O=0, R=309089, W=10002, U=0, E=0)
2025/04/21 15:07:01 - incremental_load - Pipeline duration : 10.455 seconds [ 10.455" ]
2025/04/21 15:07:01 - incremental_load - Execution finished on a local pipeline engine with run configuration 'local'
2025/04/21 15:07:01 - ETL injection.0 - Finished processing (I=309089, O=0, R=309089, W=309089, U=0, E=1)
2025/04/21 15:07:01 - etl_load_tables - Pipeline duration : 10.922 seconds [ 10.922" ]
2025/04/21 15:07:01 - etl_load_tables - Execution finished on a local pipeline engine with run configuration 'local'
2025/04/21 15:07:01 - etl_load_tables - Pipeline detected one or more transforms with errors.
2025/04/21 15:07:01 - etl_load_tables - Pipeline is killing the other transforms!
I suspect it's because the transform
Dynamic SQL row
generates cummulative rows for the filed_names of each table, so the pipeline template maps all those fields and not the only needed. What should I do?I would appreciate any help for this, I'm new at apache hop. Thanks in advance.
Beta Was this translation helpful? Give feedback.
All reactions