Skip to content

MultiApp--Moose Test Issue #18080

@AhmedAlmetwally

Description

@AhmedAlmetwally

Bug Description

I am building a multiapp using Moose. I had the following issues while running the tests. Mostly related to MPI and Libmoose library.

mesh/unique_ids.replicated_mesh .................................................. [min_cpus=2] FAILED (CRASH)
samplers/distribute.scale/execute ............................................................. FAILED (CRASH)
performance.multiprocess/mpi ................................... [min_cpus=2] FAILED (EXPECTED OUTPUT MISSING)
reporters/base.base .............................................................. [min_cpus=2] FAILED (CRASH)
vectorpostprocessors/parallel_consistency.test ................................... [min_cpus=2] FAILED (CRASH)
vectorpostprocessors/csv_reader.parallel ......................................... [min_cpus=2] FAILED (CRASH)
system_interfaces.mpi .......................................... [min_cpus=2] FAILED (EXPECTED OUTPUT MISSING)
postprocessors/num_residual_eval.test ............................................ [min_cpus=2] FAILED (CRASH)
outputs/json/distributed.info/default ............................................ [min_cpus=2] FAILED (CRASH)
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject ........ [min_cpus=2] FAILED (CRASH)
system_interfaces.partitioner/parmetis ........................................... [min_cpus=2] FAILED (CRASH)
vectorpostprocessors/work_balance.work_balance/replicated ........................ [min_cpus=2] FAILED (CRASH)
mesh/checkpoint.test_2 ........................................................... [min_cpus=2] FAILED (CRASH)
preconditioners/hmg.hmg ........................................ [min_cpus=2] FAILED (EXPECTED OUTPUT MISSING)
samplers/base.parallel/mpi ....................................................... [min_cpus=2] FAILED (CRASH)
outputs/xml.parallel/replicated .................................................. [min_cpus=3] FAILED (CRASH)
system_interfaces.partitioner/ptscotch ........................................... [min_cpus=2] FAILED (CRASH)
misc/exception.parallel_exception_residual_transient_non_zero_rank ............... [min_cpus=2] FAILED (CRASH)
meshgenerators/meta_data_store.test_meta_data_with_use_split ..................... [min_cpus=2] FAILED (CRASH)
vectorpostprocessors/parallel_consistency.broadcast .............................. [min_cpus=2] FAILED (CRASH)
mesh/checkpoint.test_2a .......................................................... [min_cpus=2] FAILED (CRASH)
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel ................... [min_cpus=2] FAILED (CRASH)
interfacekernels/2d_interface.parallel_fdp_test .................................. [min_cpus=2] FAILED (CRASH)
system_interfaces.solver/superlu ................................................. [min_cpus=2] FAILED (CRASH)
bcs/periodic.testperiodic_vector ............................................................. FAILED (ERRMSG)
mesh/custom_partitioner.group/custom_linear_partitioner .......................... [min_cpus=2] FAILED (CRASH)
transfers/multiapp_vector_pp_transfer.vector_pp_transfer ......................... [min_cpus=2] FAILED (CRASH)
reporters/mesh_info.info/default ................................................. [min_cpus=2] FAILED (CRASH)
system_interfaces.solver/mumps ................................................... [min_cpus=2] FAILED (CRASH)
auxkernels/vector_postprocessor_visualization.test ............................... [min_cpus=3] FAILED (CRASH)
bcs/dmg_periodic.check_one_step .................................................. [min_cpus=2] FAILED (CRASH)
outputs/vtk.files/parallel ....................................................... [min_cpus=2] FAILED (CRASH)
ics/depend_on_uo.ic_depend_on_uo ................................................. [min_cpus=2] FAILED (CRASH)
bcs/periodic.testperiodic_dp ..................................................... [min_cpus=2] FAILED (CRASH)
fvkernels/mms/non-orthogonal.compact ............................................. [min_cpus=2] FAILED (CRASH)
restart/restartable_types.parallel/first ......................................... [min_cpus=2] FAILED (CRASH)
preconditioners/hmg.hmg_3D ..................................... [min_cpus=2] FAILED (EXPECTED OUTPUT MISSING)
ics/depend_on_uo.scalar_ic_from_uo ............................................... [min_cpus=2] FAILED (CRASH)
mesh/mesh_only.mesh_only_checkpoint .............................................. [min_cpus=3] FAILED (CRASH)
mesh/custom_partitioner.group/custom_linear_partitioner_displacement ............. [min_cpus=2] FAILED (CRASH)
vectorpostprocessors/csv_reader.tester_fail ..................... [min_cpus=2] FAILED (EXPECTED ERROR MISSING)
transfers/multiapp_nearest_node_transfer.parallel ................................ [min_cpus=2] FAILED (CRASH)
fvkernels/mms/non-orthogonal.extended ............................................ [min_cpus=2] FAILED (CRASH)
mesh/custom_partitioner.group/custom_linear_partitioner_restart .................. [min_cpus=2] FAILED (CRASH)
outputs/vtk.solution/diff_serial_mesh_parallel ................................... [min_cpus=2] FAILED (CRASH)
interfaces/random.parallel_verification .......................................... [min_cpus=2] FAILED (CRASH)
meshgenerators/distributed_rectilinear/partition.2D_3 ............................ [min_cpus=3] FAILED (CRASH)
restart/kernel_restartable.parallel_error/error1 ................................. [min_cpus=2] FAILED (CRASH)
relationship_managers/geometric_neighbors.geometric_edge_neighbor ................ [min_cpus=3] FAILED (CRASH)
outputs/xml.parallel/distributed ................................................. [min_cpus=3] FAILED (CRASH)

Issue #1

mesh/unique_ids.distributed_mesh ................................................... [skipped dependency] SKIP
mesh/unique_ids.replicated_mesh: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/mesh/unique_ids
mesh/unique_ids.replicated_mesh: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i unique_ids.i --error --error-unused --error-override --no-gdb-backtrace
mesh/unique_ids.replicated_mesh: Fatal error in MPI_Init_thread: Invalid group, error stack:
mesh/unique_ids.replicated_mesh: MPIR_Init_thread(586)..............:
mesh/unique_ids.replicated_mesh: MPID_Init(224).....................: channel initialization failed
mesh/unique_ids.replicated_mesh: MPIDI_CH3_Init(105)................:
mesh/unique_ids.replicated_mesh: MPID_nem_init(324).................:
mesh/unique_ids.replicated_mesh: MPID_nem_tcp_init(175).............:
mesh/unique_ids.replicated_mesh: MPID_nem_tcp_get_business_card(401):
mesh/unique_ids.replicated_mesh: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
mesh/unique_ids.replicated_mesh: (unknown)(): Invalid group
mesh/unique_ids.replicated_mesh:
mesh/unique_ids.replicated_mesh:
mesh/unique_ids.replicated_mesh: Exit Code: 8
mesh/unique_ids.replicated_mesh: ################################################################################
mesh/unique_ids.replicated_mesh: Tester failed, reason: CRASH
mesh/unique_ids.replicated_mesh:
mesh/unique_ids.replicated_mesh .................................................. [min_cpus=2] FAILED (CRASH)

Issue #2

variables/fe_hermite_convergence.hermite_convergance/periodic ............................................. OK
samplers/distribute.scale/plot ..................................................... [skipped dependency] SKIP
samplers/distribute.scale/execute: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/samplers/distribute
samplers/distribute.scale/execute: Running command: /Users/almeag-mac/projects1/moose/test/tests/samplers/distribute/execute.py
samplers/distribute.scale/execute: mpiexec -n 1 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i distribute.i Outputs/file_base=distribute_1 Postprocessors/test/test_type=getGlobalSamples Samplers/sampler/num_rows=1
samplers/distribute.scale/execute: Traceback (most recent call last):
samplers/distribute.scale/execute: File "/Users/almeag-mac/projects1/moose/test/tests/samplers/distribute/execute.py", line 47, in
samplers/distribute.scale/execute: execute('distribute.i', 'distribute_none', 1, args.processors, 'getGlobalSamples')
samplers/distribute.scale/execute: File "/Users/almeag-mac/projects1/moose/test/tests/samplers/distribute/execute.py", line 29, in execute
samplers/distribute.scale/execute: local = pandas.read_csv('{}.csv'.format(file_base))
samplers/distribute.scale/execute: File "/Users/almeag-mac/miniconda3/envs/moose/lib/python3.7/site-packages/pandas/io/parsers.py", line 605, in read_csv
samplers/distribute.scale/execute: return _read(filepath_or_buffer, kwds)
samplers/distribute.scale/execute: File "/Users/almeag-mac/miniconda3/envs/moose/lib/python3.7/site-packages/pandas/io/parsers.py", line 457, in _read
samplers/distribute.scale/execute: parser = TextFileReader(filepath_or_buffer, **kwds)
samplers/distribute.scale/execute: File "/Users/almeag-mac/miniconda3/envs/moose/lib/python3.7/site-packages/pandas/io/parsers.py", line 814, in init
samplers/distribute.scale/execute: self._engine = self._make_engine(self.engine)
samplers/distribute.scale/execute: File "/Users/almeag-mac/miniconda3/envs/moose/lib/python3.7/site-packages/pandas/io/parsers.py", line 1045, in _make_engine
samplers/distribute.scale/execute: return mapping[engine](self.f, **self.options) # type: ignore[call-arg]
samplers/distribute.scale/execute: File "/Users/almeag-mac/miniconda3/envs/moose/lib/python3.7/site-packages/pandas/io/parsers.py", line 1862, in init
samplers/distribute.scale/execute: self._open_handles(src, kwds)
samplers/distribute.scale/execute: File "/Users/almeag-mac/miniconda3/envs/moose/lib/python3.7/site-packages/pandas/io/parsers.py", line 1363, in _open_handles
samplers/distribute.scale/execute: storage_options=kwds.get("storage_options", None),
samplers/distribute.scale/execute: File "/Users/almeag-mac/miniconda3/envs/moose/lib/python3.7/site-packages/pandas/io/common.py", line 647, in get_handle
samplers/distribute.scale/execute: newline="",
samplers/distribute.scale/execute: FileNotFoundError: [Errno 2] No such file or directory: 'distribute_1.csv'
samplers/distribute.scale/execute: mpiexec -n 1 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i distribute.i Outputs/file_base=distribute_1 Postprocessors/test/test_type=getGlobalSamples Samplers/sampler/num_rows=1
samplers/distribute.scale/execute: Traceback (most recent call last):
samplers/distribute.scale/execute: File "/Users/almeag-mac/projects1/moose/test/tests/samplers/distribute/execute.py", line 47, in
samplers/distribute.scale/execute: execute('distribute.i', 'distribute_none', 1, args.processors, 'getGlobalSamples')
samplers/distribute.scale/execute: File "/Users/almeag-mac/projects1/moose/test/tests/samplers/distribute/execute.py", line 29, in execute
samplers/distribute.scale/execute: local = pandas.read_csv('{}.csv'.format(file_base))
samplers/distribute.scale/execute: File "/Users/almeag-mac/miniconda3/envs/moose/lib/python3.7/site-packages/pandas/io/parsers.py", line 605, in read_csv
samplers/distribute.scale/execute: return _read(filepath_or_buffer, kwds)
samplers/distribute.scale/execute: File "/Users/almeag-mac/miniconda3/envs/moose/lib/python3.7/site-packages/pandas/io/parsers.py", line 457, in _read
samplers/distribute.scale/execute: parser = TextFileReader(filepath_or_buffer, **kwds)
samplers/distribute.scale/execute: File "/Users/almeag-mac/miniconda3/envs/moose/lib/python3.7/site-packages/pandas/io/parsers.py", line 814, in init
samplers/distribute.scale/execute: self._engine = self._make_engine(self.engine)
samplers/distribute.scale/execute: File "/Users/almeag-mac/miniconda3/envs/moose/lib/python3.7/site-packages/pandas/io/parsers.py", line 1045, in _make_engine
samplers/distribute.scale/execute: return mapping[engine](self.f, **self.options) # type: ignore[call-arg]
samplers/distribute.scale/execute: File "/Users/almeag-mac/miniconda3/envs/moose/lib/python3.7/site-packages/pandas/io/parsers.py", line 1862, in init
samplers/distribute.scale/execute: self._open_handles(src, kwds)
samplers/distribute.scale/execute: File "/Users/almeag-mac/miniconda3/envs/moose/lib/python3.7/site-packages/pandas/io/parsers.py", line 1363, in _open_handles
samplers/distribute.scale/execute: storage_options=kwds.get("storage_options", None),
samplers/distribute.scale/execute: File "/Users/almeag-mac/miniconda3/envs/moose/lib/python3.7/site-packages/pandas/io/common.py", line 647, in get_handle
samplers/distribute.scale/execute: newline="",
samplers/distribute.scale/execute: FileNotFoundError: [Errno 2] No such file or directory: 'distribute_1.csv'
samplers/distribute.scale/execute:
samplers/distribute.scale/execute:
samplers/distribute.scale/execute: Exit Code: 1
samplers/distribute.scale/execute: ################################################################################
samplers/distribute.scale/execute: Tester failed, reason: CRASH
samplers/distribute.scale/execute:
samplers/distribute.scale/execute ............................................................. FAILED (CRASH)

Issue #3

fvbcs/fv_neumannbc.fvbcs_internal ......................................................................... OK
performance.multiprocess/mpi: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/performance
performance.multiprocess/mpi: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i input.i --error --error-unused --error-override --no-gdb-backtrace
performance.multiprocess/mpi: Fatal error in MPI_Init_thread: Invalid group, error stack:
performance.multiprocess/mpi: MPIR_Init_thread(586)..............:
performance.multiprocess/mpi: MPID_Init(224).....................: channel initialization failed
performance.multiprocess/mpi: MPIDI_CH3_Init(105)................:
performance.multiprocess/mpi: MPID_nem_init(324).................:
performance.multiprocess/mpi: MPID_nem_tcp_init(175).............:
performance.multiprocess/mpi: MPID_nem_tcp_get_business_card(401):
performance.multiprocess/mpi: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
performance.multiprocess/mpi: (unknown)(): Invalid group
performance.multiprocess/mpi: Fatal error in MPI_Init_thread: Invalid group, error stack:
performance.multiprocess/mpi: MPIR_Init_thread(586)..............:
performance.multiprocess/mpi: MPID_Init(224).....................: channel initialization failed
performance.multiprocess/mpi: MPIDI_CH3_Init(105)................:
performance.multiprocess/mpi: MPID_nem_init(324).................:
performance.multiprocess/mpi: MPID_nem_tcp_init(175).............:
performance.multiprocess/mpi: MPID_nem_tcp_get_business_card(401):
performance.multiprocess/mpi: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
performance.multiprocess/mpi: (unknown)(): Invalid group
performance.multiprocess/mpi: ################################################################################
performance.multiprocess/mpi:
performance.multiprocess/mpi: Unable to match the following pattern against the program's output:
performance.multiprocess/mpi:
performance.multiprocess/mpi: Num Processors:\s+2\s+Num Threads:\s+1
performance.multiprocess/mpi:
performance.multiprocess/mpi: ################################################################################
performance.multiprocess/mpi: Tester failed, reason: EXPECTED OUTPUT MISSING
performance.multiprocess/mpi:
performance.multiprocess/mpi ................................... [min_cpus=2] FAILED (EXPECTED OUTPUT MISSING)
restrictable/block_api_test.block_undefined_var_block ..................................................... OK
reporters/base.base: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/reporters/base
reporters/base.base: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i base.i --allow-test-objects --error --error-unused --error-override --no-gdb-backtrace
reporters/base.base: Fatal error in MPI_Init_thread: Invalid group, error stack:
reporters/base.base: MPIR_Init_thread(586)..............:
reporters/base.base: MPID_Init(224).....................: channel initialization failed
reporters/base.base: MPIDI_CH3_Init(105)................:
reporters/base.base: MPID_nem_init(324).................:
reporters/base.base: MPID_nem_tcp_init(175).............:
reporters/base.base: MPID_nem_tcp_get_business_card(401):
reporters/base.base: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
reporters/base.base: (unknown)(): Invalid group
reporters/base.base:
reporters/base.base:
reporters/base.base: Exit Code: 8
reporters/base.base: ################################################################################
reporters/base.base: Tester failed, reason: CRASH
reporters/base.base:
reporters/base.base .............................................................. [min_cpus=2] FAILED (CRASH)
reporters/constant_reporter.errors/no_values .............................................................. OK

Issue #4

vectorpostprocessors/element_value_sampler.element_value_sampler/lagrange ................................. OK
vectorpostprocessors/parallel_consistency.test: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/vectorpostprocessors/parallel_consistency
vectorpostprocessors/parallel_consistency.test: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i parallel_consistency.i --error --error-unused --error-override --no-gdb-backtrace
vectorpostprocessors/parallel_consistency.test: Fatal error in MPI_Init_thread: Invalid group, error stack:
vectorpostprocessors/parallel_consistency.test: MPIR_Init_thread(586)..............:
vectorpostprocessors/parallel_consistency.test: MPID_Init(224).....................: channel initialization failed
vectorpostprocessors/parallel_consistency.test: MPIDI_CH3_Init(105)................:
vectorpostprocessors/parallel_consistency.test: MPID_nem_init(324).................:
vectorpostprocessors/parallel_consistency.test: MPID_nem_tcp_init(175).............:
vectorpostprocessors/parallel_consistency.test: MPID_nem_tcp_get_business_card(401):
vectorpostprocessors/parallel_consistency.test: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
vectorpostprocessors/parallel_consistency.test: (unknown)(): Invalid group
vectorpostprocessors/parallel_consistency.test: Fatal error in MPI_Init_thread: Invalid group, error stack:
vectorpostprocessors/parallel_consistency.test: MPIR_Init_thread(586)..............:
vectorpostprocessors/parallel_consistency.test: MPID_Init(224).....................: channel initialization failed
vectorpostprocessors/parallel_consistency.test: MPIDI_CH3_Init(105)................:
vectorpostprocessors/parallel_consistency.test: MPID_nem_init(324).................:
vectorpostprocessors/parallel_consistency.test: MPID_nem_tcp_init(175).............:
vectorpostprocessors/parallel_consistency.test: MPID_nem_tcp_get_business_card(401):
vectorpostprocessors/parallel_consistency.test: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
vectorpostprocessors/parallel_consistency.test: (unknown)(): Invalid group
vectorpostprocessors/parallel_consistency.test: Fatal error in MPI_Init_thread: Invalid group, error stack:
vectorpostprocessors/parallel_consistency.test: MPIR_Init_thread(586)..............:
vectorpostprocessors/parallel_consistency.test: MPID_Init(224).....................: channel initialization failed
vectorpostprocessors/parallel_consistency.test: MPIDI_CH3_Init(105)................:
vectorpostprocessors/parallel_consistency.test: MPID_nem_init(324).................:
vectorpostprocessors/parallel_consistency.test: MPID_nem_tcp_init(175).............:
vectorpostprocessors/parallel_consistency.test: MPID_nem_tcp_get_business_card(401):
vectorpostprocessors/parallel_consistency.test: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
vectorpostprocessors/parallel_consistency.test: (unknown)(): Invalid group
vectorpostprocessors/parallel_consistency.test: Fatal error in MPI_Init_thread: Invalid group, error stack:
vectorpostprocessors/parallel_consistency.test: MPIR_Init_thread(586)..............:
vectorpostprocessors/parallel_consistency.test: MPID_Init(224).....................: channel initialization failed
vectorpostprocessors/parallel_consistency.test: MPIDI_CH3_Init(105)................:
vectorpostprocessors/parallel_consistency.test: MPID_nem_init(324).................:
vectorpostprocessors/parallel_consistency.test: MPID_nem_tcp_init(175).............:
vectorpostprocessors/parallel_consistency.test: MPID_nem_tcp_get_business_card(401):
vectorpostprocessors/parallel_consistency.test: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
vectorpostprocessors/parallel_consistency.test: (unknown)(): Invalid group
vectorpostprocessors/parallel_consistency.test:
vectorpostprocessors/parallel_consistency.test:
vectorpostprocessors/parallel_consistency.test: Exit Code: 8
vectorpostprocessors/parallel_consistency.test: ################################################################################
vectorpostprocessors/parallel_consistency.test: Tester failed, reason: CRASH
vectorpostprocessors/parallel_consistency.test:
vectorpostprocessors/parallel_consistency.test ................................... [min_cpus=2] FAILED (CRASH)
vectorpostprocessors/intersection_points_along_line.intersecting_elems/3d ................................. OK
vectorpostprocessors/csv_reader.parallel: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/vectorpostprocessors/csv_reader
vectorpostprocessors/csv_reader.parallel: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i read.i UserObjects/tester/rank=1 Outputs/csv=false --error --error-unused --error-override --no-gdb-backtrace
vectorpostprocessors/csv_reader.parallel: Fatal error in MPI_Init_thread: Invalid group, error stack:
vectorpostprocessors/csv_reader.parallel: MPIR_Init_thread(586)..............:
vectorpostprocessors/csv_reader.parallel: MPID_Init(224).....................: channel initialization failed
vectorpostprocessors/csv_reader.parallel: MPIDI_CH3_Init(105)................:
vectorpostprocessors/csv_reader.parallel: MPID_nem_init(324).................:
vectorpostprocessors/csv_reader.parallel: MPID_nem_tcp_init(175).............:
vectorpostprocessors/csv_reader.parallel: MPID_nem_tcp_get_business_card(401):
vectorpostprocessors/csv_reader.parallel: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
vectorpostprocessors/csv_reader.parallel: (unknown)(): Invalid group
vectorpostprocessors/csv_reader.parallel: Fatal error in MPI_Init_thread: Invalid group, error stack:
vectorpostprocessors/csv_reader.parallel: MPIR_Init_thread(586)..............:
vectorpostprocessors/csv_reader.parallel: MPID_Init(224).....................: channel initialization failed
vectorpostprocessors/csv_reader.parallel: MPIDI_CH3_Init(105)................:
vectorpostprocessors/csv_reader.parallel: MPID_nem_init(324).................:
vectorpostprocessors/csv_reader.parallel: MPID_nem_tcp_init(175).............:
vectorpostprocessors/csv_reader.parallel: MPID_nem_tcp_get_business_card(401):
vectorpostprocessors/csv_reader.parallel: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
vectorpostprocessors/csv_reader.parallel: (unknown)(): Invalid group
vectorpostprocessors/csv_reader.parallel:
vectorpostprocessors/csv_reader.parallel:
vectorpostprocessors/csv_reader.parallel: Exit Code: 8
vectorpostprocessors/csv_reader.parallel: ################################################################################
vectorpostprocessors/csv_reader.parallel: Tester failed, reason: CRASH
vectorpostprocessors/csv_reader.parallel:
vectorpostprocessors/csv_reader.parallel ......................................... [min_cpus=2] FAILED (CRASH)
time_integrators/implicit-euler.monomials ................................................................. OK


Issue #5

restrictable/block_api_test.ids/blocks .................................................................... OK
system_interfaces.mpi: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/system_interfaces
system_interfaces.mpi: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i input.i --error --error-unused --error-override --no-gdb-backtrace
system_interfaces.mpi: Fatal error in MPI_Init_thread: Invalid group, error stack:
system_interfaces.mpi: MPIR_Init_thread(586)..............:
system_interfaces.mpi: MPID_Init(224).....................: channel initialization failed
system_interfaces.mpi: MPIDI_CH3_Init(105)................:
system_interfaces.mpi: MPID_nem_init(324).................:
system_interfaces.mpi: MPID_nem_tcp_init(175).............:
system_interfaces.mpi: MPID_nem_tcp_get_business_card(401):
system_interfaces.mpi: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
system_interfaces.mpi: (unknown)(): Invalid group
system_interfaces.mpi: ################################################################################
system_interfaces.mpi:
system_interfaces.mpi: Unable to match the following pattern against the program's output:
system_interfaces.mpi:
system_interfaces.mpi: Num Processors:\s+2\s+Num Threads:\s+1
system_interfaces.mpi:
system_interfaces.mpi: ################################################################################
system_interfaces.mpi: Tester failed, reason: EXPECTED OUTPUT MISSING
system_interfaces.mpi:
system_interfaces.mpi .......................................... [min_cpus=2] FAILED (EXPECTED OUTPUT MISSING)
preconditioners/fdp.jacobian_fdp_coloring_diagonal_test_fail .............................................. OK

Issue #6

time_integrators/rk-2.jacobian/2d-quadratic_num-of-jacobian-calls ......................................... OK
postprocessors/num_residual_eval.test: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/postprocessors/num_residual_eval
postprocessors/num_residual_eval.test: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i num_residual_eval.i --error --error-unused --error-override --no-gdb-backtrace
postprocessors/num_residual_eval.test: Fatal error in MPI_Init_thread: Invalid group, error stack:
postprocessors/num_residual_eval.test: MPIR_Init_thread(586)..............:
postprocessors/num_residual_eval.test: MPID_Init(224).....................: channel initialization failed
postprocessors/num_residual_eval.test: MPIDI_CH3_Init(105)................:
postprocessors/num_residual_eval.test: MPID_nem_init(324).................:
postprocessors/num_residual_eval.test: MPID_nem_tcp_init(175).............:
postprocessors/num_residual_eval.test: MPID_nem_tcp_get_business_card(401):
postprocessors/num_residual_eval.test: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
postprocessors/num_residual_eval.test: (unknown)(): Invalid group
postprocessors/num_residual_eval.test: Fatal error in MPI_Init_thread: Invalid group, error stack:
postprocessors/num_residual_eval.test: MPIR_Init_thread(586)..............:
postprocessors/num_residual_eval.test: MPID_Init(224).....................: channel initialization failed
postprocessors/num_residual_eval.test: MPIDI_CH3_Init(105)................:
postprocessors/num_residual_eval.test: MPID_nem_init(324).................:
postprocessors/num_residual_eval.test: MPID_nem_tcp_init(175).............:
postprocessors/num_residual_eval.test: MPID_nem_tcp_get_business_card(401):
postprocessors/num_residual_eval.test: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
postprocessors/num_residual_eval.test: (unknown)(): Invalid group
postprocessors/num_residual_eval.test: Fatal error in MPI_Init_thread: Invalid group, error stack:
postprocessors/num_residual_eval.test: MPIR_Init_thread(586)..............:
postprocessors/num_residual_eval.test: MPID_Init(224).....................: channel initialization failed
postprocessors/num_residual_eval.test: MPIDI_CH3_Init(105)................:
postprocessors/num_residual_eval.test: MPID_nem_init(324).................:
postprocessors/num_residual_eval.test: MPID_nem_tcp_init(175).............:
postprocessors/num_residual_eval.test: MPID_nem_tcp_get_business_card(401):
postprocessors/num_residual_eval.test: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
postprocessors/num_residual_eval.test: (unknown)(): Invalid group
postprocessors/num_residual_eval.test: Fatal error in MPI_Init_thread: Invalid group, error stack:
postprocessors/num_residual_eval.test: MPIR_Init_thread(586)..............:
postprocessors/num_residual_eval.test: MPID_Init(224).....................: channel initialization failed
postprocessors/num_residual_eval.test: MPIDI_CH3_Init(105)................:
postprocessors/num_residual_eval.test: MPID_nem_init(324).................:
postprocessors/num_residual_eval.test: MPID_nem_tcp_init(175).............:
postprocessors/num_residual_eval.test: MPID_nem_tcp_get_business_card(401):
postprocessors/num_residual_eval.test: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
postprocessors/num_residual_eval.test: (unknown)(): Invalid group
postprocessors/num_residual_eval.test:
postprocessors/num_residual_eval.test:
postprocessors/num_residual_eval.test: Exit Code: 8
postprocessors/num_residual_eval.test: ################################################################################
postprocessors/num_residual_eval.test: Tester failed, reason: CRASH
postprocessors/num_residual_eval.test:
postprocessors/num_residual_eval.test ............................................ [min_cpus=2] FAILED (CRASH)
postprocessors/change_over_fixed_point.change_over_fixed_point_error/change_with_respect_to_initial_error_this OK

Issue #7

outputs/postprocessor.show_hide ........................................................................... OK
outputs/json/distributed.info/default: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/outputs/json/distributed
outputs/json/distributed.info/default: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i distributed.i --error --error-unused --error-override --no-gdb-backtrace
outputs/json/distributed.info/default: Fatal error in MPI_Init_thread: Invalid group, error stack:
outputs/json/distributed.info/default: MPIR_Init_thread(586)..............:
outputs/json/distributed.info/default: MPID_Init(224).....................: channel initialization failed
outputs/json/distributed.info/default: MPIDI_CH3_Init(105)................:
outputs/json/distributed.info/default: MPID_nem_init(324).................:
outputs/json/distributed.info/default: MPID_nem_tcp_init(175).............:
outputs/json/distributed.info/default: MPID_nem_tcp_get_business_card(401):
outputs/json/distributed.info/default: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
outputs/json/distributed.info/default: (unknown)(): Invalid group
outputs/json/distributed.info/default: Fatal error in MPI_Init_thread: Invalid group, error stack:
outputs/json/distributed.info/default: MPIR_Init_thread(586)..............:
outputs/json/distributed.info/default: MPID_Init(224).....................: channel initialization failed
outputs/json/distributed.info/default: MPIDI_CH3_Init(105)................:
outputs/json/distributed.info/default: MPID_nem_init(324).................:
outputs/json/distributed.info/default: MPID_nem_tcp_init(175).............:
outputs/json/distributed.info/default: MPID_nem_tcp_get_business_card(401):
outputs/json/distributed.info/default: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
outputs/json/distributed.info/default: (unknown)(): Invalid group
outputs/json/distributed.info/default:
outputs/json/distributed.info/default:
outputs/json/distributed.info/default: Exit Code: 8
outputs/json/distributed.info/default: ################################################################################
outputs/json/distributed.info/default: Tester failed, reason: CRASH
outputs/json/distributed.info/default:
outputs/json/distributed.info/default ............................................ [min_cpus=2] FAILED (CRASH)
outputs/intervals.sync_times .............................................................................. OK

Issue #8

bcs/periodic.orthogonal_pbc_on_square_test ................................................................ OK
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/userobjects/setup_interface_count
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i general.i --error --error-unused --error-override --no-gdb-backtrace
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: Fatal error in MPI_Init_thread: Invalid group, error stack:
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: MPIR_Init_thread(586)..............:
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: MPID_Init(224).....................: channel initialization failed
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: MPIDI_CH3_Init(105)................:
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: MPID_nem_init(324).................:
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: MPID_nem_tcp_init(175).............:
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: MPID_nem_tcp_get_business_card(401):
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: (unknown)(): Invalid group
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: Fatal error in MPI_Init_thread: Invalid group, error stack:
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: MPIR_Init_thread(586)..............:
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: MPID_Init(224).....................: channel initialization failed
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: MPIDI_CH3_Init(105)................:
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: MPID_nem_init(324).................:
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: MPID_nem_tcp_init(175).............:
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: MPID_nem_tcp_get_business_card(401):
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: (unknown)(): Invalid group
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject:
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject:
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: Exit Code: 8
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: ################################################################################
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject: Tester failed, reason: CRASH
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject:
userobjects/setup_interface_count.setup_interface_count/GeneralUserObject ........ [min_cpus=2] FAILED (CRASH)
bcs/nodal_normals.small_sqaure ............................................................................ OK

Issue #9

outputs/misc.default_names ................................................................................ OK
system_interfaces.partitioner/parmetis: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/system_interfaces
system_interfaces.partitioner/parmetis: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i input.i --error --error-unused --error-override --no-gdb-backtrace
system_interfaces.partitioner/parmetis: Fatal error in MPI_Init_thread: Invalid group, error stack:
system_interfaces.partitioner/parmetis: MPIR_Init_thread(586)..............:
system_interfaces.partitioner/parmetis: MPID_Init(224).....................: channel initialization failed
system_interfaces.partitioner/parmetis: MPIDI_CH3_Init(105)................:
system_interfaces.partitioner/parmetis: MPID_nem_init(324).................:
system_interfaces.partitioner/parmetis: MPID_nem_tcp_init(175).............:
system_interfaces.partitioner/parmetis: MPID_nem_tcp_get_business_card(401):
system_interfaces.partitioner/parmetis: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
system_interfaces.partitioner/parmetis: (unknown)(): Invalid group
system_interfaces.partitioner/parmetis: Fatal error in MPI_Init_thread: Invalid group, error stack:
system_interfaces.partitioner/parmetis: MPIR_Init_thread(586)..............:
system_interfaces.partitioner/parmetis: MPID_Init(224).....................: channel initialization failed
system_interfaces.partitioner/parmetis: MPIDI_CH3_Init(105)................:
system_interfaces.partitioner/parmetis: MPID_nem_init(324).................:
system_interfaces.partitioner/parmetis: MPID_nem_tcp_init(175).............:
system_interfaces.partitioner/parmetis: MPID_nem_tcp_get_business_card(401):
system_interfaces.partitioner/parmetis: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
system_interfaces.partitioner/parmetis: (unknown)(): Invalid group
system_interfaces.partitioner/parmetis:
system_interfaces.partitioner/parmetis:
system_interfaces.partitioner/parmetis: Exit Code: 8
system_interfaces.partitioner/parmetis: ################################################################################
system_interfaces.partitioner/parmetis: Tester failed, reason: CRASH
system_interfaces.partitioner/parmetis:
system_interfaces.partitioner/parmetis ........................................... [min_cpus=2] FAILED (CRASH)
outputs/exodus.hide_output ................................................................................ OK

Issue #10

constraints/equal_value_embedded_constraint.penalty/1D_3D ................................................. OK
vectorpostprocessors/work_balance.work_balance/replicated: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/vectorpostprocessors/work_balance
vectorpostprocessors/work_balance.work_balance/replicated: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i work_balance.i --error --error-unused --error-override --no-gdb-backtrace
vectorpostprocessors/work_balance.work_balance/replicated: Fatal error in MPI_Init_thread: Invalid group, error stack:
vectorpostprocessors/work_balance.work_balance/replicated: MPIR_Init_thread(586)..............:
vectorpostprocessors/work_balance.work_balance/replicated: MPID_Init(224).....................: channel initialization failed
vectorpostprocessors/work_balance.work_balance/replicated: MPIDI_CH3_Init(105)................:
vectorpostprocessors/work_balance.work_balance/replicated: MPID_nem_init(324).................:
vectorpostprocessors/work_balance.work_balance/replicated: MPID_nem_tcp_init(175).............:
vectorpostprocessors/work_balance.work_balance/replicated: MPID_nem_tcp_get_business_card(401):
vectorpostprocessors/work_balance.work_balance/replicated: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
vectorpostprocessors/work_balance.work_balance/replicated: (unknown)(): Invalid group
vectorpostprocessors/work_balance.work_balance/replicated: Fatal error in MPI_Init_thread: Invalid group, error stack:
vectorpostprocessors/work_balance.work_balance/replicated: MPIR_Init_thread(586)..............:
vectorpostprocessors/work_balance.work_balance/replicated: MPID_Init(224).....................: channel initialization failed
vectorpostprocessors/work_balance.work_balance/replicated: MPIDI_CH3_Init(105)................:
vectorpostprocessors/work_balance.work_balance/replicated: MPID_nem_init(324).................:
vectorpostprocessors/work_balance.work_balance/replicated: MPID_nem_tcp_init(175).............:
vectorpostprocessors/work_balance.work_balance/replicated: MPID_nem_tcp_get_business_card(401):
vectorpostprocessors/work_balance.work_balance/replicated: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
vectorpostprocessors/work_balance.work_balance/replicated: (unknown)(): Invalid group
vectorpostprocessors/work_balance.work_balance/replicated:
vectorpostprocessors/work_balance.work_balance/replicated:
vectorpostprocessors/work_balance.work_balance/replicated: Exit Code: 8
vectorpostprocessors/work_balance.work_balance/replicated: ################################################################################
vectorpostprocessors/work_balance.work_balance/replicated: Tester failed, reason: CRASH
vectorpostprocessors/work_balance.work_balance/replicated:
vectorpostprocessors/work_balance.work_balance/replicated ........................ [min_cpus=2] FAILED (CRASH)
mesh/checkpoint.test_2: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/mesh/checkpoint
mesh/checkpoint.test_2: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i checkpoint_split.i Outputs/file_base=test_2 --use-split --split-file checkpoint_split_in --error --error-unused --error-override --no-gdb-backtrace
mesh/checkpoint.test_2: Fatal error in MPI_Init_thread: Invalid group, error stack:
mesh/checkpoint.test_2: MPIR_Init_thread(586)..............:
mesh/checkpoint.test_2: MPID_Init(224).....................: channel initialization failed
mesh/checkpoint.test_2: MPIDI_CH3_Init(105)................:
mesh/checkpoint.test_2: MPID_nem_init(324).................:
mesh/checkpoint.test_2: MPID_nem_tcp_init(175).............:
mesh/checkpoint.test_2: MPID_nem_tcp_get_business_card(401):
mesh/checkpoint.test_2: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
mesh/checkpoint.test_2: (unknown)(): Invalid group
mesh/checkpoint.test_2: Fatal error in MPI_Init_thread: Invalid group, error stack:
mesh/checkpoint.test_2: MPIR_Init_thread(586)..............:
mesh/checkpoint.test_2: MPID_Init(224).....................: channel initialization failed
mesh/checkpoint.test_2: MPIDI_CH3_Init(105)................:
mesh/checkpoint.test_2: MPID_nem_init(324).................:
mesh/checkpoint.test_2: MPID_nem_tcp_init(175).............:
mesh/checkpoint.test_2: MPID_nem_tcp_get_business_card(401):
mesh/checkpoint.test_2: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
mesh/checkpoint.test_2: (unknown)(): Invalid group
mesh/checkpoint.test_2:
mesh/checkpoint.test_2:
mesh/checkpoint.test_2: Exit Code: 8
mesh/checkpoint.test_2: ################################################################################
mesh/checkpoint.test_2: Tester failed, reason: CRASH
mesh/checkpoint.test_2:
mesh/checkpoint.test_2 ........................................................... [min_cpus=2] FAILED (CRASH)
postprocessors/num_iterations.methods/explicit_euler ...................................................... OK

Issue #11

transfers/multiapp_mesh_function_transfer.errors/mismatch_exec_on ......................................... OK
preconditioners/hmg.hmg: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/preconditioners/hmg
preconditioners/hmg.hmg: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i diffusion_hmg.i --error --error-unused --error-override --no-gdb-backtrace
preconditioners/hmg.hmg: Fatal error in MPI_Init_thread: Invalid group, error stack:
preconditioners/hmg.hmg: MPIR_Init_thread(586)..............:
preconditioners/hmg.hmg: MPID_Init(224).....................: channel initialization failed
preconditioners/hmg.hmg: MPIDI_CH3_Init(105)................:
preconditioners/hmg.hmg: MPID_nem_init(324).................:
preconditioners/hmg.hmg: MPID_nem_tcp_init(175).............:
preconditioners/hmg.hmg: MPID_nem_tcp_get_business_card(401):
preconditioners/hmg.hmg: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
preconditioners/hmg.hmg: (unknown)(): Invalid group
preconditioners/hmg.hmg: Fatal error in MPI_Init_thread: Invalid group, error stack:
preconditioners/hmg.hmg: MPIR_Init_thread(586)..............:
preconditioners/hmg.hmg: MPID_Init(224).....................: channel initialization failed
preconditioners/hmg.hmg: MPIDI_CH3_Init(105)................:
preconditioners/hmg.hmg: MPID_nem_init(324).................:
preconditioners/hmg.hmg: MPID_nem_tcp_init(175).............:
preconditioners/hmg.hmg: MPID_nem_tcp_get_business_card(401):
preconditioners/hmg.hmg: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
preconditioners/hmg.hmg: (unknown)(): Invalid group
preconditioners/hmg.hmg: ################################################################################
preconditioners/hmg.hmg:
preconditioners/hmg.hmg: Unable to match the following pattern against the program's output:
preconditioners/hmg.hmg:
preconditioners/hmg.hmg: using\s+allatonce\s+MatPtAP()\s+implementation
preconditioners/hmg.hmg:
preconditioners/hmg.hmg: ################################################################################
preconditioners/hmg.hmg: Tester failed, reason: EXPECTED OUTPUT MISSING
preconditioners/hmg.hmg:
preconditioners/hmg.hmg ........................................ [min_cpus=2] FAILED (EXPECTED OUTPUT MISSING)
samplers/base.parallel/mpi: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/samplers/base
samplers/base.parallel/mpi: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i mpi.i --allow-test-objects --error --error-unused --error-override --no-gdb-backtrace
samplers/base.parallel/mpi: Fatal error in MPI_Init_thread: Invalid group, error stack:
samplers/base.parallel/mpi: MPIR_Init_thread(586)..............:
samplers/base.parallel/mpi: MPID_Init(224).....................: channel initialization failed
samplers/base.parallel/mpi: MPIDI_CH3_Init(105)................:
samplers/base.parallel/mpi: MPID_nem_init(324).................:
samplers/base.parallel/mpi: MPID_nem_tcp_init(175).............:
samplers/base.parallel/mpi: MPID_nem_tcp_get_business_card(401):
samplers/base.parallel/mpi: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
samplers/base.parallel/mpi: (unknown)(): Invalid group
samplers/base.parallel/mpi:
samplers/base.parallel/mpi:
samplers/base.parallel/mpi: Exit Code: 8
samplers/base.parallel/mpi: ################################################################################
samplers/base.parallel/mpi: Tester failed, reason: CRASH
samplers/base.parallel/mpi:
samplers/base.parallel/mpi ....................................................... [min_cpus=2] FAILED (CRASH)
transfers/multiapp_userobject_transfer.two_pipes .......................................................... OK

Issue #11

restrictable/block_api_test.has/isBlockSubset ............................................................. OK
outputs/xml.parallel/replicated: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/outputs/xml
outputs/xml.parallel/replicated: Running command: mpiexec -n 3 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i xml.i --error --error-unused --error-override --no-gdb-backtrace
outputs/xml.parallel/replicated: Fatal error in MPI_Init_thread: Invalid group, error stack:
outputs/xml.parallel/replicated: MPIR_Init_thread(586)..............:
outputs/xml.parallel/replicated: MPID_Init(224).....................: channel initialization failed
outputs/xml.parallel/replicated: MPIDI_CH3_Init(105)................:
outputs/xml.parallel/replicated: MPID_nem_init(324).................:
outputs/xml.parallel/replicated: MPID_nem_tcp_init(175).............:
outputs/xml.parallel/replicated: MPID_nem_tcp_get_business_card(401):
outputs/xml.parallel/replicated: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
outputs/xml.parallel/replicated: (unknown)(): Invalid group
outputs/xml.parallel/replicated:
outputs/xml.parallel/replicated: ################################################################################
outputs/xml.parallel/replicated: Tester failed, reason: CRASH
outputs/xml.parallel/replicated:
outputs/xml.parallel/replicated .................................................. [min_cpus=3] FAILED (CRASH)
interfacekernels/1d_interface.ik_save_in .................................................................. OK

Issue #12

reporters/base.errors/requested_different_type ............................................................ OK
system_interfaces.partitioner/ptscotch: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/system_interfaces
system_interfaces.partitioner/ptscotch: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i input.i --error --error-unused --error-override --no-gdb-backtrace
system_interfaces.partitioner/ptscotch: Fatal error in MPI_Init_thread: Invalid group, error stack:
system_interfaces.partitioner/ptscotch: MPIR_Init_thread(586)..............:
system_interfaces.partitioner/ptscotch: MPID_Init(224).....................: channel initialization failed
system_interfaces.partitioner/ptscotch: MPIDI_CH3_Init(105)................:
system_interfaces.partitioner/ptscotch: MPID_nem_init(324).................:
system_interfaces.partitioner/ptscotch: MPID_nem_tcp_init(175).............:
system_interfaces.partitioner/ptscotch: MPID_nem_tcp_get_business_card(401):
system_interfaces.partitioner/ptscotch: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
system_interfaces.partitioner/ptscotch: (unknown)(): Invalid group
system_interfaces.partitioner/ptscotch:
system_interfaces.partitioner/ptscotch:
system_interfaces.partitioner/ptscotch: Exit Code: 8
system_interfaces.partitioner/ptscotch: ################################################################################
system_interfaces.partitioner/ptscotch: Tester failed, reason: CRASH
system_interfaces.partitioner/ptscotch:
system_interfaces.partitioner/ptscotch ........................................... [min_cpus=2] FAILED (CRASH)
interfacekernels/1d_interface.ik_save_in_other_side ....................................................... OK
bcs/periodic.testperiodic ................................................................................. OK
outputs/debug.show_material_properties_consumed ........................................................... OK
ics/random_ic_test.test_threaded .......................................................... [min_threads=2] OK
misc/exception.parallel_exception_residual_transient_non_zero_rank: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/misc/exception
misc/exception.parallel_exception_residual_transient_non_zero_rank: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i parallel_exception_residual_transient.i Kernels/exception/rank=1 --error --error-unused --error-override --no-gdb-backtrace
misc/exception.parallel_exception_residual_transient_non_zero_rank: Fatal error in MPI_Init_thread: Invalid group, error stack:
misc/exception.parallel_exception_residual_transient_non_zero_rank: MPIR_Init_thread(586)..............:
misc/exception.parallel_exception_residual_transient_non_zero_rank: MPID_Init(224).....................: channel initialization failed
misc/exception.parallel_exception_residual_transient_non_zero_rank: MPIDI_CH3_Init(105)................:
misc/exception.parallel_exception_residual_transient_non_zero_rank: MPID_nem_init(324).................:
misc/exception.parallel_exception_residual_transient_non_zero_rank: MPID_nem_tcp_init(175).............:
misc/exception.parallel_exception_residual_transient_non_zero_rank: MPID_nem_tcp_get_business_card(401):
misc/exception.parallel_exception_residual_transient_non_zero_rank: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
misc/exception.parallel_exception_residual_transient_non_zero_rank: (unknown)(): Invalid group
misc/exception.parallel_exception_residual_transient_non_zero_rank: Fatal error in MPI_Init_thread: Invalid group, error stack:
misc/exception.parallel_exception_residual_transient_non_zero_rank: MPIR_Init_thread(586)..............:
misc/exception.parallel_exception_residual_transient_non_zero_rank: MPID_Init(224).....................: channel initialization failed
misc/exception.parallel_exception_residual_transient_non_zero_rank: MPIDI_CH3_Init(105)................:
misc/exception.parallel_exception_residual_transient_non_zero_rank: MPID_nem_init(324).................:
misc/exception.parallel_exception_residual_transient_non_zero_rank: MPID_nem_tcp_init(175).............:
misc/exception.parallel_exception_residual_transient_non_zero_rank: MPID_nem_tcp_get_business_card(401):
misc/exception.parallel_exception_residual_transient_non_zero_rank: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
misc/exception.parallel_exception_residual_transient_non_zero_rank: (unknown)(): Invalid group
misc/exception.parallel_exception_residual_transient_non_zero_rank:
misc/exception.parallel_exception_residual_transient_non_zero_rank:
misc/exception.parallel_exception_residual_transient_non_zero_rank: Exit Code: 8
misc/exception.parallel_exception_residual_transient_non_zero_rank: ################################################################################
misc/exception.parallel_exception_residual_transient_non_zero_rank: Tester failed, reason: CRASH
misc/exception.parallel_exception_residual_transient_non_zero_rank:
misc/exception.parallel_exception_residual_transient_non_zero_rank ............... [min_cpus=2] FAILED (CRASH)
nodalkernels/constraint_enforcement.vi/rsls ............................................................... OK

Issue #13

meshgenerators/meta_data_store.test_meta_data_with_use_split: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/meshgenerators/meta_data_store
meshgenerators/meta_data_store.test_meta_data_with_use_split: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i mesh_meta_data_store.i --use-split --split-file split2 --error --error-unused --error-override --no-gdb-backtrace
meshgenerators/meta_data_store.test_meta_data_with_use_split: Fatal error in MPI_Init_thread: Invalid group, error stack:
meshgenerators/meta_data_store.test_meta_data_with_use_split: MPIR_Init_thread(586)..............:
meshgenerators/meta_data_store.test_meta_data_with_use_split: MPID_Init(224).....................: channel initialization failed
meshgenerators/meta_data_store.test_meta_data_with_use_split: MPIDI_CH3_Init(105)................:
meshgenerators/meta_data_store.test_meta_data_with_use_split: MPID_nem_init(324).................:
meshgenerators/meta_data_store.test_meta_data_with_use_split: MPID_nem_tcp_init(175).............:
meshgenerators/meta_data_store.test_meta_data_with_use_split: MPID_nem_tcp_get_business_card(401):
meshgenerators/meta_data_store.test_meta_data_with_use_split: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
meshgenerators/meta_data_store.test_meta_data_with_use_split: (unknown)(): Invalid group
meshgenerators/meta_data_store.test_meta_data_with_use_split: Fatal error in MPI_Init_thread: Invalid group, error stack:
meshgenerators/meta_data_store.test_meta_data_with_use_split: MPIR_Init_thread(586)..............:
meshgenerators/meta_data_store.test_meta_data_with_use_split: MPID_Init(224).....................: channel initialization failed
meshgenerators/meta_data_store.test_meta_data_with_use_split: MPIDI_CH3_Init(105)................:
meshgenerators/meta_data_store.test_meta_data_with_use_split: MPID_nem_init(324).................:
meshgenerators/meta_data_store.test_meta_data_with_use_split: MPID_nem_tcp_init(175).............:
meshgenerators/meta_data_store.test_meta_data_with_use_split: MPID_nem_tcp_get_business_card(401):
meshgenerators/meta_data_store.test_meta_data_with_use_split: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
meshgenerators/meta_data_store.test_meta_data_with_use_split: (unknown)(): Invalid group
meshgenerators/meta_data_store.test_meta_data_with_use_split:
meshgenerators/meta_data_store.test_meta_data_with_use_split:
meshgenerators/meta_data_store.test_meta_data_with_use_split: Exit Code: 8
meshgenerators/meta_data_store.test_meta_data_with_use_split: ################################################################################
meshgenerators/meta_data_store.test_meta_data_with_use_split: Tester failed, reason: CRASH
meshgenerators/meta_data_store.test_meta_data_with_use_split:
meshgenerators/meta_data_store.test_meta_data_with_use_split ..................... [min_cpus=2] FAILED (CRASH)
time_steppers/timesequence_stepper.restart_failure/timesequence_restart_failure_1 ......................... OK
multiapps/steffensen.variables_transient/app_end_transfers_end ............................................ OK
materials/output.invalid_outputs .......................................................................... OK
userobjects/threaded_general_user_object.thread_copies_guo/th2 ............................ [min_threads=2] OK
mesh/high_order_elems.test_prism6_refine .................................................................. OK
vectorpostprocessors/parallel_consistency.broadcast: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/vectorpostprocessors/parallel_consistency
vectorpostprocessors/parallel_consistency.broadcast: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i parallel_consistency.i AuxKernels/viewit/use_broadcast=true Outputs/file_base=broadcast_out --error --error-unused --error-override --no-gdb-backtrace
vectorpostprocessors/parallel_consistency.broadcast: Fatal error in MPI_Init_thread: Invalid group, error stack:
vectorpostprocessors/parallel_consistency.broadcast: MPIR_Init_thread(586)..............:
vectorpostprocessors/parallel_consistency.broadcast: MPID_Init(224).....................: channel initialization failed
vectorpostprocessors/parallel_consistency.broadcast: MPIDI_CH3_Init(105)................:
vectorpostprocessors/parallel_consistency.broadcast: MPID_nem_init(324).................:
vectorpostprocessors/parallel_consistency.broadcast: MPID_nem_tcp_init(175).............:
vectorpostprocessors/parallel_consistency.broadcast: MPID_nem_tcp_get_business_card(401):
vectorpostprocessors/parallel_consistency.broadcast: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
vectorpostprocessors/parallel_consistency.broadcast: (unknown)(): Invalid group
vectorpostprocessors/parallel_consistency.broadcast: Fatal error in MPI_Init_thread: Invalid group, error stack:
vectorpostprocessors/parallel_consistency.broadcast: MPIR_Init_thread(586)..............:
vectorpostprocessors/parallel_consistency.broadcast: MPID_Init(224).....................: channel initialization failed
vectorpostprocessors/parallel_consistency.broadcast: MPIDI_CH3_Init(105)................:
vectorpostprocessors/parallel_consistency.broadcast: MPID_nem_init(324).................:
vectorpostprocessors/parallel_consistency.broadcast: MPID_nem_tcp_init(175).............:
vectorpostprocessors/parallel_consistency.broadcast: MPID_nem_tcp_get_business_card(401):
vectorpostprocessors/parallel_consistency.broadcast: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
vectorpostprocessors/parallel_consistency.broadcast: (unknown)(): Invalid group
vectorpostprocessors/parallel_consistency.broadcast: Fatal error in MPI_Init_thread: Invalid group, error stack:
vectorpostprocessors/parallel_consistency.broadcast: MPIR_Init_thread(586)..............:
vectorpostprocessors/parallel_consistency.broadcast: MPID_Init(224).....................: channel initialization failed
vectorpostprocessors/parallel_consistency.broadcast: MPIDI_CH3_Init(105)................:
vectorpostprocessors/parallel_consistency.broadcast: MPID_nem_init(324).................:
vectorpostprocessors/parallel_consistency.broadcast: MPID_nem_tcp_init(175).............:
vectorpostprocessors/parallel_consistency.broadcast: MPID_nem_tcp_get_business_card(401):
vectorpostprocessors/parallel_consistency.broadcast: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
vectorpostprocessors/parallel_consistency.broadcast: (unknown)(): Invalid group
vectorpostprocessors/parallel_consistency.broadcast: Fatal error in MPI_Init_thread: Invalid group, error stack:
vectorpostprocessors/parallel_consistency.broadcast: MPIR_Init_thread(586)..............:
vectorpostprocessors/parallel_consistency.broadcast: MPID_Init(224).....................: channel initialization failed
vectorpostprocessors/parallel_consistency.broadcast: MPIDI_CH3_Init(105)................:
vectorpostprocessors/parallel_consistency.broadcast: MPID_nem_init(324).................:
vectorpostprocessors/parallel_consistency.broadcast: MPID_nem_tcp_init(175).............:
vectorpostprocessors/parallel_consistency.broadcast: MPID_nem_tcp_get_business_card(401):
vectorpostprocessors/parallel_consistency.broadcast: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
vectorpostprocessors/parallel_consistency.broadcast: (unknown)(): Invalid group
vectorpostprocessors/parallel_consistency.broadcast:
vectorpostprocessors/parallel_consistency.broadcast:
vectorpostprocessors/parallel_consistency.broadcast: Exit Code: 8
vectorpostprocessors/parallel_consistency.broadcast: ################################################################################
vectorpostprocessors/parallel_consistency.broadcast: Tester failed, reason: CRASH
vectorpostprocessors/parallel_consistency.broadcast:
vectorpostprocessors/parallel_consistency.broadcast .............................. [min_cpus=2] FAILED (CRASH)
fvkernels/fv_simple_diffusion.unstructured-rz ............................................................. OK
mesh/checkpoint.test_2a: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/mesh/checkpoint
mesh/checkpoint.test_2a: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i checkpoint_split.i Outputs/file_base=test_2a --use-split --split-file checkpoint_split_in.cpa --error --error-unused --error-override --no-gdb-backtrace
mesh/checkpoint.test_2a: Fatal error in MPI_Init_thread: Invalid group, error stack:
mesh/checkpoint.test_2a: MPIR_Init_thread(586)..............:
mesh/checkpoint.test_2a: MPID_Init(224).....................: channel initialization failed
mesh/checkpoint.test_2a: MPIDI_CH3_Init(105)................:
mesh/checkpoint.test_2a: MPID_nem_init(324).................:
mesh/checkpoint.test_2a: MPID_nem_tcp_init(175).............:
mesh/checkpoint.test_2a: MPID_nem_tcp_get_business_card(401):
mesh/checkpoint.test_2a: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
mesh/checkpoint.test_2a: (unknown)(): Invalid group
mesh/checkpoint.test_2a: Fatal error in MPI_Init_thread: Invalid group, error stack:
mesh/checkpoint.test_2a: MPIR_Init_thread(586)..............:
mesh/checkpoint.test_2a: MPID_Init(224).....................: channel initialization failed
mesh/checkpoint.test_2a: MPIDI_CH3_Init(105)................:
mesh/checkpoint.test_2a: MPID_nem_init(324).................:
mesh/checkpoint.test_2a: MPID_nem_tcp_init(175).............:
mesh/checkpoint.test_2a: MPID_nem_tcp_get_business_card(401):
mesh/checkpoint.test_2a: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
mesh/checkpoint.test_2a: (unknown)(): Invalid group
mesh/checkpoint.test_2a: Fatal error in MPI_Init_thread: Invalid group, error stack:
mesh/checkpoint.test_2a: MPIR_Init_thread(586)..............:
mesh/checkpoint.test_2a: MPID_Init(224).....................: channel initialization failed
mesh/checkpoint.test_2a: MPIDI_CH3_Init(105)................:
mesh/checkpoint.test_2a: MPID_nem_init(324).................:
mesh/checkpoint.test_2a: MPID_nem_tcp_init(175).............:
mesh/checkpoint.test_2a: MPID_nem_tcp_get_business_card(401):
mesh/checkpoint.test_2a: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
mesh/checkpoint.test_2a: (unknown)(): Invalid group
mesh/checkpoint.test_2a: Fatal error in MPI_Init_thread: Invalid group, error stack:
mesh/checkpoint.test_2a: MPIR_Init_thread(586)..............:
mesh/checkpoint.test_2a: MPID_Init(224).....................: channel initialization failed
mesh/checkpoint.test_2a: MPIDI_CH3_Init(105)................:
mesh/checkpoint.test_2a: MPID_nem_init(324).................:
mesh/checkpoint.test_2a: MPID_nem_tcp_init(175).............:
mesh/checkpoint.test_2a: MPID_nem_tcp_get_business_card(401):
mesh/checkpoint.test_2a: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
mesh/checkpoint.test_2a: (unknown)(): Invalid group
mesh/checkpoint.test_2a:
mesh/checkpoint.test_2a:
mesh/checkpoint.test_2a: Exit Code: 8
mesh/checkpoint.test_2a: ################################################################################
mesh/checkpoint.test_2a: Tester failed, reason: CRASH
mesh/checkpoint.test_2a:
mesh/checkpoint.test_2a .......................................................... [min_cpus=2] FAILED (CRASH)
userobjects/layered_average.layered_average/block_restricted .............................................. OK
postprocessors/find_value_on_line.depth_exceeded .......................................................... OK
constraints/equal_value_embedded_constraint.penalty/3D_3D ................................................. OK
postprocessors/num_iterations.methods/heun ................................................................ OK
transfers/reporter_transfer.clone_type/type_specified ................... [insufficient slots,min_cpus=6] SKIP
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/dgkernels/2d_diffusion_dg
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i no_mallocs_with_action.i Outputs/file_base=no_mallocs_with_action_parallel --error --error-unused --error-override --no-gdb-backtrace
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: Fatal error in MPI_Init_thread: Invalid group, error stack:
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: MPIR_Init_thread(586)..............:
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: MPID_Init(224).....................: channel initialization failed
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: MPIDI_CH3_Init(105)................:
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: MPID_nem_init(324).................:
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: MPID_nem_tcp_init(175).............:
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: MPID_nem_tcp_get_business_card(401):
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: (unknown)(): Invalid group
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: Fatal error in MPI_Init_thread: Invalid group, error stack:
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: MPIR_Init_thread(586)..............:
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: MPID_Init(224).....................: channel initialization failed
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: MPIDI_CH3_Init(105)................:
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: MPID_nem_init(324).................:
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: MPID_nem_tcp_init(175).............:
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: MPID_nem_tcp_get_business_card(401):
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: (unknown)(): Invalid group
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel:
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel:
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: Exit Code: 8
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: ################################################################################
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel: Tester failed, reason: CRASH
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel:
dgkernels/2d_diffusion_dg.proper_ghosting_with_action_parallel ................... [min_cpus=2] FAILED (CRASH)
time_integrators/convergence.explicit_midpoint/level2 ..................................................... OK
problems/eigen_problem/eigensolvers.coupled_system ........................................................ OK
interfacekernels/2d_interface.parallel_fdp_test: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/interfacekernels/2d_interface
interfacekernels/2d_interface.parallel_fdp_test: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i coupled_value_coupled_flux.i Preconditioning/smp/type=FDP --error-unused --error-override --no-gdb-backtrace
interfacekernels/2d_interface.parallel_fdp_test: Fatal error in MPI_Init_thread: Invalid group, error stack:
interfacekernels/2d_interface.parallel_fdp_test: MPIR_Init_thread(586)..............:
interfacekernels/2d_interface.parallel_fdp_test: MPID_Init(224).....................: channel initialization failed
interfacekernels/2d_interface.parallel_fdp_test: MPIDI_CH3_Init(105)................:
interfacekernels/2d_interface.parallel_fdp_test: MPID_nem_init(324).................:
interfacekernels/2d_interface.parallel_fdp_test: MPID_nem_tcp_init(175).............:
interfacekernels/2d_interface.parallel_fdp_test: MPID_nem_tcp_get_business_card(401):
interfacekernels/2d_interface.parallel_fdp_test: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
interfacekernels/2d_interface.parallel_fdp_test: (unknown)(): Invalid group
interfacekernels/2d_interface.parallel_fdp_test: Fatal error in MPI_Init_thread: Invalid group, error stack:
interfacekernels/2d_interface.parallel_fdp_test: MPIR_Init_thread(586)..............:
interfacekernels/2d_interface.parallel_fdp_test: MPID_Init(224).....................: channel initialization failed
interfacekernels/2d_interface.parallel_fdp_test: MPIDI_CH3_Init(105)................:
interfacekernels/2d_interface.parallel_fdp_test: MPID_nem_init(324).................:
interfacekernels/2d_interface.parallel_fdp_test: MPID_nem_tcp_init(175).............:
interfacekernels/2d_interface.parallel_fdp_test: MPID_nem_tcp_get_business_card(401):
interfacekernels/2d_interface.parallel_fdp_test: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
interfacekernels/2d_interface.parallel_fdp_test: (unknown)(): Invalid group
interfacekernels/2d_interface.parallel_fdp_test:
interfacekernels/2d_interface.parallel_fdp_test:
interfacekernels/2d_interface.parallel_fdp_test: Exit Code: 8
interfacekernels/2d_interface.parallel_fdp_test: ################################################################################
interfacekernels/2d_interface.parallel_fdp_test: Tester failed, reason: CRASH
interfacekernels/2d_interface.parallel_fdp_test:
interfacekernels/2d_interface.parallel_fdp_test .................................. [min_cpus=2] FAILED (CRASH)
materials/derivative_material_interface.warn .............................................................. OK
variables/optionally_coupled.catch_out_of_bound_default_access/coupled .................................... OK
samplers/base.global_vs_local/base_1rank .................................................................. OK
functions/solution_function.nonexistent_var_err ........................................................... OK
functions/piecewise_multilinear.twoDa ..................................................................... OK
restrictable/block_api_test.mat/hasBlockMaterialProperty_false ............................................ OK
outputs/checkpoint.block/recover_with_checkpoint_block .................................................... OK
reporters/base.special_types .............................................................................. OK
outputs/exodus.nodal_output ............................................................................... OK
system_interfaces.solver/superlu: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/system_interfaces
system_interfaces.solver/superlu: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i input.i --error --error-unused --error-override --no-gdb-backtrace
system_interfaces.solver/superlu: Fatal error in MPI_Init_thread: Invalid group, error stack:
system_interfaces.solver/superlu: MPIR_Init_thread(586)..............:
system_interfaces.solver/superlu: MPID_Init(224).....................: channel initialization failed
system_interfaces.solver/superlu: MPIDI_CH3_Init(105)................:
system_interfaces.solver/superlu: MPID_nem_init(324).................:
system_interfaces.solver/superlu: MPID_nem_tcp_init(175).............:
system_interfaces.solver/superlu: MPID_nem_tcp_get_business_card(401):
system_interfaces.solver/superlu: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
system_interfaces.solver/superlu: (unknown)(): Invalid group
system_interfaces.solver/superlu: Fatal error in MPI_Init_thread: Invalid group, error stack:
system_interfaces.solver/superlu: MPIR_Init_thread(586)..............:
system_interfaces.solver/superlu: MPID_Init(224).....................: channel initialization failed
system_interfaces.solver/superlu: MPIDI_CH3_Init(105)................:
system_interfaces.solver/superlu: MPID_nem_init(324).................:
system_interfaces.solver/superlu: MPID_nem_tcp_init(175).............:
system_interfaces.solver/superlu: MPID_nem_tcp_get_business_card(401):
system_interfaces.solver/superlu: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
system_interfaces.solver/superlu: (unknown)(): Invalid group
system_interfaces.solver/superlu:
system_interfaces.solver/superlu:
system_interfaces.solver/superlu: Exit Code: 8
system_interfaces.solver/superlu: ################################################################################
system_interfaces.solver/superlu: Tester failed, reason: CRASH
system_interfaces.solver/superlu:
system_interfaces.solver/superlu ................................................. [min_cpus=2] FAILED (CRASH)
multiapps/picard.steady_with_custom_convergence_check ..................................................... OK
outputs/console.norms ..................................................................................... OK
interfacekernels/1d_interface.reaction_1D_steady_CSVDiff .................................................. OK
bcs/periodic.testperiodic_vector: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/bcs/periodic
bcs/periodic.testperiodic_vector: Running command: /Users/almeag-mac/projects1/moose/test/moose_test-opt -i periodic_vector_bc_test.i --error --error-unused --error-override --no-gdb-backtrace
bcs/periodic.testperiodic_vector:
bcs/periodic.testperiodic_vector: ERROR: Bad FEType.family == LAGRANGE_VEC
bcs/periodic.testperiodic_vector: Stack frames: 17
bcs/periodic.testperiodic_vector: 0: 0 libmesh_opt.0.dylib 0x000000010ffd53bb libMesh::print_trace(std::__1::basic_ostream<char, std::__1::char_traits >&) + 1067
bcs/periodic.testperiodic_vector: 1: 1 libmesh_opt.0.dylib 0x000000010ffd1e2d libMesh::MacroFunctions::report_error(char const*, int, char const*, char const*) + 269
bcs/periodic.testperiodic_vector: 2: 2 libmesh_opt.0.dylib 0x00000001100a6147 libMesh::FEGenericBase::build(unsigned int, libMesh::FEType const&) + 4471
bcs/periodic.testperiodic_vector: 3: 3 libmesh_opt.0.dylib 0x00000001100ae548 libMesh::FEGenericBase::compute_periodic_constraints(libMesh::DofConstraints&, libMesh::DofMap&, libMesh::PeriodicBoundaries const&, libMesh::MeshBase const&, libMesh::PointLocatorBase const*, unsigned int, libMesh::Elem const*) + 168
bcs/periodic.testperiodic_vector: 4: 4 libmesh_opt.0.dylib 0x000000010ff9d218 (anonymous namespace)::ComputeConstraints::operator()(libMesh::StoredRange<libMesh::MeshBase::const_element_iterator, libMesh::Elem const*> const&) const + 280
bcs/periodic.testperiodic_vector: 5: 5 libmesh_opt.0.dylib 0x000000010ff9c68c libMesh::DofMap::create_dof_constraints(libMesh::MeshBase const&, double) + 844
bcs/periodic.testperiodic_vector: 6: 6 libmesh_opt.0.dylib 0x0000000110686721 libMesh::System::reinit_constraints() + 33
bcs/periodic.testperiodic_vector: 7: 7 libmesh_opt.0.dylib 0x0000000110685c6a libMesh::System::init_data() + 202
bcs/periodic.testperiodic_vector: 8: 8 libmesh_opt.0.dylib 0x000000011068e7f8 libMesh::System::init() + 40
bcs/periodic.testperiodic_vector: 9: 9 libmesh_opt.0.dylib 0x0000000110657312 libMesh::EquationSystems::init() + 1234
bcs/periodic.testperiodic_vector: 10: 10 libmoose-opt.0.dylib 0x000000010eeff433 FEProblemBase::init() + 1043
bcs/periodic.testperiodic_vector: 11: 11 libmoose-opt.0.dylib 0x000000010f1f183c ActionWarehouse::executeActionsWithAction(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator > const&) + 940
bcs/periodic.testperiodic_vector: 12: 12 libmoose-opt.0.dylib 0x000000010f228cf8 ActionWarehouse::executeAllActions() + 232
bcs/periodic.testperiodic_vector: 13: 13 libmoose-opt.0.dylib 0x000000010f69c2e0 MooseApp::runInputFile() + 80
bcs/periodic.testperiodic_vector: 14: 14 libmoose-opt.0.dylib 0x000000010f697a4c MooseApp::run() + 2684
bcs/periodic.testperiodic_vector: 15: 15 moose_test-opt 0x000000010e52ec34 main + 132
bcs/periodic.testperiodic_vector: 16: 16 libdyld.dylib 0x00007fff67d0bcc9 start + 1
bcs/periodic.testperiodic_vector: [0] /Users/almeag-mac/projects/raccoon/moose/scripts/../libmesh/src/fe/fe_base.C, line 335, compiled Jun 13 2021 at 10:45:15
bcs/periodic.testperiodic_vector:
bcs/periodic.testperiodic_vector:
bcs/periodic.testperiodic_vector: *** ERROR ***
bcs/periodic.testperiodic_vector: ERROR: Bad FEType.family == LAGRANGE_VEC
bcs/periodic.testperiodic_vector:
bcs/periodic.testperiodic_vector: Stack frames: 6
bcs/periodic.testperiodic_vector: 0: 0 libmesh_opt.0.dylib 0x000000010ffd53bb libMesh::print_trace(std::__1::basic_ostream<char, std::__1::char_traits >&) + 1067
bcs/periodic.testperiodic_vector: 1: 1 libmoose-opt.0.dylib 0x000000010f6c9854 moose::internal::mooseErrorRaw(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator >, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator >) + 852
bcs/periodic.testperiodic_vector: 2: 2 libmoose-opt.0.dylib 0x000000010f044a6e void mooseError<char const*>(char const*&&) + 270
bcs/periodic.testperiodic_vector: 3: 3 libmoose-opt.0.dylib 0x000000010f698404 MooseApp::run() + 5172
bcs/periodic.testperiodic_vector: 4: 4 moose_test-opt 0x000000010e52ec34 main + 132
bcs/periodic.testperiodic_vector: 5: 5 libdyld.dylib 0x00007fff67d0bcc9 start + 1
bcs/periodic.testperiodic_vector: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
bcs/periodic.testperiodic_vector: [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=1
bcs/periodic.testperiodic_vector: :
bcs/periodic.testperiodic_vector: system msg for write_line failure : Bad file descriptor
bcs/periodic.testperiodic_vector:
bcs/periodic.testperiodic_vector: ERROR: Bad FEType.family == LAGRANGE_VEC
bcs/periodic.testperiodic_vector: Stack frames: 17
bcs/periodic.testperiodic_vector: 0: 0 libmesh_opt.0.dylib 0x000000010ffd53bb libMesh::print_trace(std::__1::basic_ostream<char, std::__1::char_traits >&) + 1067
bcs/periodic.testperiodic_vector: 1: 1 libmesh_opt.0.dylib 0x000000010ffd1e2d libMesh::MacroFunctions::report_error(char const*, int, char const*, char const*) + 269
bcs/periodic.testperiodic_vector: 2: 2 libmesh_opt.0.dylib 0x00000001100a6147 libMesh::FEGenericBase::build(unsigned int, libMesh::FEType const&) + 4471
bcs/periodic.testperiodic_vector: 3: 3 libmesh_opt.0.dylib 0x00000001100ae548 libMesh::FEGenericBase::compute_periodic_constraints(libMesh::DofConstraints&, libMesh::DofMap&, libMesh::PeriodicBoundaries const&, libMesh::MeshBase const&, libMesh::PointLocatorBase const*, unsigned int, libMesh::Elem const*) + 168
bcs/periodic.testperiodic_vector: 4: 4 libmesh_opt.0.dylib 0x000000010ff9d218 (anonymous namespace)::ComputeConstraints::operator()(libMesh::StoredRange<libMesh::MeshBase::const_element_iterator, libMesh::Elem const*> const&) const + 280
bcs/periodic.testperiodic_vector: 5: 5 libmesh_opt.0.dylib 0x000000010ff9c68c libMesh::DofMap::create_dof_constraints(libMesh::MeshBase const&, double) + 844
bcs/periodic.testperiodic_vector: 6: 6 libmesh_opt.0.dylib 0x0000000110686721 libMesh::System::reinit_constraints() + 33
bcs/periodic.testperiodic_vector: 7: 7 libmesh_opt.0.dylib 0x0000000110685c6a libMesh::System::init_data() + 202
bcs/periodic.testperiodic_vector: 8: 8 libmesh_opt.0.dylib 0x000000011068e7f8 libMesh::System::init() + 40
bcs/periodic.testperiodic_vector: 9: 9 libmesh_opt.0.dylib 0x0000000110657312 libMesh::EquationSystems::init() + 1234
bcs/periodic.testperiodic_vector: 10: 10 libmoose-opt.0.dylib 0x000000010eeff433 FEProblemBase::init() + 1043
bcs/periodic.testperiodic_vector: 11: 11 libmoose-opt.0.dylib 0x000000010f1f183c ActionWarehouse::executeActionsWithAction(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator > const&) + 940
bcs/periodic.testperiodic_vector: 12: 12 libmoose-opt.0.dylib 0x000000010f228cf8 ActionWarehouse::executeAllActions() + 232
bcs/periodic.testperiodic_vector: 13: 13 libmoose-opt.0.dylib 0x000000010f69c2e0 MooseApp::runInputFile() + 80
bcs/periodic.testperiodic_vector: 14: 14 libmoose-opt.0.dylib 0x000000010f697a4c MooseApp::run() + 2684
bcs/periodic.testperiodic_vector: 15: 15 moose_test-opt 0x000000010e52ec34 main + 132
bcs/periodic.testperiodic_vector: 16: 16 libdyld.dylib 0x00007fff67d0bcc9 start + 1
bcs/periodic.testperiodic_vector: [0] /Users/almeag-mac/projects/raccoon/moose/scripts/../libmesh/src/fe/fe_base.C, line 335, compiled Jun 13 2021 at 10:45:15
bcs/periodic.testperiodic_vector:
bcs/periodic.testperiodic_vector:
bcs/periodic.testperiodic_vector: *** ERROR ***
bcs/periodic.testperiodic_vector: ERROR: Bad FEType.family == LAGRANGE_VEC
bcs/periodic.testperiodic_vector:
bcs/periodic.testperiodic_vector: Stack frames: 6
bcs/periodic.testperiodic_vector: 0: 0 libmesh_opt.0.dylib 0x000000010ffd53bb libMesh::print_trace(std::__1::basic_ostream<char, std::__1::char_traits >&) + 1067
bcs/periodic.testperiodic_vector: 1: 1 libmoose-opt.0.dylib 0x000000010f6c9854 moose::internal::mooseErrorRaw(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator >, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator >) + 852
bcs/periodic.testperiodic_vector: 2: 2 libmoose-opt.0.dylib 0x000000010f044a6e void mooseError<char const*>(char const*&&) + 270
bcs/periodic.testperiodic_vector: 3: 3 libmoose-opt.0.dylib 0x000000010f698404 MooseApp::run() + 5172
bcs/periodic.testperiodic_vector: 4: 4 moose_test-opt 0x000000010e52ec34 main + 132
bcs/periodic.testperiodic_vector: 5: 5 libdyld.dylib 0x00007fff67d0bcc9 start + 1
bcs/periodic.testperiodic_vector: application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
bcs/periodic.testperiodic_vector: [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=1
bcs/periodic.testperiodic_vector: :
bcs/periodic.testperiodic_vector: system msg for write_line failure : Bad file descriptor
bcs/periodic.testperiodic_vector:
bcs/periodic.testperiodic_vector:
bcs/periodic.testperiodic_vector: Exit Code: 1
bcs/periodic.testperiodic_vector: ################################################################################
bcs/periodic.testperiodic_vector: Tester failed, reason: ERRMSG
bcs/periodic.testperiodic_vector:
bcs/periodic.testperiodic_vector ............................................................. FAILED (ERRMSG)
outputs/csv.sort .......................................................................................... OK

Issue #14

interfaces/postprocessorinterface.missing_errors/by_name .................................................. OK
interfaces/vectorpostprocessorinterface.missing_errors/by_name ............................................ OK
interfaces/userobjectinterface.has_uo/name_T .............................................................. OK
misc/exception.parallel_exception_jacobian_transient ...................................................... OK
nodalkernels/constraint_enforcement.vi/rsls_amg ........................................................... OK
auxkernels/solution_aux.exodus_direct ..................................................................... OK
tag.systems/test_tag_nodal_kernels ........................................................................ OK
geomsearch/2d_moving_penetration.pl_test3q ................................................................ OK
mesh_modifiers/block_deleter.delete/BlockDeleterTest12 .................................................... OK
materials/material.exception/serial ....................................................................... OK
kernels/hfem.variable_dirichlet ........................................................................... OK
mesh/mesh_generation.annular/disc ......................................................................... OK
geomsearch/3d_moving_penetration.pl_test3q ................................................................ OK
multiapps/steffensen_postprocessor.pp_transient/app_begin_transfers_begin_steffensen_sub .................. OK
kernels/vector_fe.jacobian ................................................................................ OK
multiapps/secant_postprocessor.pp_transient/app_begin_transfers_begin_secant_sub .......................... OK
transfers/multiapp_conservative_transfer.userobject_transfer_csv .......................................... OK
mesh/custom_partitioner.group/custom_linear_partitioner: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/mesh/custom_partitioner
mesh/custom_partitioner.group/custom_linear_partitioner: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i custom_linear_partitioner_test.i --error --error-unused --error-override --no-gdb-backtrace
mesh/custom_partitioner.group/custom_linear_partitioner: Fatal error in MPI_Init_thread: Invalid group, error stack:
mesh/custom_partitioner.group/custom_linear_partitioner: MPIR_Init_thread(586)..............:
mesh/custom_partitioner.group/custom_linear_partitioner: MPID_Init(224).....................: channel initialization failed
mesh/custom_partitioner.group/custom_linear_partitioner: MPIDI_CH3_Init(105)................:
mesh/custom_partitioner.group/custom_linear_partitioner: MPID_nem_init(324).................:
mesh/custom_partitioner.group/custom_linear_partitioner: MPID_nem_tcp_init(175).............:
mesh/custom_partitioner.group/custom_linear_partitioner: MPID_nem_tcp_get_business_card(401):
mesh/custom_partitioner.group/custom_linear_partitioner: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
mesh/custom_partitioner.group/custom_linear_partitioner: (unknown)(): Invalid group
mesh/custom_partitioner.group/custom_linear_partitioner: Fatal error in MPI_Init_thread: Invalid group, error stack:
mesh/custom_partitioner.group/custom_linear_partitioner: MPIR_Init_thread(586)..............:
mesh/custom_partitioner.group/custom_linear_partitioner: MPID_Init(224).....................: channel initialization failed
mesh/custom_partitioner.group/custom_linear_partitioner: MPIDI_CH3_Init(105)................:
mesh/custom_partitioner.group/custom_linear_partitioner: MPID_nem_init(324).................:
mesh/custom_partitioner.group/custom_linear_partitioner: MPID_nem_tcp_init(175).............:
mesh/custom_partitioner.group/custom_linear_partitioner: MPID_nem_tcp_get_business_card(401):
mesh/custom_partitioner.group/custom_linear_partitioner: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
mesh/custom_partitioner.group/custom_linear_partitioner: (unknown)(): Invalid group
mesh/custom_partitioner.group/custom_linear_partitioner: Fatal error in MPI_Init_thread: Invalid group, error stack:
mesh/custom_partitioner.group/custom_linear_partitioner: MPIR_Init_thread(586)..............:
mesh/custom_partitioner.group/custom_linear_partitioner: MPID_Init(224).....................: channel initialization failed
mesh/custom_partitioner.group/custom_linear_partitioner: MPIDI_CH3_Init(105)................:
mesh/custom_partitioner.group/custom_linear_partitioner: MPID_nem_init(324).................:
mesh/custom_partitioner.group/custom_linear_partitioner: MPID_nem_tcp_init(175).............:
mesh/custom_partitioner.group/custom_linear_partitioner: MPID_nem_tcp_get_business_card(401):
mesh/custom_partitioner.group/custom_linear_partitioner: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
mesh/custom_partitioner.group/custom_linear_partitioner: (unknown)(): Invalid group
mesh/custom_partitioner.group/custom_linear_partitioner: Fatal error in MPI_Init_thread: Invalid group, error stack:
mesh/custom_partitioner.group/custom_linear_partitioner: MPIR_Init_thread(586)..............:
mesh/custom_partitioner.group/custom_linear_partitioner: MPID_Init(224).....................: channel initialization failed
mesh/custom_partitioner.group/custom_linear_partitioner: MPIDI_CH3_Init(105)................:
mesh/custom_partitioner.group/custom_linear_partitioner: MPID_nem_init(324).................:
mesh/custom_partitioner.group/custom_linear_partitioner: MPID_nem_tcp_init(175).............:
mesh/custom_partitioner.group/custom_linear_partitioner: MPID_nem_tcp_get_business_card(401):
mesh/custom_partitioner.group/custom_linear_partitioner: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
mesh/custom_partitioner.group/custom_linear_partitioner: (unknown)(): Invalid group
mesh/custom_partitioner.group/custom_linear_partitioner:
mesh/custom_partitioner.group/custom_linear_partitioner:
mesh/custom_partitioner.group/custom_linear_partitioner: Exit Code: 8
mesh/custom_partitioner.group/custom_linear_partitioner: ################################################################################
mesh/custom_partitioner.group/custom_linear_partitioner: Tester failed, reason: CRASH
mesh/custom_partitioner.group/custom_linear_partitioner:
mesh/custom_partitioner.group/custom_linear_partitioner .......................... [min_cpus=2] FAILED (CRASH)
meshgenerators/break_mesh_by_block_generator.surrounding_block_restricted/split_all ....................... OK
time_steppers/iteration_adaptive.pps_lim .................................................................. OK
userobjects/setup_interface_count.setup_interface_count/NodalSideUserObject ............................... OK
mesh/high_order_elems.test_prism15_refine ................................................................. OK
multiapps/secant.variables_transient/app_end_transfers_begin .............................................. OK
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/transfers/multiapp_vector_pp_transfer
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i master.i --error --error-unused --error-override --no-gdb-backtrace
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: Fatal error in MPI_Init_thread: Invalid group, error stack:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPIR_Init_thread(586)..............:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPID_Init(224).....................: channel initialization failed
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPIDI_CH3_Init(105)................:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPID_nem_init(324).................:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPID_nem_tcp_init(175).............:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPID_nem_tcp_get_business_card(401):
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: (unknown)(): Invalid group
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: Fatal error in MPI_Init_thread: Invalid group, error stack:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPIR_Init_thread(586)..............:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPID_Init(224).....................: channel initialization failed
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPIDI_CH3_Init(105)................:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPID_nem_init(324).................:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPID_nem_tcp_init(175).............:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPID_nem_tcp_get_business_card(401):
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: (unknown)(): Invalid group
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: Fatal error in MPI_Init_thread: Invalid group, error stack:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPIR_Init_thread(586)..............:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPID_Init(224).....................: channel initialization failed
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPIDI_CH3_Init(105)................:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPID_nem_init(324).................:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPID_nem_tcp_init(175).............:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPID_nem_tcp_get_business_card(401):
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: (unknown)(): Invalid group
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: Fatal error in MPI_Init_thread: Invalid group, error stack:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPIR_Init_thread(586)..............:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPID_Init(224).....................: channel initialization failed
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPIDI_CH3_Init(105)................:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPID_nem_init(324).................:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPID_nem_tcp_init(175).............:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPID_nem_tcp_get_business_card(401):
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: (unknown)(): Invalid group
transfers/multiapp_vector_pp_transfer.vector_pp_transfer:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: Exit Code: 8
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: ################################################################################
transfers/multiapp_vector_pp_transfer.vector_pp_transfer: Tester failed, reason: CRASH
transfers/multiapp_vector_pp_transfer.vector_pp_transfer:
transfers/multiapp_vector_pp_transfer.vector_pp_transfer ......................... [min_cpus=2] FAILED (CRASH)
postprocessors/num_iterations.methods/implicit_euler ...................................................... OK
interfacekernels/2d_interface.vector_2d ................................................................... OK
problems/eigen_problem/eigensolvers.eigen_scalar_kernel ................................................... OK
materials/derivative_material_interface.bad_evaluation/nan ................................................ OK
geomsearch/3d_moving_penetration_smoothing.overlapping/pl_test4ns ......................................... OK
reporters/mesh_info.info/default: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/reporters/mesh_info
reporters/mesh_info.info/default: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i mesh_info.i --error --error-unused --error-override --no-gdb-backtrace
reporters/mesh_info.info/default: Fatal error in MPI_Init_thread: Invalid group, error stack:
reporters/mesh_info.info/default: MPIR_Init_thread(586)..............:
reporters/mesh_info.info/default: MPID_Init(224).....................: channel initialization failed
reporters/mesh_info.info/default: MPIDI_CH3_Init(105)................:
reporters/mesh_info.info/default: MPID_nem_init(324).................:
reporters/mesh_info.info/default: MPID_nem_tcp_init(175).............:
reporters/mesh_info.info/default: MPID_nem_tcp_get_business_card(401):
reporters/mesh_info.info/default: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
reporters/mesh_info.info/default: (unknown)(): Invalid group
reporters/mesh_info.info/default: Fatal error in MPI_Init_thread: Invalid group, error stack:
reporters/mesh_info.info/default: MPIR_Init_thread(586)..............:
reporters/mesh_info.info/default: MPID_Init(224).....................: channel initialization failed
reporters/mesh_info.info/default: MPIDI_CH3_Init(105)................:
reporters/mesh_info.info/default: MPID_nem_init(324).................:
reporters/mesh_info.info/default: MPID_nem_tcp_init(175).............:
reporters/mesh_info.info/default: MPID_nem_tcp_get_business_card(401):
reporters/mesh_info.info/default: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
reporters/mesh_info.info/default: (unknown)(): Invalid group
reporters/mesh_info.info/default:
reporters/mesh_info.info/default: ################################################################################
reporters/mesh_info.info/default: Tester failed, reason: CRASH
reporters/mesh_info.info/default:
reporters/mesh_info.info/default ................................................. [min_cpus=2] FAILED (CRASH)
functions/solution_function.solution_function/grad_p1 ..................................................... OK
problems/eigen_problem/eigensolvers.dg_krylovschur ........................................................ OK
reporters/mesh_info.info/limit ............................................................................ OK
multiapps/secant.variables_transient/app_end_transfers_end ................................................ OK
system_interfaces.solver/mumps: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/system_interfaces
system_interfaces.solver/mumps: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i input.i --error --error-unused --error-override --no-gdb-backtrace
system_interfaces.solver/mumps: Fatal error in MPI_Init_thread: Invalid group, error stack:
system_interfaces.solver/mumps: MPIR_Init_thread(586)..............:
system_interfaces.solver/mumps: MPID_Init(224).....................: channel initialization failed
system_interfaces.solver/mumps: MPIDI_CH3_Init(105)................:
system_interfaces.solver/mumps: MPID_nem_init(324).................:
system_interfaces.solver/mumps: MPID_nem_tcp_init(175).............:
system_interfaces.solver/mumps: MPID_nem_tcp_get_business_card(401):
system_interfaces.solver/mumps: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
system_interfaces.solver/mumps: (unknown)(): Invalid group
system_interfaces.solver/mumps: Fatal error in MPI_Init_thread: Invalid group, error stack:
system_interfaces.solver/mumps: MPIR_Init_thread(586)..............:
system_interfaces.solver/mumps: MPID_Init(224).....................: channel initialization failed
system_interfaces.solver/mumps: MPIDI_CH3_Init(105)................:
system_interfaces.solver/mumps: MPID_nem_init(324).................:
system_interfaces.solver/mumps: MPID_nem_tcp_init(175).............:
system_interfaces.solver/mumps: MPID_nem_tcp_get_business_card(401):
system_interfaces.solver/mumps: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
system_interfaces.solver/mumps: (unknown)(): Invalid group
system_interfaces.solver/mumps:
system_interfaces.solver/mumps:
system_interfaces.solver/mumps: Exit Code: 8
system_interfaces.solver/mumps: ################################################################################
system_interfaces.solver/mumps: Tester failed, reason: CRASH
system_interfaces.solver/mumps:
system_interfaces.solver/mumps ................................................... [min_cpus=2] FAILED (CRASH)
geomsearch/3d_moving_penetration_smoothing.overlapping/pl_test4nstt ....................................... OK
functions/solution_function.solution_function/grad_p2 ..................................................... OK
outputs/console.transient ................................................................................. OK
interfacekernels/1d_interface.reaction_1D_transient ....................................................... OK
auxkernels/vector_postprocessor_visualization.test: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/auxkernels/vector_postprocessor_visualization
auxkernels/vector_postprocessor_visualization.test: Running command: mpiexec -n 3 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i vector_postprocessor_visualization.i --error --error-unused --error-override --no-gdb-backtrace
auxkernels/vector_postprocessor_visualization.test: Fatal error in MPI_Init_thread: Invalid group, error stack:
auxkernels/vector_postprocessor_visualization.test: MPIR_Init_thread(586)..............:
auxkernels/vector_postprocessor_visualization.test: MPID_Init(224).....................: channel initialization failed
auxkernels/vector_postprocessor_visualization.test: MPIDI_CH3_Init(105)................:
auxkernels/vector_postprocessor_visualization.test: MPID_nem_init(324).................:
auxkernels/vector_postprocessor_visualization.test: MPID_nem_tcp_init(175).............:
auxkernels/vector_postprocessor_visualization.test: MPID_nem_tcp_get_business_card(401):
auxkernels/vector_postprocessor_visualization.test: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
auxkernels/vector_postprocessor_visualization.test: (unknown)(): Invalid group
auxkernels/vector_postprocessor_visualization.test: Fatal error in MPI_Init_thread: Invalid group, error stack:
auxkernels/vector_postprocessor_visualization.test: MPIR_Init_thread(586)..............:
auxkernels/vector_postprocessor_visualization.test: MPID_Init(224).....................: channel initialization failed
auxkernels/vector_postprocessor_visualization.test: MPIDI_CH3_Init(105)................:
auxkernels/vector_postprocessor_visualization.test: MPID_nem_init(324).................:
auxkernels/vector_postprocessor_visualization.test: MPID_nem_tcp_init(175).............:
auxkernels/vector_postprocessor_visualization.test: MPID_nem_tcp_get_business_card(401):
auxkernels/vector_postprocessor_visualization.test: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
auxkernels/vector_postprocessor_visualization.test: (unknown)(): Invalid group
auxkernels/vector_postprocessor_visualization.test: Fatal error in MPI_Init_thread: Invalid group, error stack:
auxkernels/vector_postprocessor_visualization.test: MPIR_Init_thread(586)..............:
auxkernels/vector_postprocessor_visualization.test: MPID_Init(224).....................: channel initialization failed
auxkernels/vector_postprocessor_visualization.test: MPIDI_CH3_Init(105)................:
auxkernels/vector_postprocessor_visualization.test: MPID_nem_init(324).................:
auxkernels/vector_postprocessor_visualization.test: MPID_nem_tcp_init(175).............:
auxkernels/vector_postprocessor_visualization.test: MPID_nem_tcp_get_business_card(401):
auxkernels/vector_postprocessor_visualization.test: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
auxkernels/vector_postprocessor_visualization.test: (unknown)(): Invalid group
auxkernels/vector_postprocessor_visualization.test: Fatal error in MPI_Init_thread: Invalid group, error stack:
auxkernels/vector_postprocessor_visualization.test: MPIR_Init_thread(586)..............:
auxkernels/vector_postprocessor_visualization.test: MPID_Init(224).....................: channel initialization failed
auxkernels/vector_postprocessor_visualization.test: MPIDI_CH3_Init(105)................:
auxkernels/vector_postprocessor_visualization.test: MPID_nem_init(324).................:
auxkernels/vector_postprocessor_visualization.test: MPID_nem_tcp_init(175).............:
auxkernels/vector_postprocessor_visualization.test: MPID_nem_tcp_get_business_card(401):
auxkernels/vector_postprocessor_visualization.test: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
auxkernels/vector_postprocessor_visualization.test: (unknown)(): Invalid group
auxkernels/vector_postprocessor_visualization.test:
auxkernels/vector_postprocessor_visualization.test:
auxkernels/vector_postprocessor_visualization.test: Exit Code: 8
auxkernels/vector_postprocessor_visualization.test: ################################################################################
auxkernels/vector_postprocessor_visualization.test: Tester failed, reason: CRASH
uxkernels/vector_postprocessor_visualization.test:
auxkernels/vector_postprocessor_visualization.test ............................... [min_cpus=3] FAILED (CRASH)
bcs/dmg_periodic.check_one_step: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/bcs/dmg_periodic
bcs/dmg_periodic.check_one_step: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i dmg_periodic_bc.i /UserObjects/uo/type=CheckGhostedBoundaries /UserObjects/uo/total_num_bdry_sides=160 Outputs/hide="pid" Outputs/exodus=false Executioner/num_steps=1 --error --error-unused --error-override --no-gdb-backtrace
bcs/dmg_periodic.check_one_step: Fatal error in MPI_Init_thread: Invalid group, error stack:
bcs/dmg_periodic.check_one_step: MPIR_Init_thread(586)..............:
bcs/dmg_periodic.check_one_step: MPID_Init(224).....................: channel initialization failed
bcs/dmg_periodic.check_one_step: MPIDI_CH3_Init(105)................:
bcs/dmg_periodic.check_one_step: MPID_nem_init(324).................:
bcs/dmg_periodic.check_one_step: MPID_nem_tcp_init(175).............:
bcs/dmg_periodic.check_one_step: MPID_nem_tcp_get_business_card(401):
bcs/dmg_periodic.check_one_step: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
bcs/dmg_periodic.check_one_step: (unknown)(): Invalid group
bcs/dmg_periodic.check_one_step: Fatal error in MPI_Init_thread: Invalid group, error stack:
bcs/dmg_periodic.check_one_step: MPIR_Init_thread(586)..............:
bcs/dmg_periodic.check_one_step: MPID_Init(224).....................: channel initialization failed
bcs/dmg_periodic.check_one_step: MPIDI_CH3_Init(105)................:
bcs/dmg_periodic.check_one_step: MPID_nem_init(324).................:
bcs/dmg_periodic.check_one_step: MPID_nem_tcp_init(175).............:
bcs/dmg_periodic.check_one_step: MPID_nem_tcp_get_business_card(401):
bcs/dmg_periodic.check_one_step: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
bcs/dmg_periodic.check_one_step: (unknown)(): Invalid group
bcs/dmg_periodic.check_one_step:
bcs/dmg_periodic.check_one_step:
bcs/dmg_periodic.check_one_step: Exit Code: 8
bcs/dmg_periodic.check_one_step: ################################################################################
bcs/dmg_periodic.check_one_step: Tester failed, reason: CRASH
bcs/dmg_periodic.check_one_step:
bcs/dmg_periodic.check_one_step .................................................. [min_cpus=2] FAILED (CRASH)

multiapps/secant_postprocessor.pp_transient/app_end_transfers_begin_secant_sub ............................ OK
outputs/vtk.files/parallel: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/outputs/vtk
outputs/vtk.files/parallel: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i vtk_parallel.i --error --error-unused --error-override --no-gdb-backtrace
outputs/vtk.files/parallel: Fatal error in MPI_Init_thread: Invalid group, error stack:
outputs/vtk.files/parallel: MPIR_Init_thread(586)..............:
outputs/vtk.files/parallel: MPID_Init(224).....................: channel initialization failed
outputs/vtk.files/parallel: MPIDI_CH3_Init(105)................:
outputs/vtk.files/parallel: MPID_nem_init(324).................:
outputs/vtk.files/parallel: MPID_nem_tcp_init(175).............:
outputs/vtk.files/parallel: MPID_nem_tcp_get_business_card(401):
outputs/vtk.files/parallel: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
outputs/vtk.files/parallel: (unknown)(): Invalid group
outputs/vtk.files/parallel: Fatal error in MPI_Init_thread: Invalid group, error stack:
outputs/vtk.files/parallel: MPIR_Init_thread(586)..............:
outputs/vtk.files/parallel: MPID_Init(224).....................: channel initialization failed
outputs/vtk.files/parallel: MPIDI_CH3_Init(105)................:
outputs/vtk.files/parallel: MPID_nem_init(324).................:
outputs/vtk.files/parallel: MPID_nem_tcp_init(175).............:
outputs/vtk.files/parallel: MPID_nem_tcp_get_business_card(401):
outputs/vtk.files/parallel: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
outputs/vtk.files/parallel: (unknown)(): Invalid group
outputs/vtk.files/parallel:
outputs/vtk.files/parallel:
outputs/vtk.files/parallel: Exit Code: 8
outputs/vtk.files/parallel: ################################################################################
outputs/vtk.files/parallel: Tester failed, reason: CRASH
outputs/vtk.files/parallel:
outputs/vtk.files/parallel ....................................................... [min_cpus=2] FAILED (CRASH)
problems/eigen_problem/eigensolvers.eigen_as_master ....................................................... OK
ics/depend_on_uo.ic_depend_on_uo: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/ics/depend_on_uo
ics/depend_on_uo.ic_depend_on_uo: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i geometric_neighbors_ic.i --error --error-unused --error-override --no-gdb-backtrace
ics/depend_on_uo.ic_depend_on_uo: Fatal error in MPI_Init_thread: Invalid group, error stack:
ics/depend_on_uo.ic_depend_on_uo: MPIR_Init_thread(586)..............:
ics/depend_on_uo.ic_depend_on_uo: MPID_Init(224).....................: channel initialization failed
ics/depend_on_uo.ic_depend_on_uo: MPIDI_CH3_Init(105)................:
ics/depend_on_uo.ic_depend_on_uo: MPID_nem_init(324).................:
ics/depend_on_uo.ic_depend_on_uo: MPID_nem_tcp_init(175).............:
ics/depend_on_uo.ic_depend_on_uo: MPID_nem_tcp_get_business_card(401):
ics/depend_on_uo.ic_depend_on_uo: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
ics/depend_on_uo.ic_depend_on_uo: (unknown)(): Invalid group
ics/depend_on_uo.ic_depend_on_uo: Fatal error in MPI_Init_thread: Invalid group, error stack:
ics/depend_on_uo.ic_depend_on_uo: MPIR_Init_thread(586)..............:
ics/depend_on_uo.ic_depend_on_uo: MPID_Init(224).....................: channel initialization failed
ics/depend_on_uo.ic_depend_on_uo: MPIDI_CH3_Init(105)................:
ics/depend_on_uo.ic_depend_on_uo: MPID_nem_init(324).................:
ics/depend_on_uo.ic_depend_on_uo: MPID_nem_tcp_init(175).............:
ics/depend_on_uo.ic_depend_on_uo: MPID_nem_tcp_get_business_card(401):
ics/depend_on_uo.ic_depend_on_uo: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
ics/depend_on_uo.ic_depend_on_uo: (unknown)(): Invalid group
ics/depend_on_uo.ic_depend_on_uo: Fatal error in MPI_Init_thread: Invalid group, error stack:
ics/depend_on_uo.ic_depend_on_uo: MPIR_Init_thread(586)..............:
ics/depend_on_uo.ic_depend_on_uo: MPID_Init(224).....................: channel initialization failed
ics/depend_on_uo.ic_depend_on_uo: MPIDI_CH3_Init(105)................:
ics/depend_on_uo.ic_depend_on_uo: MPID_nem_init(324).................:
ics/depend_on_uo.ic_depend_on_uo: MPID_nem_tcp_init(175).............:
ics/depend_on_uo.ic_depend_on_uo: MPID_nem_tcp_get_business_card(401):
ics/depend_on_uo.ic_depend_on_uo: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
ics/depend_on_uo.ic_depend_on_uo: (unknown)(): Invalid group
ics/depend_on_uo.ic_depend_on_uo: Fatal error in MPI_Init_thread: Invalid group, error stack:
ics/depend_on_uo.ic_depend_on_uo: MPIR_Init_thread(586)..............:
ics/depend_on_uo.ic_depend_on_uo: MPID_Init(224).....................: channel initialization failed
ics/depend_on_uo.ic_depend_on_uo: MPIDI_CH3_Init(105)................:
ics/depend_on_uo.ic_depend_on_uo: MPID_nem_init(324).................:
ics/depend_on_uo.ic_depend_on_uo: MPID_nem_tcp_init(175).............:
ics/depend_on_uo.ic_depend_on_uo: MPID_nem_tcp_get_business_card(401):
ics/depend_on_uo.ic_depend_on_uo: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
ics/depend_on_uo.ic_depend_on_uo: (unknown)(): Invalid group
ics/depend_on_uo.ic_depend_on_uo:
ics/depend_on_uo.ic_depend_on_uo:
ics/depend_on_uo.ic_depend_on_uo: Exit Code: 8
ics/depend_on_uo.ic_depend_on_uo: ################################################################################
ics/depend_on_uo.ic_depend_on_uo: Tester failed, reason: CRASH
ics/depend_on_uo.ic_depend_on_uo:
ics/depend_on_uo.ic_depend_on_uo ................................................. [min_cpus=2] FAILED (CRASH)
bcs/periodic.testperiodic_dp: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/bcs/periodic
bcs/periodic.testperiodic_dp: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i periodic_bc_displaced_problem.i --error --error-unused --error-override --no-gdb-backtrace
bcs/periodic.testperiodic_dp: Fatal error in MPI_Init_thread: Invalid group, error stack:
bcs/periodic.testperiodic_dp: MPIR_Init_thread(586)..............:
bcs/periodic.testperiodic_dp: MPID_Init(224).....................: channel initialization failed
bcs/periodic.testperiodic_dp: MPIDI_CH3_Init(105)................:
bcs/periodic.testperiodic_dp: MPID_nem_init(324).................:
bcs/periodic.testperiodic_dp: MPID_nem_tcp_init(175).............:
bcs/periodic.testperiodic_dp: MPID_nem_tcp_get_business_card(401):
bcs/periodic.testperiodic_dp: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
bcs/periodic.testperiodic_dp: (unknown)(): Invalid group
bcs/periodic.testperiodic_dp: Fatal error in MPI_Init_thread: Invalid group, error stack:
bcs/periodic.testperiodic_dp: MPIR_Init_thread(586)..............:
bcs/periodic.testperiodic_dp: MPID_Init(224).....................: channel initialization failed
bcs/periodic.testperiodic_dp: MPIDI_CH3_Init(105)................:
bcs/periodic.testperiodic_dp: MPID_nem_init(324).................:
bcs/periodic.testperiodic_dp: MPID_nem_tcp_init(175).............:
bcs/periodic.testperiodic_dp: MPID_nem_tcp_get_business_card(401):
bcs/periodic.testperiodic_dp: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
bcs/periodic.testperiodic_dp: (unknown)(): Invalid group
bcs/periodic.testperiodic_dp:
bcs/periodic.testperiodic_dp:
bcs/periodic.testperiodic_dp: Exit Code: 8
bcs/periodic.testperiodic_dp: ################################################################################
bcs/periodic.testperiodic_dp: Tester failed, reason: CRASH
bcs/periodic.testperiodic_dp:
bcs/periodic.testperiodic_dp ..................................................... [min_cpus=2] FAILED (CRASH)
geomsearch/3d_moving_penetration_smoothing.overlapping/pl_test4qns ........................................ OK
multiapps/picard.steady_with_postprocessor_convergence .................................................... OK
multiapps/steffensen.variables_transient/app_begin_transfers_end_steffensen_sub ........................... OK
interfacekernels/1d_interface.reaction_1D_transient_Jac ................................................... OK
tag.controls-tagging ...................................................................................... OK
auxkernels/solution_aux.exodus_interp_restart/part2 ....................................................... OK
multiapps/picard_postprocessor.pp_transient/app_begin_transfers_begin_picard_sub .......................... OK
fvkernels/mms/non-orthogonal.compact: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/fvkernels/mms/non-orthogonal
fvkernels/mms/non-orthogonal.compact: Running command: python -m unittest -v test.TestCompactADR
fvkernels/mms/non-orthogonal.compact: The 'mms' package requires sympy for symbolic function evaluation, it can be installed by running pip install sympy --user.
fvkernels/mms/non-orthogonal.compact: Running: /Users/almeag-mac/projects1/moose/test/moose_test-opt -i advection-diffusion-reaction.i Mesh/uniform_refine=0
fvkernels/mms/non-orthogonal.compact: test (test.TestCompactADR) ... Fatal error in MPI_Init_thread: Invalid group, error stack:
fvkernels/mms/non-orthogonal.compact: MPIR_Init_thread(586)..............:
fvkernels/mms/non-orthogonal.compact: MPID_Init(224).....................: channel initialization failed
fvkernels/mms/non-orthogonal.compact: MPIDI_CH3_Init(105)................:
fvkernels/mms/non-orthogonal.compact: MPID_nem_init(324).................:
fvkernels/mms/non-orthogonal.compact: MPID_nem_tcp_init(175).............:
fvkernels/mms/non-orthogonal.compact: MPID_nem_tcp_get_business_card(401):
fvkernels/mms/non-orthogonal.compact: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
fvkernels/mms/non-orthogonal.compact: (unknown)(): Invalid group
fvkernels/mms/non-orthogonal.compact: ERROR
fvkernels/mms/non-orthogonal.compact:
fvkernels/mms/non-orthogonal.compact: ======================================================================
fvkernels/mms/non-orthogonal.compact: ERROR: test (test.TestCompactADR)
fvkernels/mms/non-orthogonal.compact: ----------------------------------------------------------------------
fvkernels/mms/non-orthogonal.compact: Traceback (most recent call last):
fvkernels/mms/non-orthogonal.compact: File "/Users/almeag-mac/projects1/moose/test/tests/fvkernels/mms/non-orthogonal/test.py", line 7, in test
fvkernels/mms/non-orthogonal.compact: df1 = mms.run_spatial('advection-diffusion-reaction.i', 7, mpi=2)
fvkernels/mms/non-orthogonal.compact: File "/Users/almeag-mac/projects1/moose/python/mms/runner.py", line 129, in run_spatial
fvkernels/mms/non-orthogonal.compact: return _runner(*args, rtype=SPATIAL, **kwargs)
fvkernels/mms/non-orthogonal.compact: File "/Users/almeag-mac/projects1/moose/python/mms/runner.py", line 102, in _runner
fvkernels/mms/non-orthogonal.compact: raise IOError("The CSV output does not exist: {}".format(csv))
fvkernels/mms/non-orthogonal.compact: OSError: The CSV output does not exist: None
fvkernels/mms/non-orthogonal.compact:
fvkernels/mms/non-orthogonal.compact: ----------------------------------------------------------------------
fvkernels/mms/non-orthogonal.compact: Ran 1 test in 0.311s
fvkernels/mms/non-orthogonal.compact:
fvkernels/mms/non-orthogonal.compact: FAILED (errors=1)
fvkernels/mms/non-orthogonal.compact:
fvkernels/mms/non-orthogonal.compact:
fvkernels/mms/non-orthogonal.compact: Exit Code: 1
fvkernels/mms/non-orthogonal.compact: ################################################################################
fvkernels/mms/non-orthogonal.compact: Tester failed, reason: CRASH
fvkernels/mms/non-orthogonal.compact:
fvkernels/mms/non-orthogonal.compact ............................................. [min_cpus=2] FAILED (CRASH)
time_steppers/iteration_adaptive.multi_piecewise_linear_function_point .................................... OK
restart/restartable_types.parallel/first: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/restart/restartable_types
restart/restartable_types.parallel/first: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i restartable_types.i --error --error-unused --error-override --no-gdb-backtrace
restart/restartable_types.parallel/first: Fatal error in MPI_Init_thread: Invalid group, error stack:
restart/restartable_types.parallel/first: MPIR_Init_thread(586)..............:
restart/restartable_types.parallel/first: MPID_Init(224).....................: channel initialization failed
restart/restartable_types.parallel/first: MPIDI_CH3_Init(105)................:
restart/restartable_types.parallel/first: MPID_nem_init(324).................:
restart/restartable_types.parallel/first: MPID_nem_tcp_init(175).............:
restart/restartable_types.parallel/first: MPID_nem_tcp_get_business_card(401):
restart/restartable_types.parallel/first: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
restart/restartable_types.parallel/first: (unknown)(): Invalid group
restart/restartable_types.parallel/first:
restart/restartable_types.parallel/first:
restart/restartable_types.parallel/first: Exit Code: 8
restart/restartable_types.parallel/first: ################################################################################
restart/restartable_types.parallel/first: Tester failed, reason: CRASH
restart/restartable_types.parallel/first:
restart/restartable_types.parallel/first ......................................... [min_cpus=2] FAILED (CRASH)
restart/restartable_types.parallel/second .......................................... [skipped dependency] SKIP
geomsearch/3d_moving_penetration.pl_test4q ................................................................ OK
preconditioners/hmg.hmg_3D: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/preconditioners/hmg
preconditioners/hmg.hmg_3D: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i diffusion_hmg.i Mesh/dmg/dim=3 Mesh/dmg/nz=10 Outputs/file_base=diffusion_hmg_3d_out -log_view --error --error-unused --error-override --no-gdb-backtrace
preconditioners/hmg.hmg_3D: Fatal error in MPI_Init_thread: Invalid group, error stack:
preconditioners/hmg.hmg_3D: MPIR_Init_thread(586)..............:
preconditioners/hmg.hmg_3D: MPID_Init(224).....................: channel initialization failed
preconditioners/hmg.hmg_3D: MPIDI_CH3_Init(105)................:
preconditioners/hmg.hmg_3D: MPID_nem_init(324).................:
preconditioners/hmg.hmg_3D: MPID_nem_tcp_init(175).............:
preconditioners/hmg.hmg_3D: MPID_nem_tcp_get_business_card(401):
preconditioners/hmg.hmg_3D: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
preconditioners/hmg.hmg_3D: (unknown)(): Invalid group
preconditioners/hmg.hmg_3D: Fatal error in MPI_Init_thread: Invalid group, error stack:
preconditioners/hmg.hmg_3D: MPIR_Init_thread(586)..............:
preconditioners/hmg.hmg_3D: MPID_Init(224).....................: channel initialization failed
preconditioners/hmg.hmg_3D: MPIDI_CH3_Init(105)................:
preconditioners/hmg.hmg_3D: MPID_nem_init(324).................:
preconditioners/hmg.hmg_3D: MPID_nem_tcp_init(175).............:
preconditioners/hmg.hmg_3D: MPID_nem_tcp_get_business_card(401):
preconditioners/hmg.hmg_3D: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
preconditioners/hmg.hmg_3D: (unknown)(): Invalid group
preconditioners/hmg.hmg_3D: ################################################################################
preconditioners/hmg.hmg_3D:
preconditioners/hmg.hmg_3D: Unable to match the following pattern against the program's output:
preconditioners/hmg.hmg_3D:
preconditioners/hmg.hmg_3D: PETSc\s+Preconditioner:\s+hmg\s+strong_threshold:\s+0.7
preconditioners/hmg.hmg_3D:
preconditioners/hmg.hmg_3D: ################################################################################
preconditioners/hmg.hmg_3D: Tester failed, reason: EXPECTED OUTPUT MISSING
preconditioners/hmg.hmg_3D:
preconditioners/hmg.hmg_3D ..................................... [min_cpus=2] FAILED (EXPECTED OUTPUT MISSING)
outputs/iterative.start_stop/output_end_step .............................................................. OK
multiapps/steffensen_postprocessor.pp_transient/app_end_transfers_end_steffensen_sub ...................... OK
ics/depend_on_uo.scalar_ic_from_uo: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/ics/depend_on_uo
ics/depend_on_uo.scalar_ic_from_uo: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i scalar_ic_from_uo.i --error --error-unused --error-override --no-gdb-backtrace
ics/depend_on_uo.scalar_ic_from_uo: Fatal error in MPI_Init_thread: Invalid group, error stack:
ics/depend_on_uo.scalar_ic_from_uo: MPIR_Init_thread(586)..............:
ics/depend_on_uo.scalar_ic_from_uo: MPID_Init(224).....................: channel initialization failed
ics/depend_on_uo.scalar_ic_from_uo: MPIDI_CH3_Init(105)................:
ics/depend_on_uo.scalar_ic_from_uo: MPID_nem_init(324).................:
ics/depend_on_uo.scalar_ic_from_uo: MPID_nem_tcp_init(175).............:
ics/depend_on_uo.scalar_ic_from_uo: MPID_nem_tcp_get_business_card(401):
ics/depend_on_uo.scalar_ic_from_uo: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
ics/depend_on_uo.scalar_ic_from_uo: (unknown)(): Invalid group
ics/depend_on_uo.scalar_ic_from_uo: Fatal error in MPI_Init_thread: Invalid group, error stack:
ics/depend_on_uo.scalar_ic_from_uo: MPIR_Init_thread(586)..............:
ics/depend_on_uo.scalar_ic_from_uo: MPID_Init(224).....................: channel initialization failed
ics/depend_on_uo.scalar_ic_from_uo: MPIDI_CH3_Init(105)................:
ics/depend_on_uo.scalar_ic_from_uo: MPID_nem_init(324).................:
ics/depend_on_uo.scalar_ic_from_uo: MPID_nem_tcp_init(175).............:
ics/depend_on_uo.scalar_ic_from_uo: MPID_nem_tcp_get_business_card(401):
ics/depend_on_uo.scalar_ic_from_uo: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
ics/depend_on_uo.scalar_ic_from_uo: (unknown)(): Invalid group
ics/depend_on_uo.scalar_ic_from_uo:
ics/depend_on_uo.scalar_ic_from_uo:
ics/depend_on_uo.scalar_ic_from_uo: Exit Code: 8
ics/depend_on_uo.scalar_ic_from_uo: ################################################################################
ics/depend_on_uo.scalar_ic_from_uo: Tester failed, reason: CRASH
ics/depend_on_uo.scalar_ic_from_uo:
ics/depend_on_uo.scalar_ic_from_uo ............................................... [min_cpus=2] FAILED (CRASH)
outputs/console.transient_perf_int ........................................................................ OK
nodalkernels/constraint_enforcement.vi/ssls_amg ........................................................... OK
mesh/mesh_only.mesh_only_checkpoint: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/mesh/mesh_only
mesh/mesh_only.mesh_only_checkpoint: Running command: mpiexec -n 3 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i mesh_only.i Mesh/parallel_type=distributed --mesh-only 3d_chimney.cpr --error --error-unused --error-override --no-gdb-backtrace
mesh/mesh_only.mesh_only_checkpoint: Fatal error in MPI_Init_thread: Invalid group, error stack:
mesh/mesh_only.mesh_only_checkpoint: MPIR_Init_thread(586)..............:
mesh/mesh_only.mesh_only_checkpoint: MPID_Init(224).....................: channel initialization failed
mesh/mesh_only.mesh_only_checkpoint: MPIDI_CH3_Init(105)................:
mesh/mesh_only.mesh_only_checkpoint: MPID_nem_init(324).................:
mesh/mesh_only.mesh_only_checkpoint: MPID_nem_tcp_init(175).............:
mesh/mesh_only.mesh_only_checkpoint: MPID_nem_tcp_get_business_card(401):
mesh/mesh_only.mesh_only_checkpoint: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
mesh/mesh_only.mesh_only_checkpoint: (unknown)(): Invalid group
mesh/mesh_only.mesh_only_checkpoint: Fatal error in MPI_Init_thread: Invalid group, error stack:
mesh/mesh_only.mesh_only_checkpoint: MPIR_Init_thread(586)..............:
mesh/mesh_only.mesh_only_checkpoint: MPID_Init(224).....................: channel initialization failed
mesh/mesh_only.mesh_only_checkpoint: MPIDI_CH3_Init(105)................:
mesh/mesh_only.mesh_only_checkpoint: MPID_nem_init(324).................:
mesh/mesh_only.mesh_only_checkpoint: MPID_nem_tcp_init(175).............:
mesh/mesh_only.mesh_only_checkpoint: MPID_nem_tcp_get_business_card(401):
mesh/mesh_only.mesh_only_checkpoint: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
mesh/mesh_only.mesh_only_checkpoint: (unknown)(): Invalid group
mesh/mesh_only.mesh_only_checkpoint:
mesh/mesh_only.mesh_only_checkpoint:
mesh/mesh_only.mesh_only_checkpoint: Exit Code: 8
mesh/mesh_only.mesh_only_checkpoint: ################################################################################
mesh/mesh_only.mesh_only_checkpoint: Tester failed, reason: CRASH
mesh/mesh_only.mesh_only_checkpoint:
mesh/mesh_only.mesh_only_checkpoint .............................................. [min_cpus=3] FAILED (CRASH)
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/mesh/custom_partitioner
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i custom_linear_partitioner_test_displacement.i --error --error-unused --error-override --no-gdb-backtrace
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: Fatal error in MPI_Init_thread: Invalid group, error stack:
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: MPIR_Init_thread(586)..............:
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: MPID_Init(224).....................: channel initialization failed
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: MPIDI_CH3_Init(105)................:
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: MPID_nem_init(324).................:
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: MPID_nem_tcp_init(175).............:
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: MPID_nem_tcp_get_business_card(401):
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: (unknown)(): Invalid group
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: Fatal error in MPI_Init_thread: Invalid group, error stack:
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: MPIR_Init_thread(586)..............:
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: MPID_Init(224).....................: channel initialization failed
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: MPIDI_CH3_Init(105)................:
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: MPID_nem_init(324).................:
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: MPID_nem_tcp_init(175).............:
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: MPID_nem_tcp_get_business_card(401):
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: (unknown)(): Invalid group
mesh/custom_partitioner.group/custom_linear_partitioner_displacement:
mesh/custom_partitioner.group/custom_linear_partitioner_displacement:
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: Exit Code: 8
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: ################################################################################
mesh/custom_partitioner.group/custom_linear_partitioner_displacement: Tester failed, reason: CRASH
mesh/custom_partitioner.group/custom_linear_partitioner_displacement:
mesh/custom_partitioner.group/custom_linear_partitioner_displacement ............. [min_cpus=2] FAILED (CRASH)
vectorpostprocessors/csv_reader.tester_fail: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/vectorpostprocessors/csv_reader
vectorpostprocessors/csv_reader.tester_fail: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i read.i UserObjects/tester/rank=1 UserObjects/tester/gold='1 2 3' Outputs/csv=false --error --error-unused --error-override --no-gdb-backtrace --keep-cout --redirect-output tester_fail
vectorpostprocessors/csv_reader.tester_fail: Fatal error in MPI_Init_thread: Invalid group, error stack:
vectorpostprocessors/csv_reader.tester_fail: MPIR_Init_thread(586)..............:
vectorpostprocessors/csv_reader.tester_fail: MPID_Init(224).....................: channel initialization failed
vectorpostprocessors/csv_reader.tester_fail: MPIDI_CH3_Init(105)................:
vectorpostprocessors/csv_reader.tester_fail: MPID_nem_init(324).................:
vectorpostprocessors/csv_reader.tester_fail: MPID_nem_tcp_init(175).............:
vectorpostprocessors/csv_reader.tester_fail: MPID_nem_tcp_get_business_card(401):
vectorpostprocessors/csv_reader.tester_fail: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
vectorpostprocessors/csv_reader.tester_fail: (unknown)(): Invalid group
vectorpostprocessors/csv_reader.tester_fail: Fatal error in MPI_Init_thread: Invalid group, error stack:
vectorpostprocessors/csv_reader.tester_fail: MPIR_Init_thread(586)..............:
vectorpostprocessors/csv_reader.tester_fail: MPID_Init(224).....................: channel initialization failed
vectorpostprocessors/csv_reader.tester_fail: MPIDI_CH3_Init(105)................:
vectorpostprocessors/csv_reader.tester_fail: MPID_nem_init(324).................:
vectorpostprocessors/csv_reader.tester_fail: MPID_nem_tcp_init(175).............:
vectorpostprocessors/csv_reader.tester_fail: MPID_nem_tcp_get_business_card(401):
vectorpostprocessors/csv_reader.tester_fail: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
vectorpostprocessors/csv_reader.tester_fail: (unknown)(): Invalid group
vectorpostprocessors/csv_reader.tester_fail: ################################################################################
vectorpostprocessors/csv_reader.tester_fail:
vectorpostprocessors/csv_reader.tester_fail: Unable to match the following pattern against the program's output:
vectorpostprocessors/csv_reader.tester_fail:
vectorpostprocessors/csv_reader.tester_fail: The supplied gold data does not match the VPP data on the given rank.
vectorpostprocessors/csv_reader.tester_fail:
vectorpostprocessors/csv_reader.tester_fail: ################################################################################
vectorpostprocessors/csv_reader.tester_fail: Tester failed, reason: EXPECTED ERROR MISSING
vectorpostprocessors/csv_reader.tester_fail:
vectorpostprocessors/csv_reader.tester_fail ..................... [min_cpus=2] FAILED (EXPECTED ERROR MISSING)
geomsearch/2d_moving_penetration.pl_test3ns ............................................................... OK
geomsearch/3d_moving_penetration_smoothing.overlapping/pl_test4qnstt ...................................... OK
transfers/multiapp_nearest_node_transfer.parallel: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/transfers/multiapp_nearest_node_transfer
transfers/multiapp_nearest_node_transfer.parallel: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i parallel_master.i --error --error-unused --error-override --no-gdb-backtrace
transfers/multiapp_nearest_node_transfer.parallel: Fatal error in MPI_Init_thread: Invalid group, error stack:
transfers/multiapp_nearest_node_transfer.parallel: MPIR_Init_thread(586)..............:
transfers/multiapp_nearest_node_transfer.parallel: MPID_Init(224).....................: channel initialization failed
transfers/multiapp_nearest_node_transfer.parallel: MPIDI_CH3_Init(105)................:
transfers/multiapp_nearest_node_transfer.parallel: MPID_nem_init(324).................:
transfers/multiapp_nearest_node_transfer.parallel: MPID_nem_tcp_init(175).............:
transfers/multiapp_nearest_node_transfer.parallel: MPID_nem_tcp_get_business_card(401):
transfers/multiapp_nearest_node_transfer.parallel: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
transfers/multiapp_nearest_node_transfer.parallel: (unknown)(): Invalid group
transfers/multiapp_nearest_node_transfer.parallel: Fatal error in MPI_Init_thread: Invalid group, error stack:
transfers/multiapp_nearest_node_transfer.parallel: MPIR_Init_thread(586)..............:
transfers/multiapp_nearest_node_transfer.parallel: MPID_Init(224).....................: channel initialization failed
transfers/multiapp_nearest_node_transfer.parallel: MPIDI_CH3_Init(105)................:
transfers/multiapp_nearest_node_transfer.parallel: MPID_nem_init(324).................:
transfers/multiapp_nearest_node_transfer.parallel: MPID_nem_tcp_init(175).............:
transfers/multiapp_nearest_node_transfer.parallel: MPID_nem_tcp_get_business_card(401):
transfers/multiapp_nearest_node_transfer.parallel: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
transfers/multiapp_nearest_node_transfer.parallel: (unknown)(): Invalid group
transfers/multiapp_nearest_node_transfer.parallel: Fatal error in MPI_Init_thread: Invalid group, error stack:
transfers/multiapp_nearest_node_transfer.parallel: MPIR_Init_thread(586)..............:
transfers/multiapp_nearest_node_transfer.parallel: MPID_Init(224).....................: channel initialization failed
transfers/multiapp_nearest_node_transfer.parallel: MPIDI_CH3_Init(105)................:
transfers/multiapp_nearest_node_transfer.parallel: MPID_nem_init(324).................:
transfers/multiapp_nearest_node_transfer.parallel: MPID_nem_tcp_init(175).............:
transfers/multiapp_nearest_node_transfer.parallel: MPID_nem_tcp_get_business_card(401):
transfers/multiapp_nearest_node_transfer.parallel: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
transfers/multiapp_nearest_node_transfer.parallel: (unknown)(): Invalid group
transfers/multiapp_nearest_node_transfer.parallel: Fatal error in MPI_Init_thread: Invalid group, error stack:
transfers/multiapp_nearest_node_transfer.parallel: MPIR_Init_thread(586)..............:
transfers/multiapp_nearest_node_transfer.parallel: MPID_Init(224).....................: channel initialization failed
transfers/multiapp_nearest_node_transfer.parallel: MPIDI_CH3_Init(105)................:
transfers/multiapp_nearest_node_transfer.parallel: MPID_nem_init(324).................:
transfers/multiapp_nearest_node_transfer.parallel: MPID_nem_tcp_init(175).............:
transfers/multiapp_nearest_node_transfer.parallel: MPID_nem_tcp_get_business_card(401):
transfers/multiapp_nearest_node_transfer.parallel: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
transfers/multiapp_nearest_node_transfer.parallel: (unknown)(): Invalid group
transfers/multiapp_nearest_node_transfer.parallel:
transfers/multiapp_nearest_node_transfer.parallel:
transfers/multiapp_nearest_node_transfer.parallel: Exit Code: 8
transfers/multiapp_nearest_node_transfer.parallel: ################################################################################
transfers/multiapp_nearest_node_transfer.parallel: Tester failed, reason: CRASH
transfers/multiapp_nearest_node_transfer.parallel:
transfers/multiapp_nearest_node_transfer.parallel ................................ [min_cpus=2] FAILED (CRASH)
materials/stateful_prop.ad/reg ............................................................................ OK
materials/derivative_material_interface.postprocessor_coupling/parsed_material ............................ OK
fvkernels/mms/non-orthogonal.extended: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/fvkernels/mms/non-orthogonal
fvkernels/mms/non-orthogonal.extended: Running command: python -m unittest -v test.TestExtendedADR
fvkernels/mms/non-orthogonal.extended: The 'mms' package requires sympy for symbolic function evaluation, it can be installed by running pip install sympy --user.
fvkernels/mms/non-orthogonal.extended: Running: /Users/almeag-mac/projects1/moose/test/moose_test-opt -i extended-adr.i Mesh/uniform_refine=0
fvkernels/mms/non-orthogonal.extended: test (test.TestExtendedADR) ... Fatal error in MPI_Init_thread: Invalid group, error stack:
fvkernels/mms/non-orthogonal.extended: MPIR_Init_thread(586)..............:
fvkernels/mms/non-orthogonal.extended: MPID_Init(224).....................: channel initialization failed
fvkernels/mms/non-orthogonal.extended: MPIDI_CH3_Init(105)................:
fvkernels/mms/non-orthogonal.extended: MPID_nem_init(324).................:
fvkernels/mms/non-orthogonal.extended: MPID_nem_tcp_init(175).............:
fvkernels/mms/non-orthogonal.extended: MPID_nem_tcp_get_business_card(401):
fvkernels/mms/non-orthogonal.extended: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
fvkernels/mms/non-orthogonal.extended: (unknown)(): Invalid group
fvkernels/mms/non-orthogonal.extended: ERROR
fvkernels/mms/non-orthogonal.extended:
fvkernels/mms/non-orthogonal.extended: ======================================================================
fvkernels/mms/non-orthogonal.extended: ERROR: test (test.TestExtendedADR)
fvkernels/mms/non-orthogonal.extended: ----------------------------------------------------------------------
fvkernels/mms/non-orthogonal.extended: Traceback (most recent call last):
fvkernels/mms/non-orthogonal.extended: File "/Users/almeag-mac/projects1/moose/test/tests/fvkernels/mms/non-orthogonal/test.py", line 23, in test
fvkernels/mms/non-orthogonal.extended: df1 = mms.run_spatial('extended-adr.i', 7, mpi=2)
fvkernels/mms/non-orthogonal.extended: File "/Users/almeag-mac/projects1/moose/python/mms/runner.py", line 129, in run_spatial
fvkernels/mms/non-orthogonal.extended: return _runner(*args, rtype=SPATIAL, **kwargs)
fvkernels/mms/non-orthogonal.extended: File "/Users/almeag-mac/projects1/moose/python/mms/runner.py", line 102, in _runner
fvkernels/mms/non-orthogonal.extended: raise IOError("The CSV output does not exist: {}".format(csv))
fvkernels/mms/non-orthogonal.extended: OSError: The CSV output does not exist: None
fvkernels/mms/non-orthogonal.extended:
fvkernels/mms/non-orthogonal.extended: ----------------------------------------------------------------------
fvkernels/mms/non-orthogonal.extended: Ran 1 test in 0.261s
fvkernels/mms/non-orthogonal.extended:
fvkernels/mms/non-orthogonal.extended: FAILED (errors=1)
fvkernels/mms/non-orthogonal.extended:
fvkernels/mms/non-orthogonal.extended:
fvkernels/mms/non-orthogonal.extended: Exit Code: 1
fvkernels/mms/non-orthogonal.extended: ################################################################################
fvkernels/mms/non-orthogonal.extended: Tester failed, reason: CRASH
fvkernels/mms/non-orthogonal.extended:
fvkernels/mms/non-orthogonal.extended ............................................ [min_cpus=2] FAILED (CRASH)
mesh/custom_partitioner.group/custom_linear_partitioner_restart: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/mesh/custom_partitioner
mesh/custom_partitioner.group/custom_linear_partitioner_restart: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i custom_linear_partitioner_restart_test.i --error --error-unused --error-override --no-gdb-backtrace
mesh/custom_partitioner.group/custom_linear_partitioner_restart: Fatal error in MPI_Init_thread: Invalid group, error stack:
mesh/custom_partitioner.group/custom_linear_partitioner_restart: MPIR_Init_thread(586)..............:
mesh/custom_partitioner.group/custom_linear_partitioner_restart: MPID_Init(224).....................: channel initialization failed
mesh/custom_partitioner.group/custom_linear_partitioner_restart: MPIDI_CH3_Init(105)................:
mesh/custom_partitioner.group/custom_linear_partitioner_restart: MPID_nem_init(324).................:
mesh/custom_partitioner.group/custom_linear_partitioner_restart: MPID_nem_tcp_init(175).............:
mesh/custom_partitioner.group/custom_linear_partitioner_restart: MPID_nem_tcp_get_business_card(401):
mesh/custom_partitioner.group/custom_linear_partitioner_restart: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
mesh/custom_partitioner.group/custom_linear_partitioner_restart: (unknown)(): Invalid group
mesh/custom_partitioner.group/custom_linear_partitioner_restart: Fatal error in MPI_Init_thread: Invalid group, error stack:
mesh/custom_partitioner.group/custom_linear_partitioner_restart: MPIR_Init_thread(586)..............:
mesh/custom_partitioner.group/custom_linear_partitioner_restart: MPID_Init(224).....................: channel initialization failed
mesh/custom_partitioner.group/custom_linear_partitioner_restart: MPIDI_CH3_Init(105)................:
mesh/custom_partitioner.group/custom_linear_partitioner_restart: MPID_nem_init(324).................:
mesh/custom_partitioner.group/custom_linear_partitioner_restart: MPID_nem_tcp_init(175).............:
mesh/custom_partitioner.group/custom_linear_partitioner_restart: MPID_nem_tcp_get_business_card(401):
mesh/custom_partitioner.group/custom_linear_partitioner_restart: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
mesh/custom_partitioner.group/custom_linear_partitioner_restart: (unknown)(): Invalid group
mesh/custom_partitioner.group/custom_linear_partitioner_restart:
mesh/custom_partitioner.group/custom_linear_partitioner_restart:
mesh/custom_partitioner.group/custom_linear_partitioner_restart: Exit Code: 8
mesh/custom_partitioner.group/custom_linear_partitioner_restart: ################################################################################
mesh/custom_partitioner.group/custom_linear_partitioner_restart: Tester failed, reason: CRASH
mesh/custom_partitioner.group/custom_linear_partitioner_restart:
mesh/custom_partitioner.group/custom_linear_partitioner_restart .................. [min_cpus=2] FAILED (CRASH)
kernels/vector_fe.coupled_vector_gradient ................................................................. OK
vectorpostprocessors/csv_reader.read_preic ................................................................ OK
mesh/high_order_elems.test_pyramid13 ...................................................................... OK
time_steppers/iteration_adaptive.multi_piecewise_linear_function_change ................................... OK
geomsearch/2d_moving_penetration.pl_test3qns .............................................................. OK
postprocessors/num_iterations.methods/l_stable_dirk3 ...................................................... OK
multiapps/secant.variables_transient/app_begin_transfers_end_secant_sub ................................... OK
outputs/vtk.solution/diff_serial_mesh_parallel: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/outputs/vtk
outputs/vtk.solution/diff_serial_mesh_parallel: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i vtk_diff_serial_mesh_parallel.i --error --error-unused --error-override --no-gdb-backtrace
outputs/vtk.solution/diff_serial_mesh_parallel: Fatal error in MPI_Init_thread: Invalid group, error stack:
outputs/vtk.solution/diff_serial_mesh_parallel: MPIR_Init_thread(586)..............:
outputs/vtk.solution/diff_serial_mesh_parallel: MPID_Init(224).....................: channel initialization failed
outputs/vtk.solution/diff_serial_mesh_parallel: MPIDI_CH3_Init(105)................:
outputs/vtk.solution/diff_serial_mesh_parallel: MPID_nem_init(324).................:
outputs/vtk.solution/diff_serial_mesh_parallel: MPID_nem_tcp_init(175).............:
outputs/vtk.solution/diff_serial_mesh_parallel: MPID_nem_tcp_get_business_card(401):
outputs/vtk.solution/diff_serial_mesh_parallel: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
outputs/vtk.solution/diff_serial_mesh_parallel: (unknown)(): Invalid group
outputs/vtk.solution/diff_serial_mesh_parallel:
outputs/vtk.solution/diff_serial_mesh_parallel: ################################################################################
outputs/vtk.solution/diff_serial_mesh_parallel: Tester failed, reason: CRASH
outputs/vtk.solution/diff_serial_mesh_parallel:
outputs/vtk.solution/diff_serial_mesh_parallel ................................... [min_cpus=2] FAILED (CRASH)
materials/stateful_prop.ad/ad ............................................................................. OK
geomsearch/3d_moving_penetration.pl_test4tt ............................................................... OK
interfaces/random.parallel_verification: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/interfaces/random
interfaces/random.parallel_verification: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i random.i --error --error-unused --error-override --no-gdb-backtrace
interfaces/random.parallel_verification: Fatal error in MPI_Init_thread: Invalid group, error stack:
interfaces/random.parallel_verification: MPIR_Init_thread(586)..............:
interfaces/random.parallel_verification: MPID_Init(224).....................: channel initialization failed
interfaces/random.parallel_verification: MPIDI_CH3_Init(105)................:
interfaces/random.parallel_verification: MPID_nem_init(324).................:
interfaces/random.parallel_verification: MPID_nem_tcp_init(175).............:
interfaces/random.parallel_verification: MPID_nem_tcp_get_business_card(401):
interfaces/random.parallel_verification: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
interfaces/random.parallel_verification: (unknown)(): Invalid group
interfaces/random.parallel_verification: Fatal error in MPI_Init_thread: Invalid group, error stack:
interfaces/random.parallel_verification: MPIR_Init_thread(586)..............:
interfaces/random.parallel_verification: MPID_Init(224).....................: channel initialization failed
interfaces/random.parallel_verification: MPIDI_CH3_Init(105)................:
interfaces/random.parallel_verification: MPID_nem_init(324).................:
interfaces/random.parallel_verification: MPID_nem_tcp_init(175).............:
interfaces/random.parallel_verification: MPID_nem_tcp_get_business_card(401):
interfaces/random.parallel_verification: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
interfaces/random.parallel_verification: (unknown)(): Invalid group
interfaces/random.parallel_verification:
interfaces/random.parallel_verification:
interfaces/random.parallel_verification: Exit Code: 8
interfaces/random.parallel_verification: ################################################################################
interfaces/random.parallel_verification: Tester failed, reason: CRASH
interfaces/random.parallel_verification:
interfaces/random.parallel_verification .......................................... [min_cpus=2] FAILED (CRASH)
interfaces/random.test_par_mesh .................................................... [skipped dependency] SKIP
interfaces/random.threads_verification ............................................. [skipped dependency] SKIP
materials/derivative_material_interface.postprocessor_coupling/derivative_parsed_material ................. OK
outputs/console._console_const ............................................................................ OK
mesh/mesh_generation.annular_except1_deprecated ........................................................... OK
bcs/periodic.testwedge .................................................................................... OK
problems/eigen_problem/eigensolvers.eigen_as_sub .......................................................... OK
nodalkernels/constraint_enforcement.unbounded ............................................................. OK
meshgenerators/distributed_rectilinear/partition.2D_3: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/meshgenerators/distributed_rectilinear/partition
meshgenerators/distributed_rectilinear/partition.2D_3: Running command: mpiexec -n 3 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i squarish_partition.i --error --error-unused --error-override --no-gdb-backtrace
meshgenerators/distributed_rectilinear/partition.2D_3: Fatal error in MPI_Init_thread: Invalid group, error stack:
meshgenerators/distributed_rectilinear/partition.2D_3: MPIR_Init_thread(586)..............:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_Init(224).....................: channel initialization failed
meshgenerators/distributed_rectilinear/partition.2D_3: MPIDI_CH3_Init(105)................:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_init(324).................:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_tcp_init(175).............:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_tcp_get_business_card(401):
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
meshgenerators/distributed_rectilinear/partition.2D_3: (unknown)(): Invalid group
meshgenerators/distributed_rectilinear/partition.2D_3: Fatal error in MPI_Init_thread: Invalid group, error stack:
meshgenerators/distributed_rectilinear/partition.2D_3: MPIR_Init_thread(586)..............:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_Init(224).....................: channel initialization failed
meshgenerators/distributed_rectilinear/partition.2D_3: MPIDI_CH3_Init(105)................:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_init(324).................:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_tcp_init(175).............:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_tcp_get_business_card(401):
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
meshgenerators/distributed_rectilinear/partition.2D_3: (unknown)(): Invalid group
meshgenerators/distributed_rectilinear/partition.2D_3: Fatal error in MPI_Init_thread: Invalid group, error stack:
meshgenerators/distributed_rectilinear/partition.2D_3: MPIR_Init_thread(586)..............:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_Init(224).....................: channel initialization failed
meshgenerators/distributed_rectilinear/partition.2D_3: MPIDI_CH3_Init(105)................:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_init(324).................:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_tcp_init(175).............:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_tcp_get_business_card(401):
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
meshgenerators/distributed_rectilinear/partition.2D_3: (unknown)(): Invalid group
meshgenerators/distributed_rectilinear/partition.2D_3: Fatal error in MPI_Init_thread: Invalid group, error stack:
meshgenerators/distributed_rectilinear/partition.2D_3: MPIR_Init_thread(586)..............:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_Init(224).....................: channel initialization failed
meshgenerators/distributed_rectilinear/partition.2D_3: MPIDI_CH3_Init(105)................:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_init(324).................:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_tcp_init(175).............:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_tcp_get_business_card(401):
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
meshgenerators/distributed_rectilinear/partition.2D_3: (unknown)(): Invalid group
meshgenerators/distributed_rectilinear/partition.2D_3: Fatal error in MPI_Init_thread: Invalid group, error stack:
meshgenerators/distributed_rectilinear/partition.2D_3: MPIR_Init_thread(586)..............:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_Init(224).....................: channel initialization failed
meshgenerators/distributed_rectilinear/partition.2D_3: MPIDI_CH3_Init(105)................:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_init(324).................:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_tcp_init(175).............:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_tcp_get_business_card(401):
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
meshgenerators/distributed_rectilinear/partition.2D_3: (unknown)(): Invalid group
meshgenerators/distributed_rectilinear/partition.2D_3: Fatal error in MPI_Init_thread: Invalid group, error stack:
meshgenerators/distributed_rectilinear/partition.2D_3: MPIR_Init_thread(586)..............:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_Init(224).....................: channel initialization failed
meshgenerators/distributed_rectilinear/partition.2D_3: MPIDI_CH3_Init(105)................:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_init(324).................:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_tcp_init(175).............:
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_tcp_get_business_card(401):
meshgenerators/distributed_rectilinear/partition.2D_3: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
meshgenerators/distributed_rectilinear/partition.2D_3: (unknown)(): Invalid group
meshgenerators/distributed_rectilinear/partition.2D_3:
meshgenerators/distributed_rectilinear/partition.2D_3:
meshgenerators/distributed_rectilinear/partition.2D_3: Exit Code: 8
meshgenerators/distributed_rectilinear/partition.2D_3: ################################################################################
meshgenerators/distributed_rectilinear/partition.2D_3: Tester failed, reason: CRASH
meshgenerators/distributed_rectilinear/partition.2D_3:
meshgenerators/distributed_rectilinear/partition.2D_3 ............................ [min_cpus=3] FAILED (CRASH)
kernels/vector_fe.comp_error .............................................................................. OK
restart/kernel_restartable.parallel_error/error2 ................................... [skipped dependency] SKIP
restart/kernel_restartable.parallel_error/error1: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/restart/kernel_restartable
restart/kernel_restartable.parallel_error/error1: Running command: mpiexec -n 2 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i kernel_restartable.i --error --error-unused --error-override --no-gdb-backtrace
restart/kernel_restartable.parallel_error/error1: Fatal error in MPI_Init_thread: Invalid group, error stack:
restart/kernel_restartable.parallel_error/error1: MPIR_Init_thread(586)..............:
restart/kernel_restartable.parallel_error/error1: MPID_Init(224).....................: channel initialization failed
restart/kernel_restartable.parallel_error/error1: MPIDI_CH3_Init(105)................:
restart/kernel_restartable.parallel_error/error1: MPID_nem_init(324).................:
restart/kernel_restartable.parallel_error/error1: MPID_nem_tcp_init(175).............:
restart/kernel_restartable.parallel_error/error1: MPID_nem_tcp_get_business_card(401):
restart/kernel_restartable.parallel_error/error1: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
restart/kernel_restartable.parallel_error/error1: (unknown)(): Invalid group
restart/kernel_restartable.parallel_error/error1: Fatal error in MPI_Init_thread: Invalid group, error stack:
restart/kernel_restartable.parallel_error/error1: MPIR_Init_thread(586)..............:
restart/kernel_restartable.parallel_error/error1: MPID_Init(224).....................: channel initialization failed
restart/kernel_restartable.parallel_error/error1: MPIDI_CH3_Init(105)................:
restart/kernel_restartable.parallel_error/error1: MPID_nem_init(324).................:
restart/kernel_restartable.parallel_error/error1: MPID_nem_tcp_init(175).............:
restart/kernel_restartable.parallel_error/error1: MPID_nem_tcp_get_business_card(401):
restart/kernel_restartable.parallel_error/error1: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
restart/kernel_restartable.parallel_error/error1: (unknown)(): Invalid group
restart/kernel_restartable.parallel_error/error1: Fatal error in MPI_Init_thread: Invalid group, error stack:
restart/kernel_restartable.parallel_error/error1: MPIR_Init_thread(586)..............:
restart/kernel_restartable.parallel_error/error1: MPID_Init(224).....................: channel initialization failed
restart/kernel_restartable.parallel_error/error1: MPIDI_CH3_Init(105)................:
restart/kernel_restartable.parallel_error/error1: MPID_nem_init(324).................:
restart/kernel_restartable.parallel_error/error1: MPID_nem_tcp_init(175).............:
restart/kernel_restartable.parallel_error/error1: MPID_nem_tcp_get_business_card(401):
restart/kernel_restartable.parallel_error/error1: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
restart/kernel_restartable.parallel_error/error1: (unknown)(): Invalid group
restart/kernel_restartable.parallel_error/error1: Fatal error in MPI_Init_thread: Invalid group, error stack:
restart/kernel_restartable.parallel_error/error1: MPIR_Init_thread(586)..............:
restart/kernel_restartable.parallel_error/error1: MPID_Init(224).....................: channel initialization failed
restart/kernel_restartable.parallel_error/error1: MPIDI_CH3_Init(105)................:
restart/kernel_restartable.parallel_error/error1: MPID_nem_init(324).................:
restart/kernel_restartable.parallel_error/error1: MPID_nem_tcp_init(175).............:
restart/kernel_restartable.parallel_error/error1: MPID_nem_tcp_get_business_card(401):
restart/kernel_restartable.parallel_error/error1: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
restart/kernel_restartable.parallel_error/error1: (unknown)(): Invalid group
restart/kernel_restartable.parallel_error/error1:
restart/kernel_restartable.parallel_error/error1:
restart/kernel_restartable.parallel_error/error1: Exit Code: 8
restart/kernel_restartable.parallel_error/error1: ################################################################################
restart/kernel_restartable.parallel_error/error1: Tester failed, reason: CRASH
restart/kernel_restartable.parallel_error/error1:
restart/kernel_restartable.parallel_error/error1 ................................. [min_cpus=2] FAILED (CRASH)
restart/kernel_restartable.thread_error/threads_error .............................. [skipped dependency] SKIP
restart/kernel_restartable.thread_error/with_threads ............................... [skipped dependency] SKIP
mesh/high_order_elems.test_pyramid14 ...................................................................... OK
relationship_managers/geometric_neighbors.geometric_edge_neighbor: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/relationship_managers/geometric_neighbors
relationship_managers/geometric_neighbors.geometric_edge_neighbor: Running command: mpiexec -n 3 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i geometric_edge_neighbors.i Mesh/Partitioner/type=GridPartitioner Mesh/Partitioner/nx=1 Mesh/Partitioner/ny=3 --error --error-unused --error-override --no-gdb-backtrace
relationship_managers/geometric_neighbors.geometric_edge_neighbor: Fatal error in MPI_Init_thread: Invalid group, error stack:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPIR_Init_thread(586)..............:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_Init(224).....................: channel initialization failed
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPIDI_CH3_Init(105)................:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_init(324).................:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_tcp_init(175).............:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_tcp_get_business_card(401):
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
relationship_managers/geometric_neighbors.geometric_edge_neighbor: (unknown)(): Invalid group
relationship_managers/geometric_neighbors.geometric_edge_neighbor: Fatal error in MPI_Init_thread: Invalid group, error stack:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPIR_Init_thread(586)..............:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_Init(224).....................: channel initialization failed
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPIDI_CH3_Init(105)................:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_init(324).................:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_tcp_init(175).............:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_tcp_get_business_card(401):
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
relationship_managers/geometric_neighbors.geometric_edge_neighbor: (unknown)(): Invalid group
relationship_managers/geometric_neighbors.geometric_edge_neighbor: Fatal error in MPI_Init_thread: Invalid group, error stack:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPIR_Init_thread(586)..............:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_Init(224).....................: channel initialization failed
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPIDI_CH3_Init(105)................:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_init(324).................:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_tcp_init(175).............:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_tcp_get_business_card(401):
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
relationship_managers/geometric_neighbors.geometric_edge_neighbor: (unknown)(): Invalid group
relationship_managers/geometric_neighbors.geometric_edge_neighbor: Fatal error in MPI_Init_thread: Invalid group, error stack:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPIR_Init_thread(586)..............:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_Init(224).....................: channel initialization failed
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPIDI_CH3_Init(105)................:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_init(324).................:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_tcp_init(175).............:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_tcp_get_business_card(401):
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
relationship_managers/geometric_neighbors.geometric_edge_neighbor: (unknown)(): Invalid group
relationship_managers/geometric_neighbors.geometric_edge_neighbor: Fatal error in MPI_Init_thread: Invalid group, error stack:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPIR_Init_thread(586)..............:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_Init(224).....................: channel initialization failed
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPIDI_CH3_Init(105)................:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_init(324).................:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_tcp_init(175).............:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_tcp_get_business_card(401):
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
relationship_managers/geometric_neighbors.geometric_edge_neighbor: (unknown)(): Invalid group
relationship_managers/geometric_neighbors.geometric_edge_neighbor: Fatal error in MPI_Init_thread: Invalid group, error stack:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPIR_Init_thread(586)..............:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_Init(224).....................: channel initialization failed
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPIDI_CH3_Init(105)................:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_init(324).................:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_tcp_init(175).............:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_tcp_get_business_card(401):
relationship_managers/geometric_neighbors.geometric_edge_neighbor: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
relationship_managers/geometric_neighbors.geometric_edge_neighbor: (unknown)(): Invalid group
relationship_managers/geometric_neighbors.geometric_edge_neighbor:
relationship_managers/geometric_neighbors.geometric_edge_neighbor:
relationship_managers/geometric_neighbors.geometric_edge_neighbor: Exit Code: 8
relationship_managers/geometric_neighbors.geometric_edge_neighbor: ################################################################################
relationship_managers/geometric_neighbors.geometric_edge_neighbor: Tester failed, reason: CRASH
relationship_managers/geometric_neighbors.geometric_edge_neighbor:
relationship_managers/geometric_neighbors.geometric_edge_neighbor ................ [min_cpus=3] FAILED (CRASH)
outputs/xml.parallel/distributed: Working Directory: /Users/almeag-mac/projects1/moose/test/tests/outputs/xml
outputs/xml.parallel/distributed: Running command: mpiexec -n 3 /Users/almeag-mac/projects1/moose/test/moose_test-opt -i xml.i VectorPostprocessors/distributed/parallel_type=DISTRIBUTED Outputs/file_base=xml_distributed_out --error --error-unused --error-override --no-gdb-backtrace
outputs/xml.parallel/distributed: Fatal error in MPI_Init_thread: Invalid group, error stack:
outputs/xml.parallel/distributed: MPIR_Init_thread(586)..............:
outputs/xml.parallel/distributed: MPID_Init(224).....................: channel initialization failed
outputs/xml.parallel/distributed: MPIDI_CH3_Init(105)................:
outputs/xml.parallel/distributed: MPID_nem_init(324).................:
outputs/xml.parallel/distributed: MPID_nem_tcp_init(175).............:
outputs/xml.parallel/distributed: MPID_nem_tcp_get_business_card(401):
outputs/xml.parallel/distributed: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
outputs/xml.parallel/distributed: (unknown)(): Invalid group
outputs/xml.parallel/distributed: Fatal error in MPI_Init_thread: Invalid group, error stack:
outputs/xml.parallel/distributed: MPIR_Init_thread(586)..............:
outputs/xml.parallel/distributed: MPID_Init(224).....................: channel initialization failed
outputs/xml.parallel/distributed: MPIDI_CH3_Init(105)................:
outputs/xml.parallel/distributed: MPID_nem_init(324).................:
outputs/xml.parallel/distributed: MPID_nem_tcp_init(175).............:
outputs/xml.parallel/distributed: MPID_nem_tcp_get_business_card(401):
outputs/xml.parallel/distributed: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
outputs/xml.parallel/distributed: (unknown)(): Invalid group
outputs/xml.parallel/distributed: Fatal error in MPI_Init_thread: Invalid group, error stack:
outputs/xml.parallel/distributed: MPIR_Init_thread(586)..............:
outputs/xml.parallel/distributed: MPID_Init(224).....................: channel initialization failed
outputs/xml.parallel/distributed: MPIDI_CH3_Init(105)................:
outputs/xml.parallel/distributed: MPID_nem_init(324).................:
outputs/xml.parallel/distributed: MPID_nem_tcp_init(175).............:
outputs/xml.parallel/distributed: MPID_nem_tcp_get_business_card(401):
outputs/xml.parallel/distributed: MPID_nem_tcp_init(373).............: gethostbyname failed, FN601235 (errno 0)
outputs/xml.parallel/distributed: (unknown)(): Invalid group
outputs/xml.parallel/distributed:
outputs/xml.parallel/distributed: ################################################################################
outputs/xml.parallel/distributed: Tester failed, reason: CRASH
outputs/xml.parallel/distributed:
outputs/xml.parallel/distributed ................................................. [min_cpus=3] FAILED (CRASH)
runWorker Exception: Traceback (most recent call last):
File "/Users/almeag-mac/projects1/moose/python/TestHarness/schedulers/Scheduler.py", line 445, in runJob
self.queueJobs(Jobs, j_lock)
File "/Users/almeag-mac/projects1/moose/python/TestHarness/schedulers/Scheduler.py", line 266, in queueJobs
self.__runner_pool_jobs.add(self.run_pool.apply_async(self.runJob, (job, Jobs, j_lock)))
File "/Users/almeag-mac/miniconda3/envs/moose/lib/python3.7/multiprocessing/pool.py", line 362, in apply_async
raise ValueError("Pool not running")
ValueError: Pool not running

runWorker Exception: Traceback (most recent call last):
File "/Users/almeag-mac/projects1/moose/python/TestHarness/schedulers/Scheduler.py", line 445, in runJob
self.queueJobs(Jobs, j_lock)
File "/Users/almeag-mac/projects1/moose/python/TestHarness/schedulers/Scheduler.py", line 266, in queueJobs
self.__runner_pool_jobs.add(self.run_pool.apply_async(self.runJob, (job, Jobs, j_lock)))
File "/Users/almeag-mac/miniconda3/envs/moose/lib/python3.7/multiprocessing/pool.py", line 362, in apply_async
raise ValueError("Pool not running")
ValueError: Pool not running



Steps to Reproduce

Running Moose tests.

Impact

Could not build a Moose multiapp.

Metadata

Metadata

Assignees

No one assigned

    Labels

    P: normalA defect affecting operation with a low possibility of significant effects.T: defectAn anomaly, which is anything that deviates from expectations.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions