Skip to content

docs: document FE index creation flow for storage-compute separation (存算分离) architecture#1

Merged
ybtsdst merged 3 commits intobranch-4.0-devfrom
copilot/fe
Mar 19, 2026
Merged

docs: document FE index creation flow for storage-compute separation (存算分离) architecture#1
ybtsdst merged 3 commits intobranch-4.0-devfrom
copilot/fe

Conversation

Copy link
Copy Markdown

Copilot AI commented Mar 18, 2026

  • Explore codebase to understand FE index creation flow in cloud (存算分离) architecture
  • Create README.md documentation in the cloud/alter directory explaining the complete FE processing flow for CREATE INDEX in storage-compute separation mode
  • Add JavaDoc comments to key methods in CloudSchemaChangeHandler and CloudSchemaChangeJobV2 to document the flow
  • Revert logger change in CloudSchemaChangeJobV2 (restore SchemaChangeJobV2.class per review feedback)
Original prompt

以创建索引为例,整理下fe部分的相关处理流程,针对存算分离的架构


🔒 GitHub Advanced Security automatically protects Copilot coding agent pull requests. You can protect all pull requests by enabling Advanced Security for your repositories. Learn more about Advanced Security.

@github-actions
Copy link
Copy Markdown

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In .devcontainer/ci-transwarp/build.sh line 26:
    ${build_args} \
    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    "${build_args}" \


In .devcontainer/ci-transwarp/build.sh line 27:
    -t doris_runtime:${tag} \
                     ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    -t doris_runtime:"${tag}" \


In .devcontainer/devtools/local_cluster.sh line 22:
    len=$(jq '.cluster | length' ${cluster_config_file})
                                 ^--------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    len=$(jq '.cluster | length' "${cluster_config_file}")


In .devcontainer/devtools/local_cluster.sh line 24:
    if [ "${cluster_id}" -lt 0 ] || [ "${cluster_id}" -ge "${len}" ]; then
       ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                                    ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "${cluster_id}" -lt 0 ]] || [[ "${cluster_id}" -ge "${len}" ]]; then


In .devcontainer/devtools/local_cluster.sh line 29:
    fe_count=$(jq ".cluster[${cluster_id}].fe" ${cluster_config_file})
                                               ^--------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    fe_count=$(jq ".cluster[${cluster_id}].fe" "${cluster_config_file}")


In .devcontainer/devtools/local_cluster.sh line 30:
    be_count=$(jq ".cluster[${cluster_id}].be" ${cluster_config_file})
                                               ^--------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    be_count=$(jq ".cluster[${cluster_id}].be" "${cluster_config_file}")


In .devcontainer/devtools/local_cluster.sh line 43:
    mkdir -p ${be_path}
             ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    mkdir -p "${be_path}"


In .devcontainer/devtools/local_cluster.sh line 45:
    if [ ! -e "${be_path}/bin" ]; then
       ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ ! -e "${be_path}/bin" ]]; then


In .devcontainer/devtools/local_cluster.sh line 46:
        cp -r -p ${be_output_dir}/bin ${be_path}/bin
                 ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                      ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cp -r -p "${be_output_dir}"/bin "${be_path}"/bin


In .devcontainer/devtools/local_cluster.sh line 49:
    if [ ! -e "${be_path}/conf" ]; then
       ^------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ ! -e "${be_path}/conf" ]]; then


In .devcontainer/devtools/local_cluster.sh line 50:
        mkdir -p ${be_path}/conf
                 ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        mkdir -p "${be_path}"/conf


In .devcontainer/devtools/local_cluster.sh line 51:
        cp -r -p ${project_dir}/conf/be.conf ${be_path}/conf/
                 ^------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                             ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cp -r -p "${project_dir}"/conf/be.conf "${be_path}"/conf/


In .devcontainer/devtools/local_cluster.sh line 52:
        cp -r -p ${project_dir}/conf/lsan_suppr.conf ${be_path}/conf/
                 ^------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                                     ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cp -r -p "${project_dir}"/conf/lsan_suppr.conf "${be_path}"/conf/


In .devcontainer/devtools/local_cluster.sh line 53:
        cp -r -p ${project_dir}/conf/asan_suppr.conf ${be_path}/conf/
                 ^------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                                     ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cp -r -p "${project_dir}"/conf/asan_suppr.conf "${be_path}"/conf/


In .devcontainer/devtools/local_cluster.sh line 54:
        cp -r -p ${project_dir}/conf/odbcinst.ini ${be_path}/conf/
                 ^------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cp -r -p "${project_dir}"/conf/odbcinst.ini "${be_path}"/conf/


In .devcontainer/devtools/local_cluster.sh line 57:
        sed -i '/# priority_networks/a priority_networks = 127.0.0.1' ${be_path}/conf/be.conf
                                                                      ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        sed -i '/# priority_networks/a priority_networks = 127.0.0.1' "${be_path}"/conf/be.conf


In .devcontainer/devtools/local_cluster.sh line 58:
        sed -i 's/^be_port = .*/be_port = '"${target_be_port}"'/' ${be_path}/conf/be.conf
                                                                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        sed -i 's/^be_port = .*/be_port = '"${target_be_port}"'/' "${be_path}"/conf/be.conf


In .devcontainer/devtools/local_cluster.sh line 59:
        sed -i 's/^webserver_port = .*/webserver_port = '"${target_webserver_port}"'/' ${be_path}/conf/be.conf
                                                                                       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        sed -i 's/^webserver_port = .*/webserver_port = '"${target_webserver_port}"'/' "${be_path}"/conf/be.conf


In .devcontainer/devtools/local_cluster.sh line 60:
        sed -i 's/^heartbeat_service_port = .*/heartbeat_service_port = '"${target_heartbeat_service_port}"'/' ${be_path}/conf/be.conf
                                                                                                               ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        sed -i 's/^heartbeat_service_port = .*/heartbeat_service_port = '"${target_heartbeat_service_port}"'/' "${be_path}"/conf/be.conf


In .devcontainer/devtools/local_cluster.sh line 61:
        sed -i 's/^brpc_port = .*/brpc_port = '"${target_brpc_port}"'/' ${be_path}/conf/be.conf
                                                                        ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        sed -i 's/^brpc_port = .*/brpc_port = '"${target_brpc_port}"'/' "${be_path}"/conf/be.conf


In .devcontainer/devtools/local_cluster.sh line 64:
    mkdir -p ${be_path}/connectors
             ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    mkdir -p "${be_path}"/connectors


In .devcontainer/devtools/local_cluster.sh line 65:
    mkdir -p ${be_path}/jdbc_drivers
             ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    mkdir -p "${be_path}"/jdbc_drivers


In .devcontainer/devtools/local_cluster.sh line 66:
    mkdir -p ${be_path}/log
             ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    mkdir -p "${be_path}"/log


In .devcontainer/devtools/local_cluster.sh line 67:
    mkdir -p ${be_path}/storage
             ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    mkdir -p "${be_path}"/storage


In .devcontainer/devtools/local_cluster.sh line 69:
    if [ -d "${be_output_dir}/dict" ] && [ ! -e "${be_path}/dict" ]; then
       ^----------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                                         ^------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ -d "${be_output_dir}/dict" ]] && [[ ! -e "${be_path}/dict" ]]; then


In .devcontainer/devtools/local_cluster.sh line 73:
    if [ -d "${be_output_dir}/lib" ] && [ ! -e "${be_path}/lib" ]; then
       ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                                        ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ -d "${be_output_dir}/lib" ]] && [[ ! -e "${be_path}/lib" ]]; then


In .devcontainer/devtools/local_cluster.sh line 77:
    if [ -d "${be_output_dir}/tools" ] && [ ! -e "${be_path}/tools" ]; then
       ^-----------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                                          ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ -d "${be_output_dir}/tools" ]] && [[ ! -e "${be_path}/tools" ]]; then


In .devcontainer/devtools/local_cluster.sh line 81:
    if [ -d "${be_output_dir}/www" ] && [ ! -e "${be_path}/www" ]; then
       ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                                        ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ -d "${be_output_dir}/www" ]] && [[ ! -e "${be_path}/www" ]]; then


In .devcontainer/devtools/local_cluster.sh line 95:
    mkdir -p ${fe_path}
             ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    mkdir -p "${fe_path}"


In .devcontainer/devtools/local_cluster.sh line 97:
    if [ ! -e "${fe_path}/bin" ]; then
       ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ ! -e "${fe_path}/bin" ]]; then


In .devcontainer/devtools/local_cluster.sh line 98:
        cp -r -p ${fe_output_dir}/bin ${fe_path}/bin
                 ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                      ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cp -r -p "${fe_output_dir}"/bin "${fe_path}"/bin


In .devcontainer/devtools/local_cluster.sh line 101:
    if [ ! -e "${fe_path}/conf" ]; then
       ^------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ ! -e "${fe_path}/conf" ]]; then


In .devcontainer/devtools/local_cluster.sh line 102:
        mkdir -p ${fe_path}/conf
                 ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        mkdir -p "${fe_path}"/conf


In .devcontainer/devtools/local_cluster.sh line 103:
        mkdir -p ${fe_path}/ssl
                 ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        mkdir -p "${fe_path}"/ssl


In .devcontainer/devtools/local_cluster.sh line 104:
        cp -r -p ${project_dir}/conf/fe.conf ${fe_path}/conf/
                 ^------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                             ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cp -r -p "${project_dir}"/conf/fe.conf "${fe_path}"/conf/


In .devcontainer/devtools/local_cluster.sh line 105:
        cp -r -p ${project_dir}/conf/ldap.conf ${fe_path}/conf/
                 ^------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                               ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cp -r -p "${project_dir}"/conf/ldap.conf "${fe_path}"/conf/


In .devcontainer/devtools/local_cluster.sh line 108:
        sed -i '/# priority_networks/a priority_networks = 127.0.0.1' ${fe_path}/conf/fe.conf
                                                                      ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        sed -i '/# priority_networks/a priority_networks = 127.0.0.1' "${fe_path}"/conf/fe.conf


In .devcontainer/devtools/local_cluster.sh line 109:
        sed -i 's/^http_port = .*/http_port = '"${target_fe_http_port}"'/' ${fe_path}/conf/fe.conf
                                                                           ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        sed -i 's/^http_port = .*/http_port = '"${target_fe_http_port}"'/' "${fe_path}"/conf/fe.conf


In .devcontainer/devtools/local_cluster.sh line 110:
        sed -i 's/^rpc_port = .*/rpc_port = '"${target_fe_rpc_port}"'/' ${fe_path}/conf/fe.conf
                                                                        ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        sed -i 's/^rpc_port = .*/rpc_port = '"${target_fe_rpc_port}"'/' "${fe_path}"/conf/fe.conf


In .devcontainer/devtools/local_cluster.sh line 111:
        sed -i 's/^query_port = .*/query_port = '"${target_fe_query_port}"'/' ${fe_path}/conf/fe.conf
                                                                              ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        sed -i 's/^query_port = .*/query_port = '"${target_fe_query_port}"'/' "${fe_path}"/conf/fe.conf


In .devcontainer/devtools/local_cluster.sh line 112:
        sed -i 's/^edit_log_port = .*/edit_log_port = '"${target_fe_edit_log_port}"'/' ${fe_path}/conf/fe.conf
                                                                                       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        sed -i 's/^edit_log_port = .*/edit_log_port = '"${target_fe_edit_log_port}"'/' "${fe_path}"/conf/fe.conf


In .devcontainer/devtools/local_cluster.sh line 115:
    mkdir -p ${fe_path}/connectors
             ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    mkdir -p "${fe_path}"/connectors


In .devcontainer/devtools/local_cluster.sh line 116:
    mkdir -p ${fe_path}/jdbc_drivers
             ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    mkdir -p "${fe_path}"/jdbc_drivers


In .devcontainer/devtools/local_cluster.sh line 117:
    mkdir -p ${fe_path}/doris-meta
             ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    mkdir -p "${fe_path}"/doris-meta


In .devcontainer/devtools/local_cluster.sh line 118:
    mkdir -p ${fe_path}/log
             ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    mkdir -p "${fe_path}"/log


In .devcontainer/devtools/local_cluster.sh line 119:
    mkdir -p ${fe_path}/minidump
             ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    mkdir -p "${fe_path}"/minidump


In .devcontainer/devtools/local_cluster.sh line 120:
    mkdir -p ${fe_path}/temp_dir
             ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    mkdir -p "${fe_path}"/temp_dir


In .devcontainer/devtools/local_cluster.sh line 122:
    if [ -d "${fe_output_dir}/lib" ] && [ ! -e "${fe_path}/lib" ]; then
       ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                                        ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ -d "${fe_output_dir}/lib" ]] && [[ ! -e "${fe_path}/lib" ]]; then


In .devcontainer/devtools/local_cluster.sh line 126:
    if [ -d "${fe_output_dir}/mysql_ssl_default_certificate" ] && [ ! -e "${fe_path}/mysql_ssl_default_certificate" ]; then
       ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                                                                  ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ -d "${fe_output_dir}/mysql_ssl_default_certificate" ]] && [[ ! -e "${fe_path}/mysql_ssl_default_certificate" ]]; then


In .devcontainer/devtools/local_cluster.sh line 130:
    if [ -d "${fe_output_dir}/plugins" ] && [ ! -e "${fe_path}/plugins" ]; then
       ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                                            ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ -d "${fe_output_dir}/plugins" ]] && [[ ! -e "${fe_path}/plugins" ]]; then


In .devcontainer/devtools/local_cluster.sh line 134:
    if [ -d "${fe_output_dir}/spark-dpp" ] && [ ! -e "${fe_path}/spark-dpp" ]; then
       ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                                              ^-----------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ -d "${fe_output_dir}/spark-dpp" ]] && [[ ! -e "${fe_path}/spark-dpp" ]]; then


In .devcontainer/devtools/local_cluster.sh line 138:
    if [ -d "${fe_output_dir}/webroot" ] && [ ! -e "${fe_path}/webroot" ]; then
       ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                                            ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ -d "${fe_output_dir}/webroot" ]] && [[ ! -e "${fe_path}/webroot" ]]; then


In .devcontainer/devtools/local_cluster.sh line 147:
    read fe_count be_count <<< $(get_cluster_config ${cluster_id})
    ^--^ SC2162 (info): read without -r will mangle backslashes.
                               ^-- SC2046 (warning): Quote this to prevent word splitting.
                                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    read fe_count be_count <<< $(get_cluster_config "${cluster_id}")


In .devcontainer/devtools/local_cluster.sh line 150:
    for((i=0; i<$be_count; i++)); do
                ^-------^ SC2004 (style): $/${} is unnecessary on arithmetic variables.
                ^-------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for((i=0; i<${be_count}; i++)); do


In .devcontainer/devtools/local_cluster.sh line 151:
      prepare_single_be ${cluster_id} ${i}
                        ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                      ^--^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
      prepare_single_be "${cluster_id}" "${i}"


In .devcontainer/devtools/local_cluster.sh line 159:
    read fe_count be_count <<< $(get_cluster_config ${cluster_id})
    ^--^ SC2162 (info): read without -r will mangle backslashes.
                               ^-- SC2046 (warning): Quote this to prevent word splitting.
                                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    read fe_count be_count <<< $(get_cluster_config "${cluster_id}")


In .devcontainer/devtools/local_cluster.sh line 162:
    for((i=0; i<$fe_count; i++)); do
                ^-------^ SC2004 (style): $/${} is unnecessary on arithmetic variables.
                ^-------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for((i=0; i<${fe_count}; i++)); do


In .devcontainer/devtools/local_cluster.sh line 163:
      prepare_single_fe ${cluster_id} ${i}
                        ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                      ^--^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
      prepare_single_fe "${cluster_id}" "${i}"


In .devcontainer/devtools/local_cluster.sh line 168:
    len=$(jq '.cluster | length' ${cluster_config_file})
                                 ^--------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    len=$(jq '.cluster | length' "${cluster_config_file}")


In .devcontainer/devtools/local_cluster.sh line 171:
    for((i=0; i<$len; i++)); do
                ^--^ SC2004 (style): $/${} is unnecessary on arithmetic variables.
                ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for((i=0; i<${len}; i++)); do


In .devcontainer/devtools/local_cluster.sh line 172:
        prepare_fe ${i}
                   ^--^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        prepare_fe "${i}"


In .devcontainer/devtools/local_cluster.sh line 173:
        prepare_be ${i}
                   ^--^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        prepare_be "${i}"


In .devcontainer/devtools/local_cluster.sh line 182:
    bash ${DORIS_HOME}/bin/start_be.sh --daemon
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    bash "${DORIS_HOME}"/bin/start_be.sh --daemon


In .devcontainer/devtools/local_cluster.sh line 190:
    bash ${DORIS_HOME}/bin/stop_be.sh
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    bash "${DORIS_HOME}"/bin/stop_be.sh


In .devcontainer/devtools/local_cluster.sh line 198:
    bash ${DORIS_HOME}/bin/start_fe.sh --daemon
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    bash "${DORIS_HOME}"/bin/start_fe.sh --daemon


In .devcontainer/devtools/local_cluster.sh line 206:
    bash ${DORIS_HOME}/bin/stop_fe.sh
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    bash "${DORIS_HOME}"/bin/stop_fe.sh


In .devcontainer/devtools/local_cluster.sh line 213:
    read fe_count be_count <<< $(get_cluster_config ${cluster_id})
    ^--^ SC2162 (info): read without -r will mangle backslashes.
                               ^-- SC2046 (warning): Quote this to prevent word splitting.
                                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    read fe_count be_count <<< $(get_cluster_config "${cluster_id}")


In .devcontainer/devtools/local_cluster.sh line 216:
    for((i=0; i<$fe_count; i++)); do
                ^-------^ SC2004 (style): $/${} is unnecessary on arithmetic variables.
                ^-------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for((i=0; i<${fe_count}; i++)); do


In .devcontainer/devtools/local_cluster.sh line 217:
      start_single_fe ${cluster_id} ${i}
                      ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^--^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
      start_single_fe "${cluster_id}" "${i}"


In .devcontainer/devtools/local_cluster.sh line 225:
    read fe_count be_count <<< $(get_cluster_config ${cluster_id})
    ^--^ SC2162 (info): read without -r will mangle backslashes.
                               ^-- SC2046 (warning): Quote this to prevent word splitting.
                                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    read fe_count be_count <<< $(get_cluster_config "${cluster_id}")


In .devcontainer/devtools/local_cluster.sh line 228:
    for((i=0; i<$fe_count; i++)); do
                ^-------^ SC2004 (style): $/${} is unnecessary on arithmetic variables.
                ^-------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for((i=0; i<${fe_count}; i++)); do


In .devcontainer/devtools/local_cluster.sh line 229:
      stop_single_fe ${cluster_id} ${i}
                     ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                   ^--^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
      stop_single_fe "${cluster_id}" "${i}"


In .devcontainer/devtools/local_cluster.sh line 236:
    stop_fe ${cluster_id}
            ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    stop_fe "${cluster_id}"


In .devcontainer/devtools/local_cluster.sh line 237:
    start_fe ${cluster_id}
             ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    start_fe "${cluster_id}"


In .devcontainer/devtools/local_cluster.sh line 244:
    read fe_count be_count <<< $(get_cluster_config ${cluster_id})
    ^--^ SC2162 (info): read without -r will mangle backslashes.
                               ^-- SC2046 (warning): Quote this to prevent word splitting.
                                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    read fe_count be_count <<< $(get_cluster_config "${cluster_id}")


In .devcontainer/devtools/local_cluster.sh line 247:
    for((i=0; i<$be_count; i++)); do
                ^-------^ SC2004 (style): $/${} is unnecessary on arithmetic variables.
                ^-------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for((i=0; i<${be_count}; i++)); do


In .devcontainer/devtools/local_cluster.sh line 248:
      start_single_be ${cluster_id} ${i}
                      ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^--^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
      start_single_be "${cluster_id}" "${i}"


In .devcontainer/devtools/local_cluster.sh line 256:
    read fe_count be_count <<< $(get_cluster_config ${cluster_id})
    ^--^ SC2162 (info): read without -r will mangle backslashes.
                               ^-- SC2046 (warning): Quote this to prevent word splitting.
                                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    read fe_count be_count <<< $(get_cluster_config "${cluster_id}")


In .devcontainer/devtools/local_cluster.sh line 259:
    for((i=0; i<$be_count; i++)); do
                ^-------^ SC2004 (style): $/${} is unnecessary on arithmetic variables.
                ^-------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for((i=0; i<${be_count}; i++)); do


In .devcontainer/devtools/local_cluster.sh line 260:
      stop_single_be ${cluster_id} ${i}
                     ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                   ^--^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
      stop_single_be "${cluster_id}" "${i}"


In .devcontainer/devtools/local_cluster.sh line 267:
    stop_be ${cluster_id}
            ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    stop_be "${cluster_id}"


In .devcontainer/devtools/local_cluster.sh line 268:
    start_be ${cluster_id}
             ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    start_be "${cluster_id}"


In .devcontainer/devtools/local_cluster.sh line 274:
    start_fe ${cluster_id}
             ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    start_fe "${cluster_id}"


In .devcontainer/devtools/local_cluster.sh line 275:
    start_be ${cluster_id}
             ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    start_be "${cluster_id}"


In .devcontainer/devtools/local_cluster.sh line 281:
    stop_fe ${cluster_id}
            ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    stop_fe "${cluster_id}"


In .devcontainer/devtools/local_cluster.sh line 282:
    stop_be ${cluster_id}
            ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    stop_be "${cluster_id}"


In .devcontainer/devtools/local_cluster.sh line 288:
    stop_cluster ${cluster_id}
                 ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    stop_cluster "${cluster_id}"


In .devcontainer/devtools/local_cluster.sh line 289:
    start_cluster ${cluster_id}
                  ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    start_cluster "${cluster_id}"


In .devcontainer/postCreateCommand.sh line 9:
  mkdir -p $HOME/.config/ccache
           ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
           ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  mkdir -p "${HOME}"/.config/ccache


In .devcontainer/postCreateCommand.sh line 10:
  echo "cache_dir = /opt/ccache" >> $HOME/.config/ccache/ccache.conf
                                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  echo "cache_dir = /opt/ccache" >> "${HOME}"/.config/ccache/ccache.conf


In .devcontainer/postCreateCommand.sh line 11:
  echo "max_size = 20.0G" >> $HOME/.config/ccache/ccache.conf
                             ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  echo "max_size = 20.0G" >> "${HOME}"/.config/ccache/ccache.conf


In .devcontainer/postCreateCommand.sh line 13:
  echo "unset http_proxy" >> $HOME/.bashrc
                             ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  echo "unset http_proxy" >> "${HOME}"/.bashrc


In .devcontainer/postCreateCommand.sh line 14:
  echo "unset https_proxy" >> $HOME/.bashrc
                              ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  echo "unset https_proxy" >> "${HOME}"/.bashrc


In .devcontainer/postCreateCommand.sh line 17:
  pushd $HOME/.vscode-server/data/Machine
  ^-- SC2164 (warning): Use 'pushd ... || exit' or 'pushd ... || return' in case pushd fails.
        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  pushd "${HOME}"/.vscode-server/data/Machine || exit


In .devcontainer/postCreateCommand.sh line 20:
  popd
  ^--^ SC2164 (warning): Use 'popd ... || exit' or 'popd ... || return' in case popd fails.

Did you mean: 
  popd || exit


In .devcontainer/postCreateCommand.sh line 23:
main $@
     ^-- SC2068 (error): Double quote array expansions to avoid re-splitting elements.


In bin/flight_record_fe.sh line 47:
FE_PID=$(${JAVA_HOME}/bin/jps | grep DorisFE | awk '{print $1}')
         ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
FE_PID=$("${JAVA_HOME}"/bin/jps | grep DorisFE | awk '{print $1}')


In bin/profile_fe.sh line 47:
FE_PID=$(${JAVA_HOME}/bin/jps | grep DorisFE | awk '{print $1}')
         ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
FE_PID=$("${JAVA_HOME}"/bin/jps | grep DorisFE | awk '{print $1}')


In build-support/clang-format.sh line 43:
    export PATH=$(brew --prefix llvm@16)/bin:$PATH
           ^--^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    export PATH=$(brew --prefix llvm@16)/bin:${PATH}


In build.sh line 244:
            BUILD_SPARK_DPP=1
            ^-------------^ SC2034 (warning): BUILD_SPARK_DPP appears unused. Verify use (or export if used externally).


In build.sh line 542:
FEAT+=($([[ -n "${WITH_TDE_DIR}" ]] && echo "+TDE" || echo "-TDE"))
       ^-- SC2207 (warning): Prefer mapfile or read -a to split command output (or quote to avoid splitting).


In build.sh line 543:
FEAT+=($([[ "${ENABLE_HDFS_STORAGE_VAULT:-OFF}" == "ON" ]] && echo "+HDFS_STORAGE_VAULT" || echo "-HDFS_STORAGE_VAULT"))
       ^-- SC2207 (warning): Prefer mapfile or read -a to split command output (or quote to avoid splitting).


In build.sh line 544:
FEAT+=($([[ ${BUILD_UI} -eq 1 ]] && echo "+UI" || echo "-UI"))
       ^-- SC2207 (warning): Prefer mapfile or read -a to split command output (or quote to avoid splitting).


In build.sh line 545:
FEAT+=($([[ "${BUILD_AZURE}" == "ON" ]] && echo "+AZURE_BLOB,+AZURE_STORAGE_VAULT" || echo "-AZURE_BLOB,-AZURE_STORAGE_VAULT"))
       ^-- SC2207 (warning): Prefer mapfile or read -a to split command output (or quote to avoid splitting).


In build.sh line 546:
FEAT+=($([[ ${BUILD_HIVE_UDF} -eq 1 ]] && echo "+HIVE_UDF" || echo "-HIVE_UDF"))
       ^-- SC2207 (warning): Prefer mapfile or read -a to split command output (or quote to avoid splitting).


In build.sh line 547:
FEAT+=($([[ ${BUILD_BE_JAVA_EXTENSIONS} -eq 1 ]] && echo "+BE_JAVA_EXTENSIONS" || echo "-BE_JAVA_EXTENSIONS"))
       ^-- SC2207 (warning): Prefer mapfile or read -a to split command output (or quote to avoid splitting).


In build.sh line 549:
export DORIS_FEATURE_LIST=$(IFS=','; echo "${FEAT[*]}")
       ^----------------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In build.sh line 702:
        -DENABLE_HDFS_STORAGE_VAULT=${ENABLE_HDFS_STORAGE_VAULT:-ON} \
                                    ^-- SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        -DENABLE_HDFS_STORAGE_VAULT="${ENABLE_HDFS_STORAGE_VAULT:-ON}" \


In build.sh line 768:
            "${MVN_CMD}" package -pl ${FE_MODULES:+${FE_MODULES}} -Dskip.doc=true -DskipTests -Dcheckstyle.skip=true ${MVN_OPT:+${MVN_OPT}} ${DEPENDENCIES_MVN_OPTS}  -gs "${USER_SETTINGS_MVN_REPO}" -T 1C
                                                                                                                                            ^----------------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            "${MVN_CMD}" package -pl ${FE_MODULES:+${FE_MODULES}} -Dskip.doc=true -DskipTests -Dcheckstyle.skip=true ${MVN_OPT:+${MVN_OPT}} "${DEPENDENCIES_MVN_OPTS}"  -gs "${USER_SETTINGS_MVN_REPO}" -T 1C


In build.sh line 770:
            "${MVN_CMD}" package -pl ${FE_MODULES:+${FE_MODULES}} -Dskip.doc=true -DskipTests -Dcheckstyle.skip=true ${MVN_OPT:+${MVN_OPT}} ${DEPENDENCIES_MVN_OPTS}  -T 1C
                                                                                                                                            ^----------------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            "${MVN_CMD}" package -pl ${FE_MODULES:+${FE_MODULES}} -Dskip.doc=true -DskipTests -Dcheckstyle.skip=true ${MVN_OPT:+${MVN_OPT}} "${DEPENDENCIES_MVN_OPTS}"  -T 1C


In build.sh line 774:
            "${MVN_CMD}" package -pl ${FE_MODULES:+${FE_MODULES}} -Dskip.doc=true -DskipTests ${MVN_OPT:+${MVN_OPT}} ${DEPENDENCIES_MVN_OPTS}  -gs "${USER_SETTINGS_MVN_REPO}" -T 1C
                                                                                                                     ^----------------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            "${MVN_CMD}" package -pl ${FE_MODULES:+${FE_MODULES}} -Dskip.doc=true -DskipTests ${MVN_OPT:+${MVN_OPT}} "${DEPENDENCIES_MVN_OPTS}"  -gs "${USER_SETTINGS_MVN_REPO}" -T 1C


In build.sh line 776:
            "${MVN_CMD}" package -pl ${FE_MODULES:+${FE_MODULES}} -Dskip.doc=true -DskipTests ${MVN_OPT:+${MVN_OPT}} ${DEPENDENCIES_MVN_OPTS}  -T 1C
                                                                                                                     ^----------------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            "${MVN_CMD}" package -pl ${FE_MODULES:+${FE_MODULES}} -Dskip.doc=true -DskipTests ${MVN_OPT:+${MVN_OPT}} "${DEPENDENCIES_MVN_OPTS}"  -T 1C


In build.sh line 830:
    if [ "${TARGET_SYSTEM}" = "Darwin" ] || [ "${TARGET_SYSTEM}" = "Linux" ]; then
       ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                                            ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "${TARGET_SYSTEM}" = "Darwin" ]] || [[ "${TARGET_SYSTEM}" = "Linux" ]]; then


In build.sh line 983:
    if [[ "${TARGET_SYSTEM}" == 'Linux' ]] && [[ "$TARGET_ARCH" == 'x86_64' ]]; then
                                                  ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${TARGET_SYSTEM}" == 'Linux' ]] && [[ "${TARGET_ARCH}" == 'x86_64' ]]; then


In build.sh line 987:
    elif [[ "${TARGET_SYSTEM}" == 'Linux' ]] && [[ "$TARGET_ARCH" == 'aarch64' ]]; then
                                                    ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    elif [[ "${TARGET_SYSTEM}" == 'Linux' ]] && [[ "${TARGET_ARCH}" == 'aarch64' ]]; then


In cloud/script/run_all_tests.sh line 175:
exit ${ret}
     ^----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
exit "${ret}"


In cloud/script/start.sh line 59:
  source "${custom_start}" 
         ^---------------^ SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.


In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 172:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 173:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 179:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 215:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 216:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 24:
    [ -e "$file" ] || continue
    ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    [[ -e "${file}" ]] || continue


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 25:
    tar -xzvf "$file" -C "$AUX_LIB"
               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    tar -xzvf "${file}" -C "${AUX_LIB}"


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 38:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
if [[ ${NEED_LOAD_DATA} = "0" ]]; then
      ^---------------^ SC2154 (warning): NEED_LOAD_DATA is referenced but not assigned.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 49:
if [[ ${enablePaimonHms} == "true" ]]; then
      ^----------------^ SC2154 (warning): enablePaimonHms is referenced but not assigned.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 54:
    echo "Script: create_paimon_table.hql executed in $EXECUTION_TIME seconds"
                                                      ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Script: create_paimon_table.hql executed in ${EXECUTION_TIME} seconds"


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 64:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${LOAD_PARALLEL}" -I {} bash -ec '
                                                                      ^--------------^ SC2154 (warning): LOAD_PARALLEL is referenced but not assigned.
                                                                                                       ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 119:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${LOAD_PARALLEL}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                                    ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 22:
find ${CUR_DIR}/data -type f -name "*.tar.gz" -print0 | \
     ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
find "${CUR_DIR}"/data -type f -name "*.tar.gz" -print0 | \


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 23:
xargs -0 -n1 -P"${LOAD_PARALLEL}" bash -c '
                ^--------------^ SC2154 (warning): LOAD_PARALLEL is referenced but not assigned.
                                          ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 33:
    cd ${CUR_DIR}/
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 34:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/tpch1.db.tar.gz
                    ^-------------^ SC2154 (warning): s3BucketName is referenced but not assigned.
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2154 (warning): s3Endpoint is referenced but not assigned.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/tpch1.db.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 45:
    cd ${CUR_DIR}/
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 46:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/tvf_data.tar.gz
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/tvf_data.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 58:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 70:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 82:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 94:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 106:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/test_tvf/data.tar.gz
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/test_tvf/data.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 118:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 144:
cd ${CUR_DIR}/auxlib
   ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
cd "${CUR_DIR}"/auxlib


In docker/thirdparties/docker-compose/hudi/scripts/init.sh line 85:
METASTORE_HOST=$(echo "${HIVE_METASTORE_URIS}" | sed 's|thrift://||' | cut -d: -f1)
                       ^--------------------^ SC2154 (warning): HIVE_METASTORE_URIS is referenced but not assigned.


In docker/thirdparties/docker-compose/hudi/scripts/init.sh line 90:
while [ $RETRY_COUNT -lt $MAX_RETRIES ]; do
      ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
        ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
        ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                         ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                         ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
while [[ "${RETRY_COUNT}" -lt "${MAX_RETRIES}" ]]; do


In docker/thirdparties/docker-compose/hudi/scripts/init.sh line 103:
    if [ $RETRY_COUNT -eq 0 ]; then
       ^--------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${RETRY_COUNT}" -eq 0 ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/init.sh line 111:
  if [ $((RETRY_COUNT % 10)) -eq 0 ]; then
     ^-----------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
  if [[ $((RETRY_COUNT % 10)) -eq 0 ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/init.sh line 117:
if [ $RETRY_COUNT -ge $MAX_RETRIES ]; then
   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
     ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                      ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${RETRY_COUNT}" -ge "${MAX_RETRIES}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/init.sh line 127:
    <value>${S3_ENDPOINT}</value>
           ^------------^ SC2154 (warning): S3_ENDPOINT is referenced but not assigned.


In docker/thirdparties/docker-compose/hudi/scripts/init.sh line 131:
    <value>${MINIO_ROOT_USER}</value>
           ^----------------^ SC2154 (warning): MINIO_ROOT_USER is referenced but not assigned.


In docker/thirdparties/docker-compose/hudi/scripts/init.sh line 135:
    <value>${MINIO_ROOT_PASSWORD}</value>
           ^--------------------^ SC2154 (warning): MINIO_ROOT_PASSWORD is referenced but not assigned.


In docker/thirdparties/docker-compose/hudi/scripts/init.sh line 165:
    <value>s3a://${HUDI_BUCKET}/warehouse</value>
                 ^------------^ SC2154 (warning): HUDI_BUCKET is referenced but not assigned.


In docker/thirdparties/docker-compose/hudi/scripts/init.sh line 171:
HUDI_BUNDLE_JAR_FILE=$(download_jar "hudi-spark3.5-bundle_2.12" "${HUDI_BUNDLE_VERSION}" "${HUDI_BUNDLE_URL}")
                                                                 ^--------------------^ SC2154 (warning): HUDI_BUNDLE_VERSION is referenced but not assigned.
                                                                                          ^----------------^ SC2154 (warning): HUDI_BUNDLE_URL is referenced but not assigned.


In docker/thirdparties/docker-compose/hudi/scripts/init.sh line 176:
HADOOP_AWS_JAR=$(download_jar "hadoop-aws" "${HADOOP_AWS_VERSION}" "${HADOOP_AWS_URL}")
                                            ^-------------------^ SC2154 (warning): HADOOP_AWS_VERSION is referenced but not assigned.
                                                                    ^---------------^ SC2154 (warning): HADOOP_AWS_URL is referenced but not assigned.


In docker/thirdparties/docker-compose/hudi/scripts/init.sh line 181:
AWS_SDK_BUNDLE_JAR=$(download_jar "aws-java-sdk-bundle" "${AWS_SDK_BUNDLE_VERSION}" "${AWS_SDK_BUNDLE_URL}")
                                                         ^-----------------------^ SC2154 (warning): AWS_SDK_BUNDLE_VERSION is referenced but not assigned.
                                                                                     ^-------------------^ SC2154 (warning): AWS_SDK_BUNDLE_URL is referenced but not assigned.


In docker/thirdparties/docker-compose/hudi/scripts/init.sh line 185:
POSTGRESQL_JDBC_JAR=$(download_jar "postgresql" "${POSTGRESQL_JDBC_VERSION}" "${POSTGRESQL_JDBC_URL}")
                                                 ^------------------------^ SC2154 (warning): POSTGRESQL_JDBC_VERSION is referenced but not assigned.
                                                                              ^--------------------^ SC2154 (warning): POSTGRESQL_JDBC_URL is referenced but not assigned.


In docker/thirdparties/docker-compose/hudi/scripts/init.sh line 209:
  ${SPARK_HOME}/bin/spark-sql \
  ^-----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
  "${SPARK_HOME}"/bin/spark-sql \


In docker/thirdparties/docker-compose/hudi/scripts/init.sh line 229:
touch ${SUCCESS_FILE}
      ^-------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
touch "${SUCCESS_FILE}"


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 74:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 81:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 83:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 90:
if [[ ${enablePaimonHms} == "true" ]]; then
      ^----------------^ SC2154 (warning): enablePaimonHms is referenced but not assigned.


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 27:
echo "[polaris-init] Waiting for Polaris health check at http://$HOST:$PORT/q/health ..."
                                                                ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                      ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
echo "[polaris-init] Waiting for Polaris health check at http://${HOST}:${PORT}/q/health ..."


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 28:
for i in $(seq 1 120); do
^-^ SC2034 (warning): i appears unused. Verify use (or export if used externally).


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 29:
  if curl -sSf "http://$HOST:8182/q/health" >/dev/null; then
                       ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  if curl -sSf "http://${HOST}:8182/q/health" >/dev/null; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 38:
  -X POST "http://$HOST:$PORT/api/catalog/v1/oauth/tokens" \
                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -X POST "http://${HOST}:${PORT}/api/catalog/v1/oauth/tokens" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 40:
  -d "grant_type=client_credentials&client_id=$USER&client_secret=$PASS&scope=PRINCIPAL_ROLE:ALL")
                                              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -d "grant_type=client_credentials&client_id=${USER}&client_secret=${PASS}&scope=PRINCIPAL_ROLE:ALL")


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 43:
TOKEN=$(printf "%s" "$TOKEN_JSON" | sed -n 's/.*"access_token"\s*:\s*"\([^"]*\)".*/\1/p')
                     ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
TOKEN=$(printf "%s" "${TOKEN_JSON}" | sed -n 's/.*"access_token"\s*:\s*"\([^"]*\)".*/\1/p')


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 45:
if [ -z "$TOKEN" ]; then
         ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [ -z "${TOKEN}" ]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 46:
  echo "[polaris-init] ERROR: Failed to obtain OAuth token. Response: $TOKEN_JSON" >&2
                                                                      ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  echo "[polaris-init] ERROR: Failed to obtain OAuth token. Response: ${TOKEN_JSON}" >&2


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 50:
echo "[polaris-init] Creating catalog '$CATALOG' with base '$BASE_LOCATION' ..."
                                       ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                            ^------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
echo "[polaris-init] Creating catalog '${CATALOG}' with base '${BASE_LOCATION}' ..."


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 53:
  "name": "$CATALOG",
           ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  "name": "${CATALOG}",


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 56:
    "default-base-location": "$BASE_LOCATION",
                              ^------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    "default-base-location": "${BASE_LOCATION}",


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 66:
    "allowedLocations": ["$BASE_LOCATION"]
                          ^------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    "allowedLocations": ["${BASE_LOCATION}"]


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 74:
  -X POST "http://$HOST:$PORT/api/management/v1/catalogs" \
                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -X POST "http://${HOST}:${PORT}/api/management/v1/catalogs" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 75:
  -H "Authorization: Bearer $TOKEN" \
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -H "Authorization: Bearer ${TOKEN}" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 77:
  -d "$CREATE_PAYLOAD")
      ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -d "${CREATE_PAYLOAD}")


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 79:
if [ "$HTTP_CODE" = "201" ]; then
      ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [ "${HTTP_CODE}" = "201" ]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 81:
elif [ "$HTTP_CODE" = "409" ]; then
        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
elif [ "${HTTP_CODE}" = "409" ]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 84:
  echo "[polaris-init] Create catalog failed (HTTP $HTTP_CODE):"
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  echo "[polaris-init] Create catalog failed (HTTP ${HTTP_CODE}):"


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 89:
echo "[polaris-init] Setting up permissions for catalog '$CATALOG' ..."
                                                         ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
echo "[polaris-init] Setting up permissions for catalog '${CATALOG}' ..."


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 94:
  -X PUT "http://$HOST:$PORT/api/management/v1/catalogs/$CATALOG/catalog-roles/catalog_admin/grants" \
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                       ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -X PUT "http://${HOST}:${PORT}/api/management/v1/catalogs/${CATALOG}/catalog-roles/catalog_admin/grants" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 95:
  -H "Authorization: Bearer $TOKEN" \
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -H "Authorization: Bearer ${TOKEN}" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 99:
if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ]; then
      ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [ "${HTTP_CODE}" != "200" ] && [ "${HTTP_CODE}" != "201" ]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 100:
  echo "[polaris-init] Warning: Failed to create catalog admin grants (HTTP $HTTP_CODE)"
                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  echo "[polaris-init] Warning: Failed to create catalog admin grants (HTTP ${HTTP_CODE})"


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 107:
  -X POST "http://$HOST:$PORT/api/management/v1/principal-roles" \
                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -X POST "http://${HOST}:${PORT}/api/management/v1/principal-roles" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 108:
  -H "Authorization: Bearer $TOKEN" \
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -H "Authorization: Bearer ${TOKEN}" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 112:
if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ] && [ "$HTTP_CODE" != "409" ]; then
      ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [ "${HTTP_CODE}" != "200" ] && [ "${HTTP_CODE}" != "201" ] && [ "${HTTP_CODE}" != "409" ]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 113:
  echo "[polaris-init] Warning: Failed to create data engineer role (HTTP $HTTP_CODE)"
                                                                          ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  echo "[polaris-init] Warning: Failed to create data engineer role (HTTP ${HTTP_CODE})"


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 120:
  -X PUT "http://$HOST:$PORT/api/management/v1/principal-roles/data_engineer/catalog-roles/$CATALOG" \
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                       ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                           ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -X PUT "http://${HOST}:${PORT}/api/management/v1/principal-roles/data_engineer/catalog-roles/${CATALOG}" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 121:
  -H "Authorization: Bearer $TOKEN" \
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -H "Authorization: Bearer ${TOKEN}" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 125:
if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ]; then
      ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [ "${HTTP_CODE}" != "200" ] && [ "${HTTP_CODE}" != "201" ]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 126:
  echo "[polaris-init] Warning: Failed to connect roles (HTTP $HTTP_CODE)"
                                                              ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  echo "[polaris-init] Warning: Failed to connect roles (HTTP ${HTTP_CODE})"


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 133:
  -X PUT "http://$HOST:$PORT/api/management/v1/principals/root/principal-roles" \
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                       ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -X PUT "http://${HOST}:${PORT}/api/management/v1/principals/root/principal-roles" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 134:
  -H "Authorization: Bearer $TOKEN" \
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -H "Authorization: Bearer ${TOKEN}" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 138:
if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ]; then
      ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [ "${HTTP_CODE}" != "200" ] && [ "${HTTP_CODE}" != "201" ]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 139:
  echo "[polaris-init] Warning: Failed to assign data engineer role to root (HTTP $HTTP_CODE)"
                                                                                  ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  echo "[polaris-init] Warning: Failed to assign data engineer role to root (HTTP ${HTTP_CODE})"


In docker/thirdparties/docker-compose/ranger/ranger-admin/ranger-entrypoint.sh line 24:
cd $RANGER_HOME
   ^----------^ SC2154 (warning): RANGER_HOME is referenced but not assigned.
   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.
   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cd "${RANGER_HOME}"


In docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh line 16:
#!/bin/bash
^-- SC1128 (error): The shebang must be on the first line. Delete blanks and move comments.


In docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh line 19:
if [ ! -d "${RANGER_HOME}/ews/webapp/WEB-INF/classes/ranger-plugins/doris" ]; then
   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
           ^------------^ SC2154 (warning): RANGER_HOME is referenced but not assigned.

Did you mean: 
if [[ ! -d "${RANGER_HOME}/ews/webapp/WEB-INF/classes/ranger-plugins/doris" ]]; then


In docker/thirdparties/docker-compose/ranger/script/install_doris_service_def.sh line 15:
#!/bin/bash
^-- SC1128 (error): The shebang must be on the first line. Delete blanks and move comments.


In docker/thirdparties/run-thirdparties-docker.sh line 55:
export IP_HOST=$(ip -4 addr show scope global | awk '/inet / {print $2}' | cut -d/ -f1 | head -n 1)
       ^-----^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 132:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 164:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 353:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 358:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 380:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 382:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 394:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 396:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 444:
    . "${HUDI_DIR}"/hudi.env
      ^--------------------^ SC1091 (info): Not following: ./hudi.env: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/run-thirdparties-docker.sh line 488:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 490:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 506:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 508:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 510:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 546:
    . "${POLARIS_DIR}/polaris_settings.env"
      ^-- SC1091 (info): Not following: ./polaris_settings.env: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/run-thirdparties-docker.sh line 595:
if [[ "$NEED_LOAD_DATA" -eq 1 ]]; then
       ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${NEED_LOAD_DATA}" -eq 1 ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 601:
if [[ $need_prepare_hive_data -eq 1 ]]; then
      ^---------------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ ${need_prepare_hive_data} -eq 1 ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 712:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 713:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 714:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 716:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true


In regression-test/pipeline/cloud_p0/run.sh line 55:
    set -e
    ^----^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 56:
    shopt -s inherit_errexit
    ^----------------------^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 58:
    cd "${teamcity_build_checkoutDir}" || return 1
    ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
                                          ^------^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 59:
    {
    ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 60:
        echo # add a new line to prevent two config items from being combined, which will cause the error "No signature of method"
        ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 61:
        echo "ak='${s3SourceAk}'"
        ^-----------------------^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 62:
        echo "sk='${s3SourceSk}'"
        ^-----------------------^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 63:
        echo "hwYunAk='${hwYunAk:-}'"
        ^---------------------------^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 64:
        echo "hwYunSk='${hwYunSk:-}'"
        ^---------------------------^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 65:
        echo "txYunAk='${txYunAk:-}'"
        ^---------------------------^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 66:
        echo "txYunSk='${txYunSk:-}'"
        ^---------------------------^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 68:
    cp -f "${teamcity_build_checkoutDir}"/regression-test/pipeline/cloud_p0/conf/regression-conf-custom.groovy \
    ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 71:
    sed -i "s/^CONTAINER_UID=\"doris--\"/CONTAINER_UID=\"doris-external--\"/" "${teamcity_build_checkoutDir}"/docker/thirdparties/custom_settings.env
    ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 72:
    sed -i "s/oss-cn-hongkong.aliyuncs.com/oss-cn-hongkong-internal.aliyuncs.com/" "${teamcity_build_checkoutDir}"/docker/thirdparties/custom_settings.env
    ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 73:
    if bash "${teamcity_build_checkoutDir}"/docker/thirdparties/run-thirdparties-docker.sh --stop; then echo; fi
    ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
       ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
                                                                                                        ^--^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 74:
    if bash "${teamcity_build_checkoutDir}"/docker/thirdparties/run-thirdparties-docker.sh -c kafka; then echo; else echo "ERROR: start kafka docker failed"; fi
    ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
       ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
                                                                                                          ^--^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
                                                                                                                     ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 75:
    JAVA_HOME="$(find /usr/lib/jvm -maxdepth 1 -type d -name 'java-8-*' | sed -n '1p')"
    ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
                 ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 76:
    export JAVA_HOME
    ^--------------^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 77:
    if "${teamcity_build_checkoutDir}"/run-regression-test.sh \
    ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
       ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 85:
        echo
        ^--^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 87:
        bash "${teamcity_build_checkoutDir}"/regression-test/pipeline/common/get-or-set-tmp-env.sh 'set' "export need_collect_log=true"
        ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 91:
        summary=$(
        ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 92:
            grep -aoE 'Test ([0-9]+) suites, failed ([0-9]+) suites, fatal ([0-9]+) scripts, skipped ([0-9]+) scripts' \
            ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 95:
        set -x
        ^----^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 96:
        test_suites=$(echo "${summary}" | cut -d ' ' -f 2)
        ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
                      ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 97:
        failed_suites=$(echo "${summary}" | cut -d ' ' -f 5)
        ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
                        ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 98:
        fatal_scripts=$(echo "${summary}" | cut -d ' ' -f 8)
        ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
                        ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 99:
        if [[ ${test_suites} -gt 0 && ${failed_suites} -le ${failed_suites_threshold:=100} && ${fatal_scripts} -eq 0 ]]; then
        ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
           ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 100:
            echo "INFO: regression test result meet (test_suites>0 && failed_suites<=${failed_suites_threshold} && fatal_scripts=0)"
            ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/cloud_p0/run.sh line 102:
            return 1
            ^------^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 55:
    set -e
    ^----^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 56:
    shopt -s inherit_errexit
    ^----------------------^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 58:
    cd "${teamcity_build_checkoutDir}" || return 1
    ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
                                          ^------^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 59:
    {
    ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 60:
        echo # add a new line to prevent two config items from being combined, which will cause the error "No signature of method"
        ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 61:
        echo "ak='${s3SourceAk}'"
        ^-----------------------^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 62:
        echo "sk='${s3SourceSk}'"
        ^-----------------------^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 63:
        echo "hwYunAk='${hwYunAk:-}'"
        ^---------------------------^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 64:
        echo "hwYunSk='${hwYunSk:-}'"
        ^---------------------------^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 65:
        echo "txYunAk='${txYunAk:-}'"
        ^---------------------------^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 66:
        echo "txYunSk='${txYunSk:-}'"
        ^---------------------------^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 68:
    cp -f "${teamcity_build_checkoutDir}"/regression-test/pipeline/vault_p0/conf/regression-conf-custom.groovy \
    ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 72:
    sed -i "s/^CONTAINER_UID=\"doris--\"/CONTAINER_UID=\"doris-external--\"/" "${teamcity_build_checkoutDir}"/docker/thirdparties/custom_settings.env
    ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 73:
    sed -i "s/oss-cn-hongkong.aliyuncs.com/oss-cn-hongkong-internal.aliyuncs.com/" "${teamcity_build_checkoutDir}"/docker/thirdparties/custom_settings.env
    ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 74:
    if bash "${teamcity_build_checkoutDir}"/docker/thirdparties/run-thirdparties-docker.sh -c minio ||
    ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
       ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 75:
        bash "${teamcity_build_checkoutDir}"/docker/thirdparties/run-thirdparties-docker.sh -c minio; then
        ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 76:
        echo "INFO: start minio docker success"
        ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 78:
        echo "ERROR: start minio docker twice failed" && return 1
        ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
                                                         ^------^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 82:
    docker_compose_hdfs_yaml='
    ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 116:
    if echo "${docker_compose_hdfs_yaml}" >docker-compose.yaml && docker-compose up -d; then echo; else echo "ERROR: start hdfs docker failed"; fi
    ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
       ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
                                                                  ^------------------^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
                                                                                             ^--^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
                                                                                                        ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 117:
    JAVA_HOME="$(find /usr/lib/jvm -maxdepth 1 -type d -name 'java-8-*' | sed -n '1p')"
    ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
                 ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 118:
    export JAVA_HOME
    ^--------------^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 119:
    if "${teamcity_build_checkoutDir}"/run-regression-test.sh \
    ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
       ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 127:
        echo
        ^--^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 129:
        bash "${teamcity_build_checkoutDir}"/regression-test/pipeline/common/get-or-set-tmp-env.sh 'set' "export need_collect_log=true"
        ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 133:
        summary=$(
        ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 134:
            grep -aoE 'Test ([0-9]+) suites, failed ([0-9]+) suites, fatal ([0-9]+) scripts, skipped ([0-9]+) scripts' \
            ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 137:
        set -x
        ^----^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 138:
        test_suites=$(echo "${summary}" | cut -d ' ' -f 2)
        ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
                      ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 139:
        failed_suites=$(echo "${summary}" | cut -d ' ' -f 5)
        ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
                        ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 140:
        fatal_scripts=$(echo "${summary}" | cut -d ' ' -f 8)
        ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
                        ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 141:
        if [[ ${test_suites} -gt 0 && ${failed_suites} -le ${failed_suites_threshold:=100} && ${fatal_scripts} -eq 0 ]]; then
        ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).
           ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 142:
            echo "INFO: regression test result meet (test_suites>0 && failed_suites<=${failed_suites_threshold} && fatal_scripts=0)"
            ^-- SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In regression-test/pipeline/vault_p0/run.sh line 144:
            return 1
            ^------^ SC2317 (info): Command appears to be unreachable. Check usage (or ignore if invoked indirectly).


In run-be-ut.sh line 150:
    WITH_TDE_DIR        -- ${WITH_TDE_DIR}
                           ^-------------^ SC2154 (warning): WITH_TDE_DIR is referenced but not assigned.


In run-cloud-ut.sh line 199:
    -DENABLE_HDFS_STORAGE_VAULT=${ENABLE_HDFS_STORAGE_VAULT:-ON} \
                                ^-- SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    -DENABLE_HDFS_STORAGE_VAULT="${ENABLE_HDFS_STORAGE_VAULT:-ON}" \


In thirdparty/build-thirdparty.sh line 1364:
    -DCMAKE_CXX_FLAGS="$CMAKE_CXX_FLAGS -Wno-elaborated-enum-base" \
                       ^--------------^ SC2154 (warning): CMAKE_CXX_FLAGS is referenced but not assigned.
                       ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    -DCMAKE_CXX_FLAGS="${CMAKE_CXX_FLAGS} -Wno-elaborated-enum-base" \


In thirdparty/build-thirdparty.sh line 1953:
    cp -r ${TP_SOURCE_DIR}/${JINDOFS_SOURCE}/* "${TP_INSTALL_DIR}/jindofs_libs/"
          ^--------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                           ^---------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    cp -r "${TP_SOURCE_DIR}"/"${JINDOFS_SOURCE}"/* "${TP_INSTALL_DIR}/jindofs_libs/"


In thirdparty/download-thirdparty.sh line 605:
    cd $TP_SOURCE_DIR/$CCTZ_SOURCE
       ^------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
       ^------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                      ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    cd "${TP_SOURCE_DIR}"/"${CCTZ_SOURCE}"


In thirdparty/download-thirdparty.sh line 606:
    if [[ ! -f "$PATCHED_MARK" ]] ; then
                ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ ! -f "${PATCHED_MARK}" ]] ; then


In thirdparty/download-thirdparty.sh line 611:
        touch "$PATCHED_MARK"
               ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        touch "${PATCHED_MARK}"


In tools/lzo/build.sh line 1:
# Licensed to the Apache Software Foundation (ASF) under one
^-- SC2148 (error): Tips depend on target shell and yours is unknown. Add a shebang or a 'shell' directive.


In tools/lzo/build.sh line 20:
g++ -o lzo_writer lzo_writer.cpp -I. -Isrc -I${DORIS_THIRDPARTY}/installed/include -L${DORIS_THIRDPARTY}/installed/lib -llzo2 -std=c++17
                                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                     ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
g++ -o lzo_writer lzo_writer.cpp -I. -Isrc -I"${DORIS_THIRDPARTY}"/installed/include -L"${DORIS_THIRDPARTY}"/installed/lib -llzo2 -std=c++17

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC1128 -- The shebang must be on the first ...
  https://www.shellcheck.net/wiki/SC2068 -- Double quote array expansions to ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- .devcontainer/ci-transwarp/build.sh.orig
+++ .devcontainer/ci-transwarp/build.sh
@@ -4,30 +4,30 @@
   --network=host
   --build-arg HTTP_PROXY=http://192.168.2.1:28080
 "
-function image_suffix {    
-  commit=$(git rev-parse --short HEAD)    
-  current_date=$(date +%Y%m%d)    
-    
-  echo "${current_date}_${commit}"    
-    
-  unset commit    
-  unset current_date    
-}    
+function image_suffix {
+    commit=$(git rev-parse --short HEAD)
+    current_date=$(date +%Y%m%d)
 
+    echo "${current_date}_${commit}"
+
+    unset commit
+    unset current_date
+}
+
 function build_runtime {
-  tag=$(image_suffix)    
-    
-  build_args="    
+    tag=$(image_suffix)
+
+    build_args="    
     ${common_build_args}    
     --build-arg OUTPUT_DIR=output
-  "    
-    
-  docker build \
-    ${build_args} \
-    -t doris_runtime:${tag} \
-    -f Dockerfile \
-    .
+  "
 
-  unset tag
-  unset build_args
+    docker build \
+        ${build_args} \
+        -t doris_runtime:${tag} \
+        -f Dockerfile \
+        .
+
+    unset tag
+    unset build_args
 }
--- .devcontainer/devtools/local_cluster.sh.orig
+++ .devcontainer/devtools/local_cluster.sh
@@ -144,11 +144,11 @@
     local cluster_id=$1
     local fe_count=0
     local be_count=0
-    read fe_count be_count <<< $(get_cluster_config ${cluster_id})
+    read fe_count be_count <<<$(get_cluster_config ${cluster_id})
 
     local i
-    for((i=0; i<$be_count; i++)); do
-      prepare_single_be ${cluster_id} ${i}
+    for ((i = 0; i < $be_count; i++)); do
+        prepare_single_be ${cluster_id} ${i}
     done
 }
 
@@ -156,11 +156,11 @@
     local cluster_id=$1
     local fe_count=0
     local be_count=0
-    read fe_count be_count <<< $(get_cluster_config ${cluster_id})
+    read fe_count be_count <<<$(get_cluster_config ${cluster_id})
 
     local i=0
-    for((i=0; i<$fe_count; i++)); do
-      prepare_single_fe ${cluster_id} ${i}
+    for ((i = 0; i < $fe_count; i++)); do
+        prepare_single_fe ${cluster_id} ${i}
     done
 }
 
@@ -168,7 +168,7 @@
     len=$(jq '.cluster | length' ${cluster_config_file})
 
     local i=0
-    for((i=0; i<$len; i++)); do
+    for ((i = 0; i < $len; i++)); do
         prepare_fe ${i}
         prepare_be ${i}
     done
@@ -210,11 +210,11 @@
     local cluster_id=${1:-0}
     local fe_count=0
     local be_count=0
-    read fe_count be_count <<< $(get_cluster_config ${cluster_id})
+    read fe_count be_count <<<$(get_cluster_config ${cluster_id})
 
     local i=0
-    for((i=0; i<$fe_count; i++)); do
-      start_single_fe ${cluster_id} ${i}
+    for ((i = 0; i < $fe_count; i++)); do
+        start_single_fe ${cluster_id} ${i}
     done
 }
 
@@ -222,11 +222,11 @@
     local cluster_id=${1:-0}
     local fe_count=0
     local be_count=0
-    read fe_count be_count <<< $(get_cluster_config ${cluster_id})
+    read fe_count be_count <<<$(get_cluster_config ${cluster_id})
 
     local i
-    for((i=0; i<$fe_count; i++)); do
-      stop_single_fe ${cluster_id} ${i}
+    for ((i = 0; i < $fe_count; i++)); do
+        stop_single_fe ${cluster_id} ${i}
     done
 }
 
@@ -241,11 +241,11 @@
     local cluster_id=${1:-0}
     local fe_count=0
     local be_count=0
-    read fe_count be_count <<< $(get_cluster_config ${cluster_id})
+    read fe_count be_count <<<$(get_cluster_config ${cluster_id})
 
     local i
-    for((i=0; i<$be_count; i++)); do
-      start_single_be ${cluster_id} ${i}
+    for ((i = 0; i < $be_count; i++)); do
+        start_single_be ${cluster_id} ${i}
     done
 }
 
@@ -253,11 +253,11 @@
     local cluster_id=${1:-0}
     local fe_count=0
     local be_count=0
-    read fe_count be_count <<< $(get_cluster_config ${cluster_id})
+    read fe_count be_count <<<$(get_cluster_config ${cluster_id})
 
     local i
-    for((i=0; i<$be_count; i++)); do
-      stop_single_be ${cluster_id} ${i}
+    for ((i = 0; i < $be_count; i++)); do
+        stop_single_be ${cluster_id} ${i}
     done
 }
 
--- .devcontainer/postCreateCommand.sh.orig
+++ .devcontainer/postCreateCommand.sh
@@ -1,23 +1,23 @@
 #!/bin/bash
 
 function main {
-  # need to reload vscode to enable cmake language server
+    # need to reload vscode to enable cmake language server
 
-  # enable coredump
-  sudo sysctl -w kernel.core_pattern="/coredumps/core-%e-%s-%u-%g-%p-%t"
+    # enable coredump
+    sudo sysctl -w kernel.core_pattern="/coredumps/core-%e-%s-%u-%g-%p-%t"
 
-  mkdir -p $HOME/.config/ccache
-  echo "cache_dir = /opt/ccache" >> $HOME/.config/ccache/ccache.conf
-  echo "max_size = 20.0G" >> $HOME/.config/ccache/ccache.conf
+    mkdir -p $HOME/.config/ccache
+    echo "cache_dir = /opt/ccache" >>$HOME/.config/ccache/ccache.conf
+    echo "max_size = 20.0G" >>$HOME/.config/ccache/ccache.conf
 
-  echo "unset http_proxy" >> $HOME/.bashrc
-  echo "unset https_proxy" >> $HOME/.bashrc
+    echo "unset http_proxy" >>$HOME/.bashrc
+    echo "unset https_proxy" >>$HOME/.bashrc
 
-  # replace container settings.json with our project settings.json
-  pushd $HOME/.vscode-server/data/Machine
-  rm -rf settings.json
-  ln -s /opt/transwarp/doris/.devcontainer/settings.json settings.json
-  popd
+    # replace container settings.json with our project settings.json
+    pushd $HOME/.vscode-server/data/Machine
+    rm -rf settings.json
+    ln -s /opt/transwarp/doris/.devcontainer/settings.json settings.json
+    popd
 }
 
 main $@
--- bin/start_be.sh.orig
+++ bin/start_be.sh
@@ -438,45 +438,45 @@
     local param="$1"
 
     case "${param}" in
-        "--add-opens="* | "--add-exports="* | "--add-reads="* | "--add-modules="*)
-            # --add-opens=java.base/sun.util.calendar=ALL-UNNAMED
-            # Extract module/package path as key: --add-opens=java.base/sun.util.calendar
-            echo "${param%=*}"
-            ;;
-        -XX:+* | -XX:-*)
-            # -XX:+HeapDumpOnOutOfMemoryError or -XX:-OmitStackTraceInFastThrow
-            # Extract flag name for pattern matching: -XX:[+-]FlagName
-            local flag_name="${param#-XX:?}"
-            echo "-XX:[+-]${flag_name}"
-            ;;
-        -XX:*=*)
-            # -XX:HeapDumpPath=/path or -XX:OnOutOfMemoryError="cmd"
-            # Extract key before '=': -XX:HeapDumpPath
-            echo "${param%%=*}"
-            ;;
-        -D*=*)
-            # -Dfile.encoding=UTF-8
-            # Extract property name: -Dfile.encoding
-            echo "${param%%=*}"
-            ;;
-        -D*)
-            # -Dfoo (boolean property without value)
-            echo "${param}"
-            ;;
-        -Xms* | -Xmx* | -Xmn* | -Xss*)
-            # -Xmx8192m, -Xms8192m, -Xmn2g, -Xss512k
-            # Extract the prefix: -Xmx, -Xms, -Xmn, -Xss
-            echo "${param}" | sed -E 's/^(-Xm[sxn]|-Xss).*/\1/'
-            ;;
-        -Xlog:*)
-            # -Xlog:gc*:file:decorators
-            # Use prefix as key
-            echo "-Xlog:"
-            ;;
-        *)
-            # For other options, use the full parameter as key
-            echo "${param}"
-            ;;
+    "--add-opens="* | "--add-exports="* | "--add-reads="* | "--add-modules="*)
+        # --add-opens=java.base/sun.util.calendar=ALL-UNNAMED
+        # Extract module/package path as key: --add-opens=java.base/sun.util.calendar
+        echo "${param%=*}"
+        ;;
+    -XX:+* | -XX:-*)
+        # -XX:+HeapDumpOnOutOfMemoryError or -XX:-OmitStackTraceInFastThrow
+        # Extract flag name for pattern matching: -XX:[+-]FlagName
+        local flag_name="${param#-XX:?}"
+        echo "-XX:[+-]${flag_name}"
+        ;;
+    -XX:*=*)
+        # -XX:HeapDumpPath=/path or -XX:OnOutOfMemoryError="cmd"
+        # Extract key before '=': -XX:HeapDumpPath
+        echo "${param%%=*}"
+        ;;
+    -D*=*)
+        # -Dfile.encoding=UTF-8
+        # Extract property name: -Dfile.encoding
+        echo "${param%%=*}"
+        ;;
+    -D*)
+        # -Dfoo (boolean property without value)
+        echo "${param}"
+        ;;
+    -Xms* | -Xmx* | -Xmn* | -Xss*)
+        # -Xmx8192m, -Xms8192m, -Xmn2g, -Xss512k
+        # Extract the prefix: -Xmx, -Xms, -Xmn, -Xss
+        echo "${param}" | sed -E 's/^(-Xm[sxn]|-Xss).*/\1/'
+        ;;
+    -Xlog:*)
+        # -Xlog:gc*:file:decorators
+        # Use prefix as key
+        echo "-Xlog:"
+        ;;
+    *)
+        # For other options, use the full parameter as key
+        echo "${param}"
+        ;;
     esac
 }
 
--- bin/start_fe.sh.orig
+++ bin/start_fe.sh
@@ -204,45 +204,45 @@
     local param="$1"
 
     case "${param}" in
-        "--add-opens="* | "--add-exports="* | "--add-reads="* | "--add-modules="*)
-            # --add-opens=java.base/sun.util.calendar=ALL-UNNAMED
-            # Extract module/package path as key: --add-opens=java.base/sun.util.calendar
-            echo "${param%=*}"
-            ;;
-        -XX:+* | -XX:-*)
-            # -XX:+HeapDumpOnOutOfMemoryError or -XX:-OmitStackTraceInFastThrow
-            # Extract flag name for pattern matching: -XX:[+-]FlagName
-            local flag_name="${param#-XX:?}"
-            echo "-XX:[+-]${flag_name}"
-            ;;
-        -XX:*=*)
-            # -XX:HeapDumpPath=/path or -XX:OnOutOfMemoryError="cmd"
-            # Extract key before '=': -XX:HeapDumpPath
-            echo "${param%%=*}"
-            ;;
-        -D*=*)
-            # -Dfile.encoding=UTF-8
-            # Extract property name: -Dfile.encoding
-            echo "${param%%=*}"
-            ;;
-        -D*)
-            # -Dfoo (boolean property without value)
-            echo "${param}"
-            ;;
-        -Xms* | -Xmx* | -Xmn* | -Xss*)
-            # -Xmx8192m, -Xms8192m, -Xmn2g, -Xss512k
-            # Extract the prefix: -Xmx, -Xms, -Xmn, -Xss
-            echo "${param}" | sed -E 's/^(-Xm[sxn]|-Xss).*/\1/'
-            ;;
-        -Xlog:*)
-            # -Xlog:gc*:file:decorators
-            # Use prefix as key
-            echo "-Xlog:"
-            ;;
-        *)
-            # For other options, use the full parameter as key
-            echo "${param}"
-            ;;
+    "--add-opens="* | "--add-exports="* | "--add-reads="* | "--add-modules="*)
+        # --add-opens=java.base/sun.util.calendar=ALL-UNNAMED
+        # Extract module/package path as key: --add-opens=java.base/sun.util.calendar
+        echo "${param%=*}"
+        ;;
+    -XX:+* | -XX:-*)
+        # -XX:+HeapDumpOnOutOfMemoryError or -XX:-OmitStackTraceInFastThrow
+        # Extract flag name for pattern matching: -XX:[+-]FlagName
+        local flag_name="${param#-XX:?}"
+        echo "-XX:[+-]${flag_name}"
+        ;;
+    -XX:*=*)
+        # -XX:HeapDumpPath=/path or -XX:OnOutOfMemoryError="cmd"
+        # Extract key before '=': -XX:HeapDumpPath
+        echo "${param%%=*}"
+        ;;
+    -D*=*)
+        # -Dfile.encoding=UTF-8
+        # Extract property name: -Dfile.encoding
+        echo "${param%%=*}"
+        ;;
+    -D*)
+        # -Dfoo (boolean property without value)
+        echo "${param}"
+        ;;
+    -Xms* | -Xmx* | -Xmn* | -Xss*)
+        # -Xmx8192m, -Xms8192m, -Xmn2g, -Xss512k
+        # Extract the prefix: -Xmx, -Xms, -Xmn, -Xss
+        echo "${param}" | sed -E 's/^(-Xm[sxn]|-Xss).*/\1/'
+        ;;
+    -Xlog:*)
+        # -Xlog:gc*:file:decorators
+        # Use prefix as key
+        echo "-Xlog:"
+        ;;
+    *)
+        # For other options, use the full parameter as key
+        echo "${param}"
+        ;;
     esac
 }
 
--- bin/start_file_cache_microbench.sh.orig
+++ bin/start_file_cache_microbench.sh
@@ -115,7 +115,6 @@
     exit 1
 fi
 
-
 JEMALLOC_CONF="percpu_arena:percpu,background_thread:true,metadata_thp:auto,muzzy_decay_ms:5000,dirty_decay_ms:5000,oversize_threshold:0,prof:true,lg_prof_interval:30,lg_prof_sample:19,prof_final:false,prof_active:true"
 JEMALLOC_PROF_PRFIX="jeprofile_doris_cloud"
 
--- build-support/clang-format.sh.orig
+++ build-support/clang-format.sh
@@ -36,7 +36,7 @@
         echo "Error: Homebrew is missing. Please install it first due to we use Homebrew to manage the tools which are needed to build the project."
         exit 1
     fi
-    if ! brew list llvm@16 > /dev/null 2>&1; then
+    if ! brew list llvm@16 >/dev/null 2>&1; then
         echo "Error: Please install llvm@16 firt due to we use it to format code."
         exit 1
     fi
--- build.sh.orig
+++ build.sh
@@ -255,15 +255,15 @@
         --be-cdc-client)
             BUILD_BE_CDC_CLIENT=1
             shift
-            ;;    
+            ;;
         --exclude-obs-dependencies)
             BUILD_OBS_DEPENDENCIES=0
             shift
-            ;; 
+            ;;
         --exclude-cos-dependencies)
             BUILD_COS_DEPENDENCIES=0
             shift
-            ;;           
+            ;;
         --clean)
             CLEAN=1
             shift
@@ -312,7 +312,7 @@
         BUILD_META_TOOL='ON'
         BUILD_FILE_CACHE_MICROBENCH_TOOL='OFF'
         BUILD_INDEX_TOOL='ON'
-	BUILD_TASK_EXECUTOR_SIMULATOR='OFF'
+        BUILD_TASK_EXECUTOR_SIMULATOR='OFF'
         BUILD_HIVE_UDF=1
         BUILD_BE_JAVA_EXTENSIONS=1
         BUILD_BE_CDC_CLIENT=1
@@ -546,7 +546,10 @@
 FEAT+=($([[ ${BUILD_HIVE_UDF} -eq 1 ]] && echo "+HIVE_UDF" || echo "-HIVE_UDF"))
 FEAT+=($([[ ${BUILD_BE_JAVA_EXTENSIONS} -eq 1 ]] && echo "+BE_JAVA_EXTENSIONS" || echo "-BE_JAVA_EXTENSIONS"))
 
-export DORIS_FEATURE_LIST=$(IFS=','; echo "${FEAT[*]}")
+export DORIS_FEATURE_LIST=$(
+    IFS=','
+    echo "${FEAT[*]}"
+)
 echo "Feature List: ${DORIS_FEATURE_LIST}"
 
 # Clean and build generated code
@@ -761,19 +764,19 @@
     if [[ "${BUILD_COS_DEPENDENCIES}" -eq 0 ]]; then
         DEPENDENCIES_MVN_OPTS+=" -Dcos.dependency.scope=provided "
     fi
-    
+
     if [[ "${DISABLE_JAVA_CHECK_STYLE}" = "ON" ]]; then
         # Allowed user customer set env param USER_SETTINGS_MVN_REPO means settings.xml file path
         if [[ -n ${USER_SETTINGS_MVN_REPO} && -f ${USER_SETTINGS_MVN_REPO} ]]; then
-            "${MVN_CMD}" package -pl ${FE_MODULES:+${FE_MODULES}} -Dskip.doc=true -DskipTests -Dcheckstyle.skip=true ${MVN_OPT:+${MVN_OPT}} ${DEPENDENCIES_MVN_OPTS}  -gs "${USER_SETTINGS_MVN_REPO}" -T 1C
+            "${MVN_CMD}" package -pl ${FE_MODULES:+${FE_MODULES}} -Dskip.doc=true -DskipTests -Dcheckstyle.skip=true ${MVN_OPT:+${MVN_OPT}} ${DEPENDENCIES_MVN_OPTS} -gs "${USER_SETTINGS_MVN_REPO}" -T 1C
         else
-            "${MVN_CMD}" package -pl ${FE_MODULES:+${FE_MODULES}} -Dskip.doc=true -DskipTests -Dcheckstyle.skip=true ${MVN_OPT:+${MVN_OPT}} ${DEPENDENCIES_MVN_OPTS}  -T 1C
+            "${MVN_CMD}" package -pl ${FE_MODULES:+${FE_MODULES}} -Dskip.doc=true -DskipTests -Dcheckstyle.skip=true ${MVN_OPT:+${MVN_OPT}} ${DEPENDENCIES_MVN_OPTS} -T 1C
         fi
     else
         if [[ -n ${USER_SETTINGS_MVN_REPO} && -f ${USER_SETTINGS_MVN_REPO} ]]; then
-            "${MVN_CMD}" package -pl ${FE_MODULES:+${FE_MODULES}} -Dskip.doc=true -DskipTests ${MVN_OPT:+${MVN_OPT}} ${DEPENDENCIES_MVN_OPTS}  -gs "${USER_SETTINGS_MVN_REPO}" -T 1C
+            "${MVN_CMD}" package -pl ${FE_MODULES:+${FE_MODULES}} -Dskip.doc=true -DskipTests ${MVN_OPT:+${MVN_OPT}} ${DEPENDENCIES_MVN_OPTS} -gs "${USER_SETTINGS_MVN_REPO}" -T 1C
         else
-            "${MVN_CMD}" package -pl ${FE_MODULES:+${FE_MODULES}} -Dskip.doc=true -DskipTests ${MVN_OPT:+${MVN_OPT}} ${DEPENDENCIES_MVN_OPTS}  -T 1C
+            "${MVN_CMD}" package -pl ${FE_MODULES:+${FE_MODULES}} -Dskip.doc=true -DskipTests ${MVN_OPT:+${MVN_OPT}} ${DEPENDENCIES_MVN_OPTS} -T 1C
         fi
     fi
     cd "${DORIS_HOME}"
@@ -828,12 +831,12 @@
     mkdir -p "${DORIS_OUTPUT}/fe/plugins/java_extensions/"
 
     if [ "${TARGET_SYSTEM}" = "Darwin" ] || [ "${TARGET_SYSTEM}" = "Linux" ]; then
-      mkdir -p "${DORIS_OUTPUT}/fe/arthas"
-      rm -rf "${DORIS_OUTPUT}/fe/arthas/*"
-      unzip -o "${DORIS_OUTPUT}/fe/lib/arthas-packaging-*.jar" arthas-bin.zip -d "${DORIS_OUTPUT}/fe/arthas/"
-      unzip -o "${DORIS_OUTPUT}/fe/arthas/arthas-bin.zip" -d "${DORIS_OUTPUT}/fe/arthas/"
-      rm "${DORIS_OUTPUT}/fe/arthas/math-game.jar"
-      rm "${DORIS_OUTPUT}/fe/arthas/arthas-bin.zip"
+        mkdir -p "${DORIS_OUTPUT}/fe/arthas"
+        rm -rf "${DORIS_OUTPUT}/fe/arthas/*"
+        unzip -o "${DORIS_OUTPUT}/fe/lib/arthas-packaging-*.jar" arthas-bin.zip -d "${DORIS_OUTPUT}/fe/arthas/"
+        unzip -o "${DORIS_OUTPUT}/fe/arthas/arthas-bin.zip" -d "${DORIS_OUTPUT}/fe/arthas/"
+        rm "${DORIS_OUTPUT}/fe/arthas/math-game.jar"
+        rm "${DORIS_OUTPUT}/fe/arthas/arthas-bin.zip"
     fi
 fi
 
@@ -950,8 +953,8 @@
         module_proj_jar="${DORIS_HOME}/fe/be-java-extensions/${extensions_module}/target/${extensions_module}-project.jar"
         mkdir "${BE_JAVA_EXTENSIONS_DIR}"/"${extensions_module}"
         echo "Copy Be Extensions ${extensions_module} jar to ${BE_JAVA_EXTENSIONS_DIR}/${extensions_module}"
-     if [[ "${extensions_module}" == "${HADOOP_DEPS_NAME}" ]]; then
-          
+        if [[ "${extensions_module}" == "${HADOOP_DEPS_NAME}" ]]; then
+
             BE_HADOOP_HDFS_DIR="${DORIS_OUTPUT}/be/lib/hadoop_hdfs/"
             echo "Copy Be Extensions hadoop deps jars to ${BE_HADOOP_HDFS_DIR}"
             rm -rf "${BE_HADOOP_HDFS_DIR}"
@@ -976,7 +979,7 @@
                 cp -r "${DORIS_HOME}/fe/be-java-extensions/${extensions_module}/target/lib" "${BE_JAVA_EXTENSIONS_DIR}/${extensions_module}/"
             fi
         fi
-    done        
+    done
 
     # copy jindofs jars, only support for Linux x64 or arm
     install -d "${DORIS_OUTPUT}/be/lib/java_extensions/jindofs"/
--- cloud/script/start.sh.orig
+++ cloud/script/start.sh
@@ -54,9 +54,9 @@
 fi
 # echo "$@" "daemonized=${daemonized}"}
 
-custom_start="${DORIS_HOME}/bin/custom_start.sh" 
+custom_start="${DORIS_HOME}/bin/custom_start.sh"
 if [[ -f "${custom_start}" ]]; then
-  source "${custom_start}" 
+    source "${custom_start}"
 fi
 enable_hdfs=${enable_hdfs:-1}
 process_name="${process_name:-doris_cloud}"
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/default/account_fund/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/account_fund/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/default/hive01/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/hive01/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/default/sale_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/sale_table/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/default/string_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/string_table/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/default/student/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/student/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/default/test1/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/test1/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/default/test2/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/test2/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/default/test_hive_doris/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/test_hive_doris/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type2/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type2/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type3/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type3/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type_delimiter/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type_delimiter2/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type_delimiter2/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type_delimiter3/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type_delimiter3/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -9,4 +9,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_nested_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_nested_types/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_partitioned_columns/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_partitioned_columns/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_partitioned_one_column/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_partitioned_one_column/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/par_fields_in_file_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/par_fields_in_file_orc/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/par_fields_in_file_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/par_fields_in_file_parquet/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_bigint/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_bigint/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_boolean/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_boolean/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_char/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_char/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_date/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_date/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_decimal/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_decimal/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_double/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_double/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_float/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_float/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_int/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_int/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_smallint/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_smallint/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_string/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_string/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_timestamp/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_timestamp/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_tinyint/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_tinyint/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_varchar/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_varchar/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_bloom_filter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_bloom_filter/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_nested_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_nested_types/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_partitioned_columns/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_partitioned_columns/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_partitioned_one_column/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_partitioned_one_column/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_predicate_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_predicate_table/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_location_1/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_location_1/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_location_2/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_location_2/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_chinese_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_chinese_orc/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_chinese_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_chinese_parquet/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_chinese_text/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_chinese_text/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -9,4 +9,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -9,4 +9,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_hive_same_db_table_name/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_hive_same_db_table_name/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_hive_special_char_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_hive_special_char_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_mixed_par_locations_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_mixed_par_locations_orc/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_mixed_par_locations_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_mixed_par_locations_parquet/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_multi_langs_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_multi_langs_orc/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_multi_langs_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_multi_langs_parquet/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_multi_langs_text/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_multi_langs_text/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_special_orc_formats/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_special_orc_formats/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_truncate_char_or_varchar_columns_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_truncate_char_or_varchar_columns_orc/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_truncate_char_or_varchar_columns_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_truncate_char_or_varchar_columns_parquet/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_truncate_char_or_varchar_columns_text/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_truncate_char_or_varchar_columns_text/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -9,4 +9,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/text_partitioned_columns/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/text_partitioned_columns/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/text_partitioned_one_column/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/text_partitioned_one_column/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/timestamp_with_time_zone/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/timestamp_with_time_zone/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/type_change_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/type_change_orc/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/type_change_origin/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/type_change_origin/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/type_change_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/type_change_parquet/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/bigint_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/bigint_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/char_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/char_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/date_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/date_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/decimal_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/decimal_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/double_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/double_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/float_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/float_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/int_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/int_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/smallint_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/smallint_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/string_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/tinyint_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/tinyint_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/varchar_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/varchar_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -3,11 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/regression/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/regression/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/statistics/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/statistics/
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/stats/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/stats/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/statistics/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/statistics/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/test/hive_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/test/hive_test/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/test/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/test/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -18,7 +18,6 @@
 
 set -e -x
 
-
 AUX_LIB="/mnt/scripts/auxlib"
 for file in "${AUX_LIB}"/*.tar.gz; do
     [ -e "$file" ] || continue
@@ -33,7 +32,6 @@
 # start metastore
 nohup /opt/hive/bin/hive --service metastore &
 
-
 # wait metastore start
 while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
     sleep 5s
@@ -73,7 +71,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -86,7 +83,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/paimon1 /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 ## put tvf_data
 if [[ -z "$(ls /mnt/scripts/tvf_data)" ]]; then
     echo "tvf_data does not exist"
@@ -99,7 +95,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 wait "${hadoop_put_pids[@]}"
 if [[ -z "$(hadoop fs -ls /user/doris/paimon1)" ]]; then
--- docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh
@@ -19,8 +19,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 # Extract all tar.gz files under the repo
-find ${CUR_DIR}/data -type f -name "*.tar.gz" -print0 | \
-xargs -0 -n1 -P"${LOAD_PARALLEL}" bash -c '
+find ${CUR_DIR}/data -type f -name "*.tar.gz" -print0 |
+    xargs -0 -n1 -P"${LOAD_PARALLEL}" bash -c '
   f="$0"
   echo "Extracting hive data $f"
   dir=$(dirname "$f")
@@ -145,4 +145,3 @@
 for jar in "${jars[@]}"; do
     curl -O "https://${s3BucketName}.${s3Endpoint}/regression/docker/hive3/${jar}"
 done
-
--- docker/thirdparties/docker-compose/hudi/scripts/init.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/init.sh
@@ -21,8 +21,8 @@
 # Remove SUCCESS file from previous run to ensure fresh initialization
 SUCCESS_FILE="/opt/hudi-scripts/SUCCESS"
 if [[ -f "${SUCCESS_FILE}" ]]; then
-  echo "Removing previous SUCCESS file to ensure fresh initialization..."
-  rm -f "${SUCCESS_FILE}"
+    echo "Removing previous SUCCESS file to ensure fresh initialization..."
+    rm -f "${SUCCESS_FILE}"
 fi
 
 SPARK_HOME=/opt/spark
@@ -34,50 +34,50 @@
 
 # Function to download a JAR file if it doesn't exist
 download_jar() {
-  local jar_name="$1"
-  local version="$2"
-  local url="$3"
-  local jar_file="${CACHE_DIR}/${jar_name}-${version}.jar"
-  
-  if [[ ! -f "${jar_file}" ]]; then
-    echo "Downloading ${jar_name} JAR ${version} from ${url} ..." >&2
-    local download_success=false
-    if command -v curl >/dev/null 2>&1; then
-      if curl -sSfL "${url}" -o "${jar_file}"; then
-        download_success=true
-      else
-        echo "Error: Failed to download ${jar_name} from ${url}" >&2
-      fi
-    elif command -v wget >/dev/null 2>&1; then
-      if wget -qO "${jar_file}" "${url}"; then
-        download_success=true
-      else
-        echo "Error: Failed to download ${jar_name} from ${url}" >&2
-      fi
-    else
-      echo "Error: Neither curl nor wget is available in hudi-spark container." >&2
-      exit 1
-    fi
-    
-    if [[ "${download_success}" == "false" ]]; then
-      echo "Error: Failed to download ${jar_name} JAR. Please check the URL: ${url}" >&2
-      exit 1
-    fi
-    
+    local jar_name="$1"
+    local version="$2"
+    local url="$3"
+    local jar_file="${CACHE_DIR}/${jar_name}-${version}.jar"
+
     if [[ ! -f "${jar_file}" ]]; then
-      echo "Error: Downloaded file ${jar_file} does not exist" >&2
-      exit 1
+        echo "Downloading ${jar_name} JAR ${version} from ${url} ..." >&2
+        local download_success=false
+        if command -v curl >/dev/null 2>&1; then
+            if curl -sSfL "${url}" -o "${jar_file}"; then
+                download_success=true
+            else
+                echo "Error: Failed to download ${jar_name} from ${url}" >&2
+            fi
+        elif command -v wget >/dev/null 2>&1; then
+            if wget -qO "${jar_file}" "${url}"; then
+                download_success=true
+            else
+                echo "Error: Failed to download ${jar_name} from ${url}" >&2
+            fi
+        else
+            echo "Error: Neither curl nor wget is available in hudi-spark container." >&2
+            exit 1
+        fi
+
+        if [[ "${download_success}" == "false" ]]; then
+            echo "Error: Failed to download ${jar_name} JAR. Please check the URL: ${url}" >&2
+            exit 1
+        fi
+
+        if [[ ! -f "${jar_file}" ]]; then
+            echo "Error: Downloaded file ${jar_file} does not exist" >&2
+            exit 1
+        fi
     fi
-  fi
-  echo "${jar_file}"
+    echo "${jar_file}"
 }
 
 # Function to link a JAR file to Spark jars directory
 link_jar() {
-  local jar_file="$1"
-  local jar_name="$2"
-  local version="$3"
-  ln -sf "${jar_file}" "${JARS_DIR}/${jar_name}-${version}.jar"
+    local jar_file="$1"
+    local jar_name="$2"
+    local version="$3"
+    ln -sf "${jar_file}" "${JARS_DIR}/${jar_name}-${version}.jar"
 }
 
 # Wait for Hive Metastore to be ready
@@ -88,35 +88,35 @@
 RETRY_COUNT=0
 
 while [ $RETRY_COUNT -lt $MAX_RETRIES ]; do
-  if command -v nc >/dev/null 2>&1; then
-    if nc -z "${METASTORE_HOST}" "${METASTORE_PORT}" 2>/dev/null; then
-      echo "Hive Metastore is ready at ${METASTORE_HOST}:${METASTORE_PORT}"
-      break
+    if command -v nc >/dev/null 2>&1; then
+        if nc -z "${METASTORE_HOST}" "${METASTORE_PORT}" 2>/dev/null; then
+            echo "Hive Metastore is ready at ${METASTORE_HOST}:${METASTORE_PORT}"
+            break
+        fi
+    elif command -v timeout >/dev/null 2>&1; then
+        if timeout 1 bash -c "cat < /dev/null > /dev/tcp/${METASTORE_HOST}/${METASTORE_PORT}" 2>/dev/null; then
+            echo "Hive Metastore is ready at ${METASTORE_HOST}:${METASTORE_PORT}"
+            break
+        fi
+    else
+        # Fallback: just wait a bit and assume it's ready
+        if [ $RETRY_COUNT -eq 0 ]; then
+            echo "Warning: nc or timeout command not available, skipping metastore readiness check"
+            sleep 10
+            break
+        fi
     fi
-  elif command -v timeout >/dev/null 2>&1; then
-    if timeout 1 bash -c "cat < /dev/null > /dev/tcp/${METASTORE_HOST}/${METASTORE_PORT}" 2>/dev/null; then
-      echo "Hive Metastore is ready at ${METASTORE_HOST}:${METASTORE_PORT}"
-      break
+
+    RETRY_COUNT=$((RETRY_COUNT + 1))
+    if [ $((RETRY_COUNT % 10)) -eq 0 ]; then
+        echo "Waiting for Hive Metastore... (${RETRY_COUNT}/${MAX_RETRIES})"
     fi
-  else
-    # Fallback: just wait a bit and assume it's ready
-    if [ $RETRY_COUNT -eq 0 ]; then
-      echo "Warning: nc or timeout command not available, skipping metastore readiness check"
-      sleep 10
-      break
-    fi
-  fi
-  
-  RETRY_COUNT=$((RETRY_COUNT + 1))
-  if [ $((RETRY_COUNT % 10)) -eq 0 ]; then
-    echo "Waiting for Hive Metastore... (${RETRY_COUNT}/${MAX_RETRIES})"
-  fi
-  sleep 2
+    sleep 2
 done
 
 if [ $RETRY_COUNT -ge $MAX_RETRIES ]; then
-  echo "Error: Hive Metastore did not become ready within $((MAX_RETRIES * 2)) seconds"
-  exit 1
+    echo "Error: Hive Metastore did not become ready within $((MAX_RETRIES * 2)) seconds"
+    exit 1
 fi
 
 # Write core-site for MinIO (S3A)
@@ -191,37 +191,37 @@
 TEMP_SQL_DIR="/tmp/hudi_sql"
 
 if [[ -d "${SCRIPTS_DIR}" ]]; then
-  mkdir -p "${TEMP_SQL_DIR}"
-  
-  # Process each SQL file: substitute environment variables and combine them
-  echo "Processing Hudi SQL scripts..."
-  for sql_file in $(find "${SCRIPTS_DIR}" -name '*.sql' | sort); do
-    echo "Processing ${sql_file}..."
-    # Use sed to replace environment variables in SQL files
-    # Replace ${HIVE_METASTORE_URIS} and ${HUDI_BUCKET} with actual values
-    sed "s|\${HIVE_METASTORE_URIS}|${HIVE_METASTORE_URIS}|g; s|\${HUDI_BUCKET}|${HUDI_BUCKET}|g" "${sql_file}" >> "${TEMP_SQL_DIR}/hudi_total.sql"
-    echo "" >> "${TEMP_SQL_DIR}/hudi_total.sql"
-  done
-  
-  # Run Spark SQL to execute all SQL scripts
-  echo "Executing Hudi SQL scripts..."
-  START_TIME=$(date +%s)
-  ${SPARK_HOME}/bin/spark-sql \
-    --master local[*] \
-    --name hudi-init \
-    --conf spark.serializer=org.apache.spark.serializer.KryoSerializer \
-    --conf spark.sql.catalogImplementation=hive \
-    --conf spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension \
-    --conf spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog \
-    -f "${TEMP_SQL_DIR}/hudi_total.sql"
-  END_TIME=$(date +%s)
-  EXECUTION_TIME=$((END_TIME - START_TIME))
-  echo "Hudi SQL scripts executed in ${EXECUTION_TIME} seconds"
-  
-  # Clean up temporary SQL file
-  rm -f "${TEMP_SQL_DIR}/hudi_total.sql"
+    mkdir -p "${TEMP_SQL_DIR}"
+
+    # Process each SQL file: substitute environment variables and combine them
+    echo "Processing Hudi SQL scripts..."
+    for sql_file in $(find "${SCRIPTS_DIR}" -name '*.sql' | sort); do
+        echo "Processing ${sql_file}..."
+        # Use sed to replace environment variables in SQL files
+        # Replace ${HIVE_METASTORE_URIS} and ${HUDI_BUCKET} with actual values
+        sed "s|\${HIVE_METASTORE_URIS}|${HIVE_METASTORE_URIS}|g; s|\${HUDI_BUCKET}|${HUDI_BUCKET}|g" "${sql_file}" >>"${TEMP_SQL_DIR}/hudi_total.sql"
+        echo "" >>"${TEMP_SQL_DIR}/hudi_total.sql"
+    done
+
+    # Run Spark SQL to execute all SQL scripts
+    echo "Executing Hudi SQL scripts..."
+    START_TIME=$(date +%s)
+    ${SPARK_HOME}/bin/spark-sql \
+        --master local[*] \
+        --name hudi-init \
+        --conf spark.serializer=org.apache.spark.serializer.KryoSerializer \
+        --conf spark.sql.catalogImplementation=hive \
+        --conf spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension \
+        --conf spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog \
+        -f "${TEMP_SQL_DIR}/hudi_total.sql"
+    END_TIME=$(date +%s)
+    EXECUTION_TIME=$((END_TIME - START_TIME))
+    echo "Hudi SQL scripts executed in ${EXECUTION_TIME} seconds"
+
+    # Clean up temporary SQL file
+    rm -f "${TEMP_SQL_DIR}/hudi_total.sql"
 else
-  echo "Warning: SQL scripts directory ${SCRIPTS_DIR} not found, skipping table initialization."
+    echo "Warning: SQL scripts directory ${SCRIPTS_DIR} not found, skipping table initialization."
 fi
 
 # Create success marker file to indicate initialization is complete
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -56,7 +56,6 @@
 curl -O https://s3BucketName.s3Endpoint/regression/docker/hive3/paimon-hive-connector-3.1-1.3-SNAPSHOT.jar
 curl -O https://s3BucketName.s3Endpoint/regression/docker/hive3/gcs-connector-hadoop3-2.2.24-shaded.jar
 
-
 /usr/local/hadoop-run.sh &
 
 # check healthy hear
@@ -86,7 +85,7 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 if [[ ${enablePaimonHms} == "true" ]]; then
     echo "Creating Paimon HMS catalog and table"
     hadoop fs -put /tmp/paimon_data/* /user/hive/warehouse/
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/docker-compose/polaris/init-catalog.sh.orig
+++ docker/thirdparties/docker-compose/polaris/init-catalog.sh
@@ -26,29 +26,30 @@
 
 echo "[polaris-init] Waiting for Polaris health check at http://$HOST:$PORT/q/health ..."
 for i in $(seq 1 120); do
-  if curl -sSf "http://$HOST:8182/q/health" >/dev/null; then
-    break
-  fi
-  sleep 2
+    if curl -sSf "http://$HOST:8182/q/health" >/dev/null; then
+        break
+    fi
+    sleep 2
 done
 
 echo "[polaris-init] Fetching OAuth token via client_credentials ..."
 # Try to obtain token using correct OAuth endpoint
 TOKEN_JSON=$(curl -sS \
-  -X POST "http://$HOST:$PORT/api/catalog/v1/oauth/tokens" \
-  -H 'Content-Type: application/x-www-form-urlencoded' \
-  -d "grant_type=client_credentials&client_id=$USER&client_secret=$PASS&scope=PRINCIPAL_ROLE:ALL")
+    -X POST "http://$HOST:$PORT/api/catalog/v1/oauth/tokens" \
+    -H 'Content-Type: application/x-www-form-urlencoded' \
+    -d "grant_type=client_credentials&client_id=$USER&client_secret=$PASS&scope=PRINCIPAL_ROLE:ALL")
 
 # Extract access_token field
 TOKEN=$(printf "%s" "$TOKEN_JSON" | sed -n 's/.*"access_token"\s*:\s*"\([^"]*\)".*/\1/p')
 
 if [ -z "$TOKEN" ]; then
-  echo "[polaris-init] ERROR: Failed to obtain OAuth token. Response: $TOKEN_JSON" >&2
-  exit 1
+    echo "[polaris-init] ERROR: Failed to obtain OAuth token. Response: $TOKEN_JSON" >&2
+    exit 1
 fi
 
 echo "[polaris-init] Creating catalog '$CATALOG' with base '$BASE_LOCATION' ..."
-CREATE_PAYLOAD=$(cat <<JSON
+CREATE_PAYLOAD=$(
+    cat <<JSON
 {
   "name": "$CATALOG",
   "type": "INTERNAL",
@@ -71,19 +72,19 @@
 
 # Try create; on 409 Conflict, treat as success
 HTTP_CODE=$(curl -sS -o /tmp/resp.json -w "%{http_code}" \
-  -X POST "http://$HOST:$PORT/api/management/v1/catalogs" \
-  -H "Authorization: Bearer $TOKEN" \
-  -H "Content-Type: application/json" \
-  -d "$CREATE_PAYLOAD")
+    -X POST "http://$HOST:$PORT/api/management/v1/catalogs" \
+    -H "Authorization: Bearer $TOKEN" \
+    -H "Content-Type: application/json" \
+    -d "$CREATE_PAYLOAD")
 
 if [ "$HTTP_CODE" = "201" ]; then
-  echo "[polaris-init] Catalog created."
+    echo "[polaris-init] Catalog created."
 elif [ "$HTTP_CODE" = "409" ]; then
-  echo "[polaris-init] Catalog already exists. Skipping."
+    echo "[polaris-init] Catalog already exists. Skipping."
 else
-  echo "[polaris-init] Create catalog failed (HTTP $HTTP_CODE):"
-  cat /tmp/resp.json || true
-  exit 1
+    echo "[polaris-init] Create catalog failed (HTTP $HTTP_CODE):"
+    cat /tmp/resp.json || true
+    exit 1
 fi
 
 echo "[polaris-init] Setting up permissions for catalog '$CATALOG' ..."
@@ -91,55 +92,54 @@
 # Create a catalog admin role grants
 echo "[polaris-init] Creating catalog admin role grants ..."
 HTTP_CODE=$(curl -sS -o /tmp/resp.json -w "%{http_code}" \
-  -X PUT "http://$HOST:$PORT/api/management/v1/catalogs/$CATALOG/catalog-roles/catalog_admin/grants" \
-  -H "Authorization: Bearer $TOKEN" \
-  -H "Content-Type: application/json" \
-  -d '{"grant":{"type":"catalog", "privilege":"CATALOG_MANAGE_CONTENT"}}')
+    -X PUT "http://$HOST:$PORT/api/management/v1/catalogs/$CATALOG/catalog-roles/catalog_admin/grants" \
+    -H "Authorization: Bearer $TOKEN" \
+    -H "Content-Type: application/json" \
+    -d '{"grant":{"type":"catalog", "privilege":"CATALOG_MANAGE_CONTENT"}}')
 
 if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ]; then
-  echo "[polaris-init] Warning: Failed to create catalog admin grants (HTTP $HTTP_CODE)"
-  cat /tmp/resp.json || true
+    echo "[polaris-init] Warning: Failed to create catalog admin grants (HTTP $HTTP_CODE)"
+    cat /tmp/resp.json || true
 fi
 
 # Create a data engineer role
 echo "[polaris-init] Creating data engineer role ..."
 HTTP_CODE=$(curl -sS -o /tmp/resp.json -w "%{http_code}" \
-  -X POST "http://$HOST:$PORT/api/management/v1/principal-roles" \
-  -H "Authorization: Bearer $TOKEN" \
-  -H "Content-Type: application/json" \
-  -d '{"principalRole":{"name":"data_engineer"}}')
+    -X POST "http://$HOST:$PORT/api/management/v1/principal-roles" \
+    -H "Authorization: Bearer $TOKEN" \
+    -H "Content-Type: application/json" \
+    -d '{"principalRole":{"name":"data_engineer"}}')
 
 if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ] && [ "$HTTP_CODE" != "409" ]; then
-  echo "[polaris-init] Warning: Failed to create data engineer role (HTTP $HTTP_CODE)"
-  cat /tmp/resp.json || true
+    echo "[polaris-init] Warning: Failed to create data engineer role (HTTP $HTTP_CODE)"
+    cat /tmp/resp.json || true
 fi
 
 # Connect the roles
 echo "[polaris-init] Connecting roles ..."
 HTTP_CODE=$(curl -sS -o /tmp/resp.json -w "%{http_code}" \
-  -X PUT "http://$HOST:$PORT/api/management/v1/principal-roles/data_engineer/catalog-roles/$CATALOG" \
-  -H "Authorization: Bearer $TOKEN" \
-  -H "Content-Type: application/json" \
-  -d '{"catalogRole":{"name":"catalog_admin"}}')
+    -X PUT "http://$HOST:$PORT/api/management/v1/principal-roles/data_engineer/catalog-roles/$CATALOG" \
+    -H "Authorization: Bearer $TOKEN" \
+    -H "Content-Type: application/json" \
+    -d '{"catalogRole":{"name":"catalog_admin"}}')
 
 if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ]; then
-  echo "[polaris-init] Warning: Failed to connect roles (HTTP $HTTP_CODE)"
-  cat /tmp/resp.json || true
+    echo "[polaris-init] Warning: Failed to connect roles (HTTP $HTTP_CODE)"
+    cat /tmp/resp.json || true
 fi
 
 # Give root the data engineer role
 echo "[polaris-init] Assigning data engineer role to root ..."
 HTTP_CODE=$(curl -sS -o /tmp/resp.json -w "%{http_code}" \
-  -X PUT "http://$HOST:$PORT/api/management/v1/principals/root/principal-roles" \
-  -H "Authorization: Bearer $TOKEN" \
-  -H "Content-Type: application/json" \
-  -d '{"principalRole": {"name":"data_engineer"}}')
+    -X PUT "http://$HOST:$PORT/api/management/v1/principals/root/principal-roles" \
+    -H "Authorization: Bearer $TOKEN" \
+    -H "Content-Type: application/json" \
+    -d '{"principalRole": {"name":"data_engineer"}}')
 
 if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ]; then
-  echo "[polaris-init] Warning: Failed to assign data engineer role to root (HTTP $HTTP_CODE)"
-  cat /tmp/resp.json || true
+    echo "[polaris-init] Warning: Failed to assign data engineer role to root (HTTP $HTTP_CODE)"
+    cat /tmp/resp.json || true
 fi
 
 echo "[polaris-init] Permissions setup completed."
 echo "[polaris-init] Done."
-
--- docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh.orig
+++ docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh
--- docker/thirdparties/docker-compose/ranger/script/install_doris_service_def.sh.orig
+++ docker/thirdparties/docker-compose/ranger/script/install_doris_service_def.sh
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -51,7 +51,7 @@
 STOP=0
 NEED_RESERVE_PORTS=0
 export NEED_LOAD_DATA=1
-export LOAD_PARALLEL=$(( $(getconf _NPROCESSORS_ONLN) / 2 ))
+export LOAD_PARALLEL=$(($(getconf _NPROCESSORS_ONLN) / 2))
 export IP_HOST=$(ip -4 addr show scope global | awk '/inet / {print $2}' | cut -d/ -f1 | head -n 1)
 
 if ! OPTS="$(getopt \
@@ -201,7 +201,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -376,7 +376,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -390,7 +390,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -409,12 +409,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data*.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data_paimon_101.zip \
-            && sudo unzip iceberg_data_paimon_101.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data_paimon_101.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data*.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data_paimon_101.zip &&
+                sudo unzip iceberg_data_paimon_101.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data_paimon_101.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -422,13 +422,12 @@
 
         if [[ ! -f "${ICEBERG_DIR}/data/input/jars/iceberg-aws-bundle-1.10.0.jar" ]]; then
             echo "iceberg 1.10.0 jars does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_1_10_0*.jars.tar.gz\
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_1_10_0.jars.tar.gz \
-            && sudo tar xzvf iceberg_1_10_0.jars.tar.gz -C "data/input/jars" \
-            && sudo rm -rf iceberg_1_10_0.jars.tar.gz
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_1_10_0*.jars.tar.gz && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_1_10_0.jars.tar.gz &&
+                sudo tar xzvf iceberg_1_10_0.jars.tar.gz -C "data/input/jars" &&
+                sudo rm -rf iceberg_1_10_0.jars.tar.gz
             cd -
-        else 
+        else
             echo "iceberg 1.10.0 jars exist, continue !"
         fi
 
@@ -505,9 +504,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -573,12 +572,12 @@
     echo "RUN_ICEBERG_REST"
     # iceberg-rest with multiple cloud storage backends
     ICEBERG_REST_DIR=${ROOT}/docker-compose/iceberg-rest
-    
+
     # generate iceberg-rest.yaml
     export CONTAINER_UID=${CONTAINER_UID}
     . "${ROOT}"/docker-compose/iceberg-rest/iceberg-rest_settings.env
     envsubst <"${ICEBERG_REST_DIR}/docker-compose.yaml.tpl" >"${ICEBERG_REST_DIR}/docker-compose.yaml"
-    
+
     sudo docker compose -f "${ICEBERG_REST_DIR}/docker-compose.yaml" down
     if [[ "${STOP}" -ne 1 ]]; then
         # Start all three REST catalogs (S3, OSS, COS)
@@ -606,102 +605,102 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_iceberg.log 2>&1 &
+    start_iceberg >start_iceberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_ICEBERG_REST}" -eq 1 ]]; then
-    start_iceberg_rest > start_iceberg_rest.log 2>&1 &
+    start_iceberg_rest >start_iceberg_rest.log 2>&1 &
     pids["iceberg-rest"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_POLARIS}" -eq 1 ]]; then
-    start_polaris > start_polaris.log 2>&1 &
+    start_polaris >start_polaris.log 2>&1 &
     pids["polaris"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
 if [[ "${RUN_RANGER}" -eq 1 ]]; then
-    start_ranger > start_ranger.log 2>&1 &
+    start_ranger >start_ranger.log 2>&1 &
     pids["ranger"]=$!
 fi
 echo "waiting all dockers starting done"
--- run-be-ut.sh.orig
+++ run-be-ut.sh
@@ -479,7 +479,6 @@
 profraw=${DORIS_TEST_BINARY_DIR}/doris_be_test.profraw
 profdata=${DORIS_TEST_BINARY_DIR}/doris_be_test.profdata
 
-
 if [[ ${GDB} -ge 1 ]]; then
     gdb --args "${test}" "${FILTER}"
     exit
--- run-regression-test.sh.orig
+++ run-regression-test.sh
--- thirdparty/build-thirdparty.sh.orig
+++ thirdparty/build-thirdparty.sh
@@ -520,7 +520,7 @@
 
     rm -rf CMakeCache.txt CMakeFiles/
     "${CMAKE_CMD}" ../ -G "${GENERATOR}" -DCMAKE_POLICY_VERSION_MINIMUM=3.5 \
-      -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DCMAKE_POSITION_INDEPENDENT_CODE=On
+        -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DCMAKE_POSITION_INDEPENDENT_CODE=On
     # -DCMAKE_CXX_FLAGS="$warning_uninitialized"
 
     "${BUILD_SYSTEM}" -j "${PARALLEL}"
@@ -630,7 +630,7 @@
 build_crc32c() {
     check_if_source_exist "${CRC32C_SOURCE}"
     cd "${TP_SOURCE_DIR}/${CRC32C_SOURCE}"
-    
+
     mkdir -p "${BUILD_DIR}"
     cd "${BUILD_DIR}"
 
@@ -1293,7 +1293,7 @@
     rm -rf CMakeCache.txt CMakeFiles/
 
     "${CMAKE_CMD}" -DCMAKE_POLICY_VERSION_MINIMUM=3.5 \
-     -G "${GENERATOR}" -DBUILD_SHARED_LIBS=FALSE -DFMT_TEST=OFF -DFMT_DOC=OFF -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" ..
+        -G "${GENERATOR}" -DBUILD_SHARED_LIBS=FALSE -DFMT_TEST=OFF -DFMT_DOC=OFF -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" ..
     "${BUILD_SYSTEM}" -j"${PARALLEL}"
     "${BUILD_SYSTEM}" install
 }
@@ -1361,8 +1361,8 @@
 
     # -Wno-elaborated-enum-base to make C++20 on MacOS happy
     "${CMAKE_CMD}" -G "${GENERATOR}" \
-    -DCMAKE_CXX_FLAGS="$CMAKE_CXX_FLAGS -Wno-elaborated-enum-base" \
-    -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DBUILD_TESTING=OFF ..
+        -DCMAKE_CXX_FLAGS="$CMAKE_CXX_FLAGS -Wno-elaborated-enum-base" \
+        -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DBUILD_TESTING=OFF ..
     "${BUILD_SYSTEM}" -j "${PARALLEL}" install
 }
 
@@ -1796,7 +1796,7 @@
     cd "${BUILD_DIR}"
 
     "${CMAKE_CMD}" -G "${GENERATOR}" -DCMAKE_POLICY_VERSION_MINIMUM=3.5 \
-    -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DCMAKE_BUILD_TYPE=Release ..
+        -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DCMAKE_BUILD_TYPE=Release ..
     "${BUILD_SYSTEM}" -j "${PARALLEL}"
     "${BUILD_SYSTEM}" install
 }
@@ -1868,7 +1868,7 @@
     cd "${BUILD_DIR}"
 
     "${CMAKE_CMD}" -G "${GENERATOR}" -DCMAKE_POLICY_VERSION_MINIMUM=3.5 \
-    -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DCMAKE_BUILD_TYPE=Release ..
+        -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DCMAKE_BUILD_TYPE=Release ..
     MACHINE_TYPE="$(uname -m)"
     if [[ "${MACHINE_TYPE}" == "aarch64" || "${MACHINE_TYPE}" == 'arm64' ]]; then
         CFLAGS="--target=aarch64-linux-gnu -march=armv8-a+crc" NEON64_CFLAGS=" "
@@ -1898,10 +1898,10 @@
 
         # Add -ldl for clang compatibility (libcrypto.a requires dlopen/dlsym/dlclose/dlerror)
         "${CMAKE_CMD}" -G "${GENERATOR}" -DCMAKE_POLICY_VERSION_MINIMUM=3.5 \
-        -DCMAKE_CXX_FLAGS="-Wno-maybe-uninitialized" \
-        -DCMAKE_EXE_LINKER_FLAGS="-ldl" \
-        -DCMAKE_SHARED_LINKER_FLAGS="-ldl" \
-        -DDISABLE_RUST_IN_BUILD=ON -DVCPKG_MANIFEST_MODE=ON -DVCPKG_OVERLAY_PORTS="${azure_dir}/${AZURE_PORTS}" -DVCPKG_MANIFEST_DIR="${azure_dir}/${AZURE_MANIFEST_DIR}" -DWARNINGS_AS_ERRORS=FALSE -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DCMAKE_BUILD_TYPE=Release ..
+            -DCMAKE_CXX_FLAGS="-Wno-maybe-uninitialized" \
+            -DCMAKE_EXE_LINKER_FLAGS="-ldl" \
+            -DCMAKE_SHARED_LINKER_FLAGS="-ldl" \
+            -DDISABLE_RUST_IN_BUILD=ON -DVCPKG_MANIFEST_MODE=ON -DVCPKG_OVERLAY_PORTS="${azure_dir}/${AZURE_PORTS}" -DVCPKG_MANIFEST_DIR="${azure_dir}/${AZURE_MANIFEST_DIR}" -DWARNINGS_AS_ERRORS=FALSE -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DCMAKE_BUILD_TYPE=Release ..
         "${BUILD_SYSTEM}" -j "${PARALLEL}"
         "${BUILD_SYSTEM}" install
     fi
@@ -1917,7 +1917,7 @@
     cd "${BUILD_DIR}"
 
     "${CMAKE_CMD}" -G "${GENERATOR}" -DCMAKE_POLICY_VERSION_MINIMUM=3.5 \
-    -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DDRAGONBOX_INSTALL_TO_CHARS=ON ..
+        -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DDRAGONBOX_INSTALL_TO_CHARS=ON ..
 
     "${BUILD_SYSTEM}" -j "${PARALLEL}"
     "${BUILD_SYSTEM}" install
@@ -1963,7 +1963,7 @@
     cd "${BUILD_DIR}"
 
     "${CMAKE_CMD}" -G "${GENERATOR}" -DCMAKE_POLICY_VERSION_MINIMUM=3.5 \
-    -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DCMAKE_BUILD_TYPE=Release ..
+        -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DCMAKE_BUILD_TYPE=Release ..
     "${BUILD_SYSTEM}" -j "${PARALLEL}"
     "${BUILD_SYSTEM}" install
 
--- thirdparty/download-thirdparty.sh.orig
+++ thirdparty/download-thirdparty.sh
@@ -323,7 +323,7 @@
             patch -p1 <"${TP_PATCH_DIR}/rocksdb-5.14.2.patch"
             if [[ "$(uname -s)" == "Darwin" ]]; then
                 patch -p1 <"${TP_PATCH_DIR}/rocksdb-mac-compile-fix.patch"
-            fi 
+            fi
             touch "${PATCHED_MARK}"
         fi
         cd -
@@ -601,9 +601,9 @@
     echo "Finished patching ${AZURE_SOURCE}"
 fi
 
-if [[ " ${TP_ARCHIVES[*]} " =~ " CCTZ " ]] ; then
+if [[ " ${TP_ARCHIVES[*]} " =~ " CCTZ " ]]; then
     cd $TP_SOURCE_DIR/$CCTZ_SOURCE
-    if [[ ! -f "$PATCHED_MARK" ]] ; then
+    if [[ ! -f "$PATCHED_MARK" ]]; then
         for patch_file in "${TP_PATCH_DIR}"/cctz-*; do
             echo "patch ${patch_file}"
             patch -p1 --ignore-whitespace <"${patch_file}"
--- tools/coffeebench-tools/bin/run-queries.sh.orig
+++ tools/coffeebench-tools/bin/run-queries.sh
@@ -73,7 +73,6 @@
     usage
 fi
 
-
 check_prerequest() {
     local CMD=$1
     local NAME=$2
--- tools/tpcds-tools/bin/run-tpcds-queries.sh.orig
+++ tools/tpcds-tools/bin/run-tpcds-queries.sh
@@ -142,15 +142,15 @@
 run_query() {
     local query_file=$1
     local query_name=$2
-    
+
     if [[ ! -f "${query_file}" ]]; then
         return
     fi
-    
+
     local cold=0
     local hot1=0
     local hot2=0
-    
+
     echo -ne "${query_name}\t" | tee -a result.csv
     start=$(date +%s%3N)
     if ! output=$(mysql -h"${FE_HOST}" -u"${USER}" -P"${FE_QUERY_PORT}" -D"${DB}" --comments \
@@ -201,7 +201,7 @@
 for i in ${query_array[@]}; do
     # Run main query file
     run_query "${TPCDS_QUERIES_DIR}/query${i}.sql" "query${i}"
-    
+
     # Run variant query file if exists
     run_query "${TPCDS_QUERIES_DIR}/query${i}_1.sql" "query${i}_1"
 done
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


…tecture

- Add README.md in cloud/alter explaining the complete FE processing flow for
  CREATE INDEX in storage-compute separation mode, including:
  - Key class descriptions
  - Two execution paths (light index change vs full schema change job)
  - Core differences between cloud and local modes
  - Complete Mermaid sequence diagram
- Add class-level Javadoc to CloudSchemaChangeJobV2 describing the MetaService
  interaction lifecycle for index creation
- Add Javadoc to key override methods in CloudSchemaChangeJobV2
- Add class-level Javadoc to CloudSchemaChangeHandler
- Fix logger in CloudSchemaChangeJobV2 to use CloudSchemaChangeJobV2.class

Co-authored-by: ybtsdst <3139535+ybtsdst@users.noreply.github.com>
Copilot AI changed the title [WIP] Update front-end processes for storage-compute separation docs: document FE index creation flow for storage-compute separation (存算分离) architecture Mar 18, 2026
Copilot AI requested a review from ybtsdst March 18, 2026 16:44
Copy link
Copy Markdown
Owner

@ybtsdst ybtsdst left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

不要修改代码逻辑(比如修改logger),可以保留注释部分

…oc only)

Co-authored-by: ybtsdst <3139535+ybtsdst@users.noreply.github.com>
Copy link
Copy Markdown
Author

Copilot AI commented Mar 19, 2026

不要修改代码逻辑(比如修改logger),可以保留注释部分

已将 logger 改动还原(恢复为原始的 SchemaChangeJobV2.class),保留了所有 Javadoc 注释。Commit: bf00c98

Copilot AI requested a review from ybtsdst March 19, 2026 01:07
@ybtsdst ybtsdst marked this pull request as ready for review March 19, 2026 02:55
@ybtsdst ybtsdst merged commit dee122f into branch-4.0-dev Mar 19, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants