Skip to content

[fix](paimon)Set the target size of the split for 3.0 #50405

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: branch-3.0
Choose a base branch
from

Conversation

wuwenchi
Copy link
Contributor

What problem does this PR solve?

bp: #50083

@Thearas
Copy link
Contributor

Thearas commented Apr 25, 2025

Thank you for your contribution to Apache Doris.
Don't know what should be done next? See How to process your PR.

Please clearly describe your PR:

  1. What problem was fixed (it's best to include specific error reporting information). How it was fixed.
  2. Which behaviors were modified. What was the previous behavior, what is it now, why was it modified, and what possible impacts might there be.
  3. What features were added. Why was this function added?
  4. Which code was refactored and why was this part of the code refactored?
  5. Which functions were optimized and what is the difference before and after the optimization?

@wuwenchi
Copy link
Contributor Author

run buildall

@wuwenchi wuwenchi marked this pull request as ready for review April 25, 2025 02:57
@wuwenchi wuwenchi requested a review from dataroaring as a code owner April 25, 2025 02:57
Copy link
Contributor

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In build.sh line 895:
    if [[ "${TARGET_SYSTEM}" == 'Linux' ]] && [[ "$TARGET_ARCH" == 'x86_64' ]]; then
                                                  ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${TARGET_SYSTEM}" == 'Linux' ]] && [[ "${TARGET_ARCH}" == 'x86_64' ]]; then


In build.sh line 899:
    elif [[ "${TARGET_SYSTEM}" == 'Linux' ]] && [[ "$TARGET_ARCH" == 'aarch64' ]]; then
                                                    ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    elif [[ "${TARGET_SYSTEM}" == 'Linux' ]] && [[ "${TARGET_ARCH}" == 'aarch64' ]]; then


In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${es_5_host}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${es_5_host}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${es_6_host}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${es_6_host}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${es_7_host}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${es_7_host}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${es_7_host}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${es_8_host}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${es_8_host}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${es_8_host}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/docker-compose/ranger/ranger-admin/ranger-entrypoint.sh line 24:
cd $RANGER_HOME
   ^----------^ SC2154 (warning): RANGER_HOME is referenced but not assigned.
   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.
   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cd "${RANGER_HOME}"


In docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh line 16:
#!/bin/bash
^-- SC1128 (error): The shebang must be on the first line. Delete blanks and move comments.


In docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh line 19:
if [ ! -d "${RANGER_HOME}/ews/webapp/WEB-INF/classes/ranger-plugins/doris" ]; then
   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
           ^------------^ SC2154 (warning): RANGER_HOME is referenced but not assigned.

Did you mean: 
if [[ ! -d "${RANGER_HOME}/ews/webapp/WEB-INF/classes/ranger-plugins/doris" ]]; then


In docker/thirdparties/docker-compose/ranger/script/install_doris_service_def.sh line 15:
#!/bin/bash
^-- SC1128 (error): The shebang must be on the first line. Delete blanks and move comments.


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 330:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 339:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 344:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 363:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 365:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 367:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 375:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 376:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 380:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 391:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 393:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 403:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 404:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 406:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 414:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 415:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 419:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 430:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 432:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 499:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 500:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 501:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 510:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 512:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 514:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 518:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 533:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 534:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 535:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 546:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 547:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 552:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 557:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 563:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 601:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 603:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 615:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 622:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 623:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 624:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 778:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 779:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 780:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 782:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true


In thirdparty/build-thirdparty.sh line 1863:
    cp -r ${TP_SOURCE_DIR}/${JINDOFS_SOURCE}/* "${TP_INSTALL_DIR}/jindofs_libs/"
          ^--------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                           ^---------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    cp -r "${TP_SOURCE_DIR}"/"${JINDOFS_SOURCE}"/* "${TP_INSTALL_DIR}/jindofs_libs/"


In tools/lzo/build.sh line 1:
# Licensed to the Apache Software Foundation (ASF) under one
^-- SC2148 (error): Tips depend on target shell and yours is unknown. Add a shebang or a 'shell' directive.


In tools/lzo/build.sh line 20:
g++ -o lzo_writer lzo_writer.cpp -I. -Isrc -I${DORIS_THIRDPARTY}/installed/include -L${DORIS_THIRDPARTY}/installed/lib -llzo2 -std=c++17
                                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                     ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
g++ -o lzo_writer lzo_writer.cpp -I. -Isrc -I"${DORIS_THIRDPARTY}"/installed/include -L"${DORIS_THIRDPARTY}"/installed/lib -llzo2 -std=c++17

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC1128 -- The shebang must be on the first ...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh.orig
+++ docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh
--- docker/thirdparties/docker-compose/ranger/script/install_doris_service_def.sh.orig
+++ docker/thirdparties/docker-compose/ranger/script/install_doris_service_def.sh
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -187,7 +187,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -387,7 +387,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -426,7 +426,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -452,12 +452,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -619,9 +619,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -672,101 +672,101 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
 if [[ "${RUN_RANGER}" -eq 1 ]]; then
-    start_ranger > start_ranger.log 2>&1 &
+    start_ranger >start_ranger.log 2>&1 &
     pids["ranger"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


@doris-robot
Copy link

TPC-H: Total hot run time: 40067 ms
machine: 'aliyun_ecs.c7a.8xlarge_32C64G'
scripts: https://github.com/apache/doris/tree/master/tools/tpch-tools
Tpch sf100 test result on commit c03ce67a8731229aa9361d0d979a15bd195efaf3, data reload: false

------ Round 1 ----------------------------------
q1	17572	6704	6608	6608
q2	2060	171	162	162
q3	10622	1098	1237	1098
q4	10558	775	751	751
q5	7748	2893	2851	2851
q6	224	135	135	135
q7	978	611	612	611
q8	9365	1930	2075	1930
q9	6580	6368	6378	6368
q10	6969	2285	2326	2285
q11	454	256	267	256
q12	400	212	206	206
q13	17792	2964	2985	2964
q14	241	202	213	202
q15	490	468	475	468
q16	662	587	573	573
q17	970	560	561	560
q18	7180	6603	6727	6603
q19	1399	1117	1066	1066
q20	493	203	213	203
q21	4021	3202	3270	3202
q22	1078	965	977	965
Total cold run time: 107856 ms
Total hot run time: 40067 ms

----- Round 2, with runtime_filter_mode=off -----
q1	6883	6619	6556	6556
q2	320	233	238	233
q3	2911	2717	2959	2717
q4	2031	1788	1756	1756
q5	5730	5758	5744	5744
q6	215	128	128	128
q7	2252	1813	1859	1813
q8	3341	3577	3512	3512
q9	9048	8800	8859	8800
q10	3541	3542	3520	3520
q11	594	496	503	496
q12	817	613	629	613
q13	9222	3138	3195	3138
q14	287	288	261	261
q15	512	474	460	460
q16	687	636	648	636
q17	1850	1637	1621	1621
q18	8239	7755	7719	7719
q19	1694	1572	1528	1528
q20	2032	1826	1794	1794
q21	5643	5265	5218	5218
q22	1091	1045	1038	1038
Total cold run time: 68940 ms
Total hot run time: 59301 ms

@doris-robot
Copy link

TPC-DS: Total hot run time: 197194 ms
machine: 'aliyun_ecs.c7a.8xlarge_32C64G'
scripts: https://github.com/apache/doris/tree/master/tools/tpcds-tools
TPC-DS sf100 test result on commit c03ce67a8731229aa9361d0d979a15bd195efaf3, data reload: false

query1	1291	885	884	884
query2	6328	2003	2005	2003
query3	10832	4276	4284	4276
query4	61238	29115	23806	23806
query5	5257	480	456	456
query6	392	167	173	167
query7	5463	313	307	307
query8	304	228	224	224
query9	8722	2626	2616	2616
query10	484	267	252	252
query11	17717	15263	15765	15263
query12	164	104	113	104
query13	1459	462	438	438
query14	10621	6868	7157	6868
query15	208	189	188	188
query16	7112	463	467	463
query17	1185	588	591	588
query18	1889	335	325	325
query19	210	169	169	169
query20	121	107	112	107
query21	219	107	108	107
query22	4620	4391	4788	4391
query23	34578	34089	34363	34089
query24	6075	2910	2918	2910
query25	530	397	408	397
query26	650	167	172	167
query27	1823	355	359	355
query28	4354	2457	2448	2448
query29	705	448	429	429
query30	251	176	174	174
query31	1001	805	823	805
query32	72	59	57	57
query33	425	301	277	277
query34	913	519	521	519
query35	875	741	724	724
query36	1047	924	935	924
query37	117	62	68	62
query38	4060	3998	3930	3930
query39	1513	1468	1532	1468
query40	198	105	101	101
query41	49	53	47	47
query42	125	104	100	100
query43	542	499	508	499
query44	1186	823	839	823
query45	186	169	172	169
query46	1153	718	724	718
query47	1996	1924	1876	1876
query48	480	380	391	380
query49	745	406	418	406
query50	850	437	421	421
query51	7334	7287	7199	7199
query52	103	92	95	92
query53	262	197	182	182
query54	582	484	471	471
query55	77	83	83	83
query56	270	255	236	236
query57	1295	1153	1122	1122
query58	219	228	217	217
query59	3105	3010	2896	2896
query60	274	255	250	250
query61	117	108	104	104
query62	779	685	683	683
query63	216	186	183	183
query64	1362	686	661	661
query65	3261	3252	3163	3163
query66	683	298	308	298
query67	15851	15713	15506	15506
query68	4243	587	572	572
query69	418	274	279	274
query70	1168	1052	1073	1052
query71	360	263	259	259
query72	6391	4066	4001	4001
query73	749	341	356	341
query74	10058	8992	9005	8992
query75	3362	2630	2680	2630
query76	2170	1218	1281	1218
query77	502	272	262	262
query78	10454	9531	9582	9531
query79	1121	595	589	589
query80	801	440	423	423
query81	515	241	236	236
query82	1335	96	88	88
query83	235	142	140	140
query84	287	81	78	78
query85	858	316	295	295
query86	317	309	311	309
query87	4433	4253	4216	4216
query88	3522	2386	2401	2386
query89	414	296	293	293
query90	2033	188	186	186
query91	183	150	147	147
query92	65	51	50	50
query93	1136	548	574	548
query94	741	301	308	301
query95	351	265	259	259
query96	607	272	280	272
query97	3285	3168	3156	3156
query98	217	194	200	194
query99	1550	1296	1273	1273
Total cold run time: 312006 ms
Total hot run time: 197194 ms

@doris-robot
Copy link

ClickBench: Total hot run time: 31.5 s
machine: 'aliyun_ecs.c7a.8xlarge_32C64G'
scripts: https://github.com/apache/doris/tree/master/tools/clickbench-tools
ClickBench test result on commit c03ce67a8731229aa9361d0d979a15bd195efaf3, data reload: false

query1	0.03	0.02	0.04
query2	0.06	0.03	0.03
query3	0.24	0.06	0.06
query4	1.62	0.10	0.10
query5	0.53	0.50	0.52
query6	1.14	0.74	0.73
query7	0.02	0.02	0.01
query8	0.04	0.03	0.03
query9	0.56	0.50	0.51
query10	0.55	0.54	0.58
query11	0.13	0.10	0.10
query12	0.14	0.11	0.10
query13	0.61	0.60	0.60
query14	2.86	2.82	2.86
query15	0.88	0.82	0.84
query16	0.38	0.37	0.39
query17	1.10	1.05	1.05
query18	0.24	0.23	0.23
query19	1.96	1.79	2.00
query20	0.01	0.01	0.02
query21	15.36	0.60	0.59
query22	2.41	2.34	1.92
query23	17.11	0.91	0.76
query24	3.28	1.52	0.28
query25	0.25	0.19	0.07
query26	0.50	0.13	0.14
query27	0.05	0.05	0.04
query28	10.51	0.48	0.46
query29	12.56	3.24	3.24
query30	0.24	0.06	0.06
query31	2.85	0.39	0.38
query32	3.26	0.45	0.46
query33	2.98	3.02	2.99
query34	16.93	4.49	4.46
query35	4.50	4.48	4.48
query36	0.66	0.47	0.47
query37	0.09	0.06	0.06
query38	0.05	0.04	0.03
query39	0.03	0.02	0.02
query40	0.18	0.13	0.13
query41	0.08	0.03	0.02
query42	0.04	0.02	0.02
query43	0.03	0.03	0.03
Total cold run time: 107.05 s
Total hot run time: 31.5 s

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants