Skip to content

[HUDI-9737] Add checksum checks while recovering from backup file for… #43982

[HUDI-9737] Add checksum checks while recovering from backup file for…

[HUDI-9737] Add checksum checks while recovering from backup file for… #43982

Triggered via push August 22, 2025 13:31
Status Failure
Total duration 1h 44m 58s
Artifacts

bot.yml

on: push
validate-source
34s
validate-source
test-hudi-trino-plugin
4m 0s
test-hudi-trino-plugin
Matrix: build-spark-java17
Matrix: docker-java17-test
Matrix: integration-tests
Matrix: test-hudi-hadoop-mr-and-hudi-java-client
Matrix: test-spark-java-tests-part1
Matrix: test-spark-java-tests-part2
Matrix: test-spark-java-tests-part3
Matrix: test-spark-java11-17-java-tests-part1
Matrix: test-spark-java11-17-java-tests-part2
Matrix: test-spark-java11-17-java-tests-part3
Matrix: test-spark-java11-17-scala-dml-tests
Matrix: test-spark-java11-17-scala-other-tests
Matrix: test-spark-java17-java-tests-part1
Matrix: test-spark-java17-java-tests-part2
Matrix: test-spark-java17-java-tests-part3
Matrix: test-spark-java17-scala-dml-tests
Matrix: test-spark-java17-scala-other-tests
Matrix: test-spark-scala-dml-tests
Matrix: test-spark-scala-other-tests
Matrix: validate-bundles-java11
Matrix: validate-bundles
Fit to window
Zoom out
Zoom in

Annotations

20 errors and 158 warnings
validate-bundles-java11 (scala-2.12, flink2.0, 1.11.4, 1.14.4, spark3.5, spark3.5.1)
Trying to access closed classloader. Please check if you store classloaders directly or indirectly in static fields. If the stacktrace suggests that the leak occurs in a third party library and cannot be fixed immediately, you can disable this check with the configuration 'classloader.check-leaked-classloader'.
validate-bundles-java11 (scala-2.12, flink2.0, 1.11.4, 1.14.4, spark3.5, spark3.5.1)
Trying to access closed classloader. Please check if you store classloaders directly or indirectly in static fields. If the stacktrace suggests that the leak occurs in a third party library and cannot be fixed immediately, you can disable this check with the configuration 'classloader.check-leaked-classloader'.
integration-tests (spark3.5, flink1.20, spark-3.5.3/spark-3.5.3-bin-hadoop3.tgz)
Process completed with exit code 1.
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
Trying to access closed classloader. Please check if you store classloaders directly or indirectly in static fields. If the stacktrace suggests that the leak occurs in a third party library and cannot be fixed immediately, you can disable this check with the configuration 'classloader.check-leaked-classloader'.
test-flink-2 (flink1.20, 1.11.4, 1.13.1)
<false>
test-flink-2 (flink1.20, 1.11.4, 1.13.1)
<false>
test-spark-java17-java-tests-part2 (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
Failed to rollback /tmp/junit-5816784283762730549/dataset commits 20250822143229620
test-spark-java17-java-tests-part2 (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
Cannot resolve conflicts for overlapping writes between first operation = {actionType=commit, instantTime=20250822143225783, actionState=INFLIGHT'}, second operation = {actionType=commit, instantTime=20250822143225766, actionState=COMPLETED'}
test-spark-java11-17-java-tests-part2 (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Failed to rollback /tmp/junit-8214831928515055848/dataset commits 20250822143345676
test-spark-java11-17-java-tests-part2 (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Cannot resolve conflicts for overlapping writes between first operation = {actionType=commit, instantTime=20250822143341585, actionState=INFLIGHT'}, second operation = {actionType=commit, instantTime=20250822143341560, actionState=COMPLETED'}
test-spark-java-tests-part2 (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Failed to rollback /tmp/junit-8528497693670148750/dataset commits 20250822143248382
test-spark-java-tests-part2 (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Cannot resolve conflicts for overlapping writes between first operation = {actionType=commit, instantTime=20250822143244850, actionState=INFLIGHT'}, second operation = {actionType=commit, instantTime=20250822143244825, actionState=COMPLETED'}
test-spark-java11-17-java-tests-part2 (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Failed to rollback /tmp/junit-8733324224962470194/dataset commits 20250822143409532
test-spark-java11-17-java-tests-part2 (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Cannot resolve conflicts for overlapping writes between first operation = {actionType=commit, instantTime=20250822143405243, actionState=INFLIGHT'}, second operation = {actionType=commit, instantTime=20250822143405260, actionState=COMPLETED'}
test-spark-java-tests-part2 (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Failed to rollback /tmp/junit-3525821838511561418/dataset commits 20250822143339701
test-spark-java-tests-part2 (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Cannot resolve conflicts for overlapping writes between first operation = {actionType=commit, instantTime=20250822143336016, actionState=INFLIGHT'}, second operation = {actionType=commit, instantTime=20250822143335991, actionState=COMPLETED'}
test-spark-java-tests-part2 (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
Failed to rollback /tmp/junit-4158318801392709624/dataset commits 20250822143259735
test-spark-java-tests-part2 (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
Cannot resolve conflicts for overlapping writes between first operation = {actionType=commit, instantTime=20250822143256209, actionState=INFLIGHT'}, second operation = {actionType=commit, instantTime=20250822143256191, actionState=COMPLETED'}
test-spark-java17-java-tests-part2 (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Failed to rollback /tmp/junit-17947392962715841213/dataset commits 20250822143343378
test-spark-java17-java-tests-part2 (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Cannot resolve conflicts for overlapping writes between first operation = {actionType=commit, instantTime=20250822143339165, actionState=INFLIGHT'}, second operation = {actionType=commit, instantTime=20250822143339138, actionState=COMPLETED'}
validate-bundles-java11 (scala-2.12, flink2.0, 1.11.4, 1.14.4, spark3.5, spark3.5.1)
validate.sh done validating flink 2.0 bundle
validate-bundles-java11 (scala-2.12, flink2.0, 1.11.4, 1.14.4, spark3.5, spark3.5.1)
validate.sh done validating Flink bundle validation was successful.
validate-bundles-java11 (scala-2.12, flink2.0, 1.11.4, 1.14.4, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles-java11 (scala-2.12, flink2.0, 1.11.4, 1.14.4, spark3.5, spark3.5.1)
validate.sh validating flink 2.0 bundle
validate-bundles-java11 (scala-2.12, flink2.0, 1.11.4, 1.14.4, spark3.5, spark3.5.1)
validate.sh done validating flink 2.0 bundle
validate-bundles-java11 (scala-2.12, flink2.0, 1.11.4, 1.14.4, spark3.5, spark3.5.1)
validate.sh done validating Flink bundle validation was successful.
validate-bundles-java11 (scala-2.12, flink2.0, 1.11.4, 1.14.4, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles-java11 (scala-2.12, flink2.0, 1.11.4, 1.14.4, spark3.5, spark3.5.1)
validate.sh validating flink 2.0 bundle
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh run_docker_tests Running Hudi maven tests on Docker
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh Running tests with Java 17
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh starting hadoop hdfs, hdfs report
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh starting datanode:3
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh starting datanode:2
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh starting datanode:1
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh starting hadoop hdfs
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh copying hadoop conf
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh Done building Hudi with Java 8
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh Building Hudi with Java 8
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh run_docker_tests Running Hudi maven tests on Docker
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Running tests with Java 17
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting hadoop hdfs, hdfs report
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:3
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:2
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:1
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting hadoop hdfs
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh copying hadoop conf
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Done building Hudi with Java 8
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Building Hudi with Java 8
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh run_docker_tests Running Hudi maven tests on Docker
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Running tests with Java 17
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting hadoop hdfs, hdfs report
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:3
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:2
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:1
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting hadoop hdfs
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh copying hadoop conf
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Done building Hudi with Java 8
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Building Hudi with Java 8
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh done validating cli bundle
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh setting up CLI bundle validation
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating cli bundle
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh done validating cli bundle
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh setting up CLI bundle validation
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating cli bundle
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh done validating cli bundle
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh setting up CLI bundle validation
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating cli bundle
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.19, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh done validating cli bundle
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh setting up CLI bundle validation
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating cli bundle
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh done validating cli bundle
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh setting up CLI bundle validation
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating cli bundle
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh done validating cli bundle
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh setting up CLI bundle validation
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating cli bundle
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.20, 1.11.4, 1.13.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh skip validating utilities bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh skip validating cli bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh skip validating utilities bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh skip validating cli bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh skip validating utilities bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh skip validating cli bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.17, 1.11.4, 1.12.3, spark3.3, spark3.3.4)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh skip validating utilities bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh skip validating cli bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh skip validating utilities bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh skip validating cli bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh skip validating utilities bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh skip validating cli bundle for non-spark3.5 build
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.18, 1.11.4, 1.13.1, spark3.4, spark3.4.3)
validate.sh validating spark & hadoop-mr bundle