 [](https://github.com/yaooqinn/kyuubi/pull/503)     [<img width="16" alt="Powered by Pull Request Badge" src="https://user-images.githubusercontent.com/1393946/111216524-d2bb8e00-85d4-11eb-821b-ed4c00989c02.png">](https://pullrequestbadge.com/?utm_medium=github&utm_source=yaooqinn&utm_campaign=badge_info)<!-- PR-BADGE: PLEASE DO NOT REMOVE THIS COMMENT --> <!-- Thanks for sending a pull request! Here are some tips for you: 1. If this is your first time, please read our contributor guidelines: https://kyuubi.readthedocs.io/en/latest/community/contributions.html 2. If the PR is related to an issue in https://github.com/yaooqinn/kyuubi/issues, add '[KYUUBI #XXXX]' in your PR title, e.g., '[KYUUBI #XXXX] Your PR title ...'. 3. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP][KYUUBI #XXXX] Your PR title ...'. --> ### _Why are the changes needed?_ Refer https://github.com/apache/spark/pull/32096, https://github.com/apache/spark/pull/32102, this PR tries to fix the java.net.BindException when testing with Github Action. ``` SparkOperationSuite: *** RUN ABORTED *** java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address. at sun.nio.ch.Net.bind0(Native Method) at sun.nio.ch.Net.bind(Net.java:461) at sun.nio.ch.Net.bind(Net.java:453) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:222) at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:134) at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:550) at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1334) at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:506) at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:491) at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:973) ``` Also transfer FRONTEND_BIND_HOST by connection string to fix similar issue. ``` Cause: java.lang.RuntimeException: org.apache.kyuubi.KyuubiSQLException:org.apache.kyuubi.KyuubiException: Failed to initialize frontend service on fv-az207-19/10.1.1.0:0. at org.apache.kyuubi.service.FrontendService.initialize(FrontendService.scala:102) at org.apache.kyuubi.service.CompositeService.$anonfun$initialize$1(CompositeService.scala:40) at org.apache.kyuubi.service.CompositeService.$anonfun$initialize$1$adapted(CompositeService.scala:40) at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62) at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49) at org.apache.kyuubi.service.CompositeService.initialize(CompositeService.scala:40) at org.apache.kyuubi.service.Serverable.initialize(Serverable.scala:44) at org.apache.kyuubi.engine.spark.SparkSQLEngine.initialize(SparkSQLEngine.scala:49) at org.apache.kyuubi.engine.spark.SparkSQLEngine$.startEngine(SparkSQLEngine.scala:105) at org.apache.kyuubi.engine.spark.SparkSQLEngine$.main(SparkSQLEngine.scala:118) at org.apache.kyuubi.engine.spark.SparkSQLEngine.main(SparkSQLEngine.scala) ``` ### _How was this patch tested?_ - [ ] Add some test cases that check the changes thoroughly including negative and positive cases if possible - [ ] Add screenshots for manual tests if appropriate - [ ] [Run test](https://kyuubi.readthedocs.io/en/latest/tools/testing.html#running-tests) locally before make a pull request Closes #503 from turboFei/KYUUBI-502. Closes #502 1b10253a [fwang12] use localhost instead of 127.0.0.1 c104ce3f [fwang12] address comments 1e549c16 [fwang12] revert install shade 457ce2f5 [fwang12] try set frontend bind host in connection string 8bcd5a0d [fwang12] revert env KYUUBI_FRONTEND_BIND_HOST and set kyuubi.frontend.bind.host to 127.0.0.1 in scalatest-maven-plugin 717a992f [fwang12] update doc d5ba05a4 [fwang12] add install shaded jars in release.yml e8b23725 [fwang12] involve KYUUBI_FRONTEND_BIND_HOST 5eb7cdb2 [fwang12] also set KYUUBI_FRONTEND_BIND_HOST env to 127.0.0.1 7d708198 [fwang12] [KYUUBI #502][SPARK-35002][INFRA] Fix the java.net.BindException when testing with Github Action Authored-by: fwang12 <fwang12@ebay.com> Signed-off-by: Cheng Pan <379377944@qq.com>
86 lines
2.7 KiB
YAML
86 lines
2.7 KiB
YAML
name: Kyuubi
|
|
|
|
on:
|
|
push:
|
|
branches:
|
|
- master
|
|
- branch-*
|
|
pull_request:
|
|
branches:
|
|
- master
|
|
- branch-*
|
|
|
|
jobs:
|
|
rat:
|
|
name: Check License
|
|
runs-on: ubuntu-latest
|
|
steps:
|
|
- uses: actions/checkout@v2
|
|
- uses: actions/setup-java@v1
|
|
with:
|
|
java-version: '1.8'
|
|
- run: build/mvn org.apache.rat:apache-rat-plugin:check
|
|
- name: Upload rat report
|
|
if: failure()
|
|
uses: actions/upload-artifact@v2
|
|
with:
|
|
name: rat
|
|
path: "**/target/rat*.txt"
|
|
|
|
build:
|
|
name: Build
|
|
runs-on: ubuntu-latest
|
|
strategy:
|
|
matrix:
|
|
profiles:
|
|
- '-Pspark-3.0'
|
|
- '-Pspark-3.0 -Dspark.archive.mirror=https://archive.apache.org/dist/spark/spark-3.1.1 -Dspark.archive.name=spark-3.1.1-bin-hadoop2.7.tgz -Dmaven.plugin.scalatest.exclude.tags=org.apache.kyuubi.tags.DataLakeTest'
|
|
- '-Pspark-3.1'
|
|
- '-Pspark-3.1 -Dhadoop.binary.version=3.2'
|
|
- '-Pspark-3.2-snapshot -pl :kyuubi-spark-sql-engine,:kyuubi-common,:kyuubi-ha,:kyuubi-zookeeper -Dmaven.plugin.scalatest.exclude.tags=org.apache.kyuubi.tags.DataLakeTest'
|
|
env:
|
|
SPARK_LOCAL_IP: localhost
|
|
steps:
|
|
- uses: actions/checkout@v2
|
|
- name: Setup JDK 1.8
|
|
uses: actions/setup-java@v1
|
|
with:
|
|
java-version: '1.8'
|
|
- uses: actions/cache@v1
|
|
with:
|
|
path: ~/.m2/repository/com
|
|
key: ${{ runner.os }}-maven-com-${{ hashFiles('**/pom.xml') }}
|
|
restore-keys: |
|
|
${{ runner.os }}-maven-com-
|
|
- uses: actions/cache@v1
|
|
with:
|
|
path: ~/.m2/repository/org
|
|
key: ${{ runner.os }}-maven-org-${{ hashFiles('**/pom.xml') }}
|
|
restore-keys: |
|
|
${{ runner.os }}-maven-org-
|
|
- uses: actions/cache@v1
|
|
with:
|
|
path: ~/.m2/repository/net
|
|
key: ${{ runner.os }}-maven-net-${{ hashFiles('**/pom.xml') }}
|
|
restore-keys: |
|
|
${{ runner.os }}-maven-net-
|
|
- uses: actions/cache@v1
|
|
with:
|
|
path: ~/.m2/repository/io
|
|
key: ${{ runner.os }}-maven-io-${{ hashFiles('**/pom.xml') }}
|
|
restore-keys: |
|
|
${{ runner.os }}-maven-io-
|
|
- name: Build with Maven
|
|
run: |
|
|
mvn clean install ${{ matrix.profiles }} -Dmaven.javadoc.skip=true -V
|
|
bash <(curl -s https://codecov.io/bash)
|
|
- name: Detected Dependency List Change
|
|
if: ${{ ! contains(matrix.profiles, 'spark-3.2-snapshot') }}
|
|
run: build/dependency.sh
|
|
- name: Upload unit tests log
|
|
if: failure()
|
|
uses: actions/upload-artifact@v2
|
|
with:
|
|
name: unit-tests-log
|
|
path: "**/target/unit-tests.log"
|