Set loverbound for pyspark<3.2 #3630
master.yml
on: push
Matrix: Conda (Python, Spark, pandas, PyArrow)
Matrix: PIP (Python, Spark, pandas, PyArrow)
Annotations
13 errors and 9 warnings
PIP (Python, Spark, pandas, PyArrow) (3.5, 2.3.4, 0.23.4, 0.16.0, 1.18.5)
Version 3.5 with arch x64 not found
The list of all available versions can be found here: https://raw.githubusercontent.com/actions/python-versions/main/versions-manifest.json
|
PIP (Python, Spark, pandas, PyArrow) (3.6, 2.3.4, 0.24.2, 0.10.0, 1.19.5, distributed-sequence)
Version 3.6 with arch x64 not found
The list of all available versions can be found here: https://raw.githubusercontent.com/actions/python-versions/main/versions-manifest.json
|
PIP (Python, Spark, pandas, PyArrow) (3.9, 3.1.2, 1.2.5, 3.0.0, 1.20.3)
Process completed with exit code 1.
|
PIP (Python, Spark, pandas, PyArrow) (3.9, 3.2.0, 1.2.5, 4.0.1, 1.21.2, distributed-sequence)
Process completed with exit code 1.
|
Conda (Python, Spark, pandas, PyArrow) (3.6, 2.4.7, 0.24.2, 0.14.1, 1.19.5, databricks.koalas.usa...
The runner has received a shutdown signal. This can happen when the runner service is stopped, or a manually started runner is canceled.
|
Conda (Python, Spark, pandas, PyArrow) (3.6, 2.4.7, 0.24.2, 0.14.1, 1.19.5, databricks.koalas.usa...
The operation was canceled.
|
Conda (Python, Spark, pandas, PyArrow) (3.7, 2.4.7, 0.25.3, 0.15.1, 1.19.5, distributed-sequence)
The runner has received a shutdown signal. This can happen when the runner service is stopped, or a manually started runner is canceled.
|
Conda (Python, Spark, pandas, PyArrow) (3.7, 2.4.7, 0.25.3, 0.15.1, 1.19.5, distributed-sequence)
The operation was canceled.
|
Conda (Python, Spark, pandas, PyArrow) (3.8, 3.1.1, 1.2.5, 3.0.0, 1.20.3, distributed-sequence)
Process completed with exit code 1.
|
Conda (Python, Spark, pandas, PyArrow) (3.7, 3.0.2, 1.0.5, 1.0.1, 1.19.5)
The runner has received a shutdown signal. This can happen when the runner service is stopped, or a manually started runner is canceled.
|
Conda (Python, Spark, pandas, PyArrow) (3.7, 3.0.2, 1.0.5, 1.0.1, 1.19.5)
The operation was canceled.
|
Conda (Python, Spark, pandas, PyArrow) (3.8, 3.0.2, 1.1.5, 2.0.0, 1.19.5)
Process completed with exit code 1.
|
Conda (Python, Spark, pandas, PyArrow) (3.7, 3.1.1, 1.1.5, 2.0.0, 1.19.5, distributed-sequence)
The hosted runner: GitHub Actions 41 lost communication with the server. Anything in your workflow that terminates the runner process, starves it for CPU/Memory, or blocks its network access can cause this error.
|
PIP (Python, Spark, pandas, PyArrow) (3.5, 2.3.4, 0.23.4, 0.16.0, 1.18.5)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/setup-java@v1, actions/cache@v1, actions/setup-python@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
|
PIP (Python, Spark, pandas, PyArrow) (3.6, 2.3.4, 0.24.2, 0.10.0, 1.19.5, distributed-sequence)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/setup-java@v1, actions/cache@v1, actions/setup-python@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
|
PIP (Python, Spark, pandas, PyArrow) (3.9, 3.1.2, 1.2.5, 3.0.0, 1.20.3)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/setup-java@v1, actions/cache@v1, actions/setup-python@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
|
PIP (Python, Spark, pandas, PyArrow) (3.9, 3.2.0, 1.2.5, 4.0.1, 1.21.2, distributed-sequence)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/setup-java@v1, actions/cache@v1, actions/setup-python@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
|
Conda (Python, Spark, pandas, PyArrow) (3.6, 2.4.7, 0.24.2, 0.14.1, 1.19.5, databricks.koalas.usa...
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/setup-java@v1. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
|
Conda (Python, Spark, pandas, PyArrow) (3.7, 2.4.7, 0.25.3, 0.15.1, 1.19.5, distributed-sequence)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/setup-java@v1. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
|
Conda (Python, Spark, pandas, PyArrow) (3.8, 3.1.1, 1.2.5, 3.0.0, 1.20.3, distributed-sequence)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/setup-java@v1. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
|
Conda (Python, Spark, pandas, PyArrow) (3.7, 3.0.2, 1.0.5, 1.0.1, 1.19.5)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/setup-java@v1. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
|
Conda (Python, Spark, pandas, PyArrow) (3.8, 3.0.2, 1.1.5, 2.0.0, 1.19.5)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/setup-java@v1. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
|