No relevant resource is found in the selected language.

This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies. Read our privacy policy>Search


To have a better experience, please upgrade your IE browser.


Spark Fails to Run Because No Node Is Available

Publication Date:  2019-04-12 Views:  138 Downloads:  0

Issue Description

An error occurs when Spark runs a task. Printing information on the executor indicates that there is no available node.

Handling Process

1. After Spark fails to run the task, check the log of the executor.

2. The alarms[h1]  in the following figure are reported on FusionInsight Manager.

3. The error information in the following figure is displayed on the HDFS native page.

4. The disk space is insufficient.


Clean up the disk to free up more space. If the disk information cannot be cleared, add disks or expand the cluster capacity.