Spark On Windows 10. 'files\spark\bin\..\jars""\' Is Not Recognized As An Internal Or External Command
Solution 1:
I've just found an answer in one of the answers to this question:
The following answer worked for me and is totally counter-intuitive:
"On Windows, I found that if it is installed in a directory that has a space in the path (C:\Program Files\Spark) the installation will fail. Move it to the root or another directory with no spaces."
Solution 2:
This problem is caused by your environment variable settings, in fact you probably put the SPARK_HOME value as 'Program Files\Spark\bin", which has 2 issue :
- you have to remove the bin, spark home is just 'Program Files\Spark\'
- since the path to spark home contains a white space, it causes a problem therefore you can set it as 'Progra~1\Spark\'
Solution 3:
I too faced the same issue. The main reason for this issue is the space in the folder path. C:\Program Files\spark-2.4.5-bin-hadoop2.7 for SPARK_HOME. Just move this spark-2.4.5-bin-hadoop2.7 folder to the root directory of C drive i.e C:\spark-2.4.5-bin-hadoop2.7 and point the SPARK_HOME also to the same location. It solves the issue.
Post a Comment for "Spark On Windows 10. 'files\spark\bin\..\jars""\' Is Not Recognized As An Internal Or External Command"