Now open command prompt and type pyspark command to run PySpark shell. Winutils are different for each Hadoop version hence download the right version from PySpark shell
PATH=%PATH% C:\apps\spark-3.0.0-bin-hadoop2.7\binĭownload wunutils.exe file from winutils, and copy it to %SPARK_HOME%\bin folder. Now set the following environment variables.