I'm new to Hadoop, and trying to use streaming option to develop some jobs using Python on windows 10 localy.
After double checking my pathes given, and even my program, I encounter an Exception that is not discussed in any pages. the Exception is as:

I will be grateful for any help.
The error comes from either:
core-site.xml,fs.defaultFSvalue. That needs to behdfs://127.0.0.1:9000, for example, not your Windows filesystem. Perhaps you confused that withhdfs-site.xmlvalues for the namenode/datanode data directories.file://c:/path, notC:/for Hadoop-compatible file paths, especially values passed as-mapperor-reducerAlso, no one really writes mapreduce code anymore. You can run similar code in PySpark, and you don't need Hadoop to run it.