We can start pig in two ways:
created on local drive folder.
To start: move to your pig folder and type :
bin/pig -x local //for local mode
So if you start pig in this mode all the files you are dealing with or giving refrence of should be there on the hdfs file system, which you can create or copy using hadoop shell commands.
if files are not present the pig will yield file not found error.
1. Local mode :
pig will operate in stand alone local mode and all files will be loaded andcreated on local drive folder.
To start: move to your pig folder and type :
bin/pig -x local //for local mode
2. Distributed mode :
pig will operate in distributed mode and all files will be loaded and created on hdfs drive folder.
bin/pig -x mapreduce //for distributed mode
So if you start pig in this mode all the files you are dealing with or giving refrence of should be there on the hdfs file system, which you can create or copy using hadoop shell commands.
if files are not present the pig will yield file not found error.
No comments:
Post a Comment
Thank you for Commenting Will reply soon ......