After creating a Hadoop file system, you can connect to it, explore and perform data management operations. This guide gives you some of the common command-line operations used to connect to, explore and manage a Hadoop file system.
WSO2 Storage server ships a script to support Hadoop CLI commands. Also user has to install Kerberos tools to cache Kerberos ticket from KDC server hosted with Storage Server.
- Cache the user kerberos ticket to the system:
$kinit user1
- List HDFS directories:
- Change the directory to <SS_HOME> created by admin /user/user1 and shown to user1 <Need to explain this>
- List directories:
$bin/hadoop dfs -lsr /user/user1
- Change the directory to <SS_HOME> created by admin /user/user1 and shown to user1 <Need to explain this>
- Create HDFS directory:
$bin/hadoop dfs -mkdir /user/user1/wso2.