After creating a Hadoop file systesystem, you can connect to it, explore and perform data management operations. This guide gives you some of the common command-line operations used to connect to, explore and manage a Hadoop file system.
WSO2 Storage server ships a script to support Hadoop CLI commands. Also user has to install Kerberos tools to cache Kerberos ticket from KDC server hosted with Storage Server. <SS team needs to explain the above statement (in blue) in detail. It is really vague>
- Cache the user kerberos ticket to the system:
$kinit user1
if super tenant$kinit <username>/<domain Name>
if user or tenant user or tenant admin$kinit <username>_<domain Name>
- List HDFS directories:
- Change If to list directories in the user's home directory, navigate to the user's home directory by changing the directory to <SS_HOME> created by admin. e.g. if for tenant user tuser, /user/test.test_tuser. The home directory for a user /user1 and shown to user1 <Need to explain this>is created in /user directory.
- List directories:
$bin/hadoop dfs -lsr /user/user1
- Change If to list directories in the user's home directory, navigate to the user's home directory by changing the directory to <SS_HOME> created by admin. e.g. if for tenant user tuser, /user/test.test_tuser. The home directory for a user /user1 and shown to user1 <Need to explain this>is created in /user directory.
- Create HDFS directory:
$bin/hadoop dfs -mkdir /user/user1/wso2
<SS team needs to add the rest of the CLI commands>.