...
After creating a Hadoop file system, you can connect to it, explore and perform data management operations. This guide gives you some of the common command-line operations used to connect to, explore and manage a Hadoop file system.
WSO2 Storage server ships a script to support Hadoop CLI commands. Also user has to install Kerberos tools to cache Kerberos ticket from KDC server hosted with Storage Server. 1.
- Cache the user kerberos ticket to the system
...
Code Block |
---|
$kinit user1 |
2. Change the directory to $CARBON_HOME
[user home should be created by admin /user/user1 and chown to user1]
List HDFS directorys
...
language | bash |
---|
...
- :
if super tenant$kinit <username>/<domain Name>
if user or tenant user or tenant admin$kinit <username>_<domain Name>
- List HDFS directories:
- If to list directories in the user's home directory, navigate to the user's home directory by changing the directory to <SS_HOME> created by admin. e.g. if for tenant user tuser, /user/test.test_tuser. The home directory for a user is created in /user directory.
- List directories:
$bin/hadoop dfs -lsr /user/user1
- If to list directories in the user's home directory, navigate to the user's home directory by changing the directory to <SS_HOME> created by admin. e.g. if for tenant user tuser, /user/test.test_tuser. The home directory for a user is created in /user directory.
- Create
...
language | bash |
---|
- HDFS directory:
$bin/hadoop
...
dfs
...
-mkdir
...
/user/user1/wso2
...
...
.