com.atlassian.confluence.content.render.xhtml.migration.exceptions.UnknownMacroMigrationException: The macro 'next_previous_links' is unknown.
HDFS CLI Commands
After creating a Hadoop file system, you can connect to it, explore and perform data management operations. This guide gives you some of the common command-line operations used to connect to, explore and manage a Hadoop file system.
WSO2 Storage server ships a script to support Hadoop CLI commands. Also user has to install Kerberos tools to cache Kerberos ticket from KDC server hosted with Storage Server.Â
- Cache the user kerberos ticket to the system:Â
   if super tenant$kinit <username>/<domain Name>
   if user or tenant user or tenant admin$kinit <username>_<domain Name>
- List HDFS directories:
- If to list directories in the user's home directory, navigate to the user's home directory by changing the directory to <SS_HOME> created by admin. e.g. if for tenant user tuser, /user/test.test_tuser. The home directory for a user is created in /user directory.
- List directories:
$bin/hadoop dfs -lsr /user/user1
- If to list directories in the user's home directory, navigate to the user's home directory by changing the directory to <SS_HOME> created by admin. e.g. if for tenant user tuser, /user/test.test_tuser. The home directory for a user is created in /user directory.
- Create HDFS directory:
$bin/hadoop dfs -mkdir /user/user1/wso2.
com.atlassian.confluence.content.render.xhtml.migration.exceptions.UnknownMacroMigrationException: The macro 'next_previous_links2' is unknown.