com.atlassian.confluence.content.render.xhtml.migration.exceptions.UnknownMacroMigrationException: The macro 'next_previous_links' is unknown.
HDFS CLI Commands
After creating a Hadoop file system, you can connect to it, explore and perform data management operations. Some of the common command-line operations used to connect to, explore and manage a Hadoop file system is explained here.
WSO2 Storage server ships a script to support Hadoop CLI commands. Also user has to install Kerberos tools to cache the Kerberos ticket from the KDC server hosted within Storage Server.Â
- Cache the user kerberos ticket to the system:Â
   if super tenant$kinit <username>/<domain Name>
   if user or tenant user or tenant admin$kinit <username>_<domain Name>
- List HDFS directories:
- To list directories in the user's home directory, navigate to the user's home directory by changing the directory to <SS_HOME> created by admin. For example, if for tenant user tuser, /user/test.test_tuser. The home directory for a user is created in /user directory.
- List directories:
$bin/hadoop dfs -lsr /user/user1
- To list directories in the user's home directory, navigate to the user's home directory by changing the directory to <SS_HOME> created by admin. For example, if for tenant user tuser, /user/test.test_tuser. The home directory for a user is created in /user directory.
- Create HDFS directory:
$bin/hadoop dfs -mkdir /user/user1/wso2.
com.atlassian.confluence.content.render.xhtml.migration.exceptions.UnknownMacroMigrationException: The macro 'next_previous_links2' is unknown.