Unknown macro: {next_previous_links}
Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 6 Next »

After creating a Hadoop file syste, you can connect to it, explore and perform data management operations. This guide gives you some of the common command-line operations used to connect to, explore and manage a Hadoop file system.

 WSO2 Storage server ships a script to support Hadoop CLI commands. Also user has to install Kerberos tools to cache Kerberos ticket from KDC server hosted with Storage Server. 

<SS team needs to explain the above statement (in blue) in detail. It is really vague>

  1. Cache the user kerberos ticket to the system: $kinit user1
  2. List HDFS directories:
    • Change the directory to <SS_HOME> created by admin /user/user1 and shown to user1 <Need to explain this>
    • List directories: $bin/hadoop dfs -lsr /user/user1
  3. Create HDFS directory: $bin/hadoop dfs -mkdir /user/user1/wso2
  4. <SS team needs to add the rest of the CLI commands>
  • No labels