Unknown macro: {next_previous_links}
Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 9 Current »

After creating a Hadoop file system, you can connect to it, explore and perform data management operations. This guide gives you some of the common command-line operations used to connect to, explore and manage a Hadoop file system.

WSO2 Storage server ships a script to support Hadoop CLI commands. Also user has to install Kerberos tools to cache Kerberos ticket from KDC server hosted with Storage Server. 

  1. Cache the user kerberos ticket to the system: $kinit user1
  2. List HDFS directories:
    • Change the directory to <SS_HOME> created by admin /user/user1 and shown to user1 <Need to explain this>
    • List directories: $bin/hadoop dfs -lsr /user/user1
  3. Create HDFS directory: $bin/hadoop dfs -mkdir /user/user1/wso2.
  • No labels