Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

After creating a Hadoop file system, you can connect to it, explore and perform data management operations. This guide gives you some of the common command-line operations used to connect to, explore and manage a Hadoop file system.

WSO2 Storage server ships a script to support Hadoop CLI commands. Also user has to install Kerberos tools to cache Kerberos ticket from KDC server hosted with Storage Server. 1.

  1. Cache the user kerberos ticket to the system

...

Code Block
$kinit user1

2.  Change the directory to  $CARBON_HOME

[user home should be created by admin /user/user1 and chown to user1]

List HDFS directorys

...

languagebash

...


  1.     if super tenant $kinit <username>/<domain Name>
        if user or tenant user or tenant admin $kinit <username>_<domain Name>
  2. List HDFS directories:

    • If to list directories in the user's home directory, navigate to the user's home directory by changing the directory to <SS_HOME> created by admin. e.g. if for tenant user tuser,  /user/test.test_tuser. The home directory for a user is created in /user directory.
    • List directories: $bin/hadoop dfs -lsr /user/user1
  3. Create

...

languagebash
  1. HDFS directory:$bin/hadoop

...

  1. dfs

...

  1. -mkdir

...

  1. /user/user1/wso2

...

 

...

  1. .