Hadoop 常用shell命令

    技术2022-07-11  94

    dfs 是fs 的实现类 hadoop dfs [-appendToFile <localsrc> ... <dst>] [-cat [-ignoreCrc] <src> ...] [-checksum <src> ...] [-chgrp [-R] GROUP PATH...] [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...] [-chown [-R] [OWNER][:[GROUP]] PATH...] [-copyFromLocal [-f] [-p] <localsrc> ... <dst>] [-copyToLocal [-p] [-ignoreCrc] [-crc] <src> ... <localdst>] [-count [-q] <path> ...] [-cp [-f] [-p] <src> ... <dst>] [-createSnapshot <snapshotDir> [<snapshotName>]] [-deleteSnapshot <snapshotDir> <snapshotName>] [-df [-h] [<path> ...]] [-du [-s] [-h] <path> ...] [-expunge] [-get [-p] [-ignoreCrc] [-crc] <src> ... <localdst>] [-getfacl [-R] <path>] [-getmerge [-nl] <src> <localdst>] [-help [cmd ...]] [-ls [-d] [-h] [-R] [<path> ...]] [-mkdir [-p] <path> ...] [-moveFromLocal <localsrc> ... <dst>] [-moveToLocal <src> <localdst>] [-mv <src> ... <dst>] [-put [-f] [-p] <localsrc> ... <dst>] [-renameSnapshot <snapshotDir> <oldName> <newName>] [-rm [-f] [-r|-R] [-skipTrash] <src> ...] [-rmdir [--ignore-fail-on-non-empty] <dir> ...] [-setfacl [-R] [{-b|-k} {-m|-x <acl_spec>} <path>]|[--set <acl_spec> <path>]] [-setrep [-R] [-w] <rep> <path> ...] [-stat [format] <path> ...] [-tail [-f] <file>] [-test -[defsz] <path>] [-text [-ignoreCrc] <src> ...] [-touchz <path> ...] [-usage [cmd ...]] 查看目录 :hadoop dfs -ls [-R] / 创建目录 :hadoop dfs -mkdir [-p] /user/files 从本地剪切粘贴到hdfs上:hadoop dfs -moveFromLocal /home/hello.txt /user/files/ 文件追加到末尾:hadoop dfs -appendToFile /home/word.txt /user/files/hello.txt 查看文件内容:hadoop dfs -cat /user/files/hello.txt 修改文件所属组:hadoop dfs -chgrp hadoop /user/files/hello.txt 修改文件拥有者:hadoop dfs -chown hive:hive /user/files/hello.txt 修改文件权限:hadoop dfs -chmod 755 hive:hive /user/files/hello.txt 从本地拷贝到hdfs:hadoop dfs -copyFromLocal /home/hi.txt /user/files/ hadoop dfs -put /home/hi.txt /user/files/ 从hdfs下载到本地:hadoop dfs -copyToLocal /user/files/hello.txt /home/text/ hadoop dfs -get /user/files/hi.txt /home/text/ 拷贝文件:hadoop dfs -cp /user/files/hello.txt /user/ 文件移动:hadoop dfs -mv /user/files/hello.txt /user/ 多个文件合并到本地:hadoop dfs -getmerge /user/files/* /home/text/merge.txt 显示一个文件的末尾:hadoop dfs -tail /user/files/hello.txt 删除文件或文件夹:hadoop dfs -rm [-r] /user/files/hi.txt 删除空文件夹:hadoop dfs -rmdir /user/files 统计文件夹的大小信息:hadoop dfs -du -h /user 统计文件夹的总大小:hadoop dfs -du -h -s /user 设置HDFS文件的副本数为2:hadoop dfs -setrep 2 /user/files/hello.txt

     

    Processed: 0.011, SQL: 9