1. 背景
最近重新复习下大数据相关的知识,于是就决定在mbp上安装一套大数据服务(Hadoop、Hive、Spark),使用brew比较简单,很容易就安装好了,按时启动的时候出现了如下错误:
2023-06-14 23:24:03,271 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /Users/liyang/Documents/hadoop/tmp/dfs/name/in_use.lock acquired by nodename 26147@liyangdeMacBook-Pro.local
2023-06-14 23:24:03,275 WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Encountered exception loading fsimage
java.io.FileNotFoundException: /Users/liyang/Documents/hadoop/tmp/dfs/name/current/VERSION (Operation not permitted)
at java.base/java.io.RandomAccessFile.open0(Native Method)
at java.base/java.io.RandomAccessFile.open(RandomAccessFile.java:346)
at java.base/java.io.RandomAccessFile.<init>(RandomAccessFile.java:260)
at java.base/java.io.RandomAccessFile.<init>(RandomAccessFile.java:215)
at org.apache.hadoop.hdfs.server.common.StorageInfo.readPropertiesFile(StorageInfo.java:250)
at org.apache.hadoop.hdfs.server.namenode.NNStorage.readProperties(NNStorage.java:672)
at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:404)
at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:243)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:1201)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:779)
at org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:681)
at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:768)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:1020)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:995)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1769)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1834)
2023-06-14 23:24:03,283 INFO org.eclipse.jetty.server.handler.ContextHandler: Stopped o.e.j.w.WebAppContext@5d10455d{hdfs,/,null,STOPPED}{file:/usr/local/Cellar/hadoop/3.3.4/libexec/share/hadoop/hdfs/webapps/hdfs}
根据错误日志,发现是macOS的权限问题,目前macOS系统默认开启了SIP,为了方便使用Hadoop,这就需要关闭SIP。
2. 关闭SIP
重启系统,在启动阶段长安command + r,进入保护模式,打开控制台输入如下指令即可关闭
csrutil disable
然后再次重启电脑,此时即可启动HDFS的namenode。
3. 开启SIP
如果要恢复SIP保护机制,那么还是上面的步骤,不同的是指令
csrutil enable