我从github下载了 hadoop
源代码并使用 native
选项进行了编译:
mvn package -Pdist,native -DskipTests -Dtar -Dmaven.javadoc.skip=true
然后我将 .dylib
文件复制到$ HADOOP_HOME / lib
cp -p hadoop-common-project/hadoop-common/target/hadoop-common-2.7.1/lib/native/*.dylib /usr/local/Cellar/hadoop/2.7.2/libexec/share/hadoop/lib
LD_LIBRARY_PATH已更新并且hdfs已重新启动:
echo $LD_LIBRARY_PATH
/usr/local/Cellar/hadoop/2.7.2/libexec/lib:
/usr/local/Cellar/hadoop/2.7.2/libexec/share/hadoop/common/lib:/Library/Java/JavaVirtualMachines/jdk1.8.0_92.jdk/Contents/Home//jre/lib
(注意:这也意味着Hadoop “Unable to load native-hadoop library for your platform” error on docker-spark?的答案对我不起作用..)
但 checknative
仍然统一返回 false
:
$stop-dfs.sh && start-dfs.sh && hadoop checknative
16/06/13 16:12:32 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Stopping namenodes on [sparkbook]
sparkbook: stopping namenode
localhost: stopping datanode
Stopping secondary namenodes [0.0.0.0]
0.0.0.0: stopping secondarynamenode
16/06/13 16:12:50 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/06/13 16:12:50 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [sparkbook]
sparkbook: starting namenode, logging to /usr/local/Cellar/hadoop/2.7.2/libexec/logs/hadoop-macuser-namenode-sparkbook.out
localhost: starting datanode, logging to /usr/local/Cellar/hadoop/2.7.2/libexec/logs/hadoop-macuser-datanode-sparkbook.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /usr/local/Cellar/hadoop/2.7.2/libexec/logs/hadoop-macuser-secondarynamenode-sparkbook.out
16/06/13 16:13:05 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/06/13 16:13:05 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Native library checking:
hadoop: false
zlib: false
snappy: false
lz4: false
bzip2: false
openssl: false
4 回答
为了完成全新安装的macOS 10.12,我必须执行以下操作:
JAVA_LIBRARY_PATH
:@ andrewdotn的回复中有一些缺失的步骤:
1)对于步骤(3),通过添加发布到文本文件的文本来创建补丁,例如, “patch.txt”,然后执行“git apply patch.txt”
2)除了按照javadba的指示复制文件外,某些应用程序还要求您设置:
所需步骤是将
git
sources build dir中的*.dylib
复制到您平台的$HADOOP_HOME/<common dir>lib
目录中 . 对于通过brew
安装的OS/X
,它是:我们现在可以在那里看到所需的库:
现在
hadoop checknative
命令有效:作为@andrewdotn回答的更新,这里是与Hadoop 2.8.1一起使用的
patch.txt
文件: