首页 文章

即使在更改hdfs-site.xml属性后,Sqoop也会导入数据但复制问题

提问于
浏览
0

有谁可以让我知道任何属性文件的变化!

谢谢,

Sqoop import the data but replication issue even after changing hdfs-site.xml properties.

命令:

C:\hadoop\hdp\sqoop-1.4.6.2.4.0.0-169>SQOOP import --connect jdbc:oracle:thin:@nile:1527:huiprd --username hud_reader --password hud_reader_n1le --table PWRLINE_COPY.DATAAGGRUN --m 1

错误信息:

警告:未设置HBASE_HOME和HBASE_VERSION . 警告:HCATALOG_HOME不存在HCatalog导入将失败 . 请将HCATALOG_HOME设置为HCatalog安装的根目录 . 警告:未设置ACCUMULO_HOME . 警告:HBASE_HOME不存在HBase导入将失败 . 请将HBASE_HOME设置为HBase安装的根目录 . 警告:ACCUMULO_HOME不存在Accumulo导入将失败 . 请将ACCUMULO_HOME设置为Accumulo安装的根目录 . 16/04/22 08:50:40 INFO sqoop.Sqoop:运行Sqoop版本:1.4.6.2.4.0.0-169 16/04/22 08:50:40 WARN tool.BaseSqoopTool:在命令上设置密码 - 线是不安全的 . 考虑使用-P代替 . 16/04/22 08:50:40 INFO oracle.OraOopManagerFactory:禁用Oracle和Hadoop的数据连接器 . 16/04/22 08:50:40 INFO manager.SqlManager:使用默认的fetchSize 1000 16/04/22 08:50:40 INFO tool.CodeGenTool:开始代码生成16/04/22 08:50:41 INFO manager .OracleManager:时区已设置为GMT 16/04/22 08:50:42 INFO manager.SqlManager:执行SQL语句:SELECT t . * FROM PWRLINE_COPY.DATAAGGRUN t WHERE 1 = 0 16/04/22 08:50 :42 INFO orm.CompilationManager:HADOOP_MAPRED_HOME为c:\的hadoop \ HDP \ Hadoop的2.7.1.2.4.0.0-169注:\ TMP \ sqoop-sahus \编译\ f1f5245c3a8fbf8c7782e696f3662575 \ PWRLINE_COPY_DATAAGGRUN.java使用或覆盖弃用API . 注意:使用-Xlint重新编译:弃用以获取详细信息 . 16/04/22 8点50分45秒INFO orm.CompilationManager:写作jar文件:\ tmp目录\ sqoop-sahus \编译\ f1f5245c3a8fbf8c7782e696f3662575 \ PWRLINE_COPY.DATAAGGRUN.jar 16/04/22 8点50分45秒INFO manager.OracleManager:时区已设置为GMT 16/04/22 08:50:46 INFO manager.OracleManager:时区已设置为GMT 16/04/22 08:50:46 INFO mapreduce.ImportJobBase:开始导入PWRLINE_COPY.DATAAGGRUN 16/04/22 08:50:46 INFO Configuration.deprecation:不推荐使用mapred.jar . 相反,使用mapreduce.job.jar 16/04/22 08:50:46 INFO manager.OracleManager:时区已设置为GMT 16/04/22 08:50:47 INFO Configuration.deprecation:mapred.map.tasks已弃用 . 相反,请使用mapreduce.job.maps 16/04/22 08:50:48 INFO impl.TimelineClientImpl:时间线服务地址:http://cc-wvd-ap161.pepcoholdings.biz:8188 / ws / v1 / timeline / 16 / 04/22 08:50:49 INFO client.RMProxy:在cc-wvd-ap161.pepcoholdings.biz/161.186.159.156:8032 16/04/22 08:50:50连接资源管理器INFO mapreduce.JobSubmitter:清理临时区域/user/sahus/.staging/job_1461298205218_0003 16/04/22 08:50:50错误工具.ImportTool:遇到IOException运行导入作业:org.apache.hadoop.ipc.RemoteException(java.io.IOException):文件/user/sahus/.staging/job_1461298205218_0003/libjars/xz-1.0.jar . 请求的复制10在org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.verifyReplication(BlockManager.java:988)在org.apache.hadoop.hdfs.server.namenode.FSDirAttrOp.setReplication(FSDirAttrOp.java超过最大3: 138)在org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setReplication(FSNamesystem.java:1968)在org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.setReplication(NameNodeRpcServer.java:740)在有机位于org.apache.hadoop的org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos $ ClientNamenodeProtocol $ 2.callBlockingMethod(ClientNamenodeProtocolProtos.java)中的.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.setReplication(ClientNamenodeProtocolServerSideTranslatorPB.java:440) . ipc.ProtobufRpcEngine $服务器$ ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)在org.apache.hadoop.ipc.RPC $ Server.call(RPC.java:969)在org.apache.hadoop.ipc.Server $处理器$ 1 .run(Server.java:2151)org.apache.hadoop.ipc.Server $ Handler $ 1.ru n(Server.java:2147)位于org.apache.hadoop.security.UserGroupInformation.doAs的javax.security.auth.Subject.doAs(Subject.java:415)的java.security.AccessController.doPrivileged(Native Method)( UserGroupInformation.java:1657)org.apache上的org.apache.hadoop.ipc.Server $ Handler.run(Server.java:2145)org.apache.hadoop.ipc.Client.call(Client.java:1427)位于com.sun.proxy的org.apache.hadoop.ipc.ProtobufRpcEngine $ Invoker.invoke(ProtobufRpcEngine.java:229).hadoop.ipc.Client.call(Client.java:1358) . $ Proxy14.setReplication(Unknown Source ) 在org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setReplication(ClientNamenodeProtocolTranslatorPB.java:349)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun . reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)在java.lang.reflect.Method.invoke(Method.java:606)在org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:252)在org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)在com.sun.proxy . $ Proxy15.setReplication(来源不明)在org.apache.hadoop.hdfs.DFSClient.setReplication(DFSClient在org.apache.hadoop.hdfs.DistributedFileSystem $ 9.doCall(DistributedFileSystem.java:517)在org.apache.hadoop.hdfs.DistributedFileSystem $ 9.doCall(DistributedFileSystem.java:513)在org.apache 1902)的.java . hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)在org.apache.hadoop.hdfs.DistributedFileSystem.setReplication(DistributedFileSystem.java:513)在org.apache.hadoop.mapreduce.JobResourceUploader.copyRemoteFiles(JobResourceUploader.java:204)在org.apache.hadoop.mapreduce.JobResourceUploader.uploadFiles (JobResourceUploader.java:128)在org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:95)在org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:190)在org.apache . hadoop.mapreduce.Job $ 10.run(Job.java:1290)atg.apache.hadoop.mapreduce.Job $ 10.run(Job.java:1287)at java.security.AccessController.doPrivileged(Native Method)at javax . org.apache.hadoop.mapreduce.Job.submit上的org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)中的security.auth.Subject.doAs(Subject.java:415)(Job.java: 1287)org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)位于org.apache.sqoop的org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196) .mapreduce.ImportJobBase.runJob(ImportJobBase.java:169)在org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266)在org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673 )org.apache.sqoop.manager.OracleManager.importTable(OracleManager.java:444)位于org.apache.sqoop.tool.ImportTool的org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497) . 运行(ImportTool.java:605)org.apache.sqoop.Sqoop.run(Sqoop.java:148)org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)org.apache.sqoop .sqoop.runSqoop(Sqoop.java:184)位于org.apache的org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)org.apache.sqoop.Sqoop.runTool(Sqoop.java:235) . sqoop.Sqoop.main(Sqoop.java:244)

1 回答

  • 0

    错误日志中的语句

    请求的复制10超过最大值3

    引用属性 mapreduce.client.submit.file.replication ,默认为10,可以在 mapred-site.xml 中修改 .

相关问题