当我运行spark作业时,我可以看到SSL密钥密码,keystorepassword在事件日志中以纯文本显示 . 你能帮我解决一下如何从日志中隐藏这些密码吗?

虽然我看下面的https://issues.apache.org/jira/browse/SPARK-16796似乎他们修复它以将其隐藏在Web UI中 . 但不确定我可以在日志中修复它

非常感谢您的帮助!!

“{”Event“:”SparkListenerLogStart“,”Spark Version“:”2.1.1“} {”Event“:”SparkListenerBlockManagerAdded“,”Block Manager ID“:{”Executor ID“:”driver“,”Host“ :“xx.xxx.xx.xxx”,“端口”:43556},“最大内存”:434031820,“时间戳”:1512750709305} {“事件”:“SparkListenerEnvironmentUpdate”,“JVM信息”:{“Java Home” :“/ usr / lib / jvm / java-1.8.0-openjdk-1.8.0.141-1.b16.32.amzn1.x86_64 / jre”,“Java Version”:“1.8.0_141(Oracle Corporation)”,“ Scala版本“:”版本2.11.8“},”Spark属性“:{”spark.sql.warehouse.dir“:”hdfs:/// user / spark / warehouse“,”spark.yarn.dist.files“ :“file:/etc/spark/conf/hive-site.xml”,“spark.executor.extraJavaOptions”:“ - verbose:gc -XX:PrintGCDetails -XX:PrintGCDateStamps -XX:UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction = 70 - XX:MaxHeapFreeRatio = 70 -XX:CMSClassUnloadingEnabled -XX:OnOutOfMemoryError ='kill -9%p'“,”spark.driver.host“:”xx.xxx.xx.xxx“,”spark.serializer.objectStreamReset“:” 100" , “spark.history.fs.logDirectory”: “HDFS:///无功/日志/火花/应用程序”,“spark.eventLog.e nabled “:” 真 “ ”spark.driver.port“: ”44832“, ”spark.shuffle.service.enabled“: ”真“, ”spark.rdd.compress“: ”真“,” spark.driver . extraLibraryPath “:”/ usr / lib中/ Hadoop的/ lib目录/本地:/ usr / lib中/ Hadoop的LZO / lib目录/本地 “ ”spark.ssl.keyStore“:” 在/ usr /共享/ AWS / EMR /安全/ conf目录/keystore.jks","spark.executorEnv.PYTHONPATH":"{}/pyspark.zip{}/py4j-0.10.4-src.zip","spark.ssl.enabled “:”真 “ ”spark.yarn.historyServer.address“:” ip-xx-xxx-xx-xxx.xxx.com:18080","spark.ssl.trustStore":"/usr/share/aws/emr/security /conf/truststore.jks","spark.app.name":"claim_line_fact_main","spark.scheduler.mode":"FIFO","spark.network.sasl.serverAlwaysEncrypt":"true","spark.ssl .keyPassword“:”xxxxxx“,”spark.ssl.keyStorePassword“:”xxxxxx“,”spark.executor.id“:”driver“,”spark.driver.extraJavaOptions“:” - XX:UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction = 70 -XX:MaxHeapFreeRatio = 70 -XX:CMSClassUnloadingEnabled -XX:OnOutOfMemoryError ='kill -9%p'“,”spark.submit.deployMode“:”client“,”spark.master“:”yarn“,”spark . 认证 . enableSaslEncryption “:” 真”, “spark.authenticate”: “真”, “spark.ui.filters”: “org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter”, “spark.executor.extraLibraryPath” : “/ usr / lib中/ hadoop的/ lib中的/天然的:/ usr / lib中/ Hadoop的LZO / LIB /天然的”, “spark.sql.hive.metastore.sharedPrefixes”: “com.amazonaws.services.dynamodbv2”,” spark.executor.memory “:” 5120M “ ”spark.driver.extraClassPath“:”/ usr / lib中/ Hadoop的LZO / LIB /:/ usr / lib中/ hadoop的/ Hadoop的aws.jar:在/ usr /共享/ AWS / AWS-java的SDK /:在/ usr /共享/ AWS / EMR / emrfs / CONF:在/ usr /共享/ AWS / EMR / emrfs / LIB /:在/ usr /共享/ AWS / EMR / emrfs / auxlib /: /usr/share/aws/emr/security/conf:/usr/share/aws/emr/security/lib/","spark.eventLog.dir":"hdfs:///var/log/spark/apps” “spark.ssl.protocol”: “TLSv1.2工作”, “spark.dynamicAllocation.enabled”: “真”, “spark.executor.extraClassPath”:“/ usr / lib中/ Hadoop的LZO / lib中/:在/ usr /lib/hadoop/hadoop-aws.jar:/usr/share/aws/aws-java-sdk/:/usr/share/aws/emr/emrfs/conf:/usr/share/aws/emr/emrfs/lib /:在/ usr /共享/ AWS / EMR / emrfs / auxlib /:在/ usr /共享/ AWS / EMR /安全/ conf目录:在/ usr /共享/ AWS / EM R /安全/ LIB / “ ”spark.executor.cores“: ”4“, ”spark.history.ui.port“: ”18080“, ”spark.driver.appUIAddress“: ”HTTP://“,” spark.yarn.isPython “:” 真 “ ”spark.ssl.trustStorePassword“: ”XXXXXX“, ”spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_HOSTS“:” IP- xx-xxx-xx-xxx.xxx.com”, “spark.ssl.enabledAlgorithms”: “TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_RSA_WITH_AES_256_CBC_SHA”, “spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_URI_BASES” ”