当我通过以下命令启动spark-shell时 bin / spark-shell –packages org.apache.bahir:spark-streaming-mqtt_2.11:2.3.0 –repositories http://central.maven.org/maven2/org/apache/bahir/spark- .. 。
这将是您的系统的问题。关于错误相关,有很多原因导致 javax.net.ssl.SSLHandshakeException: java.security.cert.CertificateException 。其中一个原因是由于请求主机URL(包括IP地址)和证书(通常包括DNS主机名)之间不匹配,请求失败。这是由于证书缺少别名这一事实造成的(使用与默认名称不同的名称访问服务器时主机的主题备用名称。
javax.net.ssl.SSLHandshakeException: java.security.cert.CertificateException
问题可以通过多种方式解决。请在以下链接中找到一些替代方案:
https://support.mulesoft.com/s/article/CertificateException-No-Subject-Alternative-Names-Present
https://support.cloudbees.com/hc/en-us/articles/360017693231-Why-am-I-getting-No-subject-alternative-DNS-name-matching-XXX-when-connecting-through-ldaps-
https://confluence.atlassian.com/confkb/java-security-cert-certificateexception-no-subject-alternative-dns-name-matching-hostname-found-452100730.html
https://confluence.atlassian.com/jirakb/java-security-cert-certificateexception-no-subject-alternative-dns-name-matching-hostname-found-297669411.html
我可以向Spark-Shell添加模块。请在下面找到代码片段。
C:\Users\XYzUser>spark-shell --repositories http://central.maven.org/maven2/org/apache/bahir/spark-streaming-mqtt_2.11/2.3.0/ --packages org.apache.bahir:spark-streaming-mqtt_2.11:2.3.0 http://central.maven.org/maven2/org/apache/bahir/spark-streaming-mqtt_2.11/2.3.0/ added as a remote repository with the name: repo-1 Ivy Default Cache set to: C:\Users\..\.ivy2\cache The jars for the packages stored in: C:\Users\..\.ivy2\jars :: loading settings :: url = jar:file:/C:/Tools/spark/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml org.apache.bahir#spark-streaming-mqtt_2.11 added as a dependency :: resolving dependencies :: org.apache.spark#spark-submit-parent-73c724b4-c15c-45a8-89df-f492b2eb6feb;1.0 confs: [default] found org.apache.bahir#spark-streaming-mqtt_2.11;2.3.0 in central found org.eclipse.paho#org.eclipse.paho.client.mqttv3;1.1.0 in central found org.spark-project.spark#unused;1.0.0 in user-list :: resolution report :: resolve 7200ms :: artifacts dl 16ms :: modules in use: org.apache.bahir#spark-streaming-mqtt_2.11;2.3.0 from central in [default] org.eclipse.paho#org.eclipse.paho.client.mqttv3;1.1.0 from central in [default] org.spark-project.spark#unused;1.0.0 from user-list in [default] --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 3 | 1 | 1 | 0 || 3 | 0 | --------------------------------------------------------------------- :: problems summary :: :::: ERRORS unknown resolver null :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS :: retrieving :: org.apache.spark#spark-submit-parent-73c724b4-c15c-45a8-89df-f492b2eb6feb confs: [default] 0 artifacts copied, 3 already retrieved (0kB/31ms) Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). Spark context Web UI available at http://...:4040 Spark context available as 'sc' (master = local[*], app id = local-1552454258705). Spark session available as 'spark'. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.4.0 /_/