HiveContext报错OutOfMemoryError: PermGen space

现象:
这里有一段简单的调用HiveContext处理数据的代码:
object LocalEnvHiveTest {
  def main(args: Array[String]): Unit = {
    Logger.getLogger("org").setLevel(Level.WARN)
    val conf = new SparkConf().setMaster("local[*]").setAppName("RunOnVMHive")
    val sc = new SparkContext(conf)
    val hiveContext = new HiveContext(sc)
    val rdd = hiveContext.sql("show databases")
    rdd.foreach(println)
    sc.stop()
  }
}

但是这么完美的代码,竟然报错了,报错内容如下:

18/04/24 16:07:08 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
Exception in thread "main" java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:249)
    at org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:327)
    at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:237)
    at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:441)
    at org.apache.spark.sql.hive.HiveContext.defaultOverrides(HiveContext.scala:226)
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:229)
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
    at com.env.LocalEnvHiveTest$.main(LocalEnvHiveTest.scala:23)
    at com.env.LocalEnvHiveTest.main(LocalEnvHiveTest.scala)
Caused by: java.lang.OutOfMemoryError: PermGen space
    at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1236)
    at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
    at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
    at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:194)
    ... 13 more

Process finished with exit code 1

 

 

 

这里比较关键的信息是:

Caused by: java.lang.OutOfMemoryError: PermGen space
解决方法1:
运行环境,换成jdk8,就解决了
 
解解方法2:
打开idea,点击 Run =》 Edit Configurations…
弹出窗口,在VM options,输入内容:
-server -XX:PermSize=256m -XX:MaxPermSize=512m
然后保存配置。
再次执行程序,报如下错误,但是程序还是成功了,此项可忽略:
18/04/24 16:03:52 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
18/04/24 16:03:52 ERROR ShutdownHookManager: Exception while deleting Spark temp dir: C:\Users\superman\AppData\Local\Temp\spark-eb35b3d4-a7f5-4268-bee6-fcc6c98a88a4
java.io.IOException: Failed to delete: C:\Users\superman\AppData\Local\Temp\spark-eb35b3d4-a7f5-4268-bee6-fcc6c98a88a4
    at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:928)
    at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:65)
    at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:62)
    at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
    at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
    at org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:62)
    at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:267)
    at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:239)
    at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:239)
    at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:239)
    at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1741)
    at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:239)
    at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:239)
    at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:239)
    at scala.util.Try$.apply(Try.scala:161)
    at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:239)
    at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:218)
    at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
截图更好看些:
参考资料:https://blog.csdn.net/xiao_jun_0820/article/details/45038205

 

 

发表评论

电子邮件地址不会被公开。 必填项已用*标注