Eclipse local 模式执行spark任务报错 日志:
2019-01-07 17:36:26 [DEBUG]jobKeys: [MJ000000016] 2019-01-07 17:36:26 [INFO]unparsed key: name 2019-01-07 17:36:26 [INFO]unparsed key: id 19/01/07 17:36:27 INFO spark.SparkContext: Running Spark version 1.6.0 19/01/07 17:36:27 INFO spark.SecurityManager: Changing view acls to: Administrator 19/01/07 17:36:27 INFO spark.SecurityManager: Changing modify acls to: Administrator 19/01/07 17:36:27 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(Administrator); users with modify permissions: Set(Administrator) 19/01/07 17:36:27 INFO util.Utils: Successfully started service 'sparkDriver' on port 52820. 19/01/07 17:36:27 INFO slf4j.Slf4jLogger: Slf4jLogger started 19/01/07 17:36:27 INFO Remoting: Starting remoting 19/01/07 17:36:27 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@20.0.0.13:52833] 19/01/07 17:36:27 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkDriverActorSystem@20.0.0.13:52833] 19/01/07 17:36:27 INFO util.Utils: Successfully started service 'sparkDriverActorSystem' on port 52833. 19/01/07 17:36:28 INFO spark.SparkEnv: Registering MapOutputTracker 19/01/07 17:36:28 INFO spark.SparkEnv: Registering BlockManagerMaster 19/01/07 17:36:28 INFO storage.DiskBlockManager: Created local directory at C:\Users\Administrator\AppData\Local\Temp\blockmgr-65503f0d-5a3b-43e9-aac1-1fa35e847744 19/01/07 17:36:28 INFO storage.MemoryStore: MemoryStore started with capacity 969.8 MB 19/01/07 17:36:28 INFO spark.SparkEnv: Registering OutputCommitCoordinator 19/01/07 17:36:28 INFO server.Server: jetty-8.y.z-SNAPSHOT 19/01/07 17:36:28 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040 19/01/07 17:36:28 INFO util.Utils: Successfully started service 'SparkUI' on port 4040. 19/01/07 17:36:28 INFO ui.SparkUI: Started SparkUI at http://20.0.0.13:4040 19/01/07 17:36:28 INFO executor.Executor: Starting executor ID driver on host localhost 19/01/07 17:36:28 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 52842. 19/01/07 17:36:28 INFO netty.NettyBlockTransferService: Server created on 52842 19/01/07 17:36:28 INFO storage.BlockManagerMaster: Trying to register BlockManager 19/01/07 17:36:28 INFO storage.BlockManagerMasterEndpoint: Registering block manager localhost:52842 with 969.8 MB RAM, BlockManagerId(driver, localhost, 52842) 19/01/07 17:36:28 INFO storage.BlockManagerMaster: Registered BlockManager 19/01/07 17:36:29 INFO storage.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 273.8 KB, free 969.6 MB) 19/01/07 17:36:29 INFO storage.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 21.8 KB, free 969.6 MB) 19/01/07 17:36:29 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:52842 (size: 21.8 KB, free: 969.8 MB) 19/01/07 17:36:29 INFO spark.SparkContext: Created broadcast 0 from newAPIHadoopRDD at RowkeyUtils.scala:82 19/01/07 17:36:29 INFO storage.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 273.8 KB, free 969.3 MB) 19/01/07 17:36:29 INFO storage.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 21.8 KB, free 969.3 MB) 19/01/07 17:36:29 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on localhost:52842 (size: 21.8 KB, free: 969.8 MB) 19/01/07 17:36:29 INFO spark.SparkContext: Created broadcast 1 from newAPIHadoopRDD at RowkeyUtils.scala:82 19/01/07 17:36:29 INFO storage.MemoryStore: Block broadcast_2 stored as values in memory (estimated size 273.8 KB, free 969.0 MB) 19/01/07 17:36:29 INFO storage.MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 21.8 KB, free 969.0 MB) 19/01/07 17:36:29 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in memory on localhost:52842 (size: 21.8 KB, free: 969.8 MB) 19/01/07 17:36:29 INFO spark.SparkContext: Created broadcast 2 from newAPIHadoopRDD at RowkeyUtils.scala:82 19/01/07 17:36:29 INFO storage.MemoryStore: Block broadcast_3 stored as values in memory (estimated size 273.8 KB, free 968.7 MB) 19/01/07 17:36:29 INFO storage.MemoryStore: Block broadcast_3_piece0 stored as bytes in memory (estimated size 21.8 KB, free 968.7 MB) 19/01/07 17:36:29 INFO storage.BlockManagerInfo: Added broadcast_3_piece0 in memory on localhost:52842 (size: 21.8 KB, free: 969.8 MB) 19/01/07 17:36:29 INFO spark.SparkContext: Created broadcast 3 from newAPIHadoopRDD at RowkeyUtils.scala:82 19/01/07 17:36:29 INFO storage.MemoryStore: Block broadcast_4 stored as values in memory (estimated size 273.8 KB, free 968.4 MB) 19/01/07 17:36:29 INFO storage.MemoryStore: Block broadcast_4_piece0 stored as bytes in memory (estimated size 21.8 KB, free 968.4 MB) 19/01/07 17:36:29 INFO storage.BlockManagerInfo: Added broadcast_4_piece0 in memory on localhost:52842 (size: 21.8 KB, free: 969.7 MB) 19/01/07 17:36:29 INFO spark.SparkContext: Created broadcast 4 from newAPIHadoopRDD at RowkeyUtils.scala:82 19/01/07 17:36:29 INFO storage.MemoryStore: Block broadcast_5 stored as values in memory (estimated size 273.8 KB, free 968.1 MB) 19/01/07 17:36:29 INFO storage.MemoryStore: Block broadcast_5_piece0 stored as bytes in memory (estimated size 21.8 KB, free 968.1 MB) 19/01/07 17:36:29 INFO storage.BlockManagerInfo: Added broadcast_5_piece0 in memory on localhost:52842 (size: 21.8 KB, free: 969.7 MB) 19/01/07 17:36:29 INFO spark.SparkContext: Created broadcast 5 from newAPIHadoopRDD at RowkeyUtils.scala:82 19/01/07 17:36:29 INFO storage.MemoryStore: Block broadcast_6 stored as values in memory (estimated size 273.8 KB, free 967.8 MB) 19/01/07 17:36:29 INFO storage.MemoryStore: Block broadcast_6_piece0 stored as bytes in memory (estimated size 21.8 KB, free 967.8 MB) 19/01/07 17:36:29 INFO storage.BlockManagerInfo: Added broadcast_6_piece0 in memory on localhost:52842 (size: 21.8 KB, free: 969.7 MB) 19/01/07 17:36:29 INFO spark.SparkContext: Created broadcast 6 from newAPIHadoopRDD at RowkeyUtils.scala:82 19/01/07 17:36:29 INFO storage.MemoryStore: Block broadcast_7 stored as values in memory (estimated size 273.8 KB, free 967.6 MB) 19/01/07 17:36:29 INFO storage.MemoryStore: Block broadcast_7_piece0 stored as bytes in memory (estimated size 21.8 KB, free 967.5 MB) 19/01/07 17:36:29 INFO storage.BlockManagerInfo: Added broadcast_7_piece0 in memory on localhost:52842 (size: 21.8 KB, free: 969.7 MB) 19/01/07 17:36:29 INFO spark.SparkContext: Created broadcast 7 from newAPIHadoopRDD at RowkeyUtils.scala:82 19/01/07 17:36:29 INFO storage.MemoryStore: Block broadcast_8 stored as values in memory (estimated size 273.8 KB, free 967.3 MB) 19/01/07 17:36:29 INFO storage.MemoryStore: Block broadcast_8_piece0 stored as bytes in memory (estimated size 21.8 KB, free 967.2 MB) 19/01/07 17:36:29 INFO storage.BlockManagerInfo: Added broadcast_8_piece0 in memory on localhost:52842 (size: 21.8 KB, free: 969.6 MB) 19/01/07 17:36:29 INFO spark.SparkContext: Created broadcast 8 from newAPIHadoopRDD at RowkeyUtils.scala:82 19/01/07 17:36:29 INFO storage.MemoryStore: Block broadcast_9 stored as values in memory (estimated size 273.8 KB, free 967.0 MB) 19/01/07 17:36:29 INFO storage.MemoryStore: Block broadcast_9_piece0 stored as bytes in memory (estimated size 21.8 KB, free 967.0 MB) 19/01/07 17:36:29 INFO storage.BlockManagerInfo: Added broadcast_9_piece0 in memory on localhost:52842 (size: 21.8 KB, free: 969.6 MB) 19/01/07 17:36:29 INFO spark.SparkContext: Created broadcast 9 from newAPIHadoopRDD at RowkeyUtils.scala:82 19/01/07 17:36:29 INFO storage.MemoryStore: Block broadcast_10 stored as values in memory (estimated size 273.8 KB, free 966.7 MB) 19/01/07 17:36:29 INFO storage.MemoryStore: Block broadcast_10_piece0 stored as bytes in memory (estimated size 21.8 KB, free 966.7 MB) 19/01/07 17:36:29 INFO storage.BlockManagerInfo: Added broadcast_10_piece0 in memory on localhost:52842 (size: 21.8 KB, free: 969.6 MB) 19/01/07 17:36:29 INFO spark.SparkContext: Created broadcast 10 from newAPIHadoopRDD at RowkeyUtils.scala:82 19/01/07 17:36:30 INFO hive.HiveContext: Initializing execution hive, version 1.1.0 19/01/07 17:36:30 INFO client.ClientWrapper: Inspected Hadoop version: 2.6.0-cdh5.13.0 19/01/07 17:36:30 INFO client.ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0-cdh5.13.0 Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Native Method) at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode(NativeIO.java:524) at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:465) at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:518) at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:496) at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:316) at org.apache.hadoop.hive.ql.exec.Utilities.createDirsWithPermission(Utilities.java:3979) at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:659) at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:606) at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:547) at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:204) at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238) at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:220) at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:210) at org.apache.spark.sql.hive.HiveContext.functionRegistry$lzycompute(HiveContext.scala:465) at org.apache.spark.sql.hive.HiveContext.functionRegistry(HiveContext.scala:464) at org.apache.spark.sql.UDFRegistration.<init>(UDFRegistration.scala:40) at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:342) at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90) at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101) at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:103) at com.bdqf.itep.df.mj.gd.intensity.weight.task.FullLoadSectionRate.execute(FullLoadSectionRate.java:163) at com.bdqf.itep.df.mj.base.BaseJob.process(BaseJob.java:252) at com.bdqf.itep.df.mj.gd.intensity.weight.task.FullLoadSectionRate.main(FullLoadSectionRate.java:353) 19/01/07 17:36:30 INFO spark.SparkContext: Invoking stop() from shutdown hook 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static/sql,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/execution/json,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/execution,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/json,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null} 19/01/07 17:36:30 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null} 19/01/07 17:36:30 INFO ui.SparkUI: Stopped Spark web UI at http://20.0.0.13:4040 19/01/07 17:36:30 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 19/01/07 17:36:30 INFO storage.MemoryStore: MemoryStore cleared 19/01/07 17:36:30 INFO storage.BlockManager: BlockManager stopped 19/01/07 17:36:30 INFO storage.BlockManagerMaster: BlockManagerMaster stopped 19/01/07 17:36:30 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 19/01/07 17:36:30 INFO spark.SparkContext: Successfully stopped SparkContext 19/01/07 17:36:30 INFO util.ShutdownHookManager: Shutdown hook called 19/01/07 17:36:30 INFO util.ShutdownHookManager: Deleting directory C:\Users\Administrator\AppData\Local\Temp\spark-a6d60985-d815-4f77-8fe7-edf2ad198c88 19/01/07 17:36:30 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon. 19/01/07 17:36:30 INFO util.ShutdownHookManager: Deleting directory C:\Users\Administrator\AppData\Local\Temp\spark-5487e705-42ed-4ef1-8950-2cc8babce1b1 19/01/07 17:36:30 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.