Spark Hive在Eclipse代码中直接编译问题

2022-05-07 14:04:45 浏览数 (1)

利用Eclipse进行Spark开发过程中,特别是针对Hive开发时,经常会碰到这样一个问题,就是无法找到metastore。而一旦找不到的时候,hive会自动创建一个临时的本地的metastore,其提示INFO信息如下:

15/12/24 20:46:02 INFO HiveContext: Initializing execution hive, version 1.2.1 15/12/24 20:46:02 INFO ClientWrapper: Inspected Hadoop version: 2.6.0 15/12/24 20:46:02 INFO ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0 15/12/24 20:46:03 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 15/12/24 20:46:03 INFO ObjectStore: ObjectStore, initialize called 15/12/24 20:46:03 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored 15/12/24 20:46:03 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored 15/12/24 20:46:17 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order" 15/12/24 20:46:19 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. 15/12/24 20:46:19 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table. 15/12/24 20:46:28 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. 15/12/24 20:46:28 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table. 15/12/24 20:46:30 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY 15/12/24 20:46:30 INFO ObjectStore: Initialized ObjectStore 15/12/24 20:46:31 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0 15/12/24 20:46:31 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException 15/12/24 20:46:32 INFO HiveMetaStore: Added admin role in metastore 15/12/24 20:46:32 INFO HiveMetaStore: Added public role in metastore 15/12/24 20:46:33 INFO HiveMetaStore: No user is added in admin role, since config is empty 15/12/24 20:46:33 INFO HiveMetaStore: 0: get_all_databases 15/12/24 20:46:33 INFO audit: ugi=ndscbigdata ip=unknown-ip-addr cmd=get_all_databases 15/12/24 20:46:33 INFO HiveMetaStore: 0: get_functions: db=default pat=* 15/12/24 20:46:33 INFO audit: ugi=ndscbigdata ip=unknown-ip-addr cmd=get_functions: db=default pat=* 15/12/24 20:46:33 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-only" so does not have its own datastore table. 15/12/24 20:46:36 INFO SessionState: Created local directory: /tmp/1f162c4e-9707-467f-8adc-f8124c3e929a_resources 15/12/24 20:46:36 INFO SessionState: Created HDFS directory: /tmp/hive/ndscbigdata/1f162c4e-9707-467f-8adc-f8124c3e929a 15/12/24 20:46:36 INFO SessionState: Created local directory: /tmp/ndscbigdata/1f162c4e-9707-467f-8adc-f8124c3e929a 15/12/24 20:46:36 INFO SessionState: Created HDFS directory: /tmp/hive/ndscbigdata/1f162c4e-9707-467f-8adc-f8124c3e929a/_tmp_space.db 15/12/24 20:46:36 INFO HiveContext: default warehouse location is /user/hive/warehouse 15/12/24 20:46:36 INFO HiveContext: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes. 15/12/24 20:46:36 INFO ClientWrapper: Inspected Hadoop version: 2.6.0 15/12/24 20:46:36 INFO ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0 15/12/24 20:46:37 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 15/12/24 20:46:37 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 15/12/24 20:46:37 INFO ObjectStore: ObjectStore, initialize called 15/12/24 20:46:37 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored 15/12/24 20:46:37 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored

这种问题的解决是个很烦人的问题,在spark shell是可以搞定的,为什么spark eclipse就不行呢,肯定是hive-site.xml配置的问题。于是针对这个问题,进行拷贝配置就可以。

进行一个简单的测试,如从一个自己建的表中进行创建,果然KO。

0 人点赞