2016-03-23 21 views
2

Bir dize küme üzerinde çalışıyorum, hortonworks 2.4 dağılımı. Bir Hive tablosunda ACID işlemleri yapmak istiyorum. Ben aynı yapıya sahip bir dış kovan masaya dayalı bu tabloyu doldurmakHadoop Hive ACID sorgu hatası

CREATE TABLE myAcidTable (..) 
CLUSTERED BY(myKey) INTO 1 BUCKETS 
STORED AS ORC TBLPROPERTIES ('transactional'='true','orc.compress'='SNAPPY'); 

: İşte benim beyanı ifadedir.

Loading data to table MyAcidTable
Table myAcidTable stats: [numFiles=1, numRows=4450, totalSize=42001, rawDataSize=0]
OK

Ben kovan kabuk üzerinden bu tabloyu sorgulamak için deneyin:

INSERT INTO myAcidTable 
SELECT * FROM MyTmpTable; 

Bu işlem iyi çalışıyor

set hive.support.concurrency=true; 
set hive.enforce.bucketing=true; 
set hive.exec.dynamic.partition.mode=nonstrict; 
set hive.txn.manager=org.apache.hadoop.hive.ql.lockmgr.DbTxnManager; 
set hive.compactor.initiator.on=true; 
set hive.compactor.worker.threads=3; 

SELECT * FROM myAcidTable 
WHERE myKey = 12; 

Ama durum normal bir durumdur bile (bu hata var): Baktığım zaman:

OK
Failed with exception java.io.IOException:java.lang.RuntimeException: serious problem

OK
12 ...

mi:

org.apache.ambari.view.hive.client.HiveErrorStatusException: H170 Unable to fetch results. java.io.IOException: java.lang.RuntimeException: serious problem

...
Caused by: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: delta_0000000_0000000 does not start with base_

ben işlem propoerty olmadan masamı ilan zaman çünkü garip, select ifadesi de

CREATE TABLE myAcidTable (..) 
CLUSTERED BY(myKey) INTO 1 BUCKETS 
STORED AS ORC TBLPROPERTIES ('orc.compress'='SNAPPY'); 

SELECT * FROM myAcidTable 
WHERE myKey = 12; 

Sonuçlar çalışır: günlüklerini bunu bulmak Nereye bakacağın hakkında bir fikrin var mı? yardım için teşekkürler.

tam hatası: Tabloyu oluştururken veya tabloya veri yükleme zamanda yanlış işlem yöneticisi vardı çünkü

org.apache.hive.service.cli.HiveSQLException: java.io.IOException: java.lang.RuntimeExce

ption: serious problem at org.apache.hive.service.cli.operation.SQLOperation.getNextRowSet(SQLOperation.java:352) at org.apache.hive.service.cli.operation.OperationManager.getOperationNextRowSet(OperationManager.java:223) at org.apache.hive.service.cli.session.HiveSessionImpl.fetchResults(HiveSessionImpl.java:716) at sun.reflect.GeneratedMethodAccessor24.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78) at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36) at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59) at com.sun.proxy.$Proxy22.fetchResults(Unknown Source) at org.apache.hive.service.cli.CLIService.fetchResults(CLIService.java:454) at org.apache.hive.service.cli.thrift.ThriftCLIService.FetchResults(ThriftCLIService.java:672) at org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults.getResult(TCLIService.java:1557) at org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults.getResult(TCLIService.java:1542) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.io.IOException: java.lang.RuntimeException: serious problem at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:512) at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:419) at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:143) at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1737) at org.apache.hive.service.cli.operation.SQLOperation.getNextRowSet(SQLOperation.java:347) ... 24 more Caused by: java.lang.RuntimeException: serious problem at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.generateSplitsInfo(OrcInputFormat.java:1115) at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.getSplits(OrcInputFormat.java:1142) at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextSplits(FetchOperator.java:367) at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:299) at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:450) ... 28 more Caused by: java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: delta_0000000_0000000 does not start with base_ at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:192) at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.generateSplitsInfo(OrcInputFormat.java:1092) ... 32 more Caused by: java.lang.IllegalArgumentException: delta_0000000_0000000 does not start with base_ at org.apache.hadoop.hive.ql.io.AcidUtils.parseBase(AcidUtils.java:154) at org.apache.hadoop.hive.ql.io.AcidUtils.parseBaseBucketFilename(AcidUtils.java:182) at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$FileGenerator.call(OrcInputFormat.java:725) at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$FileGenerator.call(OrcInputFormat.java:690) at java.util.concurrent.FutureTask.run(FutureTask.java:266) ... 3 more

+0

Sorgulamadan önce tablo oluşturmadan önce bu değerleri ayarlamayı denediniz mi? –

+0

Evet, bu değerleri de yapılandırma dosyasına değiştirdim, değişiklik yok .. – Mouette

cevap

0

muhtemelen. Benim durumumda

i org.apache.hadoop.hive.ql.lockmgr.DummyTxnManager

yerine org.apache.hadoop.hive.ql.lockmgr.DbTxnManager

hata kurtulmak için vardı, tabloyu düşmesi gerekir,

hive> set hive.txn.manager = org.apache.hadoop.hive.ql.lockmgr.DbTxnManager;

yani doğru işlem yöneticisi ayarlamak

ve sonra tabloyu yeniden oluşturun.