apache spark - Error in Caching a Table in SparkSQL -


i trying cache table available in hive(using spark-shell). given below code

scala> val hivecontext = new org.apache.spark.sql.hive.hivecontext(sc)  scala> hivecontext.cachetable("sparkdb.firsttable") 

and getting below exception

org.apache.spark.sql.catalyst.analysis.nosuchtableexception     @ org.apache.spark.sql.hive.client.clientinterface$$anonfun$gettable$1.apply(clientinterface.scala:112) 

the table firsttable available in database sparkdb(in hive). looks issue seems in providing database name. how achieve this?

ps : hiveql query 1 shown below work without issues

scala> hivecontext.sql("select * sparkdb.firsttable")

find below results few other method calls

scala> hivecontext.tables("sparkdb") res14: org.apache.spark.sql.dataframe = [tablename: string, istemporary: boolean]  scala> hivecontext.tables("sparkdb.firsttable") res15: org.apache.spark.sql.dataframe = [tablename: string, istemporary: boolean] 

aha! right, seems spark-8105. so, now, best bet select * , cache that.


Comments

Popular posts from this blog

qt - Using float or double for own QML classes -

Create Outlook appointment via C# .Net -

ios - Swift Array Resetting Itself -