sparksql table not found error

Under normal circumstances, you can first check if your table name is wrong. Of course, I encountered a situation like this:
Use hiveContext to read a table and register it as a temporary table (dataFrame.registerTemptable ()),
use sqlContext to read another table and register it as a temporary table, then I use sqlContext.sql ("two tables associated sql statement"), run the error " table not found ", after
careful study, I suddenly thought A knowledge point is that the life cycle of the temporary table registered by sqlContext and hiveContext is different, that is, the table registered by sqlContext can only be operated by sqlContext.sql(), and the table registered by hiveContext can only be operated by hiveContext.sql(), otherwise Will report an error!
So what to do? Is there a solution? Of course there are, more than one!
1: Do not register it as a temporary table, use the saveAsTable() method, this method will save the table in the local directory, even if the spark job is restarted, it can still be read.
2: The recommended method is to use hiveContext to operate uniformly. After testing, I found that hiveContext can read and operate mysql tables. Therefore, hiveContext is used here. sqlContext is generally used for testing local mysql tables, but to be honest, this logic is incorrect , So spark has made changes to this after 2.0, there is no longer hiveContext, unified as SQlContext.

Guess you like

Origin blog.csdn.net/qq_39719415/article/details/96426717