SparkSQL inserting into tables with static columns
Static columns are mapped to different columns in SparkSQL and Hive and require special handling.
Static columns are mapped to different columns in SparkSQL and Hive and require special handling. When you when run insert query, you must pass data to those columns.
To work around the different columns, set
cql3.output.query
in the insertion
hive table properties to limit the columns that are being inserted. In SparkSQL or Hive,
alter the external table to configure the prepared statement as the value of the Hive CQL
output query. For example, this prepared statement takes values that are inserted into columns
a and b in mytable and maps these values to columns b and a, respectively, for insertion into
the new row.
spark-sql> ALTER TABLE mytable SET TBLPROPERTIES ('cql3.output.query' = 'update mykeyspace.mytable set b = ? where a = ?'); spark-sql> ALTER TABLE mytable SET SERDEPROPERTIES ('cql3.update.columns' = 'b,a');