currently caching of the dataframe created by sparksql magic uses
df.cache()
This api does not let us name the cached data. Using the SQL version of this api it is possible to name the cache data (in the spark UI).
When the users creates a cached view
%%sparksql --output skip --view user_view_name --cache --eager
we could cache it using the SQL api as follows
df.createOrReplaceTempView("tmp_df_view"
spark.sql("CACHE TABLE user_view_name as select * from tmp_df_view")
spark.sql("DROP VIEW tmp_df_view")