Add TPCDS(sqlContext) constructor for backwards-compatibility

This patch adds additional constructors to `TPCDS` to maintain backwards-compatibility with code which calls `new TPCDS(anExistingSqlContext)`. This constructor was removed in #47.

The motivation for backwards-compatibility here is to simplify the gradual roll-out of an updated spark-sql-perf library to some existing jobs which share the same notebook.

Author: Josh Rosen <rosenville@gmail.com>

Closes #52 from JoshRosen/backwards-compatible-tpcds-constructor.
This commit is contained in:
Josh Rosen 2016-02-19 13:01:23 -08:00 committed by Michael Armbrust
parent 7a3d9ce5b9
commit 685ed9e488

View File

@ -19,19 +19,23 @@ package com.databricks.spark.sql.perf.tpcds
import scala.collection.mutable
import com.databricks.spark.sql.perf._
import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext
/**
* TPC-DS benchmark's dataset.
*
* @param sqlContext An existing SQLContext.
*/
class TPCDS
extends Benchmark
class TPCDS(@transient sqlContext: SQLContext)
extends Benchmark(sqlContext)
with ImpalaKitQueries
with SimpleQueries
with Tpcds_1_4_Queries
with Serializable {
def this() = this(SQLContext.getOrCreate(SparkContext.getOrCreate()))
/*
def setupBroadcast(skipTables: Seq[String] = Seq("store_sales", "customer")) = {
val skipExpr = skipTables.map(t => !('tableName === t)).reduceLeft[Column](_ && _)