Run spark program locally with intellij

I tried to run a simple test code in intellij IDEA. Here is my code:

import org.apache.spark.sql.functions._
import org.apache.spark.{SparkConf}
import org.apache.spark.sql.{DataFrame, SparkSession}

object hbasetest {
val spconf = new SparkConf() val spark = SparkSession.builder().master("local").config(spconf).getOrCreate() import spark.implicits._
def main(args : Array[String]) { val df = spark.read.parquet("file:///Users/cy/Documents/temp") df.show() spark.close() } }

My dependencies list:

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.1.0</version>
<!--<scope>provided</scope>-->
</dependency>

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-core_2.11</artifactId>
  <version>2.1.0</version>
  <!--<scope>provided</scope>-->
</dependency>

when I click with run button, it throw an exception:

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.mapreduce.TaskID.<init>(Lorg/apache/hadoop/mapreduce/JobID;Lorg/apache/hadoop/mapreduce/TaskType;I)V


I checked this post, but situation don't change after making modification. Can I get some help with running local spark application in IDEA? THx.

Update: I can run this code with spark-submit. I hope to directly run it with run button in IDEA.