Like SQL "scenario when" statement and “Swith", "if then else" statement from well known programming languages, Spark SQL Dataframe also supports identical syntax working with “when normally” or we could also use “case when” statement. So Enable’s see an instance regarding how to check for various conditions and replicate SQL http://mylesvhqak.blogdun.com/9471715/learn-spark-an-overview