Saturday, July 25, 2020

Convert TEXT to PARQUET using Scala

In this article we will see how to convert a TEXT file to a Parquet file using a Spark Dataframe using Scala.

The input text file is shown below.



The SBT library dependencies are shown below for reference.

scalaVersion := "2.11.12"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.0"

The Scala program is provided below.

import org.apache.spark.sql.{SaveMode, SparkSession}

object TXTToParquetConverter extends App {
  val spark = SparkSession.builder()
    .master("local")
    .appName("TextToParquetConverter")
    .getOrCreate()

  val inputFile = "C:\\data\\text.txt"  val outputFile = "C:\\data\\out_data_text2parquet"
  val df = spark    .read
    .format("csv")
    .option("delimiter"," ")
    .option("header", "true")
    .load(inputFile)

  df    .write
    .mode(SaveMode.Overwrite)
    .option("header","true")
    .parquet(outputFile)
}

The converted Parquet file is shown below.



That's all!

No comments:

Post a Comment