May 14, 2016

Apache Spark Job with Maven

Today, I'm going to show you how to write a sample word count application using Apache Spark. For dependency resolution and building tasks, I'm using Apache Maven. How ever, you can use SBT (Simple Build Tool). Most of the Java Developers are familiar with Maven. Hence I decided to show an example using Maven.


This application is pretty much similar to the WordCount Example of the Hadoop. This job exactly does the same thing. Content of the Drive.scala is given below.

This job basically reads all the files in the input folder. Then tokenize every word from space (" "). Then count each and every word individually. Moreover, you can see that application is reading arguments from the args variable. First argument will be the input folder. Second argument will be used to dump the output.

Maven projects needs a pom.xml. Content of the pom.xml is given below.

Run below command to build the Maven project.
mvn clean package
Maven will download all the dependencies and all the other stuff on behalf of you. Then what you need to do is run this job. To run the job, please run below command on your terminal window.
/home/dedunu/bin/spark-1.6.1/bin/spark-submit          \
     --class org.dedunu.datascience.sample.Driver      \
     target/sample-Spark-Job-jar-with-dependencies.jar \
     /home/dedunu/input                                \
     /home/dedunu/output


Output of the job will look like below.


You can find the project on Github - https://github.com/dedunu/spark-example
Enjoy Spark!

5 comments:

  1. What is different between Spark(clusting frame work) and parallal computing frame work?

    ReplyDelete
    Replies
    1. If you use a spark cluster, you don't have to manage the distributed environment. Let's say you have cluster with ten nodes and your job will be automatically scheduled from Spark scheduler. Spark will parallelize the job as well. You don't have to code for each and everything.

      If you do a research on map reduce paradigm, you will be able to find out how this is possible.

      Delete
  2. It should be noted that Maven is a completely plug-in system. In other words, he does not know how to do anything except run plug-ins, but they already know how to do amazing things. It turns out that when we want to teach Maven some features of the project build, we need to add to pom.xml an instruction to run the required plug-in in the right phase and with the necessary parameters. At this page Active Wizards I read that the main policy of use in Maven is that for any action there are default parameters and additional settings are required only if these defaults are missing or they are grossly violated.

    ReplyDelete