I suggest two ways to get started to develop Spark in Scala, both with Eclipse: one is to download (from the site scala-ide.org) the full pre-configured Eclipse which already includes the Scala IDE; another one consists in updating your existing Eclipse adding the Scala plugin (detailed instructions below). This basically will allow you to start Scala projects and run them locally. In each case, at the end of the procedure, in order to start developing in Spark, you have to import inside Eclipse as “existing maven project” a project template (that you can find linked at the bottom of this article).
Now I’ll illustrate how to integrate the Scala plugin in you existing maven installation. In this example I used an Eclipse Kepler EE. From the site http://scala-ide.org/download/current.html copy the latest link version for Kepler, or if not present, follow the link “Older versions” in the page, and choose the right Scala version for you. I copied the link from an older stable version for Scala 2.10.4 (which is the version available in the cluster I’m using at the moment), precisely this: http://download.scala-ide.org/sdk/lithium/e38/scala211/stable/site.
Make sure you have Java JDK 1.7 installed and that Eclipse is pointing at it. Click on [Windows] -> [Preferences] -> (in the left menu) [Java] -> (click on) [Installed JRE] and check if a JDK 1.7 installation is selected. In case, use …Continue reading →