Can some one please let me know how to connect to Spark application from a web application. The Spark code whatever we are having is on Scala and we want to access the code from web application. If we can provide a small code snippet that will be helpful.
Connecting Spark code from web application
1.3k Views Asked by Koushik Chandra At
1
There are 1 best solutions below
Related Questions in SCALA
- Spark .mapValues setup with multiple values
- Where do 'normal' println go in a scala jar, under Spark
- Serializing to disk and deserializing Scala objects using Pickling
- Where has "Show Type Info on Mouse Motion" gone in Intellij 14
- AbstractMethodError when mixing in trait nested in object - only when compiled and imported
- Scala POJO Aggregator Exception
- How to read in numbers from n lines into a Scala list?
- Spark pairRDD not working
- Scala Eclipse IDE compiler giving errors until "clean" is run
- How to port Slick 2.1 plain SQL queries to Slick 3.0
- Log of dependency does not show
- Getting unary error for escaped characters in Scala
- Akka actor invoked with a function delegate - is this bad practice?
- Json implicit format with recursive class definition
- How to create a executable jar reading files from local file system
Related Questions in WEB-APPLICATIONS
- Azure Web App PATH Variable Modification
- How To Update a Web Application In Azure and Keep The App Up the whole time
- Developing a search and tag heavy website
- How do you include a HTML file in c
- Is it recommended to use Node.js for an online room booking web application?
- programmatically uninstall other application without asking user
- Fail to locate j_spring_security_check in Spring Security
- Configuring Web Applications for iOS
- Change Javascript Variables Using <input>
- how do you use angularJs to produce a functioning webapp?
- NoClassDefFound error in web application deployed on Tomcat
- Replying to a request in ruby on rails (Server side)
- Exclude one role in web.xml
- LDAP connection only works on localhost
- Displaying statistics collected by Moskito-central
Related Questions in APACHE-SPARK
- Spark .mapValues setup with multiple values
- Where do 'normal' println go in a scala jar, under Spark
- How to query JSON data according to JSON array's size with Spark SQL?
- How do I set the Hive user to something different than the Spark user from within a Spark program?
- How to add a new event to Apache Spark Event Log
- Spark streaming + kafka throughput
- dataframe or sqlctx (sqlcontext) generated "Trying to call a package" error
- Spark pairRDD not working
- How to know which worker a partition is executed at?
- Using HDFS with Apache Spark on Amazon EC2
- How to create a executable jar reading files from local file system
- How to keep a SQLContext instance alive in a spark streaming application's life cycle?
- Cassandra spark connector data loss
- Proper way to provide spark application a parameter/arg with spaces in spark-submit
- sorting RDD elements
Related Questions in BIGDATA
- How to add a new event to Apache Spark Event Log
- DB candidate as CouchDB/Schema replacement
- Getting java.lang.IllegalArgumentException: requirement failed while calling Sparks MLLIB StreamingKMeans from java application
- More than expected jobs running in apache spark
- Does Cassandra support aggregation function or any other capabilities like Map Reduce?
- Accessing a large number of unsorted array elements in Python
- What are the approaches to the Big-Data problems?
- Talend Open Studio for Big Data
- How to store and retrieve time series using google appengine using python
- Connecting Spark code from web application
- Designing an API on top of BigQuery
- Apache Spark architecture
- Hive(Bigdata)- difference between bucketing and indexing
- When does an action not run on the driver in Apache Spark?
- Use of core-site.xml in mapreduce program
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
As mentioned in the comment section, try the following approaches,
spark-jobserver
spark-jobserver provides a RESTful interface for submitting and managing Apache Spark jobs, jars, and job contexts.
Approach mentioned by Jamal in spark-user-list
1) In the tomcat's classpath, add the spark distribution jar for spark code to be available at runtime
2) In the Web application project, add the spark distribution jar in the classpath ( Could be Java / Web project).
3) Setup the FAIR scheduling mode, which would help send parallel requests from web application to the spark cluster.
4) In the application startup, initialize the connection to the spark cluster. This is composed of creating the JavaSparkContext and making it available throughout the web application, in case this needs to be the only Driver Program required by the web application.
5) Using the JavaSpark Context, Create RDD's and make them available globally to the web application code.
6) invoke transformation / actions as required.