Can do without spark-submit in java? -


i told there spark cluster running on "remote-host-num1:7077" multiple nodes on "remote-host-num2:7077" "remote-host-num3:7077".

if write program following:

    sparkconf conf = new sparkconf().setappname("org.sparkexample.testcount").setmaster("spark://remote-host-num1:7077");     javasparkcontext sc = new javasparkcontext(conf); 

and create javardd "myrdd" sc.textfile, , perform operation counts "myrdd.count()". operation taking advantage of machines in remote cluster?

i want make sure don't want use spark-submit "myjarfile" if can avoid it. if have to, should doing? if have use spark-submit take advantage of distributed nature of spark across multiple machines, there way programatically in java?

yes, there support added in spark-1.4.x submitting scala/java spark apps child process. can check more details in javadocs org.apache.spark.launcher class. link below referenced in spark documentation.

https://spark.apache.org/docs/latest/programming-guide.html#launching-spark-jobs-from-java--scala


Comments

Popular posts from this blog

qt - Using float or double for own QML classes -

Create Outlook appointment via C# .Net -

ios - Swift Array Resetting Itself -