Calling a SparkGraphComputer from within Spark


robk...@...
 

Hi,

I have a Spark-based program, which writes vertices and edges to a JanusGraph cluster (Cassandra backend).  Once the write is complete, I would like to execute an OLAP traversal for all vertices, using a SparkGraphComputer.

However, in all the examples I can find, Spark is being called externally from the Groovy console, using a config file with the Spark cluster's address.  I would like to keep the process coordination all in my Spark program, is there a way to achieve this?  Or should my first Spark program be triggering a second Spark program in the same cluster?

Thanks for any help,
Rob

Join janusgraph-users@lists.lfaidata.foundation to automatically receive all group messages.