Date
1 - 4 of 4
spark operations with dymic graphs
Real Life Adventure <srinu....@...>
Hi, How to do spark traversal on dynamically created graphs created with configuredgraphfactory . I dont find any documentation on that. i found only with GrapgFactory graph = GraphFactory.open('conf/hadoop-graph/read-cql-standalone-cluster.properties') g = graph.traversal().withComputer(SparkGraphComputer) g.V().count() Any help Appreciated. Thanks, RLA. |
|
HadoopMarc <bi...@...>
This describes how to obtain a GraphTraversalSource instance, like g1, for a configured graph, like graph1: Best wishes, Marc
|
|
Real Life Adventure <srinu....@...>
Thanks for the reply. iam able to get instance.but how to set spark config in configuredgraphfactory template. On Saturday, June 20, 2020 at 2:22:33 PM UTC+5:30, HadoopMarc wrote:
|
|
HadoopMarc <bi...@...>
I see what you mean. The problem is that Gremlin Server has to serve a HadoopGraph to be able to do an OLAP query with SparkGraphComputer. However, as the name suggests, JanusGraphFactory.open() always returns a StandardJanusGraph instance. So, for doing a remote OLAP query on JanusGraph you are left with the following choices:
HTH, Marc
|
|