My Experience with Janusgraph Bulk Loading Using SparkGraphComputer
Abhay Pandit <abha...@...>
Hi, I am sharing my experience of Janusgraph Bulk Loading: Data Store: Cassandra Cassandra + Elastic IOPs: 30k per nodes à Miss match of count on a index data
Custom VertexPrograms Built: 1> Node Creation VertexProgram 2> Edge Creation VertexProgram 3> Bulk Drop VertexProgram 4> Reprocess VertexProgram for any missing data. 5> CountVertexProgram based upon some business rule 6> MultiVertexCreation with single input data and linking before committing (Still to test on millions of data) 7> Few more based upon requirements
Thanks, |
|
natali2...@...
Can you provide some basic code example of applying SparkGraphComputer for loading data please? Can not find any code examples to understand how spark is used for bulk loading (creating nodes and edges). Thanks. четверг, 20 февраля 2020 г., 22:03:21 UTC+3 пользователь Abhay Pandit написал:
|
|