JanusGraph in Gremlin


Joseph Obernberger <joseph.o...@...>
 

Hello,
I've just started to use Janus Graph and I have code that creates a graph (using HBase as the storage backend), and exports the graph to GraphML.  The graph is only about 10k vertices, and I can view it in Gephi OK.  When I load it with Gremlin, however, I get 0 vertices from the g.V().count() command.  
I open the graph OK with graph=JanusFactory.open('conf/J.properties') where J.properties has the necessary params and table name.  I then do g=graph.traversal(), which also succeeds.  Any ideas on what I can debug from here?  When I do a count on the table from HBase shell, data is present.
Thank you for any ideas!

-Joe Obernberger


HadoopMarc <m.c.d...@...>
 

Hi Joseph,

My first idea: did you commit the transactions that inserted the vertices and edges into JanusGrpah?

Cheers,    Marc


Op dinsdag 18 april 2017 04:48:02 UTC+2 schreef Joseph Obernberger:

Hello,
I've just started to use Janus Graph and I have code that creates a graph (using HBase as the storage backend), and exports the graph to GraphML.  The graph is only about 10k vertices, and I can view it in Gephi OK.  When I load it with Gremlin, however, I get 0 vertices from the g.V().count() command.  
I open the graph OK with graph=JanusFactory.open('conf/J.properties') where J.properties has the necessary params and table name.  I then do g=graph.traversal(), which also succeeds.  Any ideas on what I can debug from here?  When I do a count on the table from HBase shell, data is present.
Thank you for any ideas!

-Joe Obernberger


Joe Obernberger <joseph.o...@...>
 

Thank you Marc.  Yes, I've even tried very simple graphs of just a few vertices, commit, write out graphML, and then close.  I can also load the graph from HBase, and export graphML OK.  It is only in  the gremlin shell that I'm not able to access the data.  I'll double check the transactions.

I'm using Cloudera's CDH 5.10.0 so I had to modify the pom.xml to use their hadoop2 and hbase versions.  Could that be the issue?  Given that the exported graphML is valid, I'm leaning toward it being something with Gremlin?

Thanks again!

-Joe


On 4/18/2017 2:29 AM, HadoopMarc wrote:
Hi Joseph,

My first idea: did you commit the transactions that inserted the vertices and edges into JanusGrpah?

Cheers,    Marc


Op dinsdag 18 april 2017 04:48:02 UTC+2 schreef Joseph Obernberger:
Hello,
I've just started to use Janus Graph and I have code that creates a graph (using HBase as the storage backend), and exports the graph to GraphML.  The graph is only about 10k vertices, and I can view it in Gephi OK.  When I load it with Gremlin, however, I get 0 vertices from the g.V().count() command.  
I open the graph OK with graph=JanusFactory.open('conf/J.properties') where J.properties has the necessary params and table name.  I then do g=graph.traversal(), which also succeeds.  Any ideas on what I can debug from here?  When I do a count on the table from HBase shell, data is present.
Thank you for any ideas!

-Joe Obernberger
--
You received this message because you are subscribed to the Google Groups "JanusGraph users list" group.
To unsubscribe from this group and stop receiving emails from it, send an email to janusgra...@....
For more options, visit https://groups.google.com/d/optout.


marc.d...@...
 

Hi Joseph,

I am still on Titan/HBase, but the things should work the same.

Code to connect:

config = new PropertiesConfiguration("somedir/xyz.properties")
config.setProperty('storage.hbase.table','some_table')
graph = GraphFactory.open(config)
g=graph.traversal()

 with xyz.properties looking like:
gremlin.graph=com.thinkaurelius.titan.core.TitanFactory

# HBase config
storage.backend=hbase
storage.hostname=fqdn1,fqdn2,fqdn3

# Titan caching engine
cache.db-cache = true
cache.db-cache-clean-wait = 20
cache.db-cache-time = 180000
cache.db-cache-size = 0.5

If this is ok, you probably have to add your cluster conf and lib dirs to gremlin shell's classpath. This will look something like:

#!/bin/bash

export SCRIPT_PWD=$PWD

HDP_VERSION=2.5.3.0-37

if [[ `ls -l /usr/hdp/current` != *"$HDP_VERSION"* ]]
then
  echo "HDP_VERSION config in this script does not match active HDP stack"
  exit 1
fi
export HADOOP_HOME=/usr/hdp/current/hadoop-client
export HADOOP_CONF_DIR=$HADOOP_HOME/conf
export YARN_HOME=/usr/hdp/current/hadoop-yarn-client
export YARN_CONF_DIR=$HADOOP_CONF_DIR
export SPARK_HOME=/usr/hdp/current/spark-client
export SPARK_CONF_DIR=$SPARK_HOME/conf
export HBASE_HOME=/usr/hdp/current/hbase-client
export HBASE_CONF_DIR=$HBASE_HOME/conf

# HADOOP_JARS is HADOOP_HOME without servlet-api
# Assembled manually with:
# for filename in /usr/hdp/current/hadoop-client/lib/*.jar; do
#    HADOOPJARS=$HADOOPJARS:"$filename"
# done
# echo $HADOOPJARS
HADOOP_JARS=/usr/hdp/current/hadoop-clientlib/*.jar:/usr/hdp/current/hadoop-client/lib/activation-1.1.jar:/usr/hdp/current/hadoop-client/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/hadoop-client/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/hadoop-client/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/hadoop-client/lib/api-util-1.0.0-M20.jar:/usr/hdp/current/hadoop-client/lib/asm-3.2.jar:/usr/hdp/current/hadoop-client/lib/avro-1.7.4.jar:/usr/hdp/current/hadoop-client/lib/aws-java-sdk-core-1.10.6.jar:/usr/hdp/current/hadoop-client/lib/aws-java-sdk-kms-1.10.6.jar:/usr/hdp/current/hadoop-client/lib/aws-java-sdk-s3-1.10.6.jar:/usr/hdp/current/hadoop-client/lib/azure-keyvault-core-0.8.0.jar:/usr/hdp/current/hadoop-client/lib/azure-storage-4.2.0.jar:/usr/hdp/current/hadoop-client/lib/commons-beanutils-1.7.0.jar:/usr/hdp/current/hadoop-client/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/hadoop-client/lib/commons-cli-1.2.jar:/usr/hdp/current/hadoop-client/lib/commons-codec-1.4.jar:/usr/hdp/current/hadoop-client/lib/commons-collections-3.2.2.jar:/usr/hdp/current/hadoop-client/lib/commons-compress-1.4.1.jar:/usr/hdp/current/hadoop-client/lib/commons-configuration-1.6.jar:/usr/hdp/current/hadoop-client/lib/commons-digester-1.8.jar:/usr/hdp/current/hadoop-client/lib/commons-io-2.4.jar:/usr/hdp/current/hadoop-client/lib/commons-lang-2.6.jar:/usr/hdp/current/hadoop-client/lib/commons-lang3-3.4.jar:/usr/hdp/current/hadoop-client/lib/commons-logging-1.1.3.jar:/usr/hdp/current/hadoop-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/hadoop-client/lib/commons-net-3.1.jar:/usr/hdp/current/hadoop-client/lib/curator-client-2.7.1.jar:/usr/hdp/current/hadoop-client/lib/curator-framework-2.7.1.jar:/usr/hdp/current/hadoop-client/lib/curator-recipes-2.7.1.jar:/usr/hdp/current/hadoop-client/lib/gson-2.2.4.jar:/usr/hdp/current/hadoop-client/lib/guava-11.0.2.jar:/usr/hdp/current/hadoop-client/lib/hamcrest-core-1.3.jar:/usr/hdp/current/hadoop-client/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/current/hadoop-client/lib/httpclient-4.5.2.jar:/usr/hdp/current/hadoop-client/lib/httpcore-4.4.4.jar:/usr/hdp/current/hadoop-client/lib/jackson-annotations-2.2.3.jar:/usr/hdp/current/hadoop-client/lib/jackson-core-2.2.3.jar:/usr/hdp/current/hadoop-client/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/jackson-databind-2.2.3.jar:/usr/hdp/current/hadoop-client/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/jackson-xc-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/java-xmlbuilder-0.4.jar:/usr/hdp/current/hadoop-client/lib/jaxb-api-2.2.2.jar:/usr/hdp/current/hadoop-client/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/hadoop-client/lib/jcip-annotations-1.0.jar:/usr/hdp/current/hadoop-client/lib/jersey-core-1.9.jar:/usr/hdp/current/hadoop-client/lib/jersey-json-1.9.jar:/usr/hdp/current/hadoop-client/lib/jersey-server-1.9.jar:/usr/hdp/current/hadoop-client/lib/jets3t-0.9.0.jar:/usr/hdp/current/hadoop-client/lib/jettison-1.1.jar:/usr/hdp/current/hadoop-client/lib/jetty-6.1.26.hwx.jar:/usr/hdp/current/hadoop-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/hadoop-client/lib/joda-time-2.8.1.jar:/usr/hdp/current/hadoop-client/lib/jsch-0.1.42.jar:/usr/hdp/current/hadoop-client/lib/json-smart-1.1.1.jar:/usr/hdp/current/hadoop-client/lib/jsp-api-2.1.jar:/usr/hdp/current/hadoop-client/lib/jsr305-3.0.0.jar:/usr/hdp/current/hadoop-client/lib/junit-4.11.jar:/usr/hdp/current/hadoop-client/lib/log4j-1.2.17.jar:/usr/hdp/current/hadoop-client/lib/mockito-all-1.8.5.jar:/usr/hdp/current/hadoop-client/lib/netty-3.6.2.Final.jar:/usr/hdp/current/hadoop-client/lib/nimbus-jose-jwt-3.9.jar:/usr/hdp/current/hadoop-client/lib/ojdbc6.jar:/usr/hdp/current/hadoop-client/lib/paranamer-2.3.jar:/usr/hdp/current/hadoop-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/hadoop-client/lib/ranger-hdfs-plugin-shim-0.6.0.2.5.3.0-37.jar:/usr/hdp/current/hadoop-client/lib/ranger-plugin-classloader-0.6.0.2.5.3.0-37.jar:/usr/hdp/current/hadoop-client/lib/ranger-yarn-plugin-shim-0.6.0.2.5.3.0-37.jar:/usr/hdp/current/hadoop-client/lib/slf4j-api-1.7.10.jar:/usr/hdp/current/hadoop-client/lib/slf4j-log4j12-1.7.10.jar:/usr/hdp/current/hadoop-client/lib/snappy-java-1.0.4.1.jar:/usr/hdp/current/hadoop-client/lib/stax-api-1.0-2.jar:/usr/hdp/current/hadoop-client/lib/xmlenc-0.52.jar:/usr/hdp/current/hadoop-client/lib/xz-1.0.jar:/usr/hdp/current/hadoop-client/lib/zookeeper-3.4.6.2.5.3.0-37.jar

HBASE_JARS=/usr/hdp/current/hbase-client/lib/activation-1.1.jar:/usr/hdp/current/hbase-client/lib/aopalliance-1.0.jar:/usr/hdp/current/hbase-client/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/hbase-client/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/hbase-client/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/hbase-client/lib/api-util-1.0.0-M20.jar:/usr/hdp/current/hbase-client/lib/asm-3.1.jar:/usr/hdp/current/hbase-client/lib/avro-1.7.4.jar:/usr/hdp/current/hbase-client/lib/commons-beanutils-1.7.0.jar:/usr/hdp/current/hbase-client/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/hbase-client/lib/commons-cli-1.2.jar:/usr/hdp/current/hbase-client/lib/commons-codec-1.9.jar:/usr/hdp/current/hbase-client/lib/commons-collections-3.2.2.jar:/usr/hdp/current/hbase-client/lib/commons-compress-1.4.1.jar:/usr/hdp/current/hbase-client/lib/commons-configuration-1.6.jar:/usr/hdp/current/hbase-client/lib/commons-daemon-1.0.13.jar:/usr/hdp/current/hbase-client/lib/commons-digester-1.8.jar:/usr/hdp/current/hbase-client/lib/commons-el-1.0.jar:/usr/hdp/current/hbase-client/lib/commons-httpclient-3.1.jar:/usr/hdp/current/hbase-client/lib/commons-io-2.4.jar:/usr/hdp/current/hbase-client/lib/commons-lang-2.6.jar:/usr/hdp/current/hbase-client/lib/commons-logging-1.2.jar:/usr/hdp/current/hbase-client/lib/commons-math-2.2.jar:/usr/hdp/current/hbase-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/hbase-client/lib/commons-net-3.1.jar:/usr/hdp/current/hbase-client/lib/curator-client-2.7.1.jar:/usr/hdp/current/hbase-client/lib/curator-framework-2.7.1.jar:/usr/hdp/current/hbase-client/lib/curator-recipes-2.7.1.jar:/usr/hdp/current/hbase-client/lib/disruptor-3.3.0.jar:/usr/hdp/current/hbase-client/lib/findbugs-annotations-1.3.9-1.jar:/usr/hdp/current/hbase-client/lib/gson-2.2.4.jar:/usr/hdp/current/hbase-client/lib/guava-12.0.1.jar:/usr/hdp/current/hbase-client/lib/guice-3.0.jar:/usr/hdp/current/hbase-client/lib/guice-servlet-3.0.jar:/usr/hdp/current/hbase-client/lib/hbase-annotations-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-annotations-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/current/hbase-client/lib/hbase-annotations.jar:/usr/hdp/current/hbase-client/lib/hbase-client-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-client.jar:/usr/hdp/current/hbase-client/lib/hbase-common-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-common-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/current/hbase-client/lib/hbase-common.jar:/usr/hdp/current/hbase-client/lib/hbase-examples-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-examples.jar:/usr/hdp/current/hbase-client/lib/hbase-hadoop2-compat-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-hadoop2-compat.jar:/usr/hdp/current/hbase-client/lib/hbase-hadoop-compat-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-hadoop-compat.jar:/usr/hdp/current/hbase-client/lib/hbase-it-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-it-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/current/hbase-client/lib/hbase-it.jar:/usr/hdp/current/hbase-client/lib/hbase-prefix-tree-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-prefix-tree.jar:/usr/hdp/current/hbase-client/lib/hbase-procedure-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-procedure.jar:/usr/hdp/current/hbase-client/lib/hbase-protocol-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-protocol.jar:/usr/hdp/current/hbase-client/lib/hbase-resource-bundle-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-resource-bundle.jar:/usr/hdp/current/hbase-client/lib/hbase-rest-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-rest.jar:/usr/hdp/current/hbase-client/lib/hbase-server-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-server-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/current/hbase-client/lib/hbase-server.jar:/usr/hdp/current/hbase-client/lib/hbase-shell-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-shell.jar:/usr/hdp/current/hbase-client/lib/hbase-thrift-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-thrift.jar:/usr/hdp/current/hbase-client/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/current/hbase-client/lib/httpclient-4.2.5.jar:/usr/hdp/current/hbase-client/lib/httpcore-4.2.5.jar:/usr/hdp/current/hbase-client/lib/jamon-runtime-2.3.1.jar:/usr/hdp/current/hbase-client/lib/jasper-compiler-5.5.23.jar:/usr/hdp/current/hbase-client/lib/jasper-runtime-5.5.23.jar:/usr/hdp/current/hbase-client/lib/javax.inject-1.jar:/usr/hdp/current/hbase-client/lib/java-xmlbuilder-0.4.jar:/usr/hdp/current/hbase-client/lib/jaxb-api-2.2.2.jar:/usr/hdp/current/hbase-client/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/hbase-client/lib/jcodings-1.0.8.jar:/usr/hdp/current/hbase-client/lib/jersey-client-1.9.jar:/usr/hdp/current/hbase-client/lib/jersey-core-1.9.jar:/usr/hdp/current/hbase-client/lib/jersey-guice-1.9.jar:/usr/hdp/current/hbase-client/lib/jersey-json-1.9.jar:/usr/hdp/current/hbase-client/lib/jersey-server-1.9.jar:/usr/hdp/current/hbase-client/lib/jets3t-0.9.0.jar:/usr/hdp/current/hbase-client/lib/jettison-1.3.3.jar:/usr/hdp/current/hbase-client/lib/jetty-6.1.26.hwx.jar:/usr/hdp/current/hbase-client/lib/jetty-sslengine-6.1.26.hwx.jar:/usr/hdp/current/hbase-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/hbase-client/lib/joni-2.1.2.jar:/usr/hdp/current/hbase-client/lib/jruby-complete-1.6.8.jar:/usr/hdp/current/hbase-client/lib/jsch-0.1.42.jar:/usr/hdp/current/hbase-client/lib/jsp-2.1-6.1.14.jar:/usr/hdp/current/hbase-client/lib/jsp-api-2.1-6.1.14.jar:/usr/hdp/current/hbase-client/lib/jsr305-1.3.9.jar:/usr/hdp/current/hbase-client/lib/junit-4.11.jar:/usr/hdp/current/hbase-client/lib/leveldbjni-all-1.8.jar:/usr/hdp/current/hbase-client/lib/libthrift-0.9.0.jar:/usr/hdp/current/hbase-client/lib/log4j-1.2.17.jar:/usr/hdp/current/hbase-client/lib/metrics-core-2.2.0.jar:/usr/hdp/current/hbase-client/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/current/hbase-client/lib/okhttp-2.4.0.jar:/usr/hdp/current/hbase-client/lib/okio-1.4.0.jar:/usr/hdp/current/hbase-client/lib/paranamer-2.3.jar:/usr/hdp/current/hbase-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/hbase-client/lib/ranger-hbase-plugin-shim-0.5.0.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/ranger-plugin-classloader-0.5.0.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/slf4j-api-1.7.7.jar:/usr/hdp/current/hbase-client/lib/snappy-java-1.0.4.1.jar:/usr/hdp/current/hbase-client/lib/spymemcached-2.11.6.jar:/usr/hdp/current/hbase-client/lib/xercesImpl-2.9.1.jar:/usr/hdp/current/hbase-client/lib/xml-apis-1.3.04.jar:/usr/hdp/current/hbase-client/lib/xmlenc-0.52.jar:/usr/hdp/current/hbase-client/lib/xz-1.0.jar:/usr/hdp/current/hbase-client/lib/zookeeper.jar


source "$HADOOP_CONF_DIR"/hadoop-env.sh
source "$YARN_CONF_DIR"/yarn-env.sh
source "$SPARK_HOME"/bin/load-spark-env.sh
source "$HBASE_CONF_DIR"/hbase-env.sh

export GREMLIN_LOG_LEVEL=WARN
export JAVA_OPTIONS="$JAVA_OPTIONS -Djava.library.path=/usr/hdp/current/hadoop-client/lib/native -Dtinkerpop.ext=ext -Dlog4j.configuration=conf/log4j-console.properties -Dhdp.version=$HDP_VERSION -Dgremlin.log4j.level=$GREMLIN_LOG_LEVEL"

# for gremlin to use spark plugin
GREMLINHOME=/your/libpath/titan-1.1-graben1437-hadoop2
export HADOOP_GREMLIN_LIBS=$GREMLINHOME/lib:$HBASE_HOME/lib

# for gremlin to connect to cluster hdfs
export CLASSPATH=$HADOOP_JARS:$HADOOP_HOME/etc/hadoop

# for gremlin to connect to cluster hbase
export CLASSPATH=$CLASPATH:$HBASE_JARS:$HBASE_HOME/conf

# for gremlin to connect to cluster yarn with spark
export CLASSPATH=$GREMLINHOME/lib/*:$YARN_HOME/*:$YARN_CONF_DIR:$SPARK_HOME/lib/*:$SPARK_CONF_DIR:/opt/politie/lib/spark-metrics-current.jar:$CLASSPATH

cd $GREMLINHOME 

exec $GREMLINHOME/bin/gremlin.sh $*



As you see I had to solve some version conflicts with my cluster.

Cheers,    Marc


Op dinsdag 18 april 2017 14:43:47 UTC+2 schreef Joseph Obernberger:

Thank you Marc.  Yes, I've even tried very simple graphs of just a few vertices, commit, write out graphML, and then close.  I can also load the graph from HBase, and export graphML OK.  It is only in  the gremlin shell that I'm not able to access the data.  I'll double check the transactions.

I'm using Cloudera's CDH 5.10.0 so I had to modify the pom.xml to use their hadoop2 and hbase versions.  Could that be the issue?  Given that the exported graphML is valid, I'm leaning toward it being something with Gremlin?

Thanks again!

-Joe


On 4/18/2017 2:29 AM, HadoopMarc wrote:
Hi Joseph,

My first idea: did you commit the transactions that inserted the vertices and edges into JanusGrpah?

Cheers,    Marc


Op dinsdag 18 april 2017 04:48:02 UTC+2 schreef Joseph Obernberger:
Hello,
I've just started to use Janus Graph and I have code that creates a graph (using HBase as the storage backend), and exports the graph to GraphML.  The graph is only about 10k vertices, and I can view it in Gephi OK.  When I load it with Gremlin, however, I get 0 vertices from the g.V().count() command.  
I open the graph OK with graph=JanusFactory.open('conf/J.properties') where J.properties has the necessary params and table name.  I then do g=graph.traversal(), which also succeeds.  Any ideas on what I can debug from here?  When I do a count on the table from HBase shell, data is present.
Thank you for any ideas!

-Joe Obernberger
--
You received this message because you are subscribed to the Google Groups "JanusGraph users list" group.
To unsubscribe from this group and stop receiving emails from it, send an email to janusgraph-use...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


HadoopMarc <m.c.d...@...>
 

Hi Joseph,

I am still on Titan/HBase, but the things should work the same.

Code to connect:

config = new PropertiesConfiguration("somedir/xyz.properties")
config.setProperty('storage.hbase.table','some_table')
graph = GraphFactory.open(config)
g=graph.traversal()

 with xyz.properties looking like:
gremlin.graph=com.thinkaurelius.titan.core.TitanFactory

# HBase config
storage.backend=hbase
storage.hostname=fqdn1,fqdn2,fqdn3

# Titan caching engine
cache.db-cache = true
cache.db-cache-clean-wait = 20
cache.db-cache-time = 180000
cache.db-cache-size = 0.5

If this is ok, you probably have to add your cluster conf and lib dirs to gremlin shell's classpath. This will look something like:

#!/bin/bash

export SCRIPT_PWD=$PWD

HDP_VERSION=2.5.3.0-37

if [[ `ls -l /usr/hdp/current` != *"$HDP_VERSION"* ]]
then
  echo "HDP_VERSION config in this script does not match active HDP stack"
  exit 1
fi
export HADOOP_HOME=/usr/hdp/current/hadoop-client
export HADOOP_CONF_DIR=$HADOOP_HOME/conf
export YARN_HOME=/usr/hdp/current/hadoop-yarn-client
export YARN_CONF_DIR=$HADOOP_CONF_DIR
export SPARK_HOME=/usr/hdp/current/spark-client
export SPARK_CONF_DIR=$SPARK_HOME/conf
export HBASE_HOME=/usr/hdp/current/hbase-client
export HBASE_CONF_DIR=$HBASE_HOME/conf

# HADOOP_JARS is HADOOP_HOME without servlet-api
# Assembled manually with:
# for filename in /usr/hdp/current/hadoop-client/lib/*.jar; do
#    HADOOPJARS=$HADOOPJARS:"$filename"
# done
# echo $HADOOPJARS
HADOOP_JARS=/usr/hdp/current/hadoop-clientlib/*.jar:/usr/hdp/current/hadoop-client/lib/activation-1.1.jar:/usr/hdp/current/hadoop-client/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/hadoop-client/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/hadoop-client/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/hadoop-client/lib/api-util-1.0.0-M20.jar:/usr/hdp/current/hadoop-client/lib/asm-3.2.jar:/usr/hdp/current/hadoop-client/lib/avro-1.7.4.jar:/usr/hdp/current/hadoop-client/lib/aws-java-sdk-core-1.10.6.jar:/usr/hdp/current/hadoop-client/lib/aws-java-sdk-kms-1.10.6.jar:/usr/hdp/current/hadoop-client/lib/aws-java-sdk-s3-1.10.6.jar:/usr/hdp/current/hadoop-client/lib/azure-keyvault-core-0.8.0.jar:/usr/hdp/current/hadoop-client/lib/azure-storage-4.2.0.jar:/usr/hdp/current/hadoop-client/lib/commons-beanutils-1.7.0.jar:/usr/hdp/current/hadoop-client/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/hadoop-client/lib/commons-cli-1.2.jar:/usr/hdp/current/hadoop-client/lib/commons-codec-1.4.jar:/usr/hdp/current/hadoop-client/lib/commons-collections-3.2.2.jar:/usr/hdp/current/hadoop-client/lib/commons-compress-1.4.1.jar:/usr/hdp/current/hadoop-client/lib/commons-configuration-1.6.jar:/usr/hdp/current/hadoop-client/lib/commons-digester-1.8.jar:/usr/hdp/current/hadoop-client/lib/commons-io-2.4.jar:/usr/hdp/current/hadoop-client/lib/commons-lang-2.6.jar:/usr/hdp/current/hadoop-client/lib/commons-lang3-3.4.jar:/usr/hdp/current/hadoop-client/lib/commons-logging-1.1.3.jar:/usr/hdp/current/hadoop-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/hadoop-client/lib/commons-net-3.1.jar:/usr/hdp/current/hadoop-client/lib/curator-client-2.7.1.jar:/usr/hdp/current/hadoop-client/lib/curator-framework-2.7.1.jar:/usr/hdp/current/hadoop-client/lib/curator-recipes-2.7.1.jar:/usr/hdp/current/hadoop-client/lib/gson-2.2.4.jar:/usr/hdp/current/hadoop-client/lib/guava-11.0.2.jar:/usr/hdp/current/hadoop-client/lib/hamcrest-core-1.3.jar:/usr/hdp/current/hadoop-client/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/current/hadoop-client/lib/httpclient-4.5.2.jar:/usr/hdp/current/hadoop-client/lib/httpcore-4.4.4.jar:/usr/hdp/current/hadoop-client/lib/jackson-annotations-2.2.3.jar:/usr/hdp/current/hadoop-client/lib/jackson-core-2.2.3.jar:/usr/hdp/current/hadoop-client/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/jackson-databind-2.2.3.jar:/usr/hdp/current/hadoop-client/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/jackson-xc-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/java-xmlbuilder-0.4.jar:/usr/hdp/current/hadoop-client/lib/jaxb-api-2.2.2.jar:/usr/hdp/current/hadoop-client/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/hadoop-client/lib/jcip-annotations-1.0.jar:/usr/hdp/current/hadoop-client/lib/jersey-core-1.9.jar:/usr/hdp/current/hadoop-client/lib/jersey-json-1.9.jar:/usr/hdp/current/hadoop-client/lib/jersey-server-1.9.jar:/usr/hdp/current/hadoop-client/lib/jets3t-0.9.0.jar:/usr/hdp/current/hadoop-client/lib/jettison-1.1.jar:/usr/hdp/current/hadoop-client/lib/jetty-6.1.26.hwx.jar:/usr/hdp/current/hadoop-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/hadoop-client/lib/joda-time-2.8.1.jar:/usr/hdp/current/hadoop-client/lib/jsch-0.1.42.jar:/usr/hdp/current/hadoop-client/lib/json-smart-1.1.1.jar:/usr/hdp/current/hadoop-client/lib/jsp-api-2.1.jar:/usr/hdp/current/hadoop-client/lib/jsr305-3.0.0.jar:/usr/hdp/current/hadoop-client/lib/junit-4.11.jar:/usr/hdp/current/hadoop-client/lib/log4j-1.2.17.jar:/usr/hdp/current/hadoop-client/lib/mockito-all-1.8.5.jar:/usr/hdp/current/hadoop-client/lib/netty-3.6.2.Final.jar:/usr/hdp/current/hadoop-client/lib/nimbus-jose-jwt-3.9.jar:/usr/hdp/current/hadoop-client/lib/ojdbc6.jar:/usr/hdp/current/hadoop-client/lib/paranamer-2.3.jar:/usr/hdp/current/hadoop-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/hadoop-client/lib/ranger-hdfs-plugin-shim-0.6.0.2.5.3.0-37.jar:/usr/hdp/current/hadoop-client/lib/ranger-plugin-classloader-0.6.0.2.5.3.0-37.jar:/usr/hdp/current/hadoop-client/lib/ranger-yarn-plugin-shim-0.6.0.2.5.3.0-37.jar:/usr/hdp/current/hadoop-client/lib/slf4j-api-1.7.10.jar:/usr/hdp/current/hadoop-client/lib/slf4j-log4j12-1.7.10.jar:/usr/hdp/current/hadoop-client/lib/snappy-java-1.0.4.1.jar:/usr/hdp/current/hadoop-client/lib/stax-api-1.0-2.jar:/usr/hdp/current/hadoop-client/lib/xmlenc-0.52.jar:/usr/hdp/current/hadoop-client/lib/xz-1.0.jar:/usr/hdp/current/hadoop-client/lib/zookeeper-3.4.6.2.5.3.0-37.jar

HBASE_JARS=/usr/hdp/current/hbase-client/lib/activation-1.1.jar:/usr/hdp/current/hbase-client/lib/aopalliance-1.0.jar:/usr/hdp/current/hbase-client/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/hbase-client/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/hbase-client/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/hbase-client/lib/api-util-1.0.0-M20.jar:/usr/hdp/current/hbase-client/lib/asm-3.1.jar:/usr/hdp/current/hbase-client/lib/avro-1.7.4.jar:/usr/hdp/current/hbase-client/lib/commons-beanutils-1.7.0.jar:/usr/hdp/current/hbase-client/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/hbase-client/lib/commons-cli-1.2.jar:/usr/hdp/current/hbase-client/lib/commons-codec-1.9.jar:/usr/hdp/current/hbase-client/lib/commons-collections-3.2.2.jar:/usr/hdp/current/hbase-client/lib/commons-compress-1.4.1.jar:/usr/hdp/current/hbase-client/lib/commons-configuration-1.6.jar:/usr/hdp/current/hbase-client/lib/commons-daemon-1.0.13.jar:/usr/hdp/current/hbase-client/lib/commons-digester-1.8.jar:/usr/hdp/current/hbase-client/lib/commons-el-1.0.jar:/usr/hdp/current/hbase-client/lib/commons-httpclient-3.1.jar:/usr/hdp/current/hbase-client/lib/commons-io-2.4.jar:/usr/hdp/current/hbase-client/lib/commons-lang-2.6.jar:/usr/hdp/current/hbase-client/lib/commons-logging-1.2.jar:/usr/hdp/current/hbase-client/lib/commons-math-2.2.jar:/usr/hdp/current/hbase-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/hbase-client/lib/commons-net-3.1.jar:/usr/hdp/current/hbase-client/lib/curator-client-2.7.1.jar:/usr/hdp/current/hbase-client/lib/curator-framework-2.7.1.jar:/usr/hdp/current/hbase-client/lib/curator-recipes-2.7.1.jar:/usr/hdp/current/hbase-client/lib/disruptor-3.3.0.jar:/usr/hdp/current/hbase-client/lib/findbugs-annotations-1.3.9-1.jar:/usr/hdp/current/hbase-client/lib/gson-2.2.4.jar:/usr/hdp/current/hbase-client/lib/guava-12.0.1.jar:/usr/hdp/current/hbase-client/lib/guice-3.0.jar:/usr/hdp/current/hbase-client/lib/guice-servlet-3.0.jar:/usr/hdp/current/hbase-client/lib/hbase-annotations-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-annotations-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/current/hbase-client/lib/hbase-annotations.jar:/usr/hdp/current/hbase-client/lib/hbase-client-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-client.jar:/usr/hdp/current/hbase-client/lib/hbase-common-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-common-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/current/hbase-client/lib/hbase-common.jar:/usr/hdp/current/hbase-client/lib/hbase-examples-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-examples.jar:/usr/hdp/current/hbase-client/lib/hbase-hadoop2-compat-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-hadoop2-compat.jar:/usr/hdp/current/hbase-client/lib/hbase-hadoop-compat-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-hadoop-compat.jar:/usr/hdp/current/hbase-client/lib/hbase-it-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-it-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/current/hbase-client/lib/hbase-it.jar:/usr/hdp/current/hbase-client/lib/hbase-prefix-tree-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-prefix-tree.jar:/usr/hdp/current/hbase-client/lib/hbase-procedure-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-procedure.jar:/usr/hdp/current/hbase-client/lib/hbase-protocol-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-protocol.jar:/usr/hdp/current/hbase-client/lib/hbase-resource-bundle-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-resource-bundle.jar:/usr/hdp/current/hbase-client/lib/hbase-rest-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-rest.jar:/usr/hdp/current/hbase-client/lib/hbase-server-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-server-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/current/hbase-client/lib/hbase-server.jar:/usr/hdp/current/hbase-client/lib/hbase-shell-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-shell.jar:/usr/hdp/current/hbase-client/lib/hbase-thrift-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-thrift.jar:/usr/hdp/current/hbase-client/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/current/hbase-client/lib/httpclient-4.2.5.jar:/usr/hdp/current/hbase-client/lib/httpcore-4.2.5.jar:/usr/hdp/current/hbase-client/lib/jamon-runtime-2.3.1.jar:/usr/hdp/current/hbase-client/lib/jasper-compiler-5.5.23.jar:/usr/hdp/current/hbase-client/lib/jasper-runtime-5.5.23.jar:/usr/hdp/current/hbase-client/lib/javax.inject-1.jar:/usr/hdp/current/hbase-client/lib/java-xmlbuilder-0.4.jar:/usr/hdp/current/hbase-client/lib/jaxb-api-2.2.2.jar:/usr/hdp/current/hbase-client/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/hbase-client/lib/jcodings-1.0.8.jar:/usr/hdp/current/hbase-client/lib/jersey-client-1.9.jar:/usr/hdp/current/hbase-client/lib/jersey-core-1.9.jar:/usr/hdp/current/hbase-client/lib/jersey-guice-1.9.jar:/usr/hdp/current/hbase-client/lib/jersey-json-1.9.jar:/usr/hdp/current/hbase-client/lib/jersey-server-1.9.jar:/usr/hdp/current/hbase-client/lib/jets3t-0.9.0.jar:/usr/hdp/current/hbase-client/lib/jettison-1.3.3.jar:/usr/hdp/current/hbase-client/lib/jetty-6.1.26.hwx.jar:/usr/hdp/current/hbase-client/lib/jetty-sslengine-6.1.26.hwx.jar:/usr/hdp/current/hbase-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/hbase-client/lib/joni-2.1.2.jar:/usr/hdp/current/hbase-client/lib/jruby-complete-1.6.8.jar:/usr/hdp/current/hbase-client/lib/jsch-0.1.42.jar:/usr/hdp/current/hbase-client/lib/jsp-2.1-6.1.14.jar:/usr/hdp/current/hbase-client/lib/jsp-api-2.1-6.1.14.jar:/usr/hdp/current/hbase-client/lib/jsr305-1.3.9.jar:/usr/hdp/current/hbase-client/lib/junit-4.11.jar:/usr/hdp/current/hbase-client/lib/leveldbjni-all-1.8.jar:/usr/hdp/current/hbase-client/lib/libthrift-0.9.0.jar:/usr/hdp/current/hbase-client/lib/log4j-1.2.17.jar:/usr/hdp/current/hbase-client/lib/metrics-core-2.2.0.jar:/usr/hdp/current/hbase-client/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/current/hbase-client/lib/okhttp-2.4.0.jar:/usr/hdp/current/hbase-client/lib/okio-1.4.0.jar:/usr/hdp/current/hbase-client/lib/paranamer-2.3.jar:/usr/hdp/current/hbase-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/hbase-client/lib/ranger-hbase-plugin-shim-0.5.0.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/ranger-plugin-classloader-0.5.0.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/slf4j-api-1.7.7.jar:/usr/hdp/current/hbase-client/lib/snappy-java-1.0.4.1.jar:/usr/hdp/current/hbase-client/lib/spymemcached-2.11.6.jar:/usr/hdp/current/hbase-client/lib/xercesImpl-2.9.1.jar:/usr/hdp/current/hbase-client/lib/xml-apis-1.3.04.jar:/usr/hdp/current/hbase-client/lib/xmlenc-0.52.jar:/usr/hdp/current/hbase-client/lib/xz-1.0.jar:/usr/hdp/current/hbase-client/lib/zookeeper.jar


source "$HADOOP_CONF_DIR"/hadoop-env.sh
source "$YARN_CONF_DIR"/yarn-env.sh
source "$SPARK_HOME"/bin/load-spark-env.sh
source "$HBASE_CONF_DIR"/hbase-env.sh

export GREMLIN_LOG_LEVEL=WARN
export JAVA_OPTIONS="$JAVA_OPTIONS -Djava.library.path=/usr/hdp/current/hadoop-client/lib/native -Dtinkerpop.ext=ext -Dlog4j.configuration=conf/log4j-console.properties -Dhdp.version=$HDP_VERSION -Dgremlin.log4j.level=$GREMLIN_LOG_LEVEL"

# for gremlin to use spark plugin
GREMLINHOME=/your/libpath/titan-1.1-graben1437-hadoop2
export HADOOP_GREMLIN_LIBS=$GREMLINHOME/lib:$HBASE_HOME/lib

# for gremlin to connect to cluster hdfs
export CLASSPATH=$HADOOP_JARS:$HADOOP_HOME/etc/hadoop

# for gremlin to connect to cluster hbase
export CLASSPATH=$CLASPATH:$HBASE_JARS:$HBASE_HOME/conf

# for gremlin to connect to cluster yarn with spark
export CLASSPATH=$GREMLINHOME/lib/*:$YARN_HOME/*:$YARN_CONF_DIR:$SPARK_HOME/lib/*:$SPARK_CONF_DIR:$CLASSPATH

cd $GREMLINHOME 

exec $GREMLINHOME/bin/gremlin.sh $*



As you see I had to solve some version conflicts with my cluster. The spark OLAP part has never worked for me, but still lingers around in the gremlin start script.

Cheers,    Marc

Op dinsdag 18 april 2017 14:43:47 UTC+2 schreef Joseph Obernberger:

Thank you Marc.  Yes, I've even tried very simple graphs of just a few vertices, commit, write out graphML, and then close.  I can also load the graph from HBase, and export graphML OK.  It is only in  the gremlin shell that I'm not able to access the data.  I'll double check the transactions.

I'm using Cloudera's CDH 5.10.0 so I had to modify the pom.xml to use their hadoop2 and hbase versions.  Could that be the issue?  Given that the exported graphML is valid, I'm leaning toward it being something with Gremlin?

Thanks again!

-Joe


On 4/18/2017 2:29 AM, HadoopMarc wrote:
Hi Joseph,

My first idea: did you commit the transactions that inserted the vertices and edges into JanusGrpah?

Cheers,    Marc


Op dinsdag 18 april 2017 04:48:02 UTC+2 schreef Joseph Obernberger:
Hello,
I've just started to use Janus Graph and I have code that creates a graph (using HBase as the storage backend), and exports the graph to GraphML.  The graph is only about 10k vertices, and I can view it in Gephi OK.  When I load it with Gremlin, however, I get 0 vertices from the g.V().count() command.  
I open the graph OK with graph=JanusFactory.open('conf/J.properties') where J.properties has the necessary params and table name.  I then do g=graph.traversal(), which also succeeds.  Any ideas on what I can debug from here?  When I do a count on the table from HBase shell, data is present.
Thank you for any ideas!

-Joe Obernberger
--
You received this message because you are subscribed to the Google Groups "JanusGraph users list" group.
To unsubscribe from this group and stop receiving emails from it, send an email to janusgraph-use...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Joe Obernberger <joseph.o...@...>
 

Thank you Marc.  You raise a good point - I can access the graph fine from Java code using gremlin commands like:

JanusGraphQuery query = graph.query.has("name", "test");
if (query.vertices().iterator().hasNext()) {
  Vertex v = (Vertex) query.vertices().iterator().next();
  System.out.println("Found: "+v.label();
}

This works fine, which really makes me think the issue is some library/jar incompatibility with the gremlin.sh.

-Joe


On 4/18/2017 10:22 AM, HadoopMarc wrote:
Hi Joseph,

I am still on Titan/HBase, but the things should work the same.

Code to connect:

config = new PropertiesConfiguration("somedir/xyz.properties")
config.setProperty('storage.hbase.table','some_table')
graph = GraphFactory.open(config)
g=graph.traversal()

 with xyz.properties looking like:
gremlin.graph=com.thinkaurelius.titan.core.TitanFactory

# HBase config
storage.backend=hbase
storage.hostname=fqdn1,fqdn2,fqdn3

# Titan caching engine
cache.db-cache = true
cache.db-cache-clean-wait = 20
cache.db-cache-time = 180000
cache.db-cache-size = 0.5

If this is ok, you probably have to add your cluster conf and lib dirs to gremlin shell's classpath. This will look something like:

#!/bin/bash

export SCRIPT_PWD=$PWD

HDP_VERSION=2.5.3.0-37

if [[ `ls -l /usr/hdp/current` != *"$HDP_VERSION"* ]]
then
  echo "HDP_VERSION config in this script does not match active HDP stack"
  exit 1
fi
export HADOOP_HOME=/usr/hdp/current/hadoop-client
export HADOOP_CONF_DIR=$HADOOP_HOME/conf
export YARN_HOME=/usr/hdp/current/hadoop-yarn-client
export YARN_CONF_DIR=$HADOOP_CONF_DIR
export SPARK_HOME=/usr/hdp/current/spark-client
export SPARK_CONF_DIR=$SPARK_HOME/conf
export HBASE_HOME=/usr/hdp/current/hbase-client
export HBASE_CONF_DIR=$HBASE_HOME/conf

# HADOOP_JARS is HADOOP_HOME without servlet-api
# Assembled manually with:
# for filename in /usr/hdp/current/hadoop-client/lib/*.jar; do
#    HADOOPJARS=$HADOOPJARS:"$filename"
# done
# echo $HADOOPJARS
HADOOP_JARS=/usr/hdp/current/hadoop-clientlib/*.jar:/usr/hdp/current/hadoop-client/lib/activation-1.1.jar:/usr/hdp/current/hadoop-client/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/hadoop-client/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/hadoop-client/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/hadoop-client/lib/api-util-1.0.0-M20.jar:/usr/hdp/current/hadoop-client/lib/asm-3.2.jar:/usr/hdp/current/hadoop-client/lib/avro-1.7.4.jar:/usr/hdp/current/hadoop-client/lib/aws-java-sdk-core-1.10.6.jar:/usr/hdp/current/hadoop-client/lib/aws-java-sdk-kms-1.10.6.jar:/usr/hdp/current/hadoop-client/lib/aws-java-sdk-s3-1.10.6.jar:/usr/hdp/current/hadoop-client/lib/azure-keyvault-core-0.8.0.jar:/usr/hdp/current/hadoop-client/lib/azure-storage-4.2.0.jar:/usr/hdp/current/hadoop-client/lib/commons-beanutils-1.7.0.jar:/usr/hdp/current/hadoop-client/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/hadoop-client/lib/commons-cli-1.2.jar:/usr/hdp/current/hadoop-client/lib/commons-codec-1.4.jar:/usr/hdp/current/hadoop-client/lib/commons-collections-3.2.2.jar:/usr/hdp/current/hadoop-client/lib/commons-compress-1.4.1.jar:/usr/hdp/current/hadoop-client/lib/commons-configuration-1.6.jar:/usr/hdp/current/hadoop-client/lib/commons-digester-1.8.jar:/usr/hdp/current/hadoop-client/lib/commons-io-2.4.jar:/usr/hdp/current/hadoop-client/lib/commons-lang-2.6.jar:/usr/hdp/current/hadoop-client/lib/commons-lang3-3.4.jar:/usr/hdp/current/hadoop-client/lib/commons-logging-1.1.3.jar:/usr/hdp/current/hadoop-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/hadoop-client/lib/commons-net-3.1.jar:/usr/hdp/current/hadoop-client/lib/curator-client-2.7.1.jar:/usr/hdp/current/hadoop-client/lib/curator-framework-2.7.1.jar:/usr/hdp/current/hadoop-client/lib/curator-recipes-2.7.1.jar:/usr/hdp/current/hadoop-client/lib/gson-2.2.4.jar:/usr/hdp/current/hadoop-client/lib/guava-11.0.2.jar:/usr/hdp/current/hadoop-client/lib/hamcrest-core-1.3.jar:/usr/hdp/current/hadoop-client/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/current/hadoop-client/lib/httpclient-4.5.2.jar:/usr/hdp/current/hadoop-client/lib/httpcore-4.4.4.jar:/usr/hdp/current/hadoop-client/lib/jackson-annotations-2.2.3.jar:/usr/hdp/current/hadoop-client/lib/jackson-core-2.2.3.jar:/usr/hdp/current/hadoop-client/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/jackson-databind-2.2.3.jar:/usr/hdp/current/hadoop-client/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/jackson-xc-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/java-xmlbuilder-0.4.jar:/usr/hdp/current/hadoop-client/lib/jaxb-api-2.2.2.jar:/usr/hdp/current/hadoop-client/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/hadoop-client/lib/jcip-annotations-1.0.jar:/usr/hdp/current/hadoop-client/lib/jersey-core-1.9.jar:/usr/hdp/current/hadoop-client/lib/jersey-json-1.9.jar:/usr/hdp/current/hadoop-client/lib/jersey-server-1.9.jar:/usr/hdp/current/hadoop-client/lib/jets3t-0.9.0.jar:/usr/hdp/current/hadoop-client/lib/jettison-1.1.jar:/usr/hdp/current/hadoop-client/lib/jetty-6.1.26.hwx.jar:/usr/hdp/current/hadoop-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/hadoop-client/lib/joda-time-2.8.1.jar:/usr/hdp/current/hadoop-client/lib/jsch-0.1.42.jar:/usr/hdp/current/hadoop-client/lib/json-smart-1.1.1.jar:/usr/hdp/current/hadoop-client/lib/jsp-api-2.1.jar:/usr/hdp/current/hadoop-client/lib/jsr305-3.0.0.jar:/usr/hdp/current/hadoop-client/lib/junit-4.11.jar:/usr/hdp/current/hadoop-client/lib/log4j-1.2.17.jar:/usr/hdp/current/hadoop-client/lib/mockito-all-1.8.5.jar:/usr/hdp/current/hadoop-client/lib/netty-3.6.2.Final.jar:/usr/hdp/current/hadoop-client/lib/nimbus-jose-jwt-3.9.jar:/usr/hdp/current/hadoop-client/lib/ojdbc6.jar:/usr/hdp/current/hadoop-client/lib/paranamer-2.3.jar:/usr/hdp/current/hadoop-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/hadoop-client/lib/ranger-hdfs-plugin-shim-0.6.0.2.5.3.0-37.jar:/usr/hdp/current/hadoop-client/lib/ranger-plugin-classloader-0.6.0.2.5.3.0-37.jar:/usr/hdp/current/hadoop-client/lib/ranger-yarn-plugin-shim-0.6.0.2.5.3.0-37.jar:/usr/hdp/current/hadoop-client/lib/slf4j-api-1.7.10.jar:/usr/hdp/current/hadoop-client/lib/slf4j-log4j12-1.7.10.jar:/usr/hdp/current/hadoop-client/lib/snappy-java-1.0.4.1.jar:/usr/hdp/current/hadoop-client/lib/stax-api-1.0-2.jar:/usr/hdp/current/hadoop-client/lib/xmlenc-0.52.jar:/usr/hdp/current/hadoop-client/lib/xz-1.0.jar:/usr/hdp/current/hadoop-client/lib/zookeeper-3.4.6.2.5.3.0-37.jar

HBASE_JARS=/usr/hdp/current/hbase-client/lib/activation-1.1.jar:/usr/hdp/current/hbase-client/lib/aopalliance-1.0.jar:/usr/hdp/current/hbase-client/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/hbase-client/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/hbase-client/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/hbase-client/lib/api-util-1.0.0-M20.jar:/usr/hdp/current/hbase-client/lib/asm-3.1.jar:/usr/hdp/current/hbase-client/lib/avro-1.7.4.jar:/usr/hdp/current/hbase-client/lib/commons-beanutils-1.7.0.jar:/usr/hdp/current/hbase-client/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/hbase-client/lib/commons-cli-1.2.jar:/usr/hdp/current/hbase-client/lib/commons-codec-1.9.jar:/usr/hdp/current/hbase-client/lib/commons-collections-3.2.2.jar:/usr/hdp/current/hbase-client/lib/commons-compress-1.4.1.jar:/usr/hdp/current/hbase-client/lib/commons-configuration-1.6.jar:/usr/hdp/current/hbase-client/lib/commons-daemon-1.0.13.jar:/usr/hdp/current/hbase-client/lib/commons-digester-1.8.jar:/usr/hdp/current/hbase-client/lib/commons-el-1.0.jar:/usr/hdp/current/hbase-client/lib/commons-httpclient-3.1.jar:/usr/hdp/current/hbase-client/lib/commons-io-2.4.jar:/usr/hdp/current/hbase-client/lib/commons-lang-2.6.jar:/usr/hdp/current/hbase-client/lib/commons-logging-1.2.jar:/usr/hdp/current/hbase-client/lib/commons-math-2.2.jar:/usr/hdp/current/hbase-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/hbase-client/lib/commons-net-3.1.jar:/usr/hdp/current/hbase-client/lib/curator-client-2.7.1.jar:/usr/hdp/current/hbase-client/lib/curator-framework-2.7.1.jar:/usr/hdp/current/hbase-client/lib/curator-recipes-2.7.1.jar:/usr/hdp/current/hbase-client/lib/disruptor-3.3.0.jar:/usr/hdp/current/hbase-client/lib/findbugs-annotations-1.3.9-1.jar:/usr/hdp/current/hbase-client/lib/gson-2.2.4.jar:/usr/hdp/current/hbase-client/lib/guava-12.0.1.jar:/usr/hdp/current/hbase-client/lib/guice-3.0.jar:/usr/hdp/current/hbase-client/lib/guice-servlet-3.0.jar:/usr/hdp/current/hbase-client/lib/hbase-annotations-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-annotations-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/current/hbase-client/lib/hbase-annotations.jar:/usr/hdp/current/hbase-client/lib/hbase-client-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-client.jar:/usr/hdp/current/hbase-client/lib/hbase-common-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-common-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/current/hbase-client/lib/hbase-common.jar:/usr/hdp/current/hbase-client/lib/hbase-examples-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-examples.jar:/usr/hdp/current/hbase-client/lib/hbase-hadoop2-compat-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-hadoop2-compat.jar:/usr/hdp/current/hbase-client/lib/hbase-hadoop-compat-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-hadoop-compat.jar:/usr/hdp/current/hbase-client/lib/hbase-it-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-it-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/current/hbase-client/lib/hbase-it.jar:/usr/hdp/current/hbase-client/lib/hbase-prefix-tree-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-prefix-tree.jar:/usr/hdp/current/hbase-client/lib/hbase-procedure-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-procedure.jar:/usr/hdp/current/hbase-client/lib/hbase-protocol-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-protocol.jar:/usr/hdp/current/hbase-client/lib/hbase-resource-bundle-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-resource-bundle.jar:/usr/hdp/current/hbase-client/lib/hbase-rest-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-rest.jar:/usr/hdp/current/hbase-client/lib/hbase-server-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-server-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/current/hbase-client/lib/hbase-server.jar:/usr/hdp/current/hbase-client/lib/hbase-shell-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-shell.jar:/usr/hdp/current/hbase-client/lib/hbase-thrift-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-thrift.jar:/usr/hdp/current/hbase-client/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/current/hbase-client/lib/httpclient-4.2.5.jar:/usr/hdp/current/hbase-client/lib/httpcore-4.2.5.jar:/usr/hdp/current/hbase-client/lib/jamon-runtime-2.3.1.jar:/usr/hdp/current/hbase-client/lib/jasper-compiler-5.5.23.jar:/usr/hdp/current/hbase-client/lib/jasper-runtime-5.5.23.jar:/usr/hdp/current/hbase-client/lib/javax.inject-1.jar:/usr/hdp/current/hbase-client/lib/java-xmlbuilder-0.4.jar:/usr/hdp/current/hbase-client/lib/jaxb-api-2.2.2.jar:/usr/hdp/current/hbase-client/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/hbase-client/lib/jcodings-1.0.8.jar:/usr/hdp/current/hbase-client/lib/jersey-client-1.9.jar:/usr/hdp/current/hbase-client/lib/jersey-core-1.9.jar:/usr/hdp/current/hbase-client/lib/jersey-guice-1.9.jar:/usr/hdp/current/hbase-client/lib/jersey-json-1.9.jar:/usr/hdp/current/hbase-client/lib/jersey-server-1.9.jar:/usr/hdp/current/hbase-client/lib/jets3t-0.9.0.jar:/usr/hdp/current/hbase-client/lib/jettison-1.3.3.jar:/usr/hdp/current/hbase-client/lib/jetty-6.1.26.hwx.jar:/usr/hdp/current/hbase-client/lib/jetty-sslengine-6.1.26.hwx.jar:/usr/hdp/current/hbase-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/hbase-client/lib/joni-2.1.2.jar:/usr/hdp/current/hbase-client/lib/jruby-complete-1.6.8.jar:/usr/hdp/current/hbase-client/lib/jsch-0.1.42.jar:/usr/hdp/current/hbase-client/lib/jsp-2.1-6.1.14.jar:/usr/hdp/current/hbase-client/lib/jsp-api-2.1-6.1.14.jar:/usr/hdp/current/hbase-client/lib/jsr305-1.3.9.jar:/usr/hdp/current/hbase-client/lib/junit-4.11.jar:/usr/hdp/current/hbase-client/lib/leveldbjni-all-1.8.jar:/usr/hdp/current/hbase-client/lib/libthrift-0.9.0.jar:/usr/hdp/current/hbase-client/lib/log4j-1.2.17.jar:/usr/hdp/current/hbase-client/lib/metrics-core-2.2.0.jar:/usr/hdp/current/hbase-client/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/current/hbase-client/lib/okhttp-2.4.0.jar:/usr/hdp/current/hbase-client/lib/okio-1.4.0.jar:/usr/hdp/current/hbase-client/lib/paranamer-2.3.jar:/usr/hdp/current/hbase-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/hbase-client/lib/ranger-hbase-plugin-shim-0.5.0.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/ranger-plugin-classloader-0.5.0.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/slf4j-api-1.7.7.jar:/usr/hdp/current/hbase-client/lib/snappy-java-1.0.4.1.jar:/usr/hdp/current/hbase-client/lib/spymemcached-2.11.6.jar:/usr/hdp/current/hbase-client/lib/xercesImpl-2.9.1.jar:/usr/hdp/current/hbase-client/lib/xml-apis-1.3.04.jar:/usr/hdp/current/hbase-client/lib/xmlenc-0.52.jar:/usr/hdp/current/hbase-client/lib/xz-1.0.jar:/usr/hdp/current/hbase-client/lib/zookeeper.jar


source "$HADOOP_CONF_DIR"/hadoop-env.sh
source "$YARN_CONF_DIR"/yarn-env.sh
source "$SPARK_HOME"/bin/load-spark-env.sh
source "$HBASE_CONF_DIR"/hbase-env.sh

export GREMLIN_LOG_LEVEL=WARN
export JAVA_OPTIONS="$JAVA_OPTIONS -Djava.library.path=/usr/hdp/current/hadoop-client/lib/native -Dtinkerpop.ext=ext -Dlog4j.configuration=conf/log4j-console.properties -Dhdp.version=$HDP_VERSION -Dgremlin.log4j.level=$GREMLIN_LOG_LEVEL"

# for gremlin to use spark plugin
GREMLINHOME=/your/libpath/titan-1.1-graben1437-hadoop2
export HADOOP_GREMLIN_LIBS=$GREMLINHOME/lib:$HBASE_HOME/lib

# for gremlin to connect to cluster hdfs
export CLASSPATH=$HADOOP_JARS:$HADOOP_HOME/etc/hadoop

# for gremlin to connect to cluster hbase
export CLASSPATH=$CLASPATH:$HBASE_JARS:$HBASE_HOME/conf

# for gremlin to connect to cluster yarn with spark
export CLASSPATH=$GREMLINHOME/lib/*:$YARN_HOME/*:$YARN_CONF_DIR:$SPARK_HOME/lib/*:$SPARK_CONF_DIR:/opt/politie/lib/spark-metrics-current.jar:$CLASSPATH

cd $GREMLINHOME 

exec $GREMLINHOME/bin/gremlin.sh $*



As you see I had to solve some version conflicts with my cluster.

Cheers,    Marc

Op dinsdag 18 april 2017 14:43:47 UTC+2 schreef Joseph Obernberger:

Thank you Marc.  Yes, I've even tried very simple graphs of just a few vertices, commit, write out graphML, and then close.  I can also load the graph from HBase, and export graphML OK.  It is only in  the gremlin shell that I'm not able to access the data.  I'll double check the transactions.

I'm using Cloudera's CDH 5.10.0 so I had to modify the pom.xml to use their hadoop2 and hbase versions.  Could that be the issue?  Given that the exported graphML is valid, I'm leaning toward it being something with Gremlin?

Thanks again!

-Joe


On 4/18/2017 2:29 AM, HadoopMarc wrote:
Hi Joseph,

My first idea: did you commit the transactions that inserted the vertices and edges into JanusGrpah?

Cheers,    Marc


Op dinsdag 18 april 2017 04:48:02 UTC+2 schreef Joseph Obernberger:
Hello,
I've just started to use Janus Graph and I have code that creates a graph (using HBase as the storage backend), and exports the graph to GraphML.  The graph is only about 10k vertices, and I can view it in Gephi OK.  When I load it with Gremlin, however, I get 0 vertices from the g.V().count() command.  
I open the graph OK with graph=JanusFactory.open('conf/J.properties') where J.properties has the necessary params and table name.  I then do g=graph.traversal(), which also succeeds.  Any ideas on what I can debug from here?  When I do a count on the table from HBase shell, data is present.
Thank you for any ideas!

-Joe Obernberger
--
You received this message because you are subscribed to the Google Groups "JanusGraph users list" group.
To unsubscribe from this group and stop receiving emails from it, send an email to janusgraph-use...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "JanusGraph users list" group.
To unsubscribe from this group and stop receiving emails from it, send an email to janusgra...@....
For more options, visit https://groups.google.com/d/optout.


Joe Obernberger <joseph.o...@...>
 

Thanks for the help Marc.  Turns out I set storage.hbase.short-cf-named to false in my code and didn't set that in the gremlin properties file.  You have no idea how many days I've been screwing around to finally find out it was just a parameter! 

-Joe


On 4/18/2017 10:27 AM, HadoopMarc wrote:
Hi Joseph,

I am still on Titan/HBase, but the things should work the same.

Code to connect:

config = new PropertiesConfiguration("somedir/xyz.properties")
config.setProperty('storage.hbase.table','some_table')
graph = GraphFactory.open(config)
g=graph.traversal()

 with xyz.properties looking like:
gremlin.graph=com.thinkaurelius.titan.core.TitanFactory

# HBase config
storage.backend=hbase
storage.hostname=fqdn1,fqdn2,fqdn3

# Titan caching engine
cache.db-cache = true
cache.db-cache-clean-wait = 20
cache.db-cache-time = 180000
cache.db-cache-size = 0.5

If this is ok, you probably have to add your cluster conf and lib dirs to gremlin shell's classpath. This will look something like:

#!/bin/bash

export SCRIPT_PWD=$PWD

HDP_VERSION=2.5.3.0-37

if [[ `ls -l /usr/hdp/current` != *"$HDP_VERSION"* ]]
then
  echo "HDP_VERSION config in this script does not match active HDP stack"
  exit 1
fi
export HADOOP_HOME=/usr/hdp/current/hadoop-client
export HADOOP_CONF_DIR=$HADOOP_HOME/conf
export YARN_HOME=/usr/hdp/current/hadoop-yarn-client
export YARN_CONF_DIR=$HADOOP_CONF_DIR
export SPARK_HOME=/usr/hdp/current/spark-client
export SPARK_CONF_DIR=$SPARK_HOME/conf
export HBASE_HOME=/usr/hdp/current/hbase-client
export HBASE_CONF_DIR=$HBASE_HOME/conf

# HADOOP_JARS is HADOOP_HOME without servlet-api
# Assembled manually with:
# for filename in /usr/hdp/current/hadoop-client/lib/*.jar; do
#    HADOOPJARS=$HADOOPJARS:"$filename"
# done
# echo $HADOOPJARS
HADOOP_JARS=/usr/hdp/current/hadoop-clientlib/*.jar:/usr/hdp/current/hadoop-client/lib/activation-1.1.jar:/usr/hdp/current/hadoop-client/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/hadoop-client/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/hadoop-client/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/hadoop-client/lib/api-util-1.0.0-M20.jar:/usr/hdp/current/hadoop-client/lib/asm-3.2.jar:/usr/hdp/current/hadoop-client/lib/avro-1.7.4.jar:/usr/hdp/current/hadoop-client/lib/aws-java-sdk-core-1.10.6.jar:/usr/hdp/current/hadoop-client/lib/aws-java-sdk-kms-1.10.6.jar:/usr/hdp/current/hadoop-client/lib/aws-java-sdk-s3-1.10.6.jar:/usr/hdp/current/hadoop-client/lib/azure-keyvault-core-0.8.0.jar:/usr/hdp/current/hadoop-client/lib/azure-storage-4.2.0.jar:/usr/hdp/current/hadoop-client/lib/commons-beanutils-1.7.0.jar:/usr/hdp/current/hadoop-client/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/hadoop-client/lib/commons-cli-1.2.jar:/usr/hdp/current/hadoop-client/lib/commons-codec-1.4.jar:/usr/hdp/current/hadoop-client/lib/commons-collections-3.2.2.jar:/usr/hdp/current/hadoop-client/lib/commons-compress-1.4.1.jar:/usr/hdp/current/hadoop-client/lib/commons-configuration-1.6.jar:/usr/hdp/current/hadoop-client/lib/commons-digester-1.8.jar:/usr/hdp/current/hadoop-client/lib/commons-io-2.4.jar:/usr/hdp/current/hadoop-client/lib/commons-lang-2.6.jar:/usr/hdp/current/hadoop-client/lib/commons-lang3-3.4.jar:/usr/hdp/current/hadoop-client/lib/commons-logging-1.1.3.jar:/usr/hdp/current/hadoop-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/hadoop-client/lib/commons-net-3.1.jar:/usr/hdp/current/hadoop-client/lib/curator-client-2.7.1.jar:/usr/hdp/current/hadoop-client/lib/curator-framework-2.7.1.jar:/usr/hdp/current/hadoop-client/lib/curator-recipes-2.7.1.jar:/usr/hdp/current/hadoop-client/lib/gson-2.2.4.jar:/usr/hdp/current/hadoop-client/lib/guava-11.0.2.jar:/usr/hdp/current/hadoop-client/lib/hamcrest-core-1.3.jar:/usr/hdp/current/hadoop-client/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/current/hadoop-client/lib/httpclient-4.5.2.jar:/usr/hdp/current/hadoop-client/lib/httpcore-4.4.4.jar:/usr/hdp/current/hadoop-client/lib/jackson-annotations-2.2.3.jar:/usr/hdp/current/hadoop-client/lib/jackson-core-2.2.3.jar:/usr/hdp/current/hadoop-client/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/jackson-databind-2.2.3.jar:/usr/hdp/current/hadoop-client/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/jackson-xc-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/java-xmlbuilder-0.4.jar:/usr/hdp/current/hadoop-client/lib/jaxb-api-2.2.2.jar:/usr/hdp/current/hadoop-client/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/hadoop-client/lib/jcip-annotations-1.0.jar:/usr/hdp/current/hadoop-client/lib/jersey-core-1.9.jar:/usr/hdp/current/hadoop-client/lib/jersey-json-1.9.jar:/usr/hdp/current/hadoop-client/lib/jersey-server-1.9.jar:/usr/hdp/current/hadoop-client/lib/jets3t-0.9.0.jar:/usr/hdp/current/hadoop-client/lib/jettison-1.1.jar:/usr/hdp/current/hadoop-client/lib/jetty-6.1.26.hwx.jar:/usr/hdp/current/hadoop-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/hadoop-client/lib/joda-time-2.8.1.jar:/usr/hdp/current/hadoop-client/lib/jsch-0.1.42.jar:/usr/hdp/current/hadoop-client/lib/json-smart-1.1.1.jar:/usr/hdp/current/hadoop-client/lib/jsp-api-2.1.jar:/usr/hdp/current/hadoop-client/lib/jsr305-3.0.0.jar:/usr/hdp/current/hadoop-client/lib/junit-4.11.jar:/usr/hdp/current/hadoop-client/lib/log4j-1.2.17.jar:/usr/hdp/current/hadoop-client/lib/mockito-all-1.8.5.jar:/usr/hdp/current/hadoop-client/lib/netty-3.6.2.Final.jar:/usr/hdp/current/hadoop-client/lib/nimbus-jose-jwt-3.9.jar:/usr/hdp/current/hadoop-client/lib/ojdbc6.jar:/usr/hdp/current/hadoop-client/lib/paranamer-2.3.jar:/usr/hdp/current/hadoop-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/hadoop-client/lib/ranger-hdfs-plugin-shim-0.6.0.2.5.3.0-37.jar:/usr/hdp/current/hadoop-client/lib/ranger-plugin-classloader-0.6.0.2.5.3.0-37.jar:/usr/hdp/current/hadoop-client/lib/ranger-yarn-plugin-shim-0.6.0.2.5.3.0-37.jar:/usr/hdp/current/hadoop-client/lib/slf4j-api-1.7.10.jar:/usr/hdp/current/hadoop-client/lib/slf4j-log4j12-1.7.10.jar:/usr/hdp/current/hadoop-client/lib/snappy-java-1.0.4.1.jar:/usr/hdp/current/hadoop-client/lib/stax-api-1.0-2.jar:/usr/hdp/current/hadoop-client/lib/xmlenc-0.52.jar:/usr/hdp/current/hadoop-client/lib/xz-1.0.jar:/usr/hdp/current/hadoop-client/lib/zookeeper-3.4.6.2.5.3.0-37.jar

HBASE_JARS=/usr/hdp/current/hbase-client/lib/activation-1.1.jar:/usr/hdp/current/hbase-client/lib/aopalliance-1.0.jar:/usr/hdp/current/hbase-client/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/hbase-client/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/hbase-client/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/hbase-client/lib/api-util-1.0.0-M20.jar:/usr/hdp/current/hbase-client/lib/asm-3.1.jar:/usr/hdp/current/hbase-client/lib/avro-1.7.4.jar:/usr/hdp/current/hbase-client/lib/commons-beanutils-1.7.0.jar:/usr/hdp/current/hbase-client/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/hbase-client/lib/commons-cli-1.2.jar:/usr/hdp/current/hbase-client/lib/commons-codec-1.9.jar:/usr/hdp/current/hbase-client/lib/commons-collections-3.2.2.jar:/usr/hdp/current/hbase-client/lib/commons-compress-1.4.1.jar:/usr/hdp/current/hbase-client/lib/commons-configuration-1.6.jar:/usr/hdp/current/hbase-client/lib/commons-daemon-1.0.13.jar:/usr/hdp/current/hbase-client/lib/commons-digester-1.8.jar:/usr/hdp/current/hbase-client/lib/commons-el-1.0.jar:/usr/hdp/current/hbase-client/lib/commons-httpclient-3.1.jar:/usr/hdp/current/hbase-client/lib/commons-io-2.4.jar:/usr/hdp/current/hbase-client/lib/commons-lang-2.6.jar:/usr/hdp/current/hbase-client/lib/commons-logging-1.2.jar:/usr/hdp/current/hbase-client/lib/commons-math-2.2.jar:/usr/hdp/current/hbase-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/hbase-client/lib/commons-net-3.1.jar:/usr/hdp/current/hbase-client/lib/curator-client-2.7.1.jar:/usr/hdp/current/hbase-client/lib/curator-framework-2.7.1.jar:/usr/hdp/current/hbase-client/lib/curator-recipes-2.7.1.jar:/usr/hdp/current/hbase-client/lib/disruptor-3.3.0.jar:/usr/hdp/current/hbase-client/lib/findbugs-annotations-1.3.9-1.jar:/usr/hdp/current/hbase-client/lib/gson-2.2.4.jar:/usr/hdp/current/hbase-client/lib/guava-12.0.1.jar:/usr/hdp/current/hbase-client/lib/guice-3.0.jar:/usr/hdp/current/hbase-client/lib/guice-servlet-3.0.jar:/usr/hdp/current/hbase-client/lib/hbase-annotations-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-annotations-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/current/hbase-client/lib/hbase-annotations.jar:/usr/hdp/current/hbase-client/lib/hbase-client-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-client.jar:/usr/hdp/current/hbase-client/lib/hbase-common-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-common-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/current/hbase-client/lib/hbase-common.jar:/usr/hdp/current/hbase-client/lib/hbase-examples-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-examples.jar:/usr/hdp/current/hbase-client/lib/hbase-hadoop2-compat-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-hadoop2-compat.jar:/usr/hdp/current/hbase-client/lib/hbase-hadoop-compat-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-hadoop-compat.jar:/usr/hdp/current/hbase-client/lib/hbase-it-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-it-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/current/hbase-client/lib/hbase-it.jar:/usr/hdp/current/hbase-client/lib/hbase-prefix-tree-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-prefix-tree.jar:/usr/hdp/current/hbase-client/lib/hbase-procedure-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-procedure.jar:/usr/hdp/current/hbase-client/lib/hbase-protocol-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-protocol.jar:/usr/hdp/current/hbase-client/lib/hbase-resource-bundle-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-resource-bundle.jar:/usr/hdp/current/hbase-client/lib/hbase-rest-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-rest.jar:/usr/hdp/current/hbase-client/lib/hbase-server-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-server-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/current/hbase-client/lib/hbase-server.jar:/usr/hdp/current/hbase-client/lib/hbase-shell-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-shell.jar:/usr/hdp/current/hbase-client/lib/hbase-thrift-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-thrift.jar:/usr/hdp/current/hbase-client/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/current/hbase-client/lib/httpclient-4.2.5.jar:/usr/hdp/current/hbase-client/lib/httpcore-4.2.5.jar:/usr/hdp/current/hbase-client/lib/jamon-runtime-2.3.1.jar:/usr/hdp/current/hbase-client/lib/jasper-compiler-5.5.23.jar:/usr/hdp/current/hbase-client/lib/jasper-runtime-5.5.23.jar:/usr/hdp/current/hbase-client/lib/javax.inject-1.jar:/usr/hdp/current/hbase-client/lib/java-xmlbuilder-0.4.jar:/usr/hdp/current/hbase-client/lib/jaxb-api-2.2.2.jar:/usr/hdp/current/hbase-client/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/hbase-client/lib/jcodings-1.0.8.jar:/usr/hdp/current/hbase-client/lib/jersey-client-1.9.jar:/usr/hdp/current/hbase-client/lib/jersey-core-1.9.jar:/usr/hdp/current/hbase-client/lib/jersey-guice-1.9.jar:/usr/hdp/current/hbase-client/lib/jersey-json-1.9.jar:/usr/hdp/current/hbase-client/lib/jersey-server-1.9.jar:/usr/hdp/current/hbase-client/lib/jets3t-0.9.0.jar:/usr/hdp/current/hbase-client/lib/jettison-1.3.3.jar:/usr/hdp/current/hbase-client/lib/jetty-6.1.26.hwx.jar:/usr/hdp/current/hbase-client/lib/jetty-sslengine-6.1.26.hwx.jar:/usr/hdp/current/hbase-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/hbase-client/lib/joni-2.1.2.jar:/usr/hdp/current/hbase-client/lib/jruby-complete-1.6.8.jar:/usr/hdp/current/hbase-client/lib/jsch-0.1.42.jar:/usr/hdp/current/hbase-client/lib/jsp-2.1-6.1.14.jar:/usr/hdp/current/hbase-client/lib/jsp-api-2.1-6.1.14.jar:/usr/hdp/current/hbase-client/lib/jsr305-1.3.9.jar:/usr/hdp/current/hbase-client/lib/junit-4.11.jar:/usr/hdp/current/hbase-client/lib/leveldbjni-all-1.8.jar:/usr/hdp/current/hbase-client/lib/libthrift-0.9.0.jar:/usr/hdp/current/hbase-client/lib/log4j-1.2.17.jar:/usr/hdp/current/hbase-client/lib/metrics-core-2.2.0.jar:/usr/hdp/current/hbase-client/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/current/hbase-client/lib/okhttp-2.4.0.jar:/usr/hdp/current/hbase-client/lib/okio-1.4.0.jar:/usr/hdp/current/hbase-client/lib/paranamer-2.3.jar:/usr/hdp/current/hbase-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/hbase-client/lib/ranger-hbase-plugin-shim-0.5.0.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/ranger-plugin-classloader-0.5.0.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/slf4j-api-1.7.7.jar:/usr/hdp/current/hbase-client/lib/snappy-java-1.0.4.1.jar:/usr/hdp/current/hbase-client/lib/spymemcached-2.11.6.jar:/usr/hdp/current/hbase-client/lib/xercesImpl-2.9.1.jar:/usr/hdp/current/hbase-client/lib/xml-apis-1.3.04.jar:/usr/hdp/current/hbase-client/lib/xmlenc-0.52.jar:/usr/hdp/current/hbase-client/lib/xz-1.0.jar:/usr/hdp/current/hbase-client/lib/zookeeper.jar


source "$HADOOP_CONF_DIR"/hadoop-env.sh
source "$YARN_CONF_DIR"/yarn-env.sh
source "$SPARK_HOME"/bin/load-spark-env.sh
source "$HBASE_CONF_DIR"/hbase-env.sh

export GREMLIN_LOG_LEVEL=WARN
export JAVA_OPTIONS="$JAVA_OPTIONS -Djava.library.path=/usr/hdp/current/hadoop-client/lib/native -Dtinkerpop.ext=ext -Dlog4j.configuration=conf/log4j-console.properties -Dhdp.version=$HDP_VERSION -Dgremlin.log4j.level=$GREMLIN_LOG_LEVEL"

# for gremlin to use spark plugin
GREMLINHOME=/your/libpath/titan-1.1-graben1437-hadoop2
export HADOOP_GREMLIN_LIBS=$GREMLINHOME/lib:$HBASE_HOME/lib

# for gremlin to connect to cluster hdfs
export CLASSPATH=$HADOOP_JARS:$HADOOP_HOME/etc/hadoop

# for gremlin to connect to cluster hbase
export CLASSPATH=$CLASPATH:$HBASE_JARS:$HBASE_HOME/conf

# for gremlin to connect to cluster yarn with spark
export CLASSPATH=$GREMLINHOME/lib/*:$YARN_HOME/*:$YARN_CONF_DIR:$SPARK_HOME/lib/*:$SPARK_CONF_DIR:$CLASSPATH

cd $GREMLINHOME 

exec $GREMLINHOME/bin/gremlin.sh $*



As you see I had to solve some version conflicts with my cluster. The spark OLAP part has never worked for me, but still lingers around in the gremlin start script.

Cheers,    Marc

Op dinsdag 18 april 2017 14:43:47 UTC+2 schreef Joseph Obernberger:

Thank you Marc.  Yes, I've even tried very simple graphs of just a few vertices, commit, write out graphML, and then close.  I can also load the graph from HBase, and export graphML OK.  It is only in  the gremlin shell that I'm not able to access the data.  I'll double check the transactions.

I'm using Cloudera's CDH 5.10.0 so I had to modify the pom.xml to use their hadoop2 and hbase versions.  Could that be the issue?  Given that the exported graphML is valid, I'm leaning toward it being something with Gremlin?

Thanks again!

-Joe


On 4/18/2017 2:29 AM, HadoopMarc wrote:
Hi Joseph,

My first idea: did you commit the transactions that inserted the vertices and edges into JanusGrpah?

Cheers,    Marc


Op dinsdag 18 april 2017 04:48:02 UTC+2 schreef Joseph Obernberger:
Hello,
I've just started to use Janus Graph and I have code that creates a graph (using HBase as the storage backend), and exports the graph to GraphML.  The graph is only about 10k vertices, and I can view it in Gephi OK.  When I load it with Gremlin, however, I get 0 vertices from the g.V().count() command.  
I open the graph OK with graph=JanusFactory.open('conf/J.properties') where J.properties has the necessary params and table name.  I then do g=graph.traversal(), which also succeeds.  Any ideas on what I can debug from here?  When I do a count on the table from HBase shell, data is present.
Thank you for any ideas!

-Joe Obernberger
--
You received this message because you are subscribed to the Google Groups "JanusGraph users list" group.
To unsubscribe from this group and stop receiving emails from it, send an email to janusgraph-use...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "JanusGraph users list" group.
To unsubscribe from this group and stop receiving emails from it, send an email to janusgra...@....
For more options, visit https://groups.google.com/d/optout.


Jason Plurad <plu...@...>
 

Joe, thanks for reporting back with your solution. This sounds like a bug. I've opened up an issue to track it.
https://github.com/JanusGraph/janusgraph/issues/220

-- Jason


On Tuesday, April 18, 2017 at 2:31:40 PM UTC-4, Joseph Obernberger wrote:

Thanks for the help Marc.  Turns out I set storage.hbase.short-cf-named to false in my code and didn't set that in the gremlin properties file.  You have no idea how many days I've been screwing around to finally find out it was just a parameter! 

-Joe


On 4/18/2017 10:27 AM, HadoopMarc wrote:
Hi Joseph,

I am still on Titan/HBase, but the things should work the same.

Code to connect:

config = new PropertiesConfiguration("somedir/xyz.properties")
config.setProperty('storage.hbase.table','some_table')
graph = GraphFactory.open(config)
g=graph.traversal()

 with xyz.properties looking like:
gremlin.graph=com.thinkaurelius.titan.core.TitanFactory

# HBase config
storage.backend=hbase
storage.hostname=fqdn1,fqdn2,fqdn3

# Titan caching engine
cache.db-cache = true
cache.db-cache-clean-wait = 20
cache.db-cache-time = 180000
cache.db-cache-size = 0.5

If this is ok, you probably have to add your cluster conf and lib dirs to gremlin shell's classpath. This will look something like:

#!/bin/bash

export SCRIPT_PWD=$PWD

HDP_VERSION=2.5.3.0-37

if [[ `ls -l /usr/hdp/current` != *"$HDP_VERSION"* ]]
then
  echo "HDP_VERSION config in this script does not match active HDP stack"
  exit 1
fi
export HADOOP_HOME=/usr/hdp/current/hadoop-client
export HADOOP_CONF_DIR=$HADOOP_HOME/conf
export YARN_HOME=/usr/hdp/current/hadoop-yarn-client
export YARN_CONF_DIR=$HADOOP_CONF_DIR
export SPARK_HOME=/usr/hdp/current/spark-client
export SPARK_CONF_DIR=$SPARK_HOME/conf
export HBASE_HOME=/usr/hdp/current/hbase-client
export HBASE_CONF_DIR=$HBASE_HOME/conf

# HADOOP_JARS is HADOOP_HOME without servlet-api
# Assembled manually with:
# for filename in /usr/hdp/current/hadoop-client/lib/*.jar; do
#    HADOOPJARS=$HADOOPJARS:"$filename"
# done
# echo $HADOOPJARS
HADOOP_JARS=/usr/hdp/current/hadoop-clientlib/*.jar:/usr/hdp/current/hadoop-client/lib/activation-1.1.jar:/usr/hdp/current/hadoop-client/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/hadoop-client/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/hadoop-client/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/hadoop-client/lib/api-util-1.0.0-M20.jar:/usr/hdp/current/hadoop-client/lib/asm-3.2.jar:/usr/hdp/current/hadoop-client/lib/avro-1.7.4.jar:/usr/hdp/current/hadoop-client/lib/aws-java-sdk-core-1.10.6.jar:/usr/hdp/current/hadoop-client/lib/aws-java-sdk-kms-1.10.6.jar:/usr/hdp/current/hadoop-client/lib/aws-java-sdk-s3-1.10.6.jar:/usr/hdp/current/hadoop-client/lib/azure-keyvault-core-0.8.0.jar:/usr/hdp/current/hadoop-client/lib/azure-storage-4.2.0.jar:/usr/hdp/current/hadoop-client/lib/commons-beanutils-1.7.0.jar:/usr/hdp/current/hadoop-client/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/hadoop-client/lib/commons-cli-1.2.jar:/usr/hdp/current/hadoop-client/lib/commons-codec-1.4.jar:/usr/hdp/current/hadoop-client/lib/commons-collections-3.2.2.jar:/usr/hdp/current/hadoop-client/lib/commons-compress-1.4.1.jar:/usr/hdp/current/hadoop-client/lib/commons-configuration-1.6.jar:/usr/hdp/current/hadoop-client/lib/commons-digester-1.8.jar:/usr/hdp/current/hadoop-client/lib/commons-io-2.4.jar:/usr/hdp/current/hadoop-client/lib/commons-lang-2.6.jar:/usr/hdp/current/hadoop-client/lib/commons-lang3-3.4.jar:/usr/hdp/current/hadoop-client/lib/commons-logging-1.1.3.jar:/usr/hdp/current/hadoop-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/hadoop-client/lib/commons-net-3.1.jar:/usr/hdp/current/hadoop-client/lib/curator-client-2.7.1.jar:/usr/hdp/current/hadoop-client/lib/curator-framework-2.7.1.jar:/usr/hdp/current/hadoop-client/lib/curator-recipes-2.7.1.jar:/usr/hdp/current/hadoop-client/lib/gson-2.2.4.jar:/usr/hdp/current/hadoop-client/lib/guava-11.0.2.jar:/usr/hdp/current/hadoop-client/lib/hamcrest-core-1.3.jar:/usr/hdp/current/hadoop-client/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/current/hadoop-client/lib/httpclient-4.5.2.jar:/usr/hdp/current/hadoop-client/lib/httpcore-4.4.4.jar:/usr/hdp/current/hadoop-client/lib/jackson-annotations-2.2.3.jar:/usr/hdp/current/hadoop-client/lib/jackson-core-2.2.3.jar:/usr/hdp/current/hadoop-client/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/jackson-databind-2.2.3.jar:/usr/hdp/current/hadoop-client/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/jackson-xc-1.9.13.jar:/usr/hdp/current/hadoop-client/lib/java-xmlbuilder-0.4.jar:/usr/hdp/current/hadoop-client/lib/jaxb-api-2.2.2.jar:/usr/hdp/current/hadoop-client/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/hadoop-client/lib/jcip-annotations-1.0.jar:/usr/hdp/current/hadoop-client/lib/jersey-core-1.9.jar:/usr/hdp/current/hadoop-client/lib/jersey-json-1.9.jar:/usr/hdp/current/hadoop-client/lib/jersey-server-1.9.jar:/usr/hdp/current/hadoop-client/lib/jets3t-0.9.0.jar:/usr/hdp/current/hadoop-client/lib/jettison-1.1.jar:/usr/hdp/current/hadoop-client/lib/jetty-6.1.26.hwx.jar:/usr/hdp/current/hadoop-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/hadoop-client/lib/joda-time-2.8.1.jar:/usr/hdp/current/hadoop-client/lib/jsch-0.1.42.jar:/usr/hdp/current/hadoop-client/lib/json-smart-1.1.1.jar:/usr/hdp/current/hadoop-client/lib/jsp-api-2.1.jar:/usr/hdp/current/hadoop-client/lib/jsr305-3.0.0.jar:/usr/hdp/current/hadoop-client/lib/junit-4.11.jar:/usr/hdp/current/hadoop-client/lib/log4j-1.2.17.jar:/usr/hdp/current/hadoop-client/lib/mockito-all-1.8.5.jar:/usr/hdp/current/hadoop-client/lib/netty-3.6.2.Final.jar:/usr/hdp/current/hadoop-client/lib/nimbus-jose-jwt-3.9.jar:/usr/hdp/current/hadoop-client/lib/ojdbc6.jar:/usr/hdp/current/hadoop-client/lib/paranamer-2.3.jar:/usr/hdp/current/hadoop-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/hadoop-client/lib/ranger-hdfs-plugin-shim-0.6.0.2.5.3.0-37.jar:/usr/hdp/current/hadoop-client/lib/ranger-plugin-classloader-0.6.0.2.5.3.0-37.jar:/usr/hdp/current/hadoop-client/lib/ranger-yarn-plugin-shim-0.6.0.2.5.3.0-37.jar:/usr/hdp/current/hadoop-client/lib/slf4j-api-1.7.10.jar:/usr/hdp/current/hadoop-client/lib/slf4j-log4j12-1.7.10.jar:/usr/hdp/current/hadoop-client/lib/snappy-java-1.0.4.1.jar:/usr/hdp/current/hadoop-client/lib/stax-api-1.0-2.jar:/usr/hdp/current/hadoop-client/lib/xmlenc-0.52.jar:/usr/hdp/current/hadoop-client/lib/xz-1.0.jar:/usr/hdp/current/hadoop-client/lib/zookeeper-3.4.6.2.5.3.0-37.jar

HBASE_JARS=/usr/hdp/current/hbase-client/lib/activation-1.1.jar:/usr/hdp/current/hbase-client/lib/aopalliance-1.0.jar:/usr/hdp/current/hbase-client/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/current/hbase-client/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/current/hbase-client/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/current/hbase-client/lib/api-util-1.0.0-M20.jar:/usr/hdp/current/hbase-client/lib/asm-3.1.jar:/usr/hdp/current/hbase-client/lib/avro-1.7.4.jar:/usr/hdp/current/hbase-client/lib/commons-beanutils-1.7.0.jar:/usr/hdp/current/hbase-client/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/current/hbase-client/lib/commons-cli-1.2.jar:/usr/hdp/current/hbase-client/lib/commons-codec-1.9.jar:/usr/hdp/current/hbase-client/lib/commons-collections-3.2.2.jar:/usr/hdp/current/hbase-client/lib/commons-compress-1.4.1.jar:/usr/hdp/current/hbase-client/lib/commons-configuration-1.6.jar:/usr/hdp/current/hbase-client/lib/commons-daemon-1.0.13.jar:/usr/hdp/current/hbase-client/lib/commons-digester-1.8.jar:/usr/hdp/current/hbase-client/lib/commons-el-1.0.jar:/usr/hdp/current/hbase-client/lib/commons-httpclient-3.1.jar:/usr/hdp/current/hbase-client/lib/commons-io-2.4.jar:/usr/hdp/current/hbase-client/lib/commons-lang-2.6.jar:/usr/hdp/current/hbase-client/lib/commons-logging-1.2.jar:/usr/hdp/current/hbase-client/lib/commons-math-2.2.jar:/usr/hdp/current/hbase-client/lib/commons-math3-3.1.1.jar:/usr/hdp/current/hbase-client/lib/commons-net-3.1.jar:/usr/hdp/current/hbase-client/lib/curator-client-2.7.1.jar:/usr/hdp/current/hbase-client/lib/curator-framework-2.7.1.jar:/usr/hdp/current/hbase-client/lib/curator-recipes-2.7.1.jar:/usr/hdp/current/hbase-client/lib/disruptor-3.3.0.jar:/usr/hdp/current/hbase-client/lib/findbugs-annotations-1.3.9-1.jar:/usr/hdp/current/hbase-client/lib/gson-2.2.4.jar:/usr/hdp/current/hbase-client/lib/guava-12.0.1.jar:/usr/hdp/current/hbase-client/lib/guice-3.0.jar:/usr/hdp/current/hbase-client/lib/guice-servlet-3.0.jar:/usr/hdp/current/hbase-client/lib/hbase-annotations-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-annotations-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/current/hbase-client/lib/hbase-annotations.jar:/usr/hdp/current/hbase-client/lib/hbase-client-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-client.jar:/usr/hdp/current/hbase-client/lib/hbase-common-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-common-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/current/hbase-client/lib/hbase-common.jar:/usr/hdp/current/hbase-client/lib/hbase-examples-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-examples.jar:/usr/hdp/current/hbase-client/lib/hbase-hadoop2-compat-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-hadoop2-compat.jar:/usr/hdp/current/hbase-client/lib/hbase-hadoop-compat-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-hadoop-compat.jar:/usr/hdp/current/hbase-client/lib/hbase-it-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-it-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/current/hbase-client/lib/hbase-it.jar:/usr/hdp/current/hbase-client/lib/hbase-prefix-tree-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-prefix-tree.jar:/usr/hdp/current/hbase-client/lib/hbase-procedure-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-procedure.jar:/usr/hdp/current/hbase-client/lib/hbase-protocol-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-protocol.jar:/usr/hdp/current/hbase-client/lib/hbase-resource-bundle-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-resource-bundle.jar:/usr/hdp/current/hbase-client/lib/hbase-rest-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-rest.jar:/usr/hdp/current/hbase-client/lib/hbase-server-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-server-1.1.2.2.4.2.0-258-tests.jar:/usr/hdp/current/hbase-client/lib/hbase-server.jar:/usr/hdp/current/hbase-client/lib/hbase-shell-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-shell.jar:/usr/hdp/current/hbase-client/lib/hbase-thrift-1.1.2.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/hbase-thrift.jar:/usr/hdp/current/hbase-client/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/current/hbase-client/lib/httpclient-4.2.5.jar:/usr/hdp/current/hbase-client/lib/httpcore-4.2.5.jar:/usr/hdp/current/hbase-client/lib/jamon-runtime-2.3.1.jar:/usr/hdp/current/hbase-client/lib/jasper-compiler-5.5.23.jar:/usr/hdp/current/hbase-client/lib/jasper-runtime-5.5.23.jar:/usr/hdp/current/hbase-client/lib/javax.inject-1.jar:/usr/hdp/current/hbase-client/lib/java-xmlbuilder-0.4.jar:/usr/hdp/current/hbase-client/lib/jaxb-api-2.2.2.jar:/usr/hdp/current/hbase-client/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/current/hbase-client/lib/jcodings-1.0.8.jar:/usr/hdp/current/hbase-client/lib/jersey-client-1.9.jar:/usr/hdp/current/hbase-client/lib/jersey-core-1.9.jar:/usr/hdp/current/hbase-client/lib/jersey-guice-1.9.jar:/usr/hdp/current/hbase-client/lib/jersey-json-1.9.jar:/usr/hdp/current/hbase-client/lib/jersey-server-1.9.jar:/usr/hdp/current/hbase-client/lib/jets3t-0.9.0.jar:/usr/hdp/current/hbase-client/lib/jettison-1.3.3.jar:/usr/hdp/current/hbase-client/lib/jetty-6.1.26.hwx.jar:/usr/hdp/current/hbase-client/lib/jetty-sslengine-6.1.26.hwx.jar:/usr/hdp/current/hbase-client/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/current/hbase-client/lib/joni-2.1.2.jar:/usr/hdp/current/hbase-client/lib/jruby-complete-1.6.8.jar:/usr/hdp/current/hbase-client/lib/jsch-0.1.42.jar:/usr/hdp/current/hbase-client/lib/jsp-2.1-6.1.14.jar:/usr/hdp/current/hbase-client/lib/jsp-api-2.1-6.1.14.jar:/usr/hdp/current/hbase-client/lib/jsr305-1.3.9.jar:/usr/hdp/current/hbase-client/lib/junit-4.11.jar:/usr/hdp/current/hbase-client/lib/leveldbjni-all-1.8.jar:/usr/hdp/current/hbase-client/lib/libthrift-0.9.0.jar:/usr/hdp/current/hbase-client/lib/log4j-1.2.17.jar:/usr/hdp/current/hbase-client/lib/metrics-core-2.2.0.jar:/usr/hdp/current/hbase-client/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/current/hbase-client/lib/okhttp-2.4.0.jar:/usr/hdp/current/hbase-client/lib/okio-1.4.0.jar:/usr/hdp/current/hbase-client/lib/paranamer-2.3.jar:/usr/hdp/current/hbase-client/lib/protobuf-java-2.5.0.jar:/usr/hdp/current/hbase-client/lib/ranger-hbase-plugin-shim-0.5.0.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/ranger-plugin-classloader-0.5.0.2.4.2.0-258.jar:/usr/hdp/current/hbase-client/lib/slf4j-api-1.7.7.jar:/usr/hdp/current/hbase-client/lib/snappy-java-1.0.4.1.jar:/usr/hdp/current/hbase-client/lib/spymemcached-2.11.6.jar:/usr/hdp/current/hbase-client/lib/xercesImpl-2.9.1.jar:/usr/hdp/current/hbase-client/lib/xml-apis-1.3.04.jar:/usr/hdp/current/hbase-client/lib/xmlenc-0.52.jar:/usr/hdp/current/hbase-client/lib/xz-1.0.jar:/usr/hdp/current/hbase-client/lib/zookeeper.jar


source "$HADOOP_CONF_DIR"/hadoop-env.sh
source "$YARN_CONF_DIR"/yarn-env.sh
source "$SPARK_HOME"/bin/load-spark-env.sh
source "$HBASE_CONF_DIR"/hbase-env.sh

export GREMLIN_LOG_LEVEL=WARN
export JAVA_OPTIONS="$JAVA_OPTIONS -Djava.library.path=/usr/hdp/current/hadoop-client/lib/native -Dtinkerpop.ext=ext -Dlog4j.configuration=conf/log4j-console.properties -Dhdp.version=$HDP_VERSION -Dgremlin.log4j.level=$GREMLIN_LOG_LEVEL"

# for gremlin to use spark plugin
GREMLINHOME=/your/libpath/titan-1.1-graben1437-hadoop2
export HADOOP_GREMLIN_LIBS=$GREMLINHOME/lib:$HBASE_HOME/lib

# for gremlin to connect to cluster hdfs
export CLASSPATH=$HADOOP_JARS:$HADOOP_HOME/etc/hadoop

# for gremlin to connect to cluster hbase
export CLASSPATH=$CLASPATH:$HBASE_JARS:$HBASE_HOME/conf

# for gremlin to connect to cluster yarn with spark
export CLASSPATH=$GREMLINHOME/lib/*:$YARN_HOME/*:$YARN_CONF_DIR:$SPARK_HOME/lib/*:$SPARK_CONF_DIR:$CLASSPATH

cd $GREMLINHOME 

exec $GREMLINHOME/bin/gremlin.sh $*



As you see I had to solve some version conflicts with my cluster. The spark OLAP part has never worked for me, but still lingers around in the gremlin start script.

Cheers,    Marc

Op dinsdag 18 april 2017 14:43:47 UTC+2 schreef Joseph Obernberger:

Thank you Marc.  Yes, I've even tried very simple graphs of just a few vertices, commit, write out graphML, and then close.  I can also load the graph from HBase, and export graphML OK.  It is only in  the gremlin shell that I'm not able to access the data.  I'll double check the transactions.

I'm using Cloudera's CDH 5.10.0 so I had to modify the pom.xml to use their hadoop2 and hbase versions.  Could that be the issue?  Given that the exported graphML is valid, I'm leaning toward it being something with Gremlin?

Thanks again!

-Joe


On 4/18/2017 2:29 AM, HadoopMarc wrote:
Hi Joseph,

My first idea: did you commit the transactions that inserted the vertices and edges into JanusGrpah?

Cheers,    Marc


Op dinsdag 18 april 2017 04:48:02 UTC+2 schreef Joseph Obernberger:
Hello,
I've just started to use Janus Graph and I have code that creates a graph (using HBase as the storage backend), and exports the graph to GraphML.  The graph is only about 10k vertices, and I can view it in Gephi OK.  When I load it with Gremlin, however, I get 0 vertices from the g.V().count() command.  
I open the graph OK with graph=JanusFactory.open('conf/J.properties') where J.properties has the necessary params and table name.  I then do g=graph.traversal(), which also succeeds.  Any ideas on what I can debug from here?  When I do a count on the table from HBase shell, data is present.
Thank you for any ideas!

-Joe Obernberger
--
You received this message because you are subscribed to the Google Groups "JanusGraph users list" group.
To unsubscribe from this group and stop receiving emails from it, send an email to janusgraph-use...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "JanusGraph users list" group.
To unsubscribe from this group and stop receiving emails from it, send an email to janusgraph-users+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.