Date   

Issue Setting up ConfiguredGraphFactory

Paul Heinzlreiter
 

Hi all,

 

I am trying to set up the Janusgraph ConfiguredGraphFactory for using multiple graphs.

I am using janusgraph-full-0.6.2

 

I am trying to configure it for the inmemory graph to exclude other possible issues.

 

I am using a janusgraph-inmemory-configurationgraph.properties:

gremlin.graph=org.janusgraph.core.ConfiguredGraphFactory

graph.graphname=ConfigurationManagementGraph

storage.backend=inmemory

 

And in gremlin-server.yaml:

 

graphs: {

  ConfigurationManagementGraph: conf/janusgraph-inmemory-configurationgraph.properties

}

 

Restarting gremlin server, then wanting to set up a template configuration:

 

gremlin> :remote connect tinkerpop.server conf/remote.yaml

==>Configured localhost/127.0.0.1:8182

gremlin> :remote console

==>All scripts will now be sent to Gremlin Server - [localhost/127.0.0.1:8182] - type ':remote console' to return to local mode

gremlin> map = new HashMap();

gremlin> map.put("storage.hostname", "inmemory");

 

Leads to the exception (from janusgraph server log)

 

[gremlin-server-exec-3] WARN  org.apache.tinkerpop.gremlin.server.op.AbstractEvalOpProcessor  - Exception processing a script on request [RequestMessage{, requestId=2022abcc-d5fb-491e-bcc6-6952e7200339, op='eval', processor='', args={gremlin=map.put("storage.hostname", "inmemory");, bindings={}, batchSize=64}}].

groovy.lang.MissingPropertyException: No such property: map for class: Script4

 

So the map type cannot be (de)serialized?

 

Any help appreciated.

 

Regards,

Paul Heinzlreiter

 

--

DI Paul Heinzlreiter
Senior Data Engineer · Unit Logistics Informatics
paul.heinzlreiter@... · https://www.risc-software.at
+43 664 219 79 54

RISC Software GmbH
Softwarepark 32a · 4232 Hagenberg · Austria · +43 7236 93028
Commercial register no.: 89831f · Commercial register court: Landesgericht Linz, Austria

A company of Johannes Kepler University Linz

 

 


What Unit of Measurement Do the Time-Related JMX Metrics Use?

sammy.jia@...
 

Hello. I have a basic question about JanusGraph's metrics.

Currently, I have a JMX exporter that is sending JanusGraph's metrics to Prometheus. Some of these metrics, such as "metrics_org_janusgraph_query_graph_execute_time_Mean" or "metrics_org_janusgraph_query_graph_isDeleted_time_Count", are time-related but don't explicitly say what unit of measurement is being used. I'm not sure if they are using milliseconds, nanoseconds, or some other unit.

Does anyone happen to know what unit of measurement is being used in JanusGraph?


Re: Janusgraph-full-0.6.1: how to fix "WARNING: Critical severity vulnerabilities were found with Log4j!"

Yingjie Li
 

Hello Marc,

Yes, after applying the changes you suggested, it works now. I can load data and use gremlin successfully!

Thanks to all of you, Marc, Jan and Boxuan, for your help in fixing the security issue!

Best,
Yingjie
 


Thanks,
Yingjie


On Fri, Sep 30, 2022 at 3:46 AM <hadoopmarc@...> wrote:
Hi Yingjie,

OK, I tried for myself. From the initial log lines in the Gremlin Console:

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/tera/lib/janusgraph-full-1.0.0-SNAPSHOT/lib/log4j-slf4j-impl-2.18.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/tera/lib/janusgraph-full-1.0.0-SNAPSHOT/lib/logback-classic-1.2.11.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

you can see that you alsso have to remove lib/log4j-slf4j-impl-2.18.0.jar

After having done that, you will notice that the hadoop and spark plugins also depend on log4j. You can disable these by
removing the corresponding lines from the ext/plugins.txt file.

It seems the distribution now meets your requirements!

Best wishes,   Marc


Re: Janusgraph-full-0.6.1: how to fix "WARNING: Critical severity vulnerabilities were found with Log4j!"

hadoopmarc@...
 

Hi Yingjie,

OK, I tried for myself. From the initial log lines in the Gremlin Console:

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/tera/lib/janusgraph-full-1.0.0-SNAPSHOT/lib/log4j-slf4j-impl-2.18.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/tera/lib/janusgraph-full-1.0.0-SNAPSHOT/lib/logback-classic-1.2.11.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

you can see that you alsso have to remove lib/log4j-slf4j-impl-2.18.0.jar

After having done that, you will notice that the hadoop and spark plugins also depend on log4j. You can disable these by
removing the corresponding lines from the ext/plugins.txt file.

It seems the distribution now meets your requirements!

Best wishes,   Marc


Re: Janusgraph-full-0.6.1: how to fix "WARNING: Critical severity vulnerabilities were found with Log4j!"

Yingjie Li
 

Hello Marc,

For this build, in directory janusgraph-full-1.0.0-SNAPSHOT/lib, there are  log4j-1.2.17.jar , log4j-api-2.18.0.jar,   log4j-core-2.18.0.jar,   log4j-slf4j-impl-2.18.0.jar,  but no slf4j-log4j12-1.7.30.jar . 

In directory janusgraph-full-1.0.0-SNAPSHOT/elasticsearch/lib, there are elasticsearch-log4j-7.17.5.jar,  log4j-api-2.17.1.jar

Yingjie


On Fri, Sep 30, 2022 at 1:51 AM <hadoopmarc@...> wrote:
Hi Yingjie,

See my earlier comment, with respect to janusgraph-0.6.2:

>>My suggestion was incomplete. In addition to removing the log4j-1.2.17.jar file from the lib folder, you have to remove the slf4j-log4j12-1.7.30.jar file as well. Otherwise, JanusGraph server starts looking for the log4j jar and crashes, as you found out.

Can you confirm that log4j-2x in the elasticsearch/lib folder now has the required version?

Best wishes,   Marc


Re: Janusgraph-full-0.6.1: how to fix "WARNING: Critical severity vulnerabilities were found with Log4j!"

hadoopmarc@...
 

Hi Yingjie,

See my earlier comment, with respect to janusgraph-0.6.2:

>>My suggestion was incomplete. In addition to removing the log4j-1.2.17.jar file from the lib folder, you have to remove the slf4j-log4j12-1.7.30.jar file as well. Otherwise, JanusGraph server starts looking for the log4j jar and crashes, as you found out.

Can you confirm that log4j-2x in the elasticsearch/lib folder now has the required version?

Best wishes,   Marc


Re: Janusgraph-full-0.6.1: how to fix "WARNING: Critical severity vulnerabilities were found with Log4j!"

Yingjie Li
 

Hello Jan,
Thanks for the pointers. I downloaded the last build based on the link you provided.  I unzipped janusgraph-full-1.0.0-SNAPSHOT.zip and tried starting Janusgraph and gremlin.sh, both worked. I removed  log4j-1.2.17.jar, restarted janusgraph, and ran gremlin.sh, but failed.
Does it mean  log4j-1.2.17 is still used somehow and if so, how to disable it?

Thanks.

Yingjie



On Fri, Sep 16, 2022 at 10:08 AM Jansen, Jan via lists.lfaidata.foundation <Jan.jansen=gdata.de@...> wrote:
Hi Yingjie,

You can also download our latest artifacts from github action. https://github.com/JanusGraph/janusgraph/actions/workflows/ci-release.yml?query=branch%3Amaster+is%3Acompleted

Just go to the last build and download distribution-builds.

Greetings,
Jan

From: janusgraph-users@... <janusgraph-users@...> on behalf of hadoopmarc via lists.lfaidata.foundation <hadoopmarc=xs4all.nl@...>
Sent: Friday, September 16, 2022 2:30 PM
To: janusgraph-users@... <janusgraph-users@...>
Subject: Re: [janusgraph-users] Janusgraph-full-0.6.1: how to fix "WARNING: Critical severity vulnerabilities were found with Log4j!"
 
Hi Yingjie,

As edited in my previous reaction, the Cassandra jars in the JanusGraph distribution do not include the log4j jar. As to elasticsearch, your best choices are:
  1. not use mixed indices (check whether your application needs them)
  2. build JanusGraph for the current master branch, as already suggested by Boxuan above. The master branch has a patched Elasticsearch version 7.17

Best wishes,    Marc


Re: Operate with JMX metrics and measurements units. What metrics to observe for read queries?

Ronnie
 

Hi,
Please can someone clarify the default unit for the time relates metrics? Couldnt find this from any of the docs or on the internet.

Thanks!
Ronnie


Re: Composite Indexing not working as expected for property on vertex in janusgraph 0.6.1

hadoopmarc@...
 

Hi Nikita,

Can you please provide me with the complete steps to reproduce your issue? Preferably code lines that work in the Gremlin Console for the JanusGraph inmemory graph, including the schema and indices and the vertex add steps? From your description up till now it is just not possible to make sense of what is happening.

Best wishes,    Marc


Re: Composite Indexing not working as expected for property on vertex in janusgraph 0.6.1

Nikita Pande
 

Hi,

There is one observation that, 

1. I faced above issue when the data was loaded with graph.addVertex() method

2. I added sample data using graph traverser g.addV().property().property() and it worked fine. 

I want to understand why is step 1 not giving any response for query g.V().has("newid","xyz") whereas it works with step 2.




Re: Composite Indexing not working as expected for property on vertex in janusgraph 0.6.1

hadoopmarc@...
 

> Can the indexing work with property of single cardinality?

Yes.


Re: Composite Indexing not working as expected for property on vertex in janusgraph 0.6.1

Nikita Pande
 

Hi hadoopmarc,

Thanks for elaborate explanation. 
The cardinality is SINGLE for the property "newid".

newid                         | SINGLE      | class java.lang.String  


Can the indexing work with property of single cardinality?

Thanks and Regards,
Nikita


Re: Required Capacity Error - JanusGraph on Cassandra

hadoopmarc@...
 

Hi Joe,

With "an index on type and range" you really mean:
mgmt.buildIndex('byTypeAndRange', Vertex.class).addKey(type).addKey(range).buildCompositeIndex()

Indeed, supernodes have little value in traversing graphs. Maybe you can remove the worst ones (they probably have little meaning) or make them into a property on the attached vertices.
If the supernodes are not the ones you want to traverse in your query, maybe a label constraint in the index can help.

Best wishes,    Marc


Re: Composite Indexing not working as expected for property on vertex in janusgraph 0.6.1

hadoopmarc@...
 

Hi Nikita,

Indeed, the janusgraph query optimizer can not handle the has("newid",unfold().is("hash data")) construct in the right way and the index does not trigger. But then, this construct is not necessary. Even if the "newid" property has a LIST cardinality, you can still do:
g.V().has("newid", "hash data").valueMap(true).tryNext().isPresent() and have the index on newid triggered.

Indeed, your example gremlin> g.V().has("newid","xyz").valueMap(true).tryNext().isPresent() should result in true. I checked it on a fresh database and it does. So, please check the steps you took to get your result.

gremlin> testval = m.makePropertyKey('testval').dataType(String.class).cardinality(Cardinality.LIST).make()
==>testval
gremlin> m.buildIndex('byTestVal', Vertex.class).addKey(testval).buildCompositeIndex()
==>byTestVal
gremlin> m.commit()

gremlin> g.addV().property('testval', 'xyz1')
==>v[8192]
gremlin> g.V(8192).property('testval', 'xyz2')
==>v[8192]
gremlin> g.V(8192).values('testval')
==>xyz1
==>xyz2
gremlin> g.V().has("testval","xyz1").valueMap(true).tryNext().isPresent()
==>true

Best wishes,   Marc


Re: Janusgraph-full-0.6.1: how to fix "WARNING: Critical severity vulnerabilities were found with Log4j!"

Jansen, Jan
 

Hi Yingjie,

You can also download our latest artifacts from github action. https://github.com/JanusGraph/janusgraph/actions/workflows/ci-release.yml?query=branch%3Amaster+is%3Acompleted

Just go to the last build and download distribution-builds.

Greetings,
Jan

From: janusgraph-users@... <janusgraph-users@...> on behalf of hadoopmarc via lists.lfaidata.foundation <hadoopmarc=xs4all.nl@...>
Sent: Friday, September 16, 2022 2:30 PM
To: janusgraph-users@... <janusgraph-users@...>
Subject: Re: [janusgraph-users] Janusgraph-full-0.6.1: how to fix "WARNING: Critical severity vulnerabilities were found with Log4j!"
 
Hi Yingjie,

As edited in my previous reaction, the Cassandra jars in the JanusGraph distribution do not include the log4j jar. As to elasticsearch, your best choices are:
  1. not use mixed indices (check whether your application needs them)
  2. build JanusGraph for the current master branch, as already suggested by Boxuan above. The master branch has a patched Elasticsearch version 7.17

Best wishes,    Marc


Re: Janusgraph-full-0.6.1: how to fix "WARNING: Critical severity vulnerabilities were found with Log4j!"

hadoopmarc@...
 

Hi Yingjie,

As edited in my previous reaction, the Cassandra jars in the JanusGraph distribution do not include the log4j jar. As to elasticsearch, your best choices are:
  1. not use mixed indices (check whether your application needs them)
  2. build JanusGraph for the current master branch, as already suggested by Boxuan above. The master branch has a patched Elasticsearch version 7.17

Best wishes,    Marc


Re: Janusgraph-full-0.6.1: how to fix "WARNING: Critical severity vulnerabilities were found with Log4j!"

Yingjie Li
 

Marc,
Yes, we use Cassandra and Elasticsearch at the backend  from janusgraph-full-0.6.2  . What  are the steps to disable log4j?  

Thanks
Yingjie

Th

On Fri, Sep 16, 2022 at 2:30 AM <hadoopmarc@...> wrote:

[Edited Message Follows]

Hi Yingjie,

As to the short term workaround with removing the log4j jars, from the lib folder you can still use the gremlin console if you edit the ext/plugins.txt file and remove the lines with the hadoop and spark plugins.

However, you state that you use elasticsearch, which also ships with log4j in the janusgraph-full-0.6.2.zip distribution.

Best wishes,

Marc

Edited: cassandra does not ship with log4j


Re: Janusgraph-full-0.6.1: how to fix "WARNING: Critical severity vulnerabilities were found with Log4j!"

hadoopmarc@...
 
Edited

Hi Yingjie,

As to the short term workaround with removing the log4j jars, from the lib folder you can still use the gremlin console if you edit the ext/plugins.txt file and remove the lines with the hadoop and spark plugins.

However, you state that you use elasticsearch, which also ships with log4j in the janusgraph-full-0.6.2.zip distribution.

Best wishes,

Marc

Edited: cassandra does not ship with log4j


Re: Janusgraph-full-0.6.1: how to fix "WARNING: Critical severity vulnerabilities were found with Log4j!"

Yingjie Li
 

Hello Jan,
Yes,  the only issue is the log4j2 that comes with janusgraph-full-0.6.2.zip installation.  If there are some configurations that can be changed to fix this issue, that's all what we need. Please let me know the steps.

Thanks,
Yingjie


On Thu, Sep 15, 2022 at 2:58 PM Jansen, Jan via lists.lfaidata.foundation <Jan.jansen=gdata.de@...> wrote:
Hi
It won't be possible to backport without breaking change. We could work on release of janusgraph 1, after we finished upgrading to Tinkerpop 3.6. We have multiple libs with CVEs in 0.6.

If your main issues is log4j2, janusgraph 0.6 should work without log4j2.

Greetings, Jan

From: Li Boxuan <liboxuan@...>
Sent: Thursday, September 15, 2022 8:38 PM
To: janusgraph-users@... <janusgraph-users@...>
Cc: Jansen, Jan <Jan.Jansen@...>
Subject: Re: [janusgraph-users] Janusgraph-full-0.6.1: how to fix "WARNING: Critical severity vulnerabilities were found with Log4j!"
 
Sorry I am not sure about that… Maybe Jan (CCed) could answer that?

On Sep 15, 2022, at 2:03 PM, Yingjie Li via lists.lfaidata.foundation <yingjie.li=gmail.com@...> wrote:

Hello Boxuan,

Does it mean there is no existing release version for full install with this issue being fixed?   Would it be possible to have a release with the fix as the code seems available?  

The way we've been using janusgraph is to install  the latest release version  in our docker, after customizing/initializing it with our own schema, together with our models of data fusion and query. 
It worked very well till we hit this security issue.

Thanks,
Yingjie

On Thu, Sep 15, 2022 at 11:40 AM Boxuan Li <liboxuan@...> wrote:
Hi Yingjie,

You might need to backport https://github.com/JanusGraph/janusgraph/pull/2890 to 0.6 branch and build your own JanusGraph. You are also welcome to use the master branch to build JanusGraph.

Best,
Boxuan

On Sep 15, 2022, at 11:38 AM, Yingjie Li via lists.lfaidata.foundation <yingjie.li=gmail.com@...> wrote:

Just wondering if anybody has a fix for this? The security issue is a roadblock for us to continue using Janusgraph in our project. 

Thanks

On Mon, Sep 12, 2022 at 9:01 AM Yingjie Li via lists.lfaidata.foundation <yingjie.li=gmail.com@...> wrote:
Hello Marc,

Actually my previous testing was incomplete. After removing those two log4j related jar files from the lib directory, I can start elasticsearch, cassandra and Janusgraph server successfully. But I got exception when running ./bin/gremlin.sh as below. What other changes do I need to make?

Thanks,
Yingjie

./bin/gremlin.sh

         \,,,/
         (o o)
-----oOOo-(3)-oOOo-----
plugin activated: tinkerpop.server
plugin activated: tinkerpop.tinkergraph
08:55:29 WARN  org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/log4j/Level
at org.apache.hadoop.mapred.JobConf.<clinit>(JobConf.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2306)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:94)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:78)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
at org.apache.hadoop.security.Groups.<init>(Groups.java:106)
at org.apache.hadoop.security.Groups.<init>(Groups.java:102)
at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:450)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:314)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:281)
at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:837)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:807)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:680)
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2978)
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2968)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2830)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:181)
at org.apache.tinkerpop.gremlin.hadoop.jsr223.HadoopGremlinPlugin.lambda$static$0(HadoopGremlinPlugin.java:121)
at org.apache.tinkerpop.gremlin.jsr223.LazyBindingsCustomizer.getBindings(LazyBindingsCustomizer.java:56)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:101)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:323)
at org.codehaus.groovy.runtime.metaclass.MethodMetaProperty$GetBeanMethodMetaProperty.getProperty(MethodMetaProperty.java:76)
at org.codehaus.groovy.runtime.callsite.GetEffectivePojoPropertySite.getProperty(GetEffectivePojoPropertySite.java:63)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callGetProperty(AbstractCallSite.java:298)
at org.apache.tinkerpop.gremlin.console.PluggedIn$_activate_closure1.doCall(PluggedIn.groovy:67)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:101)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:323)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:263)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1041)
at groovy.lang.Closure.call(Closure.java:405)
at groovy.lang.Closure.call(Closure.java:421)
at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:2136)
at org.codehaus.groovy.runtime.dgm$181.invoke(Unknown Source)
at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite$PojoMetaMethodSiteNoUnwrapNoCoerce.invoke(PojoMetaMethodSite.java:244)
at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite.call(PojoMetaMethodSite.java:53)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:127)
at org.apache.tinkerpop.gremlin.console.PluggedIn.activate(PluggedIn.groovy:59)
at org.apache.tinkerpop.gremlin.console.PluggedIn$activate.call(Unknown Source)
at org.apache.tinkerpop.gremlin.console.Console$_closure18.doCall(Console.groovy:149)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:101)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:323)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:263)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1041)
at groovy.lang.Closure.call(Closure.java:405)
at groovy.lang.Closure.call(Closure.java:421)
at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:2330)
at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:2315)
at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:2356)
at org.codehaus.groovy.runtime.dgm$186.invoke(Unknown Source)
at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite$PojoMetaMethodSiteNoUnwrapNoCoerce.invoke(PojoMetaMethodSite.java:244)
at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite.call(PojoMetaMethodSite.java:53)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:47)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:115)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:127)
at org.apache.tinkerpop.gremlin.console.Console.<init>(Console.groovy:147)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:80)
at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:105)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:59)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:237)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:265)
at org.apache.tinkerpop.gremlin.console.Console.main(Console.groovy:524)
Caused by: java.lang.ClassNotFoundException: org.apache.log4j.Level
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 80 more

On Fri, Sep 9, 2022 at 4:08 PM <hadoopmarc@...> wrote:
Hi Yingjie,

My suggestion was incomplete. In addition to removing the log4j-1.2.17.jar file from the lib folder, you have to remove the slf4j-log4j12-1.7.30.jar file as well. Otherwise, JanusGraph server starts looking for the log4j jar and crashes, as you found out.

Best wishes,   Marc









Re: Janusgraph-full-0.6.1: how to fix "WARNING: Critical severity vulnerabilities were found with Log4j!"

Jansen, Jan
 

Hi
It won't be possible to backport without breaking change. We could work on release of janusgraph 1, after we finished upgrading to Tinkerpop 3.6. We have multiple libs with CVEs in 0.6.

If your main issues is log4j2, janusgraph 0.6 should work without log4j2.

Greetings, Jan

From: Li Boxuan <liboxuan@...>
Sent: Thursday, September 15, 2022 8:38 PM
To: janusgraph-users@... <janusgraph-users@...>
Cc: Jansen, Jan <Jan.Jansen@...>
Subject: Re: [janusgraph-users] Janusgraph-full-0.6.1: how to fix "WARNING: Critical severity vulnerabilities were found with Log4j!"
 
Sorry I am not sure about that… Maybe Jan (CCed) could answer that?

On Sep 15, 2022, at 2:03 PM, Yingjie Li via lists.lfaidata.foundation <yingjie.li=gmail.com@...> wrote:

Hello Boxuan,

Does it mean there is no existing release version for full install with this issue being fixed?   Would it be possible to have a release with the fix as the code seems available?  

The way we've been using janusgraph is to install  the latest release version  in our docker, after customizing/initializing it with our own schema, together with our models of data fusion and query. 
It worked very well till we hit this security issue.

Thanks,
Yingjie

On Thu, Sep 15, 2022 at 11:40 AM Boxuan Li <liboxuan@...> wrote:
Hi Yingjie,

You might need to backport https://github.com/JanusGraph/janusgraph/pull/2890 to 0.6 branch and build your own JanusGraph. You are also welcome to use the master branch to build JanusGraph.

Best,
Boxuan

On Sep 15, 2022, at 11:38 AM, Yingjie Li via lists.lfaidata.foundation <yingjie.li=gmail.com@...> wrote:

Just wondering if anybody has a fix for this? The security issue is a roadblock for us to continue using Janusgraph in our project. 

Thanks

On Mon, Sep 12, 2022 at 9:01 AM Yingjie Li via lists.lfaidata.foundation <yingjie.li=gmail.com@...> wrote:
Hello Marc,

Actually my previous testing was incomplete. After removing those two log4j related jar files from the lib directory, I can start elasticsearch, cassandra and Janusgraph server successfully. But I got exception when running ./bin/gremlin.sh as below. What other changes do I need to make?

Thanks,
Yingjie

./bin/gremlin.sh

         \,,,/
         (o o)
-----oOOo-(3)-oOOo-----
plugin activated: tinkerpop.server
plugin activated: tinkerpop.tinkergraph
08:55:29 WARN  org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/log4j/Level
at org.apache.hadoop.mapred.JobConf.<clinit>(JobConf.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2306)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:94)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:78)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
at org.apache.hadoop.security.Groups.<init>(Groups.java:106)
at org.apache.hadoop.security.Groups.<init>(Groups.java:102)
at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:450)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:314)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:281)
at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:837)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:807)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:680)
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2978)
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2968)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2830)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:181)
at org.apache.tinkerpop.gremlin.hadoop.jsr223.HadoopGremlinPlugin.lambda$static$0(HadoopGremlinPlugin.java:121)
at org.apache.tinkerpop.gremlin.jsr223.LazyBindingsCustomizer.getBindings(LazyBindingsCustomizer.java:56)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:101)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:323)
at org.codehaus.groovy.runtime.metaclass.MethodMetaProperty$GetBeanMethodMetaProperty.getProperty(MethodMetaProperty.java:76)
at org.codehaus.groovy.runtime.callsite.GetEffectivePojoPropertySite.getProperty(GetEffectivePojoPropertySite.java:63)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callGetProperty(AbstractCallSite.java:298)
at org.apache.tinkerpop.gremlin.console.PluggedIn$_activate_closure1.doCall(PluggedIn.groovy:67)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:101)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:323)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:263)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1041)
at groovy.lang.Closure.call(Closure.java:405)
at groovy.lang.Closure.call(Closure.java:421)
at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:2136)
at org.codehaus.groovy.runtime.dgm$181.invoke(Unknown Source)
at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite$PojoMetaMethodSiteNoUnwrapNoCoerce.invoke(PojoMetaMethodSite.java:244)
at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite.call(PojoMetaMethodSite.java:53)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:127)
at org.apache.tinkerpop.gremlin.console.PluggedIn.activate(PluggedIn.groovy:59)
at org.apache.tinkerpop.gremlin.console.PluggedIn$activate.call(Unknown Source)
at org.apache.tinkerpop.gremlin.console.Console$_closure18.doCall(Console.groovy:149)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:101)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:323)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:263)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1041)
at groovy.lang.Closure.call(Closure.java:405)
at groovy.lang.Closure.call(Closure.java:421)
at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:2330)
at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:2315)
at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:2356)
at org.codehaus.groovy.runtime.dgm$186.invoke(Unknown Source)
at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite$PojoMetaMethodSiteNoUnwrapNoCoerce.invoke(PojoMetaMethodSite.java:244)
at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite.call(PojoMetaMethodSite.java:53)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:47)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:115)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:127)
at org.apache.tinkerpop.gremlin.console.Console.<init>(Console.groovy:147)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:80)
at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:105)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:59)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:237)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:265)
at org.apache.tinkerpop.gremlin.console.Console.main(Console.groovy:524)
Caused by: java.lang.ClassNotFoundException: org.apache.log4j.Level
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 80 more

On Fri, Sep 9, 2022 at 4:08 PM <hadoopmarc@...> wrote:
Hi Yingjie,

My suggestion was incomplete. In addition to removing the log4j-1.2.17.jar file from the lib folder, you have to remove the slf4j-log4j12-1.7.30.jar file as well. Otherwise, JanusGraph server starts looking for the log4j jar and crashes, as you found out.

Best wishes,   Marc








21 - 40 of 6678