[Solved] Jar conflict with Couchbase inside Spark Streaming job

Hi,

Couchbase Java SDK 1.4.4 uses netty 3.5.5 as dependency but Spark uses Akka Actor which uses netty with a different version.
Does anyone have an idea to solve this jar conflict issue?

Here below the stack strace inside Spark:

akka.actor.ActorSystemImpl - Uncaught fatal error from thread [spark-akka.actor.default-dispatcher-4] shutting down ActorSystem [spark]
java.lang.NoSuchMethodError: org.jboss.netty.channel.socket.nio.NioClientSocketChannelFactory.<init>(Ljava/util/concurrent/Executor;ILorg/jboss/netty/channel/socket/nio/WorkerPool;Lorg/jboss/netty/util/Timer;)V

Thanks

1 Answer

« Back to question.

To fix this conflict with Spark you'll need to exclude netty from couchbase client lib:

        <dependency>
            <groupId>com.couchbase.client</groupId>
            <artifactId>couchbase-client</artifactId>
            <version>1.4.4</version>
            <exclusions>
                <exclusion>
                    <groupId>io.netty</groupId>
                    <artifactId>netty</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

Hi,
I am having the same issue with map reduce job. I am new to java and eclipse. how I can exclude that. should I just remove it from jar reference ??

Thanks.