Kafka with spark streaming integration error











up vote
0
down vote

favorite












I'm not able to run Kafka with spark-streaming. Following are the steps I've taken till now:




  1. Downloaded the jar file "spark-streaming-kafka-0-8-assembly_2.10-2.2.0.jar" and moved it to /home/ec2-user/spark-2.0.0-bin-hadoop2.7/jars


  2. Added this line to /home/ec2-user/spark-2.0.0-bin-hadoop2.7/conf/spark-defaults.conf.template -> spark.jars.packages org.apache.spark:spark-streaming-kafka-0-8-assembly_2.10:2.2.0



Kafka Version: kafka_2.10-0.10.2.2



Jar file version: spark-streaming-kafka-0-8-assembly_2.10-2.2.0.jar



Python Code:



os.environ['PYSPARK_SUBMIT_ARGS'] = '--packages org.apache.spark:spark-streaming-kafka-0-8-assembly_2.10-2.2.0 pyspark-shell' 
kvs = KafkaUtils.createDirectStream(ssc, ["divolte-data"], {"metadata.broker.list": "localhost:9092"})


But I'm still getting the following error:



Py4JJavaError: An error occurred while calling o39.createDirectStreamWithoutMessageHandler.
: java.lang.NoClassDefFoundError: Could not initialize class kafka.consumer.FetchRequestAndResponseStatsRegistry$
at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39)
at org.apache.spark.streaming.kafka.KafkaCluster.connect(KafkaCluster.scala:59)


What am I doing wrong?










share|improve this question






















  • How did you configure pom? Are you using ` metrics-core-2.2.0.jar? spark-shell --.jars metrics-core-2.2.0.jar`
    – karma4917
    Nov 8 at 18:39












  • You're using spark-2.0.0, but your jars are for 2.2.0... Those versions should be the same
    – cricket_007
    Nov 9 at 15:18















up vote
0
down vote

favorite












I'm not able to run Kafka with spark-streaming. Following are the steps I've taken till now:




  1. Downloaded the jar file "spark-streaming-kafka-0-8-assembly_2.10-2.2.0.jar" and moved it to /home/ec2-user/spark-2.0.0-bin-hadoop2.7/jars


  2. Added this line to /home/ec2-user/spark-2.0.0-bin-hadoop2.7/conf/spark-defaults.conf.template -> spark.jars.packages org.apache.spark:spark-streaming-kafka-0-8-assembly_2.10:2.2.0



Kafka Version: kafka_2.10-0.10.2.2



Jar file version: spark-streaming-kafka-0-8-assembly_2.10-2.2.0.jar



Python Code:



os.environ['PYSPARK_SUBMIT_ARGS'] = '--packages org.apache.spark:spark-streaming-kafka-0-8-assembly_2.10-2.2.0 pyspark-shell' 
kvs = KafkaUtils.createDirectStream(ssc, ["divolte-data"], {"metadata.broker.list": "localhost:9092"})


But I'm still getting the following error:



Py4JJavaError: An error occurred while calling o39.createDirectStreamWithoutMessageHandler.
: java.lang.NoClassDefFoundError: Could not initialize class kafka.consumer.FetchRequestAndResponseStatsRegistry$
at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39)
at org.apache.spark.streaming.kafka.KafkaCluster.connect(KafkaCluster.scala:59)


What am I doing wrong?










share|improve this question






















  • How did you configure pom? Are you using ` metrics-core-2.2.0.jar? spark-shell --.jars metrics-core-2.2.0.jar`
    – karma4917
    Nov 8 at 18:39












  • You're using spark-2.0.0, but your jars are for 2.2.0... Those versions should be the same
    – cricket_007
    Nov 9 at 15:18













up vote
0
down vote

favorite









up vote
0
down vote

favorite











I'm not able to run Kafka with spark-streaming. Following are the steps I've taken till now:




  1. Downloaded the jar file "spark-streaming-kafka-0-8-assembly_2.10-2.2.0.jar" and moved it to /home/ec2-user/spark-2.0.0-bin-hadoop2.7/jars


  2. Added this line to /home/ec2-user/spark-2.0.0-bin-hadoop2.7/conf/spark-defaults.conf.template -> spark.jars.packages org.apache.spark:spark-streaming-kafka-0-8-assembly_2.10:2.2.0



Kafka Version: kafka_2.10-0.10.2.2



Jar file version: spark-streaming-kafka-0-8-assembly_2.10-2.2.0.jar



Python Code:



os.environ['PYSPARK_SUBMIT_ARGS'] = '--packages org.apache.spark:spark-streaming-kafka-0-8-assembly_2.10-2.2.0 pyspark-shell' 
kvs = KafkaUtils.createDirectStream(ssc, ["divolte-data"], {"metadata.broker.list": "localhost:9092"})


But I'm still getting the following error:



Py4JJavaError: An error occurred while calling o39.createDirectStreamWithoutMessageHandler.
: java.lang.NoClassDefFoundError: Could not initialize class kafka.consumer.FetchRequestAndResponseStatsRegistry$
at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39)
at org.apache.spark.streaming.kafka.KafkaCluster.connect(KafkaCluster.scala:59)


What am I doing wrong?










share|improve this question













I'm not able to run Kafka with spark-streaming. Following are the steps I've taken till now:




  1. Downloaded the jar file "spark-streaming-kafka-0-8-assembly_2.10-2.2.0.jar" and moved it to /home/ec2-user/spark-2.0.0-bin-hadoop2.7/jars


  2. Added this line to /home/ec2-user/spark-2.0.0-bin-hadoop2.7/conf/spark-defaults.conf.template -> spark.jars.packages org.apache.spark:spark-streaming-kafka-0-8-assembly_2.10:2.2.0



Kafka Version: kafka_2.10-0.10.2.2



Jar file version: spark-streaming-kafka-0-8-assembly_2.10-2.2.0.jar



Python Code:



os.environ['PYSPARK_SUBMIT_ARGS'] = '--packages org.apache.spark:spark-streaming-kafka-0-8-assembly_2.10-2.2.0 pyspark-shell' 
kvs = KafkaUtils.createDirectStream(ssc, ["divolte-data"], {"metadata.broker.list": "localhost:9092"})


But I'm still getting the following error:



Py4JJavaError: An error occurred while calling o39.createDirectStreamWithoutMessageHandler.
: java.lang.NoClassDefFoundError: Could not initialize class kafka.consumer.FetchRequestAndResponseStatsRegistry$
at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39)
at org.apache.spark.streaming.kafka.KafkaCluster.connect(KafkaCluster.scala:59)


What am I doing wrong?







java apache-spark pyspark apache-kafka spark-streaming






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Nov 8 at 6:47









Jaskaran Singh Puri

187316




187316












  • How did you configure pom? Are you using ` metrics-core-2.2.0.jar? spark-shell --.jars metrics-core-2.2.0.jar`
    – karma4917
    Nov 8 at 18:39












  • You're using spark-2.0.0, but your jars are for 2.2.0... Those versions should be the same
    – cricket_007
    Nov 9 at 15:18


















  • How did you configure pom? Are you using ` metrics-core-2.2.0.jar? spark-shell --.jars metrics-core-2.2.0.jar`
    – karma4917
    Nov 8 at 18:39












  • You're using spark-2.0.0, but your jars are for 2.2.0... Those versions should be the same
    – cricket_007
    Nov 9 at 15:18
















How did you configure pom? Are you using ` metrics-core-2.2.0.jar? spark-shell --.jars metrics-core-2.2.0.jar`
– karma4917
Nov 8 at 18:39






How did you configure pom? Are you using ` metrics-core-2.2.0.jar? spark-shell --.jars metrics-core-2.2.0.jar`
– karma4917
Nov 8 at 18:39














You're using spark-2.0.0, but your jars are for 2.2.0... Those versions should be the same
– cricket_007
Nov 9 at 15:18




You're using spark-2.0.0, but your jars are for 2.2.0... Those versions should be the same
– cricket_007
Nov 9 at 15:18












1 Answer
1






active

oldest

votes

















up vote
0
down vote













spark-defaults.conf.template is only a template, and not read by Spark, therefore your JARs will not be loaded. You must copy/rename this file to remove the template suffix



You'll also need to download Spark 2.2 if you want to use those specific JAR files.



And make sure that your Spark version uses Scala 2.10 if that's the Kafka package you want to use. Otherwise, use 2.11 version






share|improve this answer





















    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53202704%2fkafka-with-spark-streaming-integration-error%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    0
    down vote













    spark-defaults.conf.template is only a template, and not read by Spark, therefore your JARs will not be loaded. You must copy/rename this file to remove the template suffix



    You'll also need to download Spark 2.2 if you want to use those specific JAR files.



    And make sure that your Spark version uses Scala 2.10 if that's the Kafka package you want to use. Otherwise, use 2.11 version






    share|improve this answer

























      up vote
      0
      down vote













      spark-defaults.conf.template is only a template, and not read by Spark, therefore your JARs will not be loaded. You must copy/rename this file to remove the template suffix



      You'll also need to download Spark 2.2 if you want to use those specific JAR files.



      And make sure that your Spark version uses Scala 2.10 if that's the Kafka package you want to use. Otherwise, use 2.11 version






      share|improve this answer























        up vote
        0
        down vote










        up vote
        0
        down vote









        spark-defaults.conf.template is only a template, and not read by Spark, therefore your JARs will not be loaded. You must copy/rename this file to remove the template suffix



        You'll also need to download Spark 2.2 if you want to use those specific JAR files.



        And make sure that your Spark version uses Scala 2.10 if that's the Kafka package you want to use. Otherwise, use 2.11 version






        share|improve this answer












        spark-defaults.conf.template is only a template, and not read by Spark, therefore your JARs will not be loaded. You must copy/rename this file to remove the template suffix



        You'll also need to download Spark 2.2 if you want to use those specific JAR files.



        And make sure that your Spark version uses Scala 2.10 if that's the Kafka package you want to use. Otherwise, use 2.11 version







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Nov 9 at 15:20









        cricket_007

        77.7k1142108




        77.7k1142108






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53202704%2fkafka-with-spark-streaming-integration-error%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            這個網誌中的熱門文章

            Xamarin.form Move up view when keyboard appear

            Post-Redirect-Get with Spring WebFlux and Thymeleaf

            Anylogic : not able to use stopDelay()