ES: Index template does not convert from UNIX timestamp to date












0















I am inserting documents coming from Apache Spark via Structured Streaming into ES.



Unfortunately, there is an unsolved bug in the Spark-ES connector (https://github.com/elastic/elasticsearch-hadoop/issues/1173) which has the negative effect that date fields on source side (Spark) are sent as unix-timestamps/long types into the sink (ES).



I thought that an index template for converting it on ES-side might be a good workaround for having it in the correct format (date) in ES.



My index template is:



{
"index_patterns": "my_index_*",
"mappings": {
"peerType_count": {
"dynamic_templates": [
{
"timestamps": {
"path_match": "*.window.*",
"match_mapping_type": "long",
"mapping": {
"type": "date",
"format": "epoch_millis"
}
}
}
]
}
}
}


But the document in ES has still the unix timestamp in it :-/



{
"_index": "my_index",
"_type": "peerType_count",
"_id": "kUGWNmcBtkL7EG0gS280",
"_version": 1,
"_score": 1,
"_source": {
"window": {
"start": 1535958000000,
"end": 1535958300000
},
"input__source_peerType": "peer2",
"count": 1
}
}


Does somebody has an idea what might be wrong?



PS: Is there any good es-mapping-debugger out there available?










share|improve this question


















  • 2





    A dynamic template will never change the source document you're sending, it will only take actions on the underlying mapping. However, you might want to look into an ingest pipeline and more specifically the date processor which might be what you need.

    – Val
    Nov 21 '18 at 16:54













  • Thanks Val, will have a look and update this post

    – Aydin K.
    Nov 21 '18 at 19:51
















0















I am inserting documents coming from Apache Spark via Structured Streaming into ES.



Unfortunately, there is an unsolved bug in the Spark-ES connector (https://github.com/elastic/elasticsearch-hadoop/issues/1173) which has the negative effect that date fields on source side (Spark) are sent as unix-timestamps/long types into the sink (ES).



I thought that an index template for converting it on ES-side might be a good workaround for having it in the correct format (date) in ES.



My index template is:



{
"index_patterns": "my_index_*",
"mappings": {
"peerType_count": {
"dynamic_templates": [
{
"timestamps": {
"path_match": "*.window.*",
"match_mapping_type": "long",
"mapping": {
"type": "date",
"format": "epoch_millis"
}
}
}
]
}
}
}


But the document in ES has still the unix timestamp in it :-/



{
"_index": "my_index",
"_type": "peerType_count",
"_id": "kUGWNmcBtkL7EG0gS280",
"_version": 1,
"_score": 1,
"_source": {
"window": {
"start": 1535958000000,
"end": 1535958300000
},
"input__source_peerType": "peer2",
"count": 1
}
}


Does somebody has an idea what might be wrong?



PS: Is there any good es-mapping-debugger out there available?










share|improve this question


















  • 2





    A dynamic template will never change the source document you're sending, it will only take actions on the underlying mapping. However, you might want to look into an ingest pipeline and more specifically the date processor which might be what you need.

    – Val
    Nov 21 '18 at 16:54













  • Thanks Val, will have a look and update this post

    – Aydin K.
    Nov 21 '18 at 19:51














0












0








0








I am inserting documents coming from Apache Spark via Structured Streaming into ES.



Unfortunately, there is an unsolved bug in the Spark-ES connector (https://github.com/elastic/elasticsearch-hadoop/issues/1173) which has the negative effect that date fields on source side (Spark) are sent as unix-timestamps/long types into the sink (ES).



I thought that an index template for converting it on ES-side might be a good workaround for having it in the correct format (date) in ES.



My index template is:



{
"index_patterns": "my_index_*",
"mappings": {
"peerType_count": {
"dynamic_templates": [
{
"timestamps": {
"path_match": "*.window.*",
"match_mapping_type": "long",
"mapping": {
"type": "date",
"format": "epoch_millis"
}
}
}
]
}
}
}


But the document in ES has still the unix timestamp in it :-/



{
"_index": "my_index",
"_type": "peerType_count",
"_id": "kUGWNmcBtkL7EG0gS280",
"_version": 1,
"_score": 1,
"_source": {
"window": {
"start": 1535958000000,
"end": 1535958300000
},
"input__source_peerType": "peer2",
"count": 1
}
}


Does somebody has an idea what might be wrong?



PS: Is there any good es-mapping-debugger out there available?










share|improve this question














I am inserting documents coming from Apache Spark via Structured Streaming into ES.



Unfortunately, there is an unsolved bug in the Spark-ES connector (https://github.com/elastic/elasticsearch-hadoop/issues/1173) which has the negative effect that date fields on source side (Spark) are sent as unix-timestamps/long types into the sink (ES).



I thought that an index template for converting it on ES-side might be a good workaround for having it in the correct format (date) in ES.



My index template is:



{
"index_patterns": "my_index_*",
"mappings": {
"peerType_count": {
"dynamic_templates": [
{
"timestamps": {
"path_match": "*.window.*",
"match_mapping_type": "long",
"mapping": {
"type": "date",
"format": "epoch_millis"
}
}
}
]
}
}
}


But the document in ES has still the unix timestamp in it :-/



{
"_index": "my_index",
"_type": "peerType_count",
"_id": "kUGWNmcBtkL7EG0gS280",
"_version": 1,
"_score": 1,
"_source": {
"window": {
"start": 1535958000000,
"end": 1535958300000
},
"input__source_peerType": "peer2",
"count": 1
}
}


Does somebody has an idea what might be wrong?



PS: Is there any good es-mapping-debugger out there available?







apache-spark elasticsearch






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Nov 21 '18 at 15:25









Aydin K.Aydin K.

1,6321934




1,6321934








  • 2





    A dynamic template will never change the source document you're sending, it will only take actions on the underlying mapping. However, you might want to look into an ingest pipeline and more specifically the date processor which might be what you need.

    – Val
    Nov 21 '18 at 16:54













  • Thanks Val, will have a look and update this post

    – Aydin K.
    Nov 21 '18 at 19:51














  • 2





    A dynamic template will never change the source document you're sending, it will only take actions on the underlying mapping. However, you might want to look into an ingest pipeline and more specifically the date processor which might be what you need.

    – Val
    Nov 21 '18 at 16:54













  • Thanks Val, will have a look and update this post

    – Aydin K.
    Nov 21 '18 at 19:51








2




2





A dynamic template will never change the source document you're sending, it will only take actions on the underlying mapping. However, you might want to look into an ingest pipeline and more specifically the date processor which might be what you need.

– Val
Nov 21 '18 at 16:54







A dynamic template will never change the source document you're sending, it will only take actions on the underlying mapping. However, you might want to look into an ingest pipeline and more specifically the date processor which might be what you need.

– Val
Nov 21 '18 at 16:54















Thanks Val, will have a look and update this post

– Aydin K.
Nov 21 '18 at 19:51





Thanks Val, will have a look and update this post

– Aydin K.
Nov 21 '18 at 19:51












1 Answer
1






active

oldest

votes


















0














Want to share my workaround, just create an ingest pipeline in ES with following http request:



PUT _ingest/pipeline/fix_date_1173
{
"description": "converts from unix ms to date, workaround for https://github.com/elastic/elasticsearch-hadoop/issues/1173",
"processors": [
{
"date": {
"field": "window.start",
"formats": ["UNIX_MS"],
"target_field":"window.start"
}
},
{
"date": {
"field": "window.end",
"formats": ["UNIX_MS"],
"target_field":"window.end"
}
}
]
}


and enable it in your spark code by



.option("es.ingest.pipeline", "fix_date_1173")


Thanks @val for the hint!






share|improve this answer























    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53415301%2fes-index-template-does-not-convert-from-unix-timestamp-to-date%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0














    Want to share my workaround, just create an ingest pipeline in ES with following http request:



    PUT _ingest/pipeline/fix_date_1173
    {
    "description": "converts from unix ms to date, workaround for https://github.com/elastic/elasticsearch-hadoop/issues/1173",
    "processors": [
    {
    "date": {
    "field": "window.start",
    "formats": ["UNIX_MS"],
    "target_field":"window.start"
    }
    },
    {
    "date": {
    "field": "window.end",
    "formats": ["UNIX_MS"],
    "target_field":"window.end"
    }
    }
    ]
    }


    and enable it in your spark code by



    .option("es.ingest.pipeline", "fix_date_1173")


    Thanks @val for the hint!






    share|improve this answer




























      0














      Want to share my workaround, just create an ingest pipeline in ES with following http request:



      PUT _ingest/pipeline/fix_date_1173
      {
      "description": "converts from unix ms to date, workaround for https://github.com/elastic/elasticsearch-hadoop/issues/1173",
      "processors": [
      {
      "date": {
      "field": "window.start",
      "formats": ["UNIX_MS"],
      "target_field":"window.start"
      }
      },
      {
      "date": {
      "field": "window.end",
      "formats": ["UNIX_MS"],
      "target_field":"window.end"
      }
      }
      ]
      }


      and enable it in your spark code by



      .option("es.ingest.pipeline", "fix_date_1173")


      Thanks @val for the hint!






      share|improve this answer


























        0












        0








        0







        Want to share my workaround, just create an ingest pipeline in ES with following http request:



        PUT _ingest/pipeline/fix_date_1173
        {
        "description": "converts from unix ms to date, workaround for https://github.com/elastic/elasticsearch-hadoop/issues/1173",
        "processors": [
        {
        "date": {
        "field": "window.start",
        "formats": ["UNIX_MS"],
        "target_field":"window.start"
        }
        },
        {
        "date": {
        "field": "window.end",
        "formats": ["UNIX_MS"],
        "target_field":"window.end"
        }
        }
        ]
        }


        and enable it in your spark code by



        .option("es.ingest.pipeline", "fix_date_1173")


        Thanks @val for the hint!






        share|improve this answer













        Want to share my workaround, just create an ingest pipeline in ES with following http request:



        PUT _ingest/pipeline/fix_date_1173
        {
        "description": "converts from unix ms to date, workaround for https://github.com/elastic/elasticsearch-hadoop/issues/1173",
        "processors": [
        {
        "date": {
        "field": "window.start",
        "formats": ["UNIX_MS"],
        "target_field":"window.start"
        }
        },
        {
        "date": {
        "field": "window.end",
        "formats": ["UNIX_MS"],
        "target_field":"window.end"
        }
        }
        ]
        }


        and enable it in your spark code by



        .option("es.ingest.pipeline", "fix_date_1173")


        Thanks @val for the hint!







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Nov 22 '18 at 16:09









        Aydin K.Aydin K.

        1,6321934




        1,6321934
































            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53415301%2fes-index-template-does-not-convert-from-unix-timestamp-to-date%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            這個網誌中的熱門文章

            Xamarin.form Move up view when keyboard appear

            Post-Redirect-Get with Spring WebFlux and Thymeleaf

            Anylogic : not able to use stopDelay()