Dimensionality of LSTM predictions, inputs and targets












0















I am writing a thesis on stock price prediction using deep learning, focusing mainly on LSTM variations. I have a time series of the stock returns of 100 stocks over time, and predict a binary classification problem of whether they will go up or down the following day.



My training data is thus of the following shape:



X_train 3D:
(1000 sequences/observations, 200 returns per sequence, 100 stocks) = (1000, 200, 100)



y_train 2D:
(1000 observations, 100 stocks) <- A binary matrix with 1/0 (wether stock went up or down)



Test data:



X_test 3D:
(500 observations, 200 returns per sequence, 100 stocks) = (500,200,100)



LSTM Code:



    classifier = Sequential()
classifier.add(LSTM(units, return_sequences = False, input_shape =
(200, 100)))
classifier.add(Dense(units = 100, activation = sigmoid))
classifier.compile()

regressor.fit(X_train, y_train)
probability = classifier.predict(X_test)
classes = classifier.predict_classes(X_test)


When running this through the LSTM the predicted probability has shape (500,100), with each probability in the range [0,1]. This is exactly what i want, one probability per stock for each day in the test period (how likely it is to be of class 1, that is how likely it is to go up the next day).



However, the predicted class has shape (500,), and is in the range [0,100]. I want the predicted class to also be of shape (500,100) and to be binary (0/1).



It seems to me that the model is currently taking each observation, and predicting which stock it is most likely to belong to. This is not what i want, i want it to take each sequence for each stock and predict the binary response for that stock at that time. Therefore, both the probabilities and classes predicted here will be invalid for its intended purpose. Would it be possible to add another dimension to get my desired predictions?










share|improve this question























  • Did you check this and this? You can adjust your problem accordingly.

    – ARAT
    Nov 18 '18 at 16:31


















0















I am writing a thesis on stock price prediction using deep learning, focusing mainly on LSTM variations. I have a time series of the stock returns of 100 stocks over time, and predict a binary classification problem of whether they will go up or down the following day.



My training data is thus of the following shape:



X_train 3D:
(1000 sequences/observations, 200 returns per sequence, 100 stocks) = (1000, 200, 100)



y_train 2D:
(1000 observations, 100 stocks) <- A binary matrix with 1/0 (wether stock went up or down)



Test data:



X_test 3D:
(500 observations, 200 returns per sequence, 100 stocks) = (500,200,100)



LSTM Code:



    classifier = Sequential()
classifier.add(LSTM(units, return_sequences = False, input_shape =
(200, 100)))
classifier.add(Dense(units = 100, activation = sigmoid))
classifier.compile()

regressor.fit(X_train, y_train)
probability = classifier.predict(X_test)
classes = classifier.predict_classes(X_test)


When running this through the LSTM the predicted probability has shape (500,100), with each probability in the range [0,1]. This is exactly what i want, one probability per stock for each day in the test period (how likely it is to be of class 1, that is how likely it is to go up the next day).



However, the predicted class has shape (500,), and is in the range [0,100]. I want the predicted class to also be of shape (500,100) and to be binary (0/1).



It seems to me that the model is currently taking each observation, and predicting which stock it is most likely to belong to. This is not what i want, i want it to take each sequence for each stock and predict the binary response for that stock at that time. Therefore, both the probabilities and classes predicted here will be invalid for its intended purpose. Would it be possible to add another dimension to get my desired predictions?










share|improve this question























  • Did you check this and this? You can adjust your problem accordingly.

    – ARAT
    Nov 18 '18 at 16:31
















0












0








0








I am writing a thesis on stock price prediction using deep learning, focusing mainly on LSTM variations. I have a time series of the stock returns of 100 stocks over time, and predict a binary classification problem of whether they will go up or down the following day.



My training data is thus of the following shape:



X_train 3D:
(1000 sequences/observations, 200 returns per sequence, 100 stocks) = (1000, 200, 100)



y_train 2D:
(1000 observations, 100 stocks) <- A binary matrix with 1/0 (wether stock went up or down)



Test data:



X_test 3D:
(500 observations, 200 returns per sequence, 100 stocks) = (500,200,100)



LSTM Code:



    classifier = Sequential()
classifier.add(LSTM(units, return_sequences = False, input_shape =
(200, 100)))
classifier.add(Dense(units = 100, activation = sigmoid))
classifier.compile()

regressor.fit(X_train, y_train)
probability = classifier.predict(X_test)
classes = classifier.predict_classes(X_test)


When running this through the LSTM the predicted probability has shape (500,100), with each probability in the range [0,1]. This is exactly what i want, one probability per stock for each day in the test period (how likely it is to be of class 1, that is how likely it is to go up the next day).



However, the predicted class has shape (500,), and is in the range [0,100]. I want the predicted class to also be of shape (500,100) and to be binary (0/1).



It seems to me that the model is currently taking each observation, and predicting which stock it is most likely to belong to. This is not what i want, i want it to take each sequence for each stock and predict the binary response for that stock at that time. Therefore, both the probabilities and classes predicted here will be invalid for its intended purpose. Would it be possible to add another dimension to get my desired predictions?










share|improve this question














I am writing a thesis on stock price prediction using deep learning, focusing mainly on LSTM variations. I have a time series of the stock returns of 100 stocks over time, and predict a binary classification problem of whether they will go up or down the following day.



My training data is thus of the following shape:



X_train 3D:
(1000 sequences/observations, 200 returns per sequence, 100 stocks) = (1000, 200, 100)



y_train 2D:
(1000 observations, 100 stocks) <- A binary matrix with 1/0 (wether stock went up or down)



Test data:



X_test 3D:
(500 observations, 200 returns per sequence, 100 stocks) = (500,200,100)



LSTM Code:



    classifier = Sequential()
classifier.add(LSTM(units, return_sequences = False, input_shape =
(200, 100)))
classifier.add(Dense(units = 100, activation = sigmoid))
classifier.compile()

regressor.fit(X_train, y_train)
probability = classifier.predict(X_test)
classes = classifier.predict_classes(X_test)


When running this through the LSTM the predicted probability has shape (500,100), with each probability in the range [0,1]. This is exactly what i want, one probability per stock for each day in the test period (how likely it is to be of class 1, that is how likely it is to go up the next day).



However, the predicted class has shape (500,), and is in the range [0,100]. I want the predicted class to also be of shape (500,100) and to be binary (0/1).



It seems to me that the model is currently taking each observation, and predicting which stock it is most likely to belong to. This is not what i want, i want it to take each sequence for each stock and predict the binary response for that stock at that time. Therefore, both the probabilities and classes predicted here will be invalid for its intended purpose. Would it be possible to add another dimension to get my desired predictions?







keras deep-learning lstm prediction recurrent-neural-network






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Nov 17 '18 at 13:34









Jonas Snefjellå LøvåsJonas Snefjellå Løvås

11




11













  • Did you check this and this? You can adjust your problem accordingly.

    – ARAT
    Nov 18 '18 at 16:31





















  • Did you check this and this? You can adjust your problem accordingly.

    – ARAT
    Nov 18 '18 at 16:31



















Did you check this and this? You can adjust your problem accordingly.

– ARAT
Nov 18 '18 at 16:31







Did you check this and this? You can adjust your problem accordingly.

– ARAT
Nov 18 '18 at 16:31














0






active

oldest

votes











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53351734%2fdimensionality-of-lstm-predictions-inputs-and-targets%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53351734%2fdimensionality-of-lstm-predictions-inputs-and-targets%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







這個網誌中的熱門文章

Xamarin.form Move up view when keyboard appear

Post-Redirect-Get with Spring WebFlux and Thymeleaf

Anylogic : not able to use stopDelay()