What are the best ways to deal with huge datasets in deep learning? Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern) 2019 Moderator Election Q&A - Questionnaire 2019 Community Moderator Election ResultsRunning huge datasets with RWhat technologies are fastest at performing joins on large datasets?Machine Learning on financial big dataIs it ethical to claim experience with big data when the data isn't a part of the new advertising/social media/retail fad?What are the most concrete and easiest to understand applications of deep learning in the industry?How are deep-learning NNs different now (2016) from the ones I studied just 4 years ago (2012)?What is the best deep learning library for scala?What is the best hardware/GPU for deep learning?Git for Deep Learning - what are the best tools for versioning/tracking machine learning experiments?How to efficiently store large amounts of speech to text data
Problem with display of presentation
Pointing to problems without suggesting solutions
How does the body cool itself in a stillsuit?
Any stored/leased 737s that could substitute for grounded MAXs?
Besides transaction validation, are there any other uses of the Script language in Bitcoin
Is the Mordenkainen's Sword spell underpowered?
Plotting a Maclaurin series
What is a more techy Technical Writer job title that isn't cutesy or confusing?
Why can't fire hurt Daenerys but it did to Jon Snow in season 1?
Where and when has Thucydides been studied?
IC on Digikey is 5x more expensive than board containing same IC on Alibaba: How?
Should man-made satellites feature an intelligent inverted "cow catcher"?
How do you write "wild blueberries flavored"?
Why does BitLocker not use RSA?
Is the time—manner—place ordering of adverbials an oversimplification?
Marquee sign letters
Did any compiler fully use 80-bit floating point?
Why is there so little support for joining EFTA in the British parliament?
Russian equivalents of おしゃれは足元から (Every good outfit starts with the shoes)
How does TikZ render an arc?
How do I say "this must not happen"?
Where did Ptolemy compare the Earth to the distance of fixed stars?
3D Masyu - A Die
One-one communication
What are the best ways to deal with huge datasets in deep learning?
Announcing the arrival of Valued Associate #679: Cesar Manara
Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern)
2019 Moderator Election Q&A - Questionnaire
2019 Community Moderator Election ResultsRunning huge datasets with RWhat technologies are fastest at performing joins on large datasets?Machine Learning on financial big dataIs it ethical to claim experience with big data when the data isn't a part of the new advertising/social media/retail fad?What are the most concrete and easiest to understand applications of deep learning in the industry?How are deep-learning NNs different now (2016) from the ones I studied just 4 years ago (2012)?What is the best deep learning library for scala?What is the best hardware/GPU for deep learning?Git for Deep Learning - what are the best tools for versioning/tracking machine learning experiments?How to efficiently store large amounts of speech to text data
$begingroup$
I have to do a deep learning project in which the size of the datasets are too big for my computer.
My question is very simple: What are the best ways to deal with this kind of project?
To me, the best option would be to find a website in which I can:
1) Store the datasets
2) Write the models related to these datasets
3) Run it
For the moment, I've heard about two options:
1) AWS (But, can I store my datasets in AWS??)
2) Google colab (But, I'm not sure about the computing efficiency of it)
Thanks for your answers and suggestions
deep-learning bigdata aws
$endgroup$
add a comment |
$begingroup$
I have to do a deep learning project in which the size of the datasets are too big for my computer.
My question is very simple: What are the best ways to deal with this kind of project?
To me, the best option would be to find a website in which I can:
1) Store the datasets
2) Write the models related to these datasets
3) Run it
For the moment, I've heard about two options:
1) AWS (But, can I store my datasets in AWS??)
2) Google colab (But, I'm not sure about the computing efficiency of it)
Thanks for your answers and suggestions
deep-learning bigdata aws
$endgroup$
add a comment |
$begingroup$
I have to do a deep learning project in which the size of the datasets are too big for my computer.
My question is very simple: What are the best ways to deal with this kind of project?
To me, the best option would be to find a website in which I can:
1) Store the datasets
2) Write the models related to these datasets
3) Run it
For the moment, I've heard about two options:
1) AWS (But, can I store my datasets in AWS??)
2) Google colab (But, I'm not sure about the computing efficiency of it)
Thanks for your answers and suggestions
deep-learning bigdata aws
$endgroup$
I have to do a deep learning project in which the size of the datasets are too big for my computer.
My question is very simple: What are the best ways to deal with this kind of project?
To me, the best option would be to find a website in which I can:
1) Store the datasets
2) Write the models related to these datasets
3) Run it
For the moment, I've heard about two options:
1) AWS (But, can I store my datasets in AWS??)
2) Google colab (But, I'm not sure about the computing efficiency of it)
Thanks for your answers and suggestions
deep-learning bigdata aws
deep-learning bigdata aws
asked 16 mins ago
nolw38nolw38
114
114
add a comment |
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "557"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f49705%2fwhat-are-the-best-ways-to-deal-with-huge-datasets-in-deep-learning%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Data Science Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f49705%2fwhat-are-the-best-ways-to-deal-with-huge-datasets-in-deep-learning%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown