Unable to download website for offline view in Ubuntu?
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ margin-bottom:0;
}
With the reference from this, I tried to download entire tutorial website from https://www.guru99.com/ so I tried to execute the following commands without any success
wget -r --mirror -p --convert-links -P . https://www.guru99.com
wget -r https://www.guru99.com
wget -r -l 0 https://www.guru99.com
The return from terminal console is as below
--2019-04-17 08:33:48-- https://www.guru99.com/
Resolving www.guru99.com (www.guru99.com)... 72.52.251.71
Connecting to www.guru99.com (www.guru99.com)|72.52.251.71|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: ‘www.guru99.com/index.html’
www.guru99.com/index.html [ <=> ] 13.31K 43.4KB/s in 0.3s
2019-04-17 08:33:50 (43.4 KB/s) - ‘www.guru99.com/index.html’ saved [13633]
FINISHED --2019-04-17 08:33:50--
Total wall clock time: 1.7s
Downloaded: 1 files, 13K in 0.3s (43.4 KB/s)
And the downloaded file has only index.html. What is the problem with that how can I download this website for offline? Thanks.
software-recommendation
add a comment |
With the reference from this, I tried to download entire tutorial website from https://www.guru99.com/ so I tried to execute the following commands without any success
wget -r --mirror -p --convert-links -P . https://www.guru99.com
wget -r https://www.guru99.com
wget -r -l 0 https://www.guru99.com
The return from terminal console is as below
--2019-04-17 08:33:48-- https://www.guru99.com/
Resolving www.guru99.com (www.guru99.com)... 72.52.251.71
Connecting to www.guru99.com (www.guru99.com)|72.52.251.71|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: ‘www.guru99.com/index.html’
www.guru99.com/index.html [ <=> ] 13.31K 43.4KB/s in 0.3s
2019-04-17 08:33:50 (43.4 KB/s) - ‘www.guru99.com/index.html’ saved [13633]
FINISHED --2019-04-17 08:33:50--
Total wall clock time: 1.7s
Downloaded: 1 files, 13K in 0.3s (43.4 KB/s)
And the downloaded file has only index.html. What is the problem with that how can I download this website for offline? Thanks.
software-recommendation
add a comment |
With the reference from this, I tried to download entire tutorial website from https://www.guru99.com/ so I tried to execute the following commands without any success
wget -r --mirror -p --convert-links -P . https://www.guru99.com
wget -r https://www.guru99.com
wget -r -l 0 https://www.guru99.com
The return from terminal console is as below
--2019-04-17 08:33:48-- https://www.guru99.com/
Resolving www.guru99.com (www.guru99.com)... 72.52.251.71
Connecting to www.guru99.com (www.guru99.com)|72.52.251.71|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: ‘www.guru99.com/index.html’
www.guru99.com/index.html [ <=> ] 13.31K 43.4KB/s in 0.3s
2019-04-17 08:33:50 (43.4 KB/s) - ‘www.guru99.com/index.html’ saved [13633]
FINISHED --2019-04-17 08:33:50--
Total wall clock time: 1.7s
Downloaded: 1 files, 13K in 0.3s (43.4 KB/s)
And the downloaded file has only index.html. What is the problem with that how can I download this website for offline? Thanks.
software-recommendation
With the reference from this, I tried to download entire tutorial website from https://www.guru99.com/ so I tried to execute the following commands without any success
wget -r --mirror -p --convert-links -P . https://www.guru99.com
wget -r https://www.guru99.com
wget -r -l 0 https://www.guru99.com
The return from terminal console is as below
--2019-04-17 08:33:48-- https://www.guru99.com/
Resolving www.guru99.com (www.guru99.com)... 72.52.251.71
Connecting to www.guru99.com (www.guru99.com)|72.52.251.71|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: ‘www.guru99.com/index.html’
www.guru99.com/index.html [ <=> ] 13.31K 43.4KB/s in 0.3s
2019-04-17 08:33:50 (43.4 KB/s) - ‘www.guru99.com/index.html’ saved [13633]
FINISHED --2019-04-17 08:33:50--
Total wall clock time: 1.7s
Downloaded: 1 files, 13K in 0.3s (43.4 KB/s)
And the downloaded file has only index.html. What is the problem with that how can I download this website for offline? Thanks.
software-recommendation
software-recommendation
asked 1 hour ago
Houy NarunHouy Narun
1827
1827
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
The program “httrack” will do exactly what your looking for. For more info goto Ubuntu httrack.
Install with : sudo apt install httrack
and start it by entering httrack
in your terminal.
For that particular site, it will take a very long time and doesn’t show any indication of progress. Be patient ;)
add a comment |
You can try doing this in the way below...
wget
--recursive
--no-clobber
--page-requisites
--html-extension
--convert-links
--restrict-file-names=windows
--domains guru99.com
--no-parent
www.guru99.com/index.html
Reference : https://www.linuxjournal.com/content/downloading-entire-web-site-wget
New contributor
thanks, it still does not work
– Houy Narun
1 hour ago
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "89"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2faskubuntu.com%2fquestions%2f1134519%2funable-to-download-website-for-offline-view-in-ubuntu%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
The program “httrack” will do exactly what your looking for. For more info goto Ubuntu httrack.
Install with : sudo apt install httrack
and start it by entering httrack
in your terminal.
For that particular site, it will take a very long time and doesn’t show any indication of progress. Be patient ;)
add a comment |
The program “httrack” will do exactly what your looking for. For more info goto Ubuntu httrack.
Install with : sudo apt install httrack
and start it by entering httrack
in your terminal.
For that particular site, it will take a very long time and doesn’t show any indication of progress. Be patient ;)
add a comment |
The program “httrack” will do exactly what your looking for. For more info goto Ubuntu httrack.
Install with : sudo apt install httrack
and start it by entering httrack
in your terminal.
For that particular site, it will take a very long time and doesn’t show any indication of progress. Be patient ;)
The program “httrack” will do exactly what your looking for. For more info goto Ubuntu httrack.
Install with : sudo apt install httrack
and start it by entering httrack
in your terminal.
For that particular site, it will take a very long time and doesn’t show any indication of progress. Be patient ;)
answered 1 hour ago
bashBedlambashBedlam
54729
54729
add a comment |
add a comment |
You can try doing this in the way below...
wget
--recursive
--no-clobber
--page-requisites
--html-extension
--convert-links
--restrict-file-names=windows
--domains guru99.com
--no-parent
www.guru99.com/index.html
Reference : https://www.linuxjournal.com/content/downloading-entire-web-site-wget
New contributor
thanks, it still does not work
– Houy Narun
1 hour ago
add a comment |
You can try doing this in the way below...
wget
--recursive
--no-clobber
--page-requisites
--html-extension
--convert-links
--restrict-file-names=windows
--domains guru99.com
--no-parent
www.guru99.com/index.html
Reference : https://www.linuxjournal.com/content/downloading-entire-web-site-wget
New contributor
thanks, it still does not work
– Houy Narun
1 hour ago
add a comment |
You can try doing this in the way below...
wget
--recursive
--no-clobber
--page-requisites
--html-extension
--convert-links
--restrict-file-names=windows
--domains guru99.com
--no-parent
www.guru99.com/index.html
Reference : https://www.linuxjournal.com/content/downloading-entire-web-site-wget
New contributor
You can try doing this in the way below...
wget
--recursive
--no-clobber
--page-requisites
--html-extension
--convert-links
--restrict-file-names=windows
--domains guru99.com
--no-parent
www.guru99.com/index.html
Reference : https://www.linuxjournal.com/content/downloading-entire-web-site-wget
New contributor
New contributor
answered 1 hour ago
unsuitable001unsuitable001
12
12
New contributor
New contributor
thanks, it still does not work
– Houy Narun
1 hour ago
add a comment |
thanks, it still does not work
– Houy Narun
1 hour ago
thanks, it still does not work
– Houy Narun
1 hour ago
thanks, it still does not work
– Houy Narun
1 hour ago
add a comment |
Thanks for contributing an answer to Ask Ubuntu!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2faskubuntu.com%2fquestions%2f1134519%2funable-to-download-website-for-offline-view-in-ubuntu%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown