Unable to download website for offline view in Ubuntu?





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ margin-bottom:0;
}







0















With the reference from this, I tried to download entire tutorial website from https://www.guru99.com/ so I tried to execute the following commands without any success



wget -r --mirror -p --convert-links -P . https://www.guru99.com

wget -r https://www.guru99.com

wget -r -l 0 https://www.guru99.com


The return from terminal console is as below



--2019-04-17 08:33:48--  https://www.guru99.com/
Resolving www.guru99.com (www.guru99.com)... 72.52.251.71
Connecting to www.guru99.com (www.guru99.com)|72.52.251.71|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: ‘www.guru99.com/index.html’

www.guru99.com/index.html [ <=> ] 13.31K 43.4KB/s in 0.3s

2019-04-17 08:33:50 (43.4 KB/s) - ‘www.guru99.com/index.html’ saved [13633]

FINISHED --2019-04-17 08:33:50--
Total wall clock time: 1.7s
Downloaded: 1 files, 13K in 0.3s (43.4 KB/s)


And the downloaded file has only index.html. What is the problem with that how can I download this website for offline? Thanks.










share|improve this question





























    0















    With the reference from this, I tried to download entire tutorial website from https://www.guru99.com/ so I tried to execute the following commands without any success



    wget -r --mirror -p --convert-links -P . https://www.guru99.com

    wget -r https://www.guru99.com

    wget -r -l 0 https://www.guru99.com


    The return from terminal console is as below



    --2019-04-17 08:33:48--  https://www.guru99.com/
    Resolving www.guru99.com (www.guru99.com)... 72.52.251.71
    Connecting to www.guru99.com (www.guru99.com)|72.52.251.71|:443... connected.
    HTTP request sent, awaiting response... 200 OK
    Length: unspecified [text/html]
    Saving to: ‘www.guru99.com/index.html’

    www.guru99.com/index.html [ <=> ] 13.31K 43.4KB/s in 0.3s

    2019-04-17 08:33:50 (43.4 KB/s) - ‘www.guru99.com/index.html’ saved [13633]

    FINISHED --2019-04-17 08:33:50--
    Total wall clock time: 1.7s
    Downloaded: 1 files, 13K in 0.3s (43.4 KB/s)


    And the downloaded file has only index.html. What is the problem with that how can I download this website for offline? Thanks.










    share|improve this question

























      0












      0








      0








      With the reference from this, I tried to download entire tutorial website from https://www.guru99.com/ so I tried to execute the following commands without any success



      wget -r --mirror -p --convert-links -P . https://www.guru99.com

      wget -r https://www.guru99.com

      wget -r -l 0 https://www.guru99.com


      The return from terminal console is as below



      --2019-04-17 08:33:48--  https://www.guru99.com/
      Resolving www.guru99.com (www.guru99.com)... 72.52.251.71
      Connecting to www.guru99.com (www.guru99.com)|72.52.251.71|:443... connected.
      HTTP request sent, awaiting response... 200 OK
      Length: unspecified [text/html]
      Saving to: ‘www.guru99.com/index.html’

      www.guru99.com/index.html [ <=> ] 13.31K 43.4KB/s in 0.3s

      2019-04-17 08:33:50 (43.4 KB/s) - ‘www.guru99.com/index.html’ saved [13633]

      FINISHED --2019-04-17 08:33:50--
      Total wall clock time: 1.7s
      Downloaded: 1 files, 13K in 0.3s (43.4 KB/s)


      And the downloaded file has only index.html. What is the problem with that how can I download this website for offline? Thanks.










      share|improve this question














      With the reference from this, I tried to download entire tutorial website from https://www.guru99.com/ so I tried to execute the following commands without any success



      wget -r --mirror -p --convert-links -P . https://www.guru99.com

      wget -r https://www.guru99.com

      wget -r -l 0 https://www.guru99.com


      The return from terminal console is as below



      --2019-04-17 08:33:48--  https://www.guru99.com/
      Resolving www.guru99.com (www.guru99.com)... 72.52.251.71
      Connecting to www.guru99.com (www.guru99.com)|72.52.251.71|:443... connected.
      HTTP request sent, awaiting response... 200 OK
      Length: unspecified [text/html]
      Saving to: ‘www.guru99.com/index.html’

      www.guru99.com/index.html [ <=> ] 13.31K 43.4KB/s in 0.3s

      2019-04-17 08:33:50 (43.4 KB/s) - ‘www.guru99.com/index.html’ saved [13633]

      FINISHED --2019-04-17 08:33:50--
      Total wall clock time: 1.7s
      Downloaded: 1 files, 13K in 0.3s (43.4 KB/s)


      And the downloaded file has only index.html. What is the problem with that how can I download this website for offline? Thanks.







      software-recommendation






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked 1 hour ago









      Houy NarunHouy Narun

      1827




      1827






















          2 Answers
          2






          active

          oldest

          votes


















          0














          The program “httrack” will do exactly what your looking for. For more info goto Ubuntu httrack.



          Install with : sudo apt install httrack and start it by entering httrack in your terminal.



          For that particular site, it will take a very long time and doesn’t show any indication of progress. Be patient ;)






          share|improve this answer































            0














            You can try doing this in the way below...



            wget
            --recursive
            --no-clobber
            --page-requisites
            --html-extension
            --convert-links
            --restrict-file-names=windows
            --domains guru99.com
            --no-parent
            www.guru99.com/index.html



            Reference : https://www.linuxjournal.com/content/downloading-entire-web-site-wget






            share|improve this answer








            New contributor




            unsuitable001 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
            Check out our Code of Conduct.





















            • thanks, it still does not work

              – Houy Narun
              1 hour ago












            Your Answer








            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "89"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














            draft saved

            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2faskubuntu.com%2fquestions%2f1134519%2funable-to-download-website-for-offline-view-in-ubuntu%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            2 Answers
            2






            active

            oldest

            votes








            2 Answers
            2






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0














            The program “httrack” will do exactly what your looking for. For more info goto Ubuntu httrack.



            Install with : sudo apt install httrack and start it by entering httrack in your terminal.



            For that particular site, it will take a very long time and doesn’t show any indication of progress. Be patient ;)






            share|improve this answer




























              0














              The program “httrack” will do exactly what your looking for. For more info goto Ubuntu httrack.



              Install with : sudo apt install httrack and start it by entering httrack in your terminal.



              For that particular site, it will take a very long time and doesn’t show any indication of progress. Be patient ;)






              share|improve this answer


























                0












                0








                0







                The program “httrack” will do exactly what your looking for. For more info goto Ubuntu httrack.



                Install with : sudo apt install httrack and start it by entering httrack in your terminal.



                For that particular site, it will take a very long time and doesn’t show any indication of progress. Be patient ;)






                share|improve this answer













                The program “httrack” will do exactly what your looking for. For more info goto Ubuntu httrack.



                Install with : sudo apt install httrack and start it by entering httrack in your terminal.



                For that particular site, it will take a very long time and doesn’t show any indication of progress. Be patient ;)







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered 1 hour ago









                bashBedlambashBedlam

                54729




                54729

























                    0














                    You can try doing this in the way below...



                    wget
                    --recursive
                    --no-clobber
                    --page-requisites
                    --html-extension
                    --convert-links
                    --restrict-file-names=windows
                    --domains guru99.com
                    --no-parent
                    www.guru99.com/index.html



                    Reference : https://www.linuxjournal.com/content/downloading-entire-web-site-wget






                    share|improve this answer








                    New contributor




                    unsuitable001 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                    Check out our Code of Conduct.





















                    • thanks, it still does not work

                      – Houy Narun
                      1 hour ago
















                    0














                    You can try doing this in the way below...



                    wget
                    --recursive
                    --no-clobber
                    --page-requisites
                    --html-extension
                    --convert-links
                    --restrict-file-names=windows
                    --domains guru99.com
                    --no-parent
                    www.guru99.com/index.html



                    Reference : https://www.linuxjournal.com/content/downloading-entire-web-site-wget






                    share|improve this answer








                    New contributor




                    unsuitable001 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                    Check out our Code of Conduct.





















                    • thanks, it still does not work

                      – Houy Narun
                      1 hour ago














                    0












                    0








                    0







                    You can try doing this in the way below...



                    wget
                    --recursive
                    --no-clobber
                    --page-requisites
                    --html-extension
                    --convert-links
                    --restrict-file-names=windows
                    --domains guru99.com
                    --no-parent
                    www.guru99.com/index.html



                    Reference : https://www.linuxjournal.com/content/downloading-entire-web-site-wget






                    share|improve this answer








                    New contributor




                    unsuitable001 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                    Check out our Code of Conduct.










                    You can try doing this in the way below...



                    wget
                    --recursive
                    --no-clobber
                    --page-requisites
                    --html-extension
                    --convert-links
                    --restrict-file-names=windows
                    --domains guru99.com
                    --no-parent
                    www.guru99.com/index.html



                    Reference : https://www.linuxjournal.com/content/downloading-entire-web-site-wget







                    share|improve this answer








                    New contributor




                    unsuitable001 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                    Check out our Code of Conduct.









                    share|improve this answer



                    share|improve this answer






                    New contributor




                    unsuitable001 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                    Check out our Code of Conduct.









                    answered 1 hour ago









                    unsuitable001unsuitable001

                    12




                    12




                    New contributor




                    unsuitable001 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                    Check out our Code of Conduct.





                    New contributor





                    unsuitable001 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                    Check out our Code of Conduct.






                    unsuitable001 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                    Check out our Code of Conduct.













                    • thanks, it still does not work

                      – Houy Narun
                      1 hour ago



















                    • thanks, it still does not work

                      – Houy Narun
                      1 hour ago

















                    thanks, it still does not work

                    – Houy Narun
                    1 hour ago





                    thanks, it still does not work

                    – Houy Narun
                    1 hour ago


















                    draft saved

                    draft discarded




















































                    Thanks for contributing an answer to Ask Ubuntu!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function () {
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2faskubuntu.com%2fquestions%2f1134519%2funable-to-download-website-for-offline-view-in-ubuntu%23new-answer', 'question_page');
                    }
                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    GameSpot

                    connect to host localhost port 22: Connection refused

                    Getting a Wifi WPA2 wifi connection