Operation Make the Wiki Pretty!
-
@detron said in Operation Make the Wiki Pretty!:
but is there an "off-line" version of the manual?
what about downloading the current version to your computer right now?
Here is the command to do the trick:
EDIT: removed the trick because it can stress the server.
-
@meleu said in Operation Make the Wiki Pretty!:
@detron said in Operation Make the Wiki Pretty!:
but is there an "off-line" version of the manual?
what about downloading the current version on your computer right now?
Here is the command to do the trick:
wget \ --recursive \ --no-clobber \ --page-requisites \ --html-extension \ --convert-links \ --restrict-file-names=windows \ --no-parent \ --domains retropie.org.uk \ https://retropie.org.uk/docs/
It may take a couple of minutes. After it ends you'll can see a directory named
retropie.org.uk/
. You can access the docs main page locally atretropie.org.uk/docs/index.html
.But, hey!, we are here for gaming but it's for learning too, right? Then here is the explanation of each option used on that command:
-
--recursive
: follow links to create a local version of the remote web site. -
--no-clobber
: do not overwrite existing files (useful when you cancel the download and then execute that command again). -
--page-requisites \
: download all the files necessary to properly display a given HTML page (images, CSS, etc.). -
--html-extension
: save files with.html
extension. -
--convert-links
: after the download is complete, convert the links in the document to make them suitable for local viewing. -
--restrict-file-names=windows
: modify filenames to make them work in Windows as well (useful if you plan to move the files to a Windows computer). -
--no-parent
: don't follow links outside the/docs
directory. -
--domains retropie.org.uk
: don't follow links outside retropie.org.uk. -
https://retropie.org.uk/docs
: the URL of the website (or just a directory from that website) you want to download.
thank you very much. I thought about doing this, but without permission, it seemed like theft.
I appreciate the explanation of each of the options, well done! others may find that useful too.I have used wget just for files here and there, usually for sites I use HTTrack. usually just for reconnaissance for penetration tests . (I am a network security guy, but a white hat, I always get permission)
-
-
@detron if your concern is about stressing the server, you can also use the
--wait
option. For example--wait=2
to wait 2 seconds between each retrieval. ;-) -
@meleu said in Operation Make the Wiki Pretty!:
@detron if your concern is about stressing the server, you can also use the
--wait
option. For example--wait=2
to wait 2 seconds between each retrieval. ;-)that sounds even better. I will do this on my next reboot (in Windows for school. CHFI uses .pdf files that REQUIRE Adobe Acrobat due to protections) really funny since most of the work done with CHFI is in Linux. Same problem when I did the Certified Ethical Hacker Certification.
-
@detron I had considered using readthedocs as it has options for html/PDF/epub export along with versioning but I didn't want to go through the hassle of converting markdown to restructured text. It would have been prohibitive especially if we want the community to continue to contribute to the wiki. Markdown is much simpler.
The process for creating the static pages is relatively simple through some parsing. Basically we clone the GitHub wiki repo, parse the markdown to HTML and generate the static pages through makedocs with minor configs/css etc.
@BuZz actually created a useful module that can be run manually that will generate the docs locally.
Source here:
https://github.com/RetroPie/RetroPie-Setup/blob/master/scriptmodules/admin/wikidocs.sh
sudo ~/RetroPie-Setup/retropie_packages.sh wikidocs depends
sudo ~/RetroPie-Setup/retropie_packages.sh wikidocs sources
sudo ~/RetroPie-Setup/retropie_packages.sh wikidocs build
sudo ~/RetroPie-Setup/retropie_packages.sh wikidocs install
Should generate a folder of the static pages in
~/RetroPie-Setup/tmp/build/wikidocs
I believe ( or something like that, don't have access to my pi to checkThe upload function is what we use to push updates to our server but requires an ssh key, a server side script runs the builds automagically though so we don't have to push updates manually.
-
thank you for explanation, and the wonderful work done on RetroPie, in all of its parts.
the fact that all replies to my inquiry were detailed really shows how wonderful this community is.
thank you everyone
-
@herb_fargus hey herb, I'm using what you did here to get some inspiration for another project documentation and I have a question: did you edited/created the mkdocs.yml from scratch by hand? (notably the
pages:
section). -
@meleu yes. But parts can be batch generated from the GitHub wiki list
-
@herb_fargus thanks, I noticed that and your work is being very helpful.
Are you OK if I use what you wrote for Editing the Wiki on my repo wiki too?
The documentation project I'm starting is RetroAchievements related.
-
@meleu by all means. If you're really ambitious readthedocs has greater functionality to export to epub PDF etc but it's based on rst instead of md so it's a little less intuitive.
-
@herb_fargus by the way, we added some valuable info in the emulationstation wiki a few weeks ago and it's not on the official docs. Please, let me know if I can help to update docs in some way. ;)
-
@meleu It would be polite to check with the web host / admin (me) before telling people to spider the site. The server is busy enough as it is serving up users etc.
Please use @herb_fargus method for generating the docs locally.
-
Contributions to the project are always appreciated, so if you would like to support us with a donation you can do so here.
Hosting provided by Mythic-Beasts. See the Hosting Information page for more information.