RetroPie forum home
    • Recent
    • Tags
    • Popular
    • Home
    • Docs
    • Register
    • Login

    Test a new data source for sselph/scraper

    Scheduled Pinned Locked Moved Ideas and Development
    scrapertestingscreenscraper
    43 Posts 16 Posters 15.7k Views
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • E
      enderandrew
      last edited by

      Your path doesn't look correct:

      P:.emulationstation\roms\megadrive\images

      Shouldn't that be something more like:

      P:\emulationstation\roms\megadrive\images

      N 1 Reply Last reply Reply Quote 0
      • N
        Nismo @enderandrew
        last edited by Nismo

        @enderandrew No, the path is correct, even if I change the path to C:/images it's the same... missing image for all games.

        Thanks for help.

        Edit: I edit my last post to avoid confussions.

        1 Reply Last reply Reply Quote 0
        • N
          Nismo
          last edited by

          I solved by myself. For who interested I solved downloading fastscraper script and edit some parameters to my like.

          You can found here: https://forum.recalbox.com/topic/2594/batch-scrape-your-roms-on-your-pc-fastscraper

          Regards.

          1 Reply Last reply Reply Quote 0
          • R
            relative
            last edited by

            Hello,

            How do I turn off gdb is becoming nuisance with server going offline whenever I want something done.

            How do I enable the use of SS ?

            I used this and it continues to default to gdb [ console srcs : gdb ]
            scraper -use_gdb=false -use_ss=true

            R 1 Reply Last reply Reply Quote 0
            • R
              relative @relative
              last edited by

              @relative nvm, I changed it through the front-end... adding that string in the command line has no effect whatsoever.

              S 1 Reply Last reply Reply Quote 0
              • S
                sselph @relative
                last edited by

                @relative the -use_ flags were not great so a while back I changed these to -console_src="ss,gdb" where you pass an ordered list of sources. There is also -mame_src and you can see the list of accepted flags and documentation with -h

                Auto-scraper: https://github.com/sselph/scraper
                Donate to Extra-Life 2018 and help save lives: https://goo.gl/diu5oU

                R 1 Reply Last reply Reply Quote 0
                • S
                  superjoe
                  last edited by

                  How do you specify the user and password for screenscraper.fr on the arguments of the sselph scrapper? So far I've been using the Universal Xml Scrapper V2 but it requires registration.

                  R 1 Reply Last reply Reply Quote 0
                  • R
                    relative @superjoe
                    last edited by

                    @superjoe
                    -ss_username -ss_password

                    They are not mandatory as far as I I know. I was able to scrape without them.

                    S 1 Reply Last reply Reply Quote 0
                    • R
                      relative @sselph
                      last edited by

                      @sselph okay thanks.

                      1 Reply Last reply Reply Quote 0
                      • S
                        sselph @relative
                        last edited by

                        @relative Yeah they haven't typically been mandatory but if there is any congestion on the server, you will be told that you can't access it. Also they limit the number of threads to 1 unless you are registered, but that isn't too bad.

                        Auto-scraper: https://github.com/sselph/scraper
                        Donate to Extra-Life 2018 and help save lives: https://goo.gl/diu5oU

                        C 1 Reply Last reply Reply Quote 0
                        • C
                          craig0r @sselph
                          last edited by

                          @sselph can you tell me how I might add my credentials to the scraper.sh script? (I'm indeed running into that congestion.)

                          S 1 Reply Last reply Reply Quote 0
                          • S
                            sselph @craig0r
                            last edited by

                            @craig0r -ss_user="myssuser" -ss_password="mypassword" if you are adding it to the scraper.sh you'd add it here:
                            https://github.com/sselph/RetroPie-Setup/blob/master/scriptmodules/supplementary/scraper.sh#L112
                            as something like params+=(-ss_user="myssuser" -ss_password="mypassword")

                            Auto-scraper: https://github.com/sselph/scraper
                            Donate to Extra-Life 2018 and help save lives: https://goo.gl/diu5oU

                            N 1 Reply Last reply Reply Quote 1
                            • soulfunkdjxS
                              soulfunkdjx
                              last edited by

                              @sselph Hey man i have already a gamelist with boxarts and metadata for every game on every system and now i want to use your scaper to add videosnaps to every game. The problem is that i want to append my gamelists and when i do that it doesn't work. It's downloading videosnaps only if i choose overwrite gamelists but i will lose all the metadata and boxarts that i was fighting to build over a year now.
                              Is there a fix for the append option for gamelists?

                              1 Reply Last reply Reply Quote 0
                              • N
                                Necrym @sselph
                                last edited by Necrym

                                This post is deleted!
                                1 Reply Last reply Reply Quote 0
                                • First post
                                  Last post

                                Contributions to the project are always appreciated, so if you would like to support us with a donation you can do so here.

                                Hosting provided by Mythic-Beasts. See the Hosting Information page for more information.