PowerGod
Forum Addict!
- Joined
- Jun 20, 2011
- Messages
- 4,456
EDIT: tl;dr the script is in the second post.
I'm trying to get DAT files from the NO-INTRO site using command line...
The site needs both GET and POST requests, the GET for going in the specific "Standard DAT" page, and the POST to select the system for the list.
This is the "Standard DAT" page:
A command like this is returning the "Standard DAT" page, but not for the system I specified:
The post-data string should be correct, I checked both from the browser logs and also fiddler what the browser was sending after selecting another system...
After that I should also send another POST command because "Prepare" has to be pressed to get the file, but I'll look at this after the first issue is solved.
Do you know where could be the problem ?
EDIT:
Somehow I solved the first issue like this:
Now, "result.html" contains the form to select the data to list for, in this case, the Game Gear.
The second request to compile the form and to get to the download page works too:
Now, this is the point where I am stuck...
With the browser I see that after confirming the form in the previous page, the address changes and doesn't have anymore the same GET parameters as before:
To download correctly the file, I need to instruct WGET to catch those "page and download" parameters, but I don't know how to do it...
EDIT:
I found a way to extract the address from the log file, but i don't like it very much... anyway it works...
I added the parameter " --append-output=LOG_FILE", and then
So I can use this URL in the last wget request.
I'm trying to get DAT files from the NO-INTRO site using command line...
DAT-o-MATIC
datomatic.no-intro.org
The site needs both GET and POST requests, the GET for going in the specific "Standard DAT" page, and the POST to select the system for the list.
This is the "Standard DAT" page:
Code:
https://datomatic.no-intro.org/index.php?page=download&op=dat
A command like this is returning the "Standard DAT" page, but not for the system I specified:
Code:
wget --keep-session-cookies --no-check-certificate --post-data="sel_s=Sega+-+Game+Gear" "https://datomatic.no-intro.org/index.php?page=download&op=dat" -O result.html
The post-data string should be correct, I checked both from the browser logs and also fiddler what the browser was sending after selecting another system...
After that I should also send another POST command because "Prepare" has to be pressed to get the file, but I'll look at this after the first issue is solved.
Do you know where could be the problem ?
EDIT:
Somehow I solved the first issue like this:
Code:
wget \
--load-cookies cookies.txt \
--save-cookies cookies.txt \
--keep-session-cookies \
--no-check-certificate \
--user-agent="Mozilla/5.0 (Windows NT 6.3; Win6>4; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/81.0.4044.132 Safari/537.36" \
--post-data="sel_s=Sega+-+Game+Gear" \
"https://datomatic.no-intro.org/index.php?page=download&op=dat" \
-O result.html
The second request to compile the form and to get to the download page works too:
Code:
wget \
--load-cookies cookies.txt \
--save-cookies cookies.txt \
--keep-session-cookies \
--no-check-certificate \
--user-agent="Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/81.0.4044.132 Safari/537.36" \
--post-data="inc_complete=0&inc_unl=1&inc_pirate=1&inc_physical=0&special1_filter=all_specials1&language_filter=all_languages®ion_filter=all_regions&prepare_2=Prepare" \
"https://datomatic.no-intro.org/index.php?page=download&op=dat" \
-O result.html
Now, this is the point where I am stuck...
With the browser I see that after confirming the form in the previous page, the address changes and doesn't have anymore the same GET parameters as before:
Code:
https://datomatic.no-intro.org/index.php?page=manager&download=9113
To download correctly the file, I need to instruct WGET to catch those "page and download" parameters, but I don't know how to do it...
EDIT:
I found a way to extract the address from the log file, but i don't like it very much... anyway it works...
I added the parameter " --append-output=LOG_FILE", and then
Code:
DOWNLOAD_URL=$(grep "^--" LOG_FILE | tail -n1 | sed 's/.*http/http/')
So I can use this URL in the last wget request.
Last edited: