I have 2 pages: xyz.example/a
and xyz.example/b
. I can only access xyz.example/b
if and only if I login to xyz.example/a
first. If accessing xyz.example/b
without going through the other, I simply get access denied (no redirect to login) via the browser. Once I login at xyz.example/a
, I can access the other.
My problem is doing this using the curl command. I can login successfully to xyz.example/a
using curl, but then try xyz.example/b
and I get access denied.
I use the following:
curl --user user:pass https://xyz.example/a #works ok
curl https://xyz.example/b #doesn't work
I've tried using the second line with & without the user/password part and still doesn't work. Both pages uses the same CA, so that's not a problem.
The web site likely uses cookies to store your session information. When you run
curl --user user:pass https://xyz.example/a #works ok
curl https://xyz.example/b #doesn't work
curl
is run twice, in two separate sessions. Thus when the second command runs, the cookies set by the 1st command are not available; it's just as if you logged in to page a
in one browser session, and tried to access page b
in a different one.
What you need to do is save the cookies created by the first command:
curl --user user:pass --cookie-jar ./somefile https://xyz.example/a
and then read them back in when running the second:
curl --cookie ./somefile https://xyz.example/b
Alternatively you can try downloading both files in the same command, which I think will use the same cookies.
Also you might want to log in via browser and get the command with all headers including cookies:
Open the Network tab of Developer Tools, log in, navigate to the needed page, use "Copy as cURL".
https://i.stack.imgur.com/DePbs.png
After some googling I found this:
curl -c cookie.txt -d "LoginName=someuser" -d "password=somepass" https://oursite/a
curl -b cookie.txt https://oursite/b
No idea if it works, but it might lead you in the right direction.
My answer is a mod of some prior answers from @JoeMills and @user.
Get a cURL command to log into server: Load login page for website and open Network pane of Developer Tools In firefox, right click page, choose 'Inspect Element (Q)' and click on Network tab Go to login form, enter username, password and log in After you have logged in, go back to Network pane and scroll to the top to find the POST entry. Right click and choose Copy -> Copy as CURL Paste this to a text editor and try this in command prompt to see if it works Its possible that some sites have hardening that will block this type of login spoofing that would require more steps below to bypass. Modify cURL command to be able to save session cookie after login Remove the entry -H 'Cookie:
I have tried this on Ubuntu 20.04 and it works like a charm.
Success story sharing
somefile
contains the path parameter (/a
in this case) and it is not forwarded to the second call. If I edit the cookie in the file and put a slash only, it works (cookie forwarded to the second call). Do you know if it's possible to prevent the storage of the path in the cookie file?