Skip to content Skip to sidebar Skip to footer

Python Requests - 403 Forbidden - Despite Setting `User-Agent` Headers

import requests import webbrowser from bs4 import BeautifulSoup url = 'https://www.gamefaqs.com' #headers={'User-Agent': 'Mozilla/5.0'} headers ={'User-Agent': 'Mozilla/5.0 (W

Solution 1:

This works if you make the request through a Session object.

import requests

session = requests.Session()
response = session.get('https://www.gamefaqs.com', headers={'User-Agent': 'Mozilla/5.0'})

print(response.status_code)

Output:

200

Solution 2:

Using keyword argument works for me:

import requests
headers={'User-Agent': 'Mozilla/5.0'}
response = requests.get('https://www.gamefaqs.com', headers=headers)

Solution 3:

Try using a Session.

import requests
session = requests.Session()
response = session.get(url, headers={'user-agent': 'Mozilla/5.0'})
print(response.status_code)

If still the request returns 403 Forbidden (after session object & adding user-agent to headers), you may need to add more headers:

headers = {
    'user-agent':"Mozilla/5.0 ...",
    'accept': '"text/html,application...',
    'referer': 'https://...',
}
r = session.get(url, headers=headers)

In the chrome, Request headers can be found in the Network > Headers > Request-Headers of the Developer Tools. (Press F12 to toggle it.)

reason being, few websites look for user-agent or for presence of specific headers before accepting the request.


Post a Comment for "Python Requests - 403 Forbidden - Despite Setting `User-Agent` Headers"