This page looks best with JavaScript enabled

kksctf: Lynx

 ·  ☕ 1 min read


We are given a url. However when we make a request it seems that there is some kind of protection:

[jusepe@nix:~]$ curl && echo ""
You are not lynx, access denied

The message it’s a clear indicator that we need to use lynx in order to see the content, the command would be lynx


There is a reference to robots.txt in the home page so let’s check it using G inside lynx to rewrite the url:

This steps could be accomplished without lynx, it is only protected the root directory


It seems that there is a hidden directory that finally contains the flag:


Alternative path

In order to whitelist specific web browsers the website may be blocking based of User-Agent header, so used burpsuite to check what header was being used with lynx with the following commands:

[jusepe@nix:~]$ export http_proxy=
[jusepe@nix:~]$ lynx

Here is the intercepted request:


Now we can use curl with the same User-Agent and check how we bypass the protection:

[jusepe@nix:~]$ curl -A "Lynx/2.9.0dev.6"

        <!DOCTYPE html>
        <title>Code panel</title>
        <script type="text/javascript" src="code.js"></script>
        <center> WELCOME </strong>
        <p>Let's defend our friend - Lynx - from robots!</p>
        <center>(C) BluePeace, 2053</center>
Share on

InTernet lover