You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
91 lines
3.2 KiB
91 lines
3.2 KiB
+++
|
|
author = "Maik de Kruif"
|
|
title = "Challenge 8"
|
|
subtitle = "Challenge 8 - AdventOfCTF"
|
|
date = 2020-12-08T09:34:24+01:00
|
|
description = "A writeup for challenge 8 of AdventOfCTF."
|
|
cover = "img/writeups/adventofctf/2020/da781419d6bf02d0a580e48414b9cbde.png"
|
|
tags = [
|
|
"AdventOfCTF",
|
|
"challenge",
|
|
"ctf",
|
|
"hacking",
|
|
"writeup",
|
|
"web",
|
|
]
|
|
categories = [
|
|
"ctf",
|
|
"writeups",
|
|
"hacking",
|
|
]
|
|
+++
|
|
|
|
- Points: 800
|
|
|
|
## Description
|
|
|
|
If only you could figure out where to go.
|
|
|
|
Visit <https://08.adventofctf.com> to start the challenge.
|
|
|
|
## Finding the vulnerability
|
|
|
|
When opening the website we're greeted with the following message:
|
|
|
|
> Did you know that the fastest robot can solve a rubiks cube in 0.887 seconds?
|
|
|
|
This is talking about robots, which my be a hint to look at the [`robots.txt`](https://08.adventofctf.com/robots.txt).
|
|
|
|
### What is a robots.txt file?
|
|
|
|
A `robots.txt` file lives at the root of a website. So, for the site www.example.com, a robots.txt file would live at www.example.com/robots.txt. robots.txt is a plain text file that follows the [Robots Exclusion Standard](http://en.wikipedia.org/wiki/Robots_exclusion_standard#About_the_standard). A robots.txt file consists of one or more rules. Each rule blocks (or allows) access for a given crawler to a specified file path in that website.
|
|
|
|
### Opening the file
|
|
|
|
The file shows the following:
|
|
|
|
```text
|
|
# robots.txt generated by *************.com
|
|
User-agent: *
|
|
Disallow: /
|
|
Disallow: /cgi-bin/
|
|
|
|
Disallow: /encryption/is/a/right
|
|
Disallow: /fnagn/unf/znal/cynprf/gb/tb
|
|
```
|
|
|
|
This probably means there is some sensitive information on one of the `Disallow` locations. Let's look at them one by one.
|
|
|
|
**`/cgi-bin/`**
|
|
|
|
When opening [`/cgi-bin/`](https://08.adventofctf.com/cgi-bin/), we get a `404` error. So let's skip this one for now.
|
|
|
|
**`/encryption/is/a/right`**
|
|
|
|
Upon opening [`/encryption/is/a/right`](https://08.adventofctf.com/encryption/is/a/right/), we get some encoded string back. It looks like `base64` so let's try to decode it using `base64 -d` in the terminal:
|
|
|
|
```bash
|
|
echo "RW5jb2RpbmcgYW5kIGVuY3J5cHRpb24gYXJlIDIgZGlmZmVyZW50IHRoaW5ncy4=" | base64 -d
|
|
> Encoding and encryption are 2 different things.
|
|
```
|
|
|
|
This doesn't mean a lot so let's have a look at the next one.
|
|
|
|
**`/fnagn/unf/znal/cynprf/gb/tb`**
|
|
|
|
After opening [`/fnagn/unf/znal/cynprf/gb/tb`](https://08.adventofctf.com/fnagn/unf/znal/cynprf/gb/tb/), we're greeted with the following text:
|
|
|
|
> "Oh, the places you'll go", my favorite poem... but this is the wrong place. Maybe you read that wrong?
|
|
|
|
Hmm, it says "Maybe you read that wrong?". The URL also looks kinda weird. It might be `rot13` encoded. So let's try to decode it using `rot13`:
|
|
|
|
```bash
|
|
echo "/fnagn/unf/znal/cynprf/gb/tb" | rot13
|
|
> /santa/has/many/places/to/go
|
|
```
|
|
|
|
_Note: `rot13` is not a program on linux, I just programmed it as an alias for `tr 'A-Za-z' 'N-ZA-Mn-za-m'`_
|
|
|
|
We got new url (I hope 😀). Let's try to [access it](https://08.adventofctf.com/santa/has/many/places/to/go/). We got the flag! It is `NOVI{you_have_br@1ns_in_your_head}`.
|
|
|
|
This flag can then be submitted for the [challenge](https://ctfd.adventofctf.com/challenges#8-9).
|
|
|