The Honeynet Project's Reverse Challenge officially began
06 May, 2002, ended 31 May, and the results will released 08 July.
This page links to all the information we've assembled about the
Challenge. This index will help you quickly get to what you want.
Introduction
Every day, incident handlers across the globe are faced with
compromised systems, running some set of unknown programs, providing
some kind of unintended service to an intruder who has taken control
of someone else's -- YOUR, or your client's, or customer's --
computers. To most, the response is a matter of "get it back online
ASAP and be done with it." This usually leads to an inadequate and ineffective
response, not even knowing what hit you, with a high probability of
repeated compromise.
On the law enforcement side, they are hampered by a flood of incidents
and a lack of good data. A victim trying to keep a system running or
doing a "quickie" job of cleanup usually means incidents are
underreported and inadequate handling of the evidence leads to
no evidence, or tainted evidence. There has to be a better way to meet
the needs of incident handlers and system administrators, as well as law
enforcement, if Internet crime is going to be managed and not run amok.
One possible answer is effective analysis skills -- widespread knowledge
of tools and techniques -- to preserve data, analyze it, and produce
meaningful reports to your organization's management, to other incident
response teams and system administrators, and to law enforcement.
Enter the Honeynet Project. One of the primary goals of the Honeynet
Project is to find order in chaos by letting the attackers do their
thing, and allowing the defenders to learn from the experience and
improve. The latest challenge is the Reverse Challenge. Just like
the Forensic Challenge,
we're opening it up to anyone who wants to join in.
The Challenge
The Reverse Challenge is an effort to allow incident handlers around
the world to all look at the same binary -- a unique tool
captured in the wild -- and to see who can dig the most out of the
tool and communicate what they've found in a concise manner.
This is a nonscientific study of tools, techniques, and procedures
applied to post-compromise incident handling. The challenge is to
have fun, to solve a common real world problem, and for everyone to
learn from the process. If what I've said already isn't enough to get
you interested, the Honeynet is offering signed copies of their
popular Know Your Enemy
book for the 20 best submissions.
All we are going to tell you about the binary is this; Sometime in 2002
a Honeynet system was compromised, and the binary in question was downloaded,
installed, and then ran on the compromised honeypot. Its now your mission --
should you choose to accept it! -- to identify how the tool works, its
purpose, and to show your methods for analysis. We don't expect that everyone
undertaking the challenge can or will address all of the following items, but
the list below of questions and deliverables is provided as a guideline for
what to produce and what to focus on. The following points should be addressed
in your answers.html document.
- Identify and explain the purpose of the binary.
- Identify and explain the different features of the binary. What
are its capabilities?
- The binary uses a network data encoding process. Identify the encoding
process and develop a decoder for it
- Identify one method of detecting this network traffic using a method
that is not just specific to this situation, but other ones as well.
- Identify and explain any techniques in the binary that protect it from being
analyzed or reverse engineered.
- Identify two tools in the past that have demonstrated similar functionality.
Bonus Questions: The bonus questions are open ended questions. It
is used when submissions are too close together to tell apart. The bonus
question is then used to identify a winner when entries are tied for a position.
-
What kind of information can be derived about the person who developed this tool?
For example, what is their skill level?
- What advancements in tools with similar purposes can we expect in the future?
Separate Documents:
- Provide a summary for use within an organization
(a fictitious university, "honeyp.edu", in this case, where we hold
an honorary Doctorate, by the way) to explain the key aspects of the
binary, how it works, the threats it poses, and how to detect and defend
against this binary. The summary is for a non-technical audience, such
as management or the media.
- Provide an advisory for use within the same organization
to explain the key aspects of the binary, how it works, the threats it poses, and
how to detect and defend against this binary. The advisory is for technical personnel,
such as the IT department.
- Produce a cost-estimate for this incident using the following
guidelines and method:
http://staff.washington.edu/dittrich/misc/faqs/incidentcosts.faq
To simplify and to normalize the results, assume that your annual
salary is $70,000 and that there are no user-related costs.
(If you work as a team, break out hours by person, but all members
should use the same annual salary. Please also include a brief
description of each investigator's number of years of experience in the
fields of system administration, programming, and security, just to help
us compare the number of hours spent with other entrants).
To summarize (and standardize) the deliverables, please produce the following
in .html format. Please be sure to use only LOCAL links in your documenation,
so other people can download and use on their systems.
File Contents
---------------------------------------------------------------------
index.html Index of files/directories submitted
(including any not listed below)
timestamp.html Timestamp of MD5 checksums of all files
listed and submitted (dating when produced
-- see deadline information below)
summary.html The summary for a non-technical audience, such
as management or media.
advisory.html Advisory for a technical audience, such as
administrators and incident handlers within
your organization.
analysis.html Details showing how you obtained your analysis,
showing tools and methods used.
answers.html Answers to the questions listed above.
costs.html Incident cost-estimate.
files.tar Any other files produced during analysis and/or
excerpts (e.g., strings output or
disassembly listings) from analysis.
The Rules
- You are free to use any tools or techniques that you choose, provided
that the judges are able to readily interpret your results and duplicate or
verify their accuracy. Several good publicly available starting points
on forensics and analysis include
http://www.zeltser.com/sans/gcih-practical/revmalw.html
No matter what tools/methods you choose, please make sure you explain
them in your analysis and cite references to resources (e.g., RFCs, CERT
or SANS "how to" documents) to help others learn by example.
Don't forget: this is a Honeynet Project brainchild, so learning
is what it's all about. And fun. It's all about learning and fun.
Oh yeah, and security. Learning, fun, AND security. ;)
-
You may work as a team, but if your entry is selected as a
Top 20, you'll have to fight over one copy of the book.
-
Deliver the results of the analysis in such a way that the judges
can quickly and easily consume the information, and such that its
authenticity, time of production, and integrity can be verified
independently. (e.g., ISO 9660 CD-ROM or .tar archive,
with digital time stamps, and PGP signatures and/or MD5 checksums.)
- All submissions MUST be time stamped prior to 24:00
GMT on Friday, 31 May, 2002. The digital time stamps and
postmarks will be used to determine the 20
Know Your Enemy book winners.)
One free digital time stamping service you can use is Stamper
.
- All submissions should be sent (or shipping address arranged, if
CD-ROMs are being produced) to challenge@honeynet.org.
- The person who hacked the box is NOT eligible, nor are members
of the Honeynet Project. Members of the Honeynet Research Alliance or
companies employing Honeynet Project members are eligible (and encouraged!)
to enter, but their entries (even if Top 20) will not receive copies of
Know Your Enemy.
The books go to other entrants.
- Entries must be written in English (UK and Aussie English accepted,
but go light on the regional slang, please!
Judging and Prizes
Submissions will be judged by a panel of experts and winners selected
and announced on Monday, 08 July, 2002. All decisions of the judges
are final (no recounts or legal challenges by teams of grossly
overpaid lawyers will be tolerated!). The judges include (but not
limited to):
- David Dittrich
- Job de Haas
- K2
- Halvar
- Gera
- Niels Provos
The judges will be using the following scoring method to determine the winners,
for a total of 50 points.
- 0-30 points for questions. This is obtained with 0-5 points for
each of the six questions.
- 0-5 points for documentation. Did you include all the documentation, was
it easy to read, follow, and understand?
- 0-5 points for analysis. Did you show your tools, methods, and process
for analyzing the data, and determining your conclusions.
- 0-5 points for correct conclusions. You may have excellent analysis and
documentation, but did you come to the correct conclusions?
- 0-5 points for extra credit. Extra credit defines going above and beyond
the requirements here, such as developing your own tools.
- There are no points for the Bonus question. The purpose of the bonus question
is to break ties only.
The top 20 entries will be awarded a copy of the
Know Your Enemy book. Additional top prizes include two licensed versions of the reverse
engineering tool IDA Pro, one free
pass to Black Hat Briefings, and a $200 Amazon gift certificate from the fine folks
at DataRescue. These will be awarded based on the algorithm
of our choosing. After the winners are announced, the top 20 entries will be posted for the
security community to review. We hope that the community can better learn from and improve
from all the different techniques that different people and organizations use.
Good luck, and have fun!
--- The Honeynet Project
|