Research Spotlight On Honeypots

honeypots.jpg
Original Author: LollyKnit
CC Licensed

Description

PhD student Christian Seifert and his supervisors, Dr Peter Komisarczuk and A/Prof Ian Welch, develop client honeypots, a system to track and classify malicious web sites.

Across the web, hundreds of thousands of malicious web sites participate in drive-by downloads, pushing malware onto a user's computer without their explicit knowledge. This problem causes many of the issues that users experience every day, perhaps displaying pop-up adverts, slowing down internet connections or even crashing computers. Being able to find and mark these suspicious web sites will help to keep web surfers safe online.

Christian, Peter and Ian use a cluster of fifteen Windows XP computers here at ECS, which go out and scan the web to try and be compromised by malicious sites. Once any anomalous behaviour occurs, the web site is marked as suspicious. This process occurs automatically, much faster than any human manually classifying sites could manage. However, even though the machines run for 24 hours a day, 365 days a year, they still can only scrape the surface of the World Wide Web. MSc student David Stirling is working to see how a grid of machines, scaling up to hundreds of computers, could be used to further speed up the process.

Collaboration

InternetNZ-logo.png
InternetNZ sponsor us to survey all .nz sites

Client Honeynets have drawn international interest, but here in New Zealand InternetNZ have sponsored Christian and Ian to survey all sites on the .nz domain.


Download

This work is available under an open source license (GPL) and can be downloaded at the HoneyC site.

Further Information

Members

Christian Seifert, Dr Peter Komisarczuk, A/Prof Ian Welch, Ramon Steenson

Main.ResearchSpotlightForm edit

SpotlightOn Honeypots
Members Christian Seifert, Dr Peter Komisarczuk, A/Prof Ian Welch, Ramon Steenson
Summary PhD student Christian Seifert and his supervisors, Dr Peter Komisarczuk and A/Prof Ian Welch, are on the hunt for malicious web sites. Using ECS computers, they roam the World Wide Web looking for malware and attempt to discover and automatically classify the sites responsible. The work has serious implications for all web users, and could be used to help Google improve their search engine warnings.
Subject Computer Science
This topic: Main > ResearchSpotlights > ResearchSpotlightOnHoneypots
Topic revision: 08 Nov 2008, mark