Archive for the 'Testing' Category

Jan 24 2014

Can’t get there from here

I’ve had an interesting problem for the last few days.  I can’t get to the Hack in the Box site, HITB.org, or the HITB NL site from my home near London.  Turns out I can’t get to the THC.org site or rokabear.com either.  That makes four hacking conferences who’s sites I can’t get to.  And I’m not the only one, since apparently a number of people who are using Virgin Media in the UK as their ISP can’t get to these sites, while other people on other ISP’s in Britain can get to all four of these sites.  I can even get to them if I log into my corporate VPN, just not while the traffic is flowing out through my home network.  I’m not going to accuse Virgin Media of blocking these sites, but I’m also not ruling chicanery on their part out as a cause either.  I also make no claims that I poses the network kung-fu to verify that any of my testing is more than scratching the surface of this problem.

So here’s how this all started:  Yesterday morning I decided I saw a tweet that the early bird sign up for Hack in the Box Amsterdam was going to end soon.  I know some of the organizers of the event, I’ve wanted to go for a long time, so I decided to get my ticket early and save the company a few bucks.  I opened up a new tab in Chrome, typed in haxpo.nl and … nothing, the request timed out.  Hmm.  Ping gave me an IP, so the DNS records were resolving, but the site itself was timing out.  I switched to the work computer, to find the same thing was happening.  The I logged into the corporate VPN and tried again, suddenly everything worked.  Curious.

At first I thought this might be a stupid DNS trick played at the ISP, so I changed my DNS resolvers to a pair of servers I have relative certainty aren’t going to play tricks, Google’s 8.8.8.8 and the DNS server from my old ISP back in the US, Sonic.net (who I highly recommend, BTW).  This didn’t change anything, I still couldn’t get to HITB.  I had to get working, so I did what any smart security professional does, I threw up a couple of tweets to see if anyone else was experiencing similar issues.  And it turns out there were a number of people, all using Virgin Media, who had the identical problem.  This is how I found out that THC and Rokabear are also not accessible for us.

As yesterday went by, I got more and more confirmations that none of these hacking sites are available for those of us on Virgin Media.  At first I thought it might simply be VM blackholing the sites, but VM’s social media person sent me a link to review who was being blocked by court order by Virgin Media.  I didn’t find any of the hacking sites listed in this, besides which Virgin Media actually throws up a warning banner page when they block a page, they don’t simply blackhole the traffic.  They will limit your internet access if they feel you’re downloading too many big files during peak usage hours, but that’s a discussion for another day.

The next step was tracert.  I a little chagrined to admit I didn’t think of tracert earlier in the process, but to be honest, I haven’t really needed to use it in a while.  What I found was a bit interesting (and no, you don’t get the first two hops in my network chain, you have no need to know what my router’s IP is).

 C:\Users\Martin>tracert www.hitb.org

Tracing route to www.hitb.org [199.58.210.36]

3     9 ms     7 ms     7 ms  glfd-core-2b-ae3-2352.network.virginmedia.net [8.4.31.225]

 4    11 ms     7 ms     7 ms  popl-bb-1b-ae3-0.network.virginmedia.net [213.10.159.245]

 5    10 ms    11 ms    10 ms  nrth-bb-1b-et-700-0.network.virginmedia.net [62.53.175.53]

 6    11 ms    15 ms    14 ms  tele-ic-4-ae0-0.network.virginmedia.net [62.253.74.18]

 7    13 ms    16 ms    14 ms  be3000.ccr21.lon02.atlas.cogentco.com [130.117.1.141]

 8    16 ms    14 ms    16 ms  be2328.ccr21.lon01.atlas.cogentco.com [130.117.4.85]

 9    17 ms    15 ms    16 ms  be2317.mpd22.lon13.atlas.cogentco.com [154.54.73.177]

10    88 ms   102 ms   103 ms  be2350.mpd22.jfk02.atlas.cogentco.com [154.54.30.185]

11    99 ms   100 ms    91 ms  be2150.mpd21.dca01.atlas.cogentco.com [154.54.31.129]

12    97 ms    94 ms    96 ms  be2177.ccr41.iad02.atlas.cogentco.com [154.54.41.205]

13   102 ms   100 ms   105 ms  te2-1.ccr01.iad01.atlas.cogentco.com [154.54.31..62]

14   101 ms   210 ms   211 ms  te4-1.ccr01.iad06.atlas.cogentco.com [154.54.85.8]

15    90 ms    91 ms    99 ms  edge03-iad-ge0.lionlink.net [38.122.66.186]

16    90 ms    94 ms    98 ms  23.29.62.12

17  nlayer.lionlink.net [67.208.163.153]  reports: Destination net unreachable.

Rather than doing what I thought would be the logical thing and simply hoping across the channel and hitting Amsterdam fairly directly, my traffic leaves the VM network through Cogent Networks, hits a few systems in the US owned by a company called Lionlink Networks LLC and dies.  So my traffic leaves the UK, travels to Switzerland, then to the US, over to Washington DC and then dies.  And this happens with four separate hacker conference sites, but doesn’t appear to happen anywhere else.  Oh, and all four hacking sites take the same basic route and all die shortly after hitting LionLink.  Hmmmm.

I know I’m a professional paranoid.  I know how BGP works and that it’s not unusual for traffic to bounce around the internet and go way, way, way, out of what a human would consider a direct route, but the fact that all four EU hacking sites all route back to the US and that they all die when they hit Lionlink is more than a little suspicious to me.  It’s almost like someone is routing the traffic through Switzerland and the US so it can be monitored for hacker activity, since both countries have laws that allow for the capture of traffic that transgresses their borders.  But of course, that would just be paranoid.  Or it would have been in a pre-Snowden world.  In a post-Snowden world, I have to assume most of my traffic is being monitored for anomalous behavior and that the only reason I noticed is because someone at Lionlink screwed up a routing table, exposing the subterfuge.  But that would just be my paranoia speaking, wouldn’t it?

I’m hoping someone with deeper understanding of the dark magiks of the Internets can dig into this and share their findings with me.  It’s interesting that this routing problem is only happening to people on Virgin Media and it’s interesting that the traffic is being routed through Switzerland and the US.  What I have isn’t conclusive proof of anything; it’s just an interesting traffic pattern at this point in time.  I’m hoping there’s a less sinister explanation for what’s going on than the one I’m positing.  If you look into this, please share your findings with me.  I might just be looking at things all wrong but I want to learn from this experience whether I’m right or not.

Thanks to @gsuberland, @clappymonkey, @sawaba @tomaszmiklas, @module0x90 and others who helped verify some of my testing on twitter last night.  And special thanks to @l33tdawg for snooping and making sure I got signed up for HITB.

Update – And here it is, a much more believable explanation than spying, route leakage.  So much for my pre-dawn ramblings.

From Hacker News on Ycombinator:

This is a route leak, plain and simple. Don’t forget to apply Occam’s Razor. All of those sites which are “coincidentally” misbehaving are located in the same /24.

This is what is actually happening. Virgin Media peers with Cogent. Virgin prefers routes from peers over transit. Cogent is turrible at provisioning and filtering, and is a large international transit provider.

Let’s look at the route from Cogent’s perspective:

 

  BGP routing table entry for 199.58.210.0/24, version 2031309347
  Paths: (1 available, best #1, table Default-IP-Routing-Table)
    54098 11557 4436 40015 54876
      38.122.66.186 (metric 10105011) from 154.54.66.76 (154.54.66.76)
        Origin incomplete, metric 0, localpref 130, valid, internal, best
        Community: 174:3092 174:10031 174:20999 174:21001 174:22013

If Cogent was competent at filtering, they’d never learn a route transiting 4436 via a customer port in the first place, but most likely someone at Lionlink (54098) is leaking from one of their transit providers (Sidera, 11557) to another (Cogent, 174).

Also, traffic passing through Switzerland is a red herring — the poster is using a geoip database to look up where a Cogent router is. GeoIP databases are typically populated by user activity, e.g., mobile devices phoning home to get wifi-based location, credit card txns, etc. None of this traffic comes from a ptp interface address on a core router. GeoIP databases tend to have a resolution of about a /24, whereas infrastructure netblocks tend to be chopped up into /30s or /31s for ptp links and /32s for loopbacks, so two adjacent /32s could physically be located in wildly different parts of the world. More than likely, that IP address was previously assigned to a customer. The more accurate source of information would be the router’s hostname, which clearly indicates that it is in London. The handoff between Virgin and Cogent almost certainly happens at Telehouse in the Docklands.

If someone were, in fact, trying to intercept your traffic, they could almost certainly do so without you noticing (at least at layer 3.)

[Slashdot] [Digg] [Reddit] [del.icio.us] [Facebook] [Technorati] [Google] [StumbleUpon]

No responses yet

Oct 02 2013

UK wants a cyber defense force, just like the US

The UK has been following the US government lead on a number of things.  Earlier this year they launched a plan called the Cyber Information Sharing Partnership (CISP) to promote information sharing between the UK government and critical infrastructure providers within the UK.  This somewhat mirrors the long term efforts in the US under the umbrella of the Information Sharing and Analysis Center (ISAC) that has been going on for some time.  In both cases the goal seems to be enabling a communication channel that allows government to share information with industry insiders in order to protect themselves better.  If this follows the US patter, the CISP program will spend much of its first few years building up trust with the participating companies.  However, the relationship between business and government is slightly different in the UK, something I’m finding out up close and personally, which might change the equation in favor of building that trust much faster.

Two additional efforts that mirror things happening in the US.  The first is a plan to create a cyber defence force in the UK called the Joint Cyber Reserve Unit (I wonder if they’ll call it ‘J Crew’).  The JCRU will have the ability to protect UK computer systems and if needed perform “cyber strikes” against ‘enemies’, though both of those terms are poorly defined at this point in time.  The US has been working on a similar capability in the military for a number of years and there have been stories about a non-military version of this effort, but very little news of what is really being done in the US has leaked out.  I strongly suspect that the UK version of this effort will be similarly quiet, working almost entirely behind the scenes.

The second effort is an accreditation program run by the UK’s GCHQ (the equivalent of the US NSA) to perform testing of security professionals in the form of a CESG Certified Professional.  There are six types of certification ranging from Practitioner to IT Security Officer.  It’s unclear exactly what will be tested for without a lot of digging, but it looks like an interesting effort.  It’s got to be better than the US efforts that basically state security professionals need to have their CISSP.  I plan on taking a much longer look at this in order to see if any of the accreditations are appropriate for me to apply for personally.

Our governments are obviously sharing a lot of experience on the spying front, but it’s nice to see them sharing information on the security front as well.  Maybe the US can learn a little from the UK’s efforts at accreditation.  I’m not going to hold my breath though.

[Slashdot] [Digg] [Reddit] [del.icio.us] [Facebook] [Technorati] [Google] [StumbleUpon]

No responses yet

Jun 04 2012

Using Facebook without being tracked

I’ve always hated the way Facebook has endeavored to track every single action their users do.  Which is funny, considering how much of my life I put on Twitter.  But the main difference between the two social media platforms is about choice, at least for me.  With Twitter, I decide what to put online 140 characters at a time.  I might reveal a little more information if I’m not careful with GPS settings on my phone or camera, but for the most part it’s simply the statements that I choose to make that go online and are published for everyone to read.  However, from the early days, Facebook has been far more intrusive and has done everything they can to track each and every digital step that it’s users take.  With constantly shifting privacy policy, the way they change and reset privacy settings every few months and Timeline being a tracking monstrosity that became mandatory, Facebook is a privacy advocate’s worst nightmare.  The list of ways that Facebook tracks and collates data on every user is both awe inspiring and terrifying in it’s magnitude and Timeline is a privacy violation of the first order, at least in my mind. 

But, to put it quite simply, they’re the biggest kid in the social media playground.  When your grandmother, who can barely answer an email, starts following you on Facebook, you know it’s gotten deep penetration in the marketplace.   And since it’s so big, just by nature of it’s natural gravity, more users and more businesses are drawn to it.  If you don’t have an account, people look at you like you’re a little strange and behind the times, whether it’s true or not.  Quite frankly, in many people’s lives, it’s become a necessary tool for communicating with friends, family and/or customers, to the point that not having an account is nearly unthinkable. 

Even I’ve had a Facebook account for years, as much as I’ve hated the idea.  The main reason I created it was simply to grab my own name; I had already seen several people in the security community be impersonated by someone who grabbed their name before they did and have a page created for them.  Usually with malicious aims.  I didn’t want to have that happen to me, so grabbed my account.  I used it a little at first, mostly by integrating my twitter stream into Facebook, but as the privacy concerns got bigger and bigger, I stopped using it all together.  I kept the account and logged in every six months or so, immediately clearing my cookies and rebooting my system afterward to clean the stain it left behind.  I know millions of people use Facebook daily without serious harm, but the thought of having my activities tracked to the degree that Facebook does it is not something I’m comfortable with.

But, as I stated earlier, if you’re not on Facebook, you’re handicapping yourself in interacting with friends, family and the people you do business with in a significant way.  As much as I hate being tracked, I came to the conclusion that it’s time find a way to use Facebook while also maintaining control of what data is being pulled into my social media network(*).  So I did what any social media security geek would do, I tweeted about the problem and waited for the replies to come in.  And did they ever.  I’ve collected some of the best links and software suggestions below.

When all was said and done, I decided the best way for me to use Facebook was to use the one major browser I hadn’t been using on my main system, Chrome.  Rockmelt sounded cool, but I didn’t want to spend the time to research it and learn a different interface.  Adding privacy filters or other extensions that allowed me to use Facebook privately in Firefox had some appeal, but relying on the extensions to keep up with Facebook’s changing policies and technologies didn’t inspire confidence in me over the long haul.  I already had Chrome installed and wasn’t using it, so it was actually a pretty easy choice and because I’m only using it for Facebook a lot of the concerns around having my browsing practices tracked are almost completely assuaged.   At least until Facebook learns to track across multiple browsers, that is.

Since I’m using Chrome as a dedicate Facebook browser, I decided to simply rely on the default install and change a number of the privacy settings, not something I would suggest if you use Chrome for other web browsing as well.  If you click on the wrench in the upper right hand corner of Chrome and select ‘settings’, it will open a new tab for the settings page.  At the bottom of the page is a link, “Show advanced settings…” which opens advance settings such as Privacy.  The ‘Content Settings’ button under Privacy opens up a new window, where the meat of the controls I wanted are.  I selected the following controls:

  • Cookies: Allow local data to be set for the current session only.
  • Cookies:  Clear cookies and other site plug-in data when I close my browser
  • Javascript:  Do not allow any site to run Javascript (You have to make exceptions for Facebook itself, https://[*.].facebook.com:443 and http://[*.].facebook.com)
  • Handlers:  Do not allow any site to handle protocols
  • Plug-ins: Click to play
  • Notifications:  Do not allow any site to show desktop notifications

There’s probably more I can do to protect myself from tracking, especially if I wanted to install some of the Chrome plug-ins specifically aimed at Facebook.  I’ve been using Facebook again for about a week or so.  I plan on using it more in the future for putting up some of the pictures I take during my world travels, to promote the podcast and to promote the work I do at Akamai.  I’m not really happy at getting sucked back into Facebook, but it isn’t really as evil as I sometimes make it out to be.  It is, however, a huge, faceless organization that is determined to make a profit off of me no matter what else happens.  

BTW, I do my banking on a completely separate computer that I do almost no other browsing on.  Or email or social media for that matter.

Additional links:

(*The new version of ‘privacy’ is controlling the information about you that flows onto the interwebz.  The pre-2000 view of privacy is dead, and even the new version is on life support with the data mining capabilities of many of our modern tools.)

[Slashdot] [Digg] [Reddit] [del.icio.us] [Facebook] [Technorati] [Google] [StumbleUpon]

No responses yet

Mar 15 2010

Mykonos: WAF, IPS or honeypot?

Published by under Firewall,Hacking,Testing

I’m not an expert on web application firewalls, which is why I’m asking for feedback on the Mykonos Security Appliance.  I was given a demo of the product at the RSA Conference this year and it’s one of the few products I’ve seen lately that’s doing something new and innovative.  Or more accurately, it appears to be doing something new and innovative; it’s still in beta and this is a technology that’s outside my comfort zone.  If you’re someone with an expertise in WAF’s, it should be worth at least a short look.

In a lot of ways, Mykonos appears to be a standard WAF; it can be used to protect your site from many of the standard coding errors that a WAF is designed to deal with.  It addresses the OWASP Top 10, it has all the reporting capabilities to tell you something’s wrong; in this area it doesn’t appear to have a lot of extra punch you can’t get elsewhere.  The place it does start to have some distinguishing capabilities is in the tracking, categorizing and response to malicious attacks on your web site.

You want to know more about who’s probing your web site?  Mykonos will dynamically modify the code your site is serving to get you more information on who’s attacking.  It’ll tell you about the level of sophistication of the attacker, whether they’re just trying to manipulate a price in the shopping cart, if they’re trying a SQL injection attack or if they’re working on something at the higher end of the attack scale.  And it gives you a lot of choices about how you want to respond; simply block the user, send custom code telling them they’ve been identified and logged or act as a honeypot to get even more information about the attacker and how he’s planning on attacking your site.  The tracking and information gathering abilities seem to be pretty impressive and it may be worth looking at for that alone.

Mykonos looks like more than a plain vanilla web application firewall and the downside to that is it requires more work from the administrator and more work from your developers to make full use of it’s capabilities.  This also means it’s potential for becoming shelfware is much greater as well.  But if you’re looking for more than what a standard WAF offers, it might be worth looking at this product.  And once you do, I’d appreciate feedback on your impression of the product.  Is Mykonos a potential new product market, a single product with greater capabilities or just a flash in the pan that won’t amount to much?

[Slashdot] [Digg] [Reddit] [del.icio.us] [Facebook] [Technorati] [Google] [StumbleUpon]

One response so far

Mar 03 2010

RSAC2010: ICSA Labs

Published by under Malware,Risk,Testing

One of the things I don’t believe we see enough of in the security field is independent testing.  Vendors of all stripes make claims about what their products do, and without independent testing it’s hard to tell if they’re the cream of the crop or a bad apple.  ICSA Labs is one of the few companies that do the sort of testing that’s needed to provide the information to tell the two extremes apart.  I took a few minutes to sit down with Andy Hayter of ICSA Labs to talk about anti-virus testing, education of consumers and a new initiative to use the testing ICSA does in the real world.  For the sake of transparency, ICSA is a part of Verizon, the company I work for as well.

NSP-RSAC2010-ICSALabs.mp3

[Slashdot] [Digg] [Reddit] [del.icio.us] [Facebook] [Technorati] [Google] [StumbleUpon]

No responses yet

Aug 16 2009

Firefox and IE8 tied, Safari 4 loses big

I finally had the time to sit down and read the NSS Labs Web Browser Security Phishing Protection paper this morning. This paper is a test of the more popular browsers in use today and how well the reputation based systems they’ve built work to protect users against phishing attempts by malicious sites.  The big winners in the test were Firefox 3 (not 3.5) and IE8, which almost tied at 80% and 83% accuracy for blocking phishing sites.  Given that the study quotes a margin of error of 3.6%, the two browsers are equal for most intents and purposes.  The big loser of the test was Safari 4, which only had a 2% blocking rate for malicious sites.  I hope Safari on my iPhone is better than it is on my Macbook, or at least that there are less phishing sites targeting the iPhone.

It’s very interesting that Firefox 3, Chrome 2 and Safari 4 all use Google’s Safebrowsing data feed but have very different results from the same data.  Chrome 2 only had a 16% success rate in blocking, compared with Firefox 3 at 80% and Safari 4 at 2%.  So why the big difference between the three browsers running off of the same information?  NSS Labs doesn’t offer an explanation and apparently none of the developers did either, so either Firefox is pulling in a lot of additional information from somewhere or the Chrome and Safari developers have some learning to do.

What I personally found the most interesting about the paper though was that the Anti-Phishing Working Group is quoted as saying that the average phishing site only has a lifespan of approximately 52 hours.  None of the browsers really reach full effectiveness for blocking a phishing site for about 48 hours after the site has become active, therefore you’re only getting 4 hours of maximum benefits.  The long term trends look good, but it’s a little disturbing that many phishing sites are relatively undetected for at least the first 24 to 48 hours they’re live. 

I’d be curious to see how Firefox 3.5 changes this mix.  Apparently it wasn’t stable enough to be used in this test, but maybe we’ll see a new set of tests next quarter.  I’m also wondering what affect the FF plugin NoScript would have on the results.  Since NoScript isn’t strictly speaking an anti-phishing tool, I doubt NSS Labs will be testing it any time soon, but I’d like to know how much more secure it makes my web surfing experience.

Now to go back and read the Socially Engineered Malware report. 

[Slashdot] [Digg] [Reddit] [del.icio.us] [Facebook] [Technorati] [Google] [StumbleUpon]

One response so far

Sep 02 2008

Got Chrome?

Published by under Testing

Unless you’ve been hiding under a rock today, you’ve probably heard that Google released their own browser, Chrome. The comic book that they’ve posted with it is cool, if for no other reason than it’s illustrated by Scott McCloud. But my first reaction to Chrome is “So what?”

Yes, it has a lot of security features built in. But so do IE 7/8 and Firefox 3. I was a little disturbed when I realized that Chrome not only copied all of my bookmarks and history from FF, it also downloaded my user names and passwords. I’m less concerned that Chrome was able to do this then the fact that the passwords can be exported from Firefox at all. I need to rethink saving any of my account information to the browsers at all knowing this.

There are a lot of other people writing reviews of Chrome, so I’m going to keep it simple. After a short test run, it seems to render everything at least as well as Firefox. It’s a bit faster to load and it gives me just a touch more screen space by using the top bar for tabs rather than as just a place holder. One interesting thing is that it appears to use quite a bit of memory, but it’s using it for individual tabs as separate processes rather than one process as Firefox and IE do. This is obviously part of the virtualization and sandboxing Google promised.

I’ll be interested in reading what people have to say about Chrome over the next couple of weeks, but I think I’ll be doing the majority of my surfing in Firefox 3 for the foreseeable future. I rely too heavily on many of the add-ons in Firefox to switch easily. How are your experiences with Chrome turning out and do you see yourself moving to Chrome from your current browser?

[Slashdot] [Digg] [Reddit] [del.icio.us] [Facebook] [Technorati] [Google] [StumbleUpon]

11 responses so far

Jul 07 2008

Finally upgraded to FF3

Published by under General,Testing

I upgraded my secondary computers, the Mac Book Pro and the wife’s desktop, to Firefox 3 the day it came out last month, but I put off upgrading my primary system until this weekend. Why? Because I dislike a number of the default tab behaviors Firefox displays by default; they’re fine for lite browsing, but for my more serious browsing, it got to be annoying. Trying to organize show notes and organize articles for blog posts is just easier when tabs behave the way I want them too, not the way Mozilla wants them to. So I waited for Tab Mix Plus to catch up with Firefox 3. Which they’ve done, even though it’s still a ‘development’ version.

There are a few features that TMP offers that I really need. The first is opening up URL’s I type in in a new tab rather than in the current window by default. There’s probably a way to get FF3 to exhibit this behavior without TMP, but I’ve never been able to work right. Another feature is the ability to automatically reload a particular tab on a regular basis. I have a couple of stats windows I keep open that I want to reload every 15 minutes, like my blog stat and podcast stat pages. Neither of these features is absolutely necessary, but it makes my browsing experience more enjoyable.

Now to upgrade the kids computer and the other household laptop. It’s a bit scary that we’ve got more computers than people in our household. But I guess that’s part of what happens when you’re a computer geek.

[Slashdot] [Digg] [Reddit] [del.icio.us] [Facebook] [Technorati] [Google] [StumbleUpon]

One response so far

Apr 09 2008

RSA 2008: Rick Moy, NSS Labs

Published by under Testing

NSS Labs is an independent testing lab that certifies firewalls, UTM’s and a host of other products for compliance with programs such as PCI. I had a chance to talk to Rick Moy for a few minutes and talk about the proper use of these reports.

nsp-RSA2008-RickMoy.mp3

[display_podcast]

[Slashdot] [Digg] [Reddit] [del.icio.us] [Facebook] [Technorati] [Google] [StumbleUpon]

No responses yet

Jul 16 2007

You’ve got to appreciate truth in advertising

I use Gmail as my central email repository and usually the spam filters they use are pretty good.  But lately they’ve been a little overly aggressive, so I have to comb through to make sure no legitimate email is being caught accidentally.  There’s not a lot that’s misidentified, but there’s enough to make it worth the few minutes a day it takes to double-check the spam folder.

I’ve been amazed at some of the subject lines I see, as well as what I see in the preview of the email.  There’s no way I’m going to click on any of them to find out what else is in the spam, because it’s just not worth the risk.  But I do have to say that my favorite subject line so far is “Thanks for contributing to our financial success”.  It’s honest and straight forward even if it is just an attempt to rip off people around the globe.

On a side note, I used to clean out my spam folder every couple of days, but in March I started letting them accumulate and get deleted automatically when they’ve aged 30 days.  It’s been interesting watching the number of spams spike and drop.  At one point I had gathered nearly 9000 spams in a 30 day period, which works out to an average of 300 spams a day.   Personally, that means about 60% of my email is spam, a far lower percentage of spam than most people see.  I guess being subscribed to ten or so mailing lists had to have some benefit.

Mine is just a single data point, compared to the millions some anti-spam vendors get to see.  But I like having a personal high water mark to compare to what the vendors are reporting. I’m not a spam expert, so it’s interesting to see new spam subjects that companies like  F-secure report.  Anyone else out there keep track of the spam they receive for fun?

Technorati Tags: , ,

[Slashdot] [Digg] [Reddit] [del.icio.us] [Facebook] [Technorati] [Google] [StumbleUpon]

7 responses so far

Next »