Recent Topics

Remove entry from Black list

Started by on Oct 02, 2005 – Contents updated: Oct 02, 2005

Oct 02, 2005 14:44    

Today i asked for update (of antispam black list)from b2evo's servers.To my shock i found download.com in the list.I want to remove it.How do it do so?

Thanks

Oct 02, 2005 20:44

Thanks
Another doubtfull entry is cjb.net .cjb.net is a free url redirection service.I dont think that it spams itself.May be some one spams for xxx.cjb.net

Oct 02, 2005 21:49

Indyan wrote:

Another doubtfull entry is cjb.net .cjb.net is a free url redirection service.I dont think that it spams itself.May be some one spams for xxx.cjb.net

Personnally, I don't hesitate to add any redirection or "dynamic DNS" services to the blacklist when some spammers use them as spamming addresses.

It appears it's far too easy to create dozens if not hundreds of new redirections each day with that kind of services. Adding each separate URL to the (static) antispam blacklist becomes useless, since those URLs don't live more than a couple of days. However, new ones are taken using the same services so fighting spam using exact URLs becomes a full time job. Removing the whole service is, however, very efficient in terms of spam reduction.

Oct 02, 2005 22:22

Thanks kwa! I couldn't have said it better because all I know is what I see on the inside of the blacklist 8| 28 different reports for various flavors of cjb.net = byebye cjb.net!

There are exceptions of course. The "have a free blog" services are normally NOT vectored down to their base domain because it's a few bad apples.

Oct 03, 2005 00:02

EdB wrote:

Thanks kwa! I couldn't have said it better because all I know is what I see on the inside of the blacklist 8| 28 different reports for various flavors of cjb.net = byebye cjb.net!

There are exceptions of course. The "have a free blog" services are normally NOT vectored down to their base domain because it's a few bad apples.

There is huge a difference between a free blogging service and a redirection service:

  • most (b2evolution) bloggers have legit visitors, comments, trackbacks and links from and to free blogging services;[*]most (b2evolution) bloggers have spamming visitors, comments, trackbacks and links from redirection services.[/list:u]Obviously, there might be legit visitors using redirection, dynamic DNS, open proxy services. (Okay, I've never seen them, but... I still believe ;)) However, does it worth the time spent to handle them?...

  • However, it should be possible to handle redirections in a partially automated way. One can read the following on [url=http://www.surbl.org/faq.html#redirect]How are URIs which include redirection sites handled? (SURBL)[/url]:
      Redirection sites like drs.yahoo.com, tinyurl.com, etc. take external URIs (unfortunately including spam ones) and redirect a browser to them. Therefore spammers can use the redirectors to trick a simple URI check that only looks at the initial URI, making the whole URI appear legitimate. For example the Yahoo redirection below might be (incorrectly) parsed as a legitimate yahoo domain:
        http://drs.yahoo.com/covey/parr/*http://spammer.address/[/list:u] SpamCop itself seems to disambiguate (most of) the redirection. If someone is using a redirector to send traffic to spamdomain.com, SpamCop seems to detect and resolve it correctly to spamdomain.com most of the time. So the data that's used as input to sc.surbl.org already has redirectors correctly handled to some extent. In other words, we're protected on the data input side by the processing that happens at SpamCop to take out the redirection in reported URIs. SpamAssassin programs such as SpamCopURI and urirhdbl that use SURBLs are capable of handling redirections to differing degrees. SpamCopURI 0.14 uses LWP to get Location information to untangle up to four levels of redirection sites without actually visiting the sites. URIDNSBL's urirhsbl includes patterns to extract the final domains from some redirection URIs. Further development will probably improve the handling of redirection sites. The big picture solution is for the redirection sites to block spam domains on their own. In other words, they should not let spammers rediect through their sites. Until they do so, their services can be abused by spammers. Some, such as tinyurl.com reportedly actively block and report spammers who abuse their site. Others such as Metamark and SnipURL are using SURBLs to deny spammers access to their redirection services. Here is an [url=http://www.surbl.org/redirect.html]Open Letter to Redirection Sites[/url] that may be used or modified to contact them.[/list:u]In addition to a redirection checker, it would be interesting to manage a fully dynamic antispam blacklist like [url=http://www.surbl.org/]SURBL[/url] does, as explained in [url=http://www.surbl.org/faq.html#tune]Can the expiration times and inclusion thresholds be tuned further? (SURBL)[/url]:
          Can the expiration times and inclusion thresholds be tuned further? We can make the expiration of records and therefore number of days any arbitrary length. Four days was chosen because we felt it was a good match for the freshness of the SpamCop (SC) Spamvertised site data. It was also chosen to keep the amount of data reasonably small. If more of a historical record would be useful, we can keep data for a week or month. The shortness was partially meant to ensure that the RBL data tracked current SC data fairly tightly and also did not result in too large of an RBL. Presently the RBL only has about 500 records; perhaps that's on the small side. We're not too worried about Joe Jobs and other problems in the data due to some of the averaging effects explained further on. sc.surbl.org is meant to be a record of the most frequently reported domains in spam message bodies that SpamCop users choose to report. In this sense it's like a broadly-based, hand-tuned black list of domains commonly found in spam. Because quite a few reports need to be received to for a domain to get added to sc.surbl.org, it effectively represents a consensus voting system about which URI domains are spammy. One improvement might be to encode the frequency data in the RBL so that more frequently reported domains could be used to give higher scores. Future versions of the data engine behind sc.surbl.org will probably have a longer default expiration time of 10 days, and will probably also set a lower threshold and longer expiration for professional spam operations and for domains hosted at spam-friendly ISPs. We may also adjust the expirations to be longer for domains that receive very many spam reports. In essence, each spam report would describe a "crime" punishable by longer expiration days "prison sentences", with the sentences consecutively served. With the current data one additional day per ten reports looks about right. Some sort of hysteresis mechanism could also be useful to prevent spam domains which get a low level of reporting from coming off the list and getting back on it repeatedly as sometimes happens with the original engine. This apparent "recidivism" is caused by the reports expiring and the count dropping back below the inclusion threshold. Some fresh reports then come in and raise the count back above the threshold. Longer expiration times automatically help with this to some extent as do lower thresholds. As message body spam domain blocking becomes more prevalent, reports will tend to decrease rapidly after the flurry of initial ones, and a longer memory of those will become important.[/list:u]Worth also reading [url=http://www.surbl.org/faq.html#size]Isn't the number of domains in sc.surbl.org somewhat small at 500 when using an report inclusion threshold of 10? (SURBL)[/url]:
            Isn't the number of domains in sc.surbl.org somewhat small at 500 when using an report inclusion threshold of 10? An interesting thing is that the data seems pretty well-behaved in a statistical sense. Halving of the threshold approximately doubles the size of the resulting list in the range of thresholds I looked at (approx 5 to 25 "report counts"). Lengthening the expiration period should also increase the size of the list for a given threshold, and the additional data gained from doing so could be pretty valid. One thing I did notice from top-sites.html is that there is a persistent pharmaspammer hosted in China or Brazil that almost always seems to be near the top of the list. They had used domain names like medz4cheap.com, and some other names. Currently they're using medicalfhtjk.com. What's interesting is that their domains only last a week or so before they switch to a new one, with very similar-style spams referencing all of them. In their case at least, that kind of argues for a one week or so expiration, but that's only one anecdotal example and not really a basis for a policy. Perhaps it's not a coincidence the 7 days is also a typical minimum zone file expire time, i.e. a length of time the spam domain zone file might be cached on name servers. Update: we have lowered the inclusion threshold from 20 to 10. See [url=http://www.surbl.org/data.html#notes]note 2[/url] for one semi-rational rationale why. The size of the SURBL domain list is now about 500 records, up from 250 when the threshold was 20.[/list:u]To sum up my position developed here:
            • checking redirections would be interesting to avoid adding redirection services, but adding the redirection destination domains;

            • [*]a dynamic antispam blacklist (a list that handles centralized additions, but that shortens itself automatically by removing "old" spamming sites) would be welcome.[/list:u]There is another thread talking about spam fight here:


Form is loading...

Social CMS software – This forum is powered by b2evolution CMS, a complete engine for your website.