Rocksolid Light

Welcome to Rocksolid Light

mail  files  register  newsreader  groups  login

Message-ID:  

You mean you didn't *know* she was off making lots of little phone companies?


tech / sci.physics.relativity / Re: How I deal with the enormous amount of spam

SubjectAuthor
* How I deal with the enormous amount of spamTom Roberts
+- Re: How I deal with the enormous amount of spamDlzc
+* Re: How I deal with the enormous amount of spampalsing
|+- Re: How I deal with the enormous amount of spamDlzc
|`* Re: How I deal with the enormous amount of spamRoss Finlayson
| `* Re: How I deal with the enormous amount of spamRoss Finlayson
|  +* Re: How I deal with the enormous amount of spamDlzc
|  |`- Re: How I deal with the enormous amount of spamRoss Finlayson
|  `* Re: How I deal with the enormous amount of spamThe Starmaker
|   `* Re: How I deal with the enormous amount of spamThe Starmaker
|    `* Re: How I deal with the enormous amount of spamRoss Finlayson
|     `* Meta: Re: How I deal with the enormous amount of spamRoss Finlayson
|      `* Re: Meta: Re: How I deal with the enormous amount of spamThe Starmaker
|       `* Re: Meta: Re: How I deal with the enormous amount of spamThe Starmaker
|        `* Re: Meta: Re: How I deal with the enormous amount of spamRoss Finlayson
|         +* Re: Meta: Re: How I deal with the enormous amount of spamThe Starmaker
|         |`* Re: Meta: Re: How I deal with the enormous amount of spamThe Starmaker
|         | `* Re: Meta: Re: How I deal with the enormous amount of spamThe Starmaker
|         |  `* Re: Meta: Re: How I deal with the enormous amount of spamThe Starmaker
|         |   `* Re: Meta: Re: How I deal with the enormous amount of spamThe Starmaker
|         |    `* Re: Meta: Re: How I deal with the enormous amount of spamPhysfitfreak
|         |     `* Re: Meta: Re: How I deal with the enormous amount of spamThe Starmaker
|         |      +* Re: Meta: Re: How I deal with the enormous amount of spamDlzc
|         |      |`- Re: Meta: Re: How I deal with the enormous amount of spamHooker Tzaran Balanowsky
|         |      `* Re: Meta: Re: How I deal with the enormous amount of spamVolney
|         |       `- Re: Meta: Re: How I deal with the enormous amount of spamRoss Finlayson
|         `* Re: Meta: Re: How I deal with the enormous amount of spamRoss Finlayson
|          `* Re: Meta: Re: How I deal with the enormous amount of spamRoss Finlayson
|           `* Re: Meta: Re: How I deal with the enormous amount of spamDlzc
|            `* Re: Meta: Re: How I deal with the enormous amount of spamRoss Finlayson
|             `- Re: Meta: Re: How I deal with the enormous amount of spamDlzc
+* Re: How I deal with the enormous amount of spamNico Kozák Pásztori
|`* Re: How I deal with the enormous amount of spamPhysfitfreak
| `- Re: How I deal with the enormous amount of spamTimmie Császár Barabás
+* Re: How I deal with the enormous amount of spamRichard Hertz
|`* Re: How I deal with the enormous amount of spamProkaryoticCaspaseHomolog
| `* Re: How I deal with the enormous amount of spamJ. J. Lodder
|  +- Re: How I deal with the enormous amount of spamProkaryoticCaspaseHomolog
|  `- Re: How I deal with the enormous amount of spamgharnagel
+- Re: How I deal with the enormous amount of spamJ. J. Lodder
+* Re: How I deal with the enormous amount of spamProkaryoticCaspaseHomolog
|`* Re: How I deal with the enormous amount of spamJ. J. Lodder
| `* Re: How I deal with the enormous amount of spamVolney
|  +* Re: How I deal with the enormous amount of spamRoscoe Hatukaev
|  |`- Re: How I deal with the enormous amount of spamPhysfitfreak
|  +- Re: How I deal with the enormous amount of spamRamses Fenstermacher
|  +- Re: How I deal with the enormous amount of spamJ. J. Lodder
|  `* Re: How I deal with the enormous amount of spamDlzc
|   +- Re: How I deal with the enormous amount of spamRichD
|   +* Re: How I deal with the enormous amount of spamRichD
|   |`* Re: How I deal with the enormous amount of spamTom Roberts
|   | `- Re: How I deal with the enormous amount of spamDlzc
|   `* Re: How I deal with the enormous amount of spamProkaryoticCaspaseHomolog
|    `- Re: How I deal with the enormous amount of spamDlzc
+- Re: How I deal with the enormous amount of spamAthel Cornish-Bowden
+- Re: How I deal with the enormous amount of spamwugi
+- Re: How I deal with the enormous amount of spamxip14
`- Re: How I deal with the enormous amount of spamxip14

Pages:123
Re: How I deal with the enormous amount of spam

<5c870716-4789-4d7c-a322-13df51e6bac1n@googlegroups.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130515&group=sci.physics.relativity#130515

  copy link   Newsgroups: sci.physics.relativity
X-Received: by 2002:a05:622a:44e:b0:42c:779:165f with SMTP id o14-20020a05622a044e00b0042c0779165fmr201121qtx.4.1707001186775;
Sat, 03 Feb 2024 14:59:46 -0800 (PST)
X-Received: by 2002:a05:622a:1448:b0:42c:1695:697c with SMTP id
v8-20020a05622a144800b0042c1695697cmr163011qtx.7.1707001186576; Sat, 03 Feb
2024 14:59:46 -0800 (PST)
Path: i2pn2.org!i2pn.org!newsfeed.endofthelinebbs.com!usenet.blueworldhosting.com!diablo1.usenet.blueworldhosting.com!peer03.iad!feed-me.highwinds-media.com!news.highwinds-media.com!news-out.google.com!nntp.google.com!postnews.google.com!google-groups.googlegroups.com!not-for-mail
Newsgroups: sci.physics.relativity
Date: Sat, 3 Feb 2024 14:59:46 -0800 (PST)
In-Reply-To: <uplu8i$37h8j$1@dont-email.me>
Injection-Info: google-groups.googlegroups.com; posting-host=2600:1700:15df:c8df:f836:52c9:5689:cb00;
posting-account=MVjzhQoAAAC9p_5zLm3q76BQ_cMWZzZC
NNTP-Posting-Host: 2600:1700:15df:c8df:f836:52c9:5689:cb00
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com>
<6352a9543d3544f2415638cca7f822ad@www.novabbs.com> <1qocqv8.5g97781by3vw6N%nospam@de-ster.demon.nl>
<uplu8i$37h8j$1@dont-email.me>
User-Agent: G2/1.0
MIME-Version: 1.0
Message-ID: <5c870716-4789-4d7c-a322-13df51e6bac1n@googlegroups.com>
Subject: Re: How I deal with the enormous amount of spam
From: turkeyheadedmutha@gmail.com (Dlzc)
Injection-Date: Sat, 03 Feb 2024 22:59:46 +0000
Content-Type: text/plain; charset="UTF-8"
Content-Transfer-Encoding: quoted-printable
X-Received-Bytes: 1703
 by: Dlzc - Sat, 3 Feb 2024 22:59 UTC

On Saturday, February 3, 2024 at 11:48:05 AM UTC-6, Volney wrote:
> Definitely, but what is the motivation or goal of the spammers? It
> doesn't make sense.

Cheap, brainless search engine optimization. The more places a site is advertised, the more likely it will be to get indexed for search, and the higher its ranking.

David A. Smith

Re: How I deal with the enormous amount of spam

<4e9b387d-9853-4bf1-9732-f83bd446cb93n@googlegroups.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130516&group=sci.physics.relativity#130516

  copy link   Newsgroups: sci.physics.relativity
X-Received: by 2002:ad4:414e:0:b0:68c:8ba5:4bf with SMTP id z14-20020ad4414e000000b0068c8ba504bfmr32890qvp.10.1707009878094;
Sat, 03 Feb 2024 17:24:38 -0800 (PST)
X-Received: by 2002:a05:620a:3183:b0:785:3d51:5dce with SMTP id
bi3-20020a05620a318300b007853d515dcemr157001qkb.0.1707009877774; Sat, 03 Feb
2024 17:24:37 -0800 (PST)
Path: i2pn2.org!i2pn.org!eternal-september.org!feeder3.eternal-september.org!2.eu.feeder.erje.net!feeder.erje.net!proxad.net!feeder1-2.proxad.net!209.85.160.216.MISMATCH!news-out.google.com!nntp.google.com!postnews.google.com!google-groups.googlegroups.com!not-for-mail
Newsgroups: sci.physics.relativity
Date: Sat, 3 Feb 2024 17:24:37 -0800 (PST)
In-Reply-To: <5c870716-4789-4d7c-a322-13df51e6bac1n@googlegroups.com>
Injection-Info: google-groups.googlegroups.com; posting-host=205.154.192.197; posting-account=x2WXVAkAAACheXC-5ndnEdz_vL9CA75q
NNTP-Posting-Host: 205.154.192.197
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com>
<6352a9543d3544f2415638cca7f822ad@www.novabbs.com> <1qocqv8.5g97781by3vw6N%nospam@de-ster.demon.nl>
<uplu8i$37h8j$1@dont-email.me> <5c870716-4789-4d7c-a322-13df51e6bac1n@googlegroups.com>
User-Agent: G2/1.0
MIME-Version: 1.0
Message-ID: <4e9b387d-9853-4bf1-9732-f83bd446cb93n@googlegroups.com>
Subject: Re: How I deal with the enormous amount of spam
From: r_delaney2001@yahoo.com (RichD)
Injection-Date: Sun, 04 Feb 2024 01:24:38 +0000
Content-Type: text/plain; charset="UTF-8"
 by: RichD - Sun, 4 Feb 2024 01:24 UTC

On February 3, Dlzc wrote:
>> Definitely, but what is the motivation or goal of the spammers?
>
> Cheap, brainless search engine optimization. The more places a site
> is advertised, the more likely it will be to get indexed for search, and the higher its ranking.

Yes. But can't Google thwart this tactic, if they're notified,
by removing all the indexes? Punishment!

Of course, this means someone with access to the search
code, the crown jewels - that won't happen without good reason -

--
Rich

Re: How I deal with the enormous amount of spam

<20bc510a-2b81-454a-9543-0eadec338dcan@googlegroups.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130518&group=sci.physics.relativity#130518

  copy link   Newsgroups: sci.physics.relativity
X-Received: by 2002:a05:6214:20e7:b0:68c:949a:9fd with SMTP id 7-20020a05621420e700b0068c949a09fdmr35335qvk.0.1707010851427;
Sat, 03 Feb 2024 17:40:51 -0800 (PST)
X-Received: by 2002:ad4:4a71:0:b0:68c:7dd3:7588 with SMTP id
cn17-20020ad44a71000000b0068c7dd37588mr34264qvb.12.1707010851098; Sat, 03 Feb
2024 17:40:51 -0800 (PST)
Path: i2pn2.org!i2pn.org!paganini.bofh.team!2.eu.feeder.erje.net!feeder.erje.net!proxad.net!feeder1-2.proxad.net!209.85.160.216.MISMATCH!news-out.google.com!nntp.google.com!postnews.google.com!google-groups.googlegroups.com!not-for-mail
Newsgroups: sci.physics.relativity
Date: Sat, 3 Feb 2024 17:40:50 -0800 (PST)
In-Reply-To: <5c870716-4789-4d7c-a322-13df51e6bac1n@googlegroups.com>
Injection-Info: google-groups.googlegroups.com; posting-host=205.154.192.197; posting-account=x2WXVAkAAACheXC-5ndnEdz_vL9CA75q
NNTP-Posting-Host: 205.154.192.197
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com>
<6352a9543d3544f2415638cca7f822ad@www.novabbs.com> <1qocqv8.5g97781by3vw6N%nospam@de-ster.demon.nl>
<uplu8i$37h8j$1@dont-email.me> <5c870716-4789-4d7c-a322-13df51e6bac1n@googlegroups.com>
User-Agent: G2/1.0
MIME-Version: 1.0
Message-ID: <20bc510a-2b81-454a-9543-0eadec338dcan@googlegroups.com>
Subject: Re: How I deal with the enormous amount of spam
From: r_delaney2001@yahoo.com (RichD)
Injection-Date: Sun, 04 Feb 2024 01:40:51 +0000
Content-Type: text/plain; charset="UTF-8"
 by: RichD - Sun, 4 Feb 2024 01:40 UTC

On February 3, Dlzc wrote:
> Cheap, brainless search engine optimization. The more places a site is advertised,
> the more likely it will be to get indexed for search, and the higher its ranking.

If i were a hacker, a real TCP guru, I'd write a script to harvest all the spam
URL links. Then another script to spawn processes which hit those Web
sites, manufacturing continuous download requests. A massive denial of
service attack.

alas, I lack such programming skills - but for sure, there are folks with such
acumen. Maybe one of them is a Usenet customer,,,

--
Rich

Re: How I deal with the enormous amount of spam

<3d359fab29f662aab6dcc5cce26ff3bc@www.novabbs.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130523&group=sci.physics.relativity#130523

  copy link   Newsgroups: sci.physics.relativity
Path: i2pn2.org!.POSTED!not-for-mail
From: tomyee3@gmail.com (ProkaryoticCaspaseHomolog)
Newsgroups: sci.physics.relativity
Subject: Re: How I deal with the enormous amount of spam
Date: Sun, 4 Feb 2024 14:06:00 +0000
Organization: novaBBS
Message-ID: <3d359fab29f662aab6dcc5cce26ff3bc@www.novabbs.com>
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com> <6352a9543d3544f2415638cca7f822ad@www.novabbs.com> <1qocqv8.5g97781by3vw6N%nospam@de-ster.demon.nl> <uplu8i$37h8j$1@dont-email.me> <5c870716-4789-4d7c-a322-13df51e6bac1n@googlegroups.com>
MIME-Version: 1.0
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Info: i2pn2.org;
logging-data="1660272"; mail-complaints-to="usenet@i2pn2.org";
posting-account="t+lO0yBNO1zGxasPvGSZV1BRu71QKx+JE37DnW+83jQ";
User-Agent: Rocksolid Light
X-Rslight-Site: $2y$10$DbpmvTNINJoaT..km839YOrHdkWkqKE/RepX1yvked.Ue6DpX3Wc6
X-Spam-Checker-Version: SpamAssassin 4.0.0
X-Rslight-Posting-User: c1a997029c70f718720f72156b7d7f56416caf7c
 by: ProkaryoticCaspaseHo - Sun, 4 Feb 2024 14:06 UTC

Dlzc wrote:

> On Saturday, February 3, 2024 at 11:48:05 AM UTC-6, Volney wrote:
>> Definitely, but what is the motivation or goal of the spammers? It
>> doesn't make sense.

> Cheap, brainless search engine optimization. The more places a site is advertised, the more likely it will be to get indexed for search, and the higher its ranking.

You don't need a flood of posts to do that.
This is simple vandalism for fun.

Re: How I deal with the enormous amount of spam

<b24d44ef-7e94-4525-9029-957f4ebdecedn@googlegroups.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130526&group=sci.physics.relativity#130526

  copy link   Newsgroups: sci.physics.relativity
X-Received: by 2002:ac8:5e52:0:b0:42a:b456:8672 with SMTP id i18-20020ac85e52000000b0042ab4568672mr288371qtx.5.1707062083863;
Sun, 04 Feb 2024 07:54:43 -0800 (PST)
X-Received: by 2002:a05:622a:188a:b0:42b:fac1:840a with SMTP id
v10-20020a05622a188a00b0042bfac1840amr294901qtc.13.1707062083544; Sun, 04 Feb
2024 07:54:43 -0800 (PST)
Path: i2pn2.org!i2pn.org!weretis.net!feeder8.news.weretis.net!proxad.net!feeder1-2.proxad.net!209.85.160.216.MISMATCH!news-out.google.com!nntp.google.com!postnews.google.com!google-groups.googlegroups.com!not-for-mail
Newsgroups: sci.physics.relativity
Date: Sun, 4 Feb 2024 07:54:43 -0800 (PST)
In-Reply-To: <3d359fab29f662aab6dcc5cce26ff3bc@www.novabbs.com>
Injection-Info: google-groups.googlegroups.com; posting-host=2600:1700:15df:c8df:6c55:2602:3001:756d;
posting-account=MVjzhQoAAAC9p_5zLm3q76BQ_cMWZzZC
NNTP-Posting-Host: 2600:1700:15df:c8df:6c55:2602:3001:756d
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com>
<6352a9543d3544f2415638cca7f822ad@www.novabbs.com> <1qocqv8.5g97781by3vw6N%nospam@de-ster.demon.nl>
<uplu8i$37h8j$1@dont-email.me> <5c870716-4789-4d7c-a322-13df51e6bac1n@googlegroups.com>
<3d359fab29f662aab6dcc5cce26ff3bc@www.novabbs.com>
User-Agent: G2/1.0
MIME-Version: 1.0
Message-ID: <b24d44ef-7e94-4525-9029-957f4ebdecedn@googlegroups.com>
Subject: Re: How I deal with the enormous amount of spam
From: turkeyheadedmutha@gmail.com (Dlzc)
Injection-Date: Sun, 04 Feb 2024 15:54:43 +0000
Content-Type: text/plain; charset="UTF-8"
Content-Transfer-Encoding: quoted-printable
 by: Dlzc - Sun, 4 Feb 2024 15:54 UTC

On Sunday, February 4, 2024 at 8:06:08 AM UTC-6, ProkaryoticCaspaseHomolog wrote:
> Dlzc wrote:
>
> > On Saturday, February 3, 2024 at 11:48:05 AM UTC-6, Volney wrote:
> >> Definitely, but what is the motivation or goal of the spammers? It
> >> doesn't make sense.
>
> > Cheap, brainless search engine optimization. The more places a site
> > is advertised, the more likely it will be to get indexed for search, and
> > the higher its ranking.

> You don't need a flood of posts to do that.
> This is simple vandalism for fun.

Disagree, and I've said why the flood is necessary. Someone is paying for it. Someone is getting paid to do it. The need is flood. This is not the only newsgroup targeted. I monitor nine different groups, and have had to report spam in seven of them (one of them 3x the daily spam in this newsgroup)... the others get little traffic, so are unlikely to be crawled for cross-indexing.

David A. Smith

PS. I wish they would have made the cutoff Feb 2nd, 2024. Or even Dec 31st, 2023.

Re: How I deal with the enormous amount of spam

<bLGdnWOumZjlJyL4nZ2dnZfqlJxj4p2d@giganews.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130527&group=sci.physics.relativity#130527

  copy link   Newsgroups: sci.physics.relativity
Path: i2pn2.org!i2pn.org!weretis.net!feeder6.news.weretis.net!border-2.nntp.ord.giganews.com!nntp.giganews.com!Xl.tags.giganews.com!local-1.nntp.ord.giganews.com!news.giganews.com.POSTED!not-for-mail
NNTP-Posting-Date: Sun, 04 Feb 2024 16:25:27 +0000
Date: Sun, 4 Feb 2024 10:25:27 -0600
MIME-Version: 1.0
User-Agent: Mozilla Thunderbird
From: tjoberts137@sbcglobal.net (Tom Roberts)
Subject: Re: How I deal with the enormous amount of spam
Newsgroups: sci.physics.relativity
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com>
<6352a9543d3544f2415638cca7f822ad@www.novabbs.com>
<1qocqv8.5g97781by3vw6N%nospam@de-ster.demon.nl>
<uplu8i$37h8j$1@dont-email.me>
<5c870716-4789-4d7c-a322-13df51e6bac1n@googlegroups.com>
<20bc510a-2b81-454a-9543-0eadec338dcan@googlegroups.com>
Content-Language: en-US
In-Reply-To: <20bc510a-2b81-454a-9543-0eadec338dcan@googlegroups.com>
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Message-ID: <bLGdnWOumZjlJyL4nZ2dnZfqlJxj4p2d@giganews.com>
Lines: 14
X-Usenet-Provider: http://www.giganews.com
X-Trace: sv3-fRWbcMjuq9p1OEYwggZTbaRwnjzdcBfQiokxN8wQbsA70hMIuAs3m0SJciwOyET0YfSWHq6RVNVhXzW!JAPJLXLtBo8A5MuvUlkriXjd/0zKDg53W8RYqloelgtyUIe2xo6n/9sPGueDpkcMafRtn+bImQ==
X-Complaints-To: abuse@giganews.com
X-DMCA-Notifications: http://www.giganews.com/info/dmca.html
X-Abuse-and-DMCA-Info: Please be sure to forward a copy of ALL headers
X-Abuse-and-DMCA-Info: Otherwise we will be unable to process your complaint properly
X-Postfilter: 1.3.40
 by: Tom Roberts - Sun, 4 Feb 2024 16:25 UTC

On 2/3/24 7:40 PM, RichD wrote:
> If i were a hacker, a real TCP guru, I'd write a script to harvest
> all the spam URL links. Then another script to spawn processes which
> hit those Web sites, manufacturing continuous download requests. A
> massive denial of service attack.

I could easily write such a script. But without control of a
1,000-member botnet (or preferably larger), the traffic load it could
generate would be useless.

Of course, that would be illegal, and against the terms of service of my
ISP.

Tom Roberts

Re: How I deal with the enormous amount of spam

<1f41af85-31d4-476a-8836-d07dde97ea7an@googlegroups.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130528&group=sci.physics.relativity#130528

  copy link   Newsgroups: sci.physics.relativity
X-Received: by 2002:a05:622a:48b:b0:429:d531:5f4e with SMTP id p11-20020a05622a048b00b00429d5315f4emr450471qtx.13.1707064844973;
Sun, 04 Feb 2024 08:40:44 -0800 (PST)
X-Received: by 2002:a05:622a:1887:b0:42a:b590:3a34 with SMTP id
v7-20020a05622a188700b0042ab5903a34mr422458qtc.0.1707064844759; Sun, 04 Feb
2024 08:40:44 -0800 (PST)
Path: i2pn2.org!i2pn.org!weretis.net!feeder8.news.weretis.net!proxad.net!feeder1-2.proxad.net!209.85.160.216.MISMATCH!news-out.google.com!nntp.google.com!postnews.google.com!google-groups.googlegroups.com!not-for-mail
Newsgroups: sci.physics.relativity
Date: Sun, 4 Feb 2024 08:40:44 -0800 (PST)
In-Reply-To: <bLGdnWOumZjlJyL4nZ2dnZfqlJxj4p2d@giganews.com>
Injection-Info: google-groups.googlegroups.com; posting-host=2600:1700:15df:c8df:6c55:2602:3001:756d;
posting-account=MVjzhQoAAAC9p_5zLm3q76BQ_cMWZzZC
NNTP-Posting-Host: 2600:1700:15df:c8df:6c55:2602:3001:756d
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com>
<6352a9543d3544f2415638cca7f822ad@www.novabbs.com> <1qocqv8.5g97781by3vw6N%nospam@de-ster.demon.nl>
<uplu8i$37h8j$1@dont-email.me> <5c870716-4789-4d7c-a322-13df51e6bac1n@googlegroups.com>
<20bc510a-2b81-454a-9543-0eadec338dcan@googlegroups.com> <bLGdnWOumZjlJyL4nZ2dnZfqlJxj4p2d@giganews.com>
User-Agent: G2/1.0
MIME-Version: 1.0
Message-ID: <1f41af85-31d4-476a-8836-d07dde97ea7an@googlegroups.com>
Subject: Re: How I deal with the enormous amount of spam
From: turkeyheadedmutha@gmail.com (Dlzc)
Injection-Date: Sun, 04 Feb 2024 16:40:44 +0000
Content-Type: text/plain; charset="UTF-8"
Content-Transfer-Encoding: quoted-printable
 by: Dlzc - Sun, 4 Feb 2024 16:40 UTC

On Sunday, February 4, 2024 at 10:25:40 AM UTC-6, Tom Roberts wrote:
> On 2/3/24 7:40 PM, RichD wrote:
> > If i were a hacker, a real TCP guru, I'd write a script to harvest
> > all the spam URL links. Then another script to spawn processes which
> > hit those Web sites, manufacturing continuous download requests. A
> > massive denial of service attack.

> I could easily write such a script. But without control of a
> 1,000-member botnet (or preferably larger), the traffic load it could
> generate would be useless.

Could use fewer computers, and a random number generator for a timer to trigger when each post event was operated. Could have the script set up new accounts as required, that'd provide some randomness. But it is cheaper to pay 3rd world personnel to spend a few hours each day, and provides easier ways of getting past the various forms of "bot detection". They are on the staff for other tasks as well, no doubt.

I just wish Google actually did what they claim they do, and spam reports DID something real. I no longer (on Google Groups) get to see headers, is some of this spam being injected via Google Groups, or is it all via usenet servers?

A half dozen different commercial sites, paying "the same outfit" (or at least cheap clones, using identical methods) to promote their websites, draw in traffic.

David A. Smith

Re: How I deal with the enormous amount of spam

<18mdnb4LQ8kNUiL4nZ2dnZfqn_SdnZ2d@giganews.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130529&group=sci.physics.relativity#130529

  copy link   Newsgroups: sci.physics.relativity
Path: i2pn2.org!i2pn.org!news.hispagatos.org!usenet.blueworldhosting.com!diablo1.usenet.blueworldhosting.com!panix!weretis.net!feeder6.news.weretis.net!border-2.nntp.ord.giganews.com!nntp.giganews.com!Xl.tags.giganews.com!local-1.nntp.ord.giganews.com!news.giganews.com.POSTED!not-for-mail
NNTP-Posting-Date: Sun, 04 Feb 2024 17:55:28 +0000
Subject: Re: How I deal with the enormous amount of spam
Newsgroups: sci.physics.relativity
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com>
<afe109ee689f7051d5909fe8eca54113@www.novabbs.com>
<b67e3cc9-a1ac-428f-b225-389abf45d416n@googlegroups.com>
<Z_icnWyNR7xW5CP4nZ2dnZfqnPSdnZ2d@giganews.com> <65BEB38A.B7E@ix.netcom.com>
<65BEC238.5CF1@ix.netcom.com>
From: ross.a.finlayson@gmail.com (Ross Finlayson)
Date: Sun, 4 Feb 2024 09:55:29 -0800
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:38.0) Gecko/20100101
Thunderbird/38.6.0
MIME-Version: 1.0
In-Reply-To: <65BEC238.5CF1@ix.netcom.com>
Content-Type: text/plain; charset=windows-1252; format=flowed
Content-Transfer-Encoding: 8bit
Message-ID: <18mdnb4LQ8kNUiL4nZ2dnZfqn_SdnZ2d@giganews.com>
Lines: 165
X-Usenet-Provider: http://www.giganews.com
X-Trace: sv3-Emj3iHQZEGjdfQpqIaJPBVaL/NhkkUUqhqJlqtmeg1HDz3pPHrakMLHdBeXz8N63MZ4IP6qtH66w414!fe0JNBDSH2r70tDZkhKDJo8AbBPA8o0v33Lomh4GooIKEt8RoEF3Rea2gRMjCnpgxczhG/7wffLo
X-Complaints-To: abuse@giganews.com
X-DMCA-Notifications: http://www.giganews.com/info/dmca.html
X-Abuse-and-DMCA-Info: Please be sure to forward a copy of ALL headers
X-Abuse-and-DMCA-Info: Otherwise we will be unable to process your complaint properly
X-Postfilter: 1.3.40
 by: Ross Finlayson - Sun, 4 Feb 2024 17:55 UTC

On 02/03/2024 02:46 PM, The Starmaker wrote:
> The Starmaker wrote:
>>
>> Ross Finlayson wrote:
>>>
>>> On 01/30/2024 12:54 PM, Ross Finlayson wrote:
>>>> On Monday, January 29, 2024 at 5:02:05 PM UTC-8, palsing wrote:
>>>>> Tom Roberts wrote:
>>>>>
>>>>>> I use Thunderbird to read Usenet. Recently sci.physics.relativity has
>>>>>> been getting hundreds of spam posts each day, completely overwhelming
>>>>>> legitimate content. These spam posts share the property that they are
>>>>>> written in a non-latin script.
>>>>>
>>>>>> Thunderbird implements message filters that can mark a message Read. So
>>>>>> I created a filter to run on sci.physics.relativity that marks messages
>>>>>> Read. Then when reading the newsgroups, I simply display only unread
>>>>>> messages. The key to making this work is to craft the filter so it marks
>>>>>> messages in which the Subject matches any of a dozen characters picked
>>>>>> from some spam messages.
>>>>>
>>>>>> This doesn't completely eliminate the spam, but it is now only a few
>>>>>> messages per day.
>>>>>
>>>>>> Tom Roberts
>>>>> I would like to do the same thing, so I installed Thunderbird... but setting it up to read newsgroups is beyond my paltry computer skills and is not at all intuitive. If anyone can point to an idiot-proof tutorial for doing this It would be much appreciated.
>>>>>
>>>>> \Paul Alsing
>>>>
>>>> Yeah, it's pretty bad, or, worse anybody's ever seen it.
>>>>
>>>> I as well sort of mow the lawn a bit or mark the spam.
>>>>
>>>> It seems alright if it'll be a sort of clean break: on Feb 22 according to Google,
>>>> Google will break its compeerage to Usenet, and furthermore make read-only
>>>> the archives, what it has, what until then, will be as it was.
>>>>
>>>> Over on sci.math I've had the idea for a while of making some brief and
>>>> special purpose Usenet compeers, for only some few groups, or, you
>>>> know, the _belles lettres_ of the text hierarchy.
>>>>
>>>> "Meta: a usenet server just for sci.math"
>>>> -- https://groups.google.com/g/sci.math/c/zggff_pVEks
>>>>
>>>> So, there you can read the outlook of this kind of thing, then while sort
>>>> of simple as the protocol is simple and its implementations widespread,
>>>> how to deal with the "signal and noise" of "exposed messaging destinations
>>>> on the Internet", well on that thread I'm theorizing a sort of, "NOOBNB protocol",
>>>> figuring to make an otherwise just standard Usenet compeer, and also for
>>>> email or messaging destinations, sort of designed with the expectation that
>>>> there will be spam, and spam and ham are hand in hand, to exclude it in simple terms.
>>>>
>>>> NOOBNB: New Old Off Bot Non Bad, Curated/Purgatory/Raw triple-feed
>>>>
>>>> (That and a firmer sort of "Load Shed" or "Load Hold" at the transport layer.)
>>>>
>>>> Also it would be real great if at least there was surfaced to the Internet a
>>>> read-only view of any message by its message ID, a "URL", or as for a "URI",
>>>> a "URN", a reliable perma-link in the IETF "news" protocol, namespace.
>>>>
>>>> https://groups.google.com/g/sci.math/c/zggff_pVEks
>>>>
>>>> I wonder that there's a reliable sort of long-term project that surfaces
>>>> "news" protocol message-IDs, .... It's a stable, standards-based protocol.
>>>>
>>>>
>>>> Thunderbird, "SLRN", .... Thanks for caring. We care.
>>>>
>>>>
>>>> https://groups.google.com/g/sci.physics.relativity/c/ToBo6XOymUw
>>>>
>>>
>>> One fellow reached me via e-mail and he said, hey, the Googler spam is
>>> outrageous, can we do anything about it? Would you write a script to
>>> funnel all their message-ID's into the abuse reporting? And I was like,
>>> you know, about 2008 I did just that, there was a big spam flood,
>>> and I wrote a little script to find them and extract their posting-account,
>>> and the message-ID, and a little script to post to the posting-host,
>>> each one of the wicked spams.
>>>
>>> At the time that seemed to help, they sort of dried up, here there's
>>> that basically they're not following the charter, but, it's the
>>> posting-account
>>> in the message headers that indicate the origin of the post, not the
>>> email address. So, I wonder, given that I can extract the posting-accounts
>>> of all the spams, how to match the posting-account to then determine
>>> whether it's a sockpuppet-farm or what, and basically about sending them up.
>>
>> Let me see your little script. Post it here.
>
> Here is a list I currently have:
>
> salz.txt
> usenet.death.penalty.gz
> purify.txt
> NewsAgent110-MS.exe
> HipCrime's NewsAgent (v1_11).htm
> NewsAgent111-BE.zip
> SuperCede.exe
> NewsAgent023.exe
> NewsAgent025.exe
> ActiveAgent.java
> HipCrime's NewsAgent (v1_02)_files
> NewsCancel.java (source code)
>
> (plus updated python versions)
>
>
>
> (Maybe your script is inthere somewhere?)
>
>
>
> Show me what you got. walk the walk.
>

I try to avoid sketchy things like hiring a criminal botnet,
there's the impression that that's looking at 1000's of counts
of computer intrusion.

With those being something about $50K and 10-25 apiece,
there's a pretty significant deterrence to such activities.

I've never much cared for "OAuth", giving away the
keys-to-the-kingdom and all, here it looks like either
a) a bunch of duped browsers clicked away their identities,
or b) it's really that Google and Facebook are more than
half full of fake identities for the sole purpose of being fake.

(How's your new deal going?
Great, we got a million users.
Why are my conversions around zero?
Your ad must not speak to them.
Would it help if I spiced it up?
Don't backtalk me, I'll put you on a list!)

So, it seems mostly a sort of "spam-walling the Internet",
where it was like "we're going to reinvent the Internet",
"no, you aren't", "all right then we'll ruin this one".

As far as search goes, there's something to be said
for a new sort of approach to search, given that
Google, Bing, Duck, ..., _all make the same results_. It's
just so highly unlikely that they'd _all make the same
results_, you figure they're just one.

So, the idea, for somebody like me who's mostly interested
in writing on the Internet, is that lots of that is of the sort
of "works" vis-a-vis, the "feuilleton" or what you might
call it, ephemeral junk, that I just learned about in
Herman Hesse's "The Glass Bead Game".

Then, there's an idea, that basically to surface high-quality
works to a search, is that there's what's called metadata,
for content like HTML, with regards to Dublin Core and
RDF and so on, about a sort of making for fungible collections
of works, what results searchable fragments of various
larger bodies of works, according to their robots.txt and
their summaries and with regards to crawling the content
and so on, then to make federated common search corpi,
these kinds of things.

Meta: Re: How I deal with the enormous amount of spam

<0decnTlW944oSSL4nZ2dnZfqn_udnZ2d@giganews.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130530&group=sci.physics.relativity#130530

  copy link   Newsgroups: sci.physics.relativity
Path: i2pn2.org!i2pn.org!usenet.blueworldhosting.com!diablo1.usenet.blueworldhosting.com!feeder.usenetexpress.com!tr2.iad1.usenetexpress.com!69.80.99.23.MISMATCH!Xl.tags.giganews.com!local-2.nntp.ord.giganews.com!news.giganews.com.POSTED!not-for-mail
NNTP-Posting-Date: Sun, 04 Feb 2024 18:17:25 +0000
Subject: Meta: Re: How I deal with the enormous amount of spam
Newsgroups: sci.physics.relativity
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com> <afe109ee689f7051d5909fe8eca54113@www.novabbs.com> <b67e3cc9-a1ac-428f-b225-389abf45d416n@googlegroups.com> <Z_icnWyNR7xW5CP4nZ2dnZfqnPSdnZ2d@giganews.com> <65BEB38A.B7E@ix.netcom.com> <65BEC238.5CF1@ix.netcom.com> <18mdnb4LQ8kNUiL4nZ2dnZfqn_SdnZ2d@giganews.com>
From: ross.a.finlayson@gmail.com (Ross Finlayson)
Date: Sun, 4 Feb 2024 10:17:39 -0800
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:38.0) Gecko/20100101 Thunderbird/38.6.0
MIME-Version: 1.0
In-Reply-To: <18mdnb4LQ8kNUiL4nZ2dnZfqn_SdnZ2d@giganews.com>
Content-Type: text/plain; charset=windows-1252; format=flowed
Content-Transfer-Encoding: 8bit
Message-ID: <0decnTlW944oSSL4nZ2dnZfqn_udnZ2d@giganews.com>
Lines: 213
X-Usenet-Provider: http://www.giganews.com
X-Trace: sv3-d105/S8JLGm8h+GTcWh5pTqpV2d6WiSFBm9KnPQwYYdx4oRla/zBFpXAraVFiNsfx03dcYMSln9UK7S!7D8NrBpoKGUbVbP2Yr9+Tj990LAL+pYl1NXiA6F4qet0rDwNatFEEns1cUcmq88s/iQ7R2N8vWK7
X-Complaints-To: abuse@giganews.com
X-DMCA-Notifications: http://www.giganews.com/info/dmca.html
X-Abuse-and-DMCA-Info: Please be sure to forward a copy of ALL headers
X-Abuse-and-DMCA-Info: Otherwise we will be unable to process your complaint properly
X-Postfilter: 1.3.40
 by: Ross Finlayson - Sun, 4 Feb 2024 18:17 UTC

On 02/04/2024 09:55 AM, Ross Finlayson wrote:
> On 02/03/2024 02:46 PM, The Starmaker wrote:
>> The Starmaker wrote:
>>>
>>> Ross Finlayson wrote:
>>>>
>>>> On 01/30/2024 12:54 PM, Ross Finlayson wrote:
>>>>> On Monday, January 29, 2024 at 5:02:05 PM UTC-8, palsing wrote:
>>>>>> Tom Roberts wrote:
>>>>>>
>>>>>>> I use Thunderbird to read Usenet. Recently sci.physics.relativity
>>>>>>> has
>>>>>>> been getting hundreds of spam posts each day, completely
>>>>>>> overwhelming
>>>>>>> legitimate content. These spam posts share the property that they
>>>>>>> are
>>>>>>> written in a non-latin script.
>>>>>>
>>>>>>> Thunderbird implements message filters that can mark a message
>>>>>>> Read. So
>>>>>>> I created a filter to run on sci.physics.relativity that marks
>>>>>>> messages
>>>>>>> Read. Then when reading the newsgroups, I simply display only unread
>>>>>>> messages. The key to making this work is to craft the filter so
>>>>>>> it marks
>>>>>>> messages in which the Subject matches any of a dozen characters
>>>>>>> picked
>>>>>>> from some spam messages.
>>>>>>
>>>>>>> This doesn't completely eliminate the spam, but it is now only a few
>>>>>>> messages per day.
>>>>>>
>>>>>>> Tom Roberts
>>>>>> I would like to do the same thing, so I installed Thunderbird...
>>>>>> but setting it up to read newsgroups is beyond my paltry computer
>>>>>> skills and is not at all intuitive. If anyone can point to an
>>>>>> idiot-proof tutorial for doing this It would be much appreciated.
>>>>>>
>>>>>> \Paul Alsing
>>>>>
>>>>> Yeah, it's pretty bad, or, worse anybody's ever seen it.
>>>>>
>>>>> I as well sort of mow the lawn a bit or mark the spam.
>>>>>
>>>>> It seems alright if it'll be a sort of clean break: on Feb 22
>>>>> according to Google,
>>>>> Google will break its compeerage to Usenet, and furthermore make
>>>>> read-only
>>>>> the archives, what it has, what until then, will be as it was.
>>>>>
>>>>> Over on sci.math I've had the idea for a while of making some brief
>>>>> and
>>>>> special purpose Usenet compeers, for only some few groups, or, you
>>>>> know, the _belles lettres_ of the text hierarchy.
>>>>>
>>>>> "Meta: a usenet server just for sci.math"
>>>>> -- https://groups.google.com/g/sci.math/c/zggff_pVEks
>>>>>
>>>>> So, there you can read the outlook of this kind of thing, then
>>>>> while sort
>>>>> of simple as the protocol is simple and its implementations
>>>>> widespread,
>>>>> how to deal with the "signal and noise" of "exposed messaging
>>>>> destinations
>>>>> on the Internet", well on that thread I'm theorizing a sort of,
>>>>> "NOOBNB protocol",
>>>>> figuring to make an otherwise just standard Usenet compeer, and
>>>>> also for
>>>>> email or messaging destinations, sort of designed with the
>>>>> expectation that
>>>>> there will be spam, and spam and ham are hand in hand, to exclude
>>>>> it in simple terms.
>>>>>
>>>>> NOOBNB: New Old Off Bot Non Bad, Curated/Purgatory/Raw triple-feed
>>>>>
>>>>> (That and a firmer sort of "Load Shed" or "Load Hold" at the
>>>>> transport layer.)
>>>>>
>>>>> Also it would be real great if at least there was surfaced to the
>>>>> Internet a
>>>>> read-only view of any message by its message ID, a "URL", or as for
>>>>> a "URI",
>>>>> a "URN", a reliable perma-link in the IETF "news" protocol, namespace.
>>>>>
>>>>> https://groups.google.com/g/sci.math/c/zggff_pVEks
>>>>>
>>>>> I wonder that there's a reliable sort of long-term project that
>>>>> surfaces
>>>>> "news" protocol message-IDs, .... It's a stable, standards-based
>>>>> protocol.
>>>>>
>>>>>
>>>>> Thunderbird, "SLRN", .... Thanks for caring. We care.
>>>>>
>>>>>
>>>>> https://groups.google.com/g/sci.physics.relativity/c/ToBo6XOymUw
>>>>>
>>>>
>>>> One fellow reached me via e-mail and he said, hey, the Googler spam is
>>>> outrageous, can we do anything about it? Would you write a script to
>>>> funnel all their message-ID's into the abuse reporting? And I was
>>>> like,
>>>> you know, about 2008 I did just that, there was a big spam flood,
>>>> and I wrote a little script to find them and extract their
>>>> posting-account,
>>>> and the message-ID, and a little script to post to the posting-host,
>>>> each one of the wicked spams.
>>>>
>>>> At the time that seemed to help, they sort of dried up, here there's
>>>> that basically they're not following the charter, but, it's the
>>>> posting-account
>>>> in the message headers that indicate the origin of the post, not the
>>>> email address. So, I wonder, given that I can extract the
>>>> posting-accounts
>>>> of all the spams, how to match the posting-account to then determine
>>>> whether it's a sockpuppet-farm or what, and basically about sending
>>>> them up.
>>>
>>> Let me see your little script. Post it here.
>>
>> Here is a list I currently have:
>>
>> salz.txt
>> usenet.death.penalty.gz
>> purify.txt
>> NewsAgent110-MS.exe
>> HipCrime's NewsAgent (v1_11).htm
>> NewsAgent111-BE.zip
>> SuperCede.exe
>> NewsAgent023.exe
>> NewsAgent025.exe
>> ActiveAgent.java
>> HipCrime's NewsAgent (v1_02)_files
>> NewsCancel.java (source code)
>>
>> (plus updated python versions)
>>
>>
>>
>> (Maybe your script is inthere somewhere?)
>>
>>
>>
>> Show me what you got. walk the walk.
>>
>
>
> I try to avoid sketchy things like hiring a criminal botnet,
> there's the impression that that's looking at 1000's of counts
> of computer intrusion.
>
> With those being something about $50K and 10-25 apiece,
> there's a pretty significant deterrence to such activities.
>
> I've never much cared for "OAuth", giving away the
> keys-to-the-kingdom and all, here it looks like either
> a) a bunch of duped browsers clicked away their identities,
> or b) it's really that Google and Facebook are more than
> half full of fake identities for the sole purpose of being fake.
>
> (How's your new deal going?
> Great, we got a million users.
> Why are my conversions around zero?
> Your ad must not speak to them.
> Would it help if I spiced it up?
> Don't backtalk me, I'll put you on a list!)
>
> So, it seems mostly a sort of "spam-walling the Internet",
> where it was like "we're going to reinvent the Internet",
> "no, you aren't", "all right then we'll ruin this one".
>
> As far as search goes, there's something to be said
> for a new sort of approach to search, given that
> Google, Bing, Duck, ..., _all make the same results_. It's
> just so highly unlikely that they'd _all make the same
> results_, you figure they're just one.
>
> So, the idea, for somebody like me who's mostly interested
> in writing on the Internet, is that lots of that is of the sort
> of "works" vis-a-vis, the "feuilleton" or what you might
> call it, ephemeral junk, that I just learned about in
> Herman Hesse's "The Glass Bead Game".
>
> Then, there's an idea, that basically to surface high-quality
> works to a search, is that there's what's called metadata,
> for content like HTML, with regards to Dublin Core and
> RDF and so on, about a sort of making for fungible collections
> of works, what results searchable fragments of various
> larger bodies of works, according to their robots.txt and
> their summaries and with regards to crawling the content
> and so on, then to make federated common search corpi,
> these kinds of things.
>
>
>


Click here to read the complete article
Re: Meta: Re: How I deal with the enormous amount of spam

<65BFF51D.3F9B@ix.netcom.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130532&group=sci.physics.relativity#130532

  copy link   Newsgroups: sci.physics.relativity
Path: i2pn2.org!i2pn.org!paganini.bofh.team!not-for-mail
From: starmaker@ix.netcom.com (The Starmaker)
Newsgroups: sci.physics.relativity
Subject: Re: Meta: Re: How I deal with the enormous amount of spam
Date: Sun, 04 Feb 2024 12:35:41 -0800
Organization: To protect and to server
Message-ID: <65BFF51D.3F9B@ix.netcom.com>
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com> <afe109ee689f7051d5909fe8eca54113@www.novabbs.com> <b67e3cc9-a1ac-428f-b225-389abf45d416n@googlegroups.com> <Z_icnWyNR7xW5CP4nZ2dnZfqnPSdnZ2d@giganews.com> <65BEB38A.B7E@ix.netcom.com> <65BEC238.5CF1@ix.netcom.com> <18mdnb4LQ8kNUiL4nZ2dnZfqn_SdnZ2d@giganews.com> <0decnTlW944oSSL4nZ2dnZfqn_udnZ2d@giganews.com>
Reply-To: starmaker@ix.netcom.com
Mime-Version: 1.0
Content-Type: text/plain; charset=iso-8859-1
Content-Transfer-Encoding: 8bit
Injection-Info: paganini.bofh.team; logging-data="1721526"; posting-host="nLYg9UBeoMWa070gP9wQcw.user.paganini.bofh.team"; mail-complaints-to="usenet@bofh.team"; posting-account="9dIQLXBM7WM9KzA+yjdR4A";
Cancel-Lock: sha256:0/4GE7yj5hU3H7oq3K0lAxBpr7/pjj0xzbSTQBDCeI0=
X-Antivirus: Avast (VPS 240204-6, 02/04/2024), Outbound message
X-Mailer: Mozilla 3.04Gold (WinNT; U)
X-Notice: Filtered by postfilter v. 0.9.3
X-Antivirus-Status: Clean
 by: The Starmaker - Sun, 4 Feb 2024 20:35 UTC

Ross Finlayson wrote:
>
> On 02/04/2024 09:55 AM, Ross Finlayson wrote:
> > On 02/03/2024 02:46 PM, The Starmaker wrote:
> >> The Starmaker wrote:
> >>>
> >>> Ross Finlayson wrote:
> >>>>
> >>>> On 01/30/2024 12:54 PM, Ross Finlayson wrote:
> >>>>> On Monday, January 29, 2024 at 5:02:05 PM UTC-8, palsing wrote:
> >>>>>> Tom Roberts wrote:
> >>>>>>
> >>>>>>> I use Thunderbird to read Usenet. Recently sci.physics.relativity
> >>>>>>> has
> >>>>>>> been getting hundreds of spam posts each day, completely
> >>>>>>> overwhelming
> >>>>>>> legitimate content. These spam posts share the property that they
> >>>>>>> are
> >>>>>>> written in a non-latin script.
> >>>>>>
> >>>>>>> Thunderbird implements message filters that can mark a message
> >>>>>>> Read. So
> >>>>>>> I created a filter to run on sci.physics.relativity that marks
> >>>>>>> messages
> >>>>>>> Read. Then when reading the newsgroups, I simply display only unread
> >>>>>>> messages. The key to making this work is to craft the filter so
> >>>>>>> it marks
> >>>>>>> messages in which the Subject matches any of a dozen characters
> >>>>>>> picked
> >>>>>>> from some spam messages.
> >>>>>>
> >>>>>>> This doesn't completely eliminate the spam, but it is now only a few
> >>>>>>> messages per day.
> >>>>>>
> >>>>>>> Tom Roberts
> >>>>>> I would like to do the same thing, so I installed Thunderbird...
> >>>>>> but setting it up to read newsgroups is beyond my paltry computer
> >>>>>> skills and is not at all intuitive. If anyone can point to an
> >>>>>> idiot-proof tutorial for doing this It would be much appreciated.
> >>>>>>
> >>>>>> \Paul Alsing
> >>>>>
> >>>>> Yeah, it's pretty bad, or, worse anybody's ever seen it.
> >>>>>
> >>>>> I as well sort of mow the lawn a bit or mark the spam.
> >>>>>
> >>>>> It seems alright if it'll be a sort of clean break: on Feb 22
> >>>>> according to Google,
> >>>>> Google will break its compeerage to Usenet, and furthermore make
> >>>>> read-only
> >>>>> the archives, what it has, what until then, will be as it was.
> >>>>>
> >>>>> Over on sci.math I've had the idea for a while of making some brief
> >>>>> and
> >>>>> special purpose Usenet compeers, for only some few groups, or, you
> >>>>> know, the _belles lettres_ of the text hierarchy.
> >>>>>
> >>>>> "Meta: a usenet server just for sci.math"
> >>>>> -- https://groups.google.com/g/sci.math/c/zggff_pVEks
> >>>>>
> >>>>> So, there you can read the outlook of this kind of thing, then
> >>>>> while sort
> >>>>> of simple as the protocol is simple and its implementations
> >>>>> widespread,
> >>>>> how to deal with the "signal and noise" of "exposed messaging
> >>>>> destinations
> >>>>> on the Internet", well on that thread I'm theorizing a sort of,
> >>>>> "NOOBNB protocol",
> >>>>> figuring to make an otherwise just standard Usenet compeer, and
> >>>>> also for
> >>>>> email or messaging destinations, sort of designed with the
> >>>>> expectation that
> >>>>> there will be spam, and spam and ham are hand in hand, to exclude
> >>>>> it in simple terms.
> >>>>>
> >>>>> NOOBNB: New Old Off Bot Non Bad, Curated/Purgatory/Raw triple-feed
> >>>>>
> >>>>> (That and a firmer sort of "Load Shed" or "Load Hold" at the
> >>>>> transport layer.)
> >>>>>
> >>>>> Also it would be real great if at least there was surfaced to the
> >>>>> Internet a
> >>>>> read-only view of any message by its message ID, a "URL", or as for
> >>>>> a "URI",
> >>>>> a "URN", a reliable perma-link in the IETF "news" protocol, namespace.
> >>>>>
> >>>>> https://groups.google.com/g/sci.math/c/zggff_pVEks
> >>>>>
> >>>>> I wonder that there's a reliable sort of long-term project that
> >>>>> surfaces
> >>>>> "news" protocol message-IDs, .... It's a stable, standards-based
> >>>>> protocol.
> >>>>>
> >>>>>
> >>>>> Thunderbird, "SLRN", .... Thanks for caring. We care.
> >>>>>
> >>>>>
> >>>>> https://groups.google.com/g/sci.physics.relativity/c/ToBo6XOymUw
> >>>>>
> >>>>
> >>>> One fellow reached me via e-mail and he said, hey, the Googler spam is
> >>>> outrageous, can we do anything about it? Would you write a script to
> >>>> funnel all their message-ID's into the abuse reporting? And I was
> >>>> like,
> >>>> you know, about 2008 I did just that, there was a big spam flood,
> >>>> and I wrote a little script to find them and extract their
> >>>> posting-account,
> >>>> and the message-ID, and a little script to post to the posting-host,
> >>>> each one of the wicked spams.
> >>>>
> >>>> At the time that seemed to help, they sort of dried up, here there's
> >>>> that basically they're not following the charter, but, it's the
> >>>> posting-account
> >>>> in the message headers that indicate the origin of the post, not the
> >>>> email address. So, I wonder, given that I can extract the
> >>>> posting-accounts
> >>>> of all the spams, how to match the posting-account to then determine
> >>>> whether it's a sockpuppet-farm or what, and basically about sending
> >>>> them up.
> >>>
> >>> Let me see your little script. Post it here.
> >>
> >> Here is a list I currently have:
> >>
> >> salz.txt
> >> usenet.death.penalty.gz
> >> purify.txt
> >> NewsAgent110-MS.exe
> >> HipCrime's NewsAgent (v1_11).htm
> >> NewsAgent111-BE.zip
> >> SuperCede.exe
> >> NewsAgent023.exe
> >> NewsAgent025.exe
> >> ActiveAgent.java
> >> HipCrime's NewsAgent (v1_02)_files
> >> NewsCancel.java (source code)
> >>
> >> (plus updated python versions)
> >>
> >>
> >>
> >> (Maybe your script is inthere somewhere?)
> >>
> >>
> >>
> >> Show me what you got. walk the walk.
> >>
> >
> >
> > I try to avoid sketchy things like hiring a criminal botnet,
> > there's the impression that that's looking at 1000's of counts
> > of computer intrusion.
> >
> > With those being something about $50K and 10-25 apiece,
> > there's a pretty significant deterrence to such activities.
> >
> > I've never much cared for "OAuth", giving away the
> > keys-to-the-kingdom and all, here it looks like either
> > a) a bunch of duped browsers clicked away their identities,
> > or b) it's really that Google and Facebook are more than
> > half full of fake identities for the sole purpose of being fake.
> >
> > (How's your new deal going?
> > Great, we got a million users.
> > Why are my conversions around zero?
> > Your ad must not speak to them.
> > Would it help if I spiced it up?
> > Don't backtalk me, I'll put you on a list!)
> >
> > So, it seems mostly a sort of "spam-walling the Internet",
> > where it was like "we're going to reinvent the Internet",
> > "no, you aren't", "all right then we'll ruin this one".
> >
> > As far as search goes, there's something to be said
> > for a new sort of approach to search, given that
> > Google, Bing, Duck, ..., _all make the same results_. It's
> > just so highly unlikely that they'd _all make the same
> > results_, you figure they're just one.
> >
> > So, the idea, for somebody like me who's mostly interested
> > in writing on the Internet, is that lots of that is of the sort
> > of "works" vis-a-vis, the "feuilleton" or what you might
> > call it, ephemeral junk, that I just learned about in
> > Herman Hesse's "The Glass Bead Game".
> >
> > Then, there's an idea, that basically to surface high-quality
> > works to a search, is that there's what's called metadata,
> > for content like HTML, with regards to Dublin Core and
> > RDF and so on, about a sort of making for fungible collections
> > of works, what results searchable fragments of various
> > larger bodies of works, according to their robots.txt and
> > their summaries and with regards to crawling the content
> > and so on, then to make federated common search corpi,
> > these kinds of things.
> >
> >
> >
>
> It's like "why are they building that new data center",
> and it's like "well it's like Artificial Intelligence, inside
> that data center is a million virts and each one has a
> browser emulator and a phone app sandbox and a
> little notecard that prompts its name, basically it's
> a million-headed hydra called a sims-bot-farm,
> that for pennies on the dollar is an instant audience."
>
> "Wow, great, do they get a cut?" "Don't be talking about my cut."
>
> Usenet traffic had been up recently, ....
>
> I think they used to call it "astro-turfing".
> "Artificial Intelligence?" "No, 'Fake eyeballs'."


Click here to read the complete article
Re: Meta: Re: How I deal with the enormous amount of spam

<65BFF947.2233@ix.netcom.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130533&group=sci.physics.relativity#130533

  copy link   Newsgroups: sci.physics.relativity
Path: i2pn2.org!i2pn.org!news.niel.me!news.nntp4.net!paganini.bofh.team!not-for-mail
From: starmaker@ix.netcom.com (The Starmaker)
Newsgroups: sci.physics.relativity
Subject: Re: Meta: Re: How I deal with the enormous amount of spam
Date: Sun, 04 Feb 2024 12:53:27 -0800
Organization: To protect and to server
Message-ID: <65BFF947.2233@ix.netcom.com>
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com> <afe109ee689f7051d5909fe8eca54113@www.novabbs.com> <b67e3cc9-a1ac-428f-b225-389abf45d416n@googlegroups.com> <Z_icnWyNR7xW5CP4nZ2dnZfqnPSdnZ2d@giganews.com> <65BEB38A.B7E@ix.netcom.com> <65BEC238.5CF1@ix.netcom.com> <18mdnb4LQ8kNUiL4nZ2dnZfqn_SdnZ2d@giganews.com> <0decnTlW944oSSL4nZ2dnZfqn_udnZ2d@giganews.com> <65BFF51D.3F9B@ix.netcom.com>
Reply-To: starmaker@ix.netcom.com
Mime-Version: 1.0
Content-Type: text/plain; charset=iso-8859-1
Content-Transfer-Encoding: 8bit
Injection-Info: paganini.bofh.team; logging-data="1723140"; posting-host="nLYg9UBeoMWa070gP9wQcw.user.paganini.bofh.team"; mail-complaints-to="usenet@bofh.team"; posting-account="9dIQLXBM7WM9KzA+yjdR4A";
Cancel-Lock: sha256:+wTbQb9hwLArgm+VqBf3XIk4cx0eJ302MgnIeVSJO3Q=
X-Antivirus: Avast (VPS 240204-6, 02/04/2024), Outbound message
X-Notice: Filtered by postfilter v. 0.9.3
X-Antivirus-Status: Clean
X-Mailer: Mozilla 3.04Gold (WinNT; U)
 by: The Starmaker - Sun, 4 Feb 2024 20:53 UTC

The Starmaker wrote:
>
> Ross Finlayson wrote:
> >
> > On 02/04/2024 09:55 AM, Ross Finlayson wrote:
> > > On 02/03/2024 02:46 PM, The Starmaker wrote:
> > >> The Starmaker wrote:
> > >>>
> > >>> Ross Finlayson wrote:
> > >>>>
> > >>>> On 01/30/2024 12:54 PM, Ross Finlayson wrote:
> > >>>>> On Monday, January 29, 2024 at 5:02:05 PM UTC-8, palsing wrote:
> > >>>>>> Tom Roberts wrote:
> > >>>>>>
> > >>>>>>> I use Thunderbird to read Usenet. Recently sci.physics.relativity
> > >>>>>>> has
> > >>>>>>> been getting hundreds of spam posts each day, completely
> > >>>>>>> overwhelming
> > >>>>>>> legitimate content. These spam posts share the property that they
> > >>>>>>> are
> > >>>>>>> written in a non-latin script.
> > >>>>>>
> > >>>>>>> Thunderbird implements message filters that can mark a message
> > >>>>>>> Read. So
> > >>>>>>> I created a filter to run on sci.physics.relativity that marks
> > >>>>>>> messages
> > >>>>>>> Read. Then when reading the newsgroups, I simply display only unread
> > >>>>>>> messages. The key to making this work is to craft the filter so
> > >>>>>>> it marks
> > >>>>>>> messages in which the Subject matches any of a dozen characters
> > >>>>>>> picked
> > >>>>>>> from some spam messages.
> > >>>>>>
> > >>>>>>> This doesn't completely eliminate the spam, but it is now only a few
> > >>>>>>> messages per day.
> > >>>>>>
> > >>>>>>> Tom Roberts
> > >>>>>> I would like to do the same thing, so I installed Thunderbird...
> > >>>>>> but setting it up to read newsgroups is beyond my paltry computer
> > >>>>>> skills and is not at all intuitive. If anyone can point to an
> > >>>>>> idiot-proof tutorial for doing this It would be much appreciated.
> > >>>>>>
> > >>>>>> \Paul Alsing
> > >>>>>
> > >>>>> Yeah, it's pretty bad, or, worse anybody's ever seen it.
> > >>>>>
> > >>>>> I as well sort of mow the lawn a bit or mark the spam.
> > >>>>>
> > >>>>> It seems alright if it'll be a sort of clean break: on Feb 22
> > >>>>> according to Google,
> > >>>>> Google will break its compeerage to Usenet, and furthermore make
> > >>>>> read-only
> > >>>>> the archives, what it has, what until then, will be as it was.
> > >>>>>
> > >>>>> Over on sci.math I've had the idea for a while of making some brief
> > >>>>> and
> > >>>>> special purpose Usenet compeers, for only some few groups, or, you
> > >>>>> know, the _belles lettres_ of the text hierarchy.
> > >>>>>
> > >>>>> "Meta: a usenet server just for sci.math"
> > >>>>> -- https://groups.google.com/g/sci.math/c/zggff_pVEks
> > >>>>>
> > >>>>> So, there you can read the outlook of this kind of thing, then
> > >>>>> while sort
> > >>>>> of simple as the protocol is simple and its implementations
> > >>>>> widespread,
> > >>>>> how to deal with the "signal and noise" of "exposed messaging
> > >>>>> destinations
> > >>>>> on the Internet", well on that thread I'm theorizing a sort of,
> > >>>>> "NOOBNB protocol",
> > >>>>> figuring to make an otherwise just standard Usenet compeer, and
> > >>>>> also for
> > >>>>> email or messaging destinations, sort of designed with the
> > >>>>> expectation that
> > >>>>> there will be spam, and spam and ham are hand in hand, to exclude
> > >>>>> it in simple terms.
> > >>>>>
> > >>>>> NOOBNB: New Old Off Bot Non Bad, Curated/Purgatory/Raw triple-feed
> > >>>>>
> > >>>>> (That and a firmer sort of "Load Shed" or "Load Hold" at the
> > >>>>> transport layer.)
> > >>>>>
> > >>>>> Also it would be real great if at least there was surfaced to the
> > >>>>> Internet a
> > >>>>> read-only view of any message by its message ID, a "URL", or as for
> > >>>>> a "URI",
> > >>>>> a "URN", a reliable perma-link in the IETF "news" protocol, namespace.
> > >>>>>
> > >>>>> https://groups.google.com/g/sci.math/c/zggff_pVEks
> > >>>>>
> > >>>>> I wonder that there's a reliable sort of long-term project that
> > >>>>> surfaces
> > >>>>> "news" protocol message-IDs, .... It's a stable, standards-based
> > >>>>> protocol.
> > >>>>>
> > >>>>>
> > >>>>> Thunderbird, "SLRN", .... Thanks for caring. We care.
> > >>>>>
> > >>>>>
> > >>>>> https://groups.google.com/g/sci.physics.relativity/c/ToBo6XOymUw
> > >>>>>
> > >>>>
> > >>>> One fellow reached me via e-mail and he said, hey, the Googler spam is
> > >>>> outrageous, can we do anything about it? Would you write a script to
> > >>>> funnel all their message-ID's into the abuse reporting? And I was
> > >>>> like,
> > >>>> you know, about 2008 I did just that, there was a big spam flood,
> > >>>> and I wrote a little script to find them and extract their
> > >>>> posting-account,
> > >>>> and the message-ID, and a little script to post to the posting-host,
> > >>>> each one of the wicked spams.
> > >>>>
> > >>>> At the time that seemed to help, they sort of dried up, here there's
> > >>>> that basically they're not following the charter, but, it's the
> > >>>> posting-account
> > >>>> in the message headers that indicate the origin of the post, not the
> > >>>> email address. So, I wonder, given that I can extract the
> > >>>> posting-accounts
> > >>>> of all the spams, how to match the posting-account to then determine
> > >>>> whether it's a sockpuppet-farm or what, and basically about sending
> > >>>> them up.
> > >>>
> > >>> Let me see your little script. Post it here.
> > >>
> > >> Here is a list I currently have:
> > >>
> > >> salz.txt
> > >> usenet.death.penalty.gz
> > >> purify.txt
> > >> NewsAgent110-MS.exe
> > >> HipCrime's NewsAgent (v1_11).htm
> > >> NewsAgent111-BE.zip
> > >> SuperCede.exe
> > >> NewsAgent023.exe
> > >> NewsAgent025.exe
> > >> ActiveAgent.java
> > >> HipCrime's NewsAgent (v1_02)_files
> > >> NewsCancel.java (source code)
> > >>
> > >> (plus updated python versions)
> > >>
> > >>
> > >>
> > >> (Maybe your script is inthere somewhere?)
> > >>
> > >>
> > >>
> > >> Show me what you got. walk the walk.
> > >>
> > >
> > >
> > > I try to avoid sketchy things like hiring a criminal botnet,
> > > there's the impression that that's looking at 1000's of counts
> > > of computer intrusion.
> > >
> > > With those being something about $50K and 10-25 apiece,
> > > there's a pretty significant deterrence to such activities.
> > >
> > > I've never much cared for "OAuth", giving away the
> > > keys-to-the-kingdom and all, here it looks like either
> > > a) a bunch of duped browsers clicked away their identities,
> > > or b) it's really that Google and Facebook are more than
> > > half full of fake identities for the sole purpose of being fake.
> > >
> > > (How's your new deal going?
> > > Great, we got a million users.
> > > Why are my conversions around zero?
> > > Your ad must not speak to them.
> > > Would it help if I spiced it up?
> > > Don't backtalk me, I'll put you on a list!)
> > >
> > > So, it seems mostly a sort of "spam-walling the Internet",
> > > where it was like "we're going to reinvent the Internet",
> > > "no, you aren't", "all right then we'll ruin this one".
> > >
> > > As far as search goes, there's something to be said
> > > for a new sort of approach to search, given that
> > > Google, Bing, Duck, ..., _all make the same results_. It's
> > > just so highly unlikely that they'd _all make the same
> > > results_, you figure they're just one.
> > >
> > > So, the idea, for somebody like me who's mostly interested
> > > in writing on the Internet, is that lots of that is of the sort
> > > of "works" vis-a-vis, the "feuilleton" or what you might
> > > call it, ephemeral junk, that I just learned about in
> > > Herman Hesse's "The Glass Bead Game".
> > >
> > > Then, there's an idea, that basically to surface high-quality
> > > works to a search, is that there's what's called metadata,
> > > for content like HTML, with regards to Dublin Core and
> > > RDF and so on, about a sort of making for fungible collections
> > > of works, what results searchable fragments of various
> > > larger bodies of works, according to their robots.txt and
> > > their summaries and with regards to crawling the content
> > > and so on, then to make federated common search corpi,
> > > these kinds of things.
> > >
> > >
> > >
> >
> > It's like "why are they building that new data center",
> > and it's like "well it's like Artificial Intelligence, inside
> > that data center is a million virts and each one has a
> > browser emulator and a phone app sandbox and a
> > little notecard that prompts its name, basically it's
> > a million-headed hydra called a sims-bot-farm,
> > that for pennies on the dollar is an instant audience."
> >
> > "Wow, great, do they get a cut?" "Don't be talking about my cut."
> >
> > Usenet traffic had been up recently, ....
> >
> > I think they used to call it "astro-turfing".
> > "Artificial Intelligence?" "No, 'Fake eyeballs'."
>
> I have NewsAgent111-MS.exe
>
> I seem to be missing version 2.0
>
> Do you have the 2.0 version?
>
> I'll trade you.
>
> I'll give you my python version with (GUI)!!!! (Tinter)
>
> let's trade!
>
> don't bogart


Click here to read the complete article
Re: Meta: Re: How I deal with the enormous amount of spam

<Oq6cneYGTPUt2l34nZ2dnZfqnPqdnZ2d@giganews.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130534&group=sci.physics.relativity#130534

  copy link   Newsgroups: sci.physics.relativity
Path: i2pn2.org!i2pn.org!news.neodome.net!usenet.blueworldhosting.com!diablo1.usenet.blueworldhosting.com!feeder.usenetexpress.com!tr2.iad1.usenetexpress.com!69.80.99.23.MISMATCH!Xl.tags.giganews.com!local-2.nntp.ord.giganews.com!news.giganews.com.POSTED!not-for-mail
NNTP-Posting-Date: Mon, 05 Feb 2024 02:28:00 +0000
Subject: Re: Meta: Re: How I deal with the enormous amount of spam
Newsgroups: sci.physics.relativity
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com> <afe109ee689f7051d5909fe8eca54113@www.novabbs.com> <b67e3cc9-a1ac-428f-b225-389abf45d416n@googlegroups.com> <Z_icnWyNR7xW5CP4nZ2dnZfqnPSdnZ2d@giganews.com> <65BEB38A.B7E@ix.netcom.com> <65BEC238.5CF1@ix.netcom.com> <18mdnb4LQ8kNUiL4nZ2dnZfqn_SdnZ2d@giganews.com> <0decnTlW944oSSL4nZ2dnZfqn_udnZ2d@giganews.com> <65BFF51D.3F9B@ix.netcom.com> <65BFF947.2233@ix.netcom.com>
From: ross.a.finlayson@gmail.com (Ross Finlayson)
Date: Sun, 4 Feb 2024 18:28:25 -0800
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:38.0) Gecko/20100101 Thunderbird/38.6.0
MIME-Version: 1.0
In-Reply-To: <65BFF947.2233@ix.netcom.com>
Content-Type: text/plain; charset=windows-1252; format=flowed
Content-Transfer-Encoding: 8bit
Message-ID: <Oq6cneYGTPUt2l34nZ2dnZfqnPqdnZ2d@giganews.com>
Lines: 289
X-Usenet-Provider: http://www.giganews.com
X-Trace: sv3-BDz1PczPPqyem7XHs3Tb2D88uO0FaZ76+pTlnyydMXWCcCQMU83Gusp9tlHEOtMkrCG12wr+iejfWJf!UyplMMjRHZY998B0AJDPfy1RSieE6iZTKdFOtGl2Gz/hTVuxzEL7mYaKZ/cTlcjBSn7rWMFmPCEA
X-Complaints-To: abuse@giganews.com
X-DMCA-Notifications: http://www.giganews.com/info/dmca.html
X-Abuse-and-DMCA-Info: Please be sure to forward a copy of ALL headers
X-Abuse-and-DMCA-Info: Otherwise we will be unable to process your complaint properly
X-Postfilter: 1.3.40
 by: Ross Finlayson - Mon, 5 Feb 2024 02:28 UTC

On 02/04/2024 12:53 PM, The Starmaker wrote:
> The Starmaker wrote:
>>
>> Ross Finlayson wrote:
>>>
>>> On 02/04/2024 09:55 AM, Ross Finlayson wrote:
>>>> On 02/03/2024 02:46 PM, The Starmaker wrote:
>>>>> The Starmaker wrote:
>>>>>>
>>>>>> Ross Finlayson wrote:
>>>>>>>
>>>>>>> On 01/30/2024 12:54 PM, Ross Finlayson wrote:
>>>>>>>> On Monday, January 29, 2024 at 5:02:05 PM UTC-8, palsing wrote:
>>>>>>>>> Tom Roberts wrote:
>>>>>>>>>
>>>>>>>>>> I use Thunderbird to read Usenet. Recently sci.physics.relativity
>>>>>>>>>> has
>>>>>>>>>> been getting hundreds of spam posts each day, completely
>>>>>>>>>> overwhelming
>>>>>>>>>> legitimate content. These spam posts share the property that they
>>>>>>>>>> are
>>>>>>>>>> written in a non-latin script.
>>>>>>>>>
>>>>>>>>>> Thunderbird implements message filters that can mark a message
>>>>>>>>>> Read. So
>>>>>>>>>> I created a filter to run on sci.physics.relativity that marks
>>>>>>>>>> messages
>>>>>>>>>> Read. Then when reading the newsgroups, I simply display only unread
>>>>>>>>>> messages. The key to making this work is to craft the filter so
>>>>>>>>>> it marks
>>>>>>>>>> messages in which the Subject matches any of a dozen characters
>>>>>>>>>> picked
>>>>>>>>>> from some spam messages.
>>>>>>>>>
>>>>>>>>>> This doesn't completely eliminate the spam, but it is now only a few
>>>>>>>>>> messages per day.
>>>>>>>>>
>>>>>>>>>> Tom Roberts
>>>>>>>>> I would like to do the same thing, so I installed Thunderbird...
>>>>>>>>> but setting it up to read newsgroups is beyond my paltry computer
>>>>>>>>> skills and is not at all intuitive. If anyone can point to an
>>>>>>>>> idiot-proof tutorial for doing this It would be much appreciated.
>>>>>>>>>
>>>>>>>>> \Paul Alsing
>>>>>>>>
>>>>>>>> Yeah, it's pretty bad, or, worse anybody's ever seen it.
>>>>>>>>
>>>>>>>> I as well sort of mow the lawn a bit or mark the spam.
>>>>>>>>
>>>>>>>> It seems alright if it'll be a sort of clean break: on Feb 22
>>>>>>>> according to Google,
>>>>>>>> Google will break its compeerage to Usenet, and furthermore make
>>>>>>>> read-only
>>>>>>>> the archives, what it has, what until then, will be as it was.
>>>>>>>>
>>>>>>>> Over on sci.math I've had the idea for a while of making some brief
>>>>>>>> and
>>>>>>>> special purpose Usenet compeers, for only some few groups, or, you
>>>>>>>> know, the _belles lettres_ of the text hierarchy.
>>>>>>>>
>>>>>>>> "Meta: a usenet server just for sci.math"
>>>>>>>> -- https://groups.google.com/g/sci.math/c/zggff_pVEks
>>>>>>>>
>>>>>>>> So, there you can read the outlook of this kind of thing, then
>>>>>>>> while sort
>>>>>>>> of simple as the protocol is simple and its implementations
>>>>>>>> widespread,
>>>>>>>> how to deal with the "signal and noise" of "exposed messaging
>>>>>>>> destinations
>>>>>>>> on the Internet", well on that thread I'm theorizing a sort of,
>>>>>>>> "NOOBNB protocol",
>>>>>>>> figuring to make an otherwise just standard Usenet compeer, and
>>>>>>>> also for
>>>>>>>> email or messaging destinations, sort of designed with the
>>>>>>>> expectation that
>>>>>>>> there will be spam, and spam and ham are hand in hand, to exclude
>>>>>>>> it in simple terms.
>>>>>>>>
>>>>>>>> NOOBNB: New Old Off Bot Non Bad, Curated/Purgatory/Raw triple-feed
>>>>>>>>
>>>>>>>> (That and a firmer sort of "Load Shed" or "Load Hold" at the
>>>>>>>> transport layer.)
>>>>>>>>
>>>>>>>> Also it would be real great if at least there was surfaced to the
>>>>>>>> Internet a
>>>>>>>> read-only view of any message by its message ID, a "URL", or as for
>>>>>>>> a "URI",
>>>>>>>> a "URN", a reliable perma-link in the IETF "news" protocol, namespace.
>>>>>>>>
>>>>>>>> https://groups.google.com/g/sci.math/c/zggff_pVEks
>>>>>>>>
>>>>>>>> I wonder that there's a reliable sort of long-term project that
>>>>>>>> surfaces
>>>>>>>> "news" protocol message-IDs, .... It's a stable, standards-based
>>>>>>>> protocol.
>>>>>>>>
>>>>>>>>
>>>>>>>> Thunderbird, "SLRN", .... Thanks for caring. We care.
>>>>>>>>
>>>>>>>>
>>>>>>>> https://groups.google.com/g/sci.physics.relativity/c/ToBo6XOymUw
>>>>>>>>
>>>>>>>
>>>>>>> One fellow reached me via e-mail and he said, hey, the Googler spam is
>>>>>>> outrageous, can we do anything about it? Would you write a script to
>>>>>>> funnel all their message-ID's into the abuse reporting? And I was
>>>>>>> like,
>>>>>>> you know, about 2008 I did just that, there was a big spam flood,
>>>>>>> and I wrote a little script to find them and extract their
>>>>>>> posting-account,
>>>>>>> and the message-ID, and a little script to post to the posting-host,
>>>>>>> each one of the wicked spams.
>>>>>>>
>>>>>>> At the time that seemed to help, they sort of dried up, here there's
>>>>>>> that basically they're not following the charter, but, it's the
>>>>>>> posting-account
>>>>>>> in the message headers that indicate the origin of the post, not the
>>>>>>> email address. So, I wonder, given that I can extract the
>>>>>>> posting-accounts
>>>>>>> of all the spams, how to match the posting-account to then determine
>>>>>>> whether it's a sockpuppet-farm or what, and basically about sending
>>>>>>> them up.
>>>>>>
>>>>>> Let me see your little script. Post it here.
>>>>>
>>>>> Here is a list I currently have:
>>>>>
>>>>> salz.txt
>>>>> usenet.death.penalty.gz
>>>>> purify.txt
>>>>> NewsAgent110-MS.exe
>>>>> HipCrime's NewsAgent (v1_11).htm
>>>>> NewsAgent111-BE.zip
>>>>> SuperCede.exe
>>>>> NewsAgent023.exe
>>>>> NewsAgent025.exe
>>>>> ActiveAgent.java
>>>>> HipCrime's NewsAgent (v1_02)_files
>>>>> NewsCancel.java (source code)
>>>>>
>>>>> (plus updated python versions)
>>>>>
>>>>>
>>>>>
>>>>> (Maybe your script is inthere somewhere?)
>>>>>
>>>>>
>>>>>
>>>>> Show me what you got. walk the walk.
>>>>>
>>>>
>>>>
>>>> I try to avoid sketchy things like hiring a criminal botnet,
>>>> there's the impression that that's looking at 1000's of counts
>>>> of computer intrusion.
>>>>
>>>> With those being something about $50K and 10-25 apiece,
>>>> there's a pretty significant deterrence to such activities.
>>>>
>>>> I've never much cared for "OAuth", giving away the
>>>> keys-to-the-kingdom and all, here it looks like either
>>>> a) a bunch of duped browsers clicked away their identities,
>>>> or b) it's really that Google and Facebook are more than
>>>> half full of fake identities for the sole purpose of being fake.
>>>>
>>>> (How's your new deal going?
>>>> Great, we got a million users.
>>>> Why are my conversions around zero?
>>>> Your ad must not speak to them.
>>>> Would it help if I spiced it up?
>>>> Don't backtalk me, I'll put you on a list!)
>>>>
>>>> So, it seems mostly a sort of "spam-walling the Internet",
>>>> where it was like "we're going to reinvent the Internet",
>>>> "no, you aren't", "all right then we'll ruin this one".
>>>>
>>>> As far as search goes, there's something to be said
>>>> for a new sort of approach to search, given that
>>>> Google, Bing, Duck, ..., _all make the same results_. It's
>>>> just so highly unlikely that they'd _all make the same
>>>> results_, you figure they're just one.
>>>>
>>>> So, the idea, for somebody like me who's mostly interested
>>>> in writing on the Internet, is that lots of that is of the sort
>>>> of "works" vis-a-vis, the "feuilleton" or what you might
>>>> call it, ephemeral junk, that I just learned about in
>>>> Herman Hesse's "The Glass Bead Game".
>>>>
>>>> Then, there's an idea, that basically to surface high-quality
>>>> works to a search, is that there's what's called metadata,
>>>> for content like HTML, with regards to Dublin Core and
>>>> RDF and so on, about a sort of making for fungible collections
>>>> of works, what results searchable fragments of various
>>>> larger bodies of works, according to their robots.txt and
>>>> their summaries and with regards to crawling the content
>>>> and so on, then to make federated common search corpi,
>>>> these kinds of things.
>>>>
>>>>
>>>>
>>>
>>> It's like "why are they building that new data center",
>>> and it's like "well it's like Artificial Intelligence, inside
>>> that data center is a million virts and each one has a
>>> browser emulator and a phone app sandbox and a
>>> little notecard that prompts its name, basically it's
>>> a million-headed hydra called a sims-bot-farm,
>>> that for pennies on the dollar is an instant audience."
>>>
>>> "Wow, great, do they get a cut?" "Don't be talking about my cut."
>>>
>>> Usenet traffic had been up recently, ....
>>>
>>> I think they used to call it "astro-turfing".
>>> "Artificial Intelligence?" "No, 'Fake eyeballs'."
>>
>> I have NewsAgent111-MS.exe
>>
>> I seem to be missing version 2.0
>>
>> Do you have the 2.0 version?
>>
>> I'll trade you.
>>
>> I'll give you my python version with (GUI)!!!! (Tinter)
>>
>> let's trade!
>>
>> don't bogart
>
> I seem to be missing this version:
>
> https://web.archive.org/web/20051023050609/http://newsagent.p5.org.uk/
>
> Do you have it? you must have!
>
>
>
>
>


Click here to read the complete article
Re: Meta: Re: How I deal with the enormous amount of spam

<65C079E5.4E84@ix.netcom.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130535&group=sci.physics.relativity#130535

  copy link   Newsgroups: sci.physics.relativity
Path: i2pn2.org!i2pn.org!nntp.comgw.net!paganini.bofh.team!not-for-mail
From: starmaker@ix.netcom.com (The Starmaker)
Newsgroups: sci.physics.relativity
Subject: Re: Meta: Re: How I deal with the enormous amount of spam
Date: Sun, 04 Feb 2024 22:02:13 -0800
Organization: To protect and to server
Message-ID: <65C079E5.4E84@ix.netcom.com>
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com> <afe109ee689f7051d5909fe8eca54113@www.novabbs.com> <b67e3cc9-a1ac-428f-b225-389abf45d416n@googlegroups.com> <Z_icnWyNR7xW5CP4nZ2dnZfqnPSdnZ2d@giganews.com> <65BEB38A.B7E@ix.netcom.com> <65BEC238.5CF1@ix.netcom.com> <18mdnb4LQ8kNUiL4nZ2dnZfqn_SdnZ2d@giganews.com> <0decnTlW944oSSL4nZ2dnZfqn_udnZ2d@giganews.com> <65BFF51D.3F9B@ix.netcom.com> <65BFF947.2233@ix.netcom.com> <Oq6cneYGTPUt2l34nZ2dnZfqnPqdnZ2d@giganews.com>
Reply-To: starmaker@ix.netcom.com
Mime-Version: 1.0
Content-Type: text/plain; charset=iso-8859-1
Content-Transfer-Encoding: 8bit
Injection-Info: paganini.bofh.team; logging-data="1923239"; posting-host="nLYg9UBeoMWa070gP9wQcw.user.paganini.bofh.team"; mail-complaints-to="usenet@bofh.team"; posting-account="9dIQLXBM7WM9KzA+yjdR4A";
Cancel-Lock: sha256:8stTiWQlHSycvLz1qxtgzPUtHHXdUK2jbMvMd0+RlOw=
X-Antivirus: Avast (VPS 240205-0, 02/04/2024), Outbound message
X-Antivirus-Status: Clean
X-Notice: Filtered by postfilter v. 0.9.3
X-Mailer: Mozilla 3.04Gold (WinNT; U)
 by: The Starmaker - Mon, 5 Feb 2024 06:02 UTC

Ross Finlayson wrote:
>
> On 02/04/2024 12:53 PM, The Starmaker wrote:
> > The Starmaker wrote:
> >>
> >> Ross Finlayson wrote:
> >>>
> >>> On 02/04/2024 09:55 AM, Ross Finlayson wrote:
> >>>> On 02/03/2024 02:46 PM, The Starmaker wrote:
> >>>>> The Starmaker wrote:
> >>>>>>
> >>>>>> Ross Finlayson wrote:
> >>>>>>>
> >>>>>>> On 01/30/2024 12:54 PM, Ross Finlayson wrote:
> >>>>>>>> On Monday, January 29, 2024 at 5:02:05 PM UTC-8, palsing wrote:
> >>>>>>>>> Tom Roberts wrote:
> >>>>>>>>>
> >>>>>>>>>> I use Thunderbird to read Usenet. Recently sci.physics.relativity
> >>>>>>>>>> has
> >>>>>>>>>> been getting hundreds of spam posts each day, completely
> >>>>>>>>>> overwhelming
> >>>>>>>>>> legitimate content. These spam posts share the property that they
> >>>>>>>>>> are
> >>>>>>>>>> written in a non-latin script.
> >>>>>>>>>
> >>>>>>>>>> Thunderbird implements message filters that can mark a message
> >>>>>>>>>> Read. So
> >>>>>>>>>> I created a filter to run on sci.physics.relativity that marks
> >>>>>>>>>> messages
> >>>>>>>>>> Read. Then when reading the newsgroups, I simply display only unread
> >>>>>>>>>> messages. The key to making this work is to craft the filter so
> >>>>>>>>>> it marks
> >>>>>>>>>> messages in which the Subject matches any of a dozen characters
> >>>>>>>>>> picked
> >>>>>>>>>> from some spam messages.
> >>>>>>>>>
> >>>>>>>>>> This doesn't completely eliminate the spam, but it is now only a few
> >>>>>>>>>> messages per day.
> >>>>>>>>>
> >>>>>>>>>> Tom Roberts
> >>>>>>>>> I would like to do the same thing, so I installed Thunderbird...
> >>>>>>>>> but setting it up to read newsgroups is beyond my paltry computer
> >>>>>>>>> skills and is not at all intuitive. If anyone can point to an
> >>>>>>>>> idiot-proof tutorial for doing this It would be much appreciated.
> >>>>>>>>>
> >>>>>>>>> \Paul Alsing
> >>>>>>>>
> >>>>>>>> Yeah, it's pretty bad, or, worse anybody's ever seen it.
> >>>>>>>>
> >>>>>>>> I as well sort of mow the lawn a bit or mark the spam.
> >>>>>>>>
> >>>>>>>> It seems alright if it'll be a sort of clean break: on Feb 22
> >>>>>>>> according to Google,
> >>>>>>>> Google will break its compeerage to Usenet, and furthermore make
> >>>>>>>> read-only
> >>>>>>>> the archives, what it has, what until then, will be as it was.
> >>>>>>>>
> >>>>>>>> Over on sci.math I've had the idea for a while of making some brief
> >>>>>>>> and
> >>>>>>>> special purpose Usenet compeers, for only some few groups, or, you
> >>>>>>>> know, the _belles lettres_ of the text hierarchy.
> >>>>>>>>
> >>>>>>>> "Meta: a usenet server just for sci.math"
> >>>>>>>> -- https://groups.google.com/g/sci.math/c/zggff_pVEks
> >>>>>>>>
> >>>>>>>> So, there you can read the outlook of this kind of thing, then
> >>>>>>>> while sort
> >>>>>>>> of simple as the protocol is simple and its implementations
> >>>>>>>> widespread,
> >>>>>>>> how to deal with the "signal and noise" of "exposed messaging
> >>>>>>>> destinations
> >>>>>>>> on the Internet", well on that thread I'm theorizing a sort of,
> >>>>>>>> "NOOBNB protocol",
> >>>>>>>> figuring to make an otherwise just standard Usenet compeer, and
> >>>>>>>> also for
> >>>>>>>> email or messaging destinations, sort of designed with the
> >>>>>>>> expectation that
> >>>>>>>> there will be spam, and spam and ham are hand in hand, to exclude
> >>>>>>>> it in simple terms.
> >>>>>>>>
> >>>>>>>> NOOBNB: New Old Off Bot Non Bad, Curated/Purgatory/Raw triple-feed
> >>>>>>>>
> >>>>>>>> (That and a firmer sort of "Load Shed" or "Load Hold" at the
> >>>>>>>> transport layer.)
> >>>>>>>>
> >>>>>>>> Also it would be real great if at least there was surfaced to the
> >>>>>>>> Internet a
> >>>>>>>> read-only view of any message by its message ID, a "URL", or as for
> >>>>>>>> a "URI",
> >>>>>>>> a "URN", a reliable perma-link in the IETF "news" protocol, namespace.
> >>>>>>>>
> >>>>>>>> https://groups.google.com/g/sci.math/c/zggff_pVEks
> >>>>>>>>
> >>>>>>>> I wonder that there's a reliable sort of long-term project that
> >>>>>>>> surfaces
> >>>>>>>> "news" protocol message-IDs, .... It's a stable, standards-based
> >>>>>>>> protocol.
> >>>>>>>>
> >>>>>>>>
> >>>>>>>> Thunderbird, "SLRN", .... Thanks for caring. We care.
> >>>>>>>>
> >>>>>>>>
> >>>>>>>> https://groups.google.com/g/sci.physics.relativity/c/ToBo6XOymUw
> >>>>>>>>
> >>>>>>>
> >>>>>>> One fellow reached me via e-mail and he said, hey, the Googler spam is
> >>>>>>> outrageous, can we do anything about it? Would you write a script to
> >>>>>>> funnel all their message-ID's into the abuse reporting? And I was
> >>>>>>> like,
> >>>>>>> you know, about 2008 I did just that, there was a big spam flood,
> >>>>>>> and I wrote a little script to find them and extract their
> >>>>>>> posting-account,
> >>>>>>> and the message-ID, and a little script to post to the posting-host,
> >>>>>>> each one of the wicked spams.
> >>>>>>>
> >>>>>>> At the time that seemed to help, they sort of dried up, here there's
> >>>>>>> that basically they're not following the charter, but, it's the
> >>>>>>> posting-account
> >>>>>>> in the message headers that indicate the origin of the post, not the
> >>>>>>> email address. So, I wonder, given that I can extract the
> >>>>>>> posting-accounts
> >>>>>>> of all the spams, how to match the posting-account to then determine
> >>>>>>> whether it's a sockpuppet-farm or what, and basically about sending
> >>>>>>> them up.
> >>>>>>
> >>>>>> Let me see your little script. Post it here.
> >>>>>
> >>>>> Here is a list I currently have:
> >>>>>
> >>>>> salz.txt
> >>>>> usenet.death.penalty.gz
> >>>>> purify.txt
> >>>>> NewsAgent110-MS.exe
> >>>>> HipCrime's NewsAgent (v1_11).htm
> >>>>> NewsAgent111-BE.zip
> >>>>> SuperCede.exe
> >>>>> NewsAgent023.exe
> >>>>> NewsAgent025.exe
> >>>>> ActiveAgent.java
> >>>>> HipCrime's NewsAgent (v1_02)_files
> >>>>> NewsCancel.java (source code)
> >>>>>
> >>>>> (plus updated python versions)
> >>>>>
> >>>>>
> >>>>>
> >>>>> (Maybe your script is inthere somewhere?)
> >>>>>
> >>>>>
> >>>>>
> >>>>> Show me what you got. walk the walk.
> >>>>>
> >>>>
> >>>>
> >>>> I try to avoid sketchy things like hiring a criminal botnet,
> >>>> there's the impression that that's looking at 1000's of counts
> >>>> of computer intrusion.
> >>>>
> >>>> With those being something about $50K and 10-25 apiece,
> >>>> there's a pretty significant deterrence to such activities.
> >>>>
> >>>> I've never much cared for "OAuth", giving away the
> >>>> keys-to-the-kingdom and all, here it looks like either
> >>>> a) a bunch of duped browsers clicked away their identities,
> >>>> or b) it's really that Google and Facebook are more than
> >>>> half full of fake identities for the sole purpose of being fake.
> >>>>
> >>>> (How's your new deal going?
> >>>> Great, we got a million users.
> >>>> Why are my conversions around zero?
> >>>> Your ad must not speak to them.
> >>>> Would it help if I spiced it up?
> >>>> Don't backtalk me, I'll put you on a list!)
> >>>>
> >>>> So, it seems mostly a sort of "spam-walling the Internet",
> >>>> where it was like "we're going to reinvent the Internet",
> >>>> "no, you aren't", "all right then we'll ruin this one".
> >>>>
> >>>> As far as search goes, there's something to be said
> >>>> for a new sort of approach to search, given that
> >>>> Google, Bing, Duck, ..., _all make the same results_. It's
> >>>> just so highly unlikely that they'd _all make the same
> >>>> results_, you figure they're just one.
> >>>>
> >>>> So, the idea, for somebody like me who's mostly interested
> >>>> in writing on the Internet, is that lots of that is of the sort
> >>>> of "works" vis-a-vis, the "feuilleton" or what you might
> >>>> call it, ephemeral junk, that I just learned about in
> >>>> Herman Hesse's "The Glass Bead Game".
> >>>>
> >>>> Then, there's an idea, that basically to surface high-quality
> >>>> works to a search, is that there's what's called metadata,
> >>>> for content like HTML, with regards to Dublin Core and
> >>>> RDF and so on, about a sort of making for fungible collections
> >>>> of works, what results searchable fragments of various
> >>>> larger bodies of works, according to their robots.txt and
> >>>> their summaries and with regards to crawling the content
> >>>> and so on, then to make federated common search corpi,
> >>>> these kinds of things.
> >>>>
> >>>>
> >>>>
> >>>
> >>> It's like "why are they building that new data center",
> >>> and it's like "well it's like Artificial Intelligence, inside
> >>> that data center is a million virts and each one has a
> >>> browser emulator and a phone app sandbox and a
> >>> little notecard that prompts its name, basically it's
> >>> a million-headed hydra called a sims-bot-farm,
> >>> that for pennies on the dollar is an instant audience."
> >>>
> >>> "Wow, great, do they get a cut?" "Don't be talking about my cut."
> >>>
> >>> Usenet traffic had been up recently, ....
> >>>
> >>> I think they used to call it "astro-turfing".
> >>> "Artificial Intelligence?" "No, 'Fake eyeballs'."
> >>
> >> I have NewsAgent111-MS.exe
> >>
> >> I seem to be missing version 2.0
> >>
> >> Do you have the 2.0 version?
> >>
> >> I'll trade you.
> >>
> >> I'll give you my python version with (GUI)!!!! (Tinter)
> >>
> >> let's trade!
> >>
> >> don't bogart
> >
> > I seem to be missing this version:
> >
> > https://web.archive.org/web/20051023050609/http://newsagent.p5.org.uk/
> >
> > Do you have it? you must have!
> >
> >
> >
> >
> >
>
> Nope, I just wrote a little script to connect to NNTP
> with a Yes/No button on the subject, tapped through
> those, and a little script to send an HTTP request to
> the publicly-facing return-to-sender in-box, for each.
>
> Here's all the sources you need: IETF RFC editor.
> Look for "NNTP". How to advise Google of this is
> that each domain on the Internet is supposed to
> have an "abuse@domain" email inbox, though there's
> probably also a web request interface, as with regards
> to publicly facing services, and expected to be
> good actors on the network.
>
> Anyways if you read through "Meta: a usenet server
> just for sci.math", what I have in mind is a sort
> of author's and writer's oriented installation,
> basically making for vanity printouts and generating
> hypertext collections of contents and authors and
> subjects and these kinds of things, basically for
> on the order of "find all the postings of Archimedes
> Plutonium, and, the threads they are in, and,
> make a hypertext page of all that, a linear timeline,
> and also thread it out as a linear sequence".
>
> I.e. people who actually post to Usenet are sometimes
> having written interesting things, and, thus having
> it so that it would be simplified to generate message-ID
> listings and their corresponding standard URL's in the
> standard IETF "news" URL protocol, and to point that
> at a given news server or like XLink, is for treating
> Usenet its archives like a living museum of all these
> different authors posts and their interactions together.
>
> I.e., here it's "belles lettres" and "fair use",
> not just "belles" and "use".
>
> It seemed nice of Google Groups to front this for a long time,
> now they're quitting.
>
> I imagine Internet Relay Chat's still insane, though.
>
> Anyways I stay away from any warez and am proud that
> since about Y2K at least I've never bootlegged anything,
> and never uploaded a bootleg. Don't want to give old Shylock
> excuses, and besides, I wrote software for a living.


Click here to read the complete article
Re: How I deal with the enormous amount of spam

<upqh5t$8t0g$1@dont-email.me>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130536&group=sci.physics.relativity#130536

  copy link   Newsgroups: sci.physics.relativity
Path: i2pn2.org!i2pn.org!news.bbs.nz!eternal-september.org!feeder3.eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail
From: wugi@brol.invalid (wugi)
Newsgroups: sci.physics.relativity
Subject: Re: How I deal with the enormous amount of spam
Date: Mon, 5 Feb 2024 12:35:25 +0100
Organization: A noiseless patient Spider
Lines: 28
Message-ID: <upqh5t$8t0g$1@dont-email.me>
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Date: Mon, 5 Feb 2024 11:35:25 -0000 (UTC)
Injection-Info: dont-email.me; posting-host="15bb213d60f704bab0ba0aefc299b3bb";
logging-data="291856"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1/xQtO7OZa1kWq4YjARZacQ344dsPZVSvs="
User-Agent: Mozilla Thunderbird
Cancel-Lock: sha1:znkKK/VO/39wxZiRHk0YB/8a5Ik=
In-Reply-To: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com>
Content-Language: nl
 by: wugi - Mon, 5 Feb 2024 11:35 UTC

Op 30/01/2024 om 1:03 schreef Tom Roberts:
> I use Thunderbird to read Usenet. Recently sci.physics.relativity has
> been getting hundreds of spam posts each day, completely overwhelming
> legitimate content. These spam posts share the property that they are
> written in a non-latin script.
>
> Thunderbird implements message filters that can mark a message Read. So

Better than that: why didn't you choose "delete"?

> I created a filter to run on sci.physics.relativity that marks messages
> Read. Then when reading the newsgroups, I simply display only unread

Not needed in case of choice "delete".

> messages. The key to making this work is to craft the filter so it marks
> messages in which the Subject matches any of a dozen characters picked
> from some spam messages.
>
> This doesn't completely eliminate the spam, but it is now only a few
> messages per day.

I've done that too, but then there came a tsunami of indonesian
latin-script spam...

--
guido wugi

Re: How I deal with the enormous amount of spam

<bf1a179c-e728-4ff7-993e-f6e4284cfcc8n@googlegroups.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130537&group=sci.physics.relativity#130537

  copy link   Newsgroups: sci.physics.relativity
X-Received: by 2002:a0c:f884:0:b0:68c:8d51:1cbf with SMTP id u4-20020a0cf884000000b0068c8d511cbfmr110092qvn.7.1707142606233;
Mon, 05 Feb 2024 06:16:46 -0800 (PST)
X-Received: by 2002:ac8:5dd0:0:b0:42c:b6b:a9d1 with SMTP id
e16-20020ac85dd0000000b0042c0b6ba9d1mr659048qtx.1.1707142606053; Mon, 05 Feb
2024 06:16:46 -0800 (PST)
Path: i2pn2.org!i2pn.org!usenet.blueworldhosting.com!diablo1.usenet.blueworldhosting.com!peer03.iad!feed-me.highwinds-media.com!news.highwinds-media.com!news-out.google.com!nntp.google.com!postnews.google.com!google-groups.googlegroups.com!not-for-mail
Newsgroups: sci.physics.relativity
Date: Mon, 5 Feb 2024 06:16:45 -0800 (PST)
In-Reply-To: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com>
Injection-Info: google-groups.googlegroups.com; posting-host=2603:7000:af3f:8068:d087:ad8b:d6ec:aa7;
posting-account=snuulgoAAABlygjDf5Sy-IsQ7XWowIAM
NNTP-Posting-Host: 2603:7000:af3f:8068:d087:ad8b:d6ec:aa7
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com>
User-Agent: G2/1.0
MIME-Version: 1.0
Message-ID: <bf1a179c-e728-4ff7-993e-f6e4284cfcc8n@googlegroups.com>
Subject: Re: How I deal with the enormous amount of spam
From: xip1415926@gmail.com (xip14)
Injection-Date: Mon, 05 Feb 2024 14:16:46 +0000
Content-Type: text/plain; charset="UTF-8"
X-Received-Bytes: 1340
 by: xip14 - Mon, 5 Feb 2024 14:16 UTC

I was galvanized into action
undertook wild gyrations
parkoured my person
postured like an incompetent photographer
writhed in fecklessness
and spiraled out of control

Re: How I deal with the enormous amount of spam

<334215ed-dad9-4a7a-9d49-b40a1de67796n@googlegroups.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130539&group=sci.physics.relativity#130539

  copy link   Newsgroups: sci.physics.relativity
X-Received: by 2002:a05:622a:20f:b0:42c:845:5ff0 with SMTP id b15-20020a05622a020f00b0042c08455ff0mr6074qtx.5.1707157623619;
Mon, 05 Feb 2024 10:27:03 -0800 (PST)
X-Received: by 2002:ac8:57c8:0:b0:42c:28cd:cea6 with SMTP id
w8-20020ac857c8000000b0042c28cdcea6mr6100qta.5.1707157623445; Mon, 05 Feb
2024 10:27:03 -0800 (PST)
Path: i2pn2.org!i2pn.org!newsfeed.endofthelinebbs.com!usenet.blueworldhosting.com!diablo1.usenet.blueworldhosting.com!peer01.iad!feed-me.highwinds-media.com!news.highwinds-media.com!news-out.google.com!nntp.google.com!postnews.google.com!google-groups.googlegroups.com!not-for-mail
Newsgroups: sci.physics.relativity
Date: Mon, 5 Feb 2024 10:27:03 -0800 (PST)
In-Reply-To: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com>
Injection-Info: google-groups.googlegroups.com; posting-host=2603:7000:af3f:8068:cdec:e575:3df3:c668;
posting-account=snuulgoAAABlygjDf5Sy-IsQ7XWowIAM
NNTP-Posting-Host: 2603:7000:af3f:8068:cdec:e575:3df3:c668
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com>
User-Agent: G2/1.0
MIME-Version: 1.0
Message-ID: <334215ed-dad9-4a7a-9d49-b40a1de67796n@googlegroups.com>
Subject: Re: How I deal with the enormous amount of spam
From: xip1415926@gmail.com (xip14)
Injection-Date: Mon, 05 Feb 2024 18:27:03 +0000
Content-Type: text/plain; charset="UTF-8"
X-Received-Bytes: 1358
 by: xip14 - Mon, 5 Feb 2024 18:27 UTC

I was galvanized into action
undertook wild gyrations
parkoured my person
postured like an incompetent photographer
writhed in fecklessness
and spiraled out of control

*/
*/
*/
*/

Re: Meta: Re: How I deal with the enormous amount of spam

<65C27AC4.15B7@ix.netcom.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130546&group=sci.physics.relativity#130546

  copy link   Newsgroups: sci.physics.relativity sci.physics
Path: i2pn2.org!i2pn.org!news.neodome.net!weretis.net!feeder8.news.weretis.net!paganini.bofh.team!not-for-mail
From: starmaker@ix.netcom.com (The Starmaker)
Newsgroups: sci.physics.relativity,sci.physics
Subject: Re: Meta: Re: How I deal with the enormous amount of spam
Date: Tue, 06 Feb 2024 10:30:28 -0800
Organization: To protect and to server
Message-ID: <65C27AC4.15B7@ix.netcom.com>
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com> <afe109ee689f7051d5909fe8eca54113@www.novabbs.com> <b67e3cc9-a1ac-428f-b225-389abf45d416n@googlegroups.com> <Z_icnWyNR7xW5CP4nZ2dnZfqnPSdnZ2d@giganews.com> <65BEB38A.B7E@ix.netcom.com> <65BEC238.5CF1@ix.netcom.com> <18mdnb4LQ8kNUiL4nZ2dnZfqn_SdnZ2d@giganews.com> <0decnTlW944oSSL4nZ2dnZfqn_udnZ2d@giganews.com> <65BFF51D.3F9B@ix.netcom.com> <65BFF947.2233@ix.netcom.com> <Oq6cneYGTPUt2l34nZ2dnZfqnPqdnZ2d@giganews.com> <65C079E5.4E84@ix.netcom.com>
Reply-To: starmaker@ix.netcom.com
Mime-Version: 1.0
Content-Type: text/plain; charset=iso-8859-1
Content-Transfer-Encoding: 8bit
Injection-Info: paganini.bofh.team; logging-data="2274419"; posting-host="nLYg9UBeoMWa070gP9wQcw.user.paganini.bofh.team"; mail-complaints-to="usenet@bofh.team"; posting-account="9dIQLXBM7WM9KzA+yjdR4A";
Cancel-Lock: sha256:AoB3Ep+6Xu/yzwOu0yncsEZwi8u5QZmn05lxXhE8l18=
X-Notice: Filtered by postfilter v. 0.9.3
X-Mailer: Mozilla 3.04Gold (WinNT; U)
X-Antivirus: Avast (VPS 240206-2, 02/06/2024), Outbound message
X-Antivirus-Status: Clean
 by: The Starmaker - Tue, 6 Feb 2024 18:30 UTC

The Starmaker wrote:
>
> Ross Finlayson wrote:
> >
> > On 02/04/2024 12:53 PM, The Starmaker wrote:
> > > The Starmaker wrote:
> > >>
> > >> Ross Finlayson wrote:
> > >>>
> > >>> On 02/04/2024 09:55 AM, Ross Finlayson wrote:
> > >>>> On 02/03/2024 02:46 PM, The Starmaker wrote:
> > >>>>> The Starmaker wrote:
> > >>>>>>
> > >>>>>> Ross Finlayson wrote:
> > >>>>>>>
> > >>>>>>> On 01/30/2024 12:54 PM, Ross Finlayson wrote:
> > >>>>>>>> On Monday, January 29, 2024 at 5:02:05 PM UTC-8, palsing wrote:
> > >>>>>>>>> Tom Roberts wrote:
> > >>>>>>>>>
> > >>>>>>>>>> I use Thunderbird to read Usenet. Recently sci.physics.relativity
> > >>>>>>>>>> has
> > >>>>>>>>>> been getting hundreds of spam posts each day, completely
> > >>>>>>>>>> overwhelming
> > >>>>>>>>>> legitimate content. These spam posts share the property that they
> > >>>>>>>>>> are
> > >>>>>>>>>> written in a non-latin script.
> > >>>>>>>>>
> > >>>>>>>>>> Thunderbird implements message filters that can mark a message
> > >>>>>>>>>> Read. So
> > >>>>>>>>>> I created a filter to run on sci.physics.relativity that marks
> > >>>>>>>>>> messages
> > >>>>>>>>>> Read. Then when reading the newsgroups, I simply display only unread
> > >>>>>>>>>> messages. The key to making this work is to craft the filter so
> > >>>>>>>>>> it marks
> > >>>>>>>>>> messages in which the Subject matches any of a dozen characters
> > >>>>>>>>>> picked
> > >>>>>>>>>> from some spam messages.
> > >>>>>>>>>
> > >>>>>>>>>> This doesn't completely eliminate the spam, but it is now only a few
> > >>>>>>>>>> messages per day.
> > >>>>>>>>>
> > >>>>>>>>>> Tom Roberts
> > >>>>>>>>> I would like to do the same thing, so I installed Thunderbird...
> > >>>>>>>>> but setting it up to read newsgroups is beyond my paltry computer
> > >>>>>>>>> skills and is not at all intuitive. If anyone can point to an
> > >>>>>>>>> idiot-proof tutorial for doing this It would be much appreciated.
> > >>>>>>>>>
> > >>>>>>>>> \Paul Alsing
> > >>>>>>>>
> > >>>>>>>> Yeah, it's pretty bad, or, worse anybody's ever seen it.
> > >>>>>>>>
> > >>>>>>>> I as well sort of mow the lawn a bit or mark the spam.
> > >>>>>>>>
> > >>>>>>>> It seems alright if it'll be a sort of clean break: on Feb 22
> > >>>>>>>> according to Google,
> > >>>>>>>> Google will break its compeerage to Usenet, and furthermore make
> > >>>>>>>> read-only
> > >>>>>>>> the archives, what it has, what until then, will be as it was.
> > >>>>>>>>
> > >>>>>>>> Over on sci.math I've had the idea for a while of making some brief
> > >>>>>>>> and
> > >>>>>>>> special purpose Usenet compeers, for only some few groups, or, you
> > >>>>>>>> know, the _belles lettres_ of the text hierarchy.
> > >>>>>>>>
> > >>>>>>>> "Meta: a usenet server just for sci.math"
> > >>>>>>>> -- https://groups.google.com/g/sci.math/c/zggff_pVEks
> > >>>>>>>>
> > >>>>>>>> So, there you can read the outlook of this kind of thing, then
> > >>>>>>>> while sort
> > >>>>>>>> of simple as the protocol is simple and its implementations
> > >>>>>>>> widespread,
> > >>>>>>>> how to deal with the "signal and noise" of "exposed messaging
> > >>>>>>>> destinations
> > >>>>>>>> on the Internet", well on that thread I'm theorizing a sort of,
> > >>>>>>>> "NOOBNB protocol",
> > >>>>>>>> figuring to make an otherwise just standard Usenet compeer, and
> > >>>>>>>> also for
> > >>>>>>>> email or messaging destinations, sort of designed with the
> > >>>>>>>> expectation that
> > >>>>>>>> there will be spam, and spam and ham are hand in hand, to exclude
> > >>>>>>>> it in simple terms.
> > >>>>>>>>
> > >>>>>>>> NOOBNB: New Old Off Bot Non Bad, Curated/Purgatory/Raw triple-feed
> > >>>>>>>>
> > >>>>>>>> (That and a firmer sort of "Load Shed" or "Load Hold" at the
> > >>>>>>>> transport layer.)
> > >>>>>>>>
> > >>>>>>>> Also it would be real great if at least there was surfaced to the
> > >>>>>>>> Internet a
> > >>>>>>>> read-only view of any message by its message ID, a "URL", or as for
> > >>>>>>>> a "URI",
> > >>>>>>>> a "URN", a reliable perma-link in the IETF "news" protocol, namespace.
> > >>>>>>>>
> > >>>>>>>> https://groups.google.com/g/sci.math/c/zggff_pVEks
> > >>>>>>>>
> > >>>>>>>> I wonder that there's a reliable sort of long-term project that
> > >>>>>>>> surfaces
> > >>>>>>>> "news" protocol message-IDs, .... It's a stable, standards-based
> > >>>>>>>> protocol.
> > >>>>>>>>
> > >>>>>>>>
> > >>>>>>>> Thunderbird, "SLRN", .... Thanks for caring. We care.
> > >>>>>>>>
> > >>>>>>>>
> > >>>>>>>> https://groups.google.com/g/sci.physics.relativity/c/ToBo6XOymUw
> > >>>>>>>>
> > >>>>>>>
> > >>>>>>> One fellow reached me via e-mail and he said, hey, the Googler spam is
> > >>>>>>> outrageous, can we do anything about it? Would you write a script to
> > >>>>>>> funnel all their message-ID's into the abuse reporting? And I was
> > >>>>>>> like,
> > >>>>>>> you know, about 2008 I did just that, there was a big spam flood,
> > >>>>>>> and I wrote a little script to find them and extract their
> > >>>>>>> posting-account,
> > >>>>>>> and the message-ID, and a little script to post to the posting-host,
> > >>>>>>> each one of the wicked spams.
> > >>>>>>>
> > >>>>>>> At the time that seemed to help, they sort of dried up, here there's
> > >>>>>>> that basically they're not following the charter, but, it's the
> > >>>>>>> posting-account
> > >>>>>>> in the message headers that indicate the origin of the post, not the
> > >>>>>>> email address. So, I wonder, given that I can extract the
> > >>>>>>> posting-accounts
> > >>>>>>> of all the spams, how to match the posting-account to then determine
> > >>>>>>> whether it's a sockpuppet-farm or what, and basically about sending
> > >>>>>>> them up.
> > >>>>>>
> > >>>>>> Let me see your little script. Post it here.
> > >>>>>
> > >>>>> Here is a list I currently have:
> > >>>>>
> > >>>>> salz.txt
> > >>>>> usenet.death.penalty.gz
> > >>>>> purify.txt
> > >>>>> NewsAgent110-MS.exe
> > >>>>> HipCrime's NewsAgent (v1_11).htm
> > >>>>> NewsAgent111-BE.zip
> > >>>>> SuperCede.exe
> > >>>>> NewsAgent023.exe
> > >>>>> NewsAgent025.exe
> > >>>>> ActiveAgent.java
> > >>>>> HipCrime's NewsAgent (v1_02)_files
> > >>>>> NewsCancel.java (source code)
> > >>>>>
> > >>>>> (plus updated python versions)
> > >>>>>
> > >>>>>
> > >>>>>
> > >>>>> (Maybe your script is inthere somewhere?)
> > >>>>>
> > >>>>>
> > >>>>>
> > >>>>> Show me what you got. walk the walk.
> > >>>>>
> > >>>>
> > >>>>
> > >>>> I try to avoid sketchy things like hiring a criminal botnet,
> > >>>> there's the impression that that's looking at 1000's of counts
> > >>>> of computer intrusion.
> > >>>>
> > >>>> With those being something about $50K and 10-25 apiece,
> > >>>> there's a pretty significant deterrence to such activities.
> > >>>>
> > >>>> I've never much cared for "OAuth", giving away the
> > >>>> keys-to-the-kingdom and all, here it looks like either
> > >>>> a) a bunch of duped browsers clicked away their identities,
> > >>>> or b) it's really that Google and Facebook are more than
> > >>>> half full of fake identities for the sole purpose of being fake.
> > >>>>
> > >>>> (How's your new deal going?
> > >>>> Great, we got a million users.
> > >>>> Why are my conversions around zero?
> > >>>> Your ad must not speak to them.
> > >>>> Would it help if I spiced it up?
> > >>>> Don't backtalk me, I'll put you on a list!)
> > >>>>
> > >>>> So, it seems mostly a sort of "spam-walling the Internet",
> > >>>> where it was like "we're going to reinvent the Internet",
> > >>>> "no, you aren't", "all right then we'll ruin this one".
> > >>>>
> > >>>> As far as search goes, there's something to be said
> > >>>> for a new sort of approach to search, given that
> > >>>> Google, Bing, Duck, ..., _all make the same results_. It's
> > >>>> just so highly unlikely that they'd _all make the same
> > >>>> results_, you figure they're just one.
> > >>>>
> > >>>> So, the idea, for somebody like me who's mostly interested
> > >>>> in writing on the Internet, is that lots of that is of the sort
> > >>>> of "works" vis-a-vis, the "feuilleton" or what you might
> > >>>> call it, ephemeral junk, that I just learned about in
> > >>>> Herman Hesse's "The Glass Bead Game".
> > >>>>
> > >>>> Then, there's an idea, that basically to surface high-quality
> > >>>> works to a search, is that there's what's called metadata,
> > >>>> for content like HTML, with regards to Dublin Core and
> > >>>> RDF and so on, about a sort of making for fungible collections
> > >>>> of works, what results searchable fragments of various
> > >>>> larger bodies of works, according to their robots.txt and
> > >>>> their summaries and with regards to crawling the content
> > >>>> and so on, then to make federated common search corpi,
> > >>>> these kinds of things.
> > >>>>
> > >>>>
> > >>>>
> > >>>
> > >>> It's like "why are they building that new data center",
> > >>> and it's like "well it's like Artificial Intelligence, inside
> > >>> that data center is a million virts and each one has a
> > >>> browser emulator and a phone app sandbox and a
> > >>> little notecard that prompts its name, basically it's
> > >>> a million-headed hydra called a sims-bot-farm,
> > >>> that for pennies on the dollar is an instant audience."
> > >>>
> > >>> "Wow, great, do they get a cut?" "Don't be talking about my cut."
> > >>>
> > >>> Usenet traffic had been up recently, ....
> > >>>
> > >>> I think they used to call it "astro-turfing".
> > >>> "Artificial Intelligence?" "No, 'Fake eyeballs'."
> > >>
> > >> I have NewsAgent111-MS.exe
> > >>
> > >> I seem to be missing version 2.0
> > >>
> > >> Do you have the 2.0 version?
> > >>
> > >> I'll trade you.
> > >>
> > >> I'll give you my python version with (GUI)!!!! (Tinter)
> > >>
> > >> let's trade!
> > >>
> > >> don't bogart
> > >
> > > I seem to be missing this version:
> > >
> > > https://web.archive.org/web/20051023050609/http://newsagent.p5.org.uk/
> > >
> > > Do you have it? you must have!
> > >
> > >
> > >
> > >
> > >
> >
> > Nope, I just wrote a little script to connect to NNTP
> > with a Yes/No button on the subject, tapped through
> > those, and a little script to send an HTTP request to
> > the publicly-facing return-to-sender in-box, for each.
> >
> > Here's all the sources you need: IETF RFC editor.
> > Look for "NNTP". How to advise Google of this is
> > that each domain on the Internet is supposed to
> > have an "abuse@domain" email inbox, though there's
> > probably also a web request interface, as with regards
> > to publicly facing services, and expected to be
> > good actors on the network.
> >
> > Anyways if you read through "Meta: a usenet server
> > just for sci.math", what I have in mind is a sort
> > of author's and writer's oriented installation,
> > basically making for vanity printouts and generating
> > hypertext collections of contents and authors and
> > subjects and these kinds of things, basically for
> > on the order of "find all the postings of Archimedes
> > Plutonium, and, the threads they are in, and,
> > make a hypertext page of all that, a linear timeline,
> > and also thread it out as a linear sequence".
> >
> > I.e. people who actually post to Usenet are sometimes
> > having written interesting things, and, thus having
> > it so that it would be simplified to generate message-ID
> > listings and their corresponding standard URL's in the
> > standard IETF "news" URL protocol, and to point that
> > at a given news server or like XLink, is for treating
> > Usenet its archives like a living museum of all these
> > different authors posts and their interactions together.
> >
> > I.e., here it's "belles lettres" and "fair use",
> > not just "belles" and "use".
> >
> > It seemed nice of Google Groups to front this for a long time,
> > now they're quitting.
> >
> > I imagine Internet Relay Chat's still insane, though.
> >
> > Anyways I stay away from any warez and am proud that
> > since about Y2K at least I've never bootlegged anything,
> > and never uploaded a bootleg. Don't want to give old Shylock
> > excuses, and besides, I wrote software for a living.
>
> Anyways, I don't know who was talking about "any warez" or "bootlegs",
> since I was refering to programs and scripts that reads:
>
> "FREE, which means you can copy it and redistribute"
> "Similarly, the source is provided as reference and can be redistributed
> freely as well. "
>
> HipCrime's NewsAgent (v2.0) is FREE, which means you can copy it and
> redistribute it at will, as long as you give credit to the original
> author. Similarly, the source is provided as reference and can be
> redistributed freely as well.
>
> https://web.archive.org/web/20051023050609/http://newsagent.p5.org.uk/
>
> You seem to be too much 'in your head', on a high horse...
>
> "FREE, which means you can copy it and redistribute"
> "Similarly, the source is provided as reference and can be redistributed
> freely as well. "
>
> So, show me that wicked script you wrote : "funnel all their
> message-ID's"
> by people you call spammers who 'funnel' their products and services
> through Usenet newsgroups.
>
> You are sooooo wicked.
>
> and a nanofossils


Click here to read the complete article
Re: Meta: Re: How I deal with the enormous amount of spam

<65C2822A.16A7@ix.netcom.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130549&group=sci.physics.relativity#130549

  copy link   Newsgroups: sci.physics.relativity sci.physics
Path: i2pn2.org!i2pn.org!paganini.bofh.team!not-for-mail
From: starmaker@ix.netcom.com (The Starmaker)
Newsgroups: sci.physics.relativity,sci.physics
Subject: Re: Meta: Re: How I deal with the enormous amount of spam
Date: Tue, 06 Feb 2024 11:02:02 -0800
Organization: To protect and to server
Message-ID: <65C2822A.16A7@ix.netcom.com>
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com> <afe109ee689f7051d5909fe8eca54113@www.novabbs.com> <b67e3cc9-a1ac-428f-b225-389abf45d416n@googlegroups.com> <Z_icnWyNR7xW5CP4nZ2dnZfqnPSdnZ2d@giganews.com> <65BEB38A.B7E@ix.netcom.com> <65BEC238.5CF1@ix.netcom.com> <18mdnb4LQ8kNUiL4nZ2dnZfqn_SdnZ2d@giganews.com> <0decnTlW944oSSL4nZ2dnZfqn_udnZ2d@giganews.com> <65BFF51D.3F9B@ix.netcom.com> <65BFF947.2233@ix.netcom.com> <Oq6cneYGTPUt2l34nZ2dnZfqnPqdnZ2d@giganews.com> <65C079E5.4E84@ix.netcom.com> <65C27AC4.15B7@ix.netcom.com>
Reply-To: starmaker@ix.netcom.com
Mime-Version: 1.0
Content-Type: text/plain; charset=iso-8859-1
Content-Transfer-Encoding: 8bit
Injection-Info: paganini.bofh.team; logging-data="2274419"; posting-host="nLYg9UBeoMWa070gP9wQcw.user.paganini.bofh.team"; mail-complaints-to="usenet@bofh.team"; posting-account="9dIQLXBM7WM9KzA+yjdR4A";
Cancel-Lock: sha256:4FrEmxvODYXVDynMlSmNNvZ8VdnuGSlZqXFWO96zULA=
X-Antivirus-Status: Clean
X-Antivirus: Avast (VPS 240206-2, 02/06/2024), Outbound message
X-Mailer: Mozilla 3.04Gold (WinNT; U)
X-Notice: Filtered by postfilter v. 0.9.3
 by: The Starmaker - Tue, 6 Feb 2024 19:02 UTC

The Starmaker wrote:
>
> The Starmaker wrote:
> >
> > Ross Finlayson wrote:
> > >
> > > On 02/04/2024 12:53 PM, The Starmaker wrote:
> > > > The Starmaker wrote:
> > > >>
> > > >> Ross Finlayson wrote:
> > > >>>
> > > >>> On 02/04/2024 09:55 AM, Ross Finlayson wrote:
> > > >>>> On 02/03/2024 02:46 PM, The Starmaker wrote:
> > > >>>>> The Starmaker wrote:
> > > >>>>>>
> > > >>>>>> Ross Finlayson wrote:
> > > >>>>>>>
> > > >>>>>>> On 01/30/2024 12:54 PM, Ross Finlayson wrote:
> > > >>>>>>>> On Monday, January 29, 2024 at 5:02:05 PM UTC-8, palsing wrote:
> > > >>>>>>>>> Tom Roberts wrote:
> > > >>>>>>>>>
> > > >>>>>>>>>> I use Thunderbird to read Usenet. Recently sci.physics.relativity
> > > >>>>>>>>>> has
> > > >>>>>>>>>> been getting hundreds of spam posts each day, completely
> > > >>>>>>>>>> overwhelming
> > > >>>>>>>>>> legitimate content. These spam posts share the property that they
> > > >>>>>>>>>> are
> > > >>>>>>>>>> written in a non-latin script.
> > > >>>>>>>>>
> > > >>>>>>>>>> Thunderbird implements message filters that can mark a message
> > > >>>>>>>>>> Read. So
> > > >>>>>>>>>> I created a filter to run on sci.physics.relativity that marks
> > > >>>>>>>>>> messages
> > > >>>>>>>>>> Read. Then when reading the newsgroups, I simply display only unread
> > > >>>>>>>>>> messages. The key to making this work is to craft the filter so
> > > >>>>>>>>>> it marks
> > > >>>>>>>>>> messages in which the Subject matches any of a dozen characters
> > > >>>>>>>>>> picked
> > > >>>>>>>>>> from some spam messages.
> > > >>>>>>>>>
> > > >>>>>>>>>> This doesn't completely eliminate the spam, but it is now only a few
> > > >>>>>>>>>> messages per day.
> > > >>>>>>>>>
> > > >>>>>>>>>> Tom Roberts
> > > >>>>>>>>> I would like to do the same thing, so I installed Thunderbird...
> > > >>>>>>>>> but setting it up to read newsgroups is beyond my paltry computer
> > > >>>>>>>>> skills and is not at all intuitive. If anyone can point to an
> > > >>>>>>>>> idiot-proof tutorial for doing this It would be much appreciated.
> > > >>>>>>>>>
> > > >>>>>>>>> \Paul Alsing
> > > >>>>>>>>
> > > >>>>>>>> Yeah, it's pretty bad, or, worse anybody's ever seen it.
> > > >>>>>>>>
> > > >>>>>>>> I as well sort of mow the lawn a bit or mark the spam.
> > > >>>>>>>>
> > > >>>>>>>> It seems alright if it'll be a sort of clean break: on Feb 22
> > > >>>>>>>> according to Google,
> > > >>>>>>>> Google will break its compeerage to Usenet, and furthermore make
> > > >>>>>>>> read-only
> > > >>>>>>>> the archives, what it has, what until then, will be as it was.
> > > >>>>>>>>
> > > >>>>>>>> Over on sci.math I've had the idea for a while of making some brief
> > > >>>>>>>> and
> > > >>>>>>>> special purpose Usenet compeers, for only some few groups, or, you
> > > >>>>>>>> know, the _belles lettres_ of the text hierarchy.
> > > >>>>>>>>
> > > >>>>>>>> "Meta: a usenet server just for sci.math"
> > > >>>>>>>> -- https://groups.google.com/g/sci.math/c/zggff_pVEks
> > > >>>>>>>>
> > > >>>>>>>> So, there you can read the outlook of this kind of thing, then
> > > >>>>>>>> while sort
> > > >>>>>>>> of simple as the protocol is simple and its implementations
> > > >>>>>>>> widespread,
> > > >>>>>>>> how to deal with the "signal and noise" of "exposed messaging
> > > >>>>>>>> destinations
> > > >>>>>>>> on the Internet", well on that thread I'm theorizing a sort of,
> > > >>>>>>>> "NOOBNB protocol",
> > > >>>>>>>> figuring to make an otherwise just standard Usenet compeer, and
> > > >>>>>>>> also for
> > > >>>>>>>> email or messaging destinations, sort of designed with the
> > > >>>>>>>> expectation that
> > > >>>>>>>> there will be spam, and spam and ham are hand in hand, to exclude
> > > >>>>>>>> it in simple terms.
> > > >>>>>>>>
> > > >>>>>>>> NOOBNB: New Old Off Bot Non Bad, Curated/Purgatory/Raw triple-feed
> > > >>>>>>>>
> > > >>>>>>>> (That and a firmer sort of "Load Shed" or "Load Hold" at the
> > > >>>>>>>> transport layer.)
> > > >>>>>>>>
> > > >>>>>>>> Also it would be real great if at least there was surfaced to the
> > > >>>>>>>> Internet a
> > > >>>>>>>> read-only view of any message by its message ID, a "URL", or as for
> > > >>>>>>>> a "URI",
> > > >>>>>>>> a "URN", a reliable perma-link in the IETF "news" protocol, namespace.
> > > >>>>>>>>
> > > >>>>>>>> https://groups.google.com/g/sci.math/c/zggff_pVEks
> > > >>>>>>>>
> > > >>>>>>>> I wonder that there's a reliable sort of long-term project that
> > > >>>>>>>> surfaces
> > > >>>>>>>> "news" protocol message-IDs, .... It's a stable, standards-based
> > > >>>>>>>> protocol.
> > > >>>>>>>>
> > > >>>>>>>>
> > > >>>>>>>> Thunderbird, "SLRN", .... Thanks for caring. We care.
> > > >>>>>>>>
> > > >>>>>>>>
> > > >>>>>>>> https://groups.google.com/g/sci.physics.relativity/c/ToBo6XOymUw
> > > >>>>>>>>
> > > >>>>>>>
> > > >>>>>>> One fellow reached me via e-mail and he said, hey, the Googler spam is
> > > >>>>>>> outrageous, can we do anything about it? Would you write a script to
> > > >>>>>>> funnel all their message-ID's into the abuse reporting? And I was
> > > >>>>>>> like,
> > > >>>>>>> you know, about 2008 I did just that, there was a big spam flood,
> > > >>>>>>> and I wrote a little script to find them and extract their
> > > >>>>>>> posting-account,
> > > >>>>>>> and the message-ID, and a little script to post to the posting-host,
> > > >>>>>>> each one of the wicked spams.
> > > >>>>>>>
> > > >>>>>>> At the time that seemed to help, they sort of dried up, here there's
> > > >>>>>>> that basically they're not following the charter, but, it's the
> > > >>>>>>> posting-account
> > > >>>>>>> in the message headers that indicate the origin of the post, not the
> > > >>>>>>> email address. So, I wonder, given that I can extract the
> > > >>>>>>> posting-accounts
> > > >>>>>>> of all the spams, how to match the posting-account to then determine
> > > >>>>>>> whether it's a sockpuppet-farm or what, and basically about sending
> > > >>>>>>> them up.
> > > >>>>>>
> > > >>>>>> Let me see your little script. Post it here.
> > > >>>>>
> > > >>>>> Here is a list I currently have:
> > > >>>>>
> > > >>>>> salz.txt
> > > >>>>> usenet.death.penalty.gz
> > > >>>>> purify.txt
> > > >>>>> NewsAgent110-MS.exe
> > > >>>>> HipCrime's NewsAgent (v1_11).htm
> > > >>>>> NewsAgent111-BE.zip
> > > >>>>> SuperCede.exe
> > > >>>>> NewsAgent023.exe
> > > >>>>> NewsAgent025.exe
> > > >>>>> ActiveAgent.java
> > > >>>>> HipCrime's NewsAgent (v1_02)_files
> > > >>>>> NewsCancel.java (source code)
> > > >>>>>
> > > >>>>> (plus updated python versions)
> > > >>>>>
> > > >>>>>
> > > >>>>>
> > > >>>>> (Maybe your script is inthere somewhere?)
> > > >>>>>
> > > >>>>>
> > > >>>>>
> > > >>>>> Show me what you got. walk the walk.
> > > >>>>>
> > > >>>>
> > > >>>>
> > > >>>> I try to avoid sketchy things like hiring a criminal botnet,
> > > >>>> there's the impression that that's looking at 1000's of counts
> > > >>>> of computer intrusion.
> > > >>>>
> > > >>>> With those being something about $50K and 10-25 apiece,
> > > >>>> there's a pretty significant deterrence to such activities.
> > > >>>>
> > > >>>> I've never much cared for "OAuth", giving away the
> > > >>>> keys-to-the-kingdom and all, here it looks like either
> > > >>>> a) a bunch of duped browsers clicked away their identities,
> > > >>>> or b) it's really that Google and Facebook are more than
> > > >>>> half full of fake identities for the sole purpose of being fake.
> > > >>>>
> > > >>>> (How's your new deal going?
> > > >>>> Great, we got a million users.
> > > >>>> Why are my conversions around zero?
> > > >>>> Your ad must not speak to them.
> > > >>>> Would it help if I spiced it up?
> > > >>>> Don't backtalk me, I'll put you on a list!)
> > > >>>>
> > > >>>> So, it seems mostly a sort of "spam-walling the Internet",
> > > >>>> where it was like "we're going to reinvent the Internet",
> > > >>>> "no, you aren't", "all right then we'll ruin this one".
> > > >>>>
> > > >>>> As far as search goes, there's something to be said
> > > >>>> for a new sort of approach to search, given that
> > > >>>> Google, Bing, Duck, ..., _all make the same results_. It's
> > > >>>> just so highly unlikely that they'd _all make the same
> > > >>>> results_, you figure they're just one.
> > > >>>>
> > > >>>> So, the idea, for somebody like me who's mostly interested
> > > >>>> in writing on the Internet, is that lots of that is of the sort
> > > >>>> of "works" vis-a-vis, the "feuilleton" or what you might
> > > >>>> call it, ephemeral junk, that I just learned about in
> > > >>>> Herman Hesse's "The Glass Bead Game".
> > > >>>>
> > > >>>> Then, there's an idea, that basically to surface high-quality
> > > >>>> works to a search, is that there's what's called metadata,
> > > >>>> for content like HTML, with regards to Dublin Core and
> > > >>>> RDF and so on, about a sort of making for fungible collections
> > > >>>> of works, what results searchable fragments of various
> > > >>>> larger bodies of works, according to their robots.txt and
> > > >>>> their summaries and with regards to crawling the content
> > > >>>> and so on, then to make federated common search corpi,
> > > >>>> these kinds of things.
> > > >>>>
> > > >>>>
> > > >>>>
> > > >>>
> > > >>> It's like "why are they building that new data center",
> > > >>> and it's like "well it's like Artificial Intelligence, inside
> > > >>> that data center is a million virts and each one has a
> > > >>> browser emulator and a phone app sandbox and a
> > > >>> little notecard that prompts its name, basically it's
> > > >>> a million-headed hydra called a sims-bot-farm,
> > > >>> that for pennies on the dollar is an instant audience."
> > > >>>
> > > >>> "Wow, great, do they get a cut?" "Don't be talking about my cut."
> > > >>>
> > > >>> Usenet traffic had been up recently, ....
> > > >>>
> > > >>> I think they used to call it "astro-turfing".
> > > >>> "Artificial Intelligence?" "No, 'Fake eyeballs'."
> > > >>
> > > >> I have NewsAgent111-MS.exe
> > > >>
> > > >> I seem to be missing version 2.0
> > > >>
> > > >> Do you have the 2.0 version?
> > > >>
> > > >> I'll trade you.
> > > >>
> > > >> I'll give you my python version with (GUI)!!!! (Tinter)
> > > >>
> > > >> let's trade!
> > > >>
> > > >> don't bogart
> > > >
> > > > I seem to be missing this version:
> > > >
> > > > https://web.archive.org/web/20051023050609/http://newsagent.p5.org.uk/
> > > >
> > > > Do you have it? you must have!
> > > >
> > > >
> > > >
> > > >
> > > >
> > >
> > > Nope, I just wrote a little script to connect to NNTP
> > > with a Yes/No button on the subject, tapped through
> > > those, and a little script to send an HTTP request to
> > > the publicly-facing return-to-sender in-box, for each.
> > >
> > > Here's all the sources you need: IETF RFC editor.
> > > Look for "NNTP". How to advise Google of this is
> > > that each domain on the Internet is supposed to
> > > have an "abuse@domain" email inbox, though there's
> > > probably also a web request interface, as with regards
> > > to publicly facing services, and expected to be
> > > good actors on the network.
> > >
> > > Anyways if you read through "Meta: a usenet server
> > > just for sci.math", what I have in mind is a sort
> > > of author's and writer's oriented installation,
> > > basically making for vanity printouts and generating
> > > hypertext collections of contents and authors and
> > > subjects and these kinds of things, basically for
> > > on the order of "find all the postings of Archimedes
> > > Plutonium, and, the threads they are in, and,
> > > make a hypertext page of all that, a linear timeline,
> > > and also thread it out as a linear sequence".
> > >
> > > I.e. people who actually post to Usenet are sometimes
> > > having written interesting things, and, thus having
> > > it so that it would be simplified to generate message-ID
> > > listings and their corresponding standard URL's in the
> > > standard IETF "news" URL protocol, and to point that
> > > at a given news server or like XLink, is for treating
> > > Usenet its archives like a living museum of all these
> > > different authors posts and their interactions together.
> > >
> > > I.e., here it's "belles lettres" and "fair use",
> > > not just "belles" and "use".
> > >
> > > It seemed nice of Google Groups to front this for a long time,
> > > now they're quitting.
> > >
> > > I imagine Internet Relay Chat's still insane, though.
> > >
> > > Anyways I stay away from any warez and am proud that
> > > since about Y2K at least I've never bootlegged anything,
> > > and never uploaded a bootleg. Don't want to give old Shylock
> > > excuses, and besides, I wrote software for a living.
> >
> > Anyways, I don't know who was talking about "any warez" or "bootlegs",
> > since I was refering to programs and scripts that reads:
> >
> > "FREE, which means you can copy it and redistribute"
> > "Similarly, the source is provided as reference and can be redistributed
> > freely as well. "
> >
> > HipCrime's NewsAgent (v2.0) is FREE, which means you can copy it and
> > redistribute it at will, as long as you give credit to the original
> > author. Similarly, the source is provided as reference and can be
> > redistributed freely as well.
> >
> > https://web.archive.org/web/20051023050609/http://newsagent.p5.org.uk/
> >
> > You seem to be too much 'in your head', on a high horse...
> >
> > "FREE, which means you can copy it and redistribute"
> > "Similarly, the source is provided as reference and can be redistributed
> > freely as well. "
> >
> > So, show me that wicked script you wrote : "funnel all their
> > message-ID's"
> > by people you call spammers who 'funnel' their products and services
> > through Usenet newsgroups.
> >
> > You are sooooo wicked.
> >
> > and a nanofossils
>
> Anyways, there is only one person that know what 'nanofossils' means,
> and that is Ross Finlayson.
>
> I just realized that Ross Finlayson doesn't know of NEWSAGENT.
>
> Anyways, ...
>
> "Anyways"???? Who talks like that?
>
> Anyways..
>
> the problem of the 'flooding' is not the spammers, it's the 'scientific
> community'. They caused the problem.
> They removed the feature that NewsAgent used to get rid of ALL flooding
> and spammers. But, but, the
> members of the scientific community could not trust their own members to
> use it against them.
>
> If one member of the 'scientific community' disagreed with another
> member of the 'scientific community'...they were removed!
>
> Too much power.
>
> I called it...God Mode.


Click here to read the complete article
Re: Meta: Re: How I deal with the enormous amount of spam

<65C28717.3F60@ix.netcom.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130551&group=sci.physics.relativity#130551

  copy link   Newsgroups: sci.physics.relativity sci.physics
Path: i2pn2.org!i2pn.org!paganini.bofh.team!not-for-mail
From: starmaker@ix.netcom.com (The Starmaker)
Newsgroups: sci.physics.relativity,sci.physics
Subject: Re: Meta: Re: How I deal with the enormous amount of spam
Date: Tue, 06 Feb 2024 11:23:03 -0800
Organization: To protect and to server
Message-ID: <65C28717.3F60@ix.netcom.com>
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com> <afe109ee689f7051d5909fe8eca54113@www.novabbs.com> <b67e3cc9-a1ac-428f-b225-389abf45d416n@googlegroups.com> <Z_icnWyNR7xW5CP4nZ2dnZfqnPSdnZ2d@giganews.com> <65BEB38A.B7E@ix.netcom.com> <65BEC238.5CF1@ix.netcom.com> <18mdnb4LQ8kNUiL4nZ2dnZfqn_SdnZ2d@giganews.com> <0decnTlW944oSSL4nZ2dnZfqn_udnZ2d@giganews.com> <65BFF51D.3F9B@ix.netcom.com> <65BFF947.2233@ix.netcom.com> <Oq6cneYGTPUt2l34nZ2dnZfqnPqdnZ2d@giganews.com> <65C079E5.4E84@ix.netcom.com> <65C27AC4.15B7@ix.netcom.com> <65C2822A.16A7@ix.netcom.com>
Reply-To: starmaker@ix.netcom.com
Mime-Version: 1.0
Content-Type: text/plain; charset=iso-8859-1
Content-Transfer-Encoding: 8bit
Injection-Info: paganini.bofh.team; logging-data="2279218"; posting-host="nLYg9UBeoMWa070gP9wQcw.user.paganini.bofh.team"; mail-complaints-to="usenet@bofh.team"; posting-account="9dIQLXBM7WM9KzA+yjdR4A";
Cancel-Lock: sha256:XTwOBFmRezGomTs5vwgXMQ4HRPBrGcPbtCrhoBn5a8w=
X-Mailer: Mozilla 3.04Gold (WinNT; U)
X-Notice: Filtered by postfilter v. 0.9.3
X-Antivirus: Avast (VPS 240206-2, 02/06/2024), Outbound message
X-Antivirus-Status: Clean
 by: The Starmaker - Tue, 6 Feb 2024 19:23 UTC

The Starmaker wrote:
>
> The Starmaker wrote:
> >
> > The Starmaker wrote:
> > >
> > > Ross Finlayson wrote:
> > > >
> > > > On 02/04/2024 12:53 PM, The Starmaker wrote:
> > > > > The Starmaker wrote:
> > > > >>
> > > > >> Ross Finlayson wrote:
> > > > >>>
> > > > >>> On 02/04/2024 09:55 AM, Ross Finlayson wrote:
> > > > >>>> On 02/03/2024 02:46 PM, The Starmaker wrote:
> > > > >>>>> The Starmaker wrote:
> > > > >>>>>>
> > > > >>>>>> Ross Finlayson wrote:
> > > > >>>>>>>
> > > > >>>>>>> On 01/30/2024 12:54 PM, Ross Finlayson wrote:
> > > > >>>>>>>> On Monday, January 29, 2024 at 5:02:05 PM UTC-8, palsing wrote:
> > > > >>>>>>>>> Tom Roberts wrote:
> > > > >>>>>>>>>
> > > > >>>>>>>>>> I use Thunderbird to read Usenet. Recently sci.physics.relativity
> > > > >>>>>>>>>> has
> > > > >>>>>>>>>> been getting hundreds of spam posts each day, completely
> > > > >>>>>>>>>> overwhelming
> > > > >>>>>>>>>> legitimate content. These spam posts share the property that they
> > > > >>>>>>>>>> are
> > > > >>>>>>>>>> written in a non-latin script.
> > > > >>>>>>>>>
> > > > >>>>>>>>>> Thunderbird implements message filters that can mark a message
> > > > >>>>>>>>>> Read. So
> > > > >>>>>>>>>> I created a filter to run on sci.physics.relativity that marks
> > > > >>>>>>>>>> messages
> > > > >>>>>>>>>> Read. Then when reading the newsgroups, I simply display only unread
> > > > >>>>>>>>>> messages. The key to making this work is to craft the filter so
> > > > >>>>>>>>>> it marks
> > > > >>>>>>>>>> messages in which the Subject matches any of a dozen characters
> > > > >>>>>>>>>> picked
> > > > >>>>>>>>>> from some spam messages.
> > > > >>>>>>>>>
> > > > >>>>>>>>>> This doesn't completely eliminate the spam, but it is now only a few
> > > > >>>>>>>>>> messages per day.
> > > > >>>>>>>>>
> > > > >>>>>>>>>> Tom Roberts
> > > > >>>>>>>>> I would like to do the same thing, so I installed Thunderbird...
> > > > >>>>>>>>> but setting it up to read newsgroups is beyond my paltry computer
> > > > >>>>>>>>> skills and is not at all intuitive. If anyone can point to an
> > > > >>>>>>>>> idiot-proof tutorial for doing this It would be much appreciated.
> > > > >>>>>>>>>
> > > > >>>>>>>>> \Paul Alsing
> > > > >>>>>>>>
> > > > >>>>>>>> Yeah, it's pretty bad, or, worse anybody's ever seen it.
> > > > >>>>>>>>
> > > > >>>>>>>> I as well sort of mow the lawn a bit or mark the spam.
> > > > >>>>>>>>
> > > > >>>>>>>> It seems alright if it'll be a sort of clean break: on Feb 22
> > > > >>>>>>>> according to Google,
> > > > >>>>>>>> Google will break its compeerage to Usenet, and furthermore make
> > > > >>>>>>>> read-only
> > > > >>>>>>>> the archives, what it has, what until then, will be as it was.
> > > > >>>>>>>>
> > > > >>>>>>>> Over on sci.math I've had the idea for a while of making some brief
> > > > >>>>>>>> and
> > > > >>>>>>>> special purpose Usenet compeers, for only some few groups, or, you
> > > > >>>>>>>> know, the _belles lettres_ of the text hierarchy.
> > > > >>>>>>>>
> > > > >>>>>>>> "Meta: a usenet server just for sci.math"
> > > > >>>>>>>> -- https://groups.google.com/g/sci.math/c/zggff_pVEks
> > > > >>>>>>>>
> > > > >>>>>>>> So, there you can read the outlook of this kind of thing, then
> > > > >>>>>>>> while sort
> > > > >>>>>>>> of simple as the protocol is simple and its implementations
> > > > >>>>>>>> widespread,
> > > > >>>>>>>> how to deal with the "signal and noise" of "exposed messaging
> > > > >>>>>>>> destinations
> > > > >>>>>>>> on the Internet", well on that thread I'm theorizing a sort of,
> > > > >>>>>>>> "NOOBNB protocol",
> > > > >>>>>>>> figuring to make an otherwise just standard Usenet compeer, and
> > > > >>>>>>>> also for
> > > > >>>>>>>> email or messaging destinations, sort of designed with the
> > > > >>>>>>>> expectation that
> > > > >>>>>>>> there will be spam, and spam and ham are hand in hand, to exclude
> > > > >>>>>>>> it in simple terms.
> > > > >>>>>>>>
> > > > >>>>>>>> NOOBNB: New Old Off Bot Non Bad, Curated/Purgatory/Raw triple-feed
> > > > >>>>>>>>
> > > > >>>>>>>> (That and a firmer sort of "Load Shed" or "Load Hold" at the
> > > > >>>>>>>> transport layer.)
> > > > >>>>>>>>
> > > > >>>>>>>> Also it would be real great if at least there was surfaced to the
> > > > >>>>>>>> Internet a
> > > > >>>>>>>> read-only view of any message by its message ID, a "URL", or as for
> > > > >>>>>>>> a "URI",
> > > > >>>>>>>> a "URN", a reliable perma-link in the IETF "news" protocol, namespace.
> > > > >>>>>>>>
> > > > >>>>>>>> https://groups.google.com/g/sci.math/c/zggff_pVEks
> > > > >>>>>>>>
> > > > >>>>>>>> I wonder that there's a reliable sort of long-term project that
> > > > >>>>>>>> surfaces
> > > > >>>>>>>> "news" protocol message-IDs, .... It's a stable, standards-based
> > > > >>>>>>>> protocol.
> > > > >>>>>>>>
> > > > >>>>>>>>
> > > > >>>>>>>> Thunderbird, "SLRN", .... Thanks for caring. We care.
> > > > >>>>>>>>
> > > > >>>>>>>>
> > > > >>>>>>>> https://groups.google.com/g/sci.physics.relativity/c/ToBo6XOymUw
> > > > >>>>>>>>
> > > > >>>>>>>
> > > > >>>>>>> One fellow reached me via e-mail and he said, hey, the Googler spam is
> > > > >>>>>>> outrageous, can we do anything about it? Would you write a script to
> > > > >>>>>>> funnel all their message-ID's into the abuse reporting? And I was
> > > > >>>>>>> like,
> > > > >>>>>>> you know, about 2008 I did just that, there was a big spam flood,
> > > > >>>>>>> and I wrote a little script to find them and extract their
> > > > >>>>>>> posting-account,
> > > > >>>>>>> and the message-ID, and a little script to post to the posting-host,
> > > > >>>>>>> each one of the wicked spams.
> > > > >>>>>>>
> > > > >>>>>>> At the time that seemed to help, they sort of dried up, here there's
> > > > >>>>>>> that basically they're not following the charter, but, it's the
> > > > >>>>>>> posting-account
> > > > >>>>>>> in the message headers that indicate the origin of the post, not the
> > > > >>>>>>> email address. So, I wonder, given that I can extract the
> > > > >>>>>>> posting-accounts
> > > > >>>>>>> of all the spams, how to match the posting-account to then determine
> > > > >>>>>>> whether it's a sockpuppet-farm or what, and basically about sending
> > > > >>>>>>> them up.
> > > > >>>>>>
> > > > >>>>>> Let me see your little script. Post it here.
> > > > >>>>>
> > > > >>>>> Here is a list I currently have:
> > > > >>>>>
> > > > >>>>> salz.txt
> > > > >>>>> usenet.death.penalty.gz
> > > > >>>>> purify.txt
> > > > >>>>> NewsAgent110-MS.exe
> > > > >>>>> HipCrime's NewsAgent (v1_11).htm
> > > > >>>>> NewsAgent111-BE.zip
> > > > >>>>> SuperCede.exe
> > > > >>>>> NewsAgent023.exe
> > > > >>>>> NewsAgent025.exe
> > > > >>>>> ActiveAgent.java
> > > > >>>>> HipCrime's NewsAgent (v1_02)_files
> > > > >>>>> NewsCancel.java (source code)
> > > > >>>>>
> > > > >>>>> (plus updated python versions)
> > > > >>>>>
> > > > >>>>>
> > > > >>>>>
> > > > >>>>> (Maybe your script is inthere somewhere?)
> > > > >>>>>
> > > > >>>>>
> > > > >>>>>
> > > > >>>>> Show me what you got. walk the walk.
> > > > >>>>>
> > > > >>>>
> > > > >>>>
> > > > >>>> I try to avoid sketchy things like hiring a criminal botnet,
> > > > >>>> there's the impression that that's looking at 1000's of counts
> > > > >>>> of computer intrusion.
> > > > >>>>
> > > > >>>> With those being something about $50K and 10-25 apiece,
> > > > >>>> there's a pretty significant deterrence to such activities.
> > > > >>>>
> > > > >>>> I've never much cared for "OAuth", giving away the
> > > > >>>> keys-to-the-kingdom and all, here it looks like either
> > > > >>>> a) a bunch of duped browsers clicked away their identities,
> > > > >>>> or b) it's really that Google and Facebook are more than
> > > > >>>> half full of fake identities for the sole purpose of being fake.
> > > > >>>>
> > > > >>>> (How's your new deal going?
> > > > >>>> Great, we got a million users.
> > > > >>>> Why are my conversions around zero?
> > > > >>>> Your ad must not speak to them.
> > > > >>>> Would it help if I spiced it up?
> > > > >>>> Don't backtalk me, I'll put you on a list!)
> > > > >>>>
> > > > >>>> So, it seems mostly a sort of "spam-walling the Internet",
> > > > >>>> where it was like "we're going to reinvent the Internet",
> > > > >>>> "no, you aren't", "all right then we'll ruin this one".
> > > > >>>>
> > > > >>>> As far as search goes, there's something to be said
> > > > >>>> for a new sort of approach to search, given that
> > > > >>>> Google, Bing, Duck, ..., _all make the same results_. It's
> > > > >>>> just so highly unlikely that they'd _all make the same
> > > > >>>> results_, you figure they're just one.
> > > > >>>>
> > > > >>>> So, the idea, for somebody like me who's mostly interested
> > > > >>>> in writing on the Internet, is that lots of that is of the sort
> > > > >>>> of "works" vis-a-vis, the "feuilleton" or what you might
> > > > >>>> call it, ephemeral junk, that I just learned about in
> > > > >>>> Herman Hesse's "The Glass Bead Game".
> > > > >>>>
> > > > >>>> Then, there's an idea, that basically to surface high-quality
> > > > >>>> works to a search, is that there's what's called metadata,
> > > > >>>> for content like HTML, with regards to Dublin Core and
> > > > >>>> RDF and so on, about a sort of making for fungible collections
> > > > >>>> of works, what results searchable fragments of various
> > > > >>>> larger bodies of works, according to their robots.txt and
> > > > >>>> their summaries and with regards to crawling the content
> > > > >>>> and so on, then to make federated common search corpi,
> > > > >>>> these kinds of things.
> > > > >>>>
> > > > >>>>
> > > > >>>>
> > > > >>>
> > > > >>> It's like "why are they building that new data center",
> > > > >>> and it's like "well it's like Artificial Intelligence, inside
> > > > >>> that data center is a million virts and each one has a
> > > > >>> browser emulator and a phone app sandbox and a
> > > > >>> little notecard that prompts its name, basically it's
> > > > >>> a million-headed hydra called a sims-bot-farm,
> > > > >>> that for pennies on the dollar is an instant audience."
> > > > >>>
> > > > >>> "Wow, great, do they get a cut?" "Don't be talking about my cut."
> > > > >>>
> > > > >>> Usenet traffic had been up recently, ....
> > > > >>>
> > > > >>> I think they used to call it "astro-turfing".
> > > > >>> "Artificial Intelligence?" "No, 'Fake eyeballs'."
> > > > >>
> > > > >> I have NewsAgent111-MS.exe
> > > > >>
> > > > >> I seem to be missing version 2.0
> > > > >>
> > > > >> Do you have the 2.0 version?
> > > > >>
> > > > >> I'll trade you.
> > > > >>
> > > > >> I'll give you my python version with (GUI)!!!! (Tinter)
> > > > >>
> > > > >> let's trade!
> > > > >>
> > > > >> don't bogart
> > > > >
> > > > > I seem to be missing this version:
> > > > >
> > > > > https://web.archive.org/web/20051023050609/http://newsagent.p5.org.uk/
> > > > >
> > > > > Do you have it? you must have!
> > > > >
> > > > >
> > > > >
> > > > >
> > > > >
> > > >
> > > > Nope, I just wrote a little script to connect to NNTP
> > > > with a Yes/No button on the subject, tapped through
> > > > those, and a little script to send an HTTP request to
> > > > the publicly-facing return-to-sender in-box, for each.
> > > >
> > > > Here's all the sources you need: IETF RFC editor.
> > > > Look for "NNTP". How to advise Google of this is
> > > > that each domain on the Internet is supposed to
> > > > have an "abuse@domain" email inbox, though there's
> > > > probably also a web request interface, as with regards
> > > > to publicly facing services, and expected to be
> > > > good actors on the network.
> > > >
> > > > Anyways if you read through "Meta: a usenet server
> > > > just for sci.math", what I have in mind is a sort
> > > > of author's and writer's oriented installation,
> > > > basically making for vanity printouts and generating
> > > > hypertext collections of contents and authors and
> > > > subjects and these kinds of things, basically for
> > > > on the order of "find all the postings of Archimedes
> > > > Plutonium, and, the threads they are in, and,
> > > > make a hypertext page of all that, a linear timeline,
> > > > and also thread it out as a linear sequence".
> > > >
> > > > I.e. people who actually post to Usenet are sometimes
> > > > having written interesting things, and, thus having
> > > > it so that it would be simplified to generate message-ID
> > > > listings and their corresponding standard URL's in the
> > > > standard IETF "news" URL protocol, and to point that
> > > > at a given news server or like XLink, is for treating
> > > > Usenet its archives like a living museum of all these
> > > > different authors posts and their interactions together.
> > > >
> > > > I.e., here it's "belles lettres" and "fair use",
> > > > not just "belles" and "use".
> > > >
> > > > It seemed nice of Google Groups to front this for a long time,
> > > > now they're quitting.
> > > >
> > > > I imagine Internet Relay Chat's still insane, though.
> > > >
> > > > Anyways I stay away from any warez and am proud that
> > > > since about Y2K at least I've never bootlegged anything,
> > > > and never uploaded a bootleg. Don't want to give old Shylock
> > > > excuses, and besides, I wrote software for a living.
> > >
> > > Anyways, I don't know who was talking about "any warez" or "bootlegs",
> > > since I was refering to programs and scripts that reads:
> > >
> > > "FREE, which means you can copy it and redistribute"
> > > "Similarly, the source is provided as reference and can be redistributed
> > > freely as well. "
> > >
> > > HipCrime's NewsAgent (v2.0) is FREE, which means you can copy it and
> > > redistribute it at will, as long as you give credit to the original
> > > author. Similarly, the source is provided as reference and can be
> > > redistributed freely as well.
> > >
> > > https://web.archive.org/web/20051023050609/http://newsagent.p5.org.uk/
> > >
> > > You seem to be too much 'in your head', on a high horse...
> > >
> > > "FREE, which means you can copy it and redistribute"
> > > "Similarly, the source is provided as reference and can be redistributed
> > > freely as well. "
> > >
> > > So, show me that wicked script you wrote : "funnel all their
> > > message-ID's"
> > > by people you call spammers who 'funnel' their products and services
> > > through Usenet newsgroups.
> > >
> > > You are sooooo wicked.
> > >
> > > and a nanofossils
> >
> > Anyways, there is only one person that know what 'nanofossils' means,
> > and that is Ross Finlayson.
> >
> > I just realized that Ross Finlayson doesn't know of NEWSAGENT.
> >
> > Anyways, ...
> >
> > "Anyways"???? Who talks like that?
> >
> > Anyways..
> >
> > the problem of the 'flooding' is not the spammers, it's the 'scientific
> > community'. They caused the problem.
> > They removed the feature that NewsAgent used to get rid of ALL flooding
> > and spammers. But, but, the
> > members of the scientific community could not trust their own members to
> > use it against them.
> >
> > If one member of the 'scientific community' disagreed with another
> > member of the 'scientific community'...they were removed!
> >
> > Too much power.
> >
> > I called it...God Mode.
>
> Using NewsAgent in God Mode was great! Ecept...if you didn't know how to use it
> properly you can make a mistake and remove *EVERYONE'S* posts by accident.
>
> Everyone just completely disapeared!
>
> Oops. i made a booboo.
>
> Like that Twilight Zone episode where everyone disapears by a click of a watch.
>
> Where is everybody? MAJOR KILLFILE!
>
> So, which is worse?


Click here to read the complete article
Re: How I deal with the enormous amount of spam

<c85b21ae5061413e756676cede9f7c78@www.novabbs.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130552&group=sci.physics.relativity#130552

  copy link   Newsgroups: sci.physics.relativity
Path: i2pn2.org!.POSTED!not-for-mail
From: tomyee3@gmail.com (ProkaryoticCaspaseHomolog)
Newsgroups: sci.physics.relativity
Subject: Re: How I deal with the enormous amount of spam
Date: Tue, 6 Feb 2024 20:03:36 +0000
Organization: novaBBS
Message-ID: <c85b21ae5061413e756676cede9f7c78@www.novabbs.com>
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com> <c9001c52-cd6d-41fc-a857-a30d9cfc845en@googlegroups.com> <0e5aabd6fba0141a7a0e7b82761055fa@www.novabbs.com> <1qocs71.1duewnn9ts4swN%nospam@de-ster.demon.nl>
MIME-Version: 1.0
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Info: i2pn2.org;
logging-data="1923601"; mail-complaints-to="usenet@i2pn2.org";
posting-account="t+lO0yBNO1zGxasPvGSZV1BRu71QKx+JE37DnW+83jQ";
User-Agent: Rocksolid Light
X-Rslight-Posting-User: c1a997029c70f718720f72156b7d7f56416caf7c
X-Spam-Checker-Version: SpamAssassin 4.0.0
X-Rslight-Site: $2y$10$tiVpNBHhbmO6FM9TU63.NOQ3eE99m1rof3qn/KUe8KRiSqrzvtwli
 by: ProkaryoticCaspaseHo - Tue, 6 Feb 2024 20:03 UTC

J. J. Lodder wrote:

> ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:

>> Richard Hertz wrote:
>>
>> > Use this website. It's spam free, and you can get a free account there:
>>
>> > https://www.novabbs.com/tech/thread.php?group=sci.physics.relativity
>>
>> Thanks!

> Much better to get a real newsserver instead,

What newsserver would you recommend for an iPhone?

Google Groups worked pretty much the same on desktop
as on iPhone or iPad, likewise novabbs. I can dictate
extended posts with very few errors. Everybody jokes
about how stupid voice recognition is, but Apple does
a pretty good job once you learn its idiosyncrasies.

Re: Meta: Re: How I deal with the enormous amount of spam

<rPecnb34-r-14lv4nZ2dnZfqn_GdnZ2d@giganews.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130570&group=sci.physics.relativity#130570

  copy link   Newsgroups: sci.physics.relativity
Path: i2pn2.org!i2pn.org!weretis.net!feeder6.news.weretis.net!news.quux.org!1.us.feeder.erje.net!feeder.erje.net!usenet.blueworldhosting.com!diablo1.usenet.blueworldhosting.com!feeder.usenetexpress.com!tr3.iad1.usenetexpress.com!69.80.99.22.MISMATCH!Xl.tags.giganews.com!local-2.nntp.ord.giganews.com!news.giganews.com.POSTED!not-for-mail
NNTP-Posting-Date: Fri, 09 Feb 2024 19:38:15 +0000
Subject: Re: Meta: Re: How I deal with the enormous amount of spam
Newsgroups: sci.physics.relativity
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com> <afe109ee689f7051d5909fe8eca54113@www.novabbs.com> <b67e3cc9-a1ac-428f-b225-389abf45d416n@googlegroups.com> <Z_icnWyNR7xW5CP4nZ2dnZfqnPSdnZ2d@giganews.com> <65BEB38A.B7E@ix.netcom.com> <65BEC238.5CF1@ix.netcom.com> <18mdnb4LQ8kNUiL4nZ2dnZfqn_SdnZ2d@giganews.com> <0decnTlW944oSSL4nZ2dnZfqn_udnZ2d@giganews.com> <65BFF51D.3F9B@ix.netcom.com> <65BFF947.2233@ix.netcom.com> <Oq6cneYGTPUt2l34nZ2dnZfqnPqdnZ2d@giganews.com>
From: ross.a.finlayson@gmail.com (Ross Finlayson)
Date: Fri, 9 Feb 2024 11:38:34 -0800
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:38.0) Gecko/20100101 Thunderbird/38.6.0
MIME-Version: 1.0
In-Reply-To: <Oq6cneYGTPUt2l34nZ2dnZfqnPqdnZ2d@giganews.com>
Content-Type: text/plain; charset=windows-1252; format=flowed
Content-Transfer-Encoding: 8bit
Message-ID: <rPecnb34-r-14lv4nZ2dnZfqn_GdnZ2d@giganews.com>
Lines: 413
X-Usenet-Provider: http://www.giganews.com
X-Trace: sv3-YLP6z6gjZVdyXA2IXTQ4LihJhUwhIh33m1Bx7HyYzyBj6h+GC9w8qwxPjy744t29GWvdoZdM0DGqzVc!GWVOhutALheSPI7xoDLsk9ZFKpkavZtcfEmTV0NDx9nctaTDZlwXBmYLJcQbNgrFgxRQvA/3UDmB!iA==
X-Complaints-To: abuse@giganews.com
X-DMCA-Notifications: http://www.giganews.com/info/dmca.html
X-Abuse-and-DMCA-Info: Please be sure to forward a copy of ALL headers
X-Abuse-and-DMCA-Info: Otherwise we will be unable to process your complaint properly
X-Postfilter: 1.3.40
 by: Ross Finlayson - Fri, 9 Feb 2024 19:38 UTC

On 02/04/2024 06:28 PM, Ross Finlayson wrote:
> On 02/04/2024 12:53 PM, The Starmaker wrote:
>> The Starmaker wrote:
>>>
>>> Ross Finlayson wrote:
>>>>
>>>> On 02/04/2024 09:55 AM, Ross Finlayson wrote:
>>>>> On 02/03/2024 02:46 PM, The Starmaker wrote:
>>>>>> The Starmaker wrote:
>>>>>>>
>>>>>>> Ross Finlayson wrote:
>>>>>>>>
>>>>>>>> On 01/30/2024 12:54 PM, Ross Finlayson wrote:
>>>>>>>>> On Monday, January 29, 2024 at 5:02:05 PM UTC-8, palsing wrote:
>>>>>>>>>> Tom Roberts wrote:
>>>>>>>>>>
>>>>>>>>>>> I use Thunderbird to read Usenet. Recently
>>>>>>>>>>> sci.physics.relativity
>>>>>>>>>>> has
>>>>>>>>>>> been getting hundreds of spam posts each day, completely
>>>>>>>>>>> overwhelming
>>>>>>>>>>> legitimate content. These spam posts share the property that
>>>>>>>>>>> they
>>>>>>>>>>> are
>>>>>>>>>>> written in a non-latin script.
>>>>>>>>>>
>>>>>>>>>>> Thunderbird implements message filters that can mark a message
>>>>>>>>>>> Read. So
>>>>>>>>>>> I created a filter to run on sci.physics.relativity that marks
>>>>>>>>>>> messages
>>>>>>>>>>> Read. Then when reading the newsgroups, I simply display only
>>>>>>>>>>> unread
>>>>>>>>>>> messages. The key to making this work is to craft the filter so
>>>>>>>>>>> it marks
>>>>>>>>>>> messages in which the Subject matches any of a dozen characters
>>>>>>>>>>> picked
>>>>>>>>>>> from some spam messages.
>>>>>>>>>>
>>>>>>>>>>> This doesn't completely eliminate the spam, but it is now
>>>>>>>>>>> only a few
>>>>>>>>>>> messages per day.
>>>>>>>>>>
>>>>>>>>>>> Tom Roberts
>>>>>>>>>> I would like to do the same thing, so I installed Thunderbird...
>>>>>>>>>> but setting it up to read newsgroups is beyond my paltry computer
>>>>>>>>>> skills and is not at all intuitive. If anyone can point to an
>>>>>>>>>> idiot-proof tutorial for doing this It would be much appreciated.
>>>>>>>>>>
>>>>>>>>>> \Paul Alsing
>>>>>>>>>
>>>>>>>>> Yeah, it's pretty bad, or, worse anybody's ever seen it.
>>>>>>>>>
>>>>>>>>> I as well sort of mow the lawn a bit or mark the spam.
>>>>>>>>>
>>>>>>>>> It seems alright if it'll be a sort of clean break: on Feb 22
>>>>>>>>> according to Google,
>>>>>>>>> Google will break its compeerage to Usenet, and furthermore make
>>>>>>>>> read-only
>>>>>>>>> the archives, what it has, what until then, will be as it was.
>>>>>>>>>
>>>>>>>>> Over on sci.math I've had the idea for a while of making some
>>>>>>>>> brief
>>>>>>>>> and
>>>>>>>>> special purpose Usenet compeers, for only some few groups, or, you
>>>>>>>>> know, the _belles lettres_ of the text hierarchy.
>>>>>>>>>
>>>>>>>>> "Meta: a usenet server just for sci.math"
>>>>>>>>> -- https://groups.google.com/g/sci.math/c/zggff_pVEks
>>>>>>>>>
>>>>>>>>> So, there you can read the outlook of this kind of thing, then
>>>>>>>>> while sort
>>>>>>>>> of simple as the protocol is simple and its implementations
>>>>>>>>> widespread,
>>>>>>>>> how to deal with the "signal and noise" of "exposed messaging
>>>>>>>>> destinations
>>>>>>>>> on the Internet", well on that thread I'm theorizing a sort of,
>>>>>>>>> "NOOBNB protocol",
>>>>>>>>> figuring to make an otherwise just standard Usenet compeer, and
>>>>>>>>> also for
>>>>>>>>> email or messaging destinations, sort of designed with the
>>>>>>>>> expectation that
>>>>>>>>> there will be spam, and spam and ham are hand in hand, to exclude
>>>>>>>>> it in simple terms.
>>>>>>>>>
>>>>>>>>> NOOBNB: New Old Off Bot Non Bad, Curated/Purgatory/Raw
>>>>>>>>> triple-feed
>>>>>>>>>
>>>>>>>>> (That and a firmer sort of "Load Shed" or "Load Hold" at the
>>>>>>>>> transport layer.)
>>>>>>>>>
>>>>>>>>> Also it would be real great if at least there was surfaced to the
>>>>>>>>> Internet a
>>>>>>>>> read-only view of any message by its message ID, a "URL", or as
>>>>>>>>> for
>>>>>>>>> a "URI",
>>>>>>>>> a "URN", a reliable perma-link in the IETF "news" protocol,
>>>>>>>>> namespace.
>>>>>>>>>
>>>>>>>>> https://groups.google.com/g/sci.math/c/zggff_pVEks
>>>>>>>>>
>>>>>>>>> I wonder that there's a reliable sort of long-term project that
>>>>>>>>> surfaces
>>>>>>>>> "news" protocol message-IDs, .... It's a stable, standards-based
>>>>>>>>> protocol.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Thunderbird, "SLRN", .... Thanks for caring. We care.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> https://groups.google.com/g/sci.physics.relativity/c/ToBo6XOymUw
>>>>>>>>>
>>>>>>>>
>>>>>>>> One fellow reached me via e-mail and he said, hey, the Googler
>>>>>>>> spam is
>>>>>>>> outrageous, can we do anything about it? Would you write a
>>>>>>>> script to
>>>>>>>> funnel all their message-ID's into the abuse reporting? And I was
>>>>>>>> like,
>>>>>>>> you know, about 2008 I did just that, there was a big spam flood,
>>>>>>>> and I wrote a little script to find them and extract their
>>>>>>>> posting-account,
>>>>>>>> and the message-ID, and a little script to post to the
>>>>>>>> posting-host,
>>>>>>>> each one of the wicked spams.
>>>>>>>>
>>>>>>>> At the time that seemed to help, they sort of dried up, here
>>>>>>>> there's
>>>>>>>> that basically they're not following the charter, but, it's the
>>>>>>>> posting-account
>>>>>>>> in the message headers that indicate the origin of the post, not
>>>>>>>> the
>>>>>>>> email address. So, I wonder, given that I can extract the
>>>>>>>> posting-accounts
>>>>>>>> of all the spams, how to match the posting-account to then
>>>>>>>> determine
>>>>>>>> whether it's a sockpuppet-farm or what, and basically about sending
>>>>>>>> them up.
>>>>>>>
>>>>>>> Let me see your little script. Post it here.
>>>>>>
>>>>>> Here is a list I currently have:
>>>>>>
>>>>>> salz.txt
>>>>>> usenet.death.penalty.gz
>>>>>> purify.txt
>>>>>> NewsAgent110-MS.exe
>>>>>> HipCrime's NewsAgent (v1_11).htm
>>>>>> NewsAgent111-BE.zip
>>>>>> SuperCede.exe
>>>>>> NewsAgent023.exe
>>>>>> NewsAgent025.exe
>>>>>> ActiveAgent.java
>>>>>> HipCrime's NewsAgent (v1_02)_files
>>>>>> NewsCancel.java (source code)
>>>>>>
>>>>>> (plus updated python versions)
>>>>>>
>>>>>>
>>>>>>
>>>>>> (Maybe your script is inthere somewhere?)
>>>>>>
>>>>>>
>>>>>>
>>>>>> Show me what you got. walk the walk.
>>>>>>
>>>>>
>>>>>
>>>>> I try to avoid sketchy things like hiring a criminal botnet,
>>>>> there's the impression that that's looking at 1000's of counts
>>>>> of computer intrusion.
>>>>>
>>>>> With those being something about $50K and 10-25 apiece,
>>>>> there's a pretty significant deterrence to such activities.
>>>>>
>>>>> I've never much cared for "OAuth", giving away the
>>>>> keys-to-the-kingdom and all, here it looks like either
>>>>> a) a bunch of duped browsers clicked away their identities,
>>>>> or b) it's really that Google and Facebook are more than
>>>>> half full of fake identities for the sole purpose of being fake.
>>>>>
>>>>> (How's your new deal going?
>>>>> Great, we got a million users.
>>>>> Why are my conversions around zero?
>>>>> Your ad must not speak to them.
>>>>> Would it help if I spiced it up?
>>>>> Don't backtalk me, I'll put you on a list!)
>>>>>
>>>>> So, it seems mostly a sort of "spam-walling the Internet",
>>>>> where it was like "we're going to reinvent the Internet",
>>>>> "no, you aren't", "all right then we'll ruin this one".
>>>>>
>>>>> As far as search goes, there's something to be said
>>>>> for a new sort of approach to search, given that
>>>>> Google, Bing, Duck, ..., _all make the same results_. It's
>>>>> just so highly unlikely that they'd _all make the same
>>>>> results_, you figure they're just one.
>>>>>
>>>>> So, the idea, for somebody like me who's mostly interested
>>>>> in writing on the Internet, is that lots of that is of the sort
>>>>> of "works" vis-a-vis, the "feuilleton" or what you might
>>>>> call it, ephemeral junk, that I just learned about in
>>>>> Herman Hesse's "The Glass Bead Game".
>>>>>
>>>>> Then, there's an idea, that basically to surface high-quality
>>>>> works to a search, is that there's what's called metadata,
>>>>> for content like HTML, with regards to Dublin Core and
>>>>> RDF and so on, about a sort of making for fungible collections
>>>>> of works, what results searchable fragments of various
>>>>> larger bodies of works, according to their robots.txt and
>>>>> their summaries and with regards to crawling the content
>>>>> and so on, then to make federated common search corpi,
>>>>> these kinds of things.
>>>>>
>>>>>
>>>>>
>>>>
>>>> It's like "why are they building that new data center",
>>>> and it's like "well it's like Artificial Intelligence, inside
>>>> that data center is a million virts and each one has a
>>>> browser emulator and a phone app sandbox and a
>>>> little notecard that prompts its name, basically it's
>>>> a million-headed hydra called a sims-bot-farm,
>>>> that for pennies on the dollar is an instant audience."
>>>>
>>>> "Wow, great, do they get a cut?" "Don't be talking about my cut."
>>>>
>>>> Usenet traffic had been up recently, ....
>>>>
>>>> I think they used to call it "astro-turfing".
>>>> "Artificial Intelligence?" "No, 'Fake eyeballs'."
>>>
>>> I have NewsAgent111-MS.exe
>>>
>>> I seem to be missing version 2.0
>>>
>>> Do you have the 2.0 version?
>>>
>>> I'll trade you.
>>>
>>> I'll give you my python version with (GUI)!!!! (Tinter)
>>>
>>> let's trade!
>>>
>>> don't bogart
>>
>> I seem to be missing this version:
>>
>> https://web.archive.org/web/20051023050609/http://newsagent.p5.org.uk/
>>
>> Do you have it? you must have!
>>
>>
>>
>>
>>
>
> Nope, I just wrote a little script to connect to NNTP
> with a Yes/No button on the subject, tapped through
> those, and a little script to send an HTTP request to
> the publicly-facing return-to-sender in-box, for each.
>
> Here's all the sources you need: IETF RFC editor.
> Look for "NNTP". How to advise Google of this is
> that each domain on the Internet is supposed to
> have an "abuse@domain" email inbox, though there's
> probably also a web request interface, as with regards
> to publicly facing services, and expected to be
> good actors on the network.
>
> Anyways if you read through "Meta: a usenet server
> just for sci.math", what I have in mind is a sort
> of author's and writer's oriented installation,
> basically making for vanity printouts and generating
> hypertext collections of contents and authors and
> subjects and these kinds of things, basically for
> on the order of "find all the postings of Archimedes
> Plutonium, and, the threads they are in, and,
> make a hypertext page of all that, a linear timeline,
> and also thread it out as a linear sequence".
>
> I.e. people who actually post to Usenet are sometimes
> having written interesting things, and, thus having
> it so that it would be simplified to generate message-ID
> listings and their corresponding standard URL's in the
> standard IETF "news" URL protocol, and to point that
> at a given news server or like XLink, is for treating
> Usenet its archives like a living museum of all these
> different authors posts and their interactions together.
>
> I.e., here it's "belles lettres" and "fair use",
> not just "belles" and "use".
>
>
> It seemed nice of Google Groups to front this for a long time,
> now they're quitting.
>
> I imagine Internet Relay Chat's still insane, though.
>
> Anyways I stay away from any warez and am proud that
> since about Y2K at least I've never bootlegged anything,
> and never uploaded a bootleg. Don't want to give old Shylock
> excuses, and besides, I wrote software for a living.
>
>


Click here to read the complete article
Re: Meta: Re: How I deal with the enormous amount of spam

<65C92FFE.44EA@ix.netcom.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130608&group=sci.physics.relativity#130608

  copy link   Newsgroups: sci.physics.relativity sci.physics
Path: i2pn2.org!i2pn.org!newsfeed.endofthelinebbs.com!paganini.bofh.team!not-for-mail
From: starmaker@ix.netcom.com (The Starmaker)
Newsgroups: sci.physics.relativity,sci.physics
Subject: Re: Meta: Re: How I deal with the enormous amount of spam
Date: Sun, 11 Feb 2024 12:37:18 -0800
Organization: To protect and to server
Message-ID: <65C92FFE.44EA@ix.netcom.com>
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com> <afe109ee689f7051d5909fe8eca54113@www.novabbs.com> <b67e3cc9-a1ac-428f-b225-389abf45d416n@googlegroups.com> <Z_icnWyNR7xW5CP4nZ2dnZfqnPSdnZ2d@giganews.com> <65BEB38A.B7E@ix.netcom.com> <65BEC238.5CF1@ix.netcom.com> <18mdnb4LQ8kNUiL4nZ2dnZfqn_SdnZ2d@giganews.com> <0decnTlW944oSSL4nZ2dnZfqn_udnZ2d@giganews.com> <65BFF51D.3F9B@ix.netcom.com> <65BFF947.2233@ix.netcom.com> <Oq6cneYGTPUt2l34nZ2dnZfqnPqdnZ2d@giganews.com> <65C079E5.4E84@ix.netcom.com> <65C27AC4.15B7@ix.netcom.com> <65C2822A.16A7@ix.netcom.com> <65C28717.3F60@ix.netcom.com>
Reply-To: starmaker@ix.netcom.com
Mime-Version: 1.0
Content-Type: text/plain; charset=iso-8859-1
Content-Transfer-Encoding: 8bit
Injection-Info: paganini.bofh.team; logging-data="3684321"; posting-host="nLYg9UBeoMWa070gP9wQcw.user.paganini.bofh.team"; mail-complaints-to="usenet@bofh.team"; posting-account="9dIQLXBM7WM9KzA+yjdR4A";
Cancel-Lock: sha256:sODMkGu18m547JxGTDHExPGxWtH1rllMobzTDxQtVjk=
X-Antivirus-Status: Clean
X-Notice: Filtered by postfilter v. 0.9.3
X-Antivirus: Avast (VPS 240211-4, 02/11/2024), Outbound message
X-Mailer: Mozilla 3.04Gold (WinNT; U)
 by: The Starmaker - Sun, 11 Feb 2024 20:37 UTC

The Starmaker wrote:
>
> The Starmaker wrote:
> >
> > The Starmaker wrote:
> > >
> > > The Starmaker wrote:
> > > >
> > > > Ross Finlayson wrote:
> > > > >
> > > > > On 02/04/2024 12:53 PM, The Starmaker wrote:
> > > > > > The Starmaker wrote:
> > > > > >>
> > > > > >> Ross Finlayson wrote:
> > > > > >>>
> > > > > >>> On 02/04/2024 09:55 AM, Ross Finlayson wrote:
> > > > > >>>> On 02/03/2024 02:46 PM, The Starmaker wrote:
> > > > > >>>>> The Starmaker wrote:
> > > > > >>>>>>
> > > > > >>>>>> Ross Finlayson wrote:
> > > > > >>>>>>>
> > > > > >>>>>>> On 01/30/2024 12:54 PM, Ross Finlayson wrote:
> > > > > >>>>>>>> On Monday, January 29, 2024 at 5:02:05 PM UTC-8, palsing wrote:
> > > > > >>>>>>>>> Tom Roberts wrote:
> > > > > >>>>>>>>>
> > > > > >>>>>>>>>> I use Thunderbird to read Usenet. Recently sci.physics.relativity
> > > > > >>>>>>>>>> has
> > > > > >>>>>>>>>> been getting hundreds of spam posts each day, completely
> > > > > >>>>>>>>>> overwhelming
> > > > > >>>>>>>>>> legitimate content. These spam posts share the property that they
> > > > > >>>>>>>>>> are
> > > > > >>>>>>>>>> written in a non-latin script.
> > > > > >>>>>>>>>
> > > > > >>>>>>>>>> Thunderbird implements message filters that can mark a message
> > > > > >>>>>>>>>> Read. So
> > > > > >>>>>>>>>> I created a filter to run on sci.physics.relativity that marks
> > > > > >>>>>>>>>> messages
> > > > > >>>>>>>>>> Read. Then when reading the newsgroups, I simply display only unread
> > > > > >>>>>>>>>> messages. The key to making this work is to craft the filter so
> > > > > >>>>>>>>>> it marks
> > > > > >>>>>>>>>> messages in which the Subject matches any of a dozen characters
> > > > > >>>>>>>>>> picked
> > > > > >>>>>>>>>> from some spam messages.
> > > > > >>>>>>>>>
> > > > > >>>>>>>>>> This doesn't completely eliminate the spam, but it is now only a few
> > > > > >>>>>>>>>> messages per day.
> > > > > >>>>>>>>>
> > > > > >>>>>>>>>> Tom Roberts
> > > > > >>>>>>>>> I would like to do the same thing, so I installed Thunderbird...
> > > > > >>>>>>>>> but setting it up to read newsgroups is beyond my paltry computer
> > > > > >>>>>>>>> skills and is not at all intuitive. If anyone can point to an
> > > > > >>>>>>>>> idiot-proof tutorial for doing this It would be much appreciated.
> > > > > >>>>>>>>>
> > > > > >>>>>>>>> \Paul Alsing
> > > > > >>>>>>>>
> > > > > >>>>>>>> Yeah, it's pretty bad, or, worse anybody's ever seen it.
> > > > > >>>>>>>>
> > > > > >>>>>>>> I as well sort of mow the lawn a bit or mark the spam.
> > > > > >>>>>>>>
> > > > > >>>>>>>> It seems alright if it'll be a sort of clean break: on Feb 22
> > > > > >>>>>>>> according to Google,
> > > > > >>>>>>>> Google will break its compeerage to Usenet, and furthermore make
> > > > > >>>>>>>> read-only
> > > > > >>>>>>>> the archives, what it has, what until then, will be as it was.
> > > > > >>>>>>>>
> > > > > >>>>>>>> Over on sci.math I've had the idea for a while of making some brief
> > > > > >>>>>>>> and
> > > > > >>>>>>>> special purpose Usenet compeers, for only some few groups, or, you
> > > > > >>>>>>>> know, the _belles lettres_ of the text hierarchy.
> > > > > >>>>>>>>
> > > > > >>>>>>>> "Meta: a usenet server just for sci.math"
> > > > > >>>>>>>> -- https://groups.google.com/g/sci.math/c/zggff_pVEks
> > > > > >>>>>>>>
> > > > > >>>>>>>> So, there you can read the outlook of this kind of thing, then
> > > > > >>>>>>>> while sort
> > > > > >>>>>>>> of simple as the protocol is simple and its implementations
> > > > > >>>>>>>> widespread,
> > > > > >>>>>>>> how to deal with the "signal and noise" of "exposed messaging
> > > > > >>>>>>>> destinations
> > > > > >>>>>>>> on the Internet", well on that thread I'm theorizing a sort of,
> > > > > >>>>>>>> "NOOBNB protocol",
> > > > > >>>>>>>> figuring to make an otherwise just standard Usenet compeer, and
> > > > > >>>>>>>> also for
> > > > > >>>>>>>> email or messaging destinations, sort of designed with the
> > > > > >>>>>>>> expectation that
> > > > > >>>>>>>> there will be spam, and spam and ham are hand in hand, to exclude
> > > > > >>>>>>>> it in simple terms.
> > > > > >>>>>>>>
> > > > > >>>>>>>> NOOBNB: New Old Off Bot Non Bad, Curated/Purgatory/Raw triple-feed
> > > > > >>>>>>>>
> > > > > >>>>>>>> (That and a firmer sort of "Load Shed" or "Load Hold" at the
> > > > > >>>>>>>> transport layer.)
> > > > > >>>>>>>>
> > > > > >>>>>>>> Also it would be real great if at least there was surfaced to the
> > > > > >>>>>>>> Internet a
> > > > > >>>>>>>> read-only view of any message by its message ID, a "URL", or as for
> > > > > >>>>>>>> a "URI",
> > > > > >>>>>>>> a "URN", a reliable perma-link in the IETF "news" protocol, namespace.
> > > > > >>>>>>>>
> > > > > >>>>>>>> https://groups.google.com/g/sci.math/c/zggff_pVEks
> > > > > >>>>>>>>
> > > > > >>>>>>>> I wonder that there's a reliable sort of long-term project that
> > > > > >>>>>>>> surfaces
> > > > > >>>>>>>> "news" protocol message-IDs, .... It's a stable, standards-based
> > > > > >>>>>>>> protocol.
> > > > > >>>>>>>>
> > > > > >>>>>>>>
> > > > > >>>>>>>> Thunderbird, "SLRN", .... Thanks for caring. We care.
> > > > > >>>>>>>>
> > > > > >>>>>>>>
> > > > > >>>>>>>> https://groups.google.com/g/sci.physics.relativity/c/ToBo6XOymUw
> > > > > >>>>>>>>
> > > > > >>>>>>>
> > > > > >>>>>>> One fellow reached me via e-mail and he said, hey, the Googler spam is
> > > > > >>>>>>> outrageous, can we do anything about it? Would you write a script to
> > > > > >>>>>>> funnel all their message-ID's into the abuse reporting? And I was
> > > > > >>>>>>> like,
> > > > > >>>>>>> you know, about 2008 I did just that, there was a big spam flood,
> > > > > >>>>>>> and I wrote a little script to find them and extract their
> > > > > >>>>>>> posting-account,
> > > > > >>>>>>> and the message-ID, and a little script to post to the posting-host,
> > > > > >>>>>>> each one of the wicked spams.
> > > > > >>>>>>>
> > > > > >>>>>>> At the time that seemed to help, they sort of dried up, here there's
> > > > > >>>>>>> that basically they're not following the charter, but, it's the
> > > > > >>>>>>> posting-account
> > > > > >>>>>>> in the message headers that indicate the origin of the post, not the
> > > > > >>>>>>> email address. So, I wonder, given that I can extract the
> > > > > >>>>>>> posting-accounts
> > > > > >>>>>>> of all the spams, how to match the posting-account to then determine
> > > > > >>>>>>> whether it's a sockpuppet-farm or what, and basically about sending
> > > > > >>>>>>> them up.
> > > > > >>>>>>
> > > > > >>>>>> Let me see your little script. Post it here.
> > > > > >>>>>
> > > > > >>>>> Here is a list I currently have:
> > > > > >>>>>
> > > > > >>>>> salz.txt
> > > > > >>>>> usenet.death.penalty.gz
> > > > > >>>>> purify.txt
> > > > > >>>>> NewsAgent110-MS.exe
> > > > > >>>>> HipCrime's NewsAgent (v1_11).htm
> > > > > >>>>> NewsAgent111-BE.zip
> > > > > >>>>> SuperCede.exe
> > > > > >>>>> NewsAgent023.exe
> > > > > >>>>> NewsAgent025.exe
> > > > > >>>>> ActiveAgent.java
> > > > > >>>>> HipCrime's NewsAgent (v1_02)_files
> > > > > >>>>> NewsCancel.java (source code)
> > > > > >>>>>
> > > > > >>>>> (plus updated python versions)
> > > > > >>>>>
> > > > > >>>>>
> > > > > >>>>>
> > > > > >>>>> (Maybe your script is inthere somewhere?)
> > > > > >>>>>
> > > > > >>>>>
> > > > > >>>>>
> > > > > >>>>> Show me what you got. walk the walk.
> > > > > >>>>>
> > > > > >>>>
> > > > > >>>>
> > > > > >>>> I try to avoid sketchy things like hiring a criminal botnet,
> > > > > >>>> there's the impression that that's looking at 1000's of counts
> > > > > >>>> of computer intrusion.
> > > > > >>>>
> > > > > >>>> With those being something about $50K and 10-25 apiece,
> > > > > >>>> there's a pretty significant deterrence to such activities.
> > > > > >>>>
> > > > > >>>> I've never much cared for "OAuth", giving away the
> > > > > >>>> keys-to-the-kingdom and all, here it looks like either
> > > > > >>>> a) a bunch of duped browsers clicked away their identities,
> > > > > >>>> or b) it's really that Google and Facebook are more than
> > > > > >>>> half full of fake identities for the sole purpose of being fake.
> > > > > >>>>
> > > > > >>>> (How's your new deal going?
> > > > > >>>> Great, we got a million users.
> > > > > >>>> Why are my conversions around zero?
> > > > > >>>> Your ad must not speak to them.
> > > > > >>>> Would it help if I spiced it up?
> > > > > >>>> Don't backtalk me, I'll put you on a list!)
> > > > > >>>>
> > > > > >>>> So, it seems mostly a sort of "spam-walling the Internet",
> > > > > >>>> where it was like "we're going to reinvent the Internet",
> > > > > >>>> "no, you aren't", "all right then we'll ruin this one".
> > > > > >>>>
> > > > > >>>> As far as search goes, there's something to be said
> > > > > >>>> for a new sort of approach to search, given that
> > > > > >>>> Google, Bing, Duck, ..., _all make the same results_. It's
> > > > > >>>> just so highly unlikely that they'd _all make the same
> > > > > >>>> results_, you figure they're just one.
> > > > > >>>>
> > > > > >>>> So, the idea, for somebody like me who's mostly interested
> > > > > >>>> in writing on the Internet, is that lots of that is of the sort
> > > > > >>>> of "works" vis-a-vis, the "feuilleton" or what you might
> > > > > >>>> call it, ephemeral junk, that I just learned about in
> > > > > >>>> Herman Hesse's "The Glass Bead Game".
> > > > > >>>>
> > > > > >>>> Then, there's an idea, that basically to surface high-quality
> > > > > >>>> works to a search, is that there's what's called metadata,
> > > > > >>>> for content like HTML, with regards to Dublin Core and
> > > > > >>>> RDF and so on, about a sort of making for fungible collections
> > > > > >>>> of works, what results searchable fragments of various
> > > > > >>>> larger bodies of works, according to their robots.txt and
> > > > > >>>> their summaries and with regards to crawling the content
> > > > > >>>> and so on, then to make federated common search corpi,
> > > > > >>>> these kinds of things.
> > > > > >>>>
> > > > > >>>>
> > > > > >>>>
> > > > > >>>
> > > > > >>> It's like "why are they building that new data center",
> > > > > >>> and it's like "well it's like Artificial Intelligence, inside
> > > > > >>> that data center is a million virts and each one has a
> > > > > >>> browser emulator and a phone app sandbox and a
> > > > > >>> little notecard that prompts its name, basically it's
> > > > > >>> a million-headed hydra called a sims-bot-farm,
> > > > > >>> that for pennies on the dollar is an instant audience."
> > > > > >>>
> > > > > >>> "Wow, great, do they get a cut?" "Don't be talking about my cut."
> > > > > >>>
> > > > > >>> Usenet traffic had been up recently, ....
> > > > > >>>
> > > > > >>> I think they used to call it "astro-turfing".
> > > > > >>> "Artificial Intelligence?" "No, 'Fake eyeballs'."
> > > > > >>
> > > > > >> I have NewsAgent111-MS.exe
> > > > > >>
> > > > > >> I seem to be missing version 2.0
> > > > > >>
> > > > > >> Do you have the 2.0 version?
> > > > > >>
> > > > > >> I'll trade you.
> > > > > >>
> > > > > >> I'll give you my python version with (GUI)!!!! (Tinter)
> > > > > >>
> > > > > >> let's trade!
> > > > > >>
> > > > > >> don't bogart
> > > > > >
> > > > > > I seem to be missing this version:
> > > > > >
> > > > > > https://web.archive.org/web/20051023050609/http://newsagent.p5.org.uk/
> > > > > >
> > > > > > Do you have it? you must have!
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > >
> > > > > Nope, I just wrote a little script to connect to NNTP
> > > > > with a Yes/No button on the subject, tapped through
> > > > > those, and a little script to send an HTTP request to
> > > > > the publicly-facing return-to-sender in-box, for each.
> > > > >
> > > > > Here's all the sources you need: IETF RFC editor.
> > > > > Look for "NNTP". How to advise Google of this is
> > > > > that each domain on the Internet is supposed to
> > > > > have an "abuse@domain" email inbox, though there's
> > > > > probably also a web request interface, as with regards
> > > > > to publicly facing services, and expected to be
> > > > > good actors on the network.
> > > > >
> > > > > Anyways if you read through "Meta: a usenet server
> > > > > just for sci.math", what I have in mind is a sort
> > > > > of author's and writer's oriented installation,
> > > > > basically making for vanity printouts and generating
> > > > > hypertext collections of contents and authors and
> > > > > subjects and these kinds of things, basically for
> > > > > on the order of "find all the postings of Archimedes
> > > > > Plutonium, and, the threads they are in, and,
> > > > > make a hypertext page of all that, a linear timeline,
> > > > > and also thread it out as a linear sequence".
> > > > >
> > > > > I.e. people who actually post to Usenet are sometimes
> > > > > having written interesting things, and, thus having
> > > > > it so that it would be simplified to generate message-ID
> > > > > listings and their corresponding standard URL's in the
> > > > > standard IETF "news" URL protocol, and to point that
> > > > > at a given news server or like XLink, is for treating
> > > > > Usenet its archives like a living museum of all these
> > > > > different authors posts and their interactions together.
> > > > >
> > > > > I.e., here it's "belles lettres" and "fair use",
> > > > > not just "belles" and "use".
> > > > >
> > > > > It seemed nice of Google Groups to front this for a long time,
> > > > > now they're quitting.
> > > > >
> > > > > I imagine Internet Relay Chat's still insane, though.
> > > > >
> > > > > Anyways I stay away from any warez and am proud that
> > > > > since about Y2K at least I've never bootlegged anything,
> > > > > and never uploaded a bootleg. Don't want to give old Shylock
> > > > > excuses, and besides, I wrote software for a living.
> > > >
> > > > Anyways, I don't know who was talking about "any warez" or "bootlegs",
> > > > since I was refering to programs and scripts that reads:
> > > >
> > > > "FREE, which means you can copy it and redistribute"
> > > > "Similarly, the source is provided as reference and can be redistributed
> > > > freely as well. "
> > > >
> > > > HipCrime's NewsAgent (v2.0) is FREE, which means you can copy it and
> > > > redistribute it at will, as long as you give credit to the original
> > > > author. Similarly, the source is provided as reference and can be
> > > > redistributed freely as well.
> > > >
> > > > https://web.archive.org/web/20051023050609/http://newsagent.p5.org.uk/
> > > >
> > > > You seem to be too much 'in your head', on a high horse...
> > > >
> > > > "FREE, which means you can copy it and redistribute"
> > > > "Similarly, the source is provided as reference and can be redistributed
> > > > freely as well. "
> > > >
> > > > So, show me that wicked script you wrote : "funnel all their
> > > > message-ID's"
> > > > by people you call spammers who 'funnel' their products and services
> > > > through Usenet newsgroups.
> > > >
> > > > You are sooooo wicked.
> > > >
> > > > and a nanofossils
> > >
> > > Anyways, there is only one person that know what 'nanofossils' means,
> > > and that is Ross Finlayson.
> > >
> > > I just realized that Ross Finlayson doesn't know of NEWSAGENT.
> > >
> > > Anyways, ...
> > >
> > > "Anyways"???? Who talks like that?
> > >
> > > Anyways..
> > >
> > > the problem of the 'flooding' is not the spammers, it's the 'scientific
> > > community'. They caused the problem.
> > > They removed the feature that NewsAgent used to get rid of ALL flooding
> > > and spammers. But, but, the
> > > members of the scientific community could not trust their own members to
> > > use it against them.
> > >
> > > If one member of the 'scientific community' disagreed with another
> > > member of the 'scientific community'...they were removed!
> > >
> > > Too much power.
> > >
> > > I called it...God Mode.
> >
> > Using NewsAgent in God Mode was great! Ecept...if you didn't know how to use it
> > properly you can make a mistake and remove *EVERYONE'S* posts by accident.
> >
> > Everyone just completely disapeared!
> >
> > Oops. i made a booboo.
> >
> > Like that Twilight Zone episode where everyone disapears by a click of a watch.
> >
> > Where is everybody? MAJOR KILLFILE!
> >
> > So, which is worse?
>
> You know what GOD MODE Killfile is? That means you not only killed filed everyone, but you also sort of
> turn everyones esles killfile on. Nobody sees nobody.
>
> It's like the Atomic Bomb of Usenet!
>
> (only yous guys make bombs like that)
> (then yous get angry when everyone has the atomic bomb)
>
> typical.


Click here to read the complete article
Re: Meta: Re: How I deal with the enormous amount of spam

<uqe61d$26n3$2@solani.org>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130623&group=sci.physics.relativity#130623

  copy link   Newsgroups: sci.physics.relativity sci.physics
Path: i2pn2.org!i2pn.org!eternal-september.org!feeder3.eternal-september.org!news.mixmin.net!weretis.net!feeder8.news.weretis.net!reader5.news.weretis.net!news.solani.org!.POSTED!not-for-mail
From: Physfitfreak@gmail.com (Physfitfreak)
Newsgroups: sci.physics.relativity,sci.physics
Subject: Re: Meta: Re: How I deal with the enormous amount of spam
Date: Mon, 12 Feb 2024 16:27:57 -0600
Message-ID: <uqe61d$26n3$2@solani.org>
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com>
<afe109ee689f7051d5909fe8eca54113@www.novabbs.com>
<b67e3cc9-a1ac-428f-b225-389abf45d416n@googlegroups.com>
<Z_icnWyNR7xW5CP4nZ2dnZfqnPSdnZ2d@giganews.com> <65BEB38A.B7E@ix.netcom.com>
<65BEC238.5CF1@ix.netcom.com> <18mdnb4LQ8kNUiL4nZ2dnZfqn_SdnZ2d@giganews.com>
<0decnTlW944oSSL4nZ2dnZfqn_udnZ2d@giganews.com> <65BFF51D.3F9B@ix.netcom.com>
<65BFF947.2233@ix.netcom.com> <Oq6cneYGTPUt2l34nZ2dnZfqnPqdnZ2d@giganews.com>
<65C079E5.4E84@ix.netcom.com> <65C27AC4.15B7@ix.netcom.com>
<65C2822A.16A7@ix.netcom.com> <65C28717.3F60@ix.netcom.com>
<65C92FFE.44EA@ix.netcom.com>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Date: Mon, 12 Feb 2024 22:27:57 -0000 (UTC)
Injection-Info: solani.org;
logging-data="72419"; mail-complaints-to="abuse@news.solani.org"
User-Agent: Mozilla Thunderbird
Cancel-Lock: sha1:5fJh7FiZdbpj6jeJz1uBDw5G6Tg=
Content-Language: en-US
In-Reply-To: <65C92FFE.44EA@ix.netcom.com>
X-User-ID: eJwNycEBwCAIA8CVsJiA4wDC/iPY+x6Ui2Wb4MZgKgyJLCJD249G5B6of7YO0dPlPj1iqRTmFdYfN1brgcgDbnEVyA==
X-Antivirus: Avast (VPS 240212-4, 2/12/2024), Outbound message
X-Antivirus-Status: Clean
 by: Physfitfreak - Mon, 12 Feb 2024 22:27 UTC

On 2/11/2024 2:37 PM, The Starmaker wrote:
> As I mentioned above,
> the problem of the 'flooding' is not the spammers,
> it's the 'scientific community'. They caused the problem.
> They removed the feature that was used to get rid of ALL flooding...
>
> Why? Because of the War of the Gods.

So a comic book is resposible for that?

--
This email has been checked for viruses by Avast antivirus software.
www.avast.com

Re: How I deal with the enormous amount of spam

<81466abc34fdac616962825dd1a1c78b@www.novabbs.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130669&group=sci.physics.relativity#130669

  copy link   Newsgroups: sci.physics.relativity
Path: i2pn2.org!.POSTED!not-for-mail
From: hitlong@yahoo.com (gharnagel)
Newsgroups: sci.physics.relativity
Subject: Re: How I deal with the enormous amount of spam
Date: Wed, 14 Feb 2024 21:18:14 +0000
Organization: novaBBS
Message-ID: <81466abc34fdac616962825dd1a1c78b@www.novabbs.com>
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com> <c9001c52-cd6d-41fc-a857-a30d9cfc845en@googlegroups.com> <0e5aabd6fba0141a7a0e7b82761055fa@www.novabbs.com> <1qocs71.1duewnn9ts4swN%nospam@de-ster.demon.nl>
MIME-Version: 1.0
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Info: i2pn2.org;
logging-data="2825119"; mail-complaints-to="usenet@i2pn2.org";
posting-account="t+lO0yBNO1zGxasPvGSZV1BRu71QKx+JE37DnW+83jQ";
User-Agent: Rocksolid Light
X-Rslight-Site: $2y$10$Nb3SHrN13FE12vyL5UcGx.Ui6muxtrGCDJ27PGy7CQS4qz5ryKaSa
X-Rslight-Posting-User: 47dad9ee83da8658a9a980eb24d2d25075d9b155
X-Spam-Checker-Version: SpamAssassin 4.0.0
 by: gharnagel - Wed, 14 Feb 2024 21:18 UTC

> ProkaryoticCaspaseHomolog <tomyee3@gmail.com> wrote:
> >
> > Richard Hertz wrote:
> > >
> > > Use this website. It's spam free, and you can get a free account there:
> > >
> > > https://www.novabbs.com/tech/thread.php?group=sci.physics.relativity
> >
> > Thanks!

I'm using it, too, since Google increased their "security" which locks me out
but still allows all the spam!

> Much better to get a real newsserver instead,
>
> Jan

My question is, will one still be able to post to novabbs when google pulls
the plug?

Gary

Re: Meta: Re: How I deal with the enormous amount of spam

<e-ednSIXwP6QClL4nZ2dnZfqnPadnZ2d@giganews.com>

  copy mid

https://news.novabbs.org/tech/article-flat.php?id=130705&group=sci.physics.relativity#130705

  copy link   Newsgroups: sci.physics.relativity
Path: i2pn2.org!i2pn.org!weretis.net!feeder6.news.weretis.net!border-2.nntp.ord.giganews.com!nntp.giganews.com!Xl.tags.giganews.com!local-1.nntp.ord.giganews.com!news.giganews.com.POSTED!not-for-mail
NNTP-Posting-Date: Fri, 16 Feb 2024 17:10:37 +0000
Subject: Re: Meta: Re: How I deal with the enormous amount of spam
Newsgroups: sci.physics.relativity
References: <V_CcnSsLkaRCoSX4nZ2dnZfqlJ9j4p2d@giganews.com>
<afe109ee689f7051d5909fe8eca54113@www.novabbs.com>
<b67e3cc9-a1ac-428f-b225-389abf45d416n@googlegroups.com>
<Z_icnWyNR7xW5CP4nZ2dnZfqnPSdnZ2d@giganews.com> <65BEB38A.B7E@ix.netcom.com>
<65BEC238.5CF1@ix.netcom.com> <18mdnb4LQ8kNUiL4nZ2dnZfqn_SdnZ2d@giganews.com>
<0decnTlW944oSSL4nZ2dnZfqn_udnZ2d@giganews.com> <65BFF51D.3F9B@ix.netcom.com>
<65BFF947.2233@ix.netcom.com> <Oq6cneYGTPUt2l34nZ2dnZfqnPqdnZ2d@giganews.com>
<rPecnb34-r-14lv4nZ2dnZfqn_GdnZ2d@giganews.com>
From: ross.a.finlayson@gmail.com (Ross Finlayson)
Date: Fri, 16 Feb 2024 09:10:37 -0800
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:38.0) Gecko/20100101
Thunderbird/38.6.0
MIME-Version: 1.0
In-Reply-To: <rPecnb34-r-14lv4nZ2dnZfqn_GdnZ2d@giganews.com>
Content-Type: text/plain; charset=windows-1252; format=flowed
Content-Transfer-Encoding: 8bit
Message-ID: <e-ednSIXwP6QClL4nZ2dnZfqnPadnZ2d@giganews.com>
Lines: 440
X-Usenet-Provider: http://www.giganews.com
X-Trace: sv3-dbw5h/F0cBG1Cv0SjU3h7E4XSQ5WTfblDvFdgDSs5oGIo/zhO/7J85RCxDrA06cARUX5vSR0OSuityy!2Cf9Z0xZs7xkQA4QBh/kS6xF68SjDYp+ykjv2SOrjJipkfjWdz37ty9IdXtetTtHhD3yJnv170Do
X-Complaints-To: abuse@giganews.com
X-DMCA-Notifications: http://www.giganews.com/info/dmca.html
X-Abuse-and-DMCA-Info: Please be sure to forward a copy of ALL headers
X-Abuse-and-DMCA-Info: Otherwise we will be unable to process your complaint properly
X-Postfilter: 1.3.40
 by: Ross Finlayson - Fri, 16 Feb 2024 17:10 UTC

On 02/09/2024 11:38 AM, Ross Finlayson wrote:
> On 02/04/2024 06:28 PM, Ross Finlayson wrote:
>> On 02/04/2024 12:53 PM, The Starmaker wrote:
>>> The Starmaker wrote:
>>>>
>>>> Ross Finlayson wrote:
>>>>>
>>>>> On 02/04/2024 09:55 AM, Ross Finlayson wrote:
>>>>>> On 02/03/2024 02:46 PM, The Starmaker wrote:
>>>>>>> The Starmaker wrote:
>>>>>>>>
>>>>>>>> Ross Finlayson wrote:
>>>>>>>>>
>>>>>>>>> On 01/30/2024 12:54 PM, Ross Finlayson wrote:
>>>>>>>>>> On Monday, January 29, 2024 at 5:02:05 PM UTC-8, palsing wrote:
>>>>>>>>>>> Tom Roberts wrote:
>>>>>>>>>>>
>>>>>>>>>>>> I use Thunderbird to read Usenet. Recently
>>>>>>>>>>>> sci.physics.relativity
>>>>>>>>>>>> has
>>>>>>>>>>>> been getting hundreds of spam posts each day, completely
>>>>>>>>>>>> overwhelming
>>>>>>>>>>>> legitimate content. These spam posts share the property that
>>>>>>>>>>>> they
>>>>>>>>>>>> are
>>>>>>>>>>>> written in a non-latin script.
>>>>>>>>>>>
>>>>>>>>>>>> Thunderbird implements message filters that can mark a message
>>>>>>>>>>>> Read. So
>>>>>>>>>>>> I created a filter to run on sci.physics.relativity that marks
>>>>>>>>>>>> messages
>>>>>>>>>>>> Read. Then when reading the newsgroups, I simply display only
>>>>>>>>>>>> unread
>>>>>>>>>>>> messages. The key to making this work is to craft the filter so
>>>>>>>>>>>> it marks
>>>>>>>>>>>> messages in which the Subject matches any of a dozen characters
>>>>>>>>>>>> picked
>>>>>>>>>>>> from some spam messages.
>>>>>>>>>>>
>>>>>>>>>>>> This doesn't completely eliminate the spam, but it is now
>>>>>>>>>>>> only a few
>>>>>>>>>>>> messages per day.
>>>>>>>>>>>
>>>>>>>>>>>> Tom Roberts
>>>>>>>>>>> I would like to do the same thing, so I installed Thunderbird...
>>>>>>>>>>> but setting it up to read newsgroups is beyond my paltry
>>>>>>>>>>> computer
>>>>>>>>>>> skills and is not at all intuitive. If anyone can point to an
>>>>>>>>>>> idiot-proof tutorial for doing this It would be much
>>>>>>>>>>> appreciated.
>>>>>>>>>>>
>>>>>>>>>>> \Paul Alsing
>>>>>>>>>>
>>>>>>>>>> Yeah, it's pretty bad, or, worse anybody's ever seen it.
>>>>>>>>>>
>>>>>>>>>> I as well sort of mow the lawn a bit or mark the spam.
>>>>>>>>>>
>>>>>>>>>> It seems alright if it'll be a sort of clean break: on Feb 22
>>>>>>>>>> according to Google,
>>>>>>>>>> Google will break its compeerage to Usenet, and furthermore make
>>>>>>>>>> read-only
>>>>>>>>>> the archives, what it has, what until then, will be as it was.
>>>>>>>>>>
>>>>>>>>>> Over on sci.math I've had the idea for a while of making some
>>>>>>>>>> brief
>>>>>>>>>> and
>>>>>>>>>> special purpose Usenet compeers, for only some few groups, or,
>>>>>>>>>> you
>>>>>>>>>> know, the _belles lettres_ of the text hierarchy.
>>>>>>>>>>
>>>>>>>>>> "Meta: a usenet server just for sci.math"
>>>>>>>>>> -- https://groups.google.com/g/sci.math/c/zggff_pVEks
>>>>>>>>>>
>>>>>>>>>> So, there you can read the outlook of this kind of thing, then
>>>>>>>>>> while sort
>>>>>>>>>> of simple as the protocol is simple and its implementations
>>>>>>>>>> widespread,
>>>>>>>>>> how to deal with the "signal and noise" of "exposed messaging
>>>>>>>>>> destinations
>>>>>>>>>> on the Internet", well on that thread I'm theorizing a sort of,
>>>>>>>>>> "NOOBNB protocol",
>>>>>>>>>> figuring to make an otherwise just standard Usenet compeer, and
>>>>>>>>>> also for
>>>>>>>>>> email or messaging destinations, sort of designed with the
>>>>>>>>>> expectation that
>>>>>>>>>> there will be spam, and spam and ham are hand in hand, to exclude
>>>>>>>>>> it in simple terms.
>>>>>>>>>>
>>>>>>>>>> NOOBNB: New Old Off Bot Non Bad, Curated/Purgatory/Raw
>>>>>>>>>> triple-feed
>>>>>>>>>>
>>>>>>>>>> (That and a firmer sort of "Load Shed" or "Load Hold" at the
>>>>>>>>>> transport layer.)
>>>>>>>>>>
>>>>>>>>>> Also it would be real great if at least there was surfaced to the
>>>>>>>>>> Internet a
>>>>>>>>>> read-only view of any message by its message ID, a "URL", or as
>>>>>>>>>> for
>>>>>>>>>> a "URI",
>>>>>>>>>> a "URN", a reliable perma-link in the IETF "news" protocol,
>>>>>>>>>> namespace.
>>>>>>>>>>
>>>>>>>>>> https://groups.google.com/g/sci.math/c/zggff_pVEks
>>>>>>>>>>
>>>>>>>>>> I wonder that there's a reliable sort of long-term project that
>>>>>>>>>> surfaces
>>>>>>>>>> "news" protocol message-IDs, .... It's a stable, standards-based
>>>>>>>>>> protocol.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Thunderbird, "SLRN", .... Thanks for caring. We care.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> https://groups.google.com/g/sci.physics.relativity/c/ToBo6XOymUw
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>> One fellow reached me via e-mail and he said, hey, the Googler
>>>>>>>>> spam is
>>>>>>>>> outrageous, can we do anything about it? Would you write a
>>>>>>>>> script to
>>>>>>>>> funnel all their message-ID's into the abuse reporting? And I was
>>>>>>>>> like,
>>>>>>>>> you know, about 2008 I did just that, there was a big spam flood,
>>>>>>>>> and I wrote a little script to find them and extract their
>>>>>>>>> posting-account,
>>>>>>>>> and the message-ID, and a little script to post to the
>>>>>>>>> posting-host,
>>>>>>>>> each one of the wicked spams.
>>>>>>>>>
>>>>>>>>> At the time that seemed to help, they sort of dried up, here
>>>>>>>>> there's
>>>>>>>>> that basically they're not following the charter, but, it's the
>>>>>>>>> posting-account
>>>>>>>>> in the message headers that indicate the origin of the post, not
>>>>>>>>> the
>>>>>>>>> email address. So, I wonder, given that I can extract the
>>>>>>>>> posting-accounts
>>>>>>>>> of all the spams, how to match the posting-account to then
>>>>>>>>> determine
>>>>>>>>> whether it's a sockpuppet-farm or what, and basically about
>>>>>>>>> sending
>>>>>>>>> them up.
>>>>>>>>
>>>>>>>> Let me see your little script. Post it here.
>>>>>>>
>>>>>>> Here is a list I currently have:
>>>>>>>
>>>>>>> salz.txt
>>>>>>> usenet.death.penalty.gz
>>>>>>> purify.txt
>>>>>>> NewsAgent110-MS.exe
>>>>>>> HipCrime's NewsAgent (v1_11).htm
>>>>>>> NewsAgent111-BE.zip
>>>>>>> SuperCede.exe
>>>>>>> NewsAgent023.exe
>>>>>>> NewsAgent025.exe
>>>>>>> ActiveAgent.java
>>>>>>> HipCrime's NewsAgent (v1_02)_files
>>>>>>> NewsCancel.java (source code)
>>>>>>>
>>>>>>> (plus updated python versions)
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> (Maybe your script is inthere somewhere?)
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Show me what you got. walk the walk.
>>>>>>>
>>>>>>
>>>>>>
>>>>>> I try to avoid sketchy things like hiring a criminal botnet,
>>>>>> there's the impression that that's looking at 1000's of counts
>>>>>> of computer intrusion.
>>>>>>
>>>>>> With those being something about $50K and 10-25 apiece,
>>>>>> there's a pretty significant deterrence to such activities.
>>>>>>
>>>>>> I've never much cared for "OAuth", giving away the
>>>>>> keys-to-the-kingdom and all, here it looks like either
>>>>>> a) a bunch of duped browsers clicked away their identities,
>>>>>> or b) it's really that Google and Facebook are more than
>>>>>> half full of fake identities for the sole purpose of being fake.
>>>>>>
>>>>>> (How's your new deal going?
>>>>>> Great, we got a million users.
>>>>>> Why are my conversions around zero?
>>>>>> Your ad must not speak to them.
>>>>>> Would it help if I spiced it up?
>>>>>> Don't backtalk me, I'll put you on a list!)
>>>>>>
>>>>>> So, it seems mostly a sort of "spam-walling the Internet",
>>>>>> where it was like "we're going to reinvent the Internet",
>>>>>> "no, you aren't", "all right then we'll ruin this one".
>>>>>>
>>>>>> As far as search goes, there's something to be said
>>>>>> for a new sort of approach to search, given that
>>>>>> Google, Bing, Duck, ..., _all make the same results_. It's
>>>>>> just so highly unlikely that they'd _all make the same
>>>>>> results_, you figure they're just one.
>>>>>>
>>>>>> So, the idea, for somebody like me who's mostly interested
>>>>>> in writing on the Internet, is that lots of that is of the sort
>>>>>> of "works" vis-a-vis, the "feuilleton" or what you might
>>>>>> call it, ephemeral junk, that I just learned about in
>>>>>> Herman Hesse's "The Glass Bead Game".
>>>>>>
>>>>>> Then, there's an idea, that basically to surface high-quality
>>>>>> works to a search, is that there's what's called metadata,
>>>>>> for content like HTML, with regards to Dublin Core and
>>>>>> RDF and so on, about a sort of making for fungible collections
>>>>>> of works, what results searchable fragments of various
>>>>>> larger bodies of works, according to their robots.txt and
>>>>>> their summaries and with regards to crawling the content
>>>>>> and so on, then to make federated common search corpi,
>>>>>> these kinds of things.
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>> It's like "why are they building that new data center",
>>>>> and it's like "well it's like Artificial Intelligence, inside
>>>>> that data center is a million virts and each one has a
>>>>> browser emulator and a phone app sandbox and a
>>>>> little notecard that prompts its name, basically it's
>>>>> a million-headed hydra called a sims-bot-farm,
>>>>> that for pennies on the dollar is an instant audience."
>>>>>
>>>>> "Wow, great, do they get a cut?" "Don't be talking about my cut."
>>>>>
>>>>> Usenet traffic had been up recently, ....
>>>>>
>>>>> I think they used to call it "astro-turfing".
>>>>> "Artificial Intelligence?" "No, 'Fake eyeballs'."
>>>>
>>>> I have NewsAgent111-MS.exe
>>>>
>>>> I seem to be missing version 2.0
>>>>
>>>> Do you have the 2.0 version?
>>>>
>>>> I'll trade you.
>>>>
>>>> I'll give you my python version with (GUI)!!!! (Tinter)
>>>>
>>>> let's trade!
>>>>
>>>> don't bogart
>>>
>>> I seem to be missing this version:
>>>
>>> https://web.archive.org/web/20051023050609/http://newsagent.p5.org.uk/
>>>
>>> Do you have it? you must have!
>>>
>>>
>>>
>>>
>>>
>>
>> Nope, I just wrote a little script to connect to NNTP
>> with a Yes/No button on the subject, tapped through
>> those, and a little script to send an HTTP request to
>> the publicly-facing return-to-sender in-box, for each.
>>
>> Here's all the sources you need: IETF RFC editor.
>> Look for "NNTP". How to advise Google of this is
>> that each domain on the Internet is supposed to
>> have an "abuse@domain" email inbox, though there's
>> probably also a web request interface, as with regards
>> to publicly facing services, and expected to be
>> good actors on the network.
>>
>> Anyways if you read through "Meta: a usenet server
>> just for sci.math", what I have in mind is a sort
>> of author's and writer's oriented installation,
>> basically making for vanity printouts and generating
>> hypertext collections of contents and authors and
>> subjects and these kinds of things, basically for
>> on the order of "find all the postings of Archimedes
>> Plutonium, and, the threads they are in, and,
>> make a hypertext page of all that, a linear timeline,
>> and also thread it out as a linear sequence".
>>
>> I.e. people who actually post to Usenet are sometimes
>> having written interesting things, and, thus having
>> it so that it would be simplified to generate message-ID
>> listings and their corresponding standard URL's in the
>> standard IETF "news" URL protocol, and to point that
>> at a given news server or like XLink, is for treating
>> Usenet its archives like a living museum of all these
>> different authors posts and their interactions together.
>>
>> I.e., here it's "belles lettres" and "fair use",
>> not just "belles" and "use".
>>
>>
>> It seemed nice of Google Groups to front this for a long time,
>> now they're quitting.
>>
>> I imagine Internet Relay Chat's still insane, though.
>>
>> Anyways I stay away from any warez and am proud that
>> since about Y2K at least I've never bootlegged anything,
>> and never uploaded a bootleg. Don't want to give old Shylock
>> excuses, and besides, I wrote software for a living.
>>
>>
>
>
> How to get a grip on the Gogole spams since six months ago or so
> seems to be along the lines of trawling the newsgroups, pulling
> down their posts since about that date, then extracting from
> each one whether it has "KINDLE EPUB EBOOK" or Thai code,
> has that mostly they can all be identified because they make
> an X-Content-Transfer-Encoding base64 header, then that the content
> is sort of inscrutable block unless its uncoded.
>
> So it looks like it's possible to identify only off the subject
> and other headers, then pretty definitively off the format,
> which ones are these spams.
>
> Then each one of these has a Google posting account in
> one of the Google headers.
>
> Injection-Info: google-groups.googlegroups.com;
> posting-host=146.70.11.7; posting-account=hJ31DwoAAADGk9KnJ0tR36KM3U7DAsJC
>
> It's that posting-account that basically is the abuser's,
> whether or not it's an innocent dupe's after something like OAuth,
> is undetermined.
>
> Another sort of indicating bit is that they start with
>
> X-Forwarded-Encrypted / X-Received / X-Forwarded-Encrypted / X-Received
>
> pointing at some SMTP id's, what with regards to that looks like an SMTP
> gateway,
> what with respect to what might be alternate forms of message injection,
> while the Path of each of the posts indicates as coming from
> postnews.google.com .
> Not all do, though.
>
> Those though look like usual Google posters' posts, so it seems like
> an automation of some Groups API on the Google side.
>
> So anyways the idea is to
>
> get the list of groups on a usenet server, GROUPS
> get the count of headers since a few weeks ago
> get the overview of headers
> find likely spams
> make a list of spammed groups
> get the spammed count of headers since October
> get the spammed overview of headers since October
> find likely spams
> pull down the headers
> extract the posting-account
>
>
> then part of the challenge is not including any threaded replies,
> in the sense that some people replied to these posts in their rejection,
> to make sure that an algorithm to mark spams is avoiding
> Type I/II errors or the false positive/negative. I.e., such posts
> in their replies, in their own content, don't have the same
> characteristics.
>
> (Or anything that contains bit ly links or "common exact-links in the
> spam".
> Also all those "Case Analysis and Case Study Solution" spams,
> look kind of similar. In fact when the spam started up I thought to
> myself "hey I wonder if that's those 'Case Analysis and Case Study
> Solution' spammers". )
>
> For example on 10/2/23 ,I replied to a spam, so looking at it,
> b0796889-551b-4637-be38-b590b5de2efcn@googlegroups.com
> I would want to disambiguate spam reply rejections, from spams.
> I'm not sure yet if the spams with same subject headers are
> actually threaded replies or just have same subject, I imagine
> that they just have the same subject header and aren't threaded replies.
>
> So, there would basically be for cross-checking "likely spam"
> (not replies, no references) and "likely not-spam" (replies, references).
>
>
> Otherwise it does look just like one of the spams, with a
> Content-Transfer-Encoding and that, but not the "encrypted SMTP"
> bit.
>
> "Note: Meta title tags should typically be around ...", is
> one of the blast-fax mail-merge prompts that slips out,
> with the idea of that finding that quote in the source
> code will probably indicate the origin of the software.
>
> So, the idea is to key off of posting account, then compute counts
> for these sorts relations
>
> posting-account -> email-address
> posting-account -> targetted-group
>
> then to compute how many spams were sent, by who, to where,
> and whether posting-account <-> email-address is 1-1 or fraudulent,
> then result a neat list of posting-accounts to batch up in a sort of
> report and send it up to Google as a curated sort of spam report.
>
> Then there's an idea that thus results a sort of spam rule,
> about making "federated spam rules" type of a thing,
> with regards to things like "spam blacklists" and these
> kinds of things, and heuristics or rules, vis-a-vis, that
> neural net classifiers are inscrutable and instead there
> is to be a sort of "open quality rules" for relating messages,
> to groups.
>
> So, we can identify the spams, and, sort of to their origins.
> Across all the Usenet groups.
>
>
>


Click here to read the complete article

tech / sci.physics.relativity / Re: How I deal with the enormous amount of spam

Pages:123
server_pubkey.txt

rocksolid light 0.9.81
clearnet tor