I'm not sure how I missed this, but there's a petition to implement Proportional Representation for Canadian federal elections: ourcommons.ca/petitions/en/Pet…
@Bob Jonkman did you hear about it?
I'm not sure how I missed this, but there's a petition to implement Proportional Representation for Canadian federal elections: ourcommons.ca/petitions/en/Pet…
@Bob Jonkman did you hear about it?
As a follow on to my last post, I've decided to shut down the pod next weekend. Thanks for all the fish.
I don't really know who else uses this pod anymore as registrations have been closed for a long time and the built-in user pruning feature deleted the podmin account (!) a long time ago -- but I'm winding down my use of social media and I haven't really checked in here in weeks. I think when this domain is up for renewal at the end of January I'm going to let it lapse (i.e. the pod will cease to exist).
Sorry if that inconveniences anyone. This is your notice to find a new pod.
The premier made a series of false claims in his announcement on Thursday.
Well, that's always a good sign.
I've caught some AI crawlers aggressively crawling some of my sites, disregarding the robots.txt. Some of the sites are of little or no interest to any real person. So I've deployed iocaine (iocaine.madhouse-project.org/) on them, in a "always spew nonsense" mode, rather than the suggested "generate nonsense if it looks like a bot" mode. But I am not unfair. I've included a robots.txt there so that any bot that respects it will be spared from ingesting it.
Unfortunately, I'm abandoning my self-hosted git repositories, but I didn't have much on there of interest any more. Most of what was there was old, and was also in my GitLab account.
As fun as it is to taunt the bot with random garbage, the idiots trying to crawl my git repo have pulled 2GB in less than 4 hours. Which should fit within my monthly limit for my VPS, but is just a stupid amount of data. So for now, I've set the hostname for my git repo to point to 127.0.0.1 instead. With any luck, the bot will try to DoS itself. But at the very least it will stop bothering me for a while. I might turn it back to iocaine at some point. I've left iocaine on the other sites.
By the way, from the access logs, it hadn't even started crawling the garbage links that iocaine generated. It was only crawling links that it had before.
In other news, you'd think that Google would understand about respecting robots.txt. Googlebot seems to do so, but GoogleOther does not.
"Kitchener non-profit to head 'deeply affordable' housing project in Guelph" Rent is expected to be $500/mo. It's only 13 units, but 13 is better than 0. We need more of this type of thing.
I've been meaning to put up a bee hotel for a while now, so I printed one. We'll see.
I found #pedantle #1077 in 34 guesses!
🟩🟩🟩🟩🟧🟧🟧🟧🟧🟥🟥🟥🟥🟥🟥🟥🟥🟥🟥🟥
https://pedantle.certitudes.org/
I would have expected to get this one a lot sooner since it aligns with my interests.
#pedantle I found #pedantle #1075 in 93 guesses!
🟩🟩🟩🟩🟧🟧🟧🟧🟥🟥🟥🟥🟥🟥🟥🟥🟥🟥🟥🟥
https://pedantle.certitudes.org/
Jonkman Microblog is a social network, courtesy of SOBAC Microcomputer Services. It runs on GNU social, version 1.2.0-beta5, available under the GNU Affero General Public License.
All Jonkman Microblog content and data are available under the Creative Commons Attribution 3.0 license.