>What's Inside > >This guide focuses on securing the Docker platform on Linux. Follow along with the techniques demonstrated in this guide. All you need is a Linux server with Docker installed and running as well as a: > > fundamental knowledge of Docker and Docker CLI commands; > functional knowledge of Linux terminal commands; and a > fundamental knowledge of systemd and Linux init systems.
I remember when I was maybe 30 or so, a friend plucked a hair from my head and gave it to me while telling me it was my first grey hair and suggested I give it to my mother.
When I gave it to my mother she replied "That is not the first grey hair you have given me".
>Dennis “Dee Tee” Thomas, a founding member of the long-running soul-funk band Kool & the Gang, known for such hits as Celebration and Get Down On It, has died. > >He was 70 years old. > >He died peacefully in his sleep on Saturday in New Jersey, where he was a resident of Montclair, according to a statement from his representative. >...
SeaMonkey is the only one that comes to mind. Add to that the 50 million monthly users they have lost in the past three years and I am not sure if it will last for more than a few years. I find myself asking the question of what browser/email client do I use next? Maybe time to do some research and writing.
>“Sometimes it comes up with a desert and it thinks its an indecent image or pornography,” says Met digital forensics head Mark Stokes. “For some reason, lots of people have screen-savers of deserts and it picks it up thinking it is skin color.”
snip
Have heard the same thing about AI's and some fruits.
>Shortly after reports today that Apple will start scanning iPhones for child-abuse images, the company confirmed its plan and provided details in a news release and technical summary. > >"Apple's method of detecting known CSAM (child sexual abuse material) is designed with user privacy in mind," Apple's announcement said. "Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users' devices." > >Apple provided more detail on the CSAM detection system in a technical summary and said its system uses a threshold "set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account." > >The changes will roll out "later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey," Apple said. Apple will also deploy software that can analyze images in the Messages application for a new system that will "warn children and their parents when receiving or sending sexually explicit photos." >...
Related: Apple's Plan to "Think Different" About Encryption Opens a Backdoor to Your Private Life Electronic Frontier Foundation https://nu.federati.net/url/282292