

At lower levels, underwater.
At lower levels, underwater.
Agreed. For a new user that wants to minimise system maintenance I’d recommend the atomic version, Fedora Kinoite. Flatpak plus rpm-ostree makes it like a phone where you can just do system updates and install/remove apps.
“It is difficult to get a man to understand something, when his salary depends upon his not understanding it!” - Upton Sinclair
Another option is to use your router to send data via VPN so you don’t have to set up tunnels on each host.
Try notbob.i2p
Looks like the price hike is US only.
I have an M4 Mini with 16GB and I don’t think the additional cost for the extra 8GB is worth it. Save your money for another release. The 16GB version will last a long time, of course depends what you are doing with it.
I think they want to give away computers without user accounts already created, that’s all.
The same Alina Habba who, when asked whether she’d rather be smart or pretty, chose pretty because you can fake being smart.
No more wary than, say, CriticalBadger or SuccessfulCrab45. Some of the more obvious bots have very normal-looking names.
It’s school property with a camera and microphone in their homes lol
That’s like a VPN with extra steps. Though if someone you trust in that country already has an exit node you can connect through, then that does sound good.
Does anyone have a good reason to go with PIA when there are others that offer a comparable service without these problems?
You can try I2P. The selection is smaller and it’s slower, but it’s free and privacy-first.
How does Tailscale help here?
We’re talking about choosing who to financially support. Catloaf doesn’t want to knowingly send money to support tankies. I’m not sure what your nazi baker comment was supposed to communicate, but I wouldn’t patronise a nazi bakery. Are you saying you would? It’s obvious that nobody knows the provenance of everything they buy, but only you are bringing that up and I don’t see the relevance.
If you knowingly patronise a nazi bakery, that’s fine?
When I was a kid, I thought it must be dust mites marching across my pillow.
Yep, clueless. I stopped reading at that point. For the audience, large language models come in all sizes and you can run some small but useful ones fairly quickly even without a GPU. They keep getting more capable for the size as well. Remember the uproar about Deepseek R1? Well, progress hasn’t stopped.
As the circle enlarges, the system approaches four T-intersections. What I want to know is: at what size circle to people lose their minds and become unable to comprehend how T-intersections work.
Fair play to people confused about multi-lane roundabouts though.