

The “dumb” solution is to just import both into one feed reader then export a new OPML. I assume most readers will deduplicate (at least to a basic degree) on import.


The “dumb” solution is to just import both into one feed reader then export a new OPML. I assume most readers will deduplicate (at least to a basic degree) on import.


How is this faulty? The degree of damage is incredibly relevant. We don’t make everything that could ever cause damage illegal, because we have nothing left. Laws are a balancing act of pros and cons to society.
A car has far less visibility (they are inside a box with a few windows) will will do far more damage if they hit someone. A cyclist has dramatically better visibility (they have basically an unobstructed 180° view) and especially when going slow is very unlikely to cause significant damage (posing risk of significant harm only the the most frail and elderly).
If not requiring complete stops for cyclists leads to 1% more cyclists on the road (because their travel is easier) it almost certainly causes less harm overall due to how dangerous cars are and also their indirect health effects (both inactivity when driving and the pollution).
So no, the logic isn’t faulty at all and probably one of the most important arguments.


I use https://difftastic.wilfred.me.uk/ which is well, fantastic. I have it set up as the default diff for Git and it is really nice.


It’s always sad when a well-love project goes unmaintained. However it is nice that if had a clear exit. I would have appreciated if the code could be kept online a bit longer for archival efforts but a month isn’t the worst.
I hope other maintainers step up with forks. It will take a while to see which forks are stable and well maintained but it seems like the project was fairly complete and stable so it shouldn’t be too difficult to keep going.
I did take an archive of all of the Git repos on https://git.tt-rss.org/. I encourage anyone who can to grab a copy for preservation. I’ll seed it myself a long while, probably at least a year. But probably not forever.
magnet:?xt=urn:btih:0d70be6a837096aff29e13c29e7ec25961ce3d09&xt=urn:btmh:1220054379c509e2c722d5109a10ff594d5f8916baf9d6152a278e05768fd8d76f65&dn=tt-rss&xl=246415360&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce&tr=udp%3A%2F%2Fopen.demonii.com%3A1337%2Fannounce&tr=udp%3A%2F%2Fopen.stealth.si%3A80%2Fannounce


I’m also not familiar. But my understanding is that the package maintainers should prevent this situation. Because otherwise even if there are package version dependencies (I don’t actually know if pacman does this) it would just block the update which results in a partial update which isn’t supported. For example if your theoretical unmaintained Firefox blocks the update of libssl but Python requires new functionality you would be stuck in dependency hell. Leaving this problem to the users just makes this problem worse. So the package maintainers need to sort something out.
It is a huge pain when it happens but tends to be pretty rare in practice. Typically they can just wait for software to update or ship a small patch to fix it. But in the worst case you need to maintain two versions of the common dependency. In lots of distros very common dependencies tend to get different packages for different major version for this reason. For example libfoo1 and libfoo2. Then there can be a period where both are supported while packages slowly move from one to the other.


IF no dependency tries to update too. Off course in that case I would stop. Without pacman -Sy, I never do that anyway, only -Syu.
That’s all you need to know. As long as you always use pacman -Syu you will be fine. pacman -Sy is the real problem. The wiki page is pretty clear about the sequences of commands that are problematic https://wiki.archlinux.org/title/System_maintenance#Partial_upgrades_are_unsupported.
Right? What i don’t understand is, when I uninstall with pacman -Rs firefox, delete the cached firefox package (only that file), then the system is in the same state as before I installed it. Then -S firefox should be okay, right? And it even looks up the new version.
This isn’t correct. It won’t look up the new version. Assuming that the system was in a consistent state it will download the exact same package that you deleted. The system only ever “updates” when you run pacman -Sy. Until you use -y all packages are effectively pinned at a specific version. If the version that gets installed is different than the one you removed it probably means that you were breaking the partial update rule previously.


But that is my point. Just running pacman -S firefox is fine as long as you didn’t run pacman -Sy at some point earlier. It won’t update anything, even dependencies. It will just install the version that matches your current package list and system including the right version of any dependencies if they aren’t already installed.
But that means if you already have Firefox installed it will do nothing.


I think you are a little confused at the problem here. The issue is that partial updates are not supported. The reason for this is very simple, Arch ensures that any given package list works on its own, but not that packages from different versions of the package list work together. So if Firefox depends on libssl the new Firefox package may depend on a new libssl function. If you install that version of Firefox without updating libssl it will cause problems.
There is no way around this limitation. If you install that new Firefox without he new libssl you will have problems. No matter how you try to rules lawyer it. Now 99% of the time this works. Typically packages don’t depend on new library functions right away. But sometimes they do, and that is why as a rule this is unsupported. You are welcome to try it, but if it breaks don’t complain to the devs, they never promised it would work. But this isn’t some policy where you can find a loophole. It is a technical limitation. If you manage to find a loophole people aren’t going to say “oh, that should work, let’s fix it” it will break and you will be on your own to fix it.
Focusing on your commands. The thing is that pacman -S firefox is always fine on its own. If Firefox is already installed it will do nothing, if it isn’t it will install the version from the current package list. Both of those operations are supported. Also pacman -Rs firefox && pacman -S firefox is really no different than just pacman -S firefox (other than potentially causing problems if the package can’t be allowed to be removed due to dependencies). So your command isn’t accomplishing anything even if it did somehow magically work around the rules.
What is really the problem is pacman -Sy. This command updates the package list without actually updating any packages. This will enter you system into a precarious state where any new package installed or updated (example our pacman -S firefox command form earlier) will be a version that is mismatched with the rest of your system. This is unsupported and will occasionally cause problems. Generally speaking you shouldn’t run pacman -Sy, any time you are using -Sy you should also be passing -u. This ensures that the package list and your installed packages are updated together.


You are misunderstanding. They are forcing people to use LLMs at the cost of productivity so that they can hopefully find places where LLMs can improve productivity.
This isn’t really unreasonable. They are basically trying to get ahead of the adoption curve by forcing pre-mature adoption and find use cases. Bezos loves firing workers so this is basically a win-win. If it works they offload more work to LLMs and fire workers, if it doesn’t work other workers get a bad performance review and get fired.
/favicon.ico is the only “default” URL. /favicon.ico is usually not an actual “icon” type anymore but PNG or JPG (but with the same URL). Other than that you need to load the HTML and check for Link headers or <link rel=icon> elements. While URLs like /favicon.png may be popular they aren’t part of any actual protocol.
IMHO if we want to get rid of tips the way to go about it is to pick a date (for example January 1st 2026) then agree to stop tipping on that date. Hard and fast stop doing it. Stores can raise their prices to compensate.
The problem is that it is very hard to make this change incrementally. Because individuals are considered assholes if they don’t tip enough. So we all sort of got to get together and agree to it. Of course it will be hard to publicize this because big media companies are all owned by the rich that benefit by paying minimum wage workers less with the excuse that they can get tips.


While I agree, I think that getting more games on Linux is far more useful. When Linux is almost 3% very few studios will care much. If they can do a small bit of testing on Proton and maybe work around a bug or two they are far more likely to do that then make and test a native build. If this then gets Linux usage to 5, 10 or 20% that will drive more native builds.
So I agree that it somewhat reduces the incentive to release a native build. But I think that is outweighed by the benefits of making the Linux gaming experience better today which will have a greater impact on availability of native builds in the future.


It’s been fine. But I’m a decently well off young white dude who has never had trouble with borders anywhere. But I will still avoid it as much as I can.


poweroff or shutdown will work on almost every distro. Even systemd ones (they are usually symlinks but doesn’t really matter because they work).


But your case is wrong anyways because i <= INT_MAX will always be true, by definition. By your argument < is actually better because it is consistent from < 0 to iterate 0 times to < INT_MAX to iterate the maximum number of times. INT_MAX + 1 is the problem, not < which is the standard to write for loops and the standard for a reason.
Actually I would pick GIMP.
Really think only thing I would like to see is some screenshots and examples of using the tool, rather than just info on what it does. But the Photoshop page barely has this, just a few examples of the AI tools.


Maybe, but some of my favourite channels do YouTube as a full-time job. Maybe they would still post part-time if they couldn’t profit off of but the videos would almost certainly be less-frequent and be made with tighter budgets.
But even then I find it hard to believe. I subscribe to a bunch of seemingly for-fun channels but most of my favourites have by this point become full-time video creators. GCP Grey, Captain Disillusion, Technology Connections, Tom Scott, Veritasium…
It is true that money can corrupt, but in this world you also need an income, and if you need to devote a lot of time to get income from a different source then that only distracts from the time and energy that you can put towards making videos.
If you just want the video call part you can use https://call.element.io/ and get E2EE calls by sharing a link. It has worked pretty well for me.
There was one bug a few weeks ago where new participants wouldn’t show up but that seems to have been fixed.


But that’s my point. If these creators on different sites charged between $0.26 and $1.30 I would have subscribed to a bunch of them. But when they are charging $5/month that is quite a different amount to pay. Something that I would only really be considering for my absolute favourites.
This is also likely interesting because console SDKs are usually highly restricted. So not only is the Minecraft code leaked (which is probably moderately interesting) it is likely that the console APIs are quite interesting to emulator developers and reverse engineering for other PS3 games.