sorted by: new top controversial old
[-] Onihikage@beehaw.org 4 points 3 weeks ago
[-] Onihikage@beehaw.org 1 points 4 weeks ago
  • Crash Course Pods: The Universe, with John Green & Dr. Katie Mack. Talks about how the universe came into existence.
  • Volts, with David Roberts. Talks about electrification and the energy transition.

I don't listen to many podcasts, but those two are pretty great.

[-] Onihikage@beehaw.org 2 points 1 month ago

https://mediabiasfactcheck.com/orinoco-tribune-bias-and-credibility/

Overall, we rate Orinoco Tribune as extreme left biased and questionable due to its consistent promotion of anti-imperialist, socialist, and Chavista viewpoints. We rate them low factually due to their strong ideological stance, selective sourcing, the promotion of propaganda, and conspiracy theories related to the West.

The Orinoco Tribune has a clear left-leaning bias. It consistently supports anti-imperialist and Chavista perspectives (those who supported Hugo Chavez). The publication critiques U.S. policies and mainstream media narratives about countries opposing U.S. influence. Articles frequently defend the Venezuelan government and criticize opposition movements and foreign intervention.

Articles and headlines often contain emotionally charged language opposed to the so-called far-right of Venezuela, like this Far Right Plots to Sabotage Venezuela’s Electrical System in Attempt to Disrupt the Electoral Process. The story is translated from another source and lacks hyperlinked sourcing to support its claims.

Maybe don't consider a pro-Maduro propaganda rag as a legitimate source for a conflict he's directly involved in.

Maduro is a man who ordered his country to block Signal, ordered it to block social media, and arrests, imprisons, and bans his political opposition. He has also expressed strong support for Russia's invasion of Ukraine, meanwhile the citizens of his country have been starving for years under what is literally known as The Maduro Diet, and the middle class has vanished. He has long forfeit his right to the benefit of the doubt. He is a despot who has now repeatedly falsified election results after mismanaging the country for years, and calls his opposition fascists while being fascist. That the people overwhelmingly want him gone is not some hegemonic plot by the evil West, it's the natural consequence of his actions.

[-] Onihikage@beehaw.org 2 points 1 month ago

Unfortunately I can't even test Llama 3.1 in Alpaca because it refuses to download, showing some error message with the important bits cut off.

That said, the Alpaca download interface seems much more robust, allowing me to select a model and then select any version of it for download, not just apparently picking whatever version it thinks I should use. That's an improvement for sure. On GPT4All I basically have to download the model manually if I want one that's not the default, and when I do that there's a decent chance it doesn't run on GPU.

However, GPT4All allows me to plainly see how I can edit the system prompt and many other parameters the model is run with, and even configure multiple sets of parameters for the same model. That allows me to effectively pre-configure a model in much more creative ways, such as programming it to be a specific character with a specific background and mindset. I can get the Mistral model from earlier to act like anything from a very curt and emotionally neutral virtual intelligence named Jarvis to a grumpy fantasy monster whose behavior is transcribed by a narrator. GPT4All can even present an API endpoint to localhost for other programs to use.

Alpaca seems to have some degree of model customization, but I can't tell how well it compares, probably because I'm not familiar with using ollama and I don't feel like tinkering with it since it doesn't want to use my GPU. The one thing I can see that's better in it is the use of multiple models at the same time; right now GPT4All will unload one model before it loads another.

[-] Onihikage@beehaw.org 3 points 1 month ago

I have a fairly substantial 16gb AMD GPU, and when I load in Llama 3.1 8B Instruct 128k (Q4_0), it gives me about 12 tokens per second. That's reasonably fast enough for me, but only 50% faster than CPU (which I test by loading mlabonne's abliterated Q4_K_M version, which runs on CPU in GPT4All, though I have no idea if that's actually meant to be comparable in performance).

Then I load in Nous Hermes 2 Mistral 7B DPO (also Q4_0) and it blazes through at 50+ tokens per second. So I don't really know what's going on there. Seems like performance varies a lot from model to model, but I don't know enough to speculate why. I can't even try Gemma2 models, GPT4All just crashes with them. I should probably test Alpaca to see if these perform any different there...

[-] Onihikage@beehaw.org 11 points 1 month ago

I actually found GPT4ALL through looking into Kompute (Vulkan Compute), and it led me to question why anyone would bother with ROCm or OpenCL at all.

[-] Onihikage@beehaw.org 6 points 1 month ago* (last edited 1 month ago)

I mainly recommend Universal Blue distros to newbies, like Bazzite or Aurora. The immutable nature more or less means users don't have to worry about performing maintenance of system apps like they might on some distros, mostly don't have to worry about dependencies, and are less likely to irreversibly break the system themselves or in an update.

That said, these distros are Fedora-based, and I think that's fine. No idea who out there is recommending Arch of all things.

[-] Onihikage@beehaw.org 6 points 1 month ago

He did at the beginning, but he helped them get what they wanted in the end, and I think that counts for something.

“We’re thankful that the Biden administration played the long game on sick days and stuck with us for months after Congress imposed our updated national agreement,” Russo said. “Without making a big show of it, Joe Biden and members of his administration in the Transportation and Labor departments have been working continuously to get guaranteed paid sick days for all railroad workers.

“We know that many of our members weren’t happy with our original agreement,” Russo said, “but through it all, we had faith that our friends in the White House and Congress would keep up the pressure on our railroad employers to get us the sick day benefits we deserve. Until we negotiated these new individual agreements with these carriers, an IBEW member who called out sick was not compensated.”

[-] Onihikage@beehaw.org 17 points 1 month ago* (last edited 1 month ago)

Counterpoint: Scumbag companies ninja-editing their timestamped warranty page such that the only way you know they edited it after you bought the product is because it was archived previously.

Archives are ideal for identifying sneaky behavior like that. You never know when an admin might have the ability to delete or edit something without anyone noticing.

[-] Onihikage@beehaw.org 1 points 2 months ago

Technically, any model trained on LAION-5B before December 2023 was trained on CSAM.

But yeah, I expect any porn model trained on a sufficient diversity of adult actors could be used to make convincing CP even without having it in the training data. AI image generation is basically the digital equivalent of a chainsaw - a tool for a particular messy job that can really hurt people if used incorrectly. You wouldn't let a typical kid run around unattended with one, that's for sure.

[-] Onihikage@beehaw.org 5 points 2 months ago

The ELI5 for Fedora's atomic desktops is that if Windows had an Atomic Desktop version, Program Files and most of the Windows folder would be read only, and each program you installed yourself would go into its own folder in your user directory. That's the basic idea. It's harder to screw up an Atomic system as long as you stick to containerized app formats like flatpak/appimage whenever possible. It makes it easier for everyone to diagnose problems, and easier for users to roll back if an update has problems. Even if you were to install it right now, you could use one simple command to "roll back" to any image from the last three months.

The benefit of Bazzite is you have all of the above, plus a lot of gaming-related stuff preinstalled which, if you were to install them yourself in a normal Fedora environment, you'd likely have to spend a lot of time just learning how they're supposed to be configured, how they interact, which versions have problems, and how to troubleshoot problems when an update to one app breaks a prerequisite for something else; eventually you end up in config hell instead of actually using your computer. With Bazzite, the image maintainers are the ones in config hell - they work out the kinks, app versioning, communicate with upstream to fix issues, all that, so your system should be in the most functional state that a Linux system can be, so you only have to think about using your apps.

tl;dr

  • Atomic Desktops are more resilient to randomly breaking from updates or user error, and are easier to revert to a prior state if problems do arise
  • Bazzite is a custom Atomic image with lots of gaming stuff preinstalled and preconfigured to work properly out of the box
  • If you're a gamer and wanting to try out Linux, Bazzite is going to be the least painful way to get your feet wet.
  • Immutable distros are excellent for daily driving. I daily drive one myself!
view more: next ›

Onihikage

joined 1 year ago