sorted by: new top controversial old
[-] jlh@lemmy.jlh.name 2 points 14 hours ago

Intel a310 is the best $/perf transcoding card, but if P40 supports nvenc, it might work for both transcode and stable diffusion.

[-] jlh@lemmy.jlh.name 24 points 2 days ago

A tale of selfish betrayal as old as time

[-] jlh@lemmy.jlh.name 19 points 2 days ago

How else does Macron expect to get a majority behind Barnier? He already rejected a technocrat coalition with NFP. Macron is obviously expecting to get support from Le Pen in return for a cordon sanitaire on the socialists.

[-] jlh@lemmy.jlh.name 41 points 2 days ago

Ah yes, an Ensemble-Republicains coalition with 38% of seats is totally a "majority" and will totally survive. This was totally better than the 65% majority coalition proposed by the NFP around Lucie Castets before Ensemble decided that they decided center-left was too extreme for them.

Macron is clearly running to Le Pen to save himself from getting impeached.

[-] jlh@lemmy.jlh.name 1 points 2 days ago

Just a historical comparison I'm making based on the Wikipedia articles for the Kosovo War.

https://en.wikipedia.org/wiki/Kosovo_War

https://en.wikipedia.org/wiki/NATO_bombing_of_Yugoslavia

A NATO-facilitated ceasefire between the KLA and Yugoslav forces was signed on 15 October 1998, but both sides broke it two months later and fighting resumed. When the killing of 45 Kosovar Albanians in the Račak massacre was reported in January 1999, NATO decided that the conflict could only be settled by introducing a military peacekeeping force to forcibly restrain the two sides.[50] Yugoslavia refused to sign the Rambouillet Accords, which among other things called for 30,000 NATO peacekeeping troops in Kosovo; an unhindered right of passage for NATO troops on Yugoslav territory; immunity for NATO and its agents to Yugoslav law; and the right to use local roads, ports, railways, and airports without payment and requisition public facilities for its use free of cost.[51][35] NATO then prepared to install the peacekeepers by force, using this refusal to justify the bombings.

It took years of fighting, but eventually both sides' refusal to sign a ceasefire was used as justification for NATO to neutralize the military forces in the region.

[-] jlh@lemmy.jlh.name 3 points 3 days ago* (last edited 3 days ago)

Probably a traffic_bytes_counter got reset. You can see a lot of graphs went negative at the same time, so something probably restarted.

Metrics software like Prometheus will handle counter resets correctly for graphs like this.

[-] jlh@lemmy.jlh.name 16 points 4 days ago

Is firefox getting paid for advertising ChatGPT, Google Gemini, HuggingChat, and Le Chat Mistral in their browser?

[-] jlh@lemmy.jlh.name 8 points 4 days ago* (last edited 4 days ago)

Counterpoint: RoR1 was a 2d sidescroller

[-] jlh@lemmy.jlh.name 7 points 5 days ago* (last edited 5 days ago)

I'm honestly seeing parallels to how Milosevic had a limit for how long he could delay a cease fire

[-] jlh@lemmy.jlh.name 5 points 5 days ago

lol indeed 🥲 Very much a war of Russian stupidity.

[-] jlh@lemmy.jlh.name 0 points 5 days ago

Adventures in bad sociology, fascism, and gender/neurological chauvinism with our favorite billionaire.

392
169

https://web.archive.org/web/20240719155854/https://www.wired.com/story/crowdstrike-outage-update-windows/

"CrowdStrike is far from the only security firm to trigger Windows crashes with a driver update. Updates to Kaspersky and even Windows’ own built-in antivirus software Windows Defender have caused similar Blue Screen of Death crashes in years past."

"'People may now demand changes in this operating model,' says Jake Williams, vice president of research and development at the cybersecurity consultancy Hunter Strategy. 'For better or worse, CrowdStrike has just shown why pushing updates without IT intervention is unsustainable.'"

49
submitted 8 months ago* (last edited 8 months ago) by jlh@lemmy.jlh.name to c/programming@programming.dev

I wanted to share an observation I've seen on the way the latest computer systems work. I swear this isn't an AI hype train post 😅

I'm seeing more and more computer systems these days use usage data or internal metrics to be able to automatically adapt how they run, and I get the feeling that this is a sort of new computing paradigm that has been enabled by the increased modularity of modern computer systems.

First off, I would classify us being in a sort of "second-generation" of computing. The first computers in the 80s and 90s were fairly basic, user programs were often written in C/Assembly, and often ran directly in ring 0 of CPUs. Leading up to the year 2000, there were a lot of advancements and technology adoption in creating more modular computers. Stuff like microkernels, MMUs, higher-level languages with memory management runtimes, and the rise of modular programming in languages like Java and Python. This allowed computer systems to become much more advanced, as the new abstractions available allowed computer programs to reuse code and be a lot more ambitious. We are well into this era now, with VMs and Docker containers taking over computer infrastructure, and modern programming depending on software packages, like you see with NPM and Cargo.

So we're still in this "modularity" era of computing, where you can reuse code and even have microservices sharing data with each other, but often the amount of data individual computer systems have access to is relatively limited.

More recently, I think we're seeing the beginning of "data-driven" computing, which uses observability and control loops to run better and self-manage.

I see a lot of recent examples of this:

  • Service orchestrators like Linux-systemd and Kubernetes that monitor the status and performance of services they own, and use that data for self-healing and to optimize how and where those services run.
  • Centralized data collection systems for microservices, which often include automated alerts and control loops. You see a lot of new systems like this, including Splunk, OpenTelemetry, and Pyroscope, as well as internal data collection systems in all of the big cloud vendors. These systems are all trying to centralize as much data as possible about how services run, not just including logs and metrics, but also more low-level data like execution-traces and CPU/RAM profiling data.
  • Hardware metrics in a lot of modern hardware. Before 2010, you were lucky if your hardware reported clock speeds and temperature for hardware components. Nowadays, it seems like hardware components are overflowing with data. Every CPU core now not only reports temperature, but also power usage. You see similar things on GPUs too, and tools like nvitop are critical for modern GPGPU operations. Nowadays, even individual RAM DIMMs report temperature data. The most impressive thing is that now CPUs even use their own internal metrics, like temperature, silicon quality, and power usage, in order to run more efficiently, like you see with AMD's CPPC system.
  • Of source, I said this wasn't an AI hype post, but I think the use of neural networks to enhance user interfaces is definitely a part of this. The way that social media uses neural networks to change what is shown to the user, the upcoming "AI search" in Windows, and the way that all this usage data is fed back into neural networks makes me think that even user-facing computer systems will start to adapt to changing conditions using data science.

I have been kind of thinking about this "trend" for a while, but this announcement that ACPI is now adding hardware health telemetry inspired me to finally write up a bit of a description of this idea.

What do people think? Have other people seen the trend for self-adapting systems like this? Is this an oversimplification on computer engineering?

view more: next ›

jlh

joined 1 year ago