sorted by: new top controversial old
[-] merari42@lemmy.world 9 points 9 hours ago

Typical BMW driver: Forget flashing their headlights to move people out of the way on the highway—now they’ve got a 122mm rocket launcher for that

[-] merari42@lemmy.world 7 points 9 hours ago

Least impractical tuned BMW

1
1
Delicious (lemmy.world)
[-] merari42@lemmy.world 0 points 1 week ago

You need to get your head checked (by a jumbo jet)

1
[-] merari42@lemmy.world 6 points 1 week ago

You're my wonderwall

1
DuckDuckGoose (lemmy.world)
[-] merari42@lemmy.world 0 points 1 week ago* (last edited 1 week ago)

I heard there was a secret cord.
you plug it in to meet the lord.
But you don't really care for safety, do ya?
It goes like this, you plug it in,
And in a flash, the lights go dim,
The power's gone,
and now it’s running through ya.

108
1
[-] merari42@lemmy.world 9 points 2 weeks ago

At least you only have to sink one Bayesian Superyacht to update your priors. To get at population parameters you would need to sink many Frequentist Superyachts

1
[-] merari42@lemmy.world 0 points 2 weeks ago

When do we Germans beat them 7:1 again? Or is Germany just not as good at murder? /s

[-] merari42@lemmy.world 5 points 3 weeks ago

If you make it from coal it is vegan because coal is just plants. If it's made from petroleum it is not vegan because it is made from dinosaurs.

[-] merari42@lemmy.world 4 points 4 weeks ago

I was on a holiday in the Cinque Terre in Italy with my wife a few years ago. Because of a rainy day we decided to take a train to Genua and visit some museums. At the maritime museum I randomly met an Italian coworker/coauthor from my research institute in Germany, who was visiting his family in his hometown with his wife.

1
[-] merari42@lemmy.world 12 points 1 month ago

For a user without much technical experience using a ready-made gui like Jan.ai with automatic model download and ability to run models with the ggml library on consumer grade hardware like mac M-series chips or cheap GPUs by either Nvidia or AMD is probably a good start.

For a little bit more technically proficient users Ollama is probably a great choice to start to host your own OpenAI-like API for local models. I mostly run gemma2 or small llama 3.1 like models with that.

[-] merari42@lemmy.world 0 points 1 month ago

Better than to be the pole vaulter whose medal ambitions were foiled by his long wang.

1
1
1
O Yea (lemmy.world)
1
Heathens! (lemmy.world)
view more: next ›

merari42

joined 5 months ago