v_pp

joined 3 years ago
[–] v_pp@lemmygrad.ml 15 points 1 month ago (2 children)

Isn't Crimea part of Russia

[–] v_pp@lemmygrad.ml 3 points 2 months ago (1 children)

I think RCP is the Avakian cult, isn't that just a US thing? But they're different from RCA.

[–] v_pp@lemmygrad.ml 9 points 2 months ago

Per RNN on Telegram, Hezbollah has claimed to have sent at least 2 drones towards the north of Tel Aviv, but have not yet announced the targets, or perhaps if they were successful. There will possibly be an announcement later today.

[–] v_pp@lemmygrad.ml 24 points 3 months ago (1 children)

They're hindering Americans' return to Earth, too, actually

[–] v_pp@lemmygrad.ml 10 points 4 months ago

Well, yeah, but perhaps that implies they don't know who the actual communists are, so they will still target the "commie" demonrats

[–] v_pp@lemmygrad.ml 5 points 4 months ago* (last edited 4 months ago) (1 children)

AI/ML research has long been notorious for choosing bullshit benchmarks that make your approach look good, and then nobody ever uses it because it's not actually that good in practice.

It's totally possible that there will be legitimate NLP use-cases where this approach makes sense, but that is almost entirely separate from the current LLM craze. Also, transformer-based LLMs pretty much entirely supplanted recurrent networks as early as like 2018 in basically every NLP task. So even if the semiconductor industry massively reoriented to producing chips that support "MatMul-free" models like this one to even get an energy reduction, that would still mean that the model outputs would be even more garbage than they already are.

[–] v_pp@lemmygrad.ml 4 points 4 months ago* (last edited 4 months ago) (3 children)

I'm highly skeptical of this at first glance. Replacing self-attention with gated recurrent units seems like a decisive step back in natural language processing capabilities. The advancement that gave rise to LLMs in the first place was when people realized that building networks out of a bunch of self-attention blocks instead of recurrent units like GRU or LSTM was extremely effective.

In short, they are proposing an older type of model which are generally outclassed by attention-based transformers that power all the LLMs we see today. I doubt it will be able to achieve nearly as good results as existing LLMs. I foresee this type of research being used to silence criticisms of the ungodly amounts of energy used by LLMs to say "See, people are working on making them way more efficient! Any day now..." Meanwhile they will never come to fruition.

[–] v_pp@lemmygrad.ml 4 points 6 months ago (1 children)

The strike was aimed at a target in occupied Palestine, seems appropriate.

[–] v_pp@lemmygrad.ml 8 points 9 months ago

Isn't the FLG headquarters compound in like upstate NY or something

[–] v_pp@lemmygrad.ml 14 points 9 months ago (1 children)

I personally reject the notion that a just and equitable future can only be built on a pile of corpses.

Ok, then how?

[–] v_pp@lemmygrad.ml 8 points 9 months ago

Hey dumbass, headlines follow different style practices. The same conventions of grammar do not apply.

[–] v_pp@lemmygrad.ml 4 points 11 months ago

This is some absolutely depraved shit. You're sitting here justifying levels of death and destruction and human misery that are beyond your comprehension just because of some made up conspiracy theories about Russia "meddling" with other countries. In what possible universe does that make you anything other than pure fucking evil?

view more: next ›