nanoobot

joined 1 year ago
[–] nanoobot@kbin.social 1 points 1 year ago

I completely agree on lost world. If you see it as a love letter to the original then I think it works a lot better. Like so many of the iconic scenes in the first are repeated with a different spin in lost world. It does take itself less seriously, but it has a lot of heart and a good attention to detail. I think the two pair together perfectly, like alien and aliens.

[–] nanoobot@kbin.social 1 points 1 year ago (1 children)

Well I wont be, and just because one thing might be higher probability than another, doesn't mean it's the only thing worth worrying about.

[–] nanoobot@kbin.social 2 points 1 year ago

That's ridiculous, of course it counts as AI. It's not conscious, and it's not very intelligent, but it has some intelligence by any reasonable definition.

[–] nanoobot@kbin.social 1 points 1 year ago

Sure, I'm not certain at all, maybe, but are you certain enough to bet your life on it?

[–] nanoobot@kbin.social 1 points 1 year ago* (last edited 1 year ago) (3 children)

I never said how long I expected it to take, how do you know we even disagree there? But like, is 50 years a long time to you? Personally anything less than 100 would be insanely quick. The key point is I don't have a high certainty on my estimates. Sure, might be perfectly reasonable it takes more than 50, but what chance is there it's even quicker? 0.01%? To me that's a scary high number! Would you be happy having someone roll a dice with a 1 in 10000 chance of killing everyone? How low is enough?

[–] nanoobot@kbin.social 4 points 1 year ago* (last edited 1 year ago) (8 children)

It makes my blood boil when people dismiss the risks of ASI without any notable counterargument. Do you honestly think something a billion times smarter than a human would struggle to kill us all if it decided it wanted to? Why would it need a terminator to do it? A virus would be far easier. And who's to say how quickly AI will advance now that AI is directly assisting progress? How can you possibly have any certainty on any timelines or risks at all?

[–] nanoobot@kbin.social 7 points 1 year ago

Something you may not have considered is that the majority of our brains are used for things like sensory input and motor control, not for thinking. This is why brain size relative to body size is so important. A whale has a far larger brain than you or I, but is significantly less intelligent.

[–] nanoobot@kbin.social 2 points 1 year ago

The change in the conversation about the importance of alignment this year has been remarkable. Last year had me feeling pretty cynical, but I am starting to feel legitimate hope again.

[–] nanoobot@kbin.social 4 points 1 year ago* (last edited 1 year ago)

Sure, but should legality be based on artistic effort? (Not asking you directly, just open to anyone who thinks what SD, etc. do should be illegal.)

[–] nanoobot@kbin.social 1 points 1 year ago

digikam for image and video collection management and viewing (also does duplicate detection)

[–] nanoobot@kbin.social 1 points 1 year ago

I agree with you, but this is a really bad counterargument to what they said. Even widely agreed politeness conventions to a degree 'compel' speech, so the debate is really around what speech is acceptable for society to encourage/suppress, rather than whether cultural changes are changing what people are compelled to say. Also, I don't think they said anything that suggested they are more concerned by that than hateful violence?

view more: next ›