kogasa

joined 2 years ago
[–] kogasa@programming.dev -2 points 4 days ago

Sounds like a skill issue. If that ruined the game for you, I dunno what to say. Might be a replicant?

[–] kogasa@programming.dev 3 points 4 days ago (3 children)

I agree with them, that game is a masterpiece. Didn't you love it?

[–] kogasa@programming.dev 12 points 1 week ago (1 children)

It doesn't top out below 144Hz. There are benefits with diminishing returns up to at least 1000Hz especially for sample-and-hold displays (like all modern LCD/OLED monitors). 240Hz looks noticeably smoother than 144Hz, and 360Hz looks noticeably smoother than 240Hz. Past that it's probably pretty hard to tell unless you know what to look for, but there are a few specific effects that continue to be reduced. https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/

[–] kogasa@programming.dev 2 points 1 week ago

That example recording is awesome

[–] kogasa@programming.dev 19 points 1 week ago

Yippee I missed these

[–] kogasa@programming.dev 1 points 1 week ago (2 children)

I know, I'm just saying it's not theoretically impossible to have a phone number as a token. It's just probably not what happened here.

the choice of the next token is really random

It's not random in the sense of a uniform distribution which is what is implied by "generate a random [phone] number".

[–] kogasa@programming.dev 2 points 2 weeks ago (4 children)

A full phone number could be in the tokenizer vocabulary, but any given one probably isn't in there

[–] kogasa@programming.dev 9 points 2 weeks ago* (last edited 2 weeks ago) (6 children)

I mean the latter statement is not true at all. I'm not sure why you think this. A basic GPT model reads a sequence of tokens and predicts the next one. Any sequence of tokens is possible, and each digit 0-9 is likely its own token, as is the case in the GPT2 tokenizer.

An LLM can't generate random numbers in the sense of a proper PRNG simulating draws from a uniform distribution, the output will probably have some kind of statistical bias. But it doesn't have to produce sequences contained in the training data.

[–] kogasa@programming.dev 6 points 2 weeks ago

It's a number and complexity refers to functions. The natural inclusion of numbers into functions maps pi to the constant function x -> pi which is O(1).

If you want the time complexity of an algorithm that produces the nth digit of pi, the best known ones are something like O(n log n) with O(1) being impossible.

[–] kogasa@programming.dev 2 points 2 weeks ago (1 children)

The direct connection is cool, I just wonder if a P2P connection is actually any better than going through a data center. There's gonna be intermediate servers right?

Do you need to have Tailscale set up on any network you want to use this on? Because I'm a fan of being able to just throw my domain or IP into any TV and log in

[–] kogasa@programming.dev 3 points 2 weeks ago (3 children)

I just use nginx on a tiny Hetzner vps acting as a reverse proxy for my home server. I dunno what the point of Tailscale is here, maybe better latency and fewer network hops in some cases if a p2p connection is possible? But I've never had any bandwidth or latency issues doing this

[–] kogasa@programming.dev 12 points 2 weeks ago (2 children)

It gets around port forwarding/firewall issues that most people don't know how to deal with. But putting it behind a paywall kinda kills any chance of it being a benevolent feature.

view more: next ›