What if the universe is simulated and special relativity is caused by the drop/lower FPS/TPS in regions with high amounts of mass/energy (perhaps to save on computation)?
You know how the time passes more slowly near a block hole? What if that's because the universe is updating/processing stuff slower in such regions compared to the emptier areas?
Let's imagine a universe has a framerate. What if that framerate drops significantly near the event horizon? For example, for each update/tick/frame there, many thousands or millions of frames happen in the rest of the universe. If you were near a black hole, you would still feel like the framerate is normal and it would seem like the rest of the universe is running at a much much faster framerate and stuff there would be happening super fast from your perspective.
Maybe the framerate drops so so so much near the singularity/event horizon that stuff that falls in stays still essentially from the perspective of the rest of the universe since framerate there asymptotically approaches zero and the whole thing grinds to a halt AKA the stuff never really reaches the singularity since it not getting updated/processed anymore (I mean, it is, but so rarely it would take a like an infinite amount of time for it to reach it).
This is obviously just my fun lil speculation that's probably wrong, but what do you guys think? Does it make sense and if it doesn't, why not?