How much computing power is necessary for fully immersive, photorealistic simulations?

A fully immersive, photorealistic simulation would have these criteria:

-The resolution of the image matches the pixel density of the eye. The eye has a density of 576 megapixels per eye but only has a total pixel count of tens of megapixels only (this is because the overwhelming majority of the pixel density is in one spot, called the foeva. As you move away from the fovea, there is less detail). The image covers the entire field of view, not just a small percentage of a person’s vision. Phones are at a distance so it covers a very small percentage of the FOV. That’s why retina displays are 1440p at most. To match the 576 MP density, each display that covers each eye must be 24k square displays.

-The fps must be 1000 fps. The eye records information at about 30 fps, but can see up to 1000 fps (the same reason why a 120 fps game looks much better than a 30 fps game. We can tell the difference because it is smoother).

-Must be photorealistic. There are many things that must work together to make photorealistic graphics. Ray tracing, bump mapping, pixel/vertex shaders, texture mapping, and more. Ray tracing is the most computationally demanding, since it requires tracing light rays bouncing off different surfaces with different optical properties for every single pixel. Every 3D game has objects in them, which are made of polygons (bunch of triangles that make all kinds of 3D shapes). To make a simulation indistinguishable from reality, the polygons must be as small or smaller than a pixel. This is because if it were any bigger, the player would notice that objects are made of triangles which would look unrealistic (if they looked carefully).

-Must be able to simulate kilometers of distance from player.

How much computational power needed.

Most graphics cards support 4k displays, but that is not enough for fully immersive displays. A 4k display is 2k vertically. A single eye has 12 times the vertical detail. Thus, two eyes have 144 times the pixel count of a 4k display. Graphics cards need to be at least 144 times more powerful than today’s.

The top graphics cards output 50 fps when playing the most graphically intensive games. To reach 1000 fps, graphics cards need to be 20x faster.

Photorealism is hard to calculate. But we can estimate. I don’t know much about Ray tracing, so I’ll just arbitrarily say that you need 10x the computing power for realistic lighting (lighting is key to realism).

The average 3D game has roughly 100k polygons (found it on Google search). A 4k game would have 1/80 polygons per pixel (roughly). A display 144 times that would need 80×144 = 11,500 times as many polygons. But since a graphics card 144 times more powerful already has 144 times as many polygons, it needs to be only 80x more powerful (see above).

Today’s games have a render distance of a couple hundred meters. But the eye can see tens of kilometers away. To create a world that feels as real and as big, this must not be ignored. You don’t need to render every polygon at extremely far distances. You only need simplified models of mountains, far away trees, buildings, etc. This is because there is less detail the farther away, so not every small detail needs to be rendered for distant objects. So there won’t be a crazy increase in demand for computer power because of this, but it will still be significant. I’m not sure how much more, but I will say about 10x more computing power is needed for this (conservative estimate).

So, the rough calculation:

144 x 20 x 10 x 80 x 10 = 23,040,000 times as powerful as today’s computers.

Today’s top GPU’s can do 10 TFLOPS. A computer needed for this would do 2.304 x 1020 FLOPS.

This is a conservative estimate and it could be less demanding. This only accounts for visual output. Graphene and carbon nanotube chips could make tomorrow’s computers millions of times faster than today’s. I think we will indeed reach those demands within three decades (according to the exponential growth of computing).

What are your thoughts on this? Are there things I didn’t consider? Are we going to be satisfied with this much computing power?

submitted by /u/Lord_Namu
[link] [comments]

Bitnami