[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/g/ - Technology


Thread archived.
You cannot reply anymore.


[Advertise on 4chan]


File: 6.mp4 (1.39 MB, 1920x1076)
1.39 MB
1.39 MB MP4
>you lived long enough to witness the death of rasterized rendering techniques
where were you when poly was kill? also Nvidia is sweating bullets and trying to create a research monopoly for gaussians splats that is built upon raytracing. otherwise gaussian splats will make nvidia gpu technology like raytracing and cuda obsolete as well.
https://youtu.be/zLIOZ7g4kfA?t=909
https://www.youtube.com/watch?v=Pjx8hHtci6Q
https://www.youtube.com/watch?v=_e3VWhWkwF8

can't wait for all the rasterized tears IIT
>>
>>106705224
>you lived long enough to witness the death of rasterized rendering techniques
Did I? I don't know what any of that means. Should I hedge against NVDA?
>>
>>106705224
I watched a video on that a while back. But, if it's as good as they claim, why is it only appearing now? Was it limited by compute before? Or is it for AR/XR/VR , 3D scanning stuff?
>>
>>106705224
You're posting this again
I can see this being useful for something like google maps, where you just want to fly through an irl scene.
But it seems useless for anything else like games. Like, how are you supposed to program that a robot arm in a game moves in a certain way? Gaussian splats won't have a concept of which part belongs to which object and how to animate them
>>
>>106705262
AI
>>
>>106705238
>>106705247
>>106705262
It's relatively new and nvidia is not hyping it (yet) for reasons stated in my original post. AI made it possible to give physical properties to gaussian splats. Also splats can be driven by basic 3d mesh counterparts.
>why would u do that
because rendering gaussians is much cheaper than 3d polys.
Check this out: https://blog.playcanvas.com/playcanvas-open-sources-sog-format-for-gaussian-splatting/
Not to mention how much easier and faster it is to create a gaussian splat vs 3d mesh.
btw I wasnt on the hype train before the animation/physics possibilities were made public
>>
>>106705456
>why would u do that
Chill man you don't have to explain yourself to me
>>
>>106705328
>AI will just fix it!
Cool, but all the examples I've seen are static scenes. Proof of animated or real-time with GSplats?
>>
tl;dw is this just prerendered graphics?
>>
I was researching the topic and found this dynamic GS demo website, feels like Cyberpunk braindances irl
https://www.4dv.ai/viewer/salmon_10s?showdemo=4dv
>>
>>106705957
yes but not rendered doing rasterization like the last 30 years of tech
>>
File: file.png (761 KB, 1184x689)
761 KB
761 KB PNG
>>106706004
pic related, the dude talks and moves while you pan around
>>
>>106706153
>>106706004
yes, that's a 4D splat and less interesting in my opinion. it requires a very expensive camera setup and computational power/bandwidth. they also use this in basketball and football/soccer games to do these real life 3D replays.
In my opinion static gaussians splats with physics are much more interesting:
https://www.youtube.com/watch?v=_e3VWhWkwF8
which would mean you could make video games and simulations fully with gaussian splats instead of 3d polymeshes
>>
>>106705899
literally shown in all videos of the OP
>>
OPs on this board will really show videos of Nvidia research and tell me Nvidia should be scared of research they are funding
>>
splats are cool but there are lots of limitations still around authoring, animating, relighting, rigging etc
i haven't messed around too much, but i don't think you can even cast shadows onto a splat without creating a mesh representation right now
>>
>>106706589
how mentally handicapped are you exactly? I literally wrote in the OP why they are funding the research. Are you that daft? Lets try this again:
>Grog has monopoly on flint stones to make fire
>Grog notices a few smart people are creating small gas box and are having success creating fire
>Grog realizes small gas box is superior to stone in 99% of usecases
>Grog contacts smart small gas box people and offers them unlimited bananas for their work. The only condition is that small gas box always needs at least one rock to work
>smart small box people agree because never having to find and pick bananas is very nice
>now all small gas boxes have rock inside and only work with physical presence of rock
>slightly above average people can't figure out how to make small gas box work without rock
>normal people and idiots like you are busy burning their fingers with small gas box
>>
>>106706784
>i haven't messed around too much, but i don't think you can even cast shadows onto a splat without creating a mesh representation right now
shadowing works now without fake mesh. relighting worked for a long time already. the other things are still finicky, indeed. but people have shown it's actually possible, so it's only a matter of time. and the performance cost will be much less than a realtime video/3D world gen like Google showcased (including other benefits)
>>
>>106706826
can you link to some of the relighting work?
the stuff i've seen just look like tints on the colour information. there's no delighting so you get stuck with shadows baked into the colour.
>>
>>106705224
Call me when a playable game with good aesthetics is made with it.
>>
>>106706786
I'm sure nvidia are terrified of technology they are funding or research that needs nvidia tech to work. Yep, I'm sure that bubble will burst any day now. Just like Deepseek right. Or FSR. Or all the others.

Go ahead and pull me the papers on this not using nvidia. Give people a reason to dump $NVDA
>>
>>106706844
https://www.youtube.com/watch?v=3I-nOIwYK9M
>there's no delighting so you get stuck with shadows baked into the colour.
soon there will be an automated solution for this. simply tune your video footage with this before creating the gaussian splat:
https://www.youtube.com/watch?v=Qjzc_uORnuE

>>106706878
I don't know what your endgame is, but I'm not giving you (You)'s until you address my argument. Obviously they aren't scared shitless. Unless someone manages to create gaussian splatting tools just as good as the ones from Nvidia without the raytracing requirement. which if you know anything about gaussian splatting, is very viable and realistic.
>>
https://arxiv.org/pdf/2409.10161
All experiments were conducted using a
UR5 robot equipped with a Robotiq 2F-85 gripper and 2
Intel Realsense D455 cameras [54] with deployment on an
NVIDIA RTX 3080Ti GPU for the Diffusion Policy

Just show me graphics/AI tech that will not give anyone a reason to buy Nvidia GPUs and I will believe you
>>
>>106706947
that diffusionrender thing looks neato.
will probably grab it for extracting pbr maps from photos and see if it's better than existing solutions.
>>
>>106706947
>which if you know anything about gaussian splatting, is very viable and realistic.
I see
and who has made it
show me
>>
Nvidia has not increased rasterization performance significantly, they only increased raytracing cores and AI compute.
it's kind of the opposite, AMD is better at rasterization than Nvidia. AMD also tries to be cheaper by going with previous generation memory chips (GDDR6) that don't affect gaming much.
Gaussian splatting isn't going to replace rasterization / polygon raytracing in gaming because it's slow, and it's hard to animate. But you can convert point clouds into geometry, and I wouldn't be surprised if ultra nightmare graphics settings would start using point clouds (mainly for effects), maybe even gaussians.
>>
>>106706947
Not anon, but the problem with lighting gaussians is that there are no shadows.
So if you had 2 rooms side by side, you can't have only 1 room lit up (it actually would be possible if you put a hidden polygon wall between the rooms, but that's inconvenient in my opinion and I think people expect everything to cast a shadow in 2025 if it runs at 60fps 1080p DLSS).
I don't think we will raytrace gaussians any time soon because RTX rendering is done with triangles, and all high performance / realtime gaussian rendering is done by converting the gaussian into triangles.
By the time you convert the gaussian into triangles, the cost of raytracing it would be absolutely awful, the real time rendering performance is done by doing simple lighting. I could imagine maybe RTX cores could start adding support for point based raytracing, but I doubt it.
>>
>>106707235
soonTM (selfshadowing and shadow projection as far as I understood)
https://lumigauss.github.io/
hybrid mesh/gaussian approach:
https://arxiv.org/html/2502.07754v1
>>
>>106706552
You're a fucking retard and no one gives a single fuck what an anonymous user "finds interesting."
>>
Retards like you don't realize how society changes. You think that people are just waiting for a breakthrough, but companies are camping on IP for billions of years and waiting for a reason to implement it. We're not "pushing the limits" like you think we are, we're couching every release with incremental teasing and pumping more money into marketing.
Until you learn what is required to actually compete (a global-level entity deciding to eat someone else's lunch), you will continue to die in this fantasy world every time reality decides to "surprise" your completely autistic, self-flagellating world.
Get off the hype train. There's work to do.
>>
not my problem
>>
>>106705224
Gaussian splats aren't a geometry primitive, they are a way to estimate an arbitrary view from a bunch of pictures. No relighting, no animation, no nothing.
>>
>>106707757
NO THESE ARE IN HEAVY COMPETITION WITH THE FUNDAMENTAL STUFF THIS BOARD ALWAYS TALKS ABOUT! EVERYTHING IS A BATTLE BETWEEN THE POPULAR THINGS IN DISCUSSION!
>>
>>106707757
>Gaussian splats aren't a geometry primitive
They literally are... unless you are referring to some specific graphics API spec or something
>No relighting
You can relight the scene using AI and then at render time interpolate between the different lighting conditions as desired
>no animation
There is though, it's called 4D gaussian splatting
>>
>>106707840
how many GB does a 4D splat take? You can get maybe one animation of one model in vram working even with compression tricks. This isn't suitable for realtime rendering.

>relight using AI
garbage, AI is not going to give you deterministic or good results

>they aren't a geometry primitive
in practice it's a quad (two triangles) and most implementations use discard + math in the fragment shader to determine the color and opacity of the quad at a certain offset from the center of it.

We had this thread already; They're cool but the reason why every engine isn't going all in on these is because they're not better than traditional rasterization. Not because it's some ultra cool technology that everyone is too stupid to use.
>>
>>106707840
>at render time interpolate between the different lighting conditions
Does this mean all lighting has to be pre-baked? At what granularity can you interpolate this - can you interpolate every part of an object separately between shaded and bright, or is it scene-wide?
>>
>>106707873
>how many GB does a 4D splat take?
Not as much as you think, they are done in a clever way it's not just a "flipbook" style animation system
>in practice it's a quad (two triangles) and most implementations use discard + math in the fragment shader to determine the color and opacity of the quad at a certain offset from the center of it.
I'm not sure why you think this is relevant, using triangle rasterization to approximate 3DGS is fine. You can compare the cuda renders with the triangle rendering approximation, they are nearly identical but the triangles render a bit quicker. it's a non-issue.
>>106707878
>Does this mean all lighting has to be pre-baked?
You have to prebake the lighting "keyframes" for each splat, but at runtime you could easily interpolate between whatever number of states you've saved to vram. For instance in an indoor scene you could have a splat with a state for when the lights are on and a state for when they are off, then interpolate. Note that within these "keyframes" there is still a vast number of lighting variations with respect to the viewing angle of the camera, it isn't just one color
>>
>>106707958
>For instance in an indoor scene you could have a splat with a state for when the lights are on and a state for when they are off, then interpolate.
My question is more about variable light sources, which has been a thing in the most basic raster renderers for decades. If I turn on a torch and walk around the room, is it something you can interpolate for by presumably having a "fully lit by torchlight" state everywhere but then efficiently only simulating the actual beam of the torch? Or if I turn on a lantern with a flickering flame?
>>
>>106705247
take the antisemitism to /pol/
>>
>>106708005
It's a different set of pros and cons. A moving light source would be difficult to account for with 3DGS
>>
>>106705224
>3D Gaussian Splatting for Realistic Physical AI Simulations
>Physical AI Simulations
was the AI really necessary here?
>>
>>106707026
Nigger, do your own research or stop investing tech.
>>
>>106708587
So you can't.
$NVDA will remain
>>
>>106708314
What are the pros then, if it can't even do basic things that raster can?

Mind you that we're getting close to literally physically based rendering with raytracing performance improving with every gen, which is directly based on how real life rays work. Going to a completely different methodology which gives up very basic capabilities of rendering a dynamic scene seems incredibly pointless.
>>
>>106707757
Wonder if you could wrap the splat elements in a low poly mesh, use it for various weights on the gaussians.
>>
>>106709042
The use case for splats are scanning real world data using a camera and automatically turning it into a 3D model people can examine. Like you can use it for house walkthroughs or 3D objects that you would buy. For synthetic data (video game characters etc) 3D models are still better.
>>
>>106709049
You wouldnt need to wonder if you checked out the links in the op xD
>>
>>106705328
>aimemeslop
kys
>>
isn't this just the INFINITE DETAIL VOXEL POINT CLOUD shit from 2011?
>>
>>106705224
>the death of rasterized rendering techniques
Isn't Gaussian splatting just creating a scene as a cloud of 3D Gaussian functions instead of building a mesh and projecting a texture to it? I don't see the problem, that's still rasterisation.

>>106705262
I guess you could group them

>>106705456
>rendering gaussians is much cheaper than 3d polys.
I can see how that would be true for additive rendering, but no one wants that in 3D rendering, so how do you make Gaussians obscure each other properly?

Also how do you get the normals and calculate reflections and specular highlights with Gaussians?

>>106710820
It's voxels but fuzzy and sparse and surface-only. What a time to be alive!
>>
>>106706784
>animating
i don't care, just give me realistic static environment and that's already a gigantic win especially for VR
>>
>>106705224
>rasterization is dead
>relies on rasterization
>literally unironically relies on triangle-based geometry for gaussians
lol https://trianglesplatting.github.io/
>>
>>106709042
The pros are:
>Novel view synthesis, images -> 3D scene with no modeling required at all, fully automated
>By FAR the most realistic realtime renders, effortlessly BTFOs traditional rendering pipelines for realism
>Far far simpler than other realitime GI solutions and all of the ass backwards optimizations they have to use to mimic a fraction of 3DGS's realism
>>106710820
>>106710956
No, it's a completely different technology
>>
>>106711224
We've already had that for years with standard mesh-based photogrammetry.
>>
>>106705224
you know how raytracing was the tax to make developers lazy and release shitty performing games. one can only imagine how this tech will exponentially make it worse.

also, fuck realism. i'll go away from a computer to experience that. where is the fucking art, surrealism, impressionism etc. in games.
>>
>>106712441
except this stuff works on the browser with a decent fps on a laptop with no dedicated graphics card
all heavy mesh techniques are slow as hell on normal hardware
>just don't be poor
it's about efficiency, if it already works on underpowered machines, you can do even bigger stuff on stronger machines
>>
>literally useless for games both for dynamic scenes and physics
>>
>>106705247
>Was it limited by compute before?
Yes. This is not new at all and OP is a retard
>>
The video you posted look like utter shit, it doesn't matter if you're using a new technique if it runs at 5fps.
The 3ds had better AR than that



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.