


<?xml version="1.0" encoding="utf-8"?><rss version="2.0" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/">
<channel>
<title>DBO Forums - 30fps vs 60fps, GPU vs CPU</title>
<link>https://destiny.bungie.org/forum/</link>
<description>Bungie.Org talks Destiny</description>
<language>en</language>
<item>
<title>30fps vs 60fps, GPU vs CPU (reply)</title>
<content:encoded><![CDATA[<blockquote><blockquote><p>Having the framerate drop mid-game feels bad, yes.</p>
</blockquote></blockquote><blockquote><p><br />
I don't mean mid-game. I mean within the same general product: visual design, control scheme.</p>
<p>Play a Halo game on MCC for a while, and then immediately start playing the original version on oXbox or 360. The switch from 60-&gt;30 is jarring.</p>
</blockquote><p>Yeah, I should give that a try sometime.  I purchased the MCC but never got around to playing it.  It seems whenever I'd have a spare 30 minutes (that I didn't want to use in destiny) I'd fire up my xb1 &amp; have to update the system software, then I'd launch the MCC &amp; it'd have a bunch of updates &amp; I'd never get to playing before my time was up.  I assume the MCC has calmed down by now.  :)</p>
<p>It's a shame really, my xb1 has mostly been just a paperweight.  :(</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=132577</link>
<guid>https://destiny.bungie.org/forum/index.php?id=132577</guid>
<pubDate>Sat, 27 May 2017 03:21:57 +0000</pubDate>
<category>Gaming</category><dc:creator>dogcow</dc:creator>
</item>
<item>
<title>30fps vs 60fps, GPU vs CPU (reply)</title>
<content:encoded><![CDATA[<blockquote><p>Having the framerate drop mid-game feels bad, yes.</p>
</blockquote><p>I don't mean mid-game. I mean within the same general product: visual design, control scheme.</p>
<p>Play a Halo game on MCC for a while, and then immediately start playing the original version on oXbox or 360. The switch from 60-&gt;30 is jarring.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=132576</link>
<guid>https://destiny.bungie.org/forum/index.php?id=132576</guid>
<pubDate>Sat, 27 May 2017 02:46:18 +0000</pubDate>
<category>Gaming</category><dc:creator>uberfoop</dc:creator>
</item>
<item>
<title>30fps vs 60fps, GPU vs CPU (reply)</title>
<content:encoded><![CDATA[<p>60fps for Multiplayer and 30fps for single player worked for Uncharted 4.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=132575</link>
<guid>https://destiny.bungie.org/forum/index.php?id=132575</guid>
<pubDate>Sat, 27 May 2017 01:59:36 +0000</pubDate>
<category>Gaming</category><dc:creator>Cody Miller</dc:creator>
</item>
<item>
<title>30fps vs 60fps, GPU vs CPU (reply)</title>
<content:encoded><![CDATA[<blockquote><blockquote><p>If D2's frame-rate is CPU bound, perhaps they could still pull off 60fps in crucible?</p>
</blockquote></blockquote><blockquote><p><br />
Even if technically feasible, I'd strongly dislike that. Regularly switching from 60fps to 30fps in the same game feels bad. The acclimation gets faster and easier the more you do it, but still.</p>
</blockquote><p>Having the framerate drop mid-game feels bad, yes.  You can really feel that, but I don't think having a solid 60 in one game mode and a solid 30 in another would be as horrible as you suggest.  It wouldn't be so terrible, Crucible would just have a different feel to it than PvE.  In fact, Crucible already has a different feel than PvE (at least it did for me, I got destroyed for many many games before I could get use to how different it was from Destiny PvE).</p>
<p>Edit: maybe it would be terrible, but I wouldn't write it off until I tried it.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=132574</link>
<guid>https://destiny.bungie.org/forum/index.php?id=132574</guid>
<pubDate>Sat, 27 May 2017 01:44:17 +0000</pubDate>
<category>Gaming</category><dc:creator>dogcow</dc:creator>
</item>
<item>
<title>30fps vs 60fps, GPU vs CPU (reply)</title>
<content:encoded><![CDATA[<blockquote><p>If D2's frame-rate is CPU bound, perhaps they could still pull off 60fps in crucible?</p>
</blockquote><p>Even if technically feasible, I'd strongly dislike that. Regularly switching from 60fps to 30fps in the same game feels bad. The acclimation gets faster and easier the more you do it, but still.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=132568</link>
<guid>https://destiny.bungie.org/forum/index.php?id=132568</guid>
<pubDate>Fri, 26 May 2017 22:30:25 +0000</pubDate>
<category>Gaming</category><dc:creator>uberfoop</dc:creator>
</item>
<item>
<title>30fps vs 60fps (reply)</title>
<content:encoded><![CDATA[<p>I will say, from my limited time on Titanfall 2 and far more extensive Overwatch, I haven't really noticed the difference between 30 FPS and 60 FPS.  </p>
<p>The performance benefits are already so far beyond my skill that I just don't derive much observable benefit. I'd much rather Bungie add more interactive terrain, NPCs, and visual effects then to strip that all out to make for a high throughput performance.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=132566</link>
<guid>https://destiny.bungie.org/forum/index.php?id=132566</guid>
<pubDate>Fri, 26 May 2017 21:59:17 +0000</pubDate>
<category>Gaming</category><dc:creator>Durandal</dc:creator>
</item>
<item>
<title>Hardware Sprites (reply)</title>
<content:encoded><![CDATA[<blockquote><p>It is not hard to hit 60fps. The NES was doing that with a CPU running at 3mhz or something like that.</p>
<p>You simply need to scale back the complexity of your simulation.</p>
</blockquote><p>Yep.</p>
<p>The hardware sprite era had very different types of tradeoffs for the devs, though.</p>
<p>Those systems don't work by having a GPU render a frame that then gets sent to be output. Instead, the graphics hardware is just set up with data describing what sprites are at what on-screen locations, and it produces the image pixel-by-pixel as it gets sent to the TV; you give it an on-screen coordinate, and it tells you the color of that pixel.</p>
<p>So the graphics hardware was always running in sync with the video signal. Since the video signal was 60Hz, the graphics hardware always produced frames at 60fps.</p>
<p>Rendering itself was never a bottleneck, so tradeoffs that would result in a 30fps game didn't have as much bang-for-the-buck to give, and were effectively leaving half the system idle half the time. Those systems also had very little memory, so you didn't necessarily have enough <em>stuff</em> to work on to make it worth blowing more than 16ms on anyway.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=132534</link>
<guid>https://destiny.bungie.org/forum/index.php?id=132534</guid>
<pubDate>Fri, 26 May 2017 03:00:59 +0000</pubDate>
<category>Gaming</category><dc:creator>uberfoop</dc:creator>
</item>
<item>
<title>30fps vs 60fps, GPU vs CPU (reply)</title>
<content:encoded><![CDATA[<blockquote><p>Programmers can correct me if I'm wrong, but the CPU has to generate the display lists to send to the GPU for rendering. This is why even though 4 player split screen is the same number of pixels, you get slowdown. The CPU is assembling 4 display lists instead of one. The GPU can't draw what it doesn't get, so there is inescapable overhead in improving the frame rate.</p>
</blockquote><p>That's part of it.</p>
<p>Even as far as rendering goes, there's lots of extra work on the GPU side. Doing two small tasks takes longer than one big one, since context switches and stuff happen. Small polygons are more expensive to draw (relative to their size) than big ones due to how GPU rasterizers work, and LOD systems are never 100% perfect.</p>
<p>I don't have any exact data to back this up, but I wouldn't be surprised if Halo 3's geometric simplicity is a big part of why it's so relatively uncompromised in split-screen. Less things in the scene means less duplicated work.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=132533</link>
<guid>https://destiny.bungie.org/forum/index.php?id=132533</guid>
<pubDate>Fri, 26 May 2017 02:44:03 +0000</pubDate>
<category>Gaming</category><dc:creator>uberfoop</dc:creator>
</item>
<item>
<title>30fps vs 60fps (reply)</title>
<content:encoded><![CDATA[<blockquote><p>It is not hard to hit 60fps. The NES was doing that with a CPU running at 3mhz or something like that.</p>
<p>You simply need to scale back the complexity of your simulation.</p>
</blockquote><p>I'll rephrase. It is hard to hit 60fps while maintaining a graphical fidelity that the audience will accept.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=132532</link>
<guid>https://destiny.bungie.org/forum/index.php?id=132532</guid>
<pubDate>Fri, 26 May 2017 02:38:09 +0000</pubDate>
<category>Gaming</category><dc:creator>CruelLEGACEY</dc:creator>
</item>
<item>
<title>And/or visual complexity too (reply)</title>
<content:encoded><![CDATA[<p>I can hit astronomical amounts of frames-per-second on my entirely-text-based simulation programs and I can assure you they're doing way more complex physics with far more particles/larger arrays than most games out there.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=132531</link>
<guid>https://destiny.bungie.org/forum/index.php?id=132531</guid>
<pubDate>Fri, 26 May 2017 02:25:21 +0000</pubDate>
<category>Gaming</category><dc:creator>ZackDark</dc:creator>
</item>
<item>
<title>30fps vs 60fps, GPU vs CPU (reply)</title>
<content:encoded><![CDATA[<p>Programmers can correct me if I'm wrong, but the CPU has to generate the display lists to send to the GPU for rendering. This is why even though 4 player split screen is the same number of pixels, you get slowdown. The CPU is assembling 4 display lists instead of one. The GPU can't draw what it doesn't get, so there is inescapable overhead in improving the frame rate.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=132530</link>
<guid>https://destiny.bungie.org/forum/index.php?id=132530</guid>
<pubDate>Fri, 26 May 2017 02:10:40 +0000</pubDate>
<category>Gaming</category><dc:creator>Cody Miller</dc:creator>
</item>
<item>
<title>30fps vs 60fps (reply)</title>
<content:encoded><![CDATA[<p>It is not hard to hit 60fps. The NES was doing that with a CPU running at 3mhz or something like that.</p>
<p>You simply need to scale back the complexity of your simulation.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=132529</link>
<guid>https://destiny.bungie.org/forum/index.php?id=132529</guid>
<pubDate>Fri, 26 May 2017 02:05:58 +0000</pubDate>
<category>Gaming</category><dc:creator>Cody Miller</dc:creator>
</item>
<item>
<title>Offloading stuff to GPU (reply)</title>
<content:encoded><![CDATA[<blockquote><p>Do console GPUs have that integrated now?</p>
</blockquote><p>It's not really a hard barrier where one moment you can't do stuff and the next you can. The question is what you can offload to where and how efficiently, and with how much development effort.</p>
<p>The GPU in the Xbox 360 only nominally supported rendering tasks. That is (roughly speaking and I'm leaving all kinds of stuff out), you'd submit a piece of geometry, GPU would sample textures, GPU would calculate per-pixel results based on user-defined shader program, then output per-pixel results to an image.<br />
But there's nothing stopping you from architecting, within that framework, a task that isn't actually a rendering task, and then having the GPU run it.</p>
<p>So, perhaps your &quot;geometry&quot; is actually a proxy for an array of particle data (position, velocity, particle type, etc). The &quot;textures&quot; that you have the GPU sample are actually just the particle data from the array. Each &quot;pixel&quot; you output is, once again, the particle data, but now updated with new trajectory. The &quot;image&quot; you're creating is just a new array of particle data.<br />
To detect collisions, you can have the GPU also sample the depth/normal buffers of the main scene at the on-screen location of each particle. If the particle's trajectory &quot;intersects the depth buffer&quot;, you can calculate a bounce, or do some other thing based on particle type (like a raindrop particle might turn into a water splash particle).</p>
<p>That's exactly what Bungie was doing back in Halo Reach, to be able to process huge numbers of particles for cheap. It's got some limitations, like particles can't interact with any geometry other than what's on-screen and forward-facing (since it's literally just using the on-screen content as the definition for collision geometry), but with things like raindrops and sparks, people usually don't notice the quirks in the system. (Reach still handles <em>some</em> particles CPU-side.)</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=132527</link>
<guid>https://destiny.bungie.org/forum/index.php?id=132527</guid>
<pubDate>Fri, 26 May 2017 01:47:10 +0000</pubDate>
<category>Gaming</category><dc:creator>uberfoop</dc:creator>
</item>
<item>
<title>30fps vs 60fps, GPU vs CPU (reply)</title>
<content:encoded><![CDATA[<blockquote><blockquote><blockquote><p>can the physics for those be run on the gpu now?</p>
</blockquote></blockquote></blockquote><blockquote><blockquote><p><br />
Yes, but most physics would be inefficient to run on the GPU and possibly inaccurate too. Thing about GPUs is that it can do the same mathematical operation in dozens of hundreds of inputs at the same time, but if you need each input to operate differently or need the result of one operation be the input of the next, the GPU will take a full cycle to get to that. Same with the CPU, but at least it isn't built with thousands of parallel cells, so you're not wasting processing power that could be better spent elsewhere.</p>
</blockquote></blockquote><blockquote><p><br />
Right, highly parallel manipulation of large arrays of data.  What I was specifically wondering about was the integration of physics processing units with GPUs (like PhysX).  My understanding of that integration is that it can handle the physics for particle systems (but not necessarily game simulation physics).  Do console GPUs have that integrated now?  Honestly I haven't followed GPU developments &amp; what capabilities consoles have lately.</p>
</blockquote><p>Someone correct me if I'm wrong, but from my understanding not much has been done using PPUs and GPUs together since DX10. From what I've heard most companies have just switched to using CPU as it's cheaper and easier (and compatible with a wider range of systems) and with advancements in CPU technology they can use the cores your computer isn't using anyway to process things better run on faster single cores.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=132522</link>
<guid>https://destiny.bungie.org/forum/index.php?id=132522</guid>
<pubDate>Fri, 26 May 2017 00:59:50 +0000</pubDate>
<category>Gaming</category><dc:creator>Xenos</dc:creator>
</item>
<item>
<title>30fps vs 60fps, GPU vs CPU (reply)</title>
<content:encoded><![CDATA[<p>Well, by definition you can't, unless it's a highly parallizable physics (like you mentioned, particle physics fits the bill, as well as some implementations of fluid dynamics) or you make a GPU also have a CPU, which might happen on discrete graphic cards, but not on embedded GPUs.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=132521</link>
<guid>https://destiny.bungie.org/forum/index.php?id=132521</guid>
<pubDate>Fri, 26 May 2017 00:59:08 +0000</pubDate>
<category>Gaming</category><dc:creator>ZackDark</dc:creator>
</item>
<item>
<title>30fps vs 60fps, GPU vs CPU (reply)</title>
<content:encoded><![CDATA[<blockquote><blockquote><p>can the physics for those be run on the gpu now?</p>
</blockquote></blockquote><blockquote><p><br />
Yes, but most physics would be inefficient to run on the GPU and possibly inaccurate too. Thing about GPUs is that it can do the same mathematical operation in dozens of hundreds of inputs at the same time, but if you need each input to operate differently or need the result of one operation be the input of the next, the GPU will take a full cycle to get to that. Same with the CPU, but at least it isn't built with thousands of parallel cells, so you're not wasting processing power that could be better spent elsewhere.</p>
</blockquote><p>Right, highly parallel manipulation of large arrays of data.  What I was specifically wondering about was the integration of physics processing units with GPUs (like PhysX).  My understanding of that integration is that it can handle the physics for particle systems (but not necessarily game simulation physics).  Do console GPUs have that integrated now?  Honestly I haven't followed GPU developments &amp; what capabilities consoles have lately.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=132519</link>
<guid>https://destiny.bungie.org/forum/index.php?id=132519</guid>
<pubDate>Fri, 26 May 2017 00:49:38 +0000</pubDate>
<category>Gaming</category><dc:creator>dogcow</dc:creator>
</item>
<item>
<title>30fps vs 60fps, GPU vs CPU (reply)</title>
<content:encoded><![CDATA[<blockquote><p>can the physics for those be run on the gpu now?</p>
</blockquote><p>Yes, but most physics would be inefficient to run on the GPU and possibly inaccurate too. Thing about GPUs is that it can do the same mathematical operation in dozens of hundreds of inputs at the same time, but if you need each input to operate differently or need the result of one operation be the input of the next, the GPU will take a full cycle to get to that. Same with the CPU, but at least it isn't built with thousands of parallel cells, so you're not wasting processing power that could be better spent elsewhere.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=132517</link>
<guid>https://destiny.bungie.org/forum/index.php?id=132517</guid>
<pubDate>Fri, 26 May 2017 00:26:28 +0000</pubDate>
<category>Gaming</category><dc:creator>ZackDark</dc:creator>
</item>
<item>
<title>30fps vs 60fps (reply)</title>
<content:encoded><![CDATA[<blockquote><p>The debate between hitting 30 fps vs 60 fps is a contentious one, from the developers' side. Hitting 60fps, and keeping it there, is extremely hard. </p>
</blockquote><p>Yeah this is the thing that really gets me. The two examples people bring up are usually Battlefield 1 and Titanfall 2. Battlefield 1 however does NOT hit a consistent 60fps, dipping down to below 45fps on a regular basis, and Titanfall 2 only hits 60fps consistently by using dynamic resolution. As a company that obviously prides themselves on making gorgeous games, neither of these solutions are good. I can completely understand why they would prefer a 4K/1080p30 over a spotty 60fps or changing resolution.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=132516</link>
<guid>https://destiny.bungie.org/forum/index.php?id=132516</guid>
<pubDate>Fri, 26 May 2017 00:25:30 +0000</pubDate>
<category>Gaming</category><dc:creator>Xenos</dc:creator>
</item>
<item>
<title>Relic is back! (reply)</title>
<content:encoded><![CDATA[<blockquote><p>Ahh, my favorite map from TF1 returns.  Sweeeeeeeet!</p>
</blockquote><p>One of my favorites too! I also didn't expect how happy I would be to hear some old Titanfall 1 music again. Goosebumps :D</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=132515</link>
<guid>https://destiny.bungie.org/forum/index.php?id=132515</guid>
<pubDate>Fri, 26 May 2017 00:23:31 +0000</pubDate>
<category>Gaming</category><dc:creator>CruelLEGACEY</dc:creator>
</item>
<item>
<title>Relic is back! (reply)</title>
<content:encoded><![CDATA[<p>Ahh, my favorite map from TF1 returns.  Sweeeeeeeet!</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=132514</link>
<guid>https://destiny.bungie.org/forum/index.php?id=132514</guid>
<pubDate>Fri, 26 May 2017 00:15:35 +0000</pubDate>
<category>Gaming</category><dc:creator>Kahzgul</dc:creator>
</item>
</channel>
</rss>
