Results 1 to 6 of 6

Thread: AMD's Cinema 2.0 Inititive

  1. #1

    Default AMD's Cinema 2.0 Inititive

    Seems pretty cool, some of the stuff coming out of it is pretty mind blowing.

    Cinema 2.0: The Next Chapter in the Ultimate Visual Experience? Story

    Excerpts from the new Ruby 2.0 demo and the scorpion one:

    Stills form Ruby 2.0:
    PCGH - Ruby 2.0: Screenshots und Video der neuen Radeon-Technologiedemo - 2008/08/Ruby_new_demo_000.jpg

    Keep in mind this is all running real time on a quad core phenom x4 9850, ATI 4870 X2 platform.

    Apparently this technology also ties into a new "cloud computing" rendering platform from a company called OTOY, and is being used in a new online world created by one of the MySpace founders called LivePlace/CitySpace. There's more info on them here, and yes, it's very very pretty!
    It's interesting to note that the same technology behind OTOY was used to render some of the Transformers TV commercials in real time, as demonstrated in the videos on the sites I linked to. So it is absolutely cinema quality.

    Another technology apparently developed in partnership with OTOY for the LivePlaces platform avatars is a new 3D human model capture and real time rendering system. The results speak for themselves, they look real.

    Now, if they can deliver this level of photorealism into the gaming world, I will be a very happy person. And considering the recent strategic partership between AMD and Blizzard, which will include the bundling of ATI GPUs with WoW, and noting that Activision Blizzard is now the largest gaming entity on Earth (Market cap of $18 billion vs EA's $16 billion), this may actually be a possibility.


    Apparently, a company called JulesWorld is behind OTOY, and OTOY is just a technology product they developed. My bad. Interestingly, they have been using it to do real time raytracing and global illumination (photon mapping in particular) using voxel data sets since the 2900XT came out. That technology is the basis of the new Ruby demos as well. It's only with the 4870 that they've been able to do 60fps with AA though.

    The trouble with mapping this over to games, is that voxel data sets are not always the easiest thing to animate, and they take up a LOT of storage space and memory. Although with size, the work of Ben Houstan has helped here a bit (see here). Also in a demonstation video, the people from JulesWorld say they have a novel compression method developed in partnership with AMD (see here). And then, if you watch the video, there is clearly a great deal of animation and destruction going on. The question is, is that baked animation, or generated dynamically? For games, it must be the latter, but I remain hopeful.

    It's not as though voxel based approaches to physics, animation and destruction don't exist. Just look at Digital Molecular Matter (ok, more voxel-like than voxels exactly, it splits objects into tetrahedron volumes), as used in the upcoming Star Wars: The Force Unleashed game. So it's entirely possible that this will work out. If this is really how the next generation of gaming is going to be, I can tell you I never would have predicted it. Looking at recent presentations on DirectX 11, it clearly leans towards subdivision surfaces and bezier patches with displacement mapping. I thought it was either that, or further advancement of relief mapping (which operates using local per surface raytracing into the displacement map to find the correct location/depth of the current pixel). But ray traced voxels? I never seriously considered it until now.

    I mean, yeah I could see voxels for the physical simulation of destructive objects, which is what DMM does, and definately for fluid/smoke simulation (it's the only way). However, in all those cases, either the voxel object is substituted with a polygonal one for rendering (DMM) or a polygonal iso-surface is generated (fluid simulation), and only in rare cases is the voxel field directly ray traced/cast (smoke, certain variants of relief mapping). But I guess technology has improved faster than my personal imagination.

    There are still more questions. Like how do they deal with aliasing inherint to voxel representations? Are they using level sets? How does their custom AA system work? Is simply the monte-carlo method of casting random rays into the pixel? When will we see production quality fluid simulations, like RealFlow, since clearly the renderer can handle it? Is it possible to blend voxel data with polygonal data and maintain performance? What kind of global illumination algorithms are they using exactly? Are they programming in pure Direct3D 9.0/10.0, or are they also using the CTM/Stream SDK to program the GPUs directly? Are voxels stuck onto a regular grid, or more free-form "volumes" as in DMM? Mainly, can they stretch and deform so that dynamically animated bodies remain contiguous and visually pleasing?
    Last edited by IncompleteDude; 14-Aug-2008 at 20:48.

  2. #2


    ATI GPUs with WoW
    *large sigh*


    This whole "cloud computing" crap is really starting to bug me. I really don't see any productivity that can come out of a project like this.

  3. #3


    Quote Originally Posted by mm3 View Post
    This whole "cloud computing" crap is really starting to bug me. I really don't see any productivity that can come out of a project like this.
    Your brand loyalties aside, perhaps you can explain to me why the same cloud computing principals that allow supercomputers to exist, predict your weather and advance science, that allow protein analysis on a scale never before seen ([email protected]), create worldwide chemical databases, collaboration between scientists all over the globe, does not enable productivity? Cloud computing has it's uses, not withstanding recent availability problems with Amazon's services. I mean, I don't think Microsoft would make it a cornerstone of their operating system after Windows, if it wasn't useful.

    Of course, I can agree that using cloud computing to create games, such as WoW, SecondLife and others with their cloud of servers, does not promote productivity, lol. Of course, with games that is hardly the point.

    However, in this instance gaming is but one application. More interesting is the use of cloud rendering farms by CG companies. It will allow real time direction of the scene, animation of characters, configuration of lighting for the final render, and so on. All of these are excellent time saving applications, without the cost of building your own rendering supercomputer. Clearly, the people behind Transformers saw the merits of it, and I expect they understand the issues far better than any of us.

    That being said, I would love to have the horsepower to render OTOY level graphics on my desktop, and probably by the time it's out I can for $3000 or so. However, it's not so easy when you want that level of quality on your iPhone, which is one of their target platforms. There it's streaming cloud computed rendering, or comparatively ugly visuals.
    Last edited by IncompleteDude; 14-Aug-2008 at 19:44.

  4. #4


    Beyond Programmable Shading

    An interesting presentation there from iD Software, discussing some of the theory behind their next generation engine (iD Tech 6, noting that iD Tech 5 is what's being used in Rage). It also agrees and expounds upon with the voxel based approach we see in Cinema 2.0 with great detail. Very interesting. Although, it does not fully resolve the issues of animating voxel structures, suggesting rendering dynamic geometry polygonally. On the other hand, the procedural generate of voxel surfaces for infinite detail is an interesting prospect I had not considered.

    It's hard to imagine that we've gone from fully fixed function GPUs architectures to fully programmable ones in less than a decade.

  5. #5


    Anyone heard about ATI/AMD doing that 3 core proc?

    I also heard they were going to try putting a GPU and CPU on the same silicon... Think how hot that would run.

  6. #6


    Quote Originally Posted by Jonathan View Post
    Anyone heard about ATI/AMD doing that 3 core proc?

    I also heard they were going to try putting a GPU and CPU on the same silicon... Think how hot that would run.
    The 3 core CPUs have been out for a while now. They're marketing as Phenom X3s. As for the hybrid CPU/GPU, that's not due out until the middle of next year, and it's not going to run hot either. It's not going to be a very fast GPU, no better than the integrated chipset stuff.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  • - the Adult Baby / Diaper Lover / Incontinence Support Community. is designed to be viewed in Firefox, with a resolution of at least 1280 x 1024.