Home Forums General Discussion Server rendered, browser displayed

Welcome to our forums. These forums were active from 2003-2014. We have now decided to close them down, but will leave them here as an archive.

Remember you can send us feedback, news, jobs and content ideas by clicking here.

If you're really stuck for time, email news@gamedevelopers.ie.

You can also follow us on Twitter @gamedev_ie 

 

 

This topic contains 5 replies, has 4 voices, and was last updated by  chris_keogh 8 years, 11 months ago.

  • Author
    Posts
  • #6886

    Pete
    Participant

    Check this out! Works on shite PCs, in a browser, with no plugins! Allegedly!

    http://www.rockpapershotgun.com/2008/08/16/liveplace-online-photorealism-on-rubbish-pcs/#more-2318

  • #42047

    david4482
    Participant

    Thats ridiculous. Perfectly sharp shaddows etc. No way is this real and ain’t gona be without major technology advances.

    As article says you would need a super graphics card for each player or be able to batch the scene and with all that light interaction they don’t say how that could be done.

    Only way in my mind to currently achieve something like this is a pre rendered database Quicktime VR style on a Google farm like the way Meez render their avatars but clearly that’s not what’s going on.

    City bits sure a purdy though. Good modellers. Which is what I think they want to hear from people watching video.

  • #42061

    feral
    Participant

    I was thinking about whether it was possible to make something broadly like this recently. (Stream video of 3d world)
    I haven’t studied the video in detail, but I wouldn’t be so quick to call BS on it (as a concept, if not the actual quality of graphics you see there)

    As article says you would need a super graphics card for each player or be able to batch the scene and with all that light interaction they don’t say how that could be done. [/quote:393027b80f]

    The article seems to assume they are going for a rasterisation based approach… There doesn’t seem to be any reason to think this, and it wouldn’t seem very smart off the bat…

    What if you consider some sort of global illumination based approach instead?
    Could you potentially store a big complex globally illuminated world on your servers, and stream viewpoints to players as they moved through it? With a good few player viewpoints moving around your complex scene, and maybe specialist hardware in your servers, this might work out cheaper than rasterisation of a viewpoint for each player..? particularly if you don’t have to render avatars for the viewpoints themselves or if you fudge that?

    High bandwidth requirements, of course, to make it work, but that might be acceptable.

  • #42062

    Pete
    Participant

    The bit at the start with moving cars etc. is purportedly pre-rendered video, but the ‘real time’ stuff is not too far away from what ATI are demoing here for their new card:

    http://www.pcgameshardware.com/aid,655742/News/Ruby_20_Screenshots_and_video_of_the_new_Radeon_tech_demo/

    and Nvidia are showing here (albeit on 4 futuristic Quadros):
    http://www.hothardware.com/News/NVIDIA-Shows-Interactive-Ray-Tracing-on-GPUs/

  • #42063

    david4482
    Participant

    Well I get the terminal concept (and I thinks its cool). I have thought about it for mobile phones where the rendering of 3D graphics is constrained but can draw streamed images and pass inputs back to server.

    My problem was with the actual quality of graphics shown, didn’t look like they are fudging on quality at all, they had dynamic lights, shaddows etc so I am calling BS that these guys have achieved what they are saying.

    I have no doubt we will have dumb terminals receiving fancy graphics soon just not unconstrained flight through massive city scapes for a few years at least. :wink:

  • #42141

    chris_keogh
    Participant

    OTOY (the original links) are sponsored by AMD/ATI, and the opposition are NV of course, who own Mental Images, who are now showing Reality server:

    http://www.mentalimages.com/products/realityserver.html

    A little bit more realistic perf (downgrades as viewpiont moves) as they are doing the whole scene, not just the whole ATI ‘Cinema 2.0’ style composting that OTOY originally where demoing.

    EDIT: for the original OTOY demos, not composite based, see:
    http://www.techcrunch.com/2008/07/09/otoy-developing-server-side-3d-rendering-technology/

The forum ‘General Discussion’ is closed to new topics and replies.