Home Forums Programming DirectX to become redundant?

Viewing 9 reply threads
  • Author
    Posts
    • #7341
      Anonymous
      Inactive

      Read an interview with Tim Sweeney.

      Apparently, software renderer’s will be back for good in 2012.

      Anyone care to comment?

      Is this based on the performance increase of parallelism???

      The article is dated March 2008, has anyone seen any indication of things moving in this direction?

      I think that DirectX 10 is the last DirectX graphics API that is truly relevant to developers. In the future, developers will tend to write their own renderers that will use both the CPU and the GPU – using graphics processor programming language rather than DirectX. I think we’re going to get there pretty quickly.

      I expect that by the time of the release of the next generation of consoles, around 2012 when Microsoft comes out with the successor of the Xbox 360 and Sony comes out with the successor of the PlayStation 3, games will be running 100% on based software pipelines. [/quote:f526c65314]

      Link: http://www.tgdaily.com/content/view/36410/118/

      -B.

    • #44325
      Anonymous
      Inactive

      Nah i personally dont see dx going anywhere any time soon. As for software renders until people can use multi-core profficiently in normal task such as off loading cpu task to them, i dont see anyone wanting to write their own multi-core rendering engine fom the full ground up.

      Now if software renders do grab on, i’ve no doubt MS will probably switch to a more software focused directx.

    • #44327
      Anonymous
      Inactive

      Yup, I’m with Peter on that one, can’t see DX going anywhere anytime soon.

    • #44395
      Anonymous
      Inactive

      Until CPUs catch up with graphics cards in terms of raw floating point performance I seriously doubt it. It will happen some day and the lines are blurring between CPU/GPU, but 2012 is a far too early I think. That’s only 3 years from now!

      Consider these two high-end CPUs and GPUS for instance:

      Radeon 4870×2:
      1600 shader units / cores, 2.4 teraflops ( 2400 gigaflops)

      Core i7 @ ~3.0 Ghz
      4 cores, ~80-100 gigaflops

      As you can see, there is absolutely no comparison in terms of raw number crunching performance. That’s also not counting the fact that GPUs are highly specialized towards graphics and thus would be able to achieve more performance for the same amount of clock cycles than a CPU would.

      Try running a Direct-X game on even a really cheap (or even integrated) graphics chipset and then run the same game using the Direct-X reference (software) rasterizer on a high end CPU. The crap graphics card will still hammer the fast CPU every time.

      Software rendering isn’t coming back any time soon!

    • #44590
      Anonymous
      Inactive

      I reckon we’ll have DX around for quite a while yet.

      When we get to close to 100 cores on our CPUs with dedicated graphics cores and functions like Intel are talking about then I think we’ll start seeing a move from GPU to raytracing on the CPU. Even then I wouldn’t be surprised to see DirectX encompass CPU and/or GPU accelerated raytracing in future versions.

      There’ll always be a need for some sort of middleware between game code and graphical functions.

    • #44672
      Anonymous
      Inactive

      Someone should develop a Realtime software renderer based on VRAY that is used in Max and other 3d suites.We have all seen the awsome renders of Vray some i usually think are real life models because they look that good.Imagine how awsome lighting and shading would be with a system like that

      It takes ages for my pc to make even a still with VRAY but now with multicore processing, Extremme gpus and even physics cards a System might be developed that mimics VRAY but on a realtime scenario

      But not gonna happen probably
      lol my imagination got away with me again
      :(

    • #44703
      Anonymous
      Inactive

      They are trying that at the minute: –

      It’s not quite realtime, but its not far off.

      All the best,
      Ash

      PS. Sorry i can’t put the link in as this is my first post, i’ll put it in the next

    • #44704
      Anonymous
      Inactive
    • #44726
      Anonymous
      Inactive

      all grainyness of the quality aside that is a huge leap in technology

      glad to see it been made
      I cant see direct X beign bullied out yet because very few are even using 10 and now 11 is out and ATi were on the Mark this time beating Nvidia with their HD5870 series which is like still warm from production its that new

      is see windows Seven uses 11 but all it seems to say to me is from a junior programming and artist perspective is this:
      Its like Adobe Photoshop Every year they release a new version of the software, thats all well and good but all they really change is the form , they make it a little more pleasing to the eye with shiny buttons, shiny tabs and other shiny things i wish they would give use some new features worth remembering and using instead of a glossy version of the previous release.

      Direct X is the same in a sense so this must be why alot of games still use Dx9

      Other than being a wrapper for XNA direct x doesnt look like its going anywhere too fast but it still kicks ass
      :)

    • #44727
      Anonymous
      Inactive

      They are trying that at the minute: –

      It’s not quite realtime, but its not far off.
      [/quote:b354166d7a]

      The mental ray guys are at it too:

      http://www.mentalimages.com/products/iray

Viewing 9 reply threads
  • The forum ‘Programming’ is closed to new topics and replies.