CGI & 3DS Max

Well Nvidia recently bought MentalImages and now Ageia. Some people already think about if it would make sense for NVIDIA to buy AMD/ATI, too.

 Joining processing markets

For many years now I always favored NVIDIA Gfx cards. ATI was said to have good hardware too, but for many years their drivers simply didn't work well enough for me. Acceleration for 3ds max wasn't working well and dual-screen support for accelerated 3D fullscreen content wasn't available for some years neither.

I was a bit disappointed about the usability when Nvidia switched their control panel GUI/System. It improved though over time, but recently I was surprised how bad dual screen management has become. I had a hard time to make the driver do what I wanted and sometimes it totally failed until I reinstalled the driver. Somehow the second output popped back to HD TV and wrong resolution very frequently. Then it somehow worked for a while and sometimes it stopped working again. Not very transparent…

In addition to that we sometimes hit randomly blue screens in a over 50 nodes cluster. I don't know if it's a Virtools issue, because I know that sometimes in the past years Virtools was able to create blue screens. Recently Virtools told me, that they have more people reporting them about unstable Nvidia hard-/software. I also found quite a large number of hits in google with the bluescreen keyword that is referring to the Nvidia driver. Many of them from 2007. This makes me think the following:

Vista and DX10 was important for Microsoft and Nvidia. It was two big and new things: DX10 driver/hardware plus new OS system with significant changes. This combo probably consumed a lot of manpower. I wonder if all this may had an effect on quality/stability? If so, it may mean that if Nvidia goes more and more into other markets (partly forced by INTELs efforts) and grows and grows … quality may suffer (even more)?

I hope not! We will see, it's certainly interesting times ahead … 🙂

If you are using Nvidia cards, do also feel that they become less solid/stable – or is still everything for you like always?

If you read Beyond3D or Tomf's blog then you probably already heard of INTEL's project 'larrabee'. If not: the basic idea is to mix CPU and GPU functionality into one 'multi mini-core' system. One could call that "CGPU". The target for 2009 is up to 16 cores (maybe less) with each capable of 4 hardware threads.

The source article ends with: 

Now do you understand why AMD had to buy ATI to survive? 

In the context of the mixed architecture that might be more flexible to to program, raytracing (as alternatives to rasterasation) for real-time rendering is discussed. Another citation from the above article is

Now do you see why Nvidia is dead?

Maybe you know about Mental Images, the creator of the famous MentalRay offline renderer that ships with 3dsMax, Maya and XSI … … they have been bought by NVIDIA! Nvidia does provide with Gelato another rendering product for the DCC market but maybe their acquisition was also interesting for them in regards to the Intel's Larrabee project where raytracing for real-time (RTRT) is researched.

So what will the future bring us? Maybe instead of 2 pairs of separate competitors 'AMD<->INTEL and NVIDIA<->ATI' we will have a triangle 'INTEL<->NVIDIA<->AMD/ATI' ?

( P.s. Actually, as I am right now discovering, Larrabeee details are already public since April 2006 )

Last week I saw a presentation from ATI about their new FireGL cards. The presenter said, that the new cards will come with new drivers that have been completely rewritten from scratch. We all know (and they know it too) that their drivers have been a weak point. That's also why I always did vote for Nvidia cards. My experience with ATI cards under 3ds max (i.e. in 2004) have been very bad. Also when I needed 3d acceleration on multi-monitor setups, ATI didn't offer that feature at all. Last time I checked (i think dec 2005/ jan. 2007) they had just release a very new driver having this feature and it didn't work very well for me, which only confirmed my choice of preferring NVidia cards.

An interesting aspect was, that he mentioned a collaboration with Dassault Systemes: as they provide one of the most widely used CAD application in the industry (CATIA) they worked with them to improve performance. He said, that once DS reported of being able to display and handle 90 million faces (? or was it vertices – sorry I don't remember anymore exactly) inside CATIA. This is done via Vertex Buffer Objects.

The cards will have up to 320 unified shader pipeline and up to 2 GB of RAM – these numbers are of course from the high-end version "V8650".

The short answer is: no!

The long answer is:  3ds max hides complexity. Before per-pixel handling via programmable pixel-shaders became available on consumer gfx cards, shading components across the triangle were mainly calculated by interpolating the value between the three vertices. A triangle is flat and thus you can interpolate in a linear/straight way. For example the vertex colors and normals.

Now, if you have a mesh inside 3ds max – let's say a cube in it's default state. It has 6 smoothing groups, because there are hard edges between each face. How many normals do you need for each corner vertex? Each corner has 3 faces – each of them pointing to another direction and having "hard" seams at it's borders.

We need 3 normals for all 8 corners … that's 24 vertices. Ok, now convert it to an 'editable poly', go into the face sub-level and hit CTRL+a to select all faces. Hit "clear all" and then smoothing group "1" in order to assign a single smoothing group to all faces. Export this one and load it into Virtools. And?

Oh, it ain't 24 anymore but not 8 neither? Yes! That's because there are not only normals stored in the vertices but also UV coordinates and color/material information. In 3ds max do then the following:

  • add the "UV Unwrap" modifier to the cube
  • go into face sub-level mode
  • select all
  • open the "Edit…" dialog
  • select Mapping -> Unfold Mapping ('walk to closest face') -> Ok
  • You now have a cross-like UV layout
  • go into vertex sub-level mode
  • select through each vertex and notice that sometimes another one is highlighted in blue
  • right click and select "target weld" form the context menu
  • now weld each vertex to it's "blue partner" in case it has one

Import the smoothed cube that has no UV seams into Virtools … … … what does the mesh setup say? 8 vertices, finally !!!

Cool

I hope this little tutorial has shed some light on the topic therefore you should now be able to understand better why there might be differences in vertex count between Virtools and 3ds max.

3ds max 2008 is announced. Something interesting is the GPU shadowing system for the viewport. Checkout the video about the "review rendering" feature. It does support up to 64 lights.

 3ds max 2008 preview - gpu based review rendering for shadows

I also like the new working-pivot and the new LOD options. The ability to UV unwrap multiple objects manually at the same time also sounds very useful.

And … finally … a better maxscript editor!! Yay 🙂 

I just read about the announced Quest 4.0 features. They include

  • COLLADA support (including ability to import at run-time)
  • GUI Builder and GUI Widgets
  • Weather System
  • Landscape System
  • Shadow System
  • Wii Controller support
  • Newton physics engine
  • Object Orientation via Interfaces And Classes on top of channels. (Exposed members editable via a Grid GUI. Sounds a bit like "components".)

Souces: Newsletters: February , June and a Forum Thread

Why is CG 2.0 of special interest?

Let's have a look at this Nvida PDF_from the GDC2007 that talks about Geforce8 OpenGL extensions. You will see it's Shader 4.0 related… Shader 4.0 is DirectX 10 stuff! Now in this PDF it says the following, too:

Cg 2.0 -> Not a GL extension -> Will support the capabilities 

So when will CG 2.0 be available? Well … it's already released! The CG 2.0 Toolkit is part of the Nvidia OpenGL SDK

Please obtain the Cg 2.0 Toolkit from the full OpenGL SDK 10 installer.

I think this means DX10 features using a Geforce8 are already available under Windows XP!  Now  …. what if Virtools 4.x would come with a CG 2.0 rasterizer?

As Patrick said:

cool shit

Laughing

Here's a video interview with Nicholas Francis from Unity3D. Now, why do I mention this?

  • it's stated that they are interested to become the 3D equivalence of flash
  • you see transformation gizmos
  • you see assets hotloading
  • you see gfx card emulation
  • you see network emulation (in the menu)

I know another tool that would benefit of having such features, too. Wink

Edit: It's actually not with "Tom Higgins" but with "Nicholas Francis"