Impossible to use nvidia graphic card with Prepare 3d V. 2.2

Any issues, problems or troubleshooting topics related to the Prepar3D client application.
Saul
Posts: 3510
Joined: Mon Mar 04, 2013 1:02 pm
Location: Manchester, UK

Post by Saul »

Hi Beau,



Is this a new development for v2.2 enabling the use of DX10 cards for the main rendering window? as v2.1 DX10 was available for secondary windows I think for instrument panels? and V2.0 was not yet implemented DX10?



minime
Posts: 1198
Joined: Mon Jun 10, 2013 4:33 pm

Post by minime »

On systems that decide dynamically themselves which graphics card is being used Prepar3d is only seeing the lesser graphics card. It is the same on my laptop.
Nullack
Posts: 285
Joined: Thu Jul 11, 2013 3:33 am

Post by Nullack »

Beau if its a directx api call bug problem, I wonder who else might have already solved this out in the interwebs with working solution code? Surely someone else has ran into this dev issue before.
User avatar
Beau Hollis
Lockheed Martin
Posts: 2452
Joined: Wed Oct 06, 2010 3:25 pm

Post by Beau Hollis »

There is a help article which should explain how d3d10 support works. It's linked to from the warning you get. It should work for main scene but you won't see terrain unless tessellation is disabled. I think shadows only work for 10.1 and up. So if you have a 10 and 11 card you might want tessellation on and keep the d3d10 card doing just panels. If 10 is all you have turn tessellation off and hope for the best. While it should work, we don't test on 10 cards so there is no guarantee it will work for everyone. it's more of an as-is unofficial support. The main goal was to take away the harsh restrictions that prevented the sim from starting if any non-11 cards were detected.
Beau Hollis
Prepar3D Software Architect
psycoma27
Posts: 12
Joined: Fri Jan 31, 2014 4:21 pm

Post by psycoma27 »

Beau, i think it's nearly impossible and takes a lot of time, but i think many people here will find this helpful. Can you make an overview of settings for 4 different systems? (lower, middle, upper class pc and notebook with nvidia 600/700m series or amd on same level). 3 categories: ingame settings, prepar3d.cfg, nvidia inspector settings. I know its difficult, because there will be some people saying "i have nearly the same system but i runs awful..." maybe as a "no-guarantee-guide"?
Saul
Posts: 3510
Joined: Mon Mar 04, 2013 1:02 pm
Location: Manchester, UK

Post by Saul »

Thanks Beau, Much appreciated the explanation.
BobbyHipperson
Posts: 134
Joined: Wed Aug 22, 2012 12:53 am

Post by BobbyHipperson »

For the OP.....try to do a clean install of NVidia driver 331.82 or earlier. In DXDIAG you will still see the integrated GPU only but Prepar3D v2.2 will "see" your NVidia card. It's an NVidia Optimus thing. I don't know why but the later drivers are causing the issue you described.



If I load the 335.23 driver, I'll get the shimmering menus and the display setting in Prepar3D only showing the Intel HD 4000 GPU. If I go back to the 331.82 driver, I don't have that problem and that driver works fine.



I don't know why. Try it....as long as you do a "custom" install and check the box for a clean install between drivers, it won't screw up anything. Just remember to change the NVidia control panel settings from Adaptive to high performance before firing up Prepar3D



Bobby
tra757
Posts: 53
Joined: Tue Jul 31, 2012 3:36 am

Post by tra757 »

I have a pretty hefty system and a GTX770 and the GTX 770 can't keep up with the Xeon with settings dialed down. I can't imagine using the 710 with anything but minimal settings.
ptrg
Posts: 4
Joined: Tue Jan 10, 2012 6:09 am

Post by ptrg »

Beau:



I, along with many others, still have this exact issue, and while it does seem to utilise the card (without it being selected as the primary display in app), it does seem to lead to weird/slow graphic anomalies..



As you are probably aware, more "modern" laptops do not allow the HD XXXX to be disabled, as the dedicated GPU is routed through its display controller- there is no way around this. Nor can the card to be used be chosen, this is purely a software automatic selection.



And while most games do allow the Nvidia card to be selected, there are still random ones (including Prepar3d) that do not. This includes trying to force it to use the high performance card in the nvidia control panel on both a prepar3d.exe level, as well as a global level. No joy.



As mentioned above, this is also the case for dxdiag interestingly. Going back, FSX dx9 allowed the Nvidia card to be selected, where as dx10 preview, along with dx11 in Prepar3d do not give his option. So obviously there is something odd going on there. I am hoping you Beau can pinpoint it.



To prove that this is the heart of the issue, a Ubisoft game was similarly affected, and the team over there were able to make changes to their game to allow the nvidia card to be selected.

Here is their notes:



Quote:
We are sorry that we didn't notice this problem during the game development.

Some laptops with combined discrete Nvidia GPU and Intel integrated GPU use a technology called Nvidia Optimus. This technology will automatically select the integrated Intel GPU for light applications to save battery life of your laptop. Unfortunately Nvidia Optimus technology doesn't yet have an application profile for Trials Fusion because the game has not yet been released (we are in closed beta).

Because of the missing profile, Optimus always seems to select the integrated Intel GPU instead of the Nvidia Geforce, and unfortunately the Intel GPU doesn't meet the minimum required specifications to run Trials Fusion properly.

We have been in contact with Nvidia about this issue and we will update you when the Optimus profile is available.



Users reported that this didn't fix the issue, so this was followed by:



Quote:
We have found a code fix for the Nvidia Optimus issue (Geforce + Intel integrated). Now the game always selects the Geforce GPU. This fix will be integrated to Friday's patch or the next one. I will keep you informed.



And then:



Quote:
The bug has been identified. The game asks the GPU video from the wrong GPU on systems with multiple GPUs. As Intel's integrated GPU has no dedicated memory, the game incorrectly assumes there's no memory to store the textures, and this causes breaks texture streaming. Fix for this bug is already in testing. We will include it in the next patch.



Which was then followed by:



Quote:
This information from sebastianaalton was over at the No 1080p thread and should clear up some info here:



"There were THREE bugs that caused these issues. We only fixed one of them in the Friday's patch. Fixes for the remaining two will be in the forthcoming patch.



1. Game detects GPU memory size incorrectly if multiple GPUs are present. Integrated GPU has no dedicated memory, so our texture streaming system gets broken.

2. Intel integrated GPUs have additional texture corruption bug caused by GPU floating point rounding issue.

3. On NVIDIA Optimus laptops (integrated GPU + GeForce) the wrong GPU was selected.



We fixed number 3 in the patch. The frame rate is now 60 fps in the game, and you have helped me with NVIDIA Inspector to prove that the GeForce GPU is actually used. Bug 2 doesn't affect Optimus laptops, but bug 1 certainly does. This causes the textures to be corrupted also on the GeForce (and limit the resolution to 720p). We will fix bugs 1 & 2 in the forthcoming patch."



With the result:



Quote:
The patch did fix the graphics problem and the game looks great! Thanks!

ptrg
Posts: 4
Joined: Tue Jan 10, 2012 6:09 am

Post by ptrg »

For the sake of all us who bought a "high end gaming" laptop, could you, LM, please implement this same fix?



Thank you very much in advance!



Tom
CoolGunS
Posts: 132
Joined: Fri Jul 19, 2013 12:31 pm

Post by CoolGunS »

When LM first released v2.0 I too wondered if my GPU was being correctly selected, so much so that in my efforts to prove to myself that P3D was indeed using the GTX740M I used nVidia Inspector to overclock the card so much that it might crash on use.



And it did :)



Besides you can monitor the temp using NI and you'll soon see it start rising.



ptrg
Posts: 4
Joined: Tue Jan 10, 2012 6:09 am

Post by ptrg »

Beau, are you able to shed any light on this? Seems like it comes down to the sims interaction with directX.. And if other games that had the same issue can fix it via a patch/code, I'm hoping you guys are able to too..?
Locked