Impossible to use nvidia graphic card with Prepare 3d V. 2.2
- Beau Hollis
- Lockheed Martin
- Posts: 2452
- Joined: Wed Oct 06, 2010 3:25 pm
There is a help article which should explain how d3d10 support works. It's linked to from the warning you get. It should work for main scene but you won't see terrain unless tessellation is disabled. I think shadows only work for 10.1 and up. So if you have a 10 and 11 card you might want tessellation on and keep the d3d10 card doing just panels. If 10 is all you have turn tessellation off and hope for the best. While it should work, we don't test on 10 cards so there is no guarantee it will work for everyone. it's more of an as-is unofficial support. The main goal was to take away the harsh restrictions that prevented the sim from starting if any non-11 cards were detected.
Beau Hollis
Prepar3D Software Architect
Prepar3D Software Architect
Beau, i think it's nearly impossible and takes a lot of time, but i think many people here will find this helpful. Can you make an overview of settings for 4 different systems? (lower, middle, upper class pc and notebook with nvidia 600/700m series or amd on same level). 3 categories: ingame settings, prepar3d.cfg, nvidia inspector settings. I know its difficult, because there will be some people saying "i have nearly the same system but i runs awful..." maybe as a "no-guarantee-guide"?
-
- Posts: 134
- Joined: Wed Aug 22, 2012 12:53 am
For the OP.....try to do a clean install of NVidia driver 331.82 or earlier. In DXDIAG you will still see the integrated GPU only but Prepar3D v2.2 will "see" your NVidia card. It's an NVidia Optimus thing. I don't know why but the later drivers are causing the issue you described.
If I load the 335.23 driver, I'll get the shimmering menus and the display setting in Prepar3D only showing the Intel HD 4000 GPU. If I go back to the 331.82 driver, I don't have that problem and that driver works fine.
I don't know why. Try it....as long as you do a "custom" install and check the box for a clean install between drivers, it won't screw up anything. Just remember to change the NVidia control panel settings from Adaptive to high performance before firing up Prepar3D
Bobby
If I load the 335.23 driver, I'll get the shimmering menus and the display setting in Prepar3D only showing the Intel HD 4000 GPU. If I go back to the 331.82 driver, I don't have that problem and that driver works fine.
I don't know why. Try it....as long as you do a "custom" install and check the box for a clean install between drivers, it won't screw up anything. Just remember to change the NVidia control panel settings from Adaptive to high performance before firing up Prepar3D
Bobby
Beau:
I, along with many others, still have this exact issue, and while it does seem to utilise the card (without it being selected as the primary display in app), it does seem to lead to weird/slow graphic anomalies..
As you are probably aware, more "modern" laptops do not allow the HD XXXX to be disabled, as the dedicated GPU is routed through its display controller- there is no way around this. Nor can the card to be used be chosen, this is purely a software automatic selection.
And while most games do allow the Nvidia card to be selected, there are still random ones (including Prepar3d) that do not. This includes trying to force it to use the high performance card in the nvidia control panel on both a prepar3d.exe level, as well as a global level. No joy.
As mentioned above, this is also the case for dxdiag interestingly. Going back, FSX dx9 allowed the Nvidia card to be selected, where as dx10 preview, along with dx11 in Prepar3d do not give his option. So obviously there is something odd going on there. I am hoping you Beau can pinpoint it.
To prove that this is the heart of the issue, a Ubisoft game was similarly affected, and the team over there were able to make changes to their game to allow the nvidia card to be selected.
Here is their notes:
Quote:
We are sorry that we didn't notice this problem during the game development.
Some laptops with combined discrete Nvidia GPU and Intel integrated GPU use a technology called Nvidia Optimus. This technology will automatically select the integrated Intel GPU for light applications to save battery life of your laptop. Unfortunately Nvidia Optimus technology doesn't yet have an application profile for Trials Fusion because the game has not yet been released (we are in closed beta).
Because of the missing profile, Optimus always seems to select the integrated Intel GPU instead of the Nvidia Geforce, and unfortunately the Intel GPU doesn't meet the minimum required specifications to run Trials Fusion properly.
We have been in contact with Nvidia about this issue and we will update you when the Optimus profile is available.
Users reported that this didn't fix the issue, so this was followed by:
Quote:
We have found a code fix for the Nvidia Optimus issue (Geforce + Intel integrated). Now the game always selects the Geforce GPU. This fix will be integrated to Friday's patch or the next one. I will keep you informed.
And then:
Quote:
The bug has been identified. The game asks the GPU video from the wrong GPU on systems with multiple GPUs. As Intel's integrated GPU has no dedicated memory, the game incorrectly assumes there's no memory to store the textures, and this causes breaks texture streaming. Fix for this bug is already in testing. We will include it in the next patch.
Which was then followed by:
Quote:
This information from sebastianaalton was over at the No 1080p thread and should clear up some info here:
"There were THREE bugs that caused these issues. We only fixed one of them in the Friday's patch. Fixes for the remaining two will be in the forthcoming patch.
1. Game detects GPU memory size incorrectly if multiple GPUs are present. Integrated GPU has no dedicated memory, so our texture streaming system gets broken.
2. Intel integrated GPUs have additional texture corruption bug caused by GPU floating point rounding issue.
3. On NVIDIA Optimus laptops (integrated GPU + GeForce) the wrong GPU was selected.
We fixed number 3 in the patch. The frame rate is now 60 fps in the game, and you have helped me with NVIDIA Inspector to prove that the GeForce GPU is actually used. Bug 2 doesn't affect Optimus laptops, but bug 1 certainly does. This causes the textures to be corrupted also on the GeForce (and limit the resolution to 720p). We will fix bugs 1 & 2 in the forthcoming patch."
With the result:
Quote:
The patch did fix the graphics problem and the game looks great! Thanks!
I, along with many others, still have this exact issue, and while it does seem to utilise the card (without it being selected as the primary display in app), it does seem to lead to weird/slow graphic anomalies..
As you are probably aware, more "modern" laptops do not allow the HD XXXX to be disabled, as the dedicated GPU is routed through its display controller- there is no way around this. Nor can the card to be used be chosen, this is purely a software automatic selection.
And while most games do allow the Nvidia card to be selected, there are still random ones (including Prepar3d) that do not. This includes trying to force it to use the high performance card in the nvidia control panel on both a prepar3d.exe level, as well as a global level. No joy.
As mentioned above, this is also the case for dxdiag interestingly. Going back, FSX dx9 allowed the Nvidia card to be selected, where as dx10 preview, along with dx11 in Prepar3d do not give his option. So obviously there is something odd going on there. I am hoping you Beau can pinpoint it.
To prove that this is the heart of the issue, a Ubisoft game was similarly affected, and the team over there were able to make changes to their game to allow the nvidia card to be selected.
Here is their notes:
Quote:
We are sorry that we didn't notice this problem during the game development.
Some laptops with combined discrete Nvidia GPU and Intel integrated GPU use a technology called Nvidia Optimus. This technology will automatically select the integrated Intel GPU for light applications to save battery life of your laptop. Unfortunately Nvidia Optimus technology doesn't yet have an application profile for Trials Fusion because the game has not yet been released (we are in closed beta).
Because of the missing profile, Optimus always seems to select the integrated Intel GPU instead of the Nvidia Geforce, and unfortunately the Intel GPU doesn't meet the minimum required specifications to run Trials Fusion properly.
We have been in contact with Nvidia about this issue and we will update you when the Optimus profile is available.
Users reported that this didn't fix the issue, so this was followed by:
Quote:
We have found a code fix for the Nvidia Optimus issue (Geforce + Intel integrated). Now the game always selects the Geforce GPU. This fix will be integrated to Friday's patch or the next one. I will keep you informed.
And then:
Quote:
The bug has been identified. The game asks the GPU video from the wrong GPU on systems with multiple GPUs. As Intel's integrated GPU has no dedicated memory, the game incorrectly assumes there's no memory to store the textures, and this causes breaks texture streaming. Fix for this bug is already in testing. We will include it in the next patch.
Which was then followed by:
Quote:
This information from sebastianaalton was over at the No 1080p thread and should clear up some info here:
"There were THREE bugs that caused these issues. We only fixed one of them in the Friday's patch. Fixes for the remaining two will be in the forthcoming patch.
1. Game detects GPU memory size incorrectly if multiple GPUs are present. Integrated GPU has no dedicated memory, so our texture streaming system gets broken.
2. Intel integrated GPUs have additional texture corruption bug caused by GPU floating point rounding issue.
3. On NVIDIA Optimus laptops (integrated GPU + GeForce) the wrong GPU was selected.
We fixed number 3 in the patch. The frame rate is now 60 fps in the game, and you have helped me with NVIDIA Inspector to prove that the GeForce GPU is actually used. Bug 2 doesn't affect Optimus laptops, but bug 1 certainly does. This causes the textures to be corrupted also on the GeForce (and limit the resolution to 720p). We will fix bugs 1 & 2 in the forthcoming patch."
With the result:
Quote:
The patch did fix the graphics problem and the game looks great! Thanks!
When LM first released v2.0 I too wondered if my GPU was being correctly selected, so much so that in my efforts to prove to myself that P3D was indeed using the GTX740M I used nVidia Inspector to overclock the card so much that it might crash on use.
And it did
Besides you can monitor the temp using NI and you'll soon see it start rising.
And it did
Besides you can monitor the temp using NI and you'll soon see it start rising.