Question For Beau RE Multi GPU Multi monitor Setup

Any issues, problems or troubleshooting topics related to the Prepar3D client application.
Locked
karrilon
Posts: 11
Joined: Fri Jun 01, 2012 3:01 pm

Post by karrilon »

Hi Beau



I read a post you made recently explaining you use one GPU per display because V2 doesn't support multi-channel. I'm planning on running 3x GTX 680s in my main PC for 3 or 5 undocked views and then run Panel Builder on a second networked PC for the avionics. Are you running your cards in SLI/Crossfire? If not, could you explain the setup for me please?



Thanks

Richie
User avatar
jimcooper1
Posts: 715
Joined: Fri Jan 21, 2011 3:37 pm

Post by jimcooper1 »



Quote:
Quote from karrilon on April 30, 2014, 16:27

I read a post you made recently explaining you use one GPU per display because V2 doesn't support multi-channel.



Can you find that post and provide a link to it?



I'm running a single PC with 2 GPUs. Each GPU is driving 4 monitors each, so I doubt Beau posted that you should only use one monitor per GPU.



Jim
go_noah
Posts: 84
Joined: Tue Sep 25, 2012 8:06 pm

Post by go_noah »

Is this it?



http://www.prepar3d.com/forum-5/?mingle ... pic&t=6352



Quote:
Quote from Beau on April 8, 2014, 08:31

A quick note on SLI:

- If you have multiple monitors and multiple views, then you'll generally be better off driving a display from each video card than using SLI. If you have 2 cards and 1 monitor than it's worth giving SLI a shot.

- We're still waiting on NVidia to create an SLI driver profile for Prepar3D v2 so the drivers may not be picking the best SLI mode for our app be default.

- We still have more work to do to improve SLI performance, but we chose to focus on autogen vegetation performance for this release as it was a big contributor to OOMs and CPU boundedness.

- If you want to check out what the best-case improvement of SLI might be in the future, you can try this:

1. Make a copy of your Prepar3D.exe and rename it AFR-FriendlyD3D.exe. Now make a new shortcut to the AFR-FriendlyD3D.exe and launch that. This is a developer trick for telling the SLI drivers not to copy any resources between GPUs each frame. (See NVidia's SLI Best Practices Doc for more info: http://developer.download.nvidia.com/wh ... 11_Feb.pdf) In a perfect world, we wouldn't need any resources to be copied and this would result in a nice perf boost with no side effects. In the real world, we do have some features that rely on resources that persist frame-to-frame. HDR is one of them so you'll probably want to turn that off to avoid getting an odd pulsing effect. You may also see some garbage geometry streak across the screen here and there. (at least that's what I was seeing on a dual titan rig with the drivers I was using at the time.)



Also, any time you enable/disable SLI in the driver settings you should clear out your shader cache or you may get driver crashes. This really shouldn't happen since cached shaders are compiled in a non-device-specific way, but it does seem to happen nonetheless. This is also sometimes true of doing driver updates.

CPU - 3570K oc 4.2 16GB memory GPU - Gigabyte GTX 780 3GB memory - Win 10 64bit
Bronco
Posts: 22
Joined: Tue Apr 10, 2012 10:49 am

Post by Bronco »



Quote:
Quote from jimcooper1 on April 30, 2014, 19:14

Quote:
Quote from karrilon on April 30, 2014, 16:27

I read a post you made recently explaining you use one GPU per display because V2 doesn't support multi-channel.



Can you find that post and provide a link to it?



I'm running a single PC with 2 GPUs. Each GPU is driving 4 monitors each, so I doubt Beau posted that you should only use one monitor per GPU.



Jim



Jim, running eight monitors is impressive.

What GPUs are you using.



gb.
User avatar
jimcooper1
Posts: 715
Joined: Fri Jan 21, 2011 3:37 pm

Post by jimcooper1 »



Quote:
Quote from Bronco on May 1, 2014, 03:01

Jim, running eight monitors is impressive.

What GPUs are you using........



2 x HD7850. They are on a development prototype so haven't used the best available, but with everything set at medium, simple clouds, water effects low, most shadows off, we're getting in excess of 20fps without stutters. When we're ready to deploy the final solution we'll use whatever GPUs are the best at the date of purchase!



Here it is configured with a 3x2 grid for outside views, an instructor station and instrument panel





Jim
karrilon
Posts: 11
Joined: Fri Jun 01, 2012 3:01 pm

Post by karrilon »



Quote:
Quote from go_noah on April 30, 2014, 19:43

Is this it?



http://www.prepar3d.com/forum-5/?mingle ... pic&t=6352



Quote:
Quote from Beau on April 8, 2014, 08:31

A quick note on SLI:

- If you have multiple monitors and multiple views, then you'll generally be better off driving a display from each video card than using SLI. If you have 2 cards and 1 monitor than it's worth giving SLI a shot.

- We're still waiting on NVidia to create an SLI driver profile for Prepar3D v2 so the drivers may not be picking the best SLI mode for our app be default.

- We still have more work to do to improve SLI performance, but we chose to focus on autogen vegetation performance for this release as it was a big contributor to OOMs and CPU boundedness.

- If you want to check out what the best-case improvement of SLI might be in the future, you can try this:

1. Make a copy of your Prepar3D.exe and rename it AFR-FriendlyD3D.exe. Now make a new shortcut to the AFR-FriendlyD3D.exe and launch that. This is a developer trick for telling the SLI drivers not to copy any resources between GPUs each frame. (See NVidia's SLI Best Practices Doc for more info: http://developer.download.nvidia.com/wh ... 11_Feb.pdf) In a perfect world, we wouldn't need any resources to be copied and this would result in a nice perf boost with no side effects. In the real world, we do have some features that rely on resources that persist frame-to-frame. HDR is one of them so you'll probably want to turn that off to avoid getting an odd pulsing effect. You may also see some garbage geometry streak across the screen here and there. (at least that's what I was seeing on a dual titan rig with the drivers I was using at the time.)



Also, any time you enable/disable SLI in the driver settings you should clear out your shader cache or you may get driver crashes. This really shouldn't happen since cached shaders are compiled in a non-device-specific way, but it does seem to happen nonetheless. This is also sometimes true of doing driver updates.





Yeah, that's the one
User avatar
jimcooper1
Posts: 715
Joined: Fri Jan 21, 2011 3:37 pm

Post by jimcooper1 »



Quote:
Quote from karrilon on May 1, 2014, 15:26

Yeah, that's the one



What Beau is saying there is that SLi is worth considering if you are running a single monitor. If you want to run multiple monitors then SLi is not appropriate.



Jim
User avatar
Beau Hollis
Lockheed Martin
Posts: 2452
Joined: Wed Oct 06, 2010 3:25 pm

Post by Beau Hollis »

If you're settings and your GPU have you raster or pixel bound, then resolution scaling will degrade performance and more GPUs can be used to help alleviate the situation. As Jim commented, I was saying that in such a case multiple GPUs in a standard configuration will work as well or better than SLI assuming you have multiple views and multiple monitors. How many views and monitors you drive per GPU is totally up to the user of course. We have demo rigs that run quite well driving 12 displays from a single desktop with the appropriately high-end hardware and tuned settings for a given training device and scenario.
Beau Hollis
Prepar3D Software Architect
Locked