Setting processor affinity on windows. #412
birdman3131
started this conversation in
Tutorials
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
How to setting Processor affinity on windows.
Go to C:\GameserverApp\containers\serverfiles and create "a.ps1"
Edit a.ps1 and put the following in it. The processor affinity number needs to be different per game server.
ASE
ASA without the API
ASA with the API. Note the significantly increased delay time.
Figuring out what to set for the processor affinity.
So the affinity is a 0 indexed binary bitmask expressed in decimal. The quickest way to figure it out is to use this calculator https://arthrift.com/cpu.html and grab the decimal number at the bottom
I would recommend leaving at least cpu's 0 and 1 for windows to use. It often prefers them.
Further info on setting the affinity
I recommend grabbing a program called Hwloc so that you can see the processors on your system.
Once downloaded run \hwloc-win64-build-2.9.3\bin\lstopo.exe
Hit F to toggle collapsing.
The P#0 is the one we are interested in for the bitmask.
For my system I have 2 processor sockets. Each socket has 12 Cores. Each core has 2 logical processors. for a total of 48 total "cpu's" for the bitmask to use. (You will often hear these called threads.)
I don't particularly recommend crossing the boundary between processors. (Often called the NUMA boundary.) So try to keep that in mind when setting processor affinity. (In the screenshot that happens between P#23 and P#24)
Beta Was this translation helpful? Give feedback.
All reactions