Hi All,
I have a client who has two identical servers in terms of hardware and operating system (bios, firmware version uncertain). Let's call this Server A and Server B.
They are using both of these server to parse data which I have provided to them from my own server. We'll call this Data C.
On Server A , parsing of data takes roughly 3 hours while on Server B it takes roughly 6 hours. Both servers do not have anything running in the background except for the standard windows services.
Now on Server A, the CPU utilization is around 70% , while on Server B the utilization is about 90%.
When looking at the logical core utilization on Server A, all 16 cores are being used but on Server B only 8 out of the 16 cores are being used.
What could be the issue here ?
(if this is the wrong forum section, kindly shift to the right group)