Puget Systems print logo

https://www.pugetsystems.com

Read this article at https://www.pugetsystems.com/guides/639
William George (Puget Labs Technician)

Why to Choose a Xeon

Written on April 7, 2015 by William George
Share:

Why Use a Xeon?

For over 15 years, Intel has branded their x86 server and workstation processors under the name “Xeon”. They have ranged from simply variants of Intel’s mainstream processor line with some extra features enabled - like ECC memory support - to much more advanced designs with added processor capabilities and cache memory.

 

Despite Intel directing the Xeons toward specific types of users and applications, there has been some confusion over when they are the appropriate choice versus a more mainstream processor (like a modern Core i5 or i7). I’ve had gamers ask about a Xeon because they thought they were more powerful, and that it would help performance in their games. Likewise, I have had businesses ask about running servers on Core i7 processors. So when does a Xeon make sense, and what do they really bring to the table?

 

ECC Memory - The most prominent feature of the Xeon processors is support for ECC memory. As far as I am aware all Xeons have supported that, and the protection ECC provides against single-bit memory errors (which it can detect and correct on the fly) is critical for systems where uptime and reliability are key. A few mainstream Intel processors have also supported ECC over the years, if used with the right chipset / motherboard, but to ensure you have proper support for this technology it is best to go with a Xeon.

 

Multiple CPUs - Some applications can benefit tremendously from having either a lot of processor cores, a large amount of memory, or high memory bandwidth (or all three). In such situations, a system with more than one CPU can be a great way to go… but mainstream processors like the Core series don’t support that. Many Xeons do, though, through added on-chip logic to facilitate communication between the CPUs so that they can share access to memory and coordinate workloads. Each CPU in such a configuration has its own memory controller and set of memory modules, plus its processing cores, so you gain in both raw computing capacity as well as the amount of memory that can be installed (capacity) and moved around at the same time (bandwidth). The multi-CPU capability of current generation Xeons can be found in the first digit of their model name. For example:

 

Xeon E3 1246 V3 (1 = single CPU only)

Xeon E5 2460 V3 (2 = dual CPU support, but can also be used alone)

Xeon E5 4627 V2 (4 = quad CPU support)

 

Each processor also has a PCI-Express controller that supports a certain number of lanes (depending on the model), so having multiple CPUs can enable motherboards with more PCI-Express slots. That can be useful for stacking lots of accelerator or co-processor cards like the Intel Xeon Phi or NVIDIA Tesla.

Higher Core Counts - In addition to the potential for using more than one CPU in a system to increase threaded workload performance, some Xeons simply feature more cores than anything on the more mainstream side of Intel’s processor lines. Currently the Core i3 processors top out at two cores, Core i5 at four core, Core i7 at eight cores, but the Xeon E5 series goes up to eighteen cores! As you climb in core count the base clock speed goes down somewhat, and cost goes up due to complexity, but heavily threaded applications can see big boosts from those added cores.

 

Virtualization Support - A lot of modern server workloads are being virtualized, which means the software and even operating system is running inside of isolated ‘bubbles’ of fake hardware. This allows a single host OS to manage several of these virtual environments, and isolates what happens within those environments to some degree - but it requires special extensions to be supported by the underlying hardware. Xeon CPUs generally have good support for that, as do most server and workstation class motherboards. You can sometimes find those features on more mainstream hardware as well, but it is a lot less likely that the whole chain (CPU, chipset, BIOS, etc) will support them… so the safest path is to get a Xeon based setup if your plans involve virtualization.

 

Those are a few good reasons to go with a Xeon based system, though not all of them. There are other factors like remote management, multiple and/or high-speed network ports, and more which are found on the server and workstation class motherboards populated by Xeons… but those are not specific to the Xeon processors themselves.

 

What about the bad ideas, though? Or to put it another way, the misinformed reasons to get a Xeon? Here are some of the top reasons I’ve seen people use for wanting a Xeon, when they would in reality be better served by something else:

“A more ‘powerful’ Xeon will help my gaming, right?” -

Some people assume that since Xeon processors are found in high-end workstations and servers they must also be better for gaming. That is definitely *not* true, because computer games don’t need a lot of cores or the other various advantages that Xeons bring to the table. As of the time I am writing this, no games use more than four cores - and many use less. What games are sensitive to, though, is clock speed… and as I mentioned earlier when you increase the number of cores you usually reduce the clock speed. So getting an 18-core Xeon for gaming would actually result in much *lower* performance than a cheaper, higher clock speed 4-core processor. Some argument can be made for a 6-core CPU for gaming (future-proofing, lots of background applications running at the same time, etc) but that can be fulfilled within the Core i7 line, without needing to go for a Xeon.


This principle extends beyond just games too. Some professional, workstation-grade applications don’t use a lot of cores, and so may be better off with mainstream processors. There are also low core count / high clock speed Xeons, but the more mainstream CPUs from Intel tend to reach slightly higher clock speeds. Which leads too...

 

“I want to have the fastest clock speeds possible for my calculations, maybe even overclock the CPU…” - If you want to overclock then you don’t want a Xeon. They are not multiplier unlocked like the ‘enthusiast’ (upper mainstream) processors that Intel aims at the overclocking crowd, and the motherboards that are built for use with Xeons are not likely to feature the sort of fine-tuning options which are needed either. Beyond that, Xeon-based platforms are all about reliability - and overclocking is by its very nature a more risky proposition. You are pushing the CPU beyond its rated specifications, and while it can be done conservatively (we make sure every system we overclock here passes the same stability tests that normal systems do) it is inherently a less reliable option, especially over a long period of time. There are applications where that is worthwhile, both for fun (like extreme gaming) and actual productive workloads… but you will be much better served in such situations by processors and motherboards which are designed to give you the options needed to get good results.

 

Conclusion

 

Xeons are great for workstations and servers, or any time you need more multi-threaded performance or reliability than a mainstream, single-CPU system can offer. We have seen a major trend in the last couple of years toward Xeon based systems, as we have seen our customer base shift in the direction of professional users - but they aren't always the right answer. Hopefully the information above can aid you in making a smart choice about what processor platform is best for your needs, but remember that we are here to help design and build systems for you! Please contact us via phone or email if you need help configuring a system.

Tags: Xeon, Processor, CPU, Intel, Server, Workstation
Stephens_Chris

Great post, William! There are so many iterations of Intel chips out there, it can be tough to navigate them all. You mention that you are seeing a trend towards Xeon systems. Do you find yourself recommended Xeons more often than you used to? If so, why do you think that is the case?

Posted on 2015-04-07 20:47:17

Well, I'd say it is a combination of a couple things:

- Customers coming to us wanting higher performance systems, with workloads that are capable of using many cores / multiple CPUs. It used to be hard to find software that effectively threaded beyond 2-4 cores, but it is getting more common now.

- Customers who value stability and are willing to pay a little more for above-average levels of it. That makes ECC memory valuable and appealing. Since most mainstream Intel CPUs don't support ECC, and AMD processors (which often do support ECC) are less powerful than Intel models, this results in the Xeon being a good choice for many users.

Posted on 2015-04-07 20:51:38
Ramos

William, not to bash at all, but I found the article a little simple and not comparing them to anything.

Like the pro/cons vs Opterons in Abu Dhabi version fx. Both have lots of cores, Xeons are more modern and run on more modern mainboards etc, but the dirt low price of Opterons is alluring for 20+ core setups and unlike the Hyperthreading (that Mr.Kinghorn proved weren't all good roses) Opterons runs all "real" cores, rather than threading each core into 2 threads.

I am a Xeon fan myself now because of DDR4 memory and the PCI-e 3.0 lane capabilties (for HBAs for SSDs in JBOD for software raids) over the now older and older Opterons, but pricing vs FLOPs delivered, aren't the Opterons still a fairly good low end bid for like up to 64 cores? (4x16 core server for Abu Dhabi fx is still cheap compared to even 8c x 3Ghz E5 v3's)

Could you make this article 1 of 2 or 3?

Like the simple introduction to Xeons and then 1-2 advanced ones, with the 2nd comparing the pro/cons to Opterons and a 3rd comparing low end Xeon setups to the new 14nm all-in-one solution Xeon from Intel, the D-1540 on (fx) a Supermicro X10SDV-TLN4F board for sub $1000, with 8c/16t for a very low price (for cheap and potent nodes for a Hadoop cluster fx)

Some scaling and pricing vs performance would be really nice, especially since you guys are so competent at this stuff. Of cause there needs to be some entry level stuff too, but this could be a good oppotunity for getting into scaling and comparisons for different scenarios, apart from HPC and Linpack testing which Kinghorn has a monopoly on. I was thinking distributed DB performance, tested, or Hadoop node scaling and price vs boost in performance etc.
-----
I know it is slightly outside scope, but any news on when the Xeon Phi Knights Landing (for CPU sockets, not the pci-e version) will arrive for sale?
Any news on the cost for a single 72c/288t 1.25/1.33 Ghz (whatever it comes up to be) Xeon Phi cpu will cost? (and with what RAM options?)

Thanks for the great work so far. Waiting patiently for the Kinghorn article-in-spe on the Knight's Landing (CPU socket version) article when they get released for sale and he can get to test them. Juiciest CPU release/news this year in my world :) Should completely change the HPC/MPP scene if the rumors about RAM at 5x DDR4 speeds hold true.

Posted on 2015-04-18 23:15:25

Thank you for your comment and ideas, Ramos. I'm sorry this blog entry didn't end up being quite what you were looking for. My goal here was to give and overall view of Xeons vs Intel's consumer-oriented processors, one that would hold up and be accurate not only today but in several years time (until Intel changes their branding or approach). I didn't want to get into specifics of the current generation of Xeons, or bring in other CPU manufacturers; either of those would widen the focus and make the info less applicable as time goes on. The same things goes with bringing pricing into the mix, as that can fluctuate.

For performance info on Opteron vs Xeon, or various current Xeons vs eachother, I think Don or Matt might be better qualified to write about. Those also might be more appropriate for our articles, rather than our blog. It is good to know what folks are looking for, though, so again I appreciate your response :)

Posted on 2015-04-20 15:54:37

X Plane 10 (http://www.x-plane.com) advertises that it will as many CPUs and as many cores as you can provide. So would a rig optimized for X Plane 10 call for Xenon?

Posted on 2015-07-30 02:12:17

I don't personally play X-Plane, but from what I have read and seen it seems like going above a quad-core is a bit wasteful. More cores may be usable, but it seems like the 'sweet spot' is a Core i5:

https://www.pugetsystems.co...

What I would worry about, scaling up to more cores, is that you lost some clock speed when doing so. I don't think I could really recommend above a six-core CPU, personally, since those are available at decently high clock speeds (up to 3.5GHz). A bigger focus for X-Plane should probably be the video card, if you want to spend more and increase frame-rates or the ability to crank up graphics quality.

Posted on 2015-07-30 06:08:53

Thanks for the reply. That's very helpful to know. The reason I ask is I like Macs and I WANT a Mac pro (which comes with a Xeon processor available with upwards of twelve cores) but don't really NEED one. Was curious if X-plane would give me something to put in the pro column.

Of course the reality will probably be that I could get an iMac AND a great gaming rig for less than a Mac pro.

Posted on 2016-04-02 04:36:43

Well what about dx 12?
It uses multithread if used for gaming in the new single cpu mode knights landing is capable of wont it be great for gaming being recognized as a second cpu?

Posted on 2015-09-06 19:04:12

DirectX 12 is supposed to help make more use of the CPU and improve game performance, but I doubt that a massively parallel processor like Knights Landing will really be helpful in games even then.

Posted on 2015-09-06 21:26:34
Adam Wykes

At the time of this writing there were a great many games - both big titles and an increasing number of indie titles - that use more than four cores. Crysis 3, Far Cry 4, Battlefield 4, Planetside 2...

Posted on 2015-09-11 18:34:27

Hmm, not really. Crysis 3 sees maybe 5% increase going from 4->6 cores at similar clock speeds, and that is only visible at low quality where the game is not GPU limited (which Crysis 3 usually is): http://www.techspot.com/rev...

Even less difference in FarCry 4: http://www.techspot.com/rev...

Battlefield 4 too: http://www.techspot.com/rev...

Posted on 2015-09-11 18:45:05
Adam Wykes

Forget the differential in performance, because it's easy to see they didn't do the testing right; if you're evaluating CPU performance, why on earth would you not have graphics settings down at the minimum, on a low resolution? Look at the differential in price. If you want to pick up the i5-3470, chances are it's still going to end up costing you more than the FX-8350, and the 8350 matches it at stock speeds in Crysis 3. That's certainly not because of single-thread performance, anyone could tell you that. So it's down to the additional threads. Another thing these tests don't consider is frame-time variance; frame rate alone does not tell the story of CPU performance in gaming. As a final note, all of this is kind of after the fact; the point I was making is that the game uses the cores; not necessarily what effect that usage has on performance. Ipso facto, the game uses all the threads you can throw at it. More games since then have also done this; GTA V, Witcher 3, Satellite Reign... my Xeon W3680 is loving it.

Posted on 2015-09-11 19:39:21
goblin072 .

I think you drank the coolaid. The games don't scale with more than 4 cores. You might be able to play the game without it slowing down as much if you have 6 cores and doing other things like web surfing, watching netflix etc

Posted on 2016-02-19 21:24:09
Adam Wykes

You're out of the loop. Compare recent gaming benches between the fx-4350 and fx-8350. Play a round of bf4 mp on an i7 vs a similar i5. The end of quad core dominance is coming to an end right now. Every single major gaming engine developed in the past few years scales beyond 4 threads

Posted on 2016-02-19 21:45:55
goblin072 .

You sound deluded. Point me to the benchmarks showing a gaming on a I7 quad vs a I7 6 core showing a huge improvement.

Posted on 2016-02-19 22:09:26
Adam Wykes

And you sound ignorant and lazy. I'll post those benches shortly, but first I gotta ask, judging by the way you worded your response - do you even know what the threading difference between a quad core i7 and a quad core i5 is?

Posted on 2016-02-19 22:13:32
goblin072 .

No need to get upset. The I5 is not hyper-threaded I7s are. Same number of physical cores. Are there any other areas you need help on?

Posted on 2016-02-19 22:29:17
Adam Wykes

You call me deluded and expect me not to get upset. That's delusional.

Anyway, here are the benches and other information about what I'm talking about. Educate yourself:

https://www.youtube.com/wat... (FX-8350 octocore better than FX-4350 quad core for gaming)

http://www.tomshardware.com... (hexacore i7 and octocore FX chips blowing i5 and other quad cores out of the water in Crysis 3)

https://youtu.be/5Cj8RP4kEG... (highly-threaded i7 and FX chips doing massively better in BF4 while i5 chokes as explained by veteran BF4 player)

http://wccftech.com/planets... of PS4 optimization for Planetside 2, which used to have major bottlenecking issues. It now runs on all 12 of my computer's threads and provides great framerates overall, and proof can be provided upon request.)

https://youtu.be/epIlB49SNT... explains why it is that you can play modern games using a 2.33ghz octocore Xeon setup - games are developed with the PS4's eight CPU cores @ 2ghz each now)

Pay attention to frametimes and minimum frames as well; they tell an important story alongside FPS averages, and should not be ignored.

Posted on 2016-02-19 23:07:49
goblin072 .

Most of that information is kiddie stuff. AMD cpus are not really relevant for gaming anymore. They eat 2x the power.

As far as cores. I saw the I5 and I7 were about the same in planet side 2.

It would be better to show a I5 and I7 6 core (With HT off) which would show 4 physical cores vs 6 physical cores.

I agree with William M George on this. Really not much reason to go beyond 4 cores for games.

And I did not say you were delusional just that you sounded like a delusional person. Maybe you are not.

Some day it might make a worth while difference to have 6 or more physical cores for games but right now a high clocking quad is the best bang for the buck.

Posted on 2016-02-19 23:30:04
Adam Wykes

Ad hominem attack, moving the goalposts, pointless abstraction, moving the goalposts again (without even a basis for doing so this time), an appeal to authority, semantic evasion, and a conclusion based on assertion without undisproven evidence, all in order one right after the other. Breathtaking elevation of the discussion, sir.

Posted on 2016-02-19 23:57:09
goblin072 .

Try to stay on topic.

Show 4 intel cores vs 6 to 8 intel cores in a game with a large improvement. You have failed to do so.

Posted on 2016-02-20 00:04:39
Adam Wykes

This is how I finally figured out you aren't being earnest - not even reading the benches i supplied you, where that exact situation does in fact play out in my favor. You're a troll, not exactly a goblin, and have been reported as such.

Posted on 2016-02-20 00:08:24

Alright guys, let's keep this professional! Disagreeing is fine, debating is fine. Attacking will get you a ban -- that's not how we roll here. I'm not accusing either of you in particular... let's just stop it here.

Posted on 2016-02-20 00:13:10
goblin072 .

Adam, read this on games and cores. I agree with this.
https://www.rockpapershotgu...

For me I want more cores for Hyper V and VM. Having the extra cores allows me to use 4 to play a game with the others are doing VM work. Ecc is important to thats why I am in the xeon thread.

I wish games scaled well beyond 4, the game consoles have 8 cores but they are very weak cores. I don't see it translating over for some time. At least not enough to make a 8 core intel worth it for just games..

Posted on 2016-02-20 05:28:38
Adam Wykes

I do not think that is a good analysis. He basically skips over the console point and doesn't really address why the consoles chose to go that route in the first place (cost+increasing demands on CPU side). Nor does he consider why game engines are universally moving to multicore, or how the kinds of games we play determine the CPU loads we see. Massively multiplayer and/or massively physics or voxel based games can and do use large numbers of threads to achieve their technical feats, and star swarm demoes do not begin to address the space.

In the future, it may be possible to match an eight core with a really fast quad core that juggles threads well, but that quad core will probably cost more. That's the present situation between the i5 and FX-8000 series, and that's before even considering whether gamers will continue to tack on peripheral software like steam, mumble, OBS, twitch browser windows, and so on to their gaming experiences - something that multi threaded systems can probably handle better (I say probably because you don't usually see attempts at benchmarking this kind of load).

Posted on 2016-02-20 06:08:51
goblin072 .

Adam I hope you are correct. It would be nice to those extra cores helping in a game. We have hit a wall in Ghz so going sideways with more cores has been the trend. Its up to the developers and the demand. Not many people have more than 4 cores so the demand right now is pretty low. And it might take them more work and time to make a game that uses more threads and cores effectively. Right now 4 + high clock is the sweet-spot. Most games don't use more than 2 cores it has been slow and not many even using 4 atm. 4+ would be nice but I will believe it when I see it.

Posted on 2016-02-20 07:18:41
Adam Wykes

Here's a list off the top of my head of all the games I know of that use more than 4 threads:

Planetside 2
Witcher 3
GTA V
Crysis 3
Satellite Reign
Battlefield 4
Star Wars: Battlegrounds 2
H1Z1
Star Citizen
No Man's Sky
Project: Cars
Beam.NG
Boss Constructor

From small to large projects, extra threading is being incorporated on a large scale right now. Most of these titles only came out in the past few years, and they represent a trend that will continue for all the reasons I gave plus the reason you gave - we have hit a bump in Moore's Law and need to compensate somehow. Multi threading on phones, consoles, and finally PCs is now the rule of the day.

Posted on 2016-02-20 17:30:49
goblin072 .

Agree on all but I have yet to find a game that scales well past 4 cores. Crysis 3 says it does but I have not found that to be the case. Play it on a 4 core I5 then move to 6 core and there is not much of a WOW effect in the scaling. I think that has more to do with the programming.

Do you have a xeon system?

I think some of those games might scale better on a console multi cpu system than a PC right now. The weaker the cpu the more it can help. But I had hoped they would someday get around to writing better code, there just was not a huge demand. Maybe phones and consoles will force them to make good use of 8 PC cores. I don't think the PC will be the reason they do it, it will be an afterthought, the PC game market is puny vs phones and consoles.

Posted on 2016-02-21 08:32:15
Adam Wykes

No it's not going to be a wow effect because of the fractional effect - adding 2 cores to make 4 is a 50% core increase, while adding 2 more is only a 33% increase. But from my perspective as a gamer, it is worth the small improvement and/or cost savings (if you go FX).

I do have a Xeon; as I said in my first post, I run a Xeon six core, twelve thread W3680.

You're right to think PC will not change the market. It was always going to be phones and consoles that brought the change

Posted on 2016-02-21 17:11:43
goblin072 .

Oh yea thats a nice old xeon. I have a twin 5870s at home 128GB. Were you able to find a way around the 24 GB limit on the W3680? Its very similar otherwise.

Posted on 2016-02-21 19:27:54
Adam Wykes

No need to try to overcome the RAM limit for me; I don't do serious work at home. I wasn't even going to get a Xeon like this except I found it when I was dumpster diving and decided I would build a system around it. What's even cooler about it is that it's an ES version, which is I think the thing that allows me to overclock it. Currently it is sitting at 4ghz on all cores under a Cooler Master Seidon 120V AIO cooler in an AntecGX500 case. For reference, here's my complete system:

W3680
EVGA FTW3 mobo
16gb DDR3 1600mhz (4 modules, dual channel)
2x GTX 760 2gb in SLI
120gb Mushkin Blaze SSD
250gb SATA III 7200 spinner on SATA III adapter (the SATA III on my mobo is... bad)
500gb SATA II 7200 spinner
Windows 10 Pro 64-bit (all my fun linux boxen are downstairs or in the living room)

The biggest mistake of the whole build was probably the GTX 760 SLI combo; yes it is wonderful when it works, like in the Witcher 3... but half a decade after my first SLI setup (dual 8600 GTS cards), they haven't succeeded in making SLI any more universal than it used to be. Looking to upgrade it early 2017, after we know what the result of AMD Zen and Polaris is.

Posted on 2016-02-21 22:26:18
Adam Wykes

btw I can't find a reference to the Xeon 5870 in Intel's ARK database?

Posted on 2016-02-21 22:27:59
goblin072 .

LOL I must have been thinking of my old AMD 5870 vid card.

I meant a pair of Xeon x5670s. They are in a Dell R610 with a Perc H700 raid and Perc H800 controlling a Lenovo thinkserver SA120 (These are a steal right now, $3,400 box for $200.00.)

http://www.anandtech.com/sh...
http://www.trustedreviews.c...

http://www.newegg.com/Produ...

Posted on 2016-02-21 22:43:47
Adam Wykes

Oh there you go. Not very different from me at all. Lacking in single core perf for gaming, but then that's obviously not what you do with a system that has 24 threads

Posted on 2016-02-22 02:13:07
goblin072 .

Lol, the R610 is far from a gaming system. Its more a tractor, it does not

slow down. They are pretty cool feature wise. With the Drac 6 card I can see the boot up enter bios remotely just as if i was sitting in front of it. No monitor attached to it, no video card but a 2D matrox. Redundant power supplies, one goes down the other kicks in.

Here is a pic of the inside.
http://www.alphr.com/dell/3...

I have like 10 computers that ones is not for gaming .

Posted on 2016-02-23 02:38:37
goblin072 .

Time to grow up, I am not trolling here, please do not start off topic rants.
I watched 3 of the many links and I have not interest in seeing a guy hack a 8 core amd with a GTX 980, waste of time.

Please provide numbers not videos of real life benchmarks using intel cpus. If you can't fine but do not barrage the site with links that do not provide this. It only wastes peoples time.

Posted on 2016-02-20 01:24:51
Guest

I went with dual Xeons myself, but I'm disappointed with the OpenGL performance number in Cinebench R15.. just 58FPS and this with a Titan X GPU. Others on the i7 platform report scores ranging from 166 to 202FPS for Titan X.
In realworld apps, the experience is choppy playback of 24P 4K material in Premiere and 40% of frames dropped on 60P 4K, despite fast SDD drives for each media stream and separate SSD for OS and another for temp files.
I thought maybe my Titan X was defective, so I swapped in my GTX680, which scores 78FPS when plugged into my 7 year old Core2Quad machine. It got the identical 58FPS when plugged into the Supermicro X10DRi with two Xeon E5-2630 v3 CPUs.
This system has 128GB of ECC RAM, the recommended brand and part number specified by Supermicro.
I'm at a loss as to what to do. I'm thinking I should have gone with an i7 at 1/5 the cost and spent the difference on a second Sony FS7.
I have spent 3 hours minimum, every night since June 30th, when I built the system, troubleshooting this problem and optimizing Windows 7 64-bit. I've reached the conclusion that 60P 4K is not possible with today's computer technology, and that dual Xeons are not the way to go for an editing system.

Posted on 2015-10-08 04:22:13

Hmm, that sounds odd. The 2630 V3 processors are fairly low clock speed (2.4GHz based, 3.2GHz max turbo) so they won't perform as well as higher-speed Core i7 processors in situations where only a few cores / threads are active, but the numbers you are seeing sound lower than they should be.

I just looked up a similar build we did for a customer, with those same CPUs and the Titan X video card. It was on an Asus Z10PE-D8 WS motherboard with 64GB of RAM (8 x 8GB modules)... and running Windows 7 Pro. Here are the scores it got in CineBench R15:

Rendering - Single CPU - 104.26 pts
Rendering - Multiple CPU - 2065.05 pts
Shading - OpenGL - 124.49 FPS

I would expect the OpenGL score to be even higher with faster (higher GHz) processors, but that is still more than twice the speed you are seeing. Not sure how that helps, other that to encourage you that something should still be fixable to improve your performance :/

Posted on 2015-10-08 05:16:43
Guest

It sounds like it's RMA time for the motherboard, as something is seriously wrong with it to be performing this poorly, compared to a similar system as you have related.

The tech at Supermicro, who's trying to replicate my issue on an identical hardware setup is getting 102FPS to 111FPS. Same motherboard and GPU.

One thought I had is that I have three displays connected to mine. A 2.5K main UI display, a 4K display for full screen output of DCI 4K, and a 1080P projector. I know the Supermicro tech said he doesn't have a second monitor hooked up to his test unit. I wonder if the second monitor is slowing down the Titan?

Posted on 2015-10-08 07:23:12

I wouldn't think so, but it would be easy to unplug all but one (lower res) output and re-test.

Posted on 2015-10-08 07:57:43
Guest

When I did that, my desktop disappeared.. and my mouse cursor went off-screen, along with all my desktop icons and start menu!

Posted on 2015-10-08 15:52:42

Before unplugging the other screens, make sure you set the one you will keep plugged in as the primary monitor:

http://answers.microsoft.co...

Posted on 2015-10-08 16:12:03
Guest

Oddly, my primary monitor is my primary monitor, until I unplug my secondary--then it becomes my secondary! Some of the confusion could be because Windows designates my primary as monitor 3 and when I unplug the others, it becomes monitor 1, but retains monitor 3's place the queue, acting like secondary. Very annoying behavior. This often burns me when I boot the system, because my desktop frequently winds up on the 4K second monitor, running at 640x480! That screws up my icon arrangement and I have to keep power cycling monitors until it comes up the right way, then I have to rearrange icons back to their proper locations. Very irritating. Now that cold weather is here, I just leave the system on 24/7 so I don't have to deal with this every time I start it up.

Posted on 2015-10-08 16:45:23

Hmm, that sounds odd - and not very fun. What happens if you turn it off, unplug all but the one you want, then turn it back on?

Posted on 2015-10-08 16:53:28
Guest

It only works if I plug one monitor in and REBOOT the system. Then it come up the way you'd expect. But when more than one monitor is connected, annoyances abound. I spend much of the summer trying to find answers, but found many other people with the same question on various forums, and no answers.

Posted on 2015-10-08 21:42:50

When you have a chance to try CineBench again, with just a single monitor connected, let us know. I am curious if that impacts performance or not - I don't expect it to, but something must be going on to cause the low performance you are seeing. Good luck!

Posted on 2015-10-08 21:45:39
Bret Z

My only complaint with the article, is that it doesn't address the cheap (but modern) Xeons. For those of us who build PCs for gaming, and video editing, the quad xeons are significantly cheaper than their Core i7 counterparts, work with all mainstream lga 1150 motherboards, and match the performance of their i7 counterparts.

They are i7s without the ability to overclock and without iGPUs, at $100 cheaper. For gamers, this is fabulous (as gaming is not CPU dependent anymore, it's primarily a question of graphics).

Additionally, Xeons are quite efficient, and can fully power modern games without the need for fancy cooling gimmicks, and can simply use the supplied heat sink.

Posted on 2015-10-17 11:08:50

Xeons in the same generation, and with the same core count / clock speed, perform identically to a Core i7 or i5 processor (depending on whether they have Hyperthreading or not). You are right that some are available without graphics capability, and those models could save a little money... but in my experience they don't perform any differently than the models with graphics. In fact, I have one of those in my home gaming computer right now - the Xeon E3-1240 v3.

Posted on 2015-10-18 06:16:01
Bret Z

I use the E3-1231 V3. The thing is, while their performance is similar to the i5 or i7, they are significantly cheaper. Mine (Also used for gaming), is fully equivalent in benchmarking to an i7 4770, despite being much cheaper. I just looked at Newegg, and today it's still $60 cheaper with a $20 sale on the i7. I feel that Xeons are way overlooked, considering they deliver the full performance of an i7 at i5 prices. They aren't unlocked, but when you can buy a higher clock speed and more cores for the money, you don't need to unlock it.

I just strapped a R9 390 8GB GDDR5 graphics card in, and there's no bottleneck at all. I can't imagine there's a benefit to the more expensive i7s.

Posted on 2015-10-18 06:25:18
Atanas Ctonlob

Whats your take on teh 5.1 xeon cpu coming out, will that be any good for gaming?

Posted on 2016-01-27 22:54:25

Assuming the rumors about that are true, yes - I expect it would be great for gaming. It is supposed to have 4 cores, which is ideal for today's games, and the top clock speed of any Intel processor yet made. We'll have to wait and see, though, as this seems almost too good to be true for applications where, like games, clock speed trumps the number of cores. Also, price will be a big question: it is supposed to be a dual-CPU Xeon version, which tend to cost more, and if the price is too high compared to something like the i7 6700K then it may not be worthwhile (at least for gaming).

Posted on 2016-01-27 22:57:44
Clearanceman2 .

OK, but it's a great upgrade path on old computers for almost free. For example, I just bought a Xeon E5450 off ebay for $18. I am replacing a Q6600 in an OEM board. I did this last month on another computer. The E5450 benches higher than the Q6600, it has 50% more cache, and TDP is 25 watts less. Also, will run in 775 with a $1.50 sticker and a couple careful snips to the mobo socket. Less heat, more CPU power, fewer watts. And you can sell the Q6600 for more than $18 anyway, so the cost of the upgrade is more or less free. The E5450 is slightly faster than the Q9650, uses fewer watts and is half the price. If you have an old non OEM board that overclocks it's even better. Most of those E5450s will run 25% overclocked.

Posted on 2016-04-07 19:06:36
Richard Monette

being a computer novice and due to circumstance a t5400 fell in my lap one day...with only one x5450 processor and 4g FB ecc ram....I have found it almost unstoppable for the simple task of "obssesive surfing"...my fixed income due to disability prevents me from chasing the latest icore i7. also when my $60 MSI GT620 quit half way through the warranty the folks at MSI sent me a MSI GTX- n660 twn frozer. I have had a couple friends drop by and try my machine against their 16 gb i7 laptops and the verdict is always the same...."that thing is better than mine".....I don't game and I have few of lifes luxuries so I am quite happy withy this machine that cost less than $200 .Does the author of this article have any thoughts on what I/we are experiencing?...is it real or just some subjective nonsense.....the only bench marks I can offer are the WEI from win 7...they are from top to bottom 7.3, 7.3 7.9, 7.9, 7.4.....yes there is a 60g SSD in there. thoughts thank

Posted on 2016-09-12 10:49:54

That is a 3GHz quad-core processor, which is plenty for using the internet and general applications. Combined with the fact that you have a SSD, and assuming a decent internet connection too, I'm not at all surprised that performance for your uses feels great. It probably is more fluid / smooth than someone with a newer CPU but a traditional hard drive.

Posted on 2016-09-12 18:33:55
WAS

As of this post, in 2016, a Xeon x5450 can run most games, with the exception of newer gen games now demanding more write speed than the motherboard can allow. This 2004 Xeon still gets a 7.4 (Max 7.6) with current benchmarks via Microsoft.

Posted on 2016-11-22 17:13:46
WolfLordTK

hello,minecraft uses any amount of cores you throw at it.

Posted on 2017-02-26 20:02:00

Do you have a citation for this? In my experience, Minecraft - both the server and the client - only use a single core. Mods may affect that, but the base game at least seems to be single-threaded.

Also, this is somewhat off topic as Minecraft was never mentioned in the article :)

Posted on 2017-02-27 07:46:09
fourfires d.

Xeon do has more PCIE lanes much more than other mainstream intel CPUs, does not help gaming? especially in CF or SLI,
also for M.2 interfaces etc.
anyone can help answer that?

Posted on 2017-03-03 00:39:07

The Intel Xeon E5 series processors along with the Core i7 6850K, 6900K, and 6950X have 40 PCI-E lanes, while the Xeon E3 and other Core i3 / i5 / i7 processors have less. That doesn't really impact gaming, though, as even with dual video cards (CF / SLI) its fine to have video cards running at x8 speed instead of x16. If you want to have more than two video cards then PCI-E lanes become an issue, or if you have a ton of other PCI-E devices, but neither of those situations is common in gaming computers. Instead, games benefit most from high CPU clock speed - making processors like the i5 7600K and i7 7700K the best options in most cases. Future games might start to use more cores, though, and that could change things down the road.

Posted on 2017-03-03 00:46:27
Hard PC Gamer

Xeon gaming capability:
https://www.youtube.com/wat...

Posted on 2017-03-05 05:46:17

Games just don't need - or effectively use - that many cores yet. A quad core with a higher clock speed, like the i7 7700k, will outperform that Xeon for a much lower price. Can Xeons play games? Sure, especially the ones with decent (>3GHz) clock speeds... but they aren't either a cost effective choice or the fastest option.

But there are many other applications where more cores can be extremely beneficial! Along with the option for multiple physical CPUs, ECC memory, etc.

Posted on 2017-03-05 08:41:09
Charlie

good info to get started with Sysadmin

Posted on 2017-04-11 18:13:59