Why not buy something like this to flush mount in the bay above your DVD drive?
You're in luck! You're very fortunate that you chose a laptop with a Thunderbolt 3 port, otherwise the display would otherwise not work, as the Apple TB displays do not work on DisplayPort, despite being theoretically possible. There's six pages of discussion about this very issue in an Apple support thread here.
This is what you need on Amazon, then you should be good. Good luck!
Really getting shorted on the CPU, RAM, and HDD. Those are all cheap. I think the seller is just hoping people see a 6GB 1060 in a pre-built for around $1K, and not look at the other components. I mean, a $45 hard drive?
> (should not be above 60$) which is capable of running Battlefield 3 and more upcoming games.
The cheapest card that would run BF3 (maybe in medium settings if you are not in a very high resolution) is the Radeon 5670 which can be found for about $60 according to Google Products.
If you want something more powerful yet staying within that price range, you should consider buying used: check r/hardwareswap
It is rainmeter: https://www.rainmeter.net/
You can make it look however you want. There are plenty of skins people have created. This is one I threw together using some bits and pieces from a few dif ones I like
/u/Emi-chan Nice quick thinking there but instead of having to mod the case you could have picked up a regular 3 prong on amazon / your local computer store. They're really common and incredibly cheap - 3 Prong AC Power Cord Cable Plug (12 Feet) https://www.amazon.com/dp/B002WQL9KA/ref=cm_sw_r_awd_OKGEub1ZW4FX7
I reproduced the build here. 910usd and 870 w/MIR So, you're paying roughly a 100usd premium by having it built by them.
That 100 could be used to turn your 6850 into a 6950. Which is a nice upgrade.
Here's a comparison between the two.
Depends on resolution, I'd say you can run it on Custom setting bordering High/Ultra.
These links highlight HD6850 and X4 at 3Ghz.. they perform "playable" at 19XX X 10XX on ultra.
http://www.tomshardware.com/reviews/skyrim-performance-benchmark,3074-8.html
http://www.tomshardware.com/reviews/skyrim-performance-benchmark,3074-9.html
I'd say if you customize the setting and use updated drivers, you can get good playable setting bordering high/ultra.
>Please don't turn this into a fight between ATI fans and nVidia fans
I don't think I've seen any fanboism in this subreddit. We're all pretty mature :)
With that out of the way, my main concern was seeing if either nvidia or amd released a sli/crossfire profile for the game. I tried searching and I'm concerned about that. I'm seeing a lot of posts talking about no improvement, or even a negative impact by running multi-gpu.
Here is one such post from May 2011. This person advises to not use multi-gpu. I also checked the AMD CAP release notes from the past year but didn't see anything about Tera. I didn't check nvidia though.
So, I don't recommend multi-gpu unless you have a source you trust saying it works fine.
As for which gpu? Honestly, in most cases either brand is interchangeable. And from what I see, Tera seems cpu intensive. And if you're doing mass pvp, I'd be more concerned about that.
Decide upon a budget and look at Tom's Hardware "best Video card for the money" article. They update it every month and it's done well.
Good Luck
The HD 5670 is low end card. You will likely get more performance from upgrading the GPU than the CPU.
See here:
The HD5670 is literally almost bottom of the barrel.
Any reason you decided the i3 isn't good enough for high? Based on the first benchmark here and the i7 core scaling, I'd say it should do just fine.
I would either go Sandy Bridge if you are impatient or if you are willing to wait a bit wait for Bulldozer which is allegedly right around the corner. Right now I think Intel's Sandy Bridge has enough performance that you will be happy for years to come however it doesn't hurt to wait for Bulldozer since it is supposed to be right around the corner. If I were buying a PC at this point I would be very hesitant to get an AMD processor.
I checked LL site, looks like there are also in.....GOLD :O click
LGA 1155 can only manage 8 PCI-E 3.0 lanes for each GPU which is equivalent to PCI-E 2.0 x16 for each GPU.
From the sidebar:
The Asus Strix 470 is $135 at Jet.com if you use the Triple15 code.
The card's details say 500 watt minimum. And I agree with them.
Even if 430 is enough, running near 100% load on a power supply isn't a good idea in the long run. Increased heat and lowered efficiency.
A 500 watt will cut it. You might want to get a beefier one for extra headroom and future proofing. I suggest 600.
Here is a list of recommended PSUs listed by wattage. Keep a close eye on the email newsletters of Newegg et al. in the days to come and you'll probably pick up a good deal on one of those listed.
As to the performance increase from 550 -> 560. Take a look at anandtech's review of the 550ti. They include the 560ti in the benchmark charts. Looks like roughly a 60% increase in performance over the 550ti.
Whether that's worth the cost of the card is your call.
Toms Hardware does a Best Video Card per price point article every month. The November edition was posted a week ago. Link Here.
The closest one to 300 is for 265, the Radeon 6950 2GB
If you want to spend more or less, take a look at that article. It covers all the price points.
AMD scales better than Nvidia cards in multi-gpu setups 80% of the time. You sound like a fanboy for not knowing that.
In the words of Toms Hardware:
>CrossFire came out with a huge overall scaling lead over SLI, and removing the one title that didn’t reflect that average would have made the lead even bigger. Superior scaling allowed two mid-priced Radeon HD 6950s to approximate the performance of two higher-cost GeForce GTX 570s, while three HD 6950s took the performance win over three GTX 570s.
Here is another study, this time a direct comparison of some high to mid end gpu's from both companies. Scaling is heavily favored to AMD by an average of 20% on dual card setups, with their 6950s at 190+% at all resolutions (up to 197% at 2560x1600). The comparable Nvidia card GTX570 (which is supposed to be more powerful than the 6950) only manages 180%@2560x1600, 171%@1920x1080, and 167%@1680x1050. Keep in mind that the 570 scaled the best out of all Nvidia cards percentage-wise, with the worst (excluding the GTX295), the GTX590.
This is not anecdotal evidence. Wake up and start reading.
Ok, sli 465 seems to be around a single 570 performance wise. If you OC it, higher for sure. Tom's hardware says a 570 will run ultra So I think a second 465 is the way to go for you. Maybe even toss in a SSD with the money saved
Only issue is the cards only have 1g vram. So some of the larger maps will cause a fps drop. Also make sure your PSU and motherboard supports SLI.
EDIT: ok here you go. these are benchmarks that include sli 470s and various 5xx boards. You can see that sli 465s should outpace the 570 and probably the 580. Even more so if you OC them.
Paradise desk http://gpdesks.com/ https://www.kickstarter.com/projects/155163930/paradise-desk-everything-youve-ever-wanted-in-a-de
It's not quite available yet(can preorder) but its pretty freaking awesome. I can't wait to receive mine, which should be within the next month or two.
It does have a bit of a price tag but steel construction with all of the built-in extensions is convenient.
Something to consider for the future if you ever want the ultimate desk.
Yes, physically impossible. Do I really need to back it up? I can direct you to any thermodynamics book if you like; it'll be in the first chapter. Otherwise I'll need to give you a lesson in introductory physics.
If you have a system, such as a CPU, that is hot, the only way to get it's temperature lower than it's surroundings is to actively cool it. An air cooler simply dissipates heat, it does not "cool" like a refrigerator. The coolest you can get your CPU is to ambient temperature, where it will be in thermal equilibrium with its surroundings. If you want to get it colder, you have to supply some sort of energy and actively cool it (or cool the surroundings and let them come to equilibrium).
In your picture, they achieved 24 C, which frankly is incorrect, unless they were in a room around ~20 C. I don't know the error bars on utilities like RealTemp, but even on their website they admit that the temperatures are only estimates and will never be fully accurate unless the software is rigorously calibrated.
This is a problem that I have with the benchmarking community as reported temperatures are often widely inaccurate, or they leave out information such as ambient temperature in their analysis. They also never report error bars, and rarely conduct multiple tests.
I suggest you read up on the laws of thermodynamics.
Also, I never said I downvoted the guy massively, and I don't know why you are accusing me of doing such a thing.
it makes a click sound when the time flips, personally I like the sound. I have had the clock for a few months, so far it has kept accurate time.
https://www.amazon.com/dp/B01F9G43WU/ref=cm_sw_r_cp_apa_i_qb0iFbQAQQ1MT
240gb ssd $39.99
Wasn't hard to find. If you can use reddit, you can use google.
Not trying to be a dick, but come on man. Be a little resourceful.
I mostly use a Steelseries Xai. I love the shape and it does everything I need, has lots of configurability, and looks great.
I needed a wireless mouse to use with my media PC though, so recently got a Logitech G700 - bought a gaming mouse since it's the same price as the non-gaming wireless Logitech I was looking at, and nicer. It's actually pretty nice and the wireless hasn't lagged at all so far in the few games I've played on the media PC - I might try playing some games with it on the main PC, especially since you can plug it in via USB and use it in wired-mode too apparently. The ergonomics aren't nearly as nice as the XAI though, I find myself using an uncomfortable tilted claw-grip with it fairly often, which makes my hand really tired by the end of the day - with the XAI I can comfortably rest my palm on the mouse and still comfortably hit all the buttons, which isn't the case with the G700. It has the same visual style as the XAI though, much better than logitech's other mice IMO - just plain matte black all over, with just a faint white logo, no blazing lights, bright colours or flimsy looking buttons.
> if you delete data from it, it can slow down immensely
Have any evidence to back that up? As long as TRIM is enabled, your SSD should maintain a good speed no matter what you do. Filling drives to capacity also slows them down, but I've never seen a claim that deleting/freeing space slows down an SSD.
http://www.howtogeek.com/165542/why-solid-state-drives-slow-down-as-you-fill-them-up/
Seeing as you changed the graphics settings from high to low and it did not improve, you just described a perfect example of a CPU bottleneck.
Since the operations that run on the GPU are not affecting performance, that means the culprit is your CPU. BF3 very much prefers a quad core to a dual. Check out this benchmark. Slowdown from a dual core doesn't really show up on the 2600K, since it's a fast chip, but you can see that the FX-8150 loses a good deal of performance on one module (which should be a fair bit faster than a Core 2 Duo), since it has much slower cores than the i7.
Also, part of the issue with larger maps that have to load in could be coming from the slower RAM that you're using (presumably, since it's 775, you have DDR2 modules).
It sounds like the next thing for you to look at is a new set of motherboard, CPU, and RAM. About $350 (or less with access to a Microcenter) budget will get you a solid board, 2500K, and 8GB. Right now there isn't much that competes well with that (it's hard to recommend the FX line because of high power consumption and inconsistent performance).
I just finished this build a few days ago for my showroom, then promptly sold it today to a guy looking to replace his aging Core 2 Quad rig. I'm pretty proud of it. It was my first time working in the Corsair Carbide 300R, and I really enjoyed it. Obviously, being as it is a crosspost from r/cablemanagement, I put a strong emphasis on tidiness. Far too often it seems that cable "management" stops behind the motherboard tray. I think that the mark of a truly beautiful build means cable management is practiced even in areas that aren't often seen.
The Core i7 is modestly overclocked to 4.0GHz, but could go well beyond that given the cooling setup. The graphics card isn't sexy enough to make panties drop, but it should serve his single-monitor gaming needs for a while to come.
Specs are as follows:
Intel Core i7-3770k CPU
Gigabyte Z77X-D3H Motherboard
16GB Corsair XMS3 DDR3 RAM
EVGA GTX 560ti Classified Edition Graphics Card
Crucial M4 256GB SSD
2TB Western Digitial RE4 Enterprise-Class Hard Drive
Corsair H100 Watercooling Loop
Corsair HX750 Modular PSU
Corsair Carbide 300R Case
Windows 7 Professional 64 Bit
Samsung DVD Burner
Mirror in case Imgur goes down again:
There's plenty of good points already here, but I want to harp on about mods. There's absolutely squillions of really awesome user-created content, and completely original indie games (Nitronic Rush and Overgrowth come to mind) that might lack polish but excel in innovative gameplay. PC gaming isn't just about consuming games, it's about making them too and it can be as simple as changing an item's texture or messing with the value for gravity (Both are extremely easy to do in the Source Engine). Once you get a taste of it it's very addictive and can double or triple the 'lifetime' of a game.
Anyway, there's my rant.
http://camelcamelcamel.com/Blue-Microphones-Yeti-USB-Microphone/product/B002VA464S?context=browse
When it first came out it was retailed at 150, but for the last couple years it has always been about $100, I picked mine up for 79 on black friday, totally worth it
Right now on amazon it's $39.99, $2 more than its cheapest price record on CamelCamelCamel.com. Iv'e been following this mouse for a month, so I naturally decided to share it with yall who are interested in buying a solid gaming mouse with a $50 or under budget.
The newish Corsair 4000d Airflow is very popular, great for airflow, looks good and is easy to build in. I’d recommend it
So unfortunately it doesn't seem like your motherboard is compatible with fans that run rgb through on board software. The only way you'll be able to have rgb fans in your case is if you buy some that come with a separate remote that changes them to whichever color you want on the remote, they're called addressable RGB fans.
As far as it not lighting up when it plugs into the JLED header, I'm not sure how you should address that. I'm sorry I can't be more help there.
Here are some ARGB fans that I thought worked fine when I first built my PC-
upHere Wireless RGB LED 120mm Case Fan,Quiet Edition High Airflow Adjustable Color LED Case Fan for PC Cases, CPU Coolers,Radiators System,5-Pack / C8123 https://www.amazon.com/dp/B07DHM6SW9/ref=cm_sw_r_cp_api_fabt1_d5iVFbMG812BT
I'm sorry it didn't work out for ya, I would send those back and get a refund if you can, or find a new motherboard that has an rgb header if you want to shell out another $80-$100. Best of luck to you! God Bless
Generally I check slickdeals.net. They have a special section specifically geared towards Black Friday. In previous years Gizmodo has compiled a list of awesome Black Friday deals.
Thanks, but that is fairly restricting in terms of the selection. It doesn't show the two monitors I am using at the moment or any on this list I was looking at.
From the tests I've seen they can idle silently (i.e. fan not moving) and are about even with a reference 6970 at full load. Also, as with any card, non-reference models will improve cooling. Or, as you say, use water.
I did also look at the Tom's Hardware review, which showed noise closer to a 6990, which seems odd, that the results are so different. Probably just a difference in their methods.
Check this article: Toms Hardware on BF3 performance of various graphics cards
They get around 46.6 fps with a 460, on high settings. And they basically conclude that the CPU should not matter much. So I would say you are good to go, specwise at least.
n/m. i found my answer. i looked up the equivelant to a 5750 and it's a gts 250. so two gts 250's sli are not as fast as a single gtx 560 ti according to this http://www.tomshardware.com/forum/311999-33-single
I think this is what OP wants, and at the right price too.
You can verify it is the correct temp using HWMonitor. Just check to see they both match. My guess, is it pulls that temp from the same place your system does. So probably, yes.
41c is a common AIO idle temp, depending on your processor and ambient temp. Maybe a tad high but, depends on your case and use. I doubt its wrong though.
Interestingly, one of the best tools for this is Process ~~Monitor~~ Explorer from Microsoft. You can get the latest version here: ~~http://technet.microsoft.com/en-us/sysinternals/bb896645~~
Correction, Process Monitor and it's here: http://technet.microsoft.com/en-us/sysinternals/bb896653
With dual cards in SLI it records the memory usage per card (for one card but both will use almost exactly the same) whereas on AMD/Crossfire it records the total of 2 cards combined.
It's also a good replacement for task manager, making it possible to CTRL+ALT+DELETE and get to this data ;-)
It has the advantage over Afterburner of causing less stability problems. We used it to benchmark Battlefield 3 videomemory usage and compared the data to Afterburner. It records the same usage.
Edit: wrong program, link is correct now!!
i pretty much agree with everything /u/kiwiandapple is saying
got some Gunnars relatively cheap off of Woot, and actually liked them at first
then I started getting headaches and they just made my eyes feel...weird
If you feel like your eyes are straining while gaming or whatever, adjust the lighting (never stare at a bright screen in a dark room for long periods of time) or adjust your monitor brightness (flux works pretty well)
Unfortunately, it's in Alpha, BUT you can get early access and back it on their website Kingdom Come: Deliverance.
I'm honestly only saying this because I backed it when it was on Kickstarter, and they still have a stretch goal :) I'm pretty hesitant to back Kickstarters, but they already had good demos when they were asking for funding so I have high hopes for it.
EDIT: Hmm, maybe they don't have a stretch goal anymore? It's not really clear from the site. But in any case you can buy early access to the game if that's your thing.
Example from the step up page:
"You have purchased a graphics card for $299.99, tax was $26.25 and $10.25 in shipping, you also took advantage of a $30.00 rebate from EVGA. The product you wish to Step-Up to is listed as MSRP of $399.99.
Add $299.99
Ignore $26.25 tax and $10.25 shipping
Subtract $30.00 for rebate to get $269.99
Subtract $269.99 from $399.99
$130 is the cost of this example, plus return shipping and applicable taxes"
Keep the case and the motherboard. Replace your graphics card with an EVGA GTX 580. Power supply should be upgraded if you want to overclock, and you have room left over in the budget to do it. Have fun.
Well I would use a real speedtest that isnt bought out by ISPs such as testmy.net. If your internet is really this fast, you should be fine though. I'd recommend 720p60fps, your pc should be able to handle it.
Yeah Can You Run It? isn't the best thing out there. YouGamers has a thing called GameO'Meter which is much better.
CCC doesn't have anything to do with your CPU fan, it only controls your graphics card. If you want, you can usually set speed parameters for your CPU fan in your BIOS or in some other software like SpeedFan.
It is rainmeter: https://www.rainmeter.net/ You can make it look however you want. There are plenty of skins people have created. This is one I threw together using some bits and pieces from a few dif ones I like
Look on FFsplit's forums, you'll see they have programs that work with full-screen. I forget the name and stuff, but seriously look on their forums.
Okay fine, I wasn't lazy and I got it for you. http://www.ffsplit.com/forums/viewtopic.php?f=8&t=235&sid=95920c1f4d3432d1dedda49eca7f9ca0
Look at 2.1.2, Dxtory
Also doesn't HDMI max out at 1080p? (1920x1080)
I can't use my 1920x1200 monitor over HDMI (gtx570). I end up scaling. So I use DVI which actually supports even highers resolutions afaik.
Edit: never mind ignore my statement, I guess its my monitor. http://superuser.com/questions/119755/hdmi-with-resolution-2560-x-1440-possible
FYI regular price on those cans is ~100 bucks. They rarely, if ever are above that price.
Reading most of these comments, I feel like you just added more fuel to the fire. People are still asking does my X bottleneck by Y but now its does my CPU bottleneck my GPU! If people actually paid attention to what you wrote they'd realize all they need to do was find the cpu and gpu usage while gaming. To do that all you need is Process Explorer and GPU-z.
8K DisplayPort Cable 1.4, JSAUX 10ft/3M DP Cable (8K@60Hz 7680x4320, 4K@240Hz, 2K@144Hz) Gold-Plated Braided Ultra High Speed DisplayPort Cord Support Resolution for Laptop PC TV Gaming Monitor(Red) https://www.amazon.com/dp/B08PFBSL59/ref=cm_sw_r_cp_api_glt_fabc_MJRY288840M8J929QFCH
I run 2 1440/240HZ monitors. These are the cables I use and yes it does 1440/240HZ. Does it flawlessly. I even left a review on amazon for it.
Ah I forgot to add those to the list! They are just a CableMod basic extension kit I got off amazon. It doesn’t look like they have the white ones I ordered anymore https://www.amazon.com/CableMod-Basic-Cable-Extension-Kit/dp/B01MXWH6ZW?th=1&psc=1b
I went with a similar setup but decided not to stress the tall stand so much by only putting my top monitor on it and keeping the dual monitor stand at eye level for the bottom two. Had this since Jan, no issues straight as an arrow still with a 27".
https://www.amazon.com/gp/product/B0155LJATK/ref=ppx_yo_dt_b_asin_title_o01_s00?ie=UTF8&psc=1
Update: Have a little teaser for you guys. So far, 147 people have taken the survey. I did a partial tabulation of 121 of those for your benefit as well as to cut down on data inaccuracy.
Let me know what you guys think! Warning: It is a bit unorganized still - https://docs.google.com/spreadsheet/ccc?key=0AiKfVKrkWM29dFM4MlNWSWJCSnNKLWhUWmh0Qy1lN1E#gid=1
All #'s are Averages/5. If multiple options are listed it is because an average isn't available yet.
4 erroneous results were taken out (125-4=121) - 2 people with trackball mice, 1 guy who thought it was important to brag about how good he got at league of legends with a 5 dollar mouse, and 1 unfortunate Microsoft Sidewinder X8 user (There was a screw up in the data page that caused the build quality, and sensor quality ratings to not show up for it, sorry - Retake the survey if you're reading this :)
I just happened to check best buy to see what they had locally here, and saw this: http://www.bestbuy.com/site/MM750H+Multimedia+Headset/1307189078.p?id=mp1307189078&skuId=1307189078&st=headset&cp=1&lp=1
What. The. Actual. Fuck. Is. This.
Edit: Here's a screencap in case they 'fix' this error: http://imgur.com/eQpvycQ
The 560ti wins... my heart
But in the benches... the 6950 2gb does really well. There are a several cases (certain games or resolutions) where the 560ti does better. I really like the non-448 core version because of gf114, which is fairly low power and can reach very high clocks.
Have you considered using Intel's SSD Caching tech? (Intel Smart Response Technology). Instead of loading files onto a SSD manually. You plug it into an existing computer and the driver will intelligently place files on it and seamlessly use the cached version instead of fetching it from the rotational HD.
You could get two ssds. One dedicated for OS and critical apps. And another wth RST enabled for a traditional HDD filled with games and media.
http://www.anandtech.com/show/4329/intel-z68-chipset-smart-response-technology-ssd-caching-review
Currently it's only on z68 chipset and it can only with with 64gb max. I have a 120gb OS SSD, and going to get another 60gb SSD for SRT on my Traditional HD Raid array.
Thanks for the reply. According to anandtech benchmarks, http://www.anandtech.com/bench/Product/305?vs=314 , two GTX 460s in SLI perform about as well as a GTX 580 and are substantially cheaper than a 580. What do you think about this?
Yup, I would tend to lean toward P67. SSD caching is software based and the software disables it for other chipsets. It seems that there is nothing for it in the hardware, much like SLI on "non-SLI mainboards".
Source here:
> "Make no mistake, this isn't a hardware feature but it's something that Intel is only enabling on Z68" - Anandtech
And also, Virtu is a feature on some Z68 mainboards but not a Z68 feature.... :-)
This is over a year old, so take that under consideration, but here is evidence that the C300 can have MAJOR problems recovering from a heavy workload.
edit: fixed link
At 2560x1600, I would recommend the HP ZR30w. It's supposed to have good pixel response time. Here is a good review for it. I don't know anything about 2560x1440 screens.
Most of the improvements you will see is with load times. It won't overcome a lack or RAM or an underpowered video card though. It also depends on the game you are playing, depending on the texture resolutions and polygon counts on the models.
You will never want to go back to a HDD for sure. Toms hardware has a decent write up on the intricacies;
http://www.tomshardware.com/reviews/battlefield-rift-ssd,3062.html
Great article. I would suggest you have a table of contents at the top and/or bottom with the headings of each section/page so it's easier to find and jump to. That way if anyone wants to look at RAM explained and don't remember it's on the 3rd page of the PC architecture article, they could just jump to it with the table of contents. eg. Best Gaming CPUs For The Money: January 2012
The other articles also do great in explaining in "layman's terms" to semi-computer geeks who are just grasping an interest in PCs and not yet ready to dive straight into the more intricate inner-workings.
560ti is a good value card. That should move your bottleneck to your CPU but I don't think it would be so bad that you are wasting money on the card. What is your ram clock speed though? You might want to upgrade that.
Tom also recommends checking out the 6950 at that price point too. Personally though I would go Nvidia.
Absolutely, but not until after I got it all set up.
This is what made me realize my mistake: http://www.tomshardware.com/forum/300804-28-raid-drives-what-pros-cons
There wouldn't really be any performance changes if I took it out of raid, so I just left it. I don't really deal with huge files, and aside from gaming, I don't access my disks much. I likely can do some tweaking here, but I'm generally happy with the current setup.
I did mean it as a replacement for the Asus. To back up what Dascandy said Tom's hardware ran an article on 16x/4x crossfire here. I'm not sure where you are seeing it'd be 4x/4x though.
Most Asrock mobo's have a "Load Optimized CPU" setting in the BIOS that will do some auto-overclocking for you. It actually works nearly as well as doing all the tinkering yourself. I have mine set to 4.4ghz and it's completely stable after several hours of stress testing. I know they have this on all the z68 boards, it's worth checking to see if they have it on the p67 boards too. Clicky here for more info.
I would recommend trying some of the modest auto OC settings, and making sure you get reasonable temps and stability. That way you can be sure your CPU HSF is seated good and your CPU isn't finicky. Then, once everything is running good, start learning more and try to tinker with the settings yourself with the help of some guides online. I'm also assuming you have an aftermarket HSF, because the stock intel HSF is trash for anything other than stock CPU speeds.
Edit: I should mention that the 4.6ghz and 4.8ghz presets were completely stable as well, I just couldn't get a low enough idle temperature for a 24/7 OC without having my rig sound like a harrier jet taking off.
Yea toms hardware did some cpu benchmarks and a 1ghz over clock from 3ghz to 4ghz made over a 10fps difference for an i5 2500k.
I would drop some RAM, go with 8 GB for now, more can be added easily.
Instead of 2x580 I'd choose 1x590, it's cheaper that way, and you can add another 590 in a few months/years which is probably going to be able to handle anything you can throw at it. I've heard SLI isn't exactly hassle-free in regards to compatibility with games etc., and having only 1 card reduces said hassle of SLIing.
With the money saved by that, get a better SSD. I would pick a 240 GB one for now, and get a HDD when prices are coming down again. (except if you desperately need 2 TB of space NOW, then go right ahead)
From what reviews I've read, the Corsair Force GT 240 GB is one of the best there is and you'll be happy with it. (just in case you wonder, I wouldn't personally spend ~400 dollars on an SSD, but if you do have the money, go for it.)
Your budget and whether or not the gain is worth the cost is your call of course.
https://www.newegg.com/Product/Product.aspx?Item=N82E16883102371
I wouldn't go less than that, but I understand if you have a budget you need to keep. I'd suggest signing up for the Newegg mailing list and checking SlickDeals daily (search for 1050 or 1060 in the search box and look at the current deals.)
As an example, this was posted about 2 weeks ago and is a great deal well within the budget you mentioned (sadly I think it is out of stock now, but it's a good example!) https://slickdeals.net/f/11518003-dell-inspiron-5680-desktop-intel-core-i5-8400-8gb-ddr4-1tb-hdd-gtx-1060-3gb-win-10-679-99-ac-free-shipping-rakuten
Thermal pads? Ebay has lots. But what requires them? You might be able to substitute ICD7 instead. I used it for the northbridge on my laptop, and it worked a lot better than the thermal pad that it originally had. It's a very thick and viscous TIM, so it can actually be used to fill gaps, unlike AS5 or a lot of the ceramic based TIMs.
Anandtech Benchmarks on SLI 1gb 460s vs 2gb CF 6950s.
Obviously, the 6950s destroy the 460s, but if you look at all the tests you'll notice none are higher res than 1080p. On a 2560x1440 or higher res setup, the differences will be even more dramatic.
Yeah toms's did an article on it. Tri-fire does seem to smooth it out nicely. But like somebody else said, 1 6950 runs most things on ultra at 1080p, I have x-fire at 1080p which is probably overkill. Tri-fire I don't think I could justify.
http://www.tomshardware.com/reviews/radeon-geforce-stutter-crossfire,2995-6.html
Best Graphics Cards For The Money: November 2011 is one guide among many.
It seems that 6850/460 to 6870/560 is within your range - go bargain hunting.
Personally I think that if you can swing it, the 560Ti or a 6950 1 Gb are right now the absolutely best deals on the market, and will last you a fairly long time.
The newer chip will be GF110 based as opposed to GF114. Technically a different GPU altogether. It is closer to a GTX570 with disabled cores than a GTX560Ti so it will not work in SLI with the older cards.
Aye aye, but that comparision is a bit rough. Here you have other Tom's charts:
Just saying that it isn't that simple. Sorry if I sounded rough, it was late and I wanted to hit the sack :)
The 960T CPUs are based on the Deneb processor core (the AMD 6-core die) while the older ones were quad core dies that were restricted to triple or dual cores. Since it's a newer die, results will be different from what was written in 2010. No word yet as to the success rate of unlocking the 6 cores dies. Personally I don't have the need for unlocking cores, I just bought the X6 1090T for gaming and it hums on without a problem at 3.7GHz. If I were on a restricted budget though I would certainly try.
The standard memory config of a 480 is 1.5Gb meaning it doesn't have to stream lots of data every frame, just like the 580 mentioned by the original buildapc video.
If you take a look at this article here you can see clearly that when memory is tight the bandwidth starts to matter significantly more.
The same situation is going to arise with the newer games.
I guess you have a point, but remember both Nvidia and ATI have screwed up with drivers at some point.
I had a GTX285 and the drivers that came out around the time of release of Bad Company 2 had a "bug" that made the fans get stuck at 40% regardless of GPU load. I stopped and did a rollback when I started seing artifacts in BC2, but other people were not so lucky: http://www.engadget.com/2010/03/05/nvidia-pulls-196-75-driver-amid-reports-its-frying-graphics-car/
Note I quoted the word "bug" because surprisingly this happened only to the gtx 200 cards, right after nvidia launched the gtx 400 series.
The worst thing that has happened with ATI drivers in my case was a game just not working.
pull a stunt, pull a prank, pull off a feat, etc.. would be in the same vein as the article's intention. See http://www.thefreedictionary.com/pull for some of the many uses of pull. Just because a word can mean canceled, doesn't mean it can't be used for its other valid meanings
Could this be it? -- http://www.mobygames.com/game/nfl-pro-league-football_
x's and o's screenshot: http://www.mobygames.com/game/dos/nfl-pro-league-football_/screenshots/gameShotId,305825/
Edit: You can find it as abandonware. Just google 'abandonware nfl pro league football 2ec.'
The 680 is in stock here - Best Buy Galaxy GTX 680
I have the G500, and it's the shit. My buddy has the G400 [best buy, $40] is basically a MX518 with a better sensor and G500 skates.
Download the ISO and then either burn it to a DVD or use the the Windows DVD/USB Download Tool to create a bootable Win7 USB key, then do a clean install using whichever method you chose.
Sounds about right. The difference gets more pronounced when the total size of the textures exceeds the video memory more. So with 580s (if I remember correctly what you have) you'll not notice a lot of difference. With other cards you'll notice more, here's another test they did a while ago. Since on for example BF3 maps exceed 1GB of textures dual 1GB cards @16 will probably make a bigger impact (because the map doesn't fit in the 1Gb, information has to stream from the main memory to the video memory during the rendering of a frame, this is obviously faster over a x16 link). Source for BF3 design here. As 'open world games' get larger the difference will grow bigger. This also holds true for cards with more than 1GB, you're likely to benefit more. How much more remains to be seen ofcourse for either improvement.
Edited for clarity
Here's an album of a few shots. I'm not sure if they provide the detail you need, but the lower ones have much more light.
Nothing is done on the back side. There are 4 pegs that you align the bracket on, push it in, then screw it down with the 4 removable caps that you might be able to see in those photos. The brackets do not fit very well. I'm honestly not sure if I ended up "bending" it a little when I put it on, and if so, I'm concerned it may not have been able to go all the way down.
well hell! Guess that would help! Here it is: http://www.bestbuy.com/site/ASUS+-+Laptop+/+Intel%26%23174%3B+Core%26%23153%3B+i7+Processor+/+17.3%22+Display+/+8GB+Memory+/+1TB+Hard+Drive+-+Black/2712579.p?skuId=2712579&id=1218346639131 Thanks and sorry!
RAMDisk would be the closest thing you can get for creating a virtual drive with your RAM. But it cannot be done until you get into windows because it the software has to manage it. Therefore you would not be able to make bootable partitions. So sorry no, its not possible at this time.
>2560 resolution where the extra memory would likely shine.
Dude, don't you get that at 1920x1080 already uses 2gb of memory, at 2560x1440/1600 it will already use much more. No difference is seen there.
>I'm assuming VRAM is like RAM in that when you hit the limit, you'll suddenly notice a massive performance drop, is that correct? In that case, considering BF3 already uses 2GB of VRAM, wouldn't it be terribly short-sighted to get a new card (top of the line, no less) with just barely enough VRAM to play the latest games at the moment at full quality?
Yes, that's why 680s throttle and look choppy when playing bf3 at 5760x1080 for me.
>Seeing no difference in performance doesn't mean the extra VRAM is useless or can't be used, just that it's unused by current games.
Again, at 1920x1200, bf3 already uses up 2gb of VRAM and I assure you that at 5760x1200, it uses much more than 2gb
These sorts of programs are usually created and maintained by the motherboard manufacturer. I guess this is because there is no universal standard for communicating and manipulating that type of info.
EVGA have software called E-LEET that appears to work with your x58 board. But use at your own risk etc etc.
It looks pretty good actually, despite its stupid name. You can set up different profiles for speeds and voltages and use hotkeys to switch between them.
The FTW variants are stated by an EVGA product manager to come out next week. Look for Jacob's posts. I saw rumors that GK110 would be announced soon, but I really doubt it will be released before August.
An outstanding build! I always underestimate the price of watercooling. Any reason why you didn't opt for the high-end GTX580s with 3GB of RAM here? Just figured the extra 3GB of video RAM would benefit triple 30" monitor game play.
Yeah the Sandy Bridge Extreme processors are priced to the point of being ridiculous. If you want that much power you would probably be better off with a dual Xeon X58 board like the EVGA Classified...
I could have understood a price point of $500-$700 and as we all know the price / performance ratio is low on high end but this is to the point of being ridiculous.
That said, I'd love to see one in action and we've also updated the banner with an X79 board :-)
Edit: link to EVGA Classified SR-2
No, after some research, KVM is not what you are looking for, atleast according to my understanding of how it works.
You may want to look at this: http://www.makeuseof.com/tag/teamplayer-use-multiple-keyboard-and-mice-on-one-system/
Perhaps you also have the possibility of using a gamepad & keyboard/mice or different keyboard layouts for the game?