I'm not an electrician but to roughly estimate your setup power draw, go to pcpartpicker.com. Enter your build and it will estimate for you.
Another way is to use wattage meter like this one.
https://www.amazon.com/dp/B00E945SJG/ref=cm_sw_r_cp_apa_fab_03VBFbSVFPFCG
Kilowatt/hr output measurement plugs are cheap, plug light in that and plug that into wall, most have other settings like how much it is costing per hr etc
Could help reduce electric costs if you have a peak/off peak price.
https://www.amazon.com/TS-836A-Energy-Voltage-Electricity-Monitor/dp/B00E945SJG
Some battery backups provide this built-in, otherwise you can get a plug that sits behind the battery backup in the elec socket and shows how much you are using. Like this here
When you say "safety fuse", is this something in your PC or something in your home?
If it's something in your home, you should get one of these to test what your computer is drawing from the wall: https://www.amazon.com/TS-836A-Energy-Voltage-Electricity-Monitor/dp/B00E945SJG
It's possible that your drawing too many amps from the wall.
Simple things first. Change the bulb. If it doesn't solve the problem, the next step is to verify fluctuating voltage. Get something like this:
https://www.amazon.com/TS-836A-Energy-Voltage-Electricity-Monitor/dp/B00E945SJG
and plug it into a socket in the room. Can I assume that the light and the outlets are on the same circuit?
This device should answer the question about your room's voltage.
A $16 kill-a-watt clone is expensive, or something else? If your laptop comes with battery software you can check for free there maybe.
Everyone else commmented on the obvious toxic situation you are in. I am just gonna try to help you with a way to quatify how much electricity you are using.
https://www.amazon.com/TS-836A-Energy-Voltage-Electricity-Monitor/dp/B00E945SJG
plug this in, and then plug in what you want to measure. You can do some easy math and figure how much electricity anything uses and do some basic math to calculate how much that cost if you have an electric bill laying around.
This is the best solution, you just need to adjust for your PSUs efficiency to get your total power draw.
HWmonitor has a field for CPU package power draw, GPU-Z will show GPU power draw. Not totally certain how accurate those measurements are though.
ya, I really want to use my kill-a-watt on a window unit in my house for a 48 hour span, just to see what the start surge would be.
These little babies are really useful to measure power draw: http://www.amazon.com/TS-836A-Energy-Voltage-Electricity-Monitor/dp/B00E945SJG/ref=sr_1_3?s=industrial&ie=UTF8&qid=1461221932&sr=1-3&keywords=kill+a+watt+electricity+usage
OP, don't just blindly buy these things to save electricity. Most modern devices draw very little (< 1W) power when turned off.
DO yourself a favor and buy a simple power meter, this one is $15 and is the one I own. It's super simple to use, just plug it into an outlet and plug something into it. It'll tell you how many watts it draws, and it will even keep track of how much energy (kWh) it's used, which is super useful for things like computers or refrigerators that have non-constant power draw.
Then, after you've done this, worry about dealing with things that are drawing power. Some things might be, but I know when I went around my apartment, almost nothing was drawing power that surprised me. My TV, computer, monitors, speakers, lights, all at 0W when off. Even my printer, which I had previously tried to keep "off" when I wasn't using it, was drawing a negligible amount of power while idle (< 1W).
If you do have something that draws a lot of "ghost power", it's probably only one or two things. Then maybe get those wireless outlet switches for those devices.
In the sub 100w total usage range, I'm comfortable with just the exhaust fan blowing air at/over it in its current configuration. But, I'll monitor close. I had the temp up to 149*F open-air in passive mode, while doing my 'burn in' / stability testing. Currently, in its Bucket design heatsink temp is in the ~low to mid 80s range, which is pretty cool.
Power Meter $14.94 shipped.
Dell R710 with Xeon 5570's, 36GB, 3 drives Dell Optiplex 390 with an i5-2400, 8GB, 2 drives Lenovo ix4-300d with 4 drives Dell PowerConnect 2724
Total draw is between 260w and 290w. Runs about $30/mo. I recommend getting something like this, they tell you a lot about what is cost efficient: http://www.amazon.com/TS-836A-Energy-Voltage-Electricity-Monitor/dp/B00E945SJG/ref=sr_1_2?ie=UTF8&qid=1436157985&sr=8-2&keywords=power+meter&pebp=1436157993528&perid=1QHRGGV60XARJBBCE1PH
My brain skipped a beat. :) Amp clamps are good for inside the fusebox.
Something like this might be better. https://www.amazon.com/TS-836A-Energy-Voltage-Electricity-Monitor/dp/B00E945SJG
I guess you could attach a current meter at the power cable and measure the results with both firmware, but that might be too much work.
Somwthing like this: https://www.amazon.com/TS-836A-Energy-Voltage-Electricity-Monitor/dp/B00E945SJG
The voltage in Norway is 240v. I used a plug-in electricity meter (or plug load meter) to read the wattage & ampere.
I'll update when the server is up and running.
Most hardware stores should have them. Or even Amazon. The kill a watt is a popular brand.
You could get something like this to measure TS-836A Plug Power Meter Energy... https://www.amazon.com/dp/B00E945SJG?ref=ppx_pop_mob_ap_share
You can use a device like this:
https://www.amazon.ca/Display-Energy-Voltage-Electricity-Monitor/dp/B00E945SJG
Test each device plugged in and see if anything is using a high amount of electricity.
I use outlet meter like this https://www.amazon.com/TS-836A-Energy-Voltage-Electricity-Monitor/dp/B00E945SJG
that is most precise way to measure power, software solutions are not reliable
An outlet measuring tool like this TS-836A Plug Power Meter Energy Voltage Amps Electricity Usage Monitor,Reduce Your Energy Costs https://www.amazon.com/dp/B00E945SJG/ref=cm_sw_r_cp_apa_LNZiAb3N0ZPDM
> It's 40 more watts on an overclocked R9 390. Most overclocked GTX 970s will exceed 210 watts when overclocked. Your "30-40%" number is a gross hyperbole, as 40w is actually 19% out of 210w.
Ok so now you are using overclocked cards, your efficiency goes down the drain if you overclock past a certain point.
Let's compare the numbers for factory overclocks on average gaming power draw:
https://www.techpowerup.com/reviews/MSI/GTX_970_Gaming/25.html
MSI GTX 970: 168W vs R9 290 in the same test: 239W -> 42.3%
https://www.techpowerup.com/reviews/Sapphire/R9_390_Nitro/21.html Here is an Sapphire R9 390 Nitro: 261W vs 168W for the MSI 970 -> 55.4%
We could compare maximum power draw instead of average but that doesn't really help your case: MSI 970 213W vs Sapphire 390 365W -> 71.4%
Let's look at a slower clocked R9 390: https://www.techpowerup.com/reviews/Powercolor/R9_390_PCS_Plus/28.html Average gameing power draw 231W vs 168W for the MSI 970 -> 37.5%
So I would not call that hyperbole, I would call that me being generous with the numbers ;).
> One, it can vary based on the displays used (of which they do not disclose), resolution, refresh rate, ect. There is not nearly enough data to call this an issue and not even the review you linked itself goes as far as to call it that because there simply isn't enough data.
How about that data, I, me personally have 3 different monitors, 3 different resolutions.
I usually don't throw away all my monitors and replace all of them, I usually upgrade one at a time.
I bet a lot of people use their old monitor as a secondary when upgrading, so they measure this power draws scenario for a reason.
Anyways, in my case the Sapphire R9 390 drew almost 90W when "idle" my MSI 970 draws about 16W.
It's night and day difference in noise and heat output. That R9 pulls more idle than my i7 at max torture load.
> This doesn't really prove anything. I've had loud GTX 970s and loud R9 390s. The only thing it proves is that some designs are better than others.
While I generally agree that this does not universally prove anything, keep in mind that all the 3 cards I tested have exceptional cooling solutions.
I run a silent PC that is almost inaudible if idle, and because I work 8 hours a day using this PC that's important to me personally. I choose those cards for a reason.
My case is not an ideal scenario for a high power card because the ventilation is limited (only 1 input fan on low RPM on idle + case is noise isolated does not have a lot of open air vents).
Still my scenario especially shows how much heat a card puts into a case because I like to keep the ventilation (noise) to a minimum.
https://www.techpowerup.com/reviews/Sapphire/R9_390_Nitro/22.html
The Sapphire Nitro is a fantastic R9 variant.
0db when "idle" like the MSI 970 (which is why the Sapphire 390 was my first choice and the first card I tested as an upgrade in my current system).
Unfortunately in my case the card never really is "idle" because of the 3 monitors.
Like I said before, the gaming noise wasn't bad at all. The extra noise this card produces in my scenario is mostly due to the excess heat my case fans have to get out of my case.
Again I wouldn't really mind the noise level during gaming, or the extra power draw in that situation.
What killed this particular card for me was the idle power draw with my 3 monitors.
> Where are you even getting these numbers from? Every review I have seen has R9 390s at under 10w during idle.
I have something like this:
https://www.amazon.com/TS-836A-Energy-Voltage-Electricity-Monitor/dp/B00E945SJG
I measured my system power draw with my iGPU vs the power draw with the R9 390. Difference was almost 90w with all monitors attached.
This test has the R9 390 at 71 W idle with multiple monitors attached:
https://www.techpowerup.com/reviews/Sapphire/R9_390_Nitro/21.html
I gladly admit that my measurement might not be super accurate and something like +-5W inaccuracy of measurement is possible.
I only had two monitors attached to the iGPU because it only has two connectors ;).
If you want we can call it 70W at idle with multiple monitors.
> Just going to point out that there are R9 390s with fans that don't turn on until 76c as well....
Yes and the Sapphire I tested is one of them. Sadly the card does not stay under 76 due to the power draw ;).
If I only had one monitor, or if they would fix the multi monitor issue,
the Sapphire 390 would be the better card compared e.g. with my 970.
If you don't mind the extra heat and electricity bill. (which is reasonable if you don't game that much or live where electricity is cheap).
> No one here was arguing otherwise but there is only so many shits that can be given about power consumption. If I wanted to worry about power usage I would buy a Nintendo switch.
Well this whole thread started when I addressed the claim of "AMD sucks because they run hot!!!!!!!1!".
If you care about power draw depends on where you live. Where I live, with my usage scenario (at least 8 hours a day drawing ~70 extra watts).
The extra cost of running the Sapphire 390 is 146kWh per year or roughly 40€ a year.
Let's assume you use your card for at least 2-3 years so that's 80-120€ extra cost.
That was the extra cost of a 980 back when I was looking for a new card. The 390 would lose that duel.
I realize, that if you live in the US or somewhere with low energy cost, the difference is much lower. (The national average was 12.99 cents per kwh).
$20 per year or 40-80$ for a 3 year period. Now if you remove the extra monitors, of if you only use the PC for gaming this will be even lower.
I'm fine with you not caring about it. I do.
> Low power consumption is nice, to a point. It stops becoming important when companies like Nvidia simply continue to cut down their die sizes to stave off performance improvements.
Whut? You cut down die size, because you can due to the 16nm process, to increase yield because that's how you make money. It's also more efficient :D.
Which is why Nvidia could improve perf/Watt immensely.
50% more efficient going from a 980Ti to a 1080Ti,
something like 70% more efficient going from a 970 to 1070,
up to 100% if you look the most favorable gains on the lower end.
On the other hand Vega 64 gives you barely more perf/Watt over a Fury X...
I don't agree that they starve off performance improvements. At least not unreasonably so.
Again, die size means profit. The smaller you can go while still improving performance the better.
Now if only AMD could bring a competitive card in the higher end... that would force Nvidia to give up more of their profits by cutting price or increasing die sizes again to stay competitive.
Again I came from an AMD card and I wanted to buy one again but I had to go with the 970.
Looking at Vega right now I have to say I'm not impressed yet.
Ryzen is a different story though, let's hope they can counter Intels next Hexacore line. Maybe I can go full AMD again, on my next machine.
My i7 3770 is starting to show its age, but only in some of my work tasks and the 970 is holding up surprisingly well at [email protected] during gaming.
I can probably wait for Zen 2, which should roughly come out around the same time as Volta and hopefully Navi is not too far away either.
Who knows, maybe by then AMD gets some support from the major deep learning frameworks, it looks like Vega would be a good card for that.
You can't. You can rig up something and use an amp probe or buy a plug-in power meter but what you're asking for doesn't exist, at least not cheaply.
Honestly, i just use a Kill-a-wattVery similar to the one in the link.
@ .15Kw/Hr I'm looking at about 29.00/Month.
I use this to watch my wattage and how much I'm being charged.
Edit:added link
60 watt bulbs hardly touch the electric bill. Anything with motors is what kills your power bills.
https://www.amazon.com/dp/B00E945SJG/ref=cm_sw_r_cp_apa_1WTKxbEMKP24B
You can use something like this.
If you have budget left, I know I can find benchmarks of this online. But please buy a kill-a-watt to check the wattage your PC will output. Preferably when stress testing both the CPU and GPU with furmark, intel burn test, OCCT. You might be surprised at how little this system pulls from the wall. I estimate between 450-500W depending on how high your overclocks are.
I suggest windforce cards a lot and I am slightly afraid to suggest a 650W PSU despite the fact that the benchmarks show that 2 GTX970s pull 439 watts when in full stress. Add a 111-140 for the CPU and like 5 for the HDD/SSD and you get to a roudhly 550W.
The kill-a-watt can also be used for other stuff! So not a terrible buy, handy tool to have.
Yea you'd need current also for any given machine to get a true power value. If they have some sort of value for maximum current draw you could get a max value for the machine and then with the time the machine is on you could calculate the power drawn for a given time period. You could also use a plug in energy meter like this one. Just note that they most likely have a max power they can safely measure so consider that when choosing one(since I assume the machines draw a fair amount of power).
You can use a device such as this, http://www.amazon.com/TS-836A-Energy-Voltage-Electricity-Monitor/dp/B00E945SJG/ref=sr_1_2?ie=UTF8&qid=1455823928&sr=8-2&keywords=watt+meter.
To monitor the power your computer is drawing from the wall. Easy way to determine if your power supply is large/efficient enough.
Can you post more detailed specs of your rig? You can use this program if you are not sure how to do that. https://www.piriform.com/speccy