Yes. That's how free software works.
A good example from my own home computer is the Amarok music player. I used to like it, but they stuffed so many features it became unusable for me. Then someone else took an older version of Amarok, from the time it worked well without so much clutter, modernized it and released it under the name Clementine.
Today I use Clementine alone, and love it. Sorry, Amarok guys, you fucked up.
This is basically what those little Square readers do in cafes:
https://i.imgur.com/DSvJ79S.jpg
Here's a SO thread where someone runs the numbers and gets to a data throughput rate of ~8kbps:
https://stackoverflow.com/questions/2181476/bandwidth-from-headphone-microphone-jack
Having interviewed at cypress, Facebook, and Microsoft:
Interviewers are trying to gauge whether you have a grasp on fundamentals and whether you can see the big picture, while gauging your thought process.
Highly recommend this book for review: Practical Electronics for Inventors 2/E https://www.amazon.com/dp/0071452818/ref=cm_sw_r_cp_api_glt_fabc_X2XYNK63P1E3BV7YG9W9?_encoding=UTF8&psc=1
I believe Latex is the standard for publishing research in many fields – many journals provide their own templates (like IEEE) which you must use if you want them to publish your work.
Personally, I used it for my Master's dissertation and my CV / resume. I've used it to write up a few longer bits of research at work too, however it makes it a bit of a pain for other people to edit, so I've stopped doing this and just use Word. The guys over at ShareLatex (which is a great product btw) have some great docs you can follow to get started: https://www.sharelatex.com/learn.
Latex can produce some really nice looking stuff, but it takes a while to get used to.
The Scientist & Engineer's Guide to Digital Signal Processing is a pretty decent book as a crash course. It covers the high level concepts in the first half and the hard core math at the very end.
In the middle there’s a chunk of stuff that’s very practical if you don’t have the time to learn all the math behind it. This is the stuff that I found most useful. It covered the various filters, why you would use one over the other, and basic implementations.
If you really want to learn DSP, a course might be useful but it all depends on what you want from it.
Buy an FE exam prep book (or find one online). This one has a diverse collection of questions from 1st/2nd year level university courses. Found it very helpful when I finally took my FE last year.
A Technique For Producing Ideas, by James Webb Young
The Soul Of A New Machine, by Tracy Kidder
The Mythical Man-Month, by Fred Brooks
The Existential Pleasures Of Engineering, by Samuel Florman
Skunk Works: A Personal Memoir of My Years at Lockheed, by Ben Rich
Spinoff, by Charlie Sporck
Big Secrets, by William Poundstone
Audio Power Amplifier Design, by Bob Cordell
The Mathematics of Money Management, by Ralph Vince
The Art of Insight in Science and Engineering, by Sanjoy Mahajan
Meditations, by Marcus Aurelius
How To Get Ideas, by Jack Foster
Well it's a good effort. Why do you need the op-amps? You could just use a voltage divider down to the ADC input directly. Also your op-amps won't work without a negative power supply connection. Finally you need bypass caps on all your ICs (though I assume you left them out).
For circuit simulation, there's always LTSpice. It's "free". No really it's free.
There are a few better schematic capture programs available other than MSPaint as well.
Note that every engineering solution sucks in some way. You usually try to get the one that's the least suckiest. However, sometimes a solution is so slick it's like a work of art. That's nirvana.
The state of the art these days is to use SystemVerilog, and perform what is known as "Constrained Random Verification". The SystemVerilog language has a constraint solver that lets you create random stimulus (as defined by constraints), as well as language features that help you determined which features have been tested. The basic idea is that you roll the dice and try random sequences on each test run until you get coverage you are happy with. This approach is often combined a with directed testing to help close out coverage gaps.
If you are looking for a good reference, I'd recommend SystemVerilog for Verification
Because Intel.
Really, that's all. Because Intel settled on an eight bit architecture and proceeded to dominate the market.
It is by no means universal. Older systems did not necessarily have eight-bit buses. Different bus sizes were common enough that The Art of Computer Programming is written for a six-bit system. Even on modern systems, we have 80-bit floating point numbers.
The next obvious question is, "Why does Intel do this?" The answer is backwards compatibility. The 8086, great-grandfather of the modern x86-64 architecture, had eight bit registers (arbitrarily eight, remember). When eight bits were no longer sufficient, Intel rigged up a system where its four main registers (A, B, C, and D) could be referred to as either eight bit or sixteen bit chips. So, instead of referring to just register A, you refer to AH or AL (A-high/A-low) for eight bit registers or AX (A-extended) for sixteen bit. Thus, eight-bit code could still compile without changes.
The problem with wireless is that power of wireless signals drops exponentially proportional to 1/r^2
edit: Looks like a company is working on Cota, works up to 30ft.
http://techcrunch.com/2013/09/09/cota-by-ossia-wireless-power/
30 ft for 1 W, I wonder how much power it takes to run the transmitter though.
Biggest pitfall I see is not knowing how to write good, clean, maintainable code. Poorly written code is going to bite you in the ass some point down the line. So more important than anything specific to embedded systems programming book is a book on good coding practices.
So I suggest Code Complete, first and foremost. Wish I had read it and took it to heart way sooner. Especially considering you probably won't end up just doing embedded work, but will probably have to write other code to support it (eg GUIs). Even though it is not C specific, this meets your second two requirements (best practices and strategies) very strongly and I wouldn't overlook it.
Ok, so, if you don't understand a little bit of transmission line theory, and you're trying to do really fast serial communications, it's going to bite you in the ass.
What you need to do is use cabling and connectors that maintain a constant characteristic impedance. You need to terminate them properly; this is part of what impedance matching means. For slower stuff (<50 MHz digital), you may be able to get away with just making all of your wires really short (a small fraction of 1/4 of a wavelength), but watch out for skew.
I did notice that you've got 75 Ω resistors. Depending on what sort of cabling you're using, this may be fine, or completely wrong. Different cables have different characteristic impedances. For example, CAT5 is 100 Ω, test equipment coax is usually 50 Ω, and TV/pro-AV stuff is usually 75 Ω. As I mentioned above, your connectors need to maintain the characteristic impedance of the cable, or you'll get all sorts of nastiness.
If you are a visual person, this may help you understand what's going on.
Computer Architecture: A Quantitative Approach by Hennessey and Patterson is the definitive book used in universities all over the world. They are the creators of the MIPS and SPARC.
https://www.amazon.com/Computer-Architecture-Quantitative-Approach-Kaufmann/dp/0128119055/
We used the 2nd edition in school way back in the 90's but it's up to the 6th edition now. We covered branch predictors and Tomasulo's algorithm for out of order execution back then. I'm sure the latest edition has significant updates.
We modeled everything in C but these days our chip architects model everything in even higher level languages like Python. It is easier to simulate the performance effects of different cache sizes, number of pipeline stages, etc. in a high level language and use traces of program execution.
After the architecture is decided they write specs and then the implementation team would write Verilog and go through the whole process of synthesis, place and route, timing analysis, LVS/DRC, etc
FTDI chip based solutions shouldn't be too hard to find. eBay seems to list couple of different FT232-based adapters from several vendors. eg this seems okayish.
You can use USBPcap to capture the raw data from the USB bus then view it with WireShark. The learning curve can be rough since there's a bunch of messaging that will have nothing to do with the actual data you're looking for. I would also recommend unplugging as many USB devices as possible so you have less junk to sort through.
Ok so the Arduino has already been said. If you have no experience with microcontrollers, I agree you should start there, but use it as a learning tool. The Arduino is a platform that allows you to quickly learn how to use microcontrollers with tons of resources online. Now before it starts to sound like I'm bad mouthing them, I'm not, (I almost always prototype with them) but the Arduino is basically microcontrollers with training wheels.
Yes the underlying actual microcontroller (Atmega328) is plenty powerful, but using the Arduino IDE hides a lot of the details you should know how to use. This includes things like special register settings for oscillators, ADCs, and such. So after you have gotten an understanding of how things work with the Arduino, I would move onto something like a PIC, TI, or some standalone AVR.
Anytime I am learning a new ucon, there are a few things I try to learn first, which make up about 90% of what you'll end up doing with it anyway.
1) Digital I/O - Can I read a button press, and turn on an LED
2) Analog read - Can I read a pot?
3) PWM - Can i create a variable duty cycle square wave output?
4) Timers - Similar to 3, but can you properly use built in timers?
5) Peripheral communication - Eaither I2C, conventional serial, just some way to talk to other devices.
So those are my thoughts. Now I realize My whole comment was about ucons, but you may not end up using one. If you do, here is a breakdown of several popular development boards HaD Once you become a master of these things, you may consider moving to something like an FPGA. Have fun, good luck, and 2 things that keep me moving through hard projects. 1 - No matter what the problem may be, 50 people on the internet have had the same problem, found a solution, and created a forum 2 - If you can control an LED, you can control the world.
Actually there is at least Rust which can replace C. It has all the advantages of C with tons of modern languages features. And it is a language designed with safety as a requirement from day 0. https://www.rust-lang.org/en-US/ https://medium.com/mozilla-tech/rust-and-the-future-of-systems-programming-b75fba746910
Eagle is rubbish (and I've used it extensively). Its popularity comes more from Linux zealotry than anything else.
If you're using Windows, try DipTrace. It actually works like a modern Windows application, not a 30 year old Motif GUI. ;)
My professor wrote this book I think you would find helpful. 10 Essential Skills for Electrical Engineers
Since I was an engineering student: here is the pdf
Honestly my go-to is probably odd but it's very complete (doesn't cover layout but that's very tool-specific and thus temporal/changing over time): Printed Circuits Handbook. Much of what you need to know isn't directly tactical to your project.
Well, sometimes technicians will cut a power cord if the device is damaged, I have done this many times. But that by no means is a universal standard.
Fix the cable and see what happens, just be VERY careful. Remember that less than .5A can be fatal and these old scopes most likely pull more current than that.
Though before you do anything check the continuity on the existing severed power cord. It's done like in the link except that you use the severed leads.
Also, a few tips to avoid electrocution:
1) When doing repairs make absolutely sure that the thing is not energized. Be careful here, if it has any capacitors they might still hold a charge after you unplug the device.
2) Make sure you plug it into an outlet with a GFCI. This way if you touch the chassis and there is a ground you won't be electrocuted, this is very important.
3) Don't wear any metal jewelry around your hands. No rings or bracelets.
I used to do repairs like this on a daily basis and these a just a few things that I learned which help to keep you safe.
Edit: When checking continuity what you want to check is that there is infinite resistance between the ground and the other leads (i.e. no sound when you connect them). If there isn't then that means that there is a ground in the scope.
The author claims to have eliminated race conditions, but he really hasn't. He's not introduced any protection on his flag variable other than wrapping one reference to it in an interrupt subroutine. This subroutine doesn't disable interrupts, so even here the variable is not well-protected. An air-tight solution would be to wrap all references with some sort of locking mechanism.
As a side note, the "8 years experience in aerospace/defense" claimed by the author is a red flag for me... having worked in these industries I've seen the most horrendous programming being passed as production code, simply for the fact that these industries live in a vacuum of innovation. Due to incredibly costly and time-consuming validation processes required on everything that touches production code, new tools and technologies are shunned. These industries are veritable time capsules of technology, often employing programmers who'd prefer COBOL or PASCAL over anything else...
What's more interesting than this blog post are the comments on computed gotos, a feature I've never seen used before: http://eli.thegreenplace.net/2012/07/12/computed-goto-for-efficient-dispatch-tables/
Good question. As a new grad myself, I try to follow these two blogs and do projects that seem interesting to me from them. Beware though, I am mostly digital so the content of the blogs might be biased more to that side.
An introduction to spaceflight engineering class would be good. It's good to understand what space is like from a physical and electronic perspective - vacuum means that "stable" materials start to outgas, outside of the ionosphere and magnetosphere means strange things happen to RAM.
Other than that, an understanding of writing safety critical systems, "Safety Integrity Levels" and some of the automotive coding standards (MISRA, etc).
Not specific for space, but read "Pragmatic Programmer" and "Code Complete"; and, for C++, everything that Scott Meyers has written (Effective C++ series and Modern C++ series)
And have excellent (and I mean excellent) oral and written communication skills. I'll hire the middle of the road programmer who can communicate, over the rockstar programmer who can only grunt any day of the week.
Is there a particular reason you want to do it this way instead of buying an off the shelf power timer?
Bell and Newell (published 1971) was one of the textbooks I learned from as an undergraduate.
There's also a downloadable free .pdf of the full and complete book by Jim Thornton on one of the early supercomputers. (amazon 2)
Here's an article about cell tower basics from hackaday that's surprisingly interesting. It mentions these antennas.
Tinkering with electronics will teach you how, math will teach you why. EE is based on math and physics. If you want to design electronics, not just tinker around, you want to learn math. I think you'll start to enjoy math when you find the connection between math and electronics. I regret that I didn't spend more time on math as an undergrad.
If school math doesn't interest you, have you check out the math section on http://www.khanacademy.org ?
The unit step function doesn’t really have that much of an impact on the time invariance of a system because of how nice of a function it is. It basically acts like a switch that turns on at t=0 and stays on as you approach infinity.
Time invariance is concerned with getting the same output signal from the input no matter when the input is applied. With the unit step function, we know it will always be magnitude 1 so anything being multiplied by that signal will remain unchanged.
Additionally, the unit step function can be shifted very easily in the time domain allowing for the following to be true; y(t-T) = x(t-T) where x(t)=cos(t)u(t) for example.
Here is a link to a basic example of testing for time invariance:
https://dsp.stackexchange.com/questions/23194/proof-of-time-invariance-of-continuous-time-system
Here is a link to an article about the properties of LTI systems:
https://brilliant.org/wiki/linear-time-invariant-systems/
Hope this helps!
Your university is 100% dead wrong. You should only do a 2 page resume if you have > 15-20 years experience in the exact field that you're applying for. And then only if it's for something EXTREMELY competitive ie officer/division head/engineering coordinator or something to that effect.
Also, I would change "Project Experience" to something along the lines of "Projects", or "Design Projects", or something to that effect. Unless you're applying for a research position, I think if you're going to put that section first you should make the title more important sounding.
Also, I would only list your master's GPA. It really is unimportant, especially your music. And I would move all of that to the bottom and use at most 1-2 lines.
This is a version of my resume. The formatting is completely screwed up since I'm opening it in openoffice instead of word.
This is also actually an old version. I don't list my languages/skills there. But if I remember correctly, I tried to keep it to a single line. What technologies you know is rather unimportant these days, especially languages. I could list like 30 languages that I know/have used, but after a dozen or so, who cares?
When using the exponential form I believe you have to use radians or bake in the appropriate conversion, even when in degree mode. Punch in 277*e^(pi/6*i) and see if it handles it correctly.
In the "Terms & Privacy" section at the bottom of our home page, you should find our contact information.
https://labxnow.org/labxweb/terms.html
you can also find a link here regarding why we are doing this:
Yes! I made a set myself for a total of about $10 to $15 and they are /awesome/. All you need to do is buy a couple of sets of Loc-Line or Flo-Snap (they are flexible pipe that are used on machining tools for coolant) and tapped connectors for these, find a suitable piece of material for a base, and some alligator clips and appropriate sized screws that fit into the ends of the hoses. I know this sounds really confusing, because I haven't linked anything or fully explained it all.
Just look here for decent directions:
http://www.instructables.com/id/Third-Hand-A-multi-use-helping-hand-for-electro/
I would definitely use something fairly heavy for the base. I just used a chunk of mdf and it is meh. The threads are going to wear out and it's not super heavy. I was able to just screw the adapters in without tapping the holes though, which was a plus.
I also had a PC fan that I screwed onto one of the hand-ends and use as a solder fume extractor.
All in all, the thing is awesome! Totally worth it!
Glad to hear you are taking it so positively! Rough interviews happen to everyone (even experienced folks fail interviews miserably now and again), and how you react and take it going forward is huge.
Especially since you're just coming out of school I figure it is worth mentioning - don't stress too much about cramming to know everything. Interviews are about gauging where you're at. It's expected that someone coming out of school will have areas they don't fully understand and will need to be taught. Whenever I'm doing interviews/hiring I always put more preference on people who I'd enjoy working with and can learn over someone who knows everything when they come in the door.
Also, if you are going to be going into more in-person interviews I'd recommend practicing with this format. Grab a friend and trade off asking each other questions, have ready responses for the "draw me a project you're working on" and expect to answer questions about the details. Be okay with saying, "I'm not sure, but here's my best guess. Tell me more!".
For paper coding problems, leetcode or project euler can be a good way to practice, just keep in mind the skill ceiling is crazy high for these types of things (for your level I'd focus on ones marked "easy" or maybe "medium"). Focus especially on saying your thought process out loud - these problems are more about understanding how you problem solve rather than if you can get the "correct" answer.
Good luck! You've got this!
Voltage divider, with a buffer if you think it's necessary.
Consider also the useful range of measurement, a car battery at 9v is dead so do you really care if it's lower that that, probably not, so use a differential amp (voltage subtractor) to remove that 9v before you divide it down otherwise you are throwing away most of your range.
Example in Falstad you'd probably want a better reference than a power sucking zener, but anyway. With a battery of 14.5v your output to the ADC is 3.3v, with a battery of 9v your output to the ADC is 0v.
Looking at the DRV8412's datasheet, a UART interface isn't the solution to your problem. Connect the stepper to the driver using the schematic provided in the datasheet (figure 10), then connect the inputs directly to any port on your msp430.
Here's some code I recently used to test a stepper using a DRV8833 driver with an MSP430F5329: http://slexy.org/view/s20Zq4fWOc
What your dad is probably looking for is something like Hantek's line of automotive diagnostic oscilloscopes.
Cool thing about them is they tend to be relatively cheap, relatively rugged, and freuqently come pre-set for common diagnostic tasks. You can find them on Amazon for about $100USD
If you haven't bought the "Black Magic" books written by Howard Johnson (real name!), consider doing so.
I have noticed this in my neck of the woods as well.
From my understanding I think it has a lot to do with people realizing that for many newer embedded applications, a RTOS isn't really necessary, and if you can afford the resources, developing on linux is much easier compared to traditional bare-metal programming/RTOS.
That being said, the only experience I have with this is reading a book. Maybe it could be a good starting point for you too: Here is the amazon link
Grab a Beagleboard, hang something off the expansion connector, and write a kernel driver for it. If you can get it to work, you'll learn a lot. The folks in #beagle on freenode are pretty helpful if you get stuck.
[author here:]
I use MathJax. It's pretty widespread and supported by most platforms. I now write my articles in IPython notebooks with TeX equations which get rendered by MathJax. There are a few minor differences between the IPython renderings and MathJax for the web, but I fix those up with a script. (primarily just that $abc$
in IPython has to be converted to \(abc\)
for MathJax.)
Julia (http://julialang.org) is MIT's open source attempt at the control/DSP and all linear algebra numerical computations. As a language it have many advantages over Matlab not tied by backward compatibility. OTOH, being a relatively new project, its ecosystem and library still not as comprehensive as Matlab. It's moving at a steady pace though.
Just go to your hardware store and buy outdoor low voltage wire. Should be cheaper. Or, use Amazon 16/2 Low Voltage Landscape Wire - 100ft Outdoor Low-Voltage Cable for Landscape Lighting, Black https://www.amazon.com/dp/B07FYWH2SL
I think I know only a few of those.
Can we get the list completed (and my guesses confirmed)?
For a more concrete suggestion, design own arduino clone with some added features, such as bluetooth, accelerometers/gyros/compass, motor driver, led drivers, etc. on a PCB.
Then program said microcontroller using C.
Also, go through *The C Programming Language*by Kernighan and Ritchie (popularly known as K&R) to really know C.
I have a crapload of these: http://www.ebay.com/itm/components-box-storage-box-Electronic-50-pcs-SMT-SMD-/150564155492?pt=LH_DefaultDomain_0&hash=item230e52b064
They are small since I find myself using SMD more these days. The lids are spring loaded. Since you can snap them together in different configurations, you can create little parts kits from your component stock. Handy to just have a chunk of these stuck together to float around on your workbench, with only the parts you need.
The same seller has larger versions that would hold resistors, ICs, etc. Everything locks together no matter what the size, so you could for example have two large boxes with big parts and then surround it with the SMD parts you need. All without trying to dump parts from one bin to another.
http://www.khanacademy.org/ will help get your math back up to snuff. Really, without a background in at least the basic maths (Algebra I/II/Linear, Trig, Calc I/II, etc) it's very difficult to even be a successful technician.
Luckily these days there are free resources like Khan Academy, and soon edX (which is extremely exciting). 2-3 months of Khan Academy a few hours a week will certainly get you to a point of understanding in those disciplines.
One can't do electronics without mathematics. Luckily mathematics is beautiful and extremely engaging and interesting. Embrace it and you will go far.
http://www.cburch.com/logisim/
Here is a piece of software that helped me a bunch through EE. If you download it and go to "circuit analysis" in the toolbar you will be able to construct circuits from boolean expressions or thuth tables. There are plenty of handy features in there, like an auto-generated karnaugh-diagram for every simulated circuit you made.
Logisim is for win users, but if you're on iOS you can try this instead.
dont use logisim, use this: https://github.com/hneemann/Digital
ive made a mandelbrot renderer in it, unlike logisim which just started oscillating on itself.
QUCS is a free program which can do a variety of RF circuit simulations, but it has no layout capabilities. It seems good but I've only played around with it a few times.
I don't know of anything single software solution that meets all of your criteria and budget, but you can always do the layout in another program.
I would trust AWS Workspaces over this website, maybe after its been live for while and has a decent user base would i consider it
Apple most certainly designs the analog I/O and embedded blocks of their CPUs. So in your modern CPU you've got amplifiers, drivers, PLLs, DLLS, temperature sensors, DACs/ADCs, power delivery/filtering, bandgap references, phase-interpolators, slew-rate/impedance/current calibration circuits, clock topology, clock deskew... and thats just off the top of my head.
Google. Boom: https://www.linkedin.com/jobs2/view/6669939
I actually interviewed with them a few years ago when I was still at Intel. They were hiring me to do exactly what I was already doing for intel (high-speed analog I/O design).
I passed the EE exam about a month ago. I used these to study:
https://www.amazon.com/Study-Fundamentals-Engineering-Electrical-Computer/dp/1985699710
https://www.amazon.com/Fundamentals-Engineering-Electrical-Computer-Specification/dp/1534759492
and they really helped a lot to prepare for the test. hint: you can buy and return on amazon when you're done :)
Let me know if you have any questions
I love the TI Launchpad series! My school used the EK-TM4C123GXL all through my undergrad EE labs. Note: my microcontrollers class used this book: https://www.amazon.com/dp/1477508996/ref=cm_sw_r_em_apa_i_lxBhFb7DBEAGT
My team built this: http://www.engadget.com/2010/04/25/nc-states-computer-vision-software-promises-improved-self-drivi/
I wanted to do something in robotics and this all they had for me to chose between.
My team was fantastic. I am a hardware guy. I was a mechanic in high school. I know software but I have much better intuition with all things hardware. I found two fantastic software guys and a real hard working Marine knucklehead. He wasn't a brainiac but he brought beer to the lab and would NEVER leave until I did. We are still best of friends and he was a vital part of us "winning".
A third option might be to get this one used. It provides a good amount of design examples which are always helpful. Plus its cheap.
Also coursera offers a power electronics course by UC boulder (Erickson) which you may want to look into. I know the other ones were free, but I don't know how these new specializations on coursera work https://www.coursera.org/learn/power-electronics
Just convert the dBm to W, and add the Watts
For more info look at this: http://www.ehow.com/how_5142308_calculate-dbm-watts.html
If you sum the dB's you would be multiplying the power...
I'm in a similar position as you (2nd year summer, still taking classes over the summer though), I found that I was excited about the idea of projects but couldn't get hyped up about it until I found something specific. I'm currently working on some X-Y controlled laser stuff.
You might want to just browse around some Instructables for basic projects or ideas to get you going if you're feeling shaky on developing your own projects. Coming out of structured labs, I'm finding it a lot easier and still extremely beneficial to practice with pre-developed projects for now.
This was posted on Engadget a couple hours ago, so there are people still working on it.
edit: Techcrunch's article has more information, so I recommend reading that instead.
tldr: A company called Ossia created a way to charge phones wirelessly from up to 30 ft away.
The georgia tech guy with the cool pen does a good job of the basics, and hits some branch prediction stuff.
edit: Sorry for the vague description :)
https://www.udacity.com/course/high-performance-computer-architecture--ud007
> Doesn't a PSU first downstep to 12 volts, and then farther down the line split those into the 5 and the 3.3?
No, the PSU first rectifies the current, doubling the voltage if you're in a 120 volts region. The voltage is doubled so they can have the same circuit in 120 as in 220 volts regions. This results in a 300+ volts DC, very deadly.
A high frequency inverter feeds the current to one or more transformers, with a different winding for each voltage level. At the currents used, above 10 A, it's not practical to convert 12 V to 5 V, you need a separate transformer winding for each voltage.
If you wanted to get five seconds of storage for 850 watts, that would be about 2.4 A @ 350 volts, you'd need something like 250,000 microfarads of storage. This means 160 of these.
You'd need a separate cabinet to hold them, and I can assure you that the risk of something going wrong would be too big to make it worth.
My advice would be: stay away from the USB port.
You can not draw a lot of power and if something goes wrong there is a very real threat of frying your USB port and/or motherboard.
Get a 12V wall wart and build something like this.
It is rather inefficient, and generates a lot of heat, but it costs little and you can adjust it from 10V to 1.25V.
I've seen jobs and internships offered by HP in that field. They make ASICs for their server products so there is a lot of VHDL verification and FPGA testing involved.
The jobs you're looking for will not be found under CS. I'm a CE and don't consider anything I do CS related. You'll need to find jobs filed under Electrical Engineering. A quick search for "fpga jobs" returns companies like HP, Xtera Communications, Virgin Galactic, Tektronix, and way more.
YouTube is a really handy learning resource for others, but if you're making this to benefit yourself over a period of time, you may want to look into a website. It's much easier to organize your information well, simpler to edit, and is generally much more professional. If you plan on keeping up with this for 8 or so years, a YouTube channel would be lots of work and difficult to find information quickly. Plus no copy paste of written info.
The suggestion of hackaday documentation is a great, easy way to keep a record. If you want to go a little more challenging and get a little bit of beginner coding in, you may want to look at creating a Github Pages site. It can be quite fun to get everything looking perfect!
What can't you do? solve the ODE? What answer are you getting?
The answer is correct according to wolframalpha
Maybe your answer is written in a way that could be simplified to this one.
This is an excellent resource on the 5940. And the author is a frequent redditor, I think he goes by artcfox.
The nice think about the 5940, and other PWM drivers, is that you can pretty easily control them and achieve brightness control over each individual LED.
5V output is standard from the fast charger. The phone has to make an explicit request for higher voltages explained here.
Although I haven't played with it myself, I've heard a lot of great reviews about the Beagle Board. So I'd throw that into the list of good ARM project boards.
> CODEX_LVL5
Oh hey, I found another HoNster in the wild.
I find the best approach for recall was to take notes with pen and paper at the time in a separate note book, using a ruler if needed for straight lines.
Then, outside class, type the notes up. Add PDFs of the lecture notes if useful. If putting in graphs, try and generate them rather than draw them - I used gnuplot to graph data and functions at the time; I'd now use matplotlib and Python, but that's personal preference. Either way round, keep the source that generates the graph as well as the graph itself, so that you can regenerate it with tweaks later.
I found that it was the second step, of typing up my notes, that really helped me learn the material; I had to read my notes, understand them, and write them up as English, and added cross references to other useful material (professor's web pages, text books etc). Because I wasn't under huge time pressure while writing them up, I could take the time to go "huh. This doesn't make sense any more", and find more material or go and ask the TA for help later.
Eagle free, limit max board size, high irritation factor
Kicad, seems to be well respected, a bit buggy
Recent Altium might very good http://www.googleadservices.com/pagead/aclk?sa=L&ai=CdWY6oND1VerUJ-qEjAbLkpnYCICXjqsH6K--u8cCgIHRiVAIAxABIMHC2BcoA2DVhYCAuAigAYD0_MYDyAEBqgQkT9BSt9vKGnr1ebO5ZPcTHG2T0T6pTGJtDtGeTAVml64TL7WVgAWQTogGAYAH6IuDOZAHA6gHpr4b2AcB&ohost=www.google....
One thing to note is that you can use Arduino boards just as any dev board, (in some cases they're even cheaper than the bare chips) just use you own programmer instead of the bootloader or the arduino IDE, Atmel studio supports arduino boards now.
It surprises me that nobody has mentioned Julia: A fresh approach to technical computing.
Julia has a syntax that is similar to MATLAB, although not exactly a copy.
Learning Julia: http://julialang.org/learning/
Try http://www.fpga4fun.com/. Hackaday usually have some interesting things, here is an old post http://hackaday.com/2011/12/30/so-you-wanna-learn-fpgas/, you can probably go from there.
Designing an ASIC? You can probably get as far as a netlist (the final map of gate connections of your design, to be "printed" on silicon) but getting an actual physical chip from it is a very expensive process.
Maybe related IR project: http://hackaday.com/2011/04/22/simple-ir-bounce-tachometer/
hobbyking has some really cheap RC tachometers (designed for multi-blade propellers), but sadly are on backorder: http://www.hobbyking.com/hobbyking/store/uh_listCategoriesAndProducts.asp?idCategory=158
For sure..
I'm a fan of the harder styles of electronic dance..
So I listen to uk hardcore, freeform, hard house, d&b etc..
I also dj uk hardcore.. So this is the kinda stuff I like to crank either at home or behind the decks: https://hearthis.at/annon201/dcfurs-loung-set-qc16-pool-party/
In case you -do- have to use your computer to read datasheets, I can wholeheartedly recommend using F.lux to help with eye-strain, and perhaps a program such as Focus Booster to keep track of when you should take a short break. (Both are free. Check them out.)
I use both and I find that both help me read enormous documents without too much strain, and with enough pauses to let my eyes rest.
There is no LaTeX “software” per se. First, you need to download and install the compiler - for Windows systems, use MikTeX.
Technically, as the LaTeX is just the core (compiler), you can write the documetns even in notepad and it will do just as fine. To make things easier however, there are packages like TeXstudio, which I'd strongly recommend using for nonautomated douments and teplates preparation.
It's not WYSIWYG editor per se, however what you once type in, it does not break up the formatting for no good reason later (looking at you, MS Word), and this is also a reason why it is heavily preffered in academia world for typesetting thesis and reports. It compiles to PDF in matter of seconds, so you can iterate very fastly, if needed.
Think html alike, but simpler. Plenty of free books and examples around the internet.
Eventhough it might look bit hostile to newcommer, it will long term pay off wery well.
Did I mentioned you can easily make it crunch *.csv files to produce a graph plot and nice formatted table also (yes, even in batches without manual changes to the document)? Also, no struggle with writing down just about any math equation you can think of.
A lot of topical groups on LinkedIn can be pretty good. Otherwise, sadly, I think that /r/ece isn't far off from one of the better communities as general discussion goes.
http://www.physicsforums.com/forumdisplay.php?f=102
This is the only other place I go, typically.
For the students entering my grad program without any EM background (save for a physics class), I recommend Engineering Electromagnetics by Nathan Ida. It's good for an undergraduate text, mostly touching on static fields with a lot of worked examples.
For more advanced EM, I'd recommend Advanced Engineering Electromagnetics by Constantine Balanis. Fairly well written, with a number of examples. The Jin book recommended by u/arosh25 is good, but half of it covers computational EM. It's a great resource, especially if you're going to eventually do some kind of computation, but if you're coming from the ground level, it's probably not what you are looking for just yet.
Not me personally, but a friend got an Onyx M92 for this same reason (well, he was going to grad school, so lots of reading). I'll try to get ahold of him and see how he likes it.
I love my eReader (Nook Simple Touch) but the rendering is way too small (though still readable) for pdf's and scrolling left to right is way too much of a hassle if I zoom in; I can only really use it for books that have adjustable fonts. I did an unscientific survey on Twitter about this a few months back and the consensus seemed to be you need a full size tablet to properly render datasheets because of the detail needed and default text sizes. But with an iPad (popular choice for reading datasheets) there was still eye strain.
Unfortunately the Kindle DX is no longer being made, but it might be possible to pick one up on eBay or used on Amazon. That screen size seems to fit the bill as well, but again, have not tried.
Personally, I have a great printer at work and still prefer paper. Especially for having datasheets open when I'm doing PCB layout or programming. Flipping pages still has too much appeal for me (though I understand the allure of searchable docs and use it when I'm in trouble).
If you figure out something novel, please report back!
27 too old, ha, your trying to be funny right? Grab a copy of K&R "The C Programming Language". The awesome thing people miss about C, the language is compact and easy to learn. There is no need for 600 page tomes. Like a great game, C is easy to learn, hard to master. I am writing some blog posts on a simple embedded system. If you can follow the post, you will be fine. embeddedapprentice
From what I remember, Code Complete and Writing Solid Code (similar to code complete but shorter) are both excellent books.
Believe I was reading Expert C Programming around the same time, which covers some of the gotchas when C coding.
http://www.amazon.com/Expert-Programming-Peter-van-Linden/dp/0131774298
http://www.e-reading.club/bookreader.php/138815/Linden_-_Expert_C_Programming:_Deep_C_Secrets.pdf
You'll need a few things to get started:
It sounds like you need a better understanding of straight C. The first, and IMO still best, book on C is The C Programming Language by Brian Kernighan and Dennis Ritchie, commonly known as the K&R White Book or simply the "White Book".
Play the smartphone game "Flow". Really! It's the essence of PCB layout, turned into an amusing pastime. And it's free.
16 years old...very impressive. Definately keep up with the MCU development and PCB design.
Unfortunately, if you really want to understand things like signal processing and electromagnetic theory, you are going to need university level calculus. Get cracking. I hate to be a downer, but there are people who study hard for four years and still have a weak grasp on some of the concepts you listed...
As /u/CoffeeMakesMeMath suggested, making your own components in Eagle is not too hard. The learning curve can be a little steep, but Sparkfun and Adafruit have some great tutorials for Eagle.
Odd shaped footprints can be pretty painful to make in Eagle. But I took a quick swing at making the one from the datasheet you linked: http://www.filedropper.com/eaglecad-tcr5am
A couple of notes:
Some improvements since then, not much... also the book has a newer edition out. https://books.google.com/books?id=MXzwCfmoihYC
[Update: My coworker confirmed, it's Linden & Reddy, so I did remember! :-) ]
If you have internet access you can make a web request using the Webhooks channel.
I don't think IFTTT supports receiving SMS, so you would need some kind of bridge if that's what you want. One hack could be to send the SMS to a Twitter account, and then read those tweets from IFTTT, but then it's all public.
Speaking from what I observe from students around me and my own experience I say start now trying to make mobile games. If you do not have a macbook,iMac, or OS X then you won't be able to make iOS apps(if you do have one of those apple products get Xcode or if you plan on getting a new computer decide if you wanna make iOS apps since you will need a mac for that). So download android studios and just start programming. The internet is a very resourceful place. I think there even might be a course on mobile development with android studios on coursea, edx, or even iTunes U. All you need to know is java for android.
But to answer your question yes. I have a friend that is CE and is required to take a software engineer course(a CS class) and this semester they are making an android app. Like I said it depends on the school you go to but many employers give you an upper hand if you do personal projects and release them to an app store or even better in a repository like github where they can see your source code.
Here is some links to get you started in android development:
https://developer.android.com/training/index.html
and this starts next week if you are interested https://www.coursera.org/course/androidpart1
Amount of data you can store.
So for integers. You have bytes, words, dwords, ... a byte is 0 - 255, a word is 0 - 65535, ...
note: a word can actually be 4 bytes, or 1 byte or ... it's weird. + assuming unsigned here.
With floating point, you have floats and doubles, and maybe bigger too, I don't really do too much of this. If you assume a float is 32 bits. You can store 4,294,967,295 numbers. Unfortunately there are infinite real numbers between 0 and 1, so you can't encode all of them. Floating point therefore gives you a set of numbers that you can store. See: https://stackoverflow.com/questions/249467/what-is-a-simple-example-of-floating-point-rounding-error for more info.
Doubles are presumably twice the size, ie. in this example 64 bits, therefore you can store a shit tonne more values. However it's still not all.
So a half precision float takes up half the space, 16 bits in this example.
The advantage to this is when you don't need accuracy, but want speed / parallelism.
> I can't imagine the cost for the populated boards being much more than 10 bucks in parts (assuming you are used the ADCs in the LPC chip).
Please see the full bill of materials in our github repository, located at https://github.com/nonolith/CEE/blob/master/BOM.pdf. The parts cost is approximately $40 in the quantities in which we'll be carrying out our first run.
The specifications you purpose would be fantastic and likely required for a device designed for electrical analysis. However, the CEE is designed to be an affordable and versatile tool for the exploration of the physical world, not as a comprehensive tool for advanced circuit analysis. Ringing is not a concept covered in highschool (or even 101) science classes...
Intel? LOL no!
Intel recently dropped their entire x86 embedded lines because they are Epic Fail as a company in the embedded space.
http://hackaday.com/2017/06/19/intel-discontinues-joule-galileo-and-edison-product-lines/
Basically Intel doesn't know WTF they are doing if it doesn't look like a Desktop computer.
They've also pissed off an über-ton of start-ups, developers and R&D managers in the process (we've switched 100% hardcore to ARM after the announcement - Intel will NEVER get another chance).
That's because of the high capital investment in the aquisition, calibration, operation and maintenance of the test equipment & facilities required to verify legal operation.
However, you make a good point - the law was written at a time when doing any RF design had a high risk of failure.
As RF ICs become more integrated and foolproof, laws could change to compensate for the lower risk level.
In fact, this is already possible in Europe, where self-certification is an option.
Zensi[1] (startup acquired by Belkin) looked at the high-freqeuncy EMI coming out of a wall socket when appliances were turned on/off, applied machine learning and could classify when things (such as a TV or refrigerator) were cycling and ultimately to monitor power usage.
Seems like this could be an interesting project for you, you just need to get the data.
[1] I can't find their actual website, but here's a cnet article. I worked with one of the founders, and currently they have another startup with a similar idea named "SNUPI"