I would recommend Nonlinear Systems by Khalil
​
You mentioned Franklin's book, and for me personally that's a go-to. It is available on libgen. Ogata's Discrete-Time Control Systems it's also pretty nice, and it is also available on libgen.
To be fair, old classics are still enough IMO for courses in discrete control, you shouldn't bother to go for too recent books.
Just find a cheap book on the subject(s), ideally an older edition or something like that and work your way through the material. I've been taking a simular approach to Optimal Control and Estimation.
Don't just read the text like a novel though; do at least some of the problems, and ideally find a project you can tinker with and apply what you've learned. Standard test problems like double pendulums and such are an okay place to start, but anything else that interests you is fair game.
Ha! I did a very similar thing, doubled up on aerospace engineering and applied math in undergrad. It is very surprising you havent had to deal with stochastic disturbances - pretty much all sensors are stochastic in nature and you cant just ignore that out of convienece.
Id say you have a good foundation to pursue stochastic geometric control, but I dont think it will align with your interest in differential games (I dont know anything about that subject).
If I were to go back to school Id focus on robust nonlinear control.
There is a good paper by Martinelli which I used to teach myself stochastic dynamics -
My research is into an estimator model for a dual solenoid - using the voltage and current measurements to estimate the position of the actuator. Removes the need for a costly LVDT.
As a side project, I'm currently working on a control systems library for Julia. Just started, and would love some people to work with. Never really done open source before.
You could help design the control loop for a non-governmental stable-value currency called Dai? :-D
I'm just a lowly PLC programmer, this stuff is too theoretical for me :-/ I just worry about active manipulation of the control-loop causing an oscillation in the price-peg (Dai:SDR)
Cheers!
If you want something really mathematically precise, you could look at Sontag's Mathematical Control Theory. It is quite advanced, but does a nice job from the mathematical perspective. For more examples, a classic reference is Khalil's Nonlinear Control. This also is very technical, but worth the effort.
Estimation with Applications to Navigation and Tracking, although not strictly a Kalman filter book, covers Kalman and other Bayesian filtering techniques quite thoroughly. There are many practical examples too, though I don't really care for the accompanying DynaEst software package.
I never had a chance to do a final project because I got my Bachelor's as a different major, but I really liked the idea of a scale model of those omni-directional vehicles that keep the driver upright like in Jurassic Park.
My Master's professor denied this for me when I had actually learned controls, but it's a project i've been meaning to look back into.
https://www.amazon.com/Jurassic-World-Gyrosphere-RC-Vehicle/dp/B076MN51T1
Actually, I just came across this book, but all I read is the table of contents, and that's RF anyway which is not really what I've studied.
>Do you encounter a problem that modern control seems a better option?
I think that until I understand modern control, I will not be able to answer this question.
However, I am really just studying modern controls because I want to have as wide a breadth of engineering knowledge as I can.
But anyway, I guess the questions still stand for me which are: Are LMI methods ubiquitous throughout controls? If not, why not, and if so, why study the topics in Kirks book? How is LMI control related to calculus of variations?
There is this, https://www.amazon.com/Control-Handbook-Electrical-Engineering/dp/0849385709 which came out in 90s, but I don’t think it has been updated since. It‘s a collection of independently written pieces. There’s lots of good content, but it‘s a different approach than e.g. Marks‘ Handbook for ME
Check out Ljung's System Identification: Theory for the User, you're probably interested in Chapters 7 & 9 however plenty of useful content throughout the book for the fundamentals of learning in control.
Speaking of which, there's a recently established conference on Learning for Dynamics and Control. Skimming through the recent submissions' titles, you may find methods for problems similar to that of your thesis.
No mine doesnt have answers unfortunately. I have this version https://www.amazon.com/Nonlinear-Systems-Author-published-November/dp/B00Y2SUZ92
But a quick google search does show some promise :)
Thanks for the comment.
Somebody said here it is called a PIV, but I didn't know that.
There are several books out there only about PID and variations, the most popular one by Astrom & Hagglund (I have a copy of the old edition https://www.amazon.com/Automatic-Tuning-Controllers-Karl-Astrom/dp/1556170815/ but I think there are newer -- at least one). And many of the popular textbooks have a chapter, or a few sections, about PID. Search Amazon, or Book Depository, with "PID Control" and you find several tens of books on the theme, and some quite inexpensive author editions, which I don't know about. There are also quite a few very expensive books...
A simpler variation of the controller we are discussing, is just a P controller (no Ki, or Ki=0) with added tachometer/derivative feedback, which is sometimes called "rate controller". This is a two-parameter controller, like PID, without the dangers of the derivative of input steps.
kinda torn between Ogata and Dorf&Bishop and the massive 3 Volume "the control handbook" , what would you recommend to an undergrad EE?
is there any noticeable difference between them or i should just flip the coin and read one randomly?
Here's some information on the X-29, while it's not the F-22, there is a lot of good information in the paper
Without knowing your circuit it sounds like you're just looking for an integrator with DC gain control. Say your setpoint was 6V, the integrator would increase in value slowly until it reached -6V, then if your set point was 3V the integrator would increase in value until it hit -3V. The voltage would be opposite the set point, but there are a plethora of ways to correct that.
Here's one in a circuit simulator, the set point (Voltage) can be ajusted on the right.
Don't think I can say exactly, but you can try a couple of easy problems from here https://leetcode.com/problemset/all/ which will give you a sample of a CS styled interview question.
This was for my current team. I also interviewed @ Waymo and Cruise for motion planning and they asked harder questions. Mostly graph and geometric based. I'd say mediums on leetcode.
There's 1 copy available on Amazon for 24.95 USD.
You can measure this directly if you can read the motor current and voltage.
If you have motor speed you can back calculate motor voltage by assuming a linear relationship between the two (called motor kv).
Estimating the current is extremely challenging, as you must know the motor torque, which can only be resolved as an average of all 4 motors in straight flight. This again assumes a linear relationship between motor torque and motor current.
You can get a very rough idea using BEM theory, but that requires knowledge of the blade geometry. You can get an even rougher estimate using actuator disk theory.
The book I recommend is Rotorcraft Aeromechanics
https://www.amazon.com/Rotorcraft-Aeromechanics-Cambridge-Aerospace-Johnson/dp/1107028078
Hint: without some kind of idea of the drag force your energy estimates will be too inaccurate to be useful except at very slow speeds.
Feedback Systems: An Introduction for Scientists and Engineers by Karl Johan Åström and Richard M. Murray, there is also an online version.
Yes, I'm using a sort of knock-off Arduino Uno. Wikipedia says the ATmega238 (the MCU on the board) can operate at 20 MHz, which I think should be plenty.
I found an interesting fact when playing around with the code. At first I had the code run every 3 - 5 milliseconds. I Then took that constraint away to have it run every loop and it didn't change a whole bunch. I then had a line print to the serial monitor cause I was timing how fast the loop was, and it seemed that the serial printing added just enough delay to make the oscillation very very close to being stable. The oscillation was there but it didn't grow to be unstable, it was just constant. So, somehow adjusting how fast the MCU is computing the input based on the state helps with the oscillations. That leads me to believe I'm definitely having a noise issue in how I'm calculating the derivates. I think the Kalman filter suggested above will be a good thing to implement and hopefully settle that oscillation
There's a good introductory textbook on nonlinear dynamical systems and applications to biological systems called Modeling Life that I help teach a class for. It's aimed to be easily digestible for college freshmen so it has some introductory calculus in there but there's also some really nice connections between dynamical systems and real life systems that's outside of what's normally taught in college differential equations courses.
Possibly https://www.amazon.com/Living-Control-Systems-III-Fact/dp/0964712180 and Behavior:Control of Perception from the same author. I'm coming from a psychology background, and these were quite useful.
I'd suggest getting a book on the subject. I've got this one in hand atm. It's relatively cheap, but your library may have a copy of it on hand if you're still in school. Chapter 3 (out of 6 chapters) is basically all about the subject you're trying to tackle.
I can fill in gaps here and there, but I doubt I'd do the subject much justice trying to paint the whole picture.
In broad strokes: A functional( J[ x(t), u(t), t_f ] in your case) is a function of a function. That is, it takes a function defined on an interval as input, and outputs a number. You are trying to minimize this number by varying the function u(t); i.e. computing a control history. The calculus of variations is about learning to take derivatives of functionals with respect to functions(as opposed to functions with respect to coordinates or parameters), and then developing an equivalent to the first and second derivative tests from calculus so as to find extrema.
I found an answer to a similar problem to yours, but it's not quite the same. I really suggest you dig deeper into the subject before anyone like me throws an equation at you.
These are for an undergraduate course based on the textbook by Seborg, Edgar, Mellichamp and Doyle. So since they don't really do much multivariable stuff, I haven't invested much into those topics. If you're interested in MIMO control, you may be interested in my repository for my postgraduate module which is based on Skogestad.
Inkbird release one new product PID Temperature Controller PIB-16
https://www.amazon.com/Inkbird-IPB-16-Temperature-Controller-Thermostat/dp/B06WD6X17V/ref=sr_1_1?ie=UTF8&qid=1498896977&sr=8-1&keywords=B06WD6X17V Promotion is going on, any interests, pm me, 20% coupon code can be shared with you.
Control System Design Guide by Ellis is a must have. He is very skilled at explaining in an intuitive way.
I am currently taking a Optimization & Control course, using this textbook: https://www.amazon.com/Numerical-Optimization-Operations-Financial-Engineering/dp/0387303030
I haven't spent much time with it yet, but I have had several people tell me it's a highly recommended book.
Thank you for all the mechanical book suggestions. I work in electrical/controls (primarily controls) for a power design company, so those topics are further off my radar. But I am happy to say I know what a control panels, relays, wiring, analog signals, and discrete signals are.
For reference conversions, I was going to bring Michael R. Lindeburg’s Engineering Unit Conversions, 4th Edition.
Great, I found a copy of Fundamentals of Momentum, Heat, and Mass Transfer for $7.20. So that seems like a great route to go for those types of questions.
I have never seen though pocket guides before. And the free download makes it all that much more appealing. Thanks!
I have the Bela Liptak books, but they are very dense. When I started studying a couple month ago, I began with those books. It was very slow going trying to read and comprehend those books. I am hoping after the first couple review books, I will be able to digest the Liptak books better. Most Pes I have spoken with say that they mainly referenced the review books, and not so much the Liptak books. So I appreciate your differing opinion.
I would not have thought of bringing the calculator manual, since we were not allowed to bring it to the FE. Great idea. I will definitely do that. As annoying as it has been I kept using my FE calculator (Casio fx-115EX PLUS) at work in preparation for the PE.
Thank you! Thank you! Thank you!
This book has been a great help in Kalman filters.
If you like Python; this other book on Kalman and Bayesian filters in Python is a must.
Robust and Adaptive Control: with Aerospace Applications by Lavretsky and Wise helped me get my foot in the door for advanced control of aerospace systems.
http://www.amazon.com/Robust-Adaptive-Control-Applications-Processing/dp/1447143957
I used Ogata for my control subjects at uni. I thought it was a decent introductory text- although I never pursued the subject beyon what I did at school. There's a pdf floating around on the internet somewhere.