First, decide how much time you have to spend on the problem and whether it should be tractable or not.
If it's very urgent, and you can hack around it, hack around it. If it's very urgent and important, hack around it now to unblock and then dig into it deeper later (more in a bit). If it's not urgent, you can treat it as a learning experience and dig in deeper if you want to or not. (Basically make sure you don't fall into a rabbit hole while attempting to solve one specific error and make sure you're achieving what you actually need to).
For digging in deeper (because it's just that important, or maybe you just want to learn): make sure you RTFM, figure out where your understanding of the system and the actual system differ – what did you expect to happen, what's happening and then tweak one thing at a time to make sure things are behaving as you would expect them to till you find the reason. There's a lot I could keep writing about this, and I'm fairly certain much better engineers and authors than me have – an example error would have made this easier.
For Android in particular, we're more lucky than iOS engineers because you have access to the underlying code (for the large part) so you can dig into the android source and even debug the part of the framework that runs within your application.
Of course, sometimes it's just frustration and exhaustion in which case you should follow the advice already posted and just take a fresh look at it the next day.
PS. I haven't completed reading this but https://www.amazon.com/dp/B00PDDKQV2/ref=dp-kindle-redirect?_encoding=UTF8&btkr=1 was recommended to me and has been good so far.
There might be a little bit of misrepresentation here. I learned to program under the curriculum for this book: https://www.amazon.com/Introduction-Computing-Systems-Gates-Beyond-ebook/dp/B07VWKMJBX/
Both the authors of this book are excellent teachers. The course will give you a grounded hardware first understanding of computers. Starts with Boolean Logic and gates. Has you write a program in machine code. Then in assembly, and finally C. Some universities do split the C and move it to a different class but if this is what OP means by four languages the course is done very well.
Also a 300 line program in a 2 semester course is completely reasonable in an intro class.
I actually learned from assembly up. The key is to not learn a real assembly language.
Here's the book they used in my school: https://www.amazon.com/Introduction-Computing-Systems-Gates-Beyond-ebook/dp/B07VWKMJBX/
The advantage of going from the bottom up is it will demystify computers to you.
Agan's book on debugging:Debugging: The 9 Indispensable Rules for Finding Even the Most Elusive Software and Hardware Problems (I like it because it changed how I think about troubleshooting)
Bentley's Programming Pearls (2nd ed) (a short book, I like it because it is short and full of problem framing perspective - reminds me of Alan Kay's quote, "Perspective is worth 80 IQ points.")
I really, really recommend "The RISC-V Reader" by Patterson and Waterman. The book is really well written and gives a good understanding how RISC-V works as well as giving plenty examples of RISC-V assembly code (alongside ARM, x86 and MIPS).
If you want an introduction to low level computing in general, I'd recommend Computer Organization and Design: RISC-V Edition it uses RISC-V for all explanations and examples
I assume that you meant the All in one guide by Mike, Its been on Amazon like all year and similar sites
It's my book that someone else referenced above. You can see it here: https://www.amazon.com/SQL-Server-Query-Performance-Tuning-ebook/dp/B01JC6P8MC/ref=sr_1_12?ie=UTF8&qid=1481383754&sr=8-12&keywords=fritchey
This one is quite good if you are getting started https://www.amazon.com/Digital-Design-Verilog-HDL-Fundamentals-ebook/dp/B008CGQIEQ/#nav-subnav
Another one by same author https://www.amazon.com/Computer-Arithmetic-Verilog-HDL-Fundamentals-ebook/dp/B005H7L0XW/#nav-subnav
That said, you can search online for tutorials, and start practicing. If you are a software guy, you first need to unlearn the "procedural" code flow. Verilog is like circuit instantiation. It does not matter what is instantiated first(or put in a circuit board first), its the control flow which matters.