I recommend this book: Artificial Intelligence: A Modern Approach. I believe Peter Norvig the author also makes his Stanford class notes available online. This is the book I used for my intro to AI class. It was one of my favorite classes and the book was interesting and easy to read. It'll give you some introductions in to a lot of the different parts of the AI field, but ends up focusing more on Machine Learning - at least what I remember (rather than robotics, vision, etc.). You'll want to have some background in probability as well. I had taken probability, if you haven't it might be good to pick up a book on that as well.
I've heard good things about "Code: The Hidden Language of Computer Hardware and Software" by Charles Petzold (though I haven't read it yet)
Also, if you're traveling and bringing lots of book, you should pick up an e-reader. You'll be really happy you did. Think of it as the book-weight equivalent of a constant time algorithm.
Long answer is here
Short answer is "stop whining and learn while you still can"
Regards,
Someone who didn't
>Even if your company needs it I will do it for free
Don't do that. It sounds like you are excited to join industry, which is great, but people are going to take advantage of you if you think like this - even the slightest bit. If people cannot pay you at all there's actually a good chance their idea is bad or they don't have the skills to lead a project.
>Id also love for it to serve the community in some way
Help out an open source project. Not only does this "serve the community", but it's a very good resume builder to say your changes are in a live release of a popular open source project. Learn how to submit merge/pull requests and follow community standards.
>What taught you to be a better programmer and engineer?
Learning how to work in a team instead of doing everything solo.
Personally, I've always wanted to play a certain AI game on mobile (I do not like other mobile games). It did not exist, so I made this. That was an enormous exercise in TDD, AI, computer science, project management, and managing my own motivation as a resource.
Linux at least supports both x86 and MIPS as well as several other architectures...
The purpose of an operating system is to expose usefully the features of the hardware. If you want, say, hardware virtualization, you'll need to be using a CPU that supports it, and write a bunch of assembly code to trigger it. Linux has a vm86() system call that allows using the 386's support for "virtual 8086" mode, heavily used by Linux DOS emulators; this system call doesn't exist on non-i386 architectures.
The OS even tends not to use much of the instruction set if it can get away with it. You'll hear the dictum that you can't use floating-point, SSE, etc. instructions in kernel mode: this isn't because they inherently don't work, but because the cost of saving/restoring floating-point registers in addition to the basic set of registers is prohibitive. Also, writing in C instead of assembly just tends to be much easier, so only hardware-specific support, such as booting up the computer, tends to be written in assembly.
The only CPUs that really can't support modern operating systems are those without memory protection, so that applications can't interfere with each other (whether intentionally or not), and device protection, so that applications can't access the raw hardware resources (disk, network, etc.) without the permission of the operating system.
You might also find VMware's "A Comparison of Software and Hardware Techniques for x86 Virtualization" enlightening. The overarching question is how useful hardware virtualization (of that era) was, compared to software techniques. The subtler question is what else software techinques can implement: they generally manage to implement protection, too.
I assume you're on linux or OSX and know how to program a little bit. I also assume you want to do this lots of times, not just once.
1) Use imagemagick to crop out the slice of the image you need. It's a command-line tool so you can write a script to invoke it. http://www.imagemagick.org
2) Pass the slice into the OCR API here using the curl or wget command. http://www.datasciencetoolkit.org/
3) Parse the text using your favorite language (Perl, AWK, Java) to extract the timestamp
--PhD
Hack Nights! Have people come into your communal lounge and just build things. NO HOMEWORK ALLOWED.
Java Robots: http://robocode.sourceforge.net/ - make robots in java and have them fight each other.
Have upperclassmen give presentations on things they've learned.
Movie Night - watch something cheesy like Hackers.
Were you given a syllabus or curriculum? Are you teaching for the AP CS test? What do you want the kids to get out of this class?
Personally, I think a CS class would do really well to have logic puzzles, brain teasers, and critical thinking, especially early on when no one knows how to program. It's a good way to start thinking logically and algorithmically.
But it'd be a silly intro to CS course if the students never learned programming (after all, prog languages are how we convey our ideas and incidentally happen to also run on computers :P). You'll want an easy high-level languages that the students can jump right into. I recommend Python - it's very easy to understand and there are lots of terrific resources for it.
I found this coursera course to be one of the best MOOCs I've been a seen. It's introductory programming in Python. In fact, If I were you (and if its offered again in the Fall), I'd consider having the students enroll in this. The video lectures are great, it has quizzes to test understanding, and the assignments / mini-projects are amazing. Plus, their codeskulptor site is a great thing - you can get a Python program up and running in no time (no installation hassle required)!
The mini projects build interactive games (such as pong and asteroids) with appropriate starter code. It uses its own simplified GUI library designed for first year CS students that are new to programming. The fun interactive games are a great way to keep studets engaged (rather than writing a program to compute a Knight's Tour or whether a number is prime, etc).
https://www.coursera.org/course/interactivepython1
I, myself, learned Python from Google's Python class taught by Nick Parlante. Very good lecturer, and goofy too (the good thing). The assignments were fun little exercises for me.
https://developers.google.com/edu/python/
Good luck! This sounds very exciting! I hope you and your students have a great time :)
SICP(Structure and Interpretation of Computer Programs)
Introduced me to Scheme. This book has never failed to blow my mind. The problems may seem complex, but figuring out the solution is an enlightening process indeed. It also teaches you lots of interesting concepts.
Introduction to Algorithms, by CLRS was mentioned here last time something like this was posted. It's not language specific, and it is very comprehensive.
Code Complete / Clean Code / Pragmatic Programmer - all three cover the same topics and will improve you as a software developer. Code Complete is probably the worst of the three, since it's also the longest but covers the same topics, more or less.
Going through the Stanford List, here are the books listed there that I have also used and thought positively of:
Of the list, the only one I have read and didn't like was Operating System Concepts. It's not that it was terrible, per se, I just liked Modern Operating Systems a lot more. I used the third edition, but I can't imagine the 4th is any worse.
Computer Science students often come to college with wildly different levels of experience. I really didn't begin programming until college.
First question: are you doing OK in your classes? If the answer is "yes", then DON'T PANIC.
Second question: are you doing anything above and beyond your coursework? Unlike some other college majors, you are expected to invest some serious effort into self-study. This is a critical skill that will define whether or not you're a good career programmer, let alone CS student. If you only do what your professors put in front of you, you will be mediocre, period.
Still, don't panic - you're first year, this is where you figure out what it will take to be successful. You absolutely do want to be picking up books and reading them in your free time. I used to say that Borders should become your second home - but these days, for many people, it's all ebooks now. Either way, books are about to become your new addiction. Whenever I interview a programmer for a job at our company, I sometimes ask them to describe one of their favorite programming books. The answer often reveals more than any other part of the interview.
There's a big huge discussion to be had as to what books you should start reading. You mention language books, and certainly, getting to know the languages you use better (or learning more languages) is very good. However, there is also a lot to learn about the practice of programming. I won't rattle off a long list, but I'll link to a nice StackOverflow page that lists a lot of them. And I will recommend reading the one at the top of the list: Code Complete. Every novice programmer should read this.
(edit: fixed link markup)
You don't do lectures in kindergarten, but some of the activities in CS Unplugged could work for kindergarteners. If you can count to 15, you can do the binary numbers exercise. http://csunplugged.org/binary-numbers/
Hour of Code, Alice, and Jeroo are all things that could work for a 5th grade curriculum. Alice lets you code functions, loops, conditions, and math expressions in a drag-and-drop environment. http://www.alice.org/index.php
Nowadays most universities start students off with Java or Python. Those languages are both easy to learn while being versatile and they both see actual use in industry. However, it's more important that you understand the fundamentals of programming logic, control flow, data storage, etc. than any particular language.
If you want other things to study, look into discrete math and theory. Even something simple like knowing what a DFA is and their limitations would put you way ahead of the curve.
As an aside, this question isn't really suitable for this forum. /r/learnprogramming would probably be a better place to post this.
I recently wrote an article to answer this very question, taking the reader from a brief history of the problem, through Turing Machines, time complexity, P & NP, some of the consequences of a proof and a brief overview of attempts at solving the problem.
I hope someone finds it informative (edit: ... and would appreciate feedback if anyone has any).
I will be down voted to oblivion... but it has to be said. You really have no business teaching computer science... yet. Many of the posters here also do not seem to grasp the difference between computer science and computer programming either.
Yes to be good at comp science you need a solid mathematical background, but you have several years of study and work to be honestly qualified to teach comp science. You have a head start, but you cannot complete it in 4 months.
If you want a good basis (and some regard as the best) in computer science that is heavy on math then read all of Knuth's books "The Art of Computer Programming". With your math background you should be able to get through it much easier than others.
http://en.wikipedia.org/wiki/The_Art_of_Computer_Programming
You could even use these books as the basis for a computer science class. They have great problems of various difficulty on each subject. The difficulty range from "you should be able to verbally give the answer in a few seconds" to "if you solve this... you will be famous (at least in math / comp sci circles)".
A great metaphor comparing bad code with broken windows.
They used the metaphor of broken windows. A building with broken windows looks like nobody cares about it. So other people stop caring. They allow more windows to become broken. Eventually they actively break them. They despoil the facade with graffiti and allow garbage to collect. One broken window starts the process toward decay.
Chapter 1: Clean Code
As best I can tell, the research he is referring to in the video is this paper: http://www.sciencedirect.com/science/article/pii/0167278990900762 (unfortunately I do not have access to sciencedirect articles at the moment)
From the articles that cite that paper, it seems that he was not working on a general sorting algorithm, but rather generating minimal sorting networks. These can be used to sort arbitrary numeric inputs up to a given size, but are not the same as a sorting algorithm.
Wikipedia touches on what is going on in its article on sorting networks: http://en.wikipedia.org/wiki/Sorting_network which "more_exercise" also points out in his/her post. The sorting networks that are known with theoretical best size and number of inputs have large constant factors. The genetic algorithm was able to find networks with better empirical runtime than the best known networks.
You can write directly to VGA memory at 0xA000. This can be done under DOS, but there's nothing stopping you from making it bootable directly.
http://www.brackeen.com/vga/basics.html
http://www.codeproject.com/Articles/36907/How-to-develop-your-own-Boot-Loader
Introduction to Algorithms is an absolute classic. It covers the vast majority of the algorithms that a good programmer "should" know (and goes over much of the math in the appendix in the back). Every school I've worked with has at least 1 course using this text, and typically each company doing anything interesting has at least 1 copy floating around somewhere.
I have a bunch more books that I could personally recommend if you have a specific thing you're trying to learn, but in terms of books that are 100% canon, that's the only one that comes to mind for me.
for "C language", the following are mandatory reference bible material, though there are better books for newbies.
The C Programming Language, Kernighan & Ritchie.
The Standard C Library, Plauger.
surprised nobody has brought it up yet, Structure and Interpretation of Computer Programs.
Also The Art of Computer Programming books by Donald Knuth are pretty classic, but I haven't read any of them yet.
Code: The Hidden Language of Computer Hardware and Software by Charles Petzold
Structure and Interpretation of Computer Programs by Harold Abelson and Gerald Jay Sussman.
When it comes to programming, this is the book to read. The knowledge contained herein will be relevant to you no matter what programming language or concept you will use. Every chapter is full of information, well-written, with exercises that will ingrain to you the skill taught in that given chapter.
This is not a book about programming tips and tricks nor is it a cookbook.
This book is about fundamental programming principles that you will use whether you will use object oriented or functional programming and will teach you how to build the rich framework that will make your code as elegant as possible. This book teaches software engineering as an art.
It is considered to be quite advanced even if it is a freshman's book, so I suggest you just skim it at first and study the knowledge you are lacking in to understand this book.
You can read the book for free over here. You can even easily find a .pdf copy of the book online as the book is under a Creative Commons License.
You can also buy the book over here. I suggest reading the reviews in this link as they have a better write-up on why this book must be read by every computer science student.
Find a reason to use it. I learned most of my higher level math stuff through programming, never passed college algebra in my actual courses and I pretty much hated pure math. After jumping into project euler and a coursera machine learning course I fuckin love math. I've learned more math in the last six months than I had the previous six years. Having an application is a huge help.
Check out the book A Mind for Numbers. Really insightful for CS majors IMHO.
Pick a language, go to project euler and go to town. I've found it a great way to familiarize myself with a language enough to be comfortable starting a bigger project. (again: avoid C#, Java, etc. go functional or interpreted)
Please don't go into a CS major with the mindset of "I need to learn things that will get me a job." I see people do that, and they just ignore fundamental, albeit dry, CS subjects (such as data structures) because they think it won't apply in the real world (they're holding out for trade specific classes such as game or web programming.) In the end, they feel like the major didn't fulfill there expectations, but they really didn't understand what CS was offering in the first place.
CS isn't focused on programming languages: languages just allow Computer Scientists to verbalize ideas. Some languages allow the programmer to express ideas more easily, so they may be preferred in certain situations (C is very useful for bit pushing, but if you want an expressive, dense language then use something from the functional paradigm.)
Anyways, people say that CS is just applied discrete mathematics, and I'd say they're correct. So if you want to prepare, I'd say study Philosophy/Logic, maybe get ahead and study Linear Algebra, and start solving programming problems with style. Don't worry about what language to learn, just learn a language that will help you do what you need and understand what they language offers (don't try and force a language into working like another, if you understand how languages try to solve problems it'll open your mind and make it easier to pick up others in a moment.)
Amazon web services offers many datasets and you can spawn an instance with the dataset as a mounted volume. You'll still need to figure out how to work with it, but quite a decent selection to mess with.
Effective Java by Joshua Bloch is a really good book and definitely worth reading if you're using an object-oriented language, not just Java. It helped me immensely when I was starting out with how to think about my code and my designs.
I know exactly the book for you.
Code: The Hidden Language of Computer Hardware and Software
It's a well written book that will guide you through the bits and bytes, the logic gates, the assembly instructions, and the hardware that implements it. It's a nice casual reading style, not like many more dry textbooks. Get it now.
If you want something more in-depth and textbookey, with excercises etc, check out Patterson and Hennessy's Computer Organization and Design which is a good book, but perhaps a bit more than what you want to dive into right now.
I would recommend reading Design Patterns: Elements of Reusable Object-Oriented Software. That book will give you the majority of design knowledge you would gain at this point in your career from college.
I have way too many bookmarks and resources when it comes to programming. I just sort each language or type in each board to check for reference.
> read all of Knuth's books "The Art of Computer Programming". With your math background you should be able to get through it much easier than others.
Not the most practical of advice. TAOCP is almost more of a reference than a learning manual, especially if you consider all of the volumes together! :-) It's like reading an encyclopedia -- the first three volumes are something like 6 inches wide worth of paper. It's also fairly dense and I'd argue not the most accessible text, especially for first-time students. Even though it does introduce basic concepts, it's written for people with a lot of background in CS.
As a learning manual (either for one's self, or as a basis for what to teach students) I'd recommend Structure and Interpretation of Computer Programs.
"Computer science is no more about computers than astronomy is about telescopes."
-- attributed to Edsger Dijkstra
Enjoying programming is fine, but solely based on that, no one can tell you if cs is the right thing for you. Actually, if programming your calculator and knowing a little of perl is the first thing that comes to your mind why you should do cs, you probably shouldn't.
Head to the next university library and bury your head in introductory books in cs fields. For example, read a little in Cormen's "Introduction to Algorithms" and maybe do a few of the exercises and proofs. Enjoying it? Theoretical CS may be your thing.
Same goes for Kuroses' "Computer Networking" and the computer communication field or Tanenbaum's "Modern Operating Systems" for ... you guessed it.
CS is very diverse. You don't have to like all of it.
I'm not convinced that non-cs major are in general better programmers. I don't mean otherwise, but I find that people in general think that they are already great programmers after coding simple Android application in Java.
There are a lot of things can be learned from CS major, those who are already good programmers will become even better programmers. Even an introductory course like Introduction to Algorithms will open one's eyes. The thing is, nowadays there are already available libraries for these algorithms.
The traditional answer, especially if your school uses some flavor of *nux to host the autograder, is to use Cygwin to compile on the command line with GCC, but I could never for the life of me get GDB to work correctly for debugging.
If you want to stick with a non-Microsoft compiler and still use an IDE, try Code::Blocks.
But honestly, I'd say stick to what you know. If you can work faster in Visual Studio, download the latest version of VS-Express and go to town. As long as you're not getting too clever with the preprocessor or C++11, (or 14, or... 17?) extensions, your code should basically do the same thing in either environment.
Short version: OpenSSL uses a random number generator (RAND_egd (entropy gathering daemon)) to give large numbers to a primality tester. OpenSSL uses the Miller-Rabin Test.
Long version: http://www.openssl.org/docs/crypto/BN_generate_prime.html
1) Generate a "random" integer of the required size (number of bits). The randomness must be quite good, for the result to be cryptographically secure. This is what RAND_egd does. This is still a tough problem. -- This number should be odd.
2) Try to divide the number by a list of known primes (3,13, etc) to stop early.
3) If the previous step is passed, run Miller-Rabin. There are no composite numbers that will always pass this test.
4) If the odd number fails the Miller-Rabin test, increase the number by 2 and go back to step 2. Usually several iterations are necessary (more than a hundred).
The resulting number is only probably prime, called "pseudoprime". The number has a low, but nonzero probability of being composite.
Its worth noting that just about every honest person I've heard recommend The Art of Computer Programming has added
> but I haven't read any of them yet.
They're great books, but they're not intended for the vast majority of people.
I'm currently going through Head First Java, and it's very excellent. Having said that, It's the only book on java i've read, so I cant say it's the best.
It has a virtual machine. It's not even limited to running the Actionscript byte code. Adobe released Alchemy, a C compiler that emits flash. Someone ported Doom to it: http://www.newgrounds.com/portal/view/470460
> Problem 1 is that most interpreters use RegEx to interpret a language
Half-assed parsers of all flavours use regular expressions.
Real language translators and interpreters are built on top of a lexical analyser ("lexer"), that breaks a sequence of characters into tokens.
Regular expressions have a bunch of equivalences to finite state machines (DFAs or whatnot) that I knew about 25 years ago, but in the intervening period have slowly faded in my memory. However, you will need to read up on those, most probably.
> Problem 2 is ... the interpreter needs to be versatile too.
Golly.
So you need to learn how to write a compiler. And you also need to translate your regular expressions into a state machine.
Broadly, the way I see this going down is that you write a libary for representing your lexer, and an API to implement regular expressions as finite state machines that consume tokens from the lexer, and all that needs an API to make it extensible. Then you load it up with some default rules for tokens and language syntax to define a regexp, and then the language bootstraps itself from that, by parsing a configuration file containing more definitions for escape sequences and other regexp syntax.
You might want to look at a pre-existing parser-generator library like ANTLR to see how much of this problem is already solved for you. It may be that you can simply embed that and bootstrap it with some definitions and some custom code to be able to feed the results of parsing back into the ANTLR library.
I found the The WEKA toolkit to be a nice centralised resource when it came to learning about the multitude of techniques and parameters used out there. There's a book too which is a very informative read if a little dry in places.
This was used in my Language Identification project for speech signals and it worked quite nicely.
As far as Programming Language Theory goes, this is a fantastic collection of books, papers and videos: http://steshaw.org/plt/
I'm particularly partial to How to Design Programs as a freshman-level guide to PLT. In some ways it's a spiritual successor to SICP.
The Art of Computer Programming is undoubtedly a part of computer science cannon, and it's well worth the investment (which is significant), but you probably won't get real value out of it until your junior year and thereafter.
Like in philosophy, there is no one definition of intelligence in comp. sci., any given AI system will essentially have its own definition.
Artificial Intelligence: A Modern Approach (Stuart Russel & Peter Norvig) gives a few definitions, and its introductory chapter gives a fairly good overview of what you're looking for, as does chapter 26.
Some examples include the splitting of AI into strong AI and weak AI. Weak AI being an AI that solves a single problem, e.g. computer vision, speech recognition, or navigation. Strong AI, on the other hand, is the (somewhat less well-defined) all-encompassing human-like intelligence you'll see in films, for example. The possibility of ever achieving strong AI, and what exactly that means, is hotly debated.
Another view of different types of AI separates it into four types:
"Thinking" refers to, well, thinking, reaching a conclusion given some facts.
"Acting" refers to making and executing decisions.
"Humanly" as you can guess refers to human-like, with all the imperfections and advantages that come with it (e.g. computer vision).
"Rationally" means the optimal solution, e.g. navigation.
(Grain of salt warning: this is all from memory, check the book for better information)
The Algorithm Design Manual is a fairly popular one you can try (more approachable than CLRS) -- and I believe it even contains some references to interview questions.
Scratch is a programming environment made for kids https://scratch.mit.edu/
Later your cousin can go onto slightly harder topics like Haskell Foldables. Just kidding. There's a book called Realm of racket that I've read someone used for non programmers and said had great results.
Oh, is this the one that uses Node.js and V8 as an install requirement?
Edit: survey says yes:
dghost@Voyager ~> brew info npm node: stable 10.8.0 (bottled), HEAD Platform built on V8 to build network applications https://nodejs.org/ /usr/local/Cellar/node/10.8.0 (4,022 files, 48.7MB) * ... icu4c: stable 62.1 (bottled) [keg-only] C/C++ and Java libraries for Unicode and globalization http://site.icu-project.org/ /usr/local/Cellar/icu4c/62.1 (250 files, 67.3MB) ... Version 0.1.1 Total size 1.02 MiB Total dependencies 121 Tarball size 10.2 KiB Direct dependencies 4 Dependencies signale @ 1.2.1 -- 717.2 KiB (23 deps) update-notifier @ 2.5.0 -- 271.8 KiB (75 deps) meow @ 5.0.0 -- 150.1 KiB (46 deps) chalk @ 2.4.1 -- 32.2 KiB (6 deps) ```
Soooo, it's a 10KiB terminal todo app with 1MiB of direct dependencies that requires a 100MiB runtime.
No, this will not come naturally after college, trust me. It takes a lot of dedication and practice outside of your normal classwork.
I did competitive programming all 4 years in high school in ACSL and TCEA. Contests were usually 2-5 hours on a team of 3 people with 1 computer. In college did the same thing a couple times with ACM.
I dedicated many hours a week during my peak competitive times to practicing problem solving, especially rewriting algorithms constantly and learning how to make them generic to apply to many different types of problems.
The practice pays off. I can approach any code at work with ease, and formulate solutions and design ridiculously fast. Many people often think I'm a genius, but I'm not, I just have extensive experience and A LOT of programming hours under my belt. I highly recommend you keep on this path, it will pay off and definitely help your career as a software developer.
Now in my free time I just like to solve problems on sites like http://projecteuler.net/ and a few others when I see them pop up, and they look interesting and fun. I have never seen Google Code Jam before, will definitely check it out, though I don't have the time anymore for any actual competitions.
For learning the basics of a language, I like to use Project Euler. Yes, it's math oriented, but I still find it very helpful. It's not as daunting as a full-fledged project because the problems are pretty much self-contained.
You don't even need a super computer for that. The butterfly effect is a property of chaotic systems and there are a number of very simple non-linear dynamical systems such as the Mackey Glass System or the Lorenz Attractor.
They might not look as impressive as you think they should, but they are exactly what the butterfly effect is about. The proof of the butterfly effect is in the definition of chaotic systems. And that there are real world systems that exhibit this behaviour is in the first paragraph of the wikipedia article you linked.
But regarding your phrasing of the effect, if you want to simulate an arbitrary small disturbance, you will get serious problems with your orders of magnitude. Imagine your simulation contains a variable that is somehere in the 10^-3 and you perturb it with a value of 10^-(10^10^10), your floating point variables will round off your pertubation (because they can only hold a certain precission) and you will have no effect at all. But that is only a technical limitation and has nothing to do with chaotic systems.
Google hasn't been helpful because no such algorithm exists. Check out Rice's Theorem for the impossibility.
edit:
Let S be the set of languages that you can reduce SAT to in polynomial time.
SAT is clearly in S, and we know some machine recognizes it.
The empty language is not in S (even if P=NP, so that SAT is P-complete), and we know some machine recognizes it.
By Rice's Theorem, no machine decides, when given a machine as input, whether that machine recognizes a language in S.
(we assume that the "any custom problem" input is as a machine encoding)
edit2:
I see that you make a lot of questions about computational complexity, but do not have a good foundation. Many things you propose or ask already have known impossibility results. May I suggest you have a look at Sipser (https://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X)? That will give you a better understanding of computability and complexity to understand the feedback you're getting.
Code by Charles Petzold sounds like it is exactly what you're looking for. It reads more along the lines of a novel, starting at the very beginning (electricity, binary, etc) working up through all the parts of a basic computer (including digital logic, assembly, etc). It's as good as (or better in my opinion) than any undergraduate book on OS/Computer Architecture you'll find. And a much more enjoyable read.
The book that comes to my mind is closer to Software Engineering than it is really to Computer Science.
The Design of Everyday Things Really a must if you plan on building anything people are going to interact with. Not so much for Algorithm design.
Cormen et al's "Introduction to Algorithms" is a lot of fun to read despite its heft, and generally stays abstract.
Knuth's "The Art of Computer Programming" series stays sufficiently far from talking about hardware that he designs his own instruction set for the sake of pedagogy, and talks about implementations on that ISA.
But more seriously, most of the reasons algorithms are what they are is because the hardware we have drives them that way. If CPUs weren't designed to execute a single instruction and were instead designed more like neural nets and could do much less complicated tasks in parallel, programming would be very different. If all storage were equally fast, and we didn't have caches faster than memory faster than disk, our constraints on space usage of reasonable algorithms would be very different. If computers didn't really like fixed-size integers, lots of algorithms would look different and van Emde Boas trees would be way less interesting. If quantum computers existed in any practical scale, we wouldn't have a separate subfield of CS for finding quantum algorithms, distinct from "normal" algorithms. (And in thirty to three hundred years, that's probably exactly what will happen.)
The algorithms text I mentioned spends an entire chapter on vEB trees, incidentally, as well as a few chapters on alternative models of computation like sorting networks and stuff.
Do you already know a programming language or two? It couldn't hurt to start learning now. Project Euler is a great place to get some practice in if you don't know what to program.
There are some actual useful things you can do.
Boinc is a project where you can donate CPU time to various computationally intensive research projects. They distribute the computing to volunteers. Running this would create just as much heat as an infinite loop but instead for science. https://boinc.berkeley.edu/index.php
Another alternative would be to mine cryptocurrencies. It is pretty much futile to attempt to mine bitcoin without special hardware at this point but it can be fun/educational and you might make a couple cents! There are also other alternate currencies besides bitcoin that can be mined more effectively with desktop computers than Bitcoin.
This post on Quora offers a reasonable explanation.
>The fact that an asterisk is used when declaring a pointer and an asterisk is used when dereferencing a pointer is not some sort of usability glitch in C. It is intentional.
>There are two other notations that work like this. You use square brackets when declaring an array, and you use square brackets when accessing an array. You use parentheses when declaring a function pointer, and you use parentheses when calling a function pointer.
This is not an analogy about a specific computer science problem, but about the life of a developer in general. I stole it from a guy's signature on Code Project years ago. I always keep a copy of it printed and post it prominently regardless of where I work.
> Imagine that you are hired to build a bridge over a river which gets slightly wider every day; sometimes it shrinks but nobody can predict when. Your client provides no concrete or steel, only timber and cut stone (but they won't tell you what kind). The coefficient of gravity changes randomly from hour to hour, as does the viscosity of air. Your only tools are a hacksaw, a chainsaw, a rubber mallet, and a length of rope.
> Welcome to my world.
> -Me explaining my job to an engineer
Purely functional languages. OCaml with ref is not pure in the strict sense like how Lisp/Scheme/Racket is not pure. In contrast, Haskell is pure which makes direct representations difficult but thanks to laziness, possible, e.g., Tying the Knot and Representing Graph Data Structures in Haskell.
Pro Git is the best I have ever found and reading it has started to make me the "go to git guy" at my company, which is kinda neat. You can get a hard copy on Amazon, but it's free here for download (pdf + ebook) or web-based. I also have "Git in the Trenches" on my Kindle and while I've only gotten through some of it so far but it's pretty good - attempting to make a narrative and teach by example.
Proving algorithms is much harder than coding algorithms. I think you should get good at doing the analysis yourself. It is a very valuable skill.
There are entire books dedicated to string matching. Maybe something similar to your idea is in one of them.
Here are a couple: https://www.amazon.com/Algorithms-Strings-Trees-Sequences-Computational/dp/0521585198/
https://www.amazon.com/Algorithms-Strings-Maxime-Crochemore/dp/1107670993/
Also, in the C code, why are you starting the clock before the “start of program” print statement (instead of after)? Printing text to the screen is a slow process and will add a lot of time to the clock.
You're going to get a lot of seemingly know-it-alls telling you what CS isn't
Generic Person: Oh you like computers? Do you computer engineering.
Generic Person: Oh you like software? Do software engineering.
Generic Person: You should only do CS if you're interested in algorithms and the how-to knowledge of solving problems.
but like... How would you know if you're interested in that if you've never been exposed to it? I spent my freshman year as a Mech Eng because I liked Physics and Calculus. They made me take a programming class in Matlab, and it turns out I liked that even more. So sure, you might like computer or software, but that doesn't mean CS isn't for you. It might just mean you're a talented or curious person.
Also, like I said, I started with CS because I loved programming in Matlab. Is CS about Matlab? No. Is it even about programming? Not necessarily. But it was my introduction to the field. Once I got here, I realized how much I also like Algorithms, Structure and Interpretation of Computer Programs, and Machine Learning.
My advice is give it a shot. Heck, spend some time over the summer learning Python and working on a project to see if you like it. Nothing is better than a self study and self project, and to get you started, check out MIT OCW, Coursera, Khan Academy, Google's Python Class, etc. This curiosity and motivation will be the single most important thing for discovering if CS is for you, and if it is this will likely put you far above your intro-level peers.
And not that this should be the reason to join, but the jobs are great.
But I should also note that CS isn't for everyone. It's the perfect fit for me. Could be for you. But maybe the Generic People are right and you should look into one of the engineerings. The moral of the story is that you won't know until you try and CS is a field where there are so many free online resources to help!
The books I keep hearing people refer to over and over again, despite their age are:
You will hear these books referenced as if they're books every programmer is supposed at least know exists, which is the closest thing I know of to canon. Now, should you actually read them? Eh, there's better, more entertaining books to learn from. But if you wanted something to keep on your bookshelf, perhaps as reference, these would be it.
Read these a while ago, but they stand out as causing the biggest leaps for me:
Code Complete, Effective C++, More Effective C++, Design Patterns, The Art of Computer Programming, Compiler Design in C
Edit: While not strictly a coding book, the Mythical Man Month made me a much better overall developer / manager.
This is a good book for undergrads interested in AI: Artificial Intelligence: A Modern Approach
"Omit needless words." -- The Elements of Style. http://www.amazon.com/The-Elements-Style-Fourth-Edition/dp/020530902X
An example write from the getgo:
Yours: Let me start off by saying this: I’m terrible at learning new technologies.
Mine: I’m terrible at learning new technologies.
In my version I get to the point right away, instead of forcing the reader to slog over "Let me say this about that before I say anything else and, uh, you still with me here?"
My version has punch. Yours is soft. "Let me say, at the outset, before commencing the rest of the story, that my name isn't really Ishmael, but I'd like it if you call me Ishmael." Versus the classic: "Call me Ishmael." Punch. Precision. Get to the point. Omit needless words.
My suggestion would be for you to buy The Elements of Style (six bucks), read it, then take a sharp red pencil to your prose. The length of your piece will be halved and the reader will appreciate your sharp, to-the-point exposition.
> The Art of Computer Programming by Donald Knuth
I would love to see the person who can get through all 3.5 volumes of AoCP and doesn't have a math or computer science degree. That's some seriously heady stuff, and it's not like Knuth takes it slow or anything.
> -- ~~Edsger Wybe Dijkstra~~
-- Ian Parberry and Mike Fellows
Often misattributed to Dijkstra, the original quote is this:
> "Computer Science is no more about computers than astronomy is about telescopes, biology is about microscopes or chemistry is about beakers and test tubes." > Michael R. Fellows and Ian Parberry Computing Research News, 5(1):7, 1993
Overleaf is a great online Latex editor. I tend to use it because it makes giving the document to someone else to edit so much easier.
It can also sync with git if you'd prefer to version control it that way!
One way is to bundle the runtime with the program. So there's a subfolder (or it's directly bundled into an executable) in the application directory that gets used.
For instance you could use launch4j to bundled a java program & the runtime into an exe
Another way is to cross-compile to a different language or use an alternative compiler that compiles directly to machine code, which exists for some languages, e.g. Excelsior JET for Java.
Hey! I highly highly highly recommend using https://www.sharelatex.com - it doesn't require you to download anything, it's lightweight, you can save stuff as you go, it compiles as you go (just press compile) so you can see what you're doing AS YOU DO IT. Best way to learn to use LaTeX, IMO.
This is what I've been using for my discrete math HW, I haven't been writing notes in it because I still don't have all the symbols memorized though! High five and good luck in the class, I'm hoping to do decently myself haha
I found Andrew Tanenbaum's Structured Computer Organization really good for getting a thorough introduction to computer hardware from logic gates over instruction set architecture to assembly language.
Product lifecycle, development process, testing and quality. Books such as "Code Complete" and "The Pragmatic Programmer" plus classics like "Mythical Man Month" and "Peopleware".
That plus an education in what not to do (don't recreate CORBA, "Eight Fallacies of Distributed Computing", etc.) will serve you well. This leads to suggesting books like "Effective Java" or "Effective C++" and "Design Patterns".
I often feel that an emphasis on algorithms is too abstract and not tied to good judgment. Having the perfect data structure is often very weakly correlated to delivering a success. There are a multitude of factors in building software system that must be balanced and prioritized of which algorithms is just one. I always keep this story in mind.
The Art of Computer Programming (TAOCP) by Donald Knuth.
Otherwise, you're right. You'll need to find entire books that focus on single topics that you're interested in.
This course on coursera is really good too. It takes a lot of works though and has been started for about 3 months, but I think you still have access to lectures and exercises https://www.coursera.org/course/matrix
Creator here! I am excited to share this AR application I made on snapchat's AR platform, and I wanted to see what people think, any bugs they find, and any doodles you all create!
Link to try the snapchat experience here: https://www.snapchat.com/unlock/?type=SNAPCODE&uuid=ecc379949d1341719936b365291b16d3&metadata=01
Try some Project Euler problems. You can use it as a way to try out new languages too. Some of the problems, for example, might be a lot easier in a language with list comprehensions (e.g. Scheme, Haskell, Python, etc.) and convenient higher-order functions.
This is purely a guess based on the fact that you're posting in /r/CS about an intro to Java...please forgive/correct any erroneous assumptions I've made.
If you're entering a computer science curriculum where the first language that gets taught is Java, you have bigger things to worry about than learning Java. Most likely the first week or two of classes will be "This is the IDE you'll be using. This is how to open it. This is how to save files. This is how to write Hello, World. Ignore all the complicated words like 'class' and 'void', we'll explain them later. For now, just copy it down exactly as it is on the slide. This is how to compile and run it. If you get a compilation error, raise your hand and a TA will show you where you forgot to put the semicolon."
You'll be better served in the long run by teaching yourself Python, or C, or almost any language other than what you're currently learning in school. If you're bored with your homework assignments (and I can almost guarantee you will be), pick another language and solve your homework again in that language. Or, go on Project Euler and solve problems that look interesting.
The key is, don't focus on learning Java, focus on learning programming. Java is one of the tools you have available to do that. Then, you can focus on learning computer science, with programming as one of the tools you have available.
Exhaustively checking for correctness seems awfully close to proving correctness of your implementation. There are provably correct implementations of AVL trees (implementations with proofs that the implementations adhere to some specification). For example, AVL trees have been formalized in the Coq proof assistant: https://coq.inria.fr/library/Coq.FSets.FMapAVL.html
Spend a lot of time testing yourself, instead of just reading.
To take it to another level, use software that tests each item at increasingly long intervals. This article is a great introduction. There's also open source software for it, such as mnemosyne. To use this effectively you have to do it every day.
This might seem out of left field but I've actually found the python library matplotlib (along with numpy and scipy) really easy to use for quick graphing of datasets; much easier than learning a new language (R) or fighting with Excel (I believe my datasets are too large)
I think The Art of Computer Programming mentions that feature in its preface. At least, Volume 1. I never got to the MIX parts of the book, but, considering the author is Knuth, you probably don't actually have to run very many—if any—programs.
Well as you can kind of see, Comp. Sci is made up of a lot of fields. There aren't "canon" books because the science isn't necessarily divided by the fields, so there are a lot of overlaps and unsurprisingly, a lot of great books. Personally I think there are 5 major fields and I'll give some books I've heard of or have read:
Computability and Formal Methods: Ullman, Automata Theory is hands down the best introduction to the field. The dragon book is a great compliment to this book as well, as Automata Theory and compilers go hand in hand. Eventually you should read Turing's Entscheidungsproblem paper.
Complexity Theory: Not sure if an undergrad knows enough to go down this path just yet, and I don't think a ton of CS undergrad programs focus on this much, but maybe a good intro to this would be just reading Shannon's Mathematical Theory of Communication.
Data and Algorithm Analysis: CLRS. Also, the Sedgewick and Flajolet Algorithm Analysis book is a good introduction to analytic combinatorics, which is great if you have the calc chops.
Programming Languages: I know nothing about this area, but I know people research it rather than just write programs. I only read through TTFP from Thompson and thought it was ok. Nipkow and Klien's Concrete Semantics was in my list, but I haven't read it. Not sure if its undergrad ready though.
Software Engineering: Mythical Man Month was great. I tend to consider this a field because maintenance of code is why industry cares, so it should be considered a field. Dependency management, team organization, version control, etc. are all important. I think people like Code Complete as well.
Great book. Fantastic book. Most underrated book in computer science. Read it from Chapter 1. If you're not sure you understand the algorithm, then try implementing it in a different language. I'd recommend a language you know well and that's in the same paradigm as the language of the book (like C, Java, or Python). I'd avoid languages you don't know well and languages that are too different from the language in the book.
I wouldn't stress the war stories too much - I think the author is only trying to emphasize that algorithmic problems often come in somewhat disguised form (only once has someone come up to me and said "I need to know the largest subset of these integers whose sum is 0").
Finally, with respect to CLRS, it could be a gentle introduction to read about an algorithm in The Algorithm Design Manual, then implement it in a different language, then read about it in CLRS. You'll already know what you're going to learn from the section, so you'll be able to focus on how they formalize it.
I'm getting excited to do this process again myself (I did it a few years ago to prepare for interviews). So fun!
Math scared the crap out of me too, but I loved CS and decided to give it a shot. Made it out OK. A BS comp sci degree isn't quite as math heavy as you might think.
As for the rest... recently I found out about Khan Academy, and I'm trying to use it to recover from my terrible high school math education.
If you want some really fun programming problems I suggest doing some challenges from http://projecteuler.net/. The programs are all mathematically oriented similar to your prime number finder. You can also complete those problems in any language, which really helps to understand syntactical differences.
For those who don't want to sit through Ng's course material, these notes are (seemingly) pretty good: http://www.holehouse.org/mlclass/
I'd spend time going over some finite math and algebra MOOCs or tutorials or videos instead TBH.
Also, this book is a great intro that doesn't assume really much of anything, and introduces the required math and stats as necessary. After cramming through and understanding what's going on in a blackbox manner, a deeper revisit of concepts after-the-fact could be worked: http://www.cs.waikato.ac.nz/ml/weka/book.html
Structure and Interpretation of Computer Programs lectured by Sussman and Ableson themselves. These lectures were recorded in 1986 for an audience of professional computer programmers (staff of HP, iirc), not freshmen, so it's got a little bit of a different approach to it.
First lecture: https://www.youtube.com/watch?v=2Op3QLzMgSY
Really? Structure and Interpretation of Computer Programs is a "fun to read" book to help someone "think in a logical computer kind of way"? Give me a break. I'm sure someone will come along and suggest you read through the Knuth series, too.
I'd suggest something like this:
The Tinkertoy Computer and Other Machinations, by A.K. Dewdney.
Dewdney used to write a column in Scientific American in the '90s called "Computer Recreations," which explored interesting computer science ideas for a non-C.S. audience. This book is a collection of some of those columns (there are a few other collections as well). I used to read the column, and I've read much of this book and others, and they are fun and interesting. They are not about programming directly, but explore algorithms and ideas in an accessible way. Each article also stands alone, so you can thumb through a pick out an interesting chapter to read without having to wade through the ones that don't appeal to you.
If you want to learn about programming, computer architecture, etc., there are some excellent textbooks, but if you want to explore the CS way of thinking I'd got for something written with that in mind.
Don't go for a 'For Dummies' book. I'd recommend starting with an old MIT course, 6.001. It uses a book called The Structure and Interpretation of Computer Programs, commonly called the Wizard Book. It tangentially allows you to learn a language called Scheme, which is a (in my opinion) great programming language with a number of innovative features. However, the focus is on teaching you how to think like a programmer. The book and lectures are both available online.
Well, a big part of it is caused by the speed of light. Wolfie to the rescue. Conveniently, they include the time it takes light to travel that distance in fiber: 75.9ms. And that's only one way. Ping measures both ways.
Additionally, fiber doesn't go straight. And there are some routers in between too. Each of them adds up a few milliseconds.
In other words and to answer your question: Physics.
I hate to be a hater but you probably won't be hired for a job you aren't qualified for, and if you are it won't last long. Also, try r/programming. r/compsci is for in depth scientific discussion on computer topics.
As a redeemer, if you want to learn algorithms, do Project Euler.
This isn't really a computer science question. You may want to consider /r/CompTIA or /r/techsupport
That said, your best bet might be eBay or Craigslist. You may also be able to convince someone at Best Buy or any other IT shop to give you an old PC, but that's unlikely.
Additionally, when I got my A+ cert a few years back, the videos on this site were a huge help: http://www.professormesser.com/free-a-plus-training/free-a-plus/
Idk about books, but look up "8 bit breadboard computer" on YouTube.
Wait... I'll do it.
Building an 8-bit breadboard computer!: http://www.youtube.com/playlist?list=PLowKtXNTBypGqImE405J2565dvjafglHU
Oh and I guess the algorithms bible:
Introduction to Algorithms, 3rd Edition (MIT Press) 3rd Edition ISBN-13: 978-0262033848, ISBN-10: 0262033844
For "algorithms lite" check out: Algorithms to Live By: The Computer Science of Human Decisions
ISBN-13: 978-1627790369, ISBN-10: 1627790365
One book that I don't believe has been mentioned is the one recommended by Steve Yegge, The Algorithm Design Manual. Take from that what you will.