Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The Computer Science Handbook: First Draft [pdf] (thecshandbook.com)
144 points by StylifyYourBlog on Feb 7, 2015 | hide | past | favorite | 43 comments


This is algorithms, which while useful is not even the majority part of a computer science education, as I understand it.

My CS degree involved image processing, graphics, operating systems, systems programming (low level programming), programming language theory, discrete math, linear algebra and statistics, just off the top of my head.

Interestingly programming is actually not a big part of a degree (again, as I understand it.) It takes many years to become a good programmer, and it would be a waste to dedicate an entire 4 year degree to just that.


This was also my experience. Years 1 and 2 covered general programming concepts like data structures and algorithms. Years 3 and 4 transitioned into lower level topics like compiler design, some embedded work, lots of math. However, as most CS depts at the time we're relatively young, they had gaps to fill, which usually meant adding on more math courses to the curriculum to fill credit requirements.

I am sure these days CS depts are far more comprehensive than 15, 20 years ago and can offer many more courses in advanced programming topics (for example big data, enterprise, security, distributed, project management, mobile, etc).


interesting! I had not really given much thought on how distributed computing or enterprise architectures would fit in a modern curriculum...

I don't think that math should be considered filler though - it was very fundamental to my degree! In fact the first year of my comp. sci. degree was /exactly/ the same as the first year engineering curriculum. It was almost entirely math, with basic programming (just pascal) which the engineers also needed for their degree.

As such I'd say that math is still very prominent in most comp sci degrees

edit: in fact as chance has it I was just today going over some old lecture notes from my 3rd degree year, and even the programming courses were extremely heavy in inductive reasoning to demonstrate how, for example, Big(O) notation worked with different algorithms (i.e., actual formal proofs)


The title is much broader than what's listed in the table of contents, which is primarily what you'd find in one course (algorithms) from a CS undergrad course of study.

For an online text that covers similar stuff, see http://interactivepython.org/runestone/static/pythonds/index... .

The last "interview" chapter is about getting a job, not about CS itself.

A good starting spot for the topics in "computer science", at least at the undergrad level, is the ACM curriculum ( http://www.acm.org/education/CS2013-final-report.pdf ).


Which language is better for studying algorithms: C or Python?


Python

You don't have to worry about memory management, pointers and the like, and can just focus on the algorithms.

All that stuff it super important to also know, but probably easier to learn about them separately.


I looked at the implementation of a linked list in the Python book above, there it's done with objects instead of pointers. To be honest, the idea of pointers seems to be more straightforward in this case: one node "points" to the next, while the idea of an object seems a little abstract to me. It took me a moment to understand that an object is just a memory location (pointer) to "something", and it's less clear what that "something" is. It probably took me just as long to see the relationships between a pointer and an array or a struct when I was learning C, but I'm afraid at this point, to understand Python data structures I have to mentally translate them to C data structures.


I would suggest C only if you are a reasonably proficient C programmer. Actually, even then, I would suggest using Python.

Choosing any language means that you are pigeonholed into using the DS that the language supports and have to think about things from the language's perspective. But just the ease of writing and experimenting with Python code makes it a great language for writing algorithms.

e.g. https://github.com/pratikmallya/interview_questions


But can't this be viewed as an advantage of C? The lack of suitable data structures would force me to implement them first myself, and then use them to build algorithms.

Am I missing your point?


Just to add a third voice. Python. For all the reasons already stated.

Personal story. I started programming with C. The day I picked up Python (many years later) I knew that this was the language that any university should have used to get students started with programming/algorithms.


I work in machine learning. I have to know both Python and C (C++). I feel like whatever language I will choose for learning algorithms, I will end up knowing better.

If you could choose, looking back, which language would you prefer to know better: C or Python?


That's a really tough question. If I had to pick one, I'd take C. Simply for the reason of employment. I can't think of any other reason to pick one over the other. My choice of python as my go to language has bitten me in the sense that I see very few job postings listing python as the required skill. Language isn't really important but as long as the world spins in a particular direction one has got to live with it.

On the same note of employment however, C opens up a completely different line of work in robotics and electronics related stuff which is something I really want to get into.


Interesting. Thanks. In ML field, from the point of view of employment opportunities, I'd say the real "language to know" is CUDA. Which unfortunately still leaves the choice of C vs Python: https://devtalk.nvidia.com/default/topic/540772/preferred-la...

I wish there was a book on algorithms that would display both C and Python code side by side!


c) English


I'm afraid my computer is not smart enough (yet) to understand English! :-)


People used to test programs on paper ... Your imagination is at least as mighty, albeit often slower than a PC. English is badly defined, I rather thought about logic. The languages don't really matter, all share the same fundamentals. For algorithms, FP seems just as beloved: http://mitpress.mit.edu/sicp


Sure, but to know the algorithms is not enough. I remember I was asked to implement a certain prime numbers search algorithm in C, and I spent many hours trying to wrap my head around how to do it, even though the algorithm itself was perfectly clear to me. Not only that, I then spent several more hours trying to make it faster, which is also language specific. So the language does matter in my opinion.


> easy to read without any math or computer science background

> you are already familiar with Java or C++ syntax

not sure you will have too much success hitting your target demographic of "people who are ignorant of computer science, yet are experienced programmers"


After conducting many interviews of candidates I'm pretty sure that description fits at least 60-70% of experienced programmers.


Can someone without knowledge of basic data structures and algorithms be called an experienced programmer?


The book covers more advanced data structures too. You'd be amazed by the experienced developers was use basic arraylists, and occasionally hash tables without any idea if how they actually work. They get systems to work without knowing the underlying details, which for some industries is seemingly considered ok.


Not sure if OP is author, but this seems like a decent start at a useful compilation. It's obviously highly focused on data structures and algorithms, so the title is a bit misleading.

Strange - not a single citation/reference?


Pretty average content and not academically strong enough. It lacks the depth of proof and detailed technical explanations. I wouldn't call it a computer science book - it seems like a data structure and alg concise guide. There are tons of books if you're serious about Algorithms or Data Structures alone and that won't make you a computer scientist.

Computer Science is a big field that spans many areas of programming, theory and research.


It's enough cs for most practical purposes as a programmer (only from looking at the table of contents so far), which is probably the target audience.


The treatment of Big O notation is not only misleading (and poorly conveyed) but wrong in several ways. Big O is an upper bound that does not need to be tight. The table displaying the "limit of N for 1 second (standard processor)" and the accompanying note that the chart will eventually become outdated is manifestly wrong and misleading. Big O ignores constant factors and so no such comparisons can be made. For any particular duration of time (or for any particular number of elements), an algorithm with complexity `O(f(N))` may be faster than one with complexity `O(g(N))` regardless of `f` and `g`. Big O is not something that can obsolesce.

Also, it is not necessary that there be a base case for recursion (only well-founded recursion). For instance, the Haskell definition

  repeat :: a -> [a]
  repeat x = x : repeat x
is a recursive definition but it has no base case. Of course, there can also be multiple base cases or other more complicated structures.

Saying unconditionally that all operations for a hash set or hash map are O(1) is wrong.

Opening quotation in LaTeX is accomplished by "``".

I also think that the comparison between the human brain and CPU is completely unjustified. Given that most people could not remember the sequence of results of 32 coin tosses, why shouldn't I say they have no greater than 4 bytes of memory? (For myself, I think the most appropriate unit of memory is "10 seconds of commonly spoken English language").

There are already so many terrific sources for learning algorithms that I don't understand why the author created this book. It is not only inaccurate, but more difficult to understand than other resources I have come across (e.g., Coursera).



I could use a resource along these lines, especially with all the programming tests I'm facing during my job hunt. Thanks.


I like it. It's written by students so it's pretty easy to read. This seems like it could evolve into a students' version of 'Foundations of Computer Science' by Aho, Ullman - http://infolab.stanford.edu/~ullman/focs.html


This theory + a tutorial about testing best practices + 20 weekend projects would be a really good way to learn how to code.

Thinking about stacks, trees, and graphs can go a long way to build up learners' ability to simulate what the computer will do, e.g., getting the steps right for breadth first search in a graph is a rite of passage.


page 9: I think there is a mistake "However, the human brain can contain much more memory than humans."


Indeed this document could use a copy editor.


I appreciate the effort, but I think Foundations of Computer Science covers the same material, only better and more rigorously: http://infolab.stanford.edu/~ullman/focs.html


Research says that interactive feedback is more important than static. I recommend the OpenDSA project.

http://algoviz.org/OpenDSA/

I do appreciate accessible text though - worth looking into.


this is cool


Looks good so far. I would have loved this as an intro. before starting my CS degree; it might've even been useful for my UK A-level course (age 17-18).

I think there's an error here:

string[1..3] = ’abc’ string[1..1] = ’’


I think you also have one too few '+1's in the penultimate step on p172.


I'm sorry. If I am going to teach this, it must be a comic book.


I would buy The Manga Guide to Computer Science in a heartbeat.



No, but thanks for the suggestion. I do own the Physics, Electronics, and Calculus ones, if memory serves.


Is it supposed to be missing practice problems at the end?


Have you read "Head First Java" by Kathy Sierra? Data Structure & Algorithms book written in "Head First" format maybe pretty cool.

I wonder if you could re-format your book in that manner?


You should really consider something other than LaTeX as a publishing platform, or make it look better, when taking it from draft format to publishing format.

This doesn't communicate that algorithms are fun. An algorithm book, should be like a Magicians show, really. With fun problems to apply the algorithms on.

I also note that there aren't any links for backreferences to topics, and that at least one topic is missing, heaps.

I am actually very fond of Robert M. Sedgewicks books (Second Edition), and Donald E. Knuths monumental accomplishment. Those books are fun, most books concerning algorithms are not as fun as they should be.

I am picky I guess, I want fun excercises, or presentations, but also accurate details, and minituous explanations.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: