This is only really half the story. Modern computer science is an intermingling of an engineering-heritage and mathematics-heritage tradition, and the people quoted here are all definitely on the math side. The intermingling has only partially taken place, too: you can often still tell from a particular CS department's focus whether it broke off from an engineering dept, or broke off from a math dept.
Which makes sense to a certain extent, because if you want to reduce all of computation to something, "machines" and "mathematics" are both things it reduces to, depending on how you look at it. Everything at its base is lambda calculus, but everything at its base is also machine code and bits in registers.
"... but everything at its base is also machine code and bits in registers.
So far.
I don't have any examples, but my gut tells that it would be much easier to replace the "machine code and bits in registers" part with something else than it would be the lambda calc part.
That I believe is where the art of computer programming kicks in. You somehow have to bridge these both things you said which CS reduces to, in order to satisfy the "smart + gets things done" equation. You have to think both in terms of computational abstractions as well as how to fill your memory chips with the right bits. And this bridging is definitely a human thing, an art, a skill, a craft, and a process with few definitions, none conclusive.
I think iii is the most interesting of those. How to succinctly describe formal systems is a very hard topic. The kind of mathematical notation we use in classical mathematics (algebra through calculus) is an amazing achievement developed over centuries (most of it since the 14th century!). But it only describes the existence of a calculation to be performed, not that actual method of calculation (the algorithm).
Our ability to represent the methods of that calculation are extremely primitive. I suppose Lambda calculus is pretty close as are most modern programming languages. Upper and lower bound notation (like big-O) isn't bad as a kind of descriptor, but it's very restrictive in what you can do with it. I've seen certain kinds of state notation that's interesting-ish. But nothing approaches the elegance, flexibility (it's easy to abuse in good ways), extensibility, simplicity and efficiency of math notation (though even that system falls down when doing lots of matrix calculations by hand). But it beats the pants off of any other system I've seen for general mathematics.
Most of the things that we deal with in CS are so big and complex that even diagramming them by hand would take longer than we'd want and require more paper than we have. In math we have some nice symbols to bundle all that complexity up, like pi, or summation, or limit notation. In CS we lack that. Most of the time we try and get by with toy examples, like graphs with half-a-dozen nodes, or small trees only a couple levels deep with different colored lines or some such -- most of it seems to be diagrams and tables...lots and lots and lots of diagrams and tables. Regular Expressions seemed about as close as we ever got to describing something succinctly. Learning how to manipulate graphs just off of the regexes was a very cool experience.
I remember distinctly how frustrated my CS professors were compared to my Mathematics professors when trying to explain some concept on the board. I don't ever recall seeing a nice symbol that represented say, a tree, with some various super or subscripts that succinctly described all of the properties of that tree and the algorithm we were going to be using. Imagine how powerful it would be to write down, with just a few strokes, a short series of symbols that unambiguously defined a red-black tree and a search algorithm that runs in such and such min-time-complexity, max-time-complexity, and average-complexity as just a simple operator! Maybe something as simple as Y(rb) + s(log n, log n, "search term") = [s|earch term] or some such! But alas, I'm just falling back on a slightly bruised mathematic formal notation, not really anything new.
Given how rudimentary the tools are we have to work with, it's amazing how far we've come in so short a time.
Treating Computer Science as a homogeneous, monolithic concept is probably a mistake. Asking whether lambda calculus or relational databases or compiler design are truer computer science is somewhat akin to asking whether algebra or topology or calculus is the truer math. The answer, in both cases, is "yes."
The main difference between maths and computer science in this respect, if there is any, is that historically math has a divergent evolutionary path, starting as one discipline and spinning off subsystems, while computer science has a convergent evolutionary path, starting from the very different folds of math and electrical engineering and incrementally bringing the two sides closer together. Computational theory and formal verfication belong on the math side, compilers on the engineering side (Haskell being used in production code may be the final merging).
The roots of Computer Science lie in Electrical/Electronic Engineering and Maths. Both groups contributed tremendously (as well as a great number of other disciplines that saw its potential as a tool). My first contact with computers was with a CNC experimental machine (as a Mechanical Engineer) and the Electrical Dept. assigned someone from the Department for the lectures (early 80's in the UK) and that hooked me onto Computers:) At the same time the Electronics Engineers had quite a few options to take on Computer Science courses.
However, just reading the disclaimers of liability that routinely accompany software bear witness to how far software engineering lies from the "established branches of engineering." No Engineer worth his salt in another Engineering discipline would ever dream of getting away with such a disclaimer! As such the field has been hijacked somewhere along the line away from Engineering:)
about the difference between computer science and informatik (UK/US vs europa):
as others have noted: in europe "computer science" is not taught at universities. informatik is. (one could argue wether the courses "fachhochschulen" offer teach "computer science").
reading what lectures in computer science (in the US) are about, it seems "computer science" is really a lot about computers and writing software. whereas informatics ("informatiker") do not necessarly work with computers - often they do, because computers are handy for working with information. but the computer itself, its software & hardware, how it works is not a part of all informatik studies (it is of some).
i'm no native english speaker but from what wikipedia tells me "informatics" - this very broad field - is actually what we mean with "informatik". for example at the technical university of vienna there are eight different, specific sub-disciplines you can do your master's degree in ("media informatics", "medical informatics", "software informatiks", "technical informatiks", "computer graphics" - my own translations, sorry).
In the cases where there are no physical-world measurements, as with inputs and outputs to many machine learning problems, it is math. "Experiments" using randomized algorithms and measuring the results, themselves, certainly don't qualify as science.
In the twentieth century we learned that math is bigger than we can possibly handle with anything like traditional mathematical rigor. There is no contradiction in saying one is running an experiment on math when it is not possible even in theory to obtain the result of the experiment through any means more efficient than simply running the experiment. And given Computer Science's focus on things that are, generally, Turing Complete, proving that that is frequently the case is a sophomore-level homework problem in any decent curriculum. (See "Rice's Theorem".)
The idea of "math" you get in school is not wrong, but very incomplete. "Experimental math" is a perfectly valid field of study; mathematics itself proved it, about a hundred years ago. Theories are made, predictions are given, and there's no way to prove or disprove them until the experiment is run, at which point you still only have evidence, not proof (in general); sounds pretty scientific to me.
Things that are "computer science" but I contend are not mathematics: programming language design and implementation, operating system design and implementation, computer architecture and networking.
All of these things depend on math. But they go beyond the scope of math because they deal with the realities of designing and building systems for computation. Physics also relies on math, but I've never seen anyone make the reductionist statement "Physics is a branch of mathematics."
> programming language design [...] All of these things depend on math.
Really? I'm currently designing and implementing a programming language, and I haven't (consciously, at least) used any maths concepts more complex than arithmetic.
Some aspects of programming languages are very math heavy - things like type theory and lambda calculus. It's accurate to say some of the field of programming language design and implementation depends on math. Hence, I'm willing to say the field aggregate has that dependence, even if some other parts of the field do not have an explicit dependence.
In English, "information science" (http://en.wikipedia.org/wiki/Information_science) is usually more closely associated with classification and resource collection management, such as archival work and library cataloging.
I like "informatics" though. It is used in "bioinformatics", at least.
Even in England it used to be commonly called Informatics. I like Informatics/Informatik/Informatique etc. much better than "Computer Science", because most applications are not really about computing, but about information.
"A different way of (mathematical) thinking" is too vague a definition for me. In my mind CS is the study of how to precisely define a problem and how to precisely define a process to solve it.
That's mathematics, not science. There's more to CS than algorithms. Even heuristics wouldn't fit your definition, never mind any kind of research like machine learning or human-computer interaction.
I love this quote. My interpretation is that he's saying computers are merely tools for exploration, like how telescopes are just tools to the astronomer.
Which makes sense to a certain extent, because if you want to reduce all of computation to something, "machines" and "mathematics" are both things it reduces to, depending on how you look at it. Everything at its base is lambda calculus, but everything at its base is also machine code and bits in registers.