Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not sure I agree. Usually, mathematics is developed backwards - you have a concrete question that you want to answer, and you reason that it can be answered so long as such and such as true. The formal system is developed post-hoc to give yourself a language to reason in, but you're really trying to take the result you already "knew" to be true, and find the least restrictive system description to which it still applies. Then you look for parallels and generalizations. The abstractions are in a large sense the most important and difficult thing to create, because you're creating a schematic and saying "if you can rephrase your problem into these terms, then I already proved this intuitively correct thing for you, so you don't have to worry about edge cases". Algebra, geometry, calculus, etc. all follow this model. A lot of the conclusions in, e.g. analysis are obvious once you impose continuity.

The object of mathematics is to produce abstractions that make proofs possible or trivial. Sets, fields, groups, categories, functions, integers, reals, complex numbers, quaternions - these aren't notational conveniences that are introduced to make it easier to talk about. They're important because if you have something with the properties of a group, you know some powerful truths about it for free. If you change anything about the definition of group, the set of truths you know changes. People have been grinding on what the definition of a set should be for a century, trying to build the best possible abstraction, and they have all of the same problems coders have. You assume to much, it's not very general. Don't assume enough, there's nothing interesting to say that's unilaterally true.

So I would argue that characterizing math as the study of abstraction is largely fair. You "see" a result is probably true some of the time, and then you try to find out just how general you can make that statement.



Thanks for the response.

> People have been grinding on what the definition of a set should be for a century, trying to build the best possible abstraction, and they have all of the same problems coders have. You assume to much, it's not very general. Don't assume enough, there's nothing interesting to say that's unilaterally true.

This is a great point.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: