I'd argue that mathematics is somewhat falsifiable. For example, in our system, 1 and 1 makes 2. However, if we observed that adding 1 and 1 made some other result I suspect our math would be entirely different.
Sorry, but your reasoning is flawed. In mathematics, 1 and 1 isn't 2. Under certain rules of mathematics, it is. Under other rules, it isn't. One such example would be modulo-1-arithmetic.
As far as mathematics is concerned, whether you use the "normal" rules, the rules of modulo-1-arithmetic, or something entirely different, is completely irrelevant. As a mathematician, what you are concerned with is whether your theorems are consistent with your axioms, rules, and definitions.
What I can agree with, however, is that mathematics is a popularity contest. If certain axioms, rules, and definitions lead to a form of mathematics that is useful for many real-world tasks, then that kind of mathematics will become more popular than mathematics that have no (or few) obvious useful application in the real world.
However, if we observed that adding 1 and 1 made some other result I suspect our math would be entirely different.
We can do no such "observing". "1", "1", "+", and "=" are abstract entities, that are only given meaning in the abstract universe they belong to, by axioms, rules, and definitions given by mathematicians. That they somehow resemble concepts applicable to the real world, is purely coincidental (well, pretty obvious given human psychology, but that's the main reason, there's no inherent properties in math that give it this applicability to the real world).
On the other hand, yes. Our "normal" math would look different, due to the psychological factors, and the "popularity contest" alluded to above.
[ Parent ]