It's often been questioned whether mathematics is a discovery or an invention. As in, did we stumble across the structure of the universe, or is our identification of discrete parts of the universe limited to our minds? It's certainly convincing to understand mathematics as a discovery since its rules are so widely applicable. We can find numbers, functions, and figures for just about everything we encounter. But ultimately that perspective is flawed. To see mathematics as a discovery is to see the English language as a discovery. The confusion lies in our distinction, or lack thereof, between the symbols we devise and the concepts they represent. Having learned (evolved the ability) both as individuals (via trained recognition) and as a society to count the things we see according to mathematical models, we also assign symbols to the numbers we measure count with. And these symbols are what many trip on in questioning the derivation of mathematics; in giving a name to our ideas, we come to pretend that they actually exist. In The Way of the Explorer, astronaut and founder of the Institute of Noetic Sciences Edgar Mitchell states..
"Mathematical descriptions are not embedded in nature, but created by mind. The differences between existence and knowing must be observed carefully when considering the larger viewpoint. We can unequivocally state that mathematics is a linguistic creation of the mind, not an intrinsic characteristic of nature, because it depends on how we assign labels to nature, and then quantify those labels."
That is, 'we create the definitions of mathematics (by selecting things we recognize as significantly individual in appearance and counting them). This realization comes from the holistic understanding that there is a difference between existence and knowledge, between map and territory'.
Mathematics is our set of concepts derivative of the idea of count, which relies on the assumption of discretion. Mathematics is an excellent set of models that ranges from merely basic ideas to extremely complex applications. It's also very old, which in itself isn't so much wrong as it is a warning. An accurate warning, as it turns out, for you find that mathematics is archaically unintuitive at higher levels. Besides that more specific issue, though, it has a few philosophical holes, too. Three, to be exact. The three symbols of indiscretion, which naturally go against the assumption of discretion mathematics relies on.
In noticing difference/discretion, we frequently define boundaries in order to construct models of understanding (and only later end up finding the inevitable holes in our models). Seeing as mathematics is such a fundamental expression of our understanding of measurable things in the universe, these holes are serious red flags in the foundation of any philosophy assuming count to be real.
For starters, division by zero is infamously called “undefined”. This is the convention mathematics ostracizes infinity by; really, with the utmost precision, dividing anything discrete by zero would have to be considered infinite since you can fit infinite zeroes into any other value. But infinity isn't considered a number by the conventions of mathematics because its value isn't discrete. So in having no definite answer, it has been called “undefined”. Discretion is both the defining rule of mathematics and the culprit of its contrivance…
One of the terms considered “undefined” is the fraction 0⁄0. This one is even more troublesome, though, more precisely receiving the title “indeterminate”. This is because you could also say any of the following:
0, because any fraction with a numerator of zero would be zero fractions
1, because when any number is divided by itself, the totality is a whole
∞, because division of anything into pieces of size zero results in infinite pieces
And so, the answer is considered “indeterminate” – there's no reason to choose one of these answers as being more correct. Although, philosophically, a slightly more precise understanding would be that each is correct, just depending on the perspective you consider it from. The perspectives are really equivalent.
In defense of the philosophical precision of mathematics, someone is going to argue that these are not shortcomings. But is it not apparent that when your system of logic, especially one focused on finding solutions, must be patched with such indecisions as undefinability and indeterminance that it is flawed on a universal level? Should not a system of logic be universally applicable, or else be imperfect by being occasionally false? Indeed, in realizing its flaws and giving them titles you can pretend that the system is fixed. However, a perfect understanding would not need fixing. It's really that simple.
A related way in which extremes expose holes in mathematics is through an equation with the indiscrete numbers:
First, take the three forms of this equation:
1⁄0 = ∞
∞0 = 1
1⁄∞ = 0
Then, compare those to the three forms of a similar equation, where you replace 1 with another number, say 7:
7⁄0 = ∞
∞0 = 7
7⁄∞ = 0
What is demonstrated is that holistically, there is an indeterminate solution. For example, what is the simplification of ∞0? Is it 1 or is it 7? Now, by the conventional rules, these kinds of equations aren't even allowed in the first place. Over time, this inconsistency has been prevented simply by disallowing infinity from being used – that's why it's not considered a number. But what this shows is that there is a hole in mathematics that had to be patched; a universally applicable system of logic would not have any exceptions! One would not be equivalent to every number, nor any number equivalent to any other, if mathematics was totally philosophically sound.
There are other counterexamples to mathematics too, with the only explanation always being ~"this isn't allowed".
Here's another demonstration that is even simpler..
∞ + 1 = ∞
1 = 0
That 1 could be any other number and you would still be equaling your number to zero.
Most mathematicians brush these counterexamples aside, with the only argument being that such is unallowed/invalid/wrong because infinity is not a number. But that is just an exclusive label slapped on as an excuse in order to keep up the gamelike model of mathematics. And the ironic truth is that infinity, along with the other indiscrete numbers, is one of the best possible symbols for describing the universe.
In tandem with the realization that unpatched mathematics implies equivalency of all numbers, we find that for what they're worth, the indiscrete numbers are more valid philosophically because they can more fairly and equivalently describe the universe and its aspects. This is surely due to their lacking assumption of discretion. Their equivalency reveals truth through the facility of reasoning at the extremes.
Having trouble accepting that zero and infinity are equivalent? Zero is more similar to infinity than you might realize, because it's the same thing as infinitesimality, which is the inverse of infinity! Zero is the inverse of infinity in the simplest sense just because 1/∞ (or anything over infinity) equals zero. From there, it's just a matter of perspective. You can say zero is infinite smallness. You can say infinity is zero smallness! Each way, all you're saying is that each one is the other in the opposite direction. And direction is ambiguous until perspective is applied! Zero and infinity are philosophically equivalent because only perspective differentiates them.
One, too, is equivalent to the other indiscrete numbers by the following reasonings:
0 - nothingness, if you are assuming such can be perceived to exist (which philosophically precisely it cannot), is able to be seen as a whole; it is not bounded by the identified lines of existence and as such it is continuous
∞ - infinity necessarily goes beyond any defined boundaries infinitely, breaking all bounds and thus encompassing the whole universe
The indiscrete numbers are the most relevant to philosophical understanding. There is a contrived reliance on discrete quantities alone, leading infinity to be ostracized behind layers of compounded reasoning, archaic enough that we accept them widely in culture as philosophical truth. That's what I'm focusing on unlearning. As for zero, it took a long time for its concept to be invented. The number one is very popular, though, simply because it's critical to mathematics as the base unit of count by which all other values are relative.
I noticed these holes thanks to the insightful power of reasoning at the extremes. What you realize is that if mathematics' logic dealing with values doesn't apply to the fullest extent of value, the infinite extreme, then the logical system is ultimately flawed. And certainly, if any numbers were totally real and not just concepts in our heads, the most real numbers would be the extremes – the universe is best described in terms of value as either derivative of nothing \valueless (0), singular (1), or infinite (∞), not as a sum of individual parts.
Admittedly, without discrete values, there's no use for mathematics. That's why, interestingly, we can use these systems of fundamental falsehoods, like discretion, value, and mathematics, to solve problems with powerful applications. Say, getting a spaceship to the moon! What I'm saying is that the power of approximation is considerable. It just depends on how far you approximate, how precise you get to reality. With rocket science, precision is extremely important. There are other realms in which mathematics bears less fruit, though. For example, in predicting weather. Predictions cannot be made too far out, or else due to statistics the inaccuracies will compound and lead to significant error. This is because, as with any system, there are infinite variables to deal with in the prediction of weather in the reality of the universe... and with weather, the system is much larger than just a few identified objects like planets and rockets – it's more on the level of each molecule. But only in a virtual realm can predictions be certain, perfectly precisely accurate. Discretion is inherently an approximation because the universe is inherently indiscrete – in order to describe that which is indiscrete by mathematics, you would need infinite (inherently discrete) equations, which isn't discretely possible!
A simple way to reason that ultimately, everything is equivalent is to consider the math done in physics. What is the form of progress? Equations. Just look at how celebrated Einstein was for connecting energy to matter with his famous equivalency. By this pattern, ultimately (taken to the extreme), total progress should be total equivalency. With the extremity of this infinity of understanding the barriers of discretion dissolve and even the idea of equivalency fades away, for there are not even two different sides anymore but simply an understanding of universal unity.
The closest thing you've got in mathematics are its symbols of indiscrete numbers, which are each equivalent. Those are basically equations of philosophical understanding all to themselves!