My Math Forum  

Go Back   My Math Forum > Math Forums > Math

Math General Math Forum - For general math related discussion and news


Thanks Tree71Thanks
Reply
 
LinkBack Thread Tools Display Modes
October 16th, 2014, 09:17 PM   #11
Member
 
Joined: Oct 2014
From: UK

Posts: 62
Thanks: 2

Quote:
Originally Posted by topsquark View Post
As x gets larger and larger the limit I suggested gets closer and closer to 1. But it never quite gets there. So how do you know it's really 1?

-Dan
The process is no different if the infinity symbol is used or not. The point is that you don't really use infinity. You find the limit by examination of the expression as x increases. Using the word infinity is misleading as it is not what is really happening.
Karma Peny is offline  
 
October 16th, 2014, 09:22 PM   #12
Global Moderator
 
greg1313's Avatar
 
Joined: Oct 2008
From: London, Ontario, Canada - The Forest City

Posts: 7,963
Thanks: 1148

Math Focus: Elementary mathematics and beyond
That is what's happening! The statement of the limit says "we can make f(x) as close to
1 as we like by choosing x sufficiently large". There is no restriction on the size of x. If there
were, we'd have a constant, not a limit.
Thanks from topsquark

Last edited by greg1313; October 16th, 2014 at 09:40 PM.
greg1313 is offline  
October 16th, 2014, 09:39 PM   #13
Member
 
Joined: Oct 2014
From: UK

Posts: 62
Thanks: 2

Quote:
Originally Posted by v8archie View Post
This is true whether there "infinity" is admitted or not.
It gives a solution for natural numbers up to n. It does give a solution for the completed set of all natural numbers. In this respect it is not a complete solution if infinity is admitted.

Quote:
Originally Posted by v8archie View Post
Without the infinite, we can't adequately deal with the very, very small or the very, very, large. To discard the infinite leaves a hole in mathematics, and to me it's a hole where much of the most interesting stuff lies. Of course, it's interesting partly because it's a tricky concept, but that doesn't mean it should be thrown out.

Discarding things because they seem too difficult or because they don't fit with your preconceived ideas is religion, not mathematics. That's not to say that one can't do valid and even interesting mathematics without admitting the infinite. Just that you have to be clear about what you are doing.
I am throwing it out on the basis that it has no sound logic behind it (as explained on my website), not on the basis that it is tricky or difficult.
Karma Peny is offline  
October 16th, 2014, 09:48 PM   #14
Member
 
Joined: Oct 2014
From: UK

Posts: 62
Thanks: 2

Quote:
Originally Posted by CRGreathouse View Post
No, that's a misunderstanding of real numbers. Perhaps you can work this one out on your own since you've studied calculus.
I'd be very grateful for some further clarification here, many thanks.
Karma Peny is offline  
October 16th, 2014, 10:02 PM   #15
Member
 
Joined: Oct 2014
From: UK

Posts: 62
Thanks: 2

Quote:
Originally Posted by greg1313 View Post
That is what's happening! The statement of the limit says "we can make f(x) as close to
1 as we like by choosing x sufficiently large". There is no restriction on the size of x. If there
were, we'd have a constant, not a limit.
I am not claiming there is a restriction of any kind. We can make x as large as we like, but we cannot make it non-finite.

The problem is at a much more basic level. If something is endless or limitless then by definition it cannot have an end point.
The concept of a completed infinity implies an end. If something is truly endless then it can never be completed. And so saying something has no limit does not mean it is infinite.

Furthermore, consider the expression 'as x approaches infinity'. Assuming we can use infinity in this context, then it does not matter how much we increase x because we will still be an infinite distance away from infinity. The same applies if we decrease x or leave x unchanged.

Since increasing or decreasing or not changing x all result in being the same distance from infinity, it follows that we cannot 'approach' it.

Last edited by Karma Peny; October 16th, 2014 at 10:43 PM.
Karma Peny is offline  
October 16th, 2014, 10:40 PM   #16
Senior Member
 
Joined: Aug 2012

Posts: 2,395
Thanks: 749

One can employ infinity for its usefulness, even while denying it any ontological status. That is, one can if one likes adopt the following position: "I know that infinite numbers do not exist; and if they did exist I'd stamp them out; for I dislike them. Yet, they are curiously useful and make many computations simpler in physics, mathematics, biology, and economics. So I will freely use them, even though they do not exist."

In fact that's a familiar sounding position. Some people felt that way about the idea that the earth revolves around the sun. Of course it doesn't, everybody knows that God put the earth at the center of the universe. But the model -- false though it is -- with the sun at the center of the solar system sure does simplify the calculations. So let's just use that system in astronomy class, even when we know it's not literally true.

And that "imaginary" number i, with the property that if you square it you get -1. That doesn't exist either. Of course it does allow us to solve polynomial equations, and it makes the calculations of electromagnetic theory far simpler ... so let's just take it as a useful fiction.

OP, would that satisfy you? From a philosophical point of view, what does "true" mean, anyway? Isn't truth just what we all find convenient to believe? You could in fact work out all the math with the earth as the center of the universe. The universe wouldn't change, only the math would.

I don't care if the engineers building the bridge I'm driving on believe in infinity. I just expect that they've studied calculus.
Thanks from topsquark and Sauvage
Maschke is offline  
October 16th, 2014, 11:20 PM   #17
Senior Member
 
Joined: Nov 2013

Posts: 160
Thanks: 7

Quote:
Originally Posted by Karma Peny View Post
I studied mathematics at school and as part of my computing degree, but I never felt at ease with the concept of infinity. Recently I gave the matter more thought and I reached the conclusion that infinity is not a valid concept.

Of all the areas of mathematics that I have encountered (school-level set theory, calculus, geometry, probability and so on) I could find no valid reason for using infinity. For me, its intangible nature creates mysticism where I would prefer precision and clarity. To address this, I wrote an article suggesting how we could start to remove infinity from these areas of
mathematics.

I am very interested to hear other people's ideas on how to remove infinity from mathematics, or possibly why it should not be removed at all. The more opinions I can get about this subject the better. Many thanks in advance.

I think you are struggling with exactly the same problems with infinity as me.
I looked at your articles, and I think the problem lies in what you have written under the topic "Removing infinity from repeating decimals", you write that:

0.999… = 1 this is incorrect (yet many mathematicians accept it)
lim(0.999…) = 1 this is correct

You also write that " we can say that the limit of 0.333… as the number of decimal places increases, equates to one third. This is completely different to saying that 0.333… equals one third, which would be wrong, as it never does."

If you look at my last thread on this forum "What lies beyond infinity", I presented similar ideas there, but they were not accepted as valid. I did not suggest abandoning infinity. Instead, I was even more radical, I suggested abandoning the decimal number system altogether if we can't accept infinitesimals.
In other words, I can deny that $\displaystyle $$\frac{1}{3}$ has a decimal number representation if the infinitesimals don't exist.
$\displaystyle $$\frac{1}{3} = 0.333.....$ only if the infinitesimals exist, infinitesimals which are $\displaystyle \neq 0$.

Otherwise, we are forced to admit that
$\displaystyle $$\frac{1}{3} \approx 0.333.....$

Somehow I get the feeling that abandoning the decimal number system corresponds to your idea of abandoning infinity. We should abandon the infinite amount of decimals
of $\displaystyle $$\frac{1}{3}$ if we cannot count them correctly. I see no
other way out of this problem. Either we can count them all or we can't count them all.
TwoTwo is offline  
October 16th, 2014, 11:40 PM   #18
Member
 
Joined: Oct 2014
From: UK

Posts: 62
Thanks: 2

Quote:
Originally Posted by Maschke View Post
One can employ infinity for its usefulness, even while denying it any ontological status. That is, one can if one likes adopt the following position: "I know that infinite numbers do not exist; and if they did exist I'd stamp them out; for I dislike them. Yet, they are curiously useful and make many computations simpler in physics, mathematics, biology, and economics. So I will freely use them, even though they do not exist."
For the mathematics I have encountered, infinity appears to add mysticism and it creates problems that do not exist without it. I have always found it less than useful.

There are many processes that appear to use infinity, such as calculus, but the word is often used because people think of infinity as being an unimaginably large number. Calculus was not devised using infinity, the proof of the fundamental theorem does not require infinity, the use of differential and integral calculus does not require infinity. But still, such processes are claimed to justify the value of infinity.

Gauss’s views on the subject can be paraphrased as: 'Infinity is nothing more than a figure of speech which helps us to talk about limits. The notion of a completed infinity doesn’t belong in mathematics'.


Quote:
Originally Posted by Maschke View Post
And that "imaginary" number i, with the property that if you square it you get -1. That doesn't exist either. Of course it does allow us to solve polynomial equations, and it makes the calculations of electromagnetic theory far simpler ... so let's just take it as a useful fiction.
Imaginary numbers make perfect sense and provide a good example for comparison.

In order to solve previously unsolvable quadratic equations, we assume that the square root of -1 can exist. This appears wrong as it breaks the rule that something times itself must be positive (but maybe this rule was wrong all the time). It soon becomes apparent that we can form a logically consistent set of mathematical rules that include the square root of -1.

In a similar fashion, it is perfectly acceptable to assume a completed collection of an endless sequence can exist. But in this case it is not possible to form a logically consistent set of rules. Many paradoxes arise and it appears to be possible to prove that such an object cannot exist.

Last edited by Karma Peny; October 16th, 2014 at 11:53 PM.
Karma Peny is offline  
October 17th, 2014, 12:16 AM   #19
Member
 
Joined: Oct 2014
From: UK

Posts: 62
Thanks: 2

Quote:
Originally Posted by TwoTwo View Post
Somehow I get the feeling that abandoning the decimal number system corresponds to your idea of abandoning infinity. We should abandon the infinite amount of decimals
of $\displaystyle $$\frac{1}{3}$ if we cannot count them correctly. I see no
other way out of this problem. Either we can count them all or we can't count them all.
Many thanks for taking the time to read my article, it is very much appreciated.

I've just looked through that thread you mentioned, it's a long one!

The real number line supposedly stretches from -infinity to +infinity and any section of it, however small, contains an infinite number of numbers. As you might have guessed, I reject the whole basis of a continuum as it is based on the concept of infinity, which is nonsensical. I prefer to address the problem that the continuum is trying to provide a solution for.

The problem, as I see it, is how can we fully express irrational numbers in a framework that we can work with? The solution is to use symbols, such as π and √2 rather than their decimal expansions. We can then work with irrational numbers in an abstract framework. If we want to see a numeric result rather than a set of expressions containing symbols then we have to expand the expressions using the constraints of the real or abstract world to which we are applying it.

This problem has already been solved to some extent for the encoding of vector graphics in computer software. Vector images are made up from multiple objects. Each object consists of mathematical instructions that define shapes. And each shape is defined in terms of points and paths. This makes the image fully scalable without loss of quality. It only gets converted to discrete pixels when the image is rendered onto a real world object like a small screen or the large side of a building.

Last edited by Karma Peny; October 17th, 2014 at 12:21 AM.
Karma Peny is offline  
October 17th, 2014, 01:34 AM   #20
Math Team
 
topsquark's Avatar
 
Joined: May 2013
From: The Astral plane

Posts: 2,274
Thanks: 943

Math Focus: Wibbly wobbly timey-wimey stuff.
Quote:
Originally Posted by Karma Peny View Post
...Calculus was not devised using infinity...
Well...Newton was one of the ones that first developed Calculus and applied it to central force problems. In these problems he did use the concept of infinity. I don't know if Leibnitz did or not.

-Dan
topsquark is offline  
Reply

  My Math Forum > Math Forums > Math

Tags
infinity, mathematics, removed



Search tags for this page
Click on a term to search for related topics.
Thread Tools
Display Modes


Similar Threads
Thread Thread Starter Forum Replies Last Post
Infinity ryanninjasheep Number Theory 11 September 30th, 2013 12:43 PM
is 1/infinity = lim x->infinity 1/x sivela Calculus 1 June 25th, 2012 10:04 AM
Infinity deanmullen10 Calculus 6 July 4th, 2010 01:24 AM
Solve X where -infinity>x>infinity -DQ- Algebra 5 September 14th, 2009 05:13 AM
Infinity... nerd9 Number Theory 1 December 31st, 1969 04:00 PM





Copyright © 2019 My Math Forum. All rights reserved.