My Math Forum  

Go Back   My Math Forum > College Math Forum > Calculus

Calculus Calculus Math Forum


Thanks Tree1Thanks
Reply
 
LinkBack Thread Tools Display Modes
January 5th, 2017, 05:44 AM   #1
Newbie
 
Joined: Jan 2017
From: Tehran, Iran

Posts: 5
Thanks: 0

Rigorous definition of "Differential"

First of all I want to clarify that I posted this question on many forums and Q&A websites so the chances of getting an answer will be increased. So don't be surprised if you saw my post somewhere else.
Now let's get started:
__________________________________________________ _______________

When it comes to definitions, I will be very strict. Most textbooks tend to define differential of a function/variable in a way like this:

--------------------------------------------------------------------------------
Let $\displaystyle f(x)$ be a differentiable function. By assuming that changes in $\displaystyle x$ are small, with a good approximation we can say:
$\displaystyle \Delta f(x)\approx {f}'(x)\Delta x$
Where $\displaystyle \Delta f(x)$ is the changes in the value of function. Now if we consider that changes in $\displaystyle f(x)$ are small enough then we define differential of $\displaystyle f(x)$ as follows:
$\displaystyle \mathrm{d}f(x):= {f}'(x)\mathrm{d} x$
Where $\displaystyle \mathrm{d} f(x)$ is the differential of $\displaystyle f(x)$ and $\displaystyle \mathrm{d} x$ is the differential of $\displaystyle x$.

--------------------------------------------------------------------------------

What bothers me is this definition is completely circular. I mean we are defining differential by differential itself. Although some say that here $\displaystyle \mathrm{d} x$ is another object independent of the meaning of differential but as we proceed it seems that's not the case:

First of all we define differential as $\displaystyle \mathrm{d} f(x)=f'(x)\mathrm{d} x$ then we deceive ourselves that $\displaystyle \mathrm{d} x$ is nothing but another representation of $\displaystyle \Delta x$ and then without clarifying the reason, we indeed treat $\displaystyle \mathrm{d} x$ as the differential of the variable $\displaystyle x$ and then we write the derivative of $\displaystyle f(x)$ as the ratio of $\displaystyle \mathrm{d} f(x)$ to $\displaystyle \mathrm{d} x$. So we literally (and also by stealthily screwing ourselves) defined "Differential" by another differential and it is circular.

Secondly (at least I think) it could be possible to define differential without having any knowledge of the notion of derivative. So we can define "Derivative" and "Differential" independently and then deduce that the relation $\displaystyle f'{(x)}=\frac{\mathrm{d} f(x)}{\mathrm{d} x}$ is just a natural result of their definitions (using possibly the notion of limits) and is not related to the definition itself.

Though I know many don't accept the concept of differential quotient($\displaystyle \frac{\mathrm{d} f(x)}{\mathrm{d} x}$) and treat this notation merely as a derivative operator($\displaystyle \frac{\mathrm{d} }{\mathrm{d} x}$) acting on the function($\displaystyle f(x)$) but I think that it should be true that a "Derivative" could be represented as a "Differential quotient" for many reasons. For example think of how we represent derivatives with the ratio of differentials to show how chain rule works by cancelling out identical differentials. Or how we broke a differential into another differential in the $\displaystyle u$-substitution method to solve integrals. And it's especially obvious when we want to solve differential equations where we freely take $\displaystyle \mathrm{d} x$ and $\displaystyle \mathrm{d} y$ from any side of a differential equation and move it to any other side to make a term in the form of $\displaystyle \frac{\mathrm{d} y}{\mathrm{d} x}$, then we call that term "Derivative of $\displaystyle y$". It seems we are actually treating differentials as something like algebraic expressions.

I know the relation $\displaystyle \mathrm{d} f(x)=f'(x)\mathrm{d} x$ always works and it will always give us a way to calculate differentials. But I (as an strictly axiomaticist person) couldn't accept it as a definition of Differential.

So my question is:

Can we define "Differential" more precisely and rigorously?


Thank you in advance.

P.S. I prefer the answer to be in the context of "Calculus" or "Analysis" rather than the "Theory of Differential forms". And again I don't want a circular definition. I think it is possible to define "Differential" with the use of "Limits" in some way(though it's just a feeling).
HamedBegloo is offline  
 
January 5th, 2017, 05:54 AM   #2
Math Team
 
Joined: Jan 2015
From: Alabama

Posts: 2,354
Thanks: 591

Yes, "differential" can be defined rigorously but such a definition really has to wait until "differential geometry".

See http://people.math.gatech.edu/~ghomi...ureNotes8U.pdf for "differential map".

until then, it is better to stay with the rough concept which is, yes, circular.

Essentially, the concept of "differential" in basic Calculus is an attempt to make use of a general observation- although the derivative, dy/dx, is NOT a fraction, it can be used AS if it WERE a fraction because it is a limit of a fraction. The "differentials", dy and dx are introduced, in a fairly hand-waving way, in order to be able to treat dy/dx as if it were a fraction.
Country Boy is offline  
January 5th, 2017, 07:21 AM   #3
Senior Member
 
Joined: May 2016
From: USA

Posts: 578
Thanks: 248

I may be wrong, but I am missing the circularity in what, admittedly, is very obscurely written.

Let's try to clean it up.

$Given\ y = f(x).$

$\Delta y \equiv f(x + \Delta x) - f(x) \equiv \text{change in y if x changed by } \Delta x.$

No mention of a "differential." What is defined is quite clearly a difference.

$\displaystyle f'(x) \equiv \lim_{\Delta x \rightarrow 0} \dfrac{f(x + \Delta x) - f(x)}{\Delta x}.$

$dy \equiv f'(x)\ * \Delta x \equiv \text{differential of y.}$

The differential is a product, not a difference.

Once the definitions are clarified, I am missing any circularity.

Finally we have a proposition to be proved:

$Given\ arbitrary\ \epsilon > 0,\ \exists\ \lambda\ such\ that$

$0 < | x - \Delta x | < \lambda \implies | dy - \Delta y | < \epsilon.$

In English, that proposition says that if the change in x is sufficiently small, the differential of y approximates the change in y. That as yet unproved proposition may or may not be true, and it may or may not be provable, but it is not circular.

What am I missing?

Last edited by JeffM1; January 5th, 2017 at 07:25 AM.
JeffM1 is online now  
January 5th, 2017, 07:30 AM   #4
Newbie
 
Joined: Jan 2017
From: Tehran, Iran

Posts: 5
Thanks: 0

Quote:
Originally Posted by Country Boy View Post
Yes, "differential" can be defined rigorously but such a definition really has to wait until "differential geometry".

See http://people.math.gatech.edu/~ghomi...ureNotes8U.pdf for "differential map".

until then, it is better to stay with the rough concept which is, yes, circular.

Essentially, the concept of "differential" in basic Calculus is an attempt to make use of a general observation- although the derivative, dy/dx, is NOT a fraction, it can be used AS if it WERE a fraction because it is a limit of a fraction. The "differentials", dy and dx are introduced, in a fairly hand-waving way, in order to be able to treat dy/dx as if it were a fraction.
In this case, I think it's really bad to use Leibniz's notation in basic calculus. We should reformulate Calculus using Lagrange's notation for both derivative and antiderivative operations so we get rid of this nonsense differential.
HamedBegloo is offline  
January 5th, 2017, 07:55 AM   #5
Newbie
 
Joined: Jan 2017
From: Tehran, Iran

Posts: 5
Thanks: 0

Quote:
Originally Posted by JeffM1 View Post
$dy \equiv f'(x)\ * \Delta x \equiv \text{differential of y.}$
Once again the main problem is when we reach this point we easily take $\Delta x$ as $\mathrm{d} x$ without any reason and then treat the derivative of $y$ as the ratio of two differentials: $\frac{\mathrm{d} y}{\mathrm{d} x}$.

Was $\mathrm{d} x$ as the same as $\Delta x$ then we wouldn't need to introduce a new concept such as differential. Or maybe I'm missing something?
HamedBegloo is offline  
January 5th, 2017, 08:56 AM   #6
Senior Member
 
Joined: May 2016
From: USA

Posts: 578
Thanks: 248

Quote:
Originally Posted by HamedBegloo View Post
Once again the main problem is when we reach this point we easily take $\Delta x$ as $\mathrm{d} x$ without any reason and then treat the derivative of $y$ as the ratio of two differentials: $\frac{\mathrm{d} y}{\mathrm{d} x}$.

Was $\mathrm{d} x$ as the same as $\Delta x$ then we wouldn't need to introduce a new concept such as differential. Or maybe I'm missing something?
I did not try to grasp whatever subtext may have been in your original post. I was merely responding to your statement that the differential must necessarily involve circularity. I don't believe Weirstrauss et al. were guilty of circularity.

I simply did not address the issues of what is the most intuitive notation for introducing calculus or what is the best way to let students learn how to apply calculus. I am dubious that any analysis should be included in introductory calculus, and if it must, I might lean in favor of some form of simplified non-standard analysis. To put it differently, I think understanding should precede rigor. Understanding the utility of calculus justifies rigor; without that understanding, rigor is without motivation.
JeffM1 is online now  
January 5th, 2017, 09:25 AM   #7
Newbie
 
Joined: Jan 2017
From: Tehran, Iran

Posts: 5
Thanks: 0

Quote:
Originally Posted by JeffM1 View Post
I did not try to grasp whatever subtext may have been in your original post. I was merely responding to your statement that the differential must necessarily involve circularity. I don't believe Weirstrauss et al. were guilty of circularity.
For me circularity is a real headache and I couldn't withstand it.
I think at least in mathematics circularity is very frowned upon.
Anyway now I get what you meant.

Quote:
Originally Posted by JeffM1 View Post
I simply did not address the issues of what is the most intuitive notation for introducing calculus or what is the best way to let students learn how to apply calculus. I am dubious that any analysis should be included in introductory calculus, and if it must, I might lean in favor of some form of simplified non-standard analysis.
Well that seems a really good idea. Shouldn't it be better if we taught "Hyperreal analysis" instead of "Real analysis" in introductory calculus?

Quote:
Originally Posted by JeffM1 View Post
I think understanding should precede rigor. Understanding the utility of calculus justifies rigor; without that understanding, rigor is without motivation.
This seems important too. I wish there were a way too keep rigor and intuition at the same time.
HamedBegloo is offline  
January 5th, 2017, 01:01 PM   #8
Senior Member
 
Joined: Aug 2012

Posts: 1,165
Thanks: 258

Quote:
Originally Posted by HamedBegloo View Post
In this case, I think it's really bad to use Leibniz's notation in basic calculus. We should reformulate Calculus using Lagrange's notation for both derivative and antiderivative operations so we get rid of this nonsense differential.
I agree with you that it would be better if calculus texts said, "For now, just manipulate differentials the way we tell you to. It's not possible to give a rigorous definition till much later in your math education." Instead, the books tell confusing half-truths.



Quote:
Originally Posted by HamedBegloo View Post
Well that seems a really good idea. Shouldn't it be better if we taught "Hyperreal analysis" instead of "Real analysis" in introductory calculus?
Since Keisler wrote his infinitesimal-based calculus text in 1976, two things have NOT happened.

* Nobody else has bothered to write a similar text; and

* Hyperreal-based teaching of calculus has not caught on.

Studies have been done showing that students are equally confused no matter which approach is taken. And if you teach calculus using nonstandard terminology and concepts ("nonstandard" having two meanings here, as in nonstandard analysis and also not the standard way of teaching calculus) you would leave students totally unprepared to study standard physics, math, biology, statistics, or any other subject.

The only thing I can imagine that would change this state of affairs is if tomorrow morning professor so-and-so from Helsinki proves P $\neq$ NP using hyperreals. If that happened, everyone would suddently develop tremendous interest in nonstandard analysis.

Absent a dramatic development along those lines, nobody is going to change the way calculus is taught. And as I say, I completely agree with you that the treatment of differentials in calculus texts is very confusing to students; and the more thoughtful and mathematically inclined the student, the more confused they are on this point.
Thanks from complicatemodulus

Last edited by Maschke; January 5th, 2017 at 01:15 PM.
Maschke is offline  
January 6th, 2017, 07:59 AM   #9
Newbie
 
Joined: Jan 2017
From: Tehran, Iran

Posts: 5
Thanks: 0

Quote:
Originally Posted by Maschke View Post
I agree with you that it would be better if calculus texts said, "For now, just manipulate differentials the way we tell you to. It's not possible to give a rigorous definition till much later in your math education." Instead, the books tell confusing half-truths.





Since Keisler wrote his infinitesimal-based calculus text in 1976, two things have NOT happened.

* Nobody else has bothered to write a similar text; and

* Hyperreal-based teaching of calculus has not caught on.

Studies have been done showing that students are equally confused no matter which approach is taken. And if you teach calculus using nonstandard terminology and concepts ("nonstandard" having two meanings here, as in nonstandard analysis and also not the standard way of teaching calculus) you would leave students totally unprepared to study standard physics, math, biology, statistics, or any other subject.

The only thing I can imagine that would change this state of affairs is if tomorrow morning professor so-and-so from Helsinki proves P $\neq$ NP using hyperreals. If that happened, everyone would suddently develop tremendous interest in nonstandard analysis.

Absent a dramatic development along those lines, nobody is going to change the way calculus is taught. And as I say, I completely agree with you that the treatment of differentials in calculus texts is very confusing to students; and the more thoughtful and mathematically inclined the student, the more confused they are on this point.
Now that I noticed that, it seems there maybe a way to resolve this issue by staying in the "Real analysis" scope.

I remember someone pointed out that the main problem with defining Differential in "Real calculus" is that the mathematical objects $\displaystyle df(x)$ and $\displaystyle dx$ aren't even "Real numbers" or in some sense aren't "Real valued functions".

But I also remember there were a section in our "Calculus" course where they discussed "Infinite limits" and "Infinite derivatives" which were Limits and Derivatives that have a value of "Infinity". Since infinity isn't a real number sure they were treated as a kind of "Non-existent limit/derivative" but as a special kind of them. I mean this type of non-existence of limit/derivative had some significant properties which made them important.

Now why don't introduce "Infinitesimals" as some kind of non-existent limits and then relate the idea to the concept of "Differentials". By doing so we neither entered into non-standard analysis nor put down the differentials and even somehow made them more rigorous.

Do you think it's possible?
HamedBegloo is offline  
January 6th, 2017, 08:25 AM   #10
Senior Member
 
Joined: Dec 2012

Posts: 925
Thanks: 23

Quote:
Originally Posted by Maschke View Post
..., I completely agree with you that the treatment of differentials in calculus texts is very confusing to students; and the more thoughtful and mathematically inclined the student, the more confused they are on this point.
You gain lot of points as the most honest matematician of EVER

But I hope we are on the way to let think more and more clear.

New more simple and clear math is knocking on the door, it's just question of how many years it takes, to cancel out the old math generations...
complicatemodulus is offline  
Reply

  My Math Forum > College Math Forum > Calculus

Tags
definition, differential, rigorous



Thread Tools
Display Modes


Similar Threads
Thread Thread Starter Forum Replies Last Post
Statistics "Independent " definition calebh Probability and Statistics 1 January 26th, 2013 03:41 PM
A "simple" application of dirac delta "shift theorem"...help SedaKhold Calculus 0 February 13th, 2012 12:45 PM
What does "definition" mean in mathematics? Amir Kiani Algebra 2 March 19th, 2009 01:02 PM
sample exeriment-need help finding "statistic" and "result" katie0127 Advanced Statistics 0 December 3rd, 2008 02:54 PM
"Recursive definition" robocop_911 Applied Math 3 June 1st, 2008 09:43 PM





Copyright © 2017 My Math Forum. All rights reserved.