My Math Forum  

Go Back   My Math Forum > College Math Forum > Calculus

Calculus Calculus Math Forum


Thanks Tree2Thanks
  • 1 Post By v8archie
  • 1 Post By romsek
Reply
 
LinkBack Thread Tools Display Modes
December 1st, 2016, 07:42 AM   #1
Senior Member
 
Joined: Mar 2015
From: New Jersey

Posts: 1,603
Thanks: 115

Power Series and Taylor Expansion

f(x)=$\displaystyle \frac{1-e^{-x}}{x} =1-\frac{x}{2!}+\frac{x^{2}}{3!}-\frac{x^{3}}{4!}+.....
$

Define f(0) = 1

f'(x)=$\displaystyle \frac{xe^{-x}-(1-e^{-x})}{x^{2}}
$

f'(0)=?

f(x) has a valid power series expansion about 0.
Yet, there is the theorem:
If a function can be represented by a convergent series of powers of x, then the function is the Taylor series expansion about the point x=0.

That's my question.
zylo is offline  
 
December 1st, 2016, 08:07 AM   #2
Global Moderator
 
Joined: Dec 2006

Posts: 19,981
Thanks: 1853

f$^{\,'}\!$(0) = -1/2.
skipjack is offline  
December 1st, 2016, 08:57 AM   #3
Math Team
 
Joined: Dec 2013
From: Colombia

Posts: 7,512
Thanks: 2514

Math Focus: Mainly analysis and algebra
Rather than simply differentiating, you ought to be using the difference quotient
$$f'(0) = \lim_{h \to 0} \frac{f(h) - f(0)}{h} = \lim_{h \to 0} \frac{\frac{1 - e^{-h}}{h} - 1}{h} = \lim_{h \to 0} \frac{1 - h - e^{-h}}{h^2} = -\frac12$$
Although it makes no practical difference, it does have the benefit that it only uses values that you have defined.
Thanks from zylo
v8archie is offline  
December 1st, 2016, 09:57 AM   #4
Senior Member
 
romsek's Avatar
 
Joined: Sep 2015
From: USA

Posts: 2,202
Thanks: 1157

Given the power series about 0,

$f^\prime(0)$ is simply the coefficient of the series term of order 1.

By inspection, it's as the others have said: $f^\prime(0)=-\dfrac 1 2$.
Thanks from zylo

Last edited by skipjack; December 1st, 2016 at 12:13 PM.
romsek is offline  
December 1st, 2016, 11:19 AM   #5
Senior Member
 
Joined: Mar 2015
From: New Jersey

Posts: 1,603
Thanks: 115

You don't differentiate a series to get its Taylor expansion. You differentiate a function to get its Taylor expansion; you don't differentiate the power series of sinx to get the power series for sinx.*

L'Hôpital's rule works in OP, as in v8archie's post, for f'(0). I'd rather use it in the actual derivative to be consistent with Taylor's theorem. I assume it works in the higher order derivatives f$\displaystyle ^{(n)}$(0)

Thanks for comments.

EDIT*
But if you just have the convergent power series for an unknown function, then differentiating the series itself does verify it's a Taylor's expansion.
Everybody is right, including Taylor.

Last edited by skipjack; December 1st, 2016 at 12:16 PM.
zylo is offline  
December 1st, 2016, 12:01 PM   #6
Math Team
 
Joined: Dec 2013
From: Colombia

Posts: 7,512
Thanks: 2514

Math Focus: Mainly analysis and algebra
Quote:
Originally Posted by zylo View Post
I'd rather use it in the actual derivative to be consistent with Taylor's theorem. I assume it works in the higher order derivatives f$\displaystyle ^{(n)}$
Note that your original function is not defined at $x=0$. It's a removable singularity which is how we happen to get a Maclaurin series that is not undefined at zero. But speaking strictly, the domain of the function and thus of its power series does not include zero. This means that rhe derivative is also not defined at zero. Again it's a removable singularity so the derivative's power series does return a value at zero, but again, the domain of the derivative and thus of its power series does not include zero, both being undefined at this point

It is only when you explicitly defined the function at $x=0$ that anything changes. And only then when the value defined for $f(0)$ is that returned by the power series. Whatever that value may be, the correct way to proceed is then to evaluate the derivative via the definition of the derivative: the difference quotient. There are, after all, functions (that do not have a power series expansion, but do have a derivative. For example:
$$g(x)=\begin{cases}x^2\sin\left(\frac1x\right) & (x \ne 0) \\ 0 & (x=0) \end{cases}$$
I suspect that you have omitted some continuity requirement on the function when talking about Taylor's theorem, unless it is hidden in the statement that the function has a Taylor expansion.
v8archie is offline  
December 1st, 2016, 12:54 PM   #7
Global Moderator
 
Joined: Dec 2006

Posts: 19,981
Thanks: 1853

Quote:
Originally Posted by zylo View Post
You don't differentiate a series to get its Taylor expansion.
Nobody suggested differentiating a series to get its own Taylor expansion, as that wouldn't make sense.
However, the original definitions imply f($x$) ≡ $\displaystyle 1-\frac{x}{2!}+\frac{x^{2}}{3!}-\frac{x^{3}}{4!}+\,...
$.
It's legitimate to differentiate that to obtain f$\, '\!$($x$) ≡ $\displaystyle -\frac{1}{2!}+\frac{x}{3}-\frac{x^{2}}{8}+\,...$,
and then put $x$ = 0 to obtain f$\,'\!$(0) = -1/2.
skipjack is offline  
December 1st, 2016, 02:03 PM   #8
Senior Member
 
Joined: Mar 2015
From: New Jersey

Posts: 1,603
Thanks: 115

Define $\displaystyle f^{(n)}(0) =\lim_{x\rightarrow 0}f^{(n)}(x)$ which exist and are continuous at x=0 for all derivatives by L'Hôpital's rule.
Then x=0 belongs to domain of f(x), and obviously the power series is a valid expansion about x=0.

romsek had a good point. A convergent power series may define an unknown function, power series solution of an ODE for example. In that case, it is easily shown it is a Taylor series.

Last edited by skipjack; December 1st, 2016 at 02:11 PM.
zylo is offline  
December 1st, 2016, 02:14 PM   #9
Global Moderator
 
Joined: Dec 2006

Posts: 19,981
Thanks: 1853

Quote:
Originally Posted by zylo View Post
In that case, it is easily shown it is a Taylor series.
How?
skipjack is offline  
December 2nd, 2016, 09:01 AM   #10
Senior Member
 
Joined: Mar 2015
From: New Jersey

Posts: 1,603
Thanks: 115

Quote:
Originally Posted by skipjack View Post
How?
$\displaystyle f(x)=a_{0}+a_{1}x+a_{2}x^{2}+a_{3}x^3+...\\
f'(x)=a_{1}+2a_{2}x+3a_{3}x^2+...\\
f''(x)=2a_{2}+3\cdot 2a_{3}x+...\\
f'''(x)=3\cdot 2a_{3}+....\\
...\\
f(0)=a^{0}\\
f'(0)=a_{1}\\
f''(0)=2!a_{2}\\
f'''(0)=3!a_{3}\\
...\\$
Taylor's (really Maclaurins) expansion of f(x):

$\displaystyle f(x)=f(0)+f'(0)x+\frac{f''(0)}{2!}x^{2}+\frac{f''' (0)}{3!}x^{3}+....$
zylo is offline  
Reply

  My Math Forum > College Math Forum > Calculus

Tags
expansion, power, series, taylor



Thread Tools
Display Modes


Similar Threads
Thread Thread Starter Forum Replies Last Post
Taylor Series Expansion MathsStudent Calculus 1 October 22nd, 2013 04:56 PM
taylor series expansion about non-zero point philosopher Real Analysis 3 October 13th, 2012 04:17 PM
Taylor series expansion for sin(x)/sqrt(1+x)? cham07 Calculus 1 July 13th, 2010 10:26 AM
taylor series expansion kandre Algebra 0 November 10th, 2009 03:59 AM
Taylor Series Expansion of p(x) = 0 JackOLantern Calculus 2 March 2nd, 2009 11:11 PM





Copyright © 2018 My Math Forum. All rights reserved.