My Math Forum  

Go Back   My Math Forum > High School Math Forum > Algebra

Algebra Pre-Algebra and Basic Algebra Math Forum


Thanks Tree9Thanks
  • 6 Post By v8archie
  • 1 Post By v8archie
  • 2 Post By v8archie
Reply
 
LinkBack Thread Tools Display Modes
March 4th, 2019, 11:36 PM   #1
Senior Member
 
Joined: Oct 2015
From: Greece

Posts: 107
Thanks: 6

How was Log proven?

Hello!

In any resources I've read or any teachers that I have watched explain logarithms none of them actually prove logarithms. They just explain them with examples. So my question is:

1) Did logarithms actually invented by observations which we can't defy?

For example, let's say we have not invented logs yet, can we somehow solve the following equation for n in order to find the reverse of it?
If yes, then we can find the formula of log2 with algebra, am I right? Or is it impossible to use algebra for solving this problem?
Well, probably one of the reasons we invented logs was to solve problems like this, right? But how did we invented and proven them in the first place?

$\displaystyle
f(x) = 2^n
$

Last edited by babaliaris; March 4th, 2019 at 11:41 PM.
babaliaris is offline  
 
March 5th, 2019, 06:29 AM   #2
Math Team
 
Joined: Dec 2013
From: Colombia

Posts: 7,618
Thanks: 2608

Math Focus: Mainly analysis and algebra
Here's a great history of Logarithms. As you will see, they were not so much proved as created from existing knowledge to fill a need.

However, if we decided that we want a (non-trivial) function $f(x)$ that satisfies the basic logarithmic identity, simplifying multiplication: $$f(xy)=f(x)+f(y)$$
we can determine that the function must be what we know as the logarithm.

\begin{align}
f(xy) &= f(x) + f(y) \\[8pt] \text{set $y=0$} \implies f(0) &= f(x) + f(0) \\ \implies f(x) &= 0
\end{align}
Thus, if $f(0)$ exists, we must have $f(x) = 0$ for all $x$. This is clearly not useful, so $f(0)$ is not defined.
\begin{align}
f(xy) &= f(x) + f(y) &(x\ne 0,\, y\ne 0) \\[8pt] \text{set $y=1$} \implies f(x) &= f(x) + f(1) \\ \implies f(1) &= 0
\end{align}
This can be taken at face value, but it's kind of interesting - all functions that do what we want have $f(1)=0$.
\begin{align}
f(xy) &= f(x) + f(y) &(x\ne 0,\, y\ne 0) \\[8pt] \text{differentiating with respect to $x$} \implies y f'(xy) &= f'(x) \\ f'(xy) &= \frac{f'(x)}{y} \\ \text{differentiating the original equation with respect to $y$} \implies x f'(xy) &= f'(y) \\ f'(xy) &= \frac{f'(y)}{x} \\
\text{equating the two expressions for $f'(xy)$} \implies \frac{f'(x)}{y} &= \frac{f'(y)}{x} \\ xf'(x) &= yf'(y)
\end{align}
Now, the right hand side is independent of $x$, so no matter the value of $x$ it has the same value - a constant we'll call $c$, and so
\begin{align}xf'(x) &= c \\ f'(x) &= \frac{c}{x} \\ \int_1^t f(x)\,\mathrm dx &= \int_1^t \frac{c}{x} \,\mathrm dx \\ f(t)-f(1) &= c \int_1^t \frac{1}{x} \,\mathrm dx \\ f(t) &= c \int_1^t \frac{1}{x} \,\mathrm dx &\text{(because $f(1)=0$)}\end{align}

This equation is one of the definitions of the logarithmic functions, with the selection of the constant $c$ determining the base. $c=1$ corresponds to the natural logarithm.
v8archie is offline  
March 5th, 2019, 11:09 AM   #3
Senior Member
 
Joined: Oct 2015
From: Greece

Posts: 107
Thanks: 6

Quote:
Originally Posted by v8archie View Post
Here's a great history of Logarithms. As you will see, they were not so much proved as created from existing knowledge to fill a need.

However, if we decided that we want a (non-trivial) function $f(x)$ that satisfies the basic logarithmic identity, simplifying multiplication: $$f(xy)=f(x)+f(y)$$
we can determine that the function must be what we know as the logarithm.

\begin{align}
f(xy) &= f(x) + f(y) \\[8pt] \text{set $y=0$} \implies f(0) &= f(x) + f(0) \\ \implies f(x) &= 0
\end{align}
Thus, if $f(0)$ exists, we must have $f(x) = 0$ for all $x$. This is clearly not useful, so $f(0)$ is not defined.
\begin{align}
f(xy) &= f(x) + f(y) &(x\ne 0,\, y\ne 0) \\[8pt] \text{set $y=1$} \implies f(x) &= f(x) + f(1) \\ \implies f(1) &= 0
\end{align}
This can be taken at face value, but it's kind of interesting - all functions that do what we want have $f(1)=0$.
\begin{align}
f(xy) &= f(x) + f(y) &(x\ne 0,\, y\ne 0) \\[8pt] \text{differentiating with respect to $x$} \implies y f'(xy) &= f'(x) \\ f'(xy) &= \frac{f'(x)}{y} \\ \text{differentiating the original equation with respect to $y$} \implies x f'(xy) &= f'(y) \\ f'(xy) &= \frac{f'(y)}{x} \\
\text{equating the two expressions for $f'(xy)$} \implies \frac{f'(x)}{y} &= \frac{f'(y)}{x} \\ xf'(x) &= yf'(y)
\end{align}
Now, the right hand side is independent of $x$, so no matter the value of $x$ it has the same value - a constant we'll call $c$, and so
\begin{align}xf'(x) &= c \\ f'(x) &= \frac{c}{x} \\ \int_1^t f(x)\,\mathrm dx &= \int_1^t \frac{c}{x} \,\mathrm dx \\ f(t)-f(1) &= c \int_1^t \frac{1}{x} \,\mathrm dx \\ f(t) &= c \int_1^t \frac{1}{x} \,\mathrm dx &\text{(because $f(1)=0$)}\end{align}

This equation is one of the definitions of the logarithmic functions, with the selection of the constant $c$ determining the base. $c=1$ corresponds to the natural logarithm.
This made me cry Didn't understand it completely but i get the idea!!!

Can we do the same but starting with
$\displaystyle
f(\frac{x}{y}) = f(x) - f(y)
$

Last edited by babaliaris; March 5th, 2019 at 11:13 AM.
babaliaris is offline  
March 5th, 2019, 12:08 PM   #4
Senior Member
 
Joined: Oct 2015
From: Greece

Posts: 107
Thanks: 6

I did it!

But I can't understand the equation in the red


Why
$\displaystyle
\frac{df(\frac{x}{y})}{dx} = \frac{df(\frac{x}{y})}{dy}
$

Last edited by babaliaris; March 5th, 2019 at 12:10 PM.
babaliaris is offline  
March 6th, 2019, 05:13 AM   #5
Math Team
 
Joined: Dec 2013
From: Colombia

Posts: 7,618
Thanks: 2608

Math Focus: Mainly analysis and algebra
Nicely done.

To answer your question: the function $f$ is a single variable function. Thus $f(xy) = f\big(t(x,y)\big)$ where $t(x,y) = xy$. It is $f(t)$ not $f(x,y)$. This means it does not have a partial derivative, just a derivative.

Thus under differentiation with respect to $x$, for example, we get $$\frac{\partial}{\partial x} f\big(t(x,y)\big) = \frac{\mathrm df}{\mathrm dt} \cdot \frac{\partial t}{\partial x} = \frac{\partial t}{\partial x} f'(t) \quad \text{or, equivalently} \\
\frac{\partial}{\partial x}f(xy) = \frac{\mathrm d}{\mathrm d(xy)}f(xy) \cdot \frac{\partial}{\partial x}(xy) = y f'(xy)$$

Something similar will happen in your case of the derivative of $f\left(\frac{x}{y}\right)$ with respect to $x$ or $y$.

Thanks for asking the question - it's one that was mildly bugging me as I wrote the thing out.
Thanks from topsquark

Last edited by v8archie; March 6th, 2019 at 05:27 AM.
v8archie is offline  
March 6th, 2019, 09:05 AM   #6
Senior Member
 
Joined: Oct 2015
From: Greece

Posts: 107
Thanks: 6

Oh, I understand now. But still, I must reread about partial derivatives, because I forgot them a little (don't encounter them very often).

Thanks for your answers!

Last edited by skipjack; March 6th, 2019 at 09:45 AM.
babaliaris is offline  
March 7th, 2019, 05:55 AM   #7
Math Team
 
Joined: Dec 2013
From: Colombia

Posts: 7,618
Thanks: 2608

Math Focus: Mainly analysis and algebra
Quote:
Originally Posted by v8archie View Post
\begin{align}
f(xy) &= f(x) + f(y) &(x\ne 0,\, y\ne 0) \\[8pt] \text{differentiating with respect to $x$} \implies y f'(xy) &= f'(x) \\ f'(xy) &= \frac{f'(x)}{y} \\ \text{differentiating the original equation with respect to $y$} \implies x f'(xy) &= f'(y) \\ f'(xy) &= \frac{f'(y)}{x} \\
\text{equating the two expressions for $f'(xy)$} \implies \frac{f'(x)}{y} &= \frac{f'(y)}{x} \\ xf'(x) &= yf'(y)
\end{align}
Now, the right hand side is independent of $x$, so no matter the value of $x$ it has the same value - a constant we'll call $c$, and so
\begin{align}xf'(x) &= c \\ f'(x) &= \frac{c}{x} \\ \int_1^t f(x)\,\mathrm dx &= \int_1^t \frac{c}{x} \,\mathrm dx \\ f(t)-f(1) &= c \int_1^t \frac{1}{x} \,\mathrm dx \\ f(t) &= c \int_1^t \frac{1}{x} \,\mathrm dx &\text{(because $f(1)=0$)}\end{align}
Now that I've remembered it, this part can be done in a slightly more straightforward fashion.
\begin{align}f(xy) &= f(x) + f(y) &(x \ne 0,\, y \ne 0) \\[8pt] \text{differentiating with respect to $y$} \implies xf'(xy) &= f'(y) \\ \text{and now selecting $y=1$} \implies xf'(x) &= f'(1) \\ f'(x) &= \frac{f'(1)}{x} \\ \text{and integrating as before} \implies \int_1^t f'(x)\,\mathrm dx &= f'(1)\int_1^t \frac1x\,\mathrm dx \\ f(t) - f(1) &= f'(1)\int_1^t \frac1x\,\mathrm dx \\ f(t) &= f'(1)\int_1^t \frac1x\,\mathrm dx \end{align}
This also makes clear the significance of the constant $c$ in the previous derivation.

Note that, strictly speaking, these derivations show only that if such a function $f$ exists (satistfying the functional equation, defined at 1 and differentiable) it must have the form given. But it's straightforward to show that each of these conditions is satisfied by all functions of the form $$f(t) = c \int_1^t \frac1x\,\mathrm dx$$
v8archie is offline  
March 7th, 2019, 11:09 AM   #8
Senior Member
 
Joined: Oct 2015
From: Greece

Posts: 107
Thanks: 6

Quote:
Originally Posted by v8archie View Post
Now that I've remembered it, this part can be done in a slightly more straightforward fashion.
\begin{align}f(xy) &= f(x) + f(y) &(x \ne 0,\, y \ne 0) \\[8pt] \text{differentiating with respect to $y$} \implies xf'(xy) &= f'(y) \\ \text{and now selecting $y=1$} \implies xf'(x) &= f'(1) \\ f'(x) &= \frac{f'(1)}{x} \\ \text{and integrating as before} \implies \int_1^t f'(x)\,\mathrm dx &= f'(1)\int_1^t \frac1x\,\mathrm dx \\ f(t) - f(1) &= f'(1)\int_1^t \frac1x\,\mathrm dx \\ f(t) &= f'(1)\int_1^t \frac1x\,\mathrm dx \end{align}
This also makes clear the significance of the constant $c$ in the previous derivation.

Note that, strictly speaking, these derivations show only that if such a function $f$ exists (satistfying the functional equation, defined at 1 and differentiable) it must have the form given. But it's straightforward to show that each of these conditions is satisfied by all functions of the form $$f(t) = c \int_1^t \frac1x\,\mathrm dx$$
Nice! And now what happens if you try to calculate that integration? I will try it:

$\displaystyle
f(t) = c\int_1^t\frac{1}{x}dx = c\cdot [ln(|x|) |_{t-1} =
c \cdot (ln(t) - ln(1)) \Leftrightarrow f(t) = c \cdot ln(t) \Leftrightarrow
e^{f(t)} = t^c
$

I can not really see how this can give $\displaystyle log_c(t) = f(t)$
babaliaris is offline  
March 7th, 2019, 11:18 AM   #9
Math Team
 
Joined: Dec 2013
From: Colombia

Posts: 7,618
Thanks: 2608

Math Focus: Mainly analysis and algebra
Nobody said that $c$ is the actual base, but different bases result from different choices of $c$.

In particular, we know that $c=f'(1)$ when $f(t) = c \ln t$ (here $\ln t$ is the natural logarithm).

So, given base $b$ we have $\log_b t = \frac{\ln t}{\ln b} = \frac1{\ln b}{\ln t}$ and so we expect $f'(1)$ to be equal to $\frac1{\ln b}$. You can verify this.
Thanks from topsquark and babaliaris
v8archie is offline  
Reply

  My Math Forum > High School Math Forum > Algebra

Tags
log, proven



Thread Tools
Display Modes


Similar Threads
Thread Thread Starter Forum Replies Last Post
Has anything been proven by this method? Antoniomathgini Algebra 2 October 29th, 2017 07:34 PM
Collatz conjecture (proven) jhonhard Number Theory 4 December 4th, 2015 02:31 AM
f linear but not differentiable ... Proven .... Noelopan Calculus 0 November 15th, 2008 03:12 AM





Copyright © 2019 My Math Forum. All rights reserved.