 My Math Forum Function of two variables and differentiability
 User Name Remember Me? Password

 Calculus Calculus Math Forum

 June 18th, 2019, 01:53 AM #1 Member   Joined: Apr 2017 From: India Posts: 73 Thanks: 0 Function of two variables and differentiability I read the following three statements for a real valued two variable function 'f' which are true: 1. I know that when the function(of two variables) is differentiable, then it implies that the function is continuous and partial derivatives of the function exists. 2. I also know that if one of the partial derivatives of the function is continuous at a point (a,b) and other partial derivative merely exists at point (a,b) then also the function is differentiable at (a,b) 3. Also, a continuously differentiable function is always differentiable. After reading all the three, a question aroused: If it is given that the function is continuous and whose partial derivatives exist at (a,b). Is it enough for me deduce that the function will be differentiable or I also need the partial derivatives to be continuous at the point (a,b) to deduce that. Please help me. Last edited by shashank dwivedi; June 18th, 2019 at 01:55 AM. June 18th, 2019, 04:59 AM #2 Senior Member   Joined: Sep 2016 From: USA Posts: 647 Thanks: 412 Math Focus: Dynamical systems, analytic function theory, numerics This topic is a bit subtle the first time through so I'll include some extra details. Assume $f: D \subset \mathbb{R}^n \to \mathbb{R}$ is a scalar function of $n$ variables where $n > 1$. Let $\partial_j f$ denote the partial derivivative with respect to the $j^{\rm th}$ input. Now, we define a few sets: 1. Let $A$ denote the set of functions whose partial derivatives exist for every $x \in D$. 2. Let $B$ denote the set of differentiable functions defined on $D$. 3. Let $C$ denote the set of functions whose partial derivatives are continuous for every $x \in D$. Then, we have the following inclusions $A \subset B \subset C$ and both inclusions are strict. The reason this can be confusing is typically due to the fact that the derivative of $f$ (when it exists) is a linear functional which is perfectly fine to talk about. However, when one needs to compute it in practice, you need to choose a basis and the partial derivatives are a highly convenient choice. To be more specific, the derivative of $f$ should be a linear approximation of $f$ which means, for a fixed $x \in D$, we want a linear function $A: \mathbb{R}^n \to \mathbb{R}$ such that $f(x+h) \approx f(x) + Ah$ when $\left| \left| h \right| \right| \approx 0$. First notice that $A$ is a linear functional. Now, what do we mean by approximation? Well, we require the following: Let $h \in \mathbb{R}^n$ be an arbitrary unit vector, then $A$ is the derivative of $f$ at $x$, denoted by $A = Df(x)$ if $\lim\limits_{t \to 0} \frac{f(x + th) - f(x) - tAh}{t} = 0$ for every unit vector $h$. There are of course many ways to write the same thing here so don't be thrown off if my definition differs slightly from another one. What is important is that a derivative is a linear functional. Now, if we have a specific $f$, how can we get our hands on $Df(x)$? Even simpler, how can we even determine whether such a linear functional exists or not (i.e. how can we decide if $f \in B$?) This is much harder in the abstract setting. However, we have the following theorem which is used very often in practice. Roughly, it says if we ask for something a little bit stronger than differentiability, then we get an easy condition to check which is sufficient for differentiability. Moroever, we get an easy way to compute these derivatives in the basis, $\{dx_1, \dotsc, dx_n \}$. Theorem: Fix $x \in D$ and suppose $\partial_1f, \dotsc, \partial_nf$ exist. Even more, assume that in some open neighborhood of $x$, $\{\partial_1f, \dotsc, \partial_n f\}$ are all continuous. Then, $Df(x)$ exists and moreover, the Jacobian matrix (matrix of partial derivatives) is a representation for it. I have to run but I might add more to this later. Thanks from Maschke and shashank dwivedi June 18th, 2019, 08:17 PM #3 Member   Joined: Apr 2017 From: India Posts: 73 Thanks: 0 Thanks for the elaborate explanation on this topic. So, The conclusion I reached after your explanation is this: If the function is given to be continuous and it's said that partial derivatives exist at (a,b)(nothing is said about the continuity of partial derivatives). Above you mentioned that A is the subset B, then accordingly the function can be differentiable or cannot be differentiable, that entirely depends on the definition of the domain in this case. (Did I get that right?) However, the existence and continuity of the partial derivatives at (a,b) ensures the differentiability of the function at that point. (This is true by the theorem above) Last edited by shashank dwivedi; June 18th, 2019 at 08:21 PM. Tags differentiability, function, variables Thread Tools Show Printable Version Email this Page Display Modes Linear Mode Switch to Hybrid Mode Switch to Threaded Mode Similar Threads Thread Thread Starter Forum Replies Last Post Carlos2007 Calculus 18 September 15th, 2018 10:36 AM bigli Calculus 1 July 19th, 2012 11:56 PM dunn Calculus 2 November 24th, 2011 03:54 AM SonicYouth Real Analysis 4 March 31st, 2011 09:10 AM WannaBe Calculus 6 March 24th, 2010 12:14 AM

 Contact - Home - Forums - Cryptocurrency Forum - Top      