Question
Asked Dec 2, 2019
18 views

assume f(g(x))=x and f'(x)=1+f2(x) for all x. show that g'(x)=1/(1+x2)

check_circle

Expert Answer

star
star
star
star
star
1 Rating
Step 1

Differentiate both sides of f(g(x))=x with respect to x. Use chain rule on the left side. 

f(g(x)) x
f'(g(x))g'(x) 1
help_outline

Image Transcriptionclose

f(g(x)) x f'(g(x))g'(x) 1

fullscreen
Step 2

Given f'(x)= 1+f^2(x)

Plug g(x) in place of x here. ...

f'(x) 1f2(x)
f'(g(x))-1+[f (g(x))]
f'(g(x)) 1+x
help_outline

Image Transcriptionclose

f'(x) 1f2(x) f'(g(x))-1+[f (g(x))] f'(g(x)) 1+x

fullscreen

Want to see the full answer?

See Solution

Check out a sample Q&A here.

Want to see this answer and more?

Solutions are written by subject experts who are available 24/7. Questions are typically answered within 1 hour.*

See Solution
*Response times may vary by subject and question.
Tagged in

Math

Calculus

Other