wy
Lv 7
wy 發問於 科學及數學數學 · 1 十年前

Maths

For a certain function f(x), f(a) + f(d) = f(c) + f(b) and a> b > c > d > 0.

Also f"(x) < 0 and f'(x) > 0 for all x > 0.

Prove that f[(a+d)/2] > f[(c +b)/2]. Please help.

1 個解答

評分
  • cipker
    Lv 5
    1 十年前
    最愛解答

    Since f'(x) > 0 for all x > 0, the graph of f(x) is always sloping upwards.

    As f"(x) < 0 for all x > 0, the the graph of f'(x) is sloping downwards.

    So, the slope of f(x) is postive but decreasing as x increases.

    So, the graph of f(x) is similar to the graph of y=√x + C ,

    where C is a arbitrary constant.

    The graph is shown on the link http://hk.geocities.com/cipker/graphfx.bmp

    f(a) + f(d) = f(c) + f(b)

    f(a) - f(b) = f(c) - f(d)

    Let k= f(a) - f(b) = f(c) - f(d)

    From the graph, we can easily see that a - b > c - d.

    So, a+d>c+b

    (a+d)/2 > (c +b)/2

    As the graph of f(x) is sloping upwards all x > 0,

    f(n) > f(m) when n>m for any numbers n and m.

    So,

    f[(a+d)/2] > f[(c +b)/2]

還有問題嗎?立即提問即可得到解答。