Yahoo Answers is shutting down on May 4th, 2021 (Eastern Time) and beginning April 20th, 2021 (Eastern Time) the Yahoo Answers website will be in read-only mode. There will be no changes to other Yahoo properties or services, or your Yahoo account. You can find more information about the Yahoo Answers shutdown and how to download your data on this help page.

Putnam Competition: Aftermath A?

For those of you who are not aware, the Putnam Collegiate Mathematics Competition was held today, from 10:00-6:00PM US Eastern Time (So these are now fair game :) )

Here is one question that piqued my fancy, as I was unable to solve it in the time I had, given that it was stubbornly resilient to my class of usual approaches. Anyone else?

No cheating by looking at the website.

A6) Let f : R --> R be strictly decreasing with limit f(x) --> 0 as x --> infinity. Prove that:

∫_0^∞ (f(x)-f(x+1))/f(x) dx

diverges

1 Answer

Relevance
  • 1 decade ago
    Favorite Answer

    You can compare this to sum ( 1 - a_n+1 / a_n) for a decreasing sequence tending to 0.

    Then ( 1 - a_n+1 / a_n) < ln [a_n /a_(n+1)], since for all x in R+ ln x <= x -1.

    Since a_n ----> 0, the sum ln [a_n /a_(n+1)] diverges. You are done in the series case;

    You can adapt it to the integral case.

Still have questions? Get your answers by asking now.