Here’s a combinatorial problem I found recently: given a 100 story building and two identical plates, what is the worst case number of drops required to determine the lower floor from which the plates break when dropped (if a plate drops and does not break, it is undamaged). The answer is 14, which I’ll state without proof.

In the interest of completely useless generalization, we can ask what happens if we add more plates. Specifically, what is the cost $C(s)$ to analyze an $s$ story building if we can buy as many plates as we like? Say each plate costs as much as one drop.

First we need to figure out how many drops it takes for a given number of plates, or (roughly) equivalently the number of floors we can analyze for a given number of plates and drops. If we count the zeroth story as a floor to make the formulas slightly prettier, the answer is that $n$ drops and $k$ plates can analyze a building with $S(n,k)$ floors, where
$$S(n,k)=\left(\genfrac{}{}{0ex}{}{n}{0}\right)+\left(\genfrac{}{}{0ex}{}{n}{1}\right)+\cdots +\left(\genfrac{}{}{0ex}{}{n}{k}\right)=\sum _{p\le k}\left(\genfrac{}{}{0ex}{}{n}{k}\right)$$
(This can be proved by induction on the number of plates.) Thus, $S(n,k)$ is the number of subsets of [1..n] with at most $k$ elements, which turns out to have no closed form expression (Graham et al.). Therefore, we’ll have to settle for asymptotics. The best estimates I could find are given in Worsch 1994. Since they details vary depending on the relative growth of $n$ and $k$, we first need a rough idea of the growth rate of $n/k$.

For a building of $s$ floors, brute force binary search requires $n=k=\mathrm{lg}s+1$, so $C(s)<2+2\mathrm{lg}s$ and $n,k\le n+k=C(s)<2\mathrm{lg}s$. Since at least $\mathrm{lg}s$ tests are required regardless of the number of plates, we have $\mathrm{lg}s\le n<2+2\mathrm{lg}s$. Lacking an obvious lower bound for $k$, I’ll assume for the moment that $k=\Theta (\mathrm{lg}s)=\Theta (n)$. Numerical evidence indicates that $n/k$ is in fact between 2 and 3.

In the case where $n/k$ is a constant of at least 2, Worsch 1994 gives
$$S(n,k)=(C(x)+O(1){)}^{n}\approx C(x{)}^{n}$$
where
$$C(n/k)=C(x)={x}^{\frac{1}{x}}{\left(\frac{x}{x-1}\right)}^{\frac{x-1}{x}}$$
We can now use this approximation to find the optimal ratio $x$ by optimizing $n+k$ subject to $S=s$. For convenience, let $D(x)=C\prime (x)/C(x)$ be the logarithmic derivative of $C$. Applying Lagrange multipliers, we have
$$\begin{array}{rl}E=& n+k-\lambda (C{\left(\frac{n}{k}\right)}^{n}-s)\\ \frac{\partial E}{\partial n}=& 1-\lambda (n{C}^{n-1}CD\frac{1}{k}+{C}^{n}\mathrm{log}C)\\ =& 1-\lambda {C}^{n}(Dx+\mathrm{log}C)\\ \frac{\partial E}{\partial k}=& 1-\lambda n{C}^{n-1}CD\frac{n}{{k}^{2}}\\ =& 1-\lambda {C}^{n}D{x}^{2}\end{array}$$
Equating derivatives to zero and solving gives $\mathrm{log}C+\mathrm{Dx}+{\mathrm{Dx}}^{2}=0$. Filling in the details,
$$\begin{array}{rl}\mathrm{log}C=& \frac{1}{x}\mathrm{log}x+\frac{x-1}{x}\mathrm{log}\frac{x}{x-1}\\ =& \mathrm{log}x+\frac{1}{x}\mathrm{log}(x-1)-\mathrm{log}(x-1)\\ D=& \frac{C\prime}{C}=\frac{1}{x}-\frac{1}{{x}^{2}}\mathrm{log}(x-1)+\frac{1}{x(x-1)}-\frac{1}{x-1}\\ =& -\frac{1}{{x}^{2}}\mathrm{log}(x-1)\\ \mathrm{log}C+\mathrm{Dx}+{\mathrm{Dx}}^{2}=& \mathrm{log}x+\frac{1}{x}\mathrm{log}(x-1)-\mathrm{log}(x-1)-\frac{1}{x}\mathrm{log}(x-1)-\mathrm{log}(x-1)\\ =& \mathrm{log}x-2\mathrm{log}(x-1)=0\\ \mathrm{log}x=& 2\mathrm{log}(x-1)\\ x=& (x-1{)}^{2}\\ 0=& {x}^{2}-3x+1\\ x=& \frac{3+\sqrt{5}}{2}\end{array}$$
Thus, $n\approx 2.62k$, $S(n,n/2.62)\approx {1.94}^{n}$, and $C(s)\approx (1+1/x)\mathrm{log}s/\mathrm{log}x\approx 1.44\mathrm{lg}s$.