The following problem is investigated. Given intrgers r ⪖ l ⪖ 0 wwith 0, and an integer c for which l − r < c ⪯ l + r, what is the minimum ratio of white balls to black balls over the family of all finite rings of white and black balls that satisfy: (i) the ring has at least one white ball, and (ii) for every white ball, there are at least c more white balls than black balls in the list of the l balls counterclockwise from the white ball conjoined with the list of r balls clockwise from the white ball?
Let R(l, r, c) be the minimum ratio and assume that c has the same parity as l + r. For the symmetric cases with l = r = k, it is proved that R(k, k, c) = (2k + c)/(2k − c) when k and are congruent (mod 2), and that R(k, k, c) mightbe slightly larger than (2k + c)/(2k − c) when are not congruent. The upper bounds on R given for the latters case are conjectured to be tight.
We also conjecture that R(l, r, c) > 1 wheneverc > 0. This is known to be true when l = r. It is shown in general that R(l, r, c) ⪖ 1 whenever c > 0 and l ≤ 9. Apparently tight upper bounds on R are given for the asymmetric cases with l < r.