Posts

Showing posts with the label puzzle

Math Puzzle: Rich Get Richer?

Today I'm going to talk about a math problem whose answer is simultaneously simple and unintuitive.  Here is the problem statement: You have a urn that has $1$ red ball and $1$ blue ball.  You repeatedly remove one ball from the urn randomly, then put two balls of the same color back into the urn.  How does this system behave over time?  Specifically, what is the probability that at some point the urn will contain exactly $r$ red balls and $b$ blue balls? If you want to see my solution, go ahead and expand the solution tab.  Otherwise, you can try to figure it out for yourself.  Also, if you have other interesting questions you want to ask about this system, feel free to post them in the comments below. Show/Hide Solution Intuitively, this seems like a "rich get richer" problem", in that the long term behavior of the system depends on how "lucky" we are in the early stages (i.e., how many red and blue balls we pick).  For example, if we pick m...

Challenge Problem: Taking a Quiz

I thought of this problem about two years ago and was going to send it out as the one of the weekly challenge problems for the ACM student chapter at UD, however I was unable to solve it when I first thought of it, and so I put it on the back-burner for a while.  Recently, I revisited it and made some progress on it, but was still unable to find a strategy that can solve it at all, let alone efficiently.  I am posting the problem to this blog in the hopes that you can help me figure it out by posting your ideas and code in the comments below.  Anyway, here is the problem: You are taking a quiz with $n$ true/false questions that you didn't study for, but you are allowed to retake the quiz as many times as you want. Every time you take the quiz, you are told how many questions you got correct, but not which questions were right and wrong.  Your final score is the number of answers you got correct on your last submission minus the number of times you retook the quiz. ...

Mathematics and Darts!

In this blog post I am going to talk about one of my favorite math problems. When I first solved this problem, I developed a method that can be generalized to a lot of other similar problems. The problem is as follows: Alice and Bob take turns throwing darts at a dartboard (with Alice going first). At each turn, Alice will try to get it closer to the center than Bob's last throw and vice versa. The game ends when one of the players doesn't get it closer than the other player's previous throw. Alice and Bob are equally skilled at throwing darts and they are guaranteed to hit the dart board. What is the probability that Alice wins? To help you visualize this problem, I created a simple program in Racket (then compiled it to JavaScript using whalesong .) You can visualize the game being played by pressing the Right Arrow Key (make sure the frame is in focus). You can speed up the simulation by pressing the Space Bar (you can pause the simulation by pressing it again.) ...

Pattern in Pascal's Triangle

Image
Pascal's Triangle is a triangular array of binomial coefficients, and it often arises in counting problems.  In this blog post, I will present a problem that is related to Pascal's triangle and in solving it, I will derive an interesting relationship in Pascal's Triangle that you probably didn't know about. If you randomly walk down \( n \) rows of Pascal's Triangle by going down-left or down-right at each step, what is the expected value of the number that you finish on? For example, if \( n = 4 \), then here are a few possible walks (that all happen to end at 4): Show/Hide Solution Solution The first thing to notice is that the number of ways to walk to any cell in the triangle, is the value of the cell itself!  That's because you can get to any cell by going through either it's left parent or it's right parent, and by the way this triangle is constructed, we know that the value of each cell is the sum of the values of it's two par...

Random Walk on a Complete Graph

Image
This is the 4th math problem in a series of math problem blog posts that I am writing about.  I thought of this problem as a challenging weekly problem for math club, and I will present 2 solutions.  I came up with the first solution, and a friend of mine came up with the second solution -- a completely different approach that yielded the same answer. You are randomly walking on a complete graph with \( 2 n \) vertices. You start at a random vertex and at each time step you transition to a different connected vertex. Let \( f(n) \) be the expected number of steps until you've visited exactly \( n \) distinct vertices. Find $$ \lim_{n \to \infty} \frac{f(n)}{n} $$ For those of you who are not familiar with graph theory, here is a complete graph with 6 vertices: In the context of this problem, you start on one of the circles (it doesn't matter which due to symmetry) and at each time step, you cross one of the lines to a new circle.  Henceforth, I will refer to the ...

Analyzing Randomly Descending Sequences

Finals week just ended for me so I'm hoping to post some new cool math problems in the next few days.  In this blog post, I present a math problem that is very easy to understand what it is asking, but not so easy to solve.  Regardless, I will present my solution, which yielded a very interesting result. You construct a sequence \( k_1, k_2, ..., k_m \) by randomly picking an integer from the set \( 0 \leq k_1 < n \).  After picking \( k_1 \), you repeat the process by picking \( k_2 \) randomly from \( 0 \leq k_2 < k_1 \) and repeat the process until you can no longer pick any numbers (when \( k_m = 0 \)).  Find a function for \( f(n) \), where \( f(n) \) is the expected value of the length of this sequence. Show/Hide Solution If \( n = 3 \), then the possible sequences are  \( \{ [0], [1,0], [2,0], [2,1,0] \} \), but these don't all occur with equal probability: \( P([0]) = \frac{1}{3}, P([1,0]) = \frac{1}{3}, P([2,0]) = \frac{1}{6}, P([2,1,0]) ...

Counting Special Non-Descending Sequences

Image
In this blog post, I will discuss the solution to a very fascinating math problem that I came up with while pondering challenging math problems. The solution to this problem happens to be a really interesting combinatorics identity, and it can be used to describe some very interesting counting problems (including the number of ways to match parentheses). How many non-descending sequences of integers $ a_1, a_2, \dots, a_n $ are there such that $ 0 \leq a_i < i \: \forall \: i \leq n $? Show/Hide Solution If \( n = 1 \), the only possible sequence is \( a_1 = 0 \).  Similarly, for \( n = 2 \), the possible sequences are \( \{ [0,0], [0,1] \} \).  For \( n = 3 \), the possible sequences are \( \{ [0,0,0], [0,0,1], [0,0,2], [0,1,1], [0,1,2] \} \).  Let's let \( f(n) \) denote the number of valid sequences of length \( n \).  From these examples, it is clear that \( f(1) = 1, f(2) = 2, \) and \( f(3) = 5 \).  We would ideally like to get a nice combinat...

Analyzing an Unfair Game

Image
In this blog post, I will present an interesting 2 player game and I will discuss the mathematics that I used to determine the expectations for each player. This game is a generalization of the Price is Right spin wheel, in which players take turns spinning the Big Wheel in an attempt to get a sum as close to 100 without going over. I generalize this by picking random numbers in the range $[0,1]$ instead of numbers from the discrete set ${5,10,...100}$. Here is the formal problem statement: Alice and Bob play a game where they try to get as close to $1$ as possible without going over. Whoever gets closer without going over wins, and Alice goes first. Alice repeatedly gets a random number from $[0,1]$ and adds it to her sum until she is satisfied with her current score or exceeds $1$ (if she exceeds $1$, she loses). Then Bob repeatedly picks random number until he either beats Alice's score or exceeds $1$. Assuming both players play optimally, what is the probability that Alic...

A Challenge for my Readers!

In this blog post, I present a very interesting math problem that I came up with while pondering random walk problems. You are walking along the x axis and you start at $x=0$. At each time step you move to $x+1$ with probability $ \frac{1}{1+x} $ and to $ x-1 $ with probability $ \frac{x}{x+1} $.  What is the expected number of steps until you return to $ 0 $ for the first time? What makes this problem harder than a typical random walk problem is that the probability of moving to $x+1$ vs $x-1$ is not fixed -- it changes based on your position. That being said, the typical counting methods for solving problems of this type do not apply. There is a clever and fascinating solution that doesn't require too much advanced knowledge, but I will leave it to you to think about this problem.

A Fun Puzzle from Probability

Here is a problem that came up a few days ago and I'm still thinking about: There is an amoebae in a box. At each time step, it either multiplies in 2, multiplies in 3, stays the same, or dies - all with equal probability. What is the probability that eventually, all the amoebae die out? Show/Hide Solution Solution One can quickly see that the probability is at least $\frac{1}{4}$ because there is a $\frac{1}{4}$ chance of dying on the first time step. Similarly, you could die on the first time step ($\frac{1}{4}$), stay the same then die ($\frac{1}{4^2}$), etc. Thus, the probability of eventual extinction is at least $\sum_{n=1}^{\infty} \frac{1}{4^n}$. Unfortunately, this strategy leads to a dead end. Instead, I use a clever interpretation of the problem that leads to a very simple and elegant solution. My key insight came from seeing an infinite continued fraction problem, which uses a simple technique to define the value of the continued fraction in terms of itself. T...