I have a bleg for you all... or at least all of you who know a lot of probability.
I'm going nuts with the following problem. I'm sure that it can be solved, but it's not in the probability books that I have.
Suppose you're in Vegas, and they have a perfectly shuffled, infinite deck of cards (ie, the probability of getting any one specific card is ALWAYS 1/52, no matter how many cards you've drawn).
How many cards would I have to draw so that I could have a X% chance of collecting an entire deck? Obviously, there's no amount of cards that will guarantee an entire deck - for example, it's possible you could keep drawing the two of clubs over and over again, never collecting all of the different cards. But drawing a certain number of cards should give me a certain probability of having an entire deck in my hand.
It seems at first glance to be similar to Bernoulli trials - independent trials, each with its chance of success or failure, but the probability of success (ie, getting a card you don't already have) is dependent on how many successes you've had in the past - which means it's not Bernoulli...
Brute forcing it (ie, what's the probability that it happens in 52 draws? That's 52!/(52^52). Okay, 53 draws? 54 draws? Add them up until you get the X% chance you want.) seems to get complicated VERY quickly, because there are a lot of different ways you could draw 52 different cards in 55 (or whatever) draws.
Any help would be much appreciated.
EDITED TO ADD:
So for some reason Excel didn't want to recognize the little function I wrote in VBA to solve this. I still don't know why, but I was able to build it up from 1_P_1 using Sal's definition in the spreadsheet itself, without any actual programming.
I even created a neat 3D graph to display the answer.
The Z axis is the probability that you will have X unique cards (X axis goes from 0-52) if you draw Y cards (0-350) from an infinite, perfectly shuffled deck.
Thanks everyone!