Will we get f/n=60/400=0.15 again ?
I'm guessing it would have to be done before we can write something like f/n=45/400, whatever the result would be. If the experiment is identical then must have a stable frequency.
If i rolled two die again I probably would not get 60/400 and therefore the probability of obtaining a sum of 7 from a my dice roll probably is not 0.15.
This is true regardless of exact ideal conditions during a repeated experiment.
Think in terms of a coin toss. If I flipped a coin 1 time and it came up heads then f/n = 1/1 = 1 and 1 implies that the coin should come up heads every single time but we know that is silly. If the coin is fair then we know the probability of heads is 0.5. The greater the number N the closer the actual probability of heads will be 0.5.Is that with identical conditions as well ?
It depends on what is meant by 'identical'.
We could create an idealized version of the coin flip in which all the physics are the exact same. Consider flipping a coin in a vacuum, there is no air or any other particles to interact with the coin, you flip it with the same amount of force and angle, etc. Under said conditions you would be able to predict exact results, it would no longer be a question of statistics and probability but instead classical mechanics.
In statistics and probability of games we assume identical conditions to mean the same level of uncertainty is present in the experiment. Below I simulate coin flips using a computer, in it I assume the same level of uncertainty for each coin flip by using a random number generator that assumes the coin flips are normally distributed.
Look how the relative frequency (probability) of heads changes as we increase the number N.
N = 10
N = 50
N= 10000
I wouldn't go with a pair of seven cause I'm not sure how that works with 6 sided dice. Maybe with 4 dice, but that would at least decrease the odds when rolling a pair of 7 in a single roll, but that shouldn't matter, it's the result that determins the relative frequency.
Maybe I don't understand that part, so I'll settle for a pair of 6. Probably scaled down from the OP model. I should probably use 1 dice for starters.
This is good for short term memory, but that's the best I can do for now.
P₆ = {(5,1),(4,2),(3,3),(1,5),(2,4)}, so there are actually only 5 ways to get a sum of 6 while there is 6 ways to get a sum of 7, hence betting on seven is technically better.
Alice I don't get it. ( Those graphs make life easier ) But the part I don't get...
I am not Alice.
.
Wait I see, had to count them. Probability 6 P₆={(5,1),(4,2),(3,3),(1,5),(2,4)} each adds up to 6. And each are the possible outcomes. I was thinking 6 sided dice, but this isn't the same.
Oh wait, the B was assigned, but that isn't the case this time. ( I'm writing my thoughts to show you how amature I am are getting this. I'm not quick at it. )
Probability is not easy by any means, it is, however, incredibly rewarding.
There is a trade off between adding more dice because despite there now being a greater number of ways to obtain any given sum there is also a much larger set of possible outcomes as a whole. I won;t do the calculations now but I wonder how probabilities of different sums of pairs scales with increasing dice.Got it.
In the previous result of f/n=60/400=0.15 ( let's say stable ) would we be able to simply use addition and make it
f/n=240/1600=0.60 ? Or is that cheating ? ( I have my doubt about this, maybe there's something I don't see )
On its own that is cheating and cheating in probability will leave you ignorant given the outcome is making incorrect inferences.
You can do a similar process however via computational simulation, much like the one I created above. A simulation is cheating in the sense that it is just made up and built completely on our own assumptions about how we believe something should behave given uncertainty.
For instance, in the simulation of coin flips I write in the assumption that coin flips are normally distributed, that is they follow the bell curve. How valid is that assumption, it depends on my goal.
Typically the goal is to fit a simulation to a real world systems behavior. We observe something uncertain in real life and then create a computer simulation of that things behavior up to a point so that we can understand its uncertainty to a great enough fidelity that we can predict future outcomes or other properties of that things behavior.
This is limited by how complex the thing we are simulating is and a long the way we must always consider how much trust we can appropriately allocate to our simulation.
it's the result that determins the relative frequency.if we consider f/N where it is the number of occurrences of an even and the number of samples that determines the relative frequency.
Infinite samples (N -> ∞) is what gives us our absolute probability of an event, that is a coin flip is 0.5 heads and 0.5 tales.
Is there a machine that can say, flip a coin with the exact identical conditions ( down to the coin's exact starting point) and achieve the same result at various frequencies ? ?
This took long to write, so much thinking. It's easier to get it done faster eh ?
I believe for coin flips that is very possible and I know that there are magicians who are capable of flipping heads almost every time. Coin flipping is simple and as such to manipulate it is simple. Black Jack is a great deal more complicated but still relatively simple, and though you cannot predict every hand with absolute certainty you can predict a set of hands x% of the time with absolute certainty which is why counting cards is actually profitable. A casino can increase the complexity of a black jack game by adding additional card decks and rules for betting. With enough complexity counting cards becomes impossible and the odds are back in the houses favor.