I think your difficulty with that analogy lies in a slight misinterpretation of it's meaning.
Now, don't get me wrong, I'm not trying to insult you, I've never even met you
But the way you describe it...That's not quite how it's meant to be interpreted.
I'll start at the beginning and summarize...I apologize if you already know some or all of this, but at least I wont be backtracking later.
Scientists developed a model of a deterministic universe, widely pushed by Newton, who developed laws which, in the span of a few sentences, accurately described thousands of complicated universal processes. He stated that the universe is rather like a game of billiards; if you hit the ball the same way each time, it'll roll the same direction. Hence, if your initial conditions are precisely the same, the outcome will always be the same.
Makes logical sense, right? If the golfball is on the same blade of grass, all the weather is identical, and you hit it with identical speed and force, it should land precisely the same place every time 400 yards down the course.
Now here's the issue; you can never measure your initial conditions to infinite accuracy. To do this, you would need, say, a thermometer with an infinite number of decimal places, otherwise there's a tiny bit of room for error, no matter how small, maybe a millionth of a degree.
For a while, people believe that if your measurement of initial conditions was more accurate, your prediction of the outcome of an event would be more accurate. If you could measure the force you hit the golf-ball to 50000 decimal places, you could more accurately determine where it would land.
Scientists went on believing this until that arse Poincare came along and tried to look at 3 planetary bodies, all interacting with one another. He realized that even if he took insanely precise measurements of the planets and all the initial conditions, he could not accurately predict how they would move in relation to each other. The reason was that, if his initial conditions were off by a hundredth of a decimal place, the resulting outcome was ENORMOUSLY different than he predicted. If he made his measurements of the initial conditions 100,000x more accurate, the end result would STILL be way, way different if you went a billion decimal places down the line and changed a 2 to a 3.
He figured out that some systems, minute changes in initial conditions grow to have an enormous effect in a tiny amount of time. The uncertainties (and there will always be uncertainties, no instrument can measure to infinite decimal places) will always overwhelm any calculation and defeat the accuracy of your prediction. No matter how tiny these uncertainties are!
This is where the butterfly effect comes in. It's just what I said it was, an analogy. "Whether a butterfly does or does not flap its wings can determine whether a storm will arise, a year later, on the other side of the world".
The butterfly is the billionth decimal place of uncertainty. The weather system is the chaotic system.
So I see what you're saying, but if you have a MILLION tiny variables, each SINGLE one of them has the power to change your prediction into something completely different. The butterfly effect states that if you have EVERYTHING the same, and ONE butterfly flaps its wings, that's enough uncertainty to throw off your calculation by orders of magnitude.
Goddamn this is a long post.
Does it make more sense now?