Of course 0.9 repeating doesn't equal 1. That's because 0.9 repeating equals 0.9 repeating. :)-0.999... ≠ 1Don't get me wrong, it's pretty darn close to being 1, but not quite. Think about a doctor telling you that you have some of the most poisonous venom in your body. So toxic and poisonous is this venom that just a smidgen, an infinitesimally small amount, will certainly kill you. The doctor was able to get 0.999... of it out of your body. Unfortunately, b/c of this you're now dead. I'd like to know my doctor got out 100% of the venom instead of 99.999... % of the venom.Wouldn't you? I'd hate to die knowing that the doctor was able to get just about all of the venom except this incalculably small amount that he wasn't able to extract out of my body.
But what is 1-0.999...? It's 0.000..., no?
I'm sure that's what mathematicians would like for me to think. lolIt seems to me that there has to be a 1 (not a whole unit but a 1 that would make the 0.9 repeating complete, if that makes any sense) that has to go on top of the 0.9 repeating. Methinks that 0.999... is missing a certain, though extremely small, amount of something to make it a complete 1.So, if you gave me 1-0.999... as a question on a math test, I could not put 0 down for the answer. I'm thinking that there has to be something there than literally zero.
Then all your intuition aside, you'd be at least one right answer short of a perfect test.To say that 0.9 repeating forever (forever, not for any finite amount of digits but for an infinite amount) does not equal one would make many mathematically sound proofs unsound. I think in Math that's what one would call a "bad thing."A real-world example like poison doesn't apply here because calling something infinitesimally small in real life just means it's really, really small. In math, as I understand, there's no difference between infinitesimally small, and zero. I think for any amount of anything that's larger than zero, saying 'negligible' amount would be more accurate and helpful. I'm not really sure the term infinitesimal should ever really be applied as if it could really describe an extant amount of anything; I'm pretty sure even in my proof-courses in college that we would always have to say that some number became infinitesimally or arbitrarily small, not that the amount was infinitesimal. There's a lot of smart people in my life who didn't get it at first and one or two that still don't. It's very counterintuitive. But that doesn't mean it's not true.The paper looks awesome, I'm going to read it this weekend.
The most obvious issue is that if 0.999... is not 1, then by the same token 0.333... is not 1/3.
Fine by me, we can use any word mathematicians deem appropriate. We'll use diminutive!Dan Lower / KKairos notes, "if 0.9 repeating forever does not equal one would make many mathematically sound proofs unsound. I think in Math that's what one would call a "bad thing.""No matter how true the above may be, that does not prove that 0.9 repeating equals 1. The above would be a fallacious argument, argumentum ad consequentiam (0.9 repeating must equal 1 or else several mathematical proofs would now be unsound). However, I'm sure mathematicians have proofs that 0.9 repeating = 1, though I'd dispute that, too :)-. I'm just playing mathematician's devil.Professor Pruss. I don't find 0.333... = 1/3 hard to accept, unlike 1 = 0.999... Come Thanksgiving Day, I'm going to ask for a third of the pie. I'd hate to be the one who has to cut the pie. lolThe question I have is what is the smallest number closest to zero? Whatever answer you give me would be what I would put down for the question 1-0.999... = ?.
NOTE:I think for a teacher, one should ask their students if they agree that 0.333... = 1/3. After this has been established, then ask if they believe 0.999... = 1. If they say no on this, you [the teacher] can simply state that they [students] are being inconsistent.I'm curious to know what if one stated that any repeating (infinitely) number makes no sense, and therefore void of any meaning? Couldn't one say that 0.999... or 0.333... and etc. isn't actually a number? It's just a mathematical device/tool/technique/trick, but has no actual content behind it? I'm no philosopher or mathematician, but could one say that such numbers have no ontological status?
Well, if 0.999... = 1 and 0.333... = 1/3, then the question whether they exist comes down to the question whether 1 and 1/3 exist.
Jarrett:"Come Thanksgiving Day, I'm going to ask for a third of the pie. I'd hate to be the one who has to cut the pie."I hate dealing with thirds of circles. In my DIY astronomy stuff, I from time to time have to do things like drill three holes equally spaced around the perimeter of a cylinder. My best method is to wrap a long wide strip of paper around the edge, mark where it joins up with itself, measure that length, divide it by three, mark off the thirds, wrap the strip around the cylinder again, and then mark on the cylinder where the 1/3 markings on the strip were. This works pretty well. You can try it with the pie. You'll also need to mark the center point of the pie, so you can cut from the 1/3 marks around the edge to the center. (For a more regularly shaped circle, there is a clever way of finding a center by using any object with a large right angle.) Another option, if you have a compass, is to first find the center of the pie somehow. Then set the compass to the radius of the pie. Then hold that distance and use the compass to mark off six lengths equal to the radius around the edges of the pie. Basically, you're inscribing a hexagon in the pie. So now you have six equally sized pie slices marked off. Then you can just cut a double-sized one.Perhaps a simpler solution to your problem is to use Inkscape (or Adobe Illustrator) to draw a circular template the diameter of the pie, with the three slices marked out every 120 degrees. Print it out, cut it out, and then mark the center of the pie slices and the points on the edges of the pie. If the pie is bigger than the piece of paper, the drawing will be cut off, but it still might be good enough for marking the 1/3 slices.I think the Inkscape template method is what is best suited to an object as irregular as the pie--you can figure out how to place the template over the pie by eye. But it may not work so well if the pie has no top crust, as the template will stick. Maybe doing the marking when the pie is well-refrigerated will work then. Or maybe you can put some parchment paper between the pie and the template.It also wouldn't be so hard to make a little web service that auto-generates PDF files with pie cutting templates.:-) :-)
I deleted the above comment, wanted to add one piece:Fantastic! Mr. Pruss, you seem to have a good grasp on this, so I'll let you cut the pie. lol I like pecan pie. :) Is it even possible to precisely cut a pie, or anything else for that matter, down to the numbered decimal, that is infinitely repeating? I guess my problem is that I'm thinking of these repeating decimals as being dynamic, and not static. For example, the number 2 (or any number that doesn't have infinitely repeating decimals). You tell me the number 2 and I'm able to categorize several different objects being able to be 2. 2 muffins, pies, cookies, etc. You tell me 0.999... or any other numbered decimal repeating, and I can't make sense of it (thinking about it, negative number don't make sense to me, as well. How can one have a negative something? We'll worry about this at another time.). Consider this, I'm baking a cake, and I need 0.222... cup of flour. How in the world do I calculate that? Is it even possible to not go over or below the 0.222... limit? I'm thinking no. B/c of this, I'm thinking such repeating decimals are void of any real meaning. It seems to me that they are just abstract ideas, but lacking any true meaning.Oh, if you haven't noticed, I like food. lol I guess I have problems with anything infinite (quantitatively speaking, I'd hate to have problems with God :) Either I'm possibly on to something, or I'm a lost cause. It might be the latter. :(
The problem is that 0.999... is not a number of any kind. But the confusion begins with the idiot mathematician Euler. The formula he uses (which was known by Euclid) to support his ill-formed definition of "infinite sum" is the root cause of the problem.Add to this the fact that most academics do not understand limits or even know what the difference is between a magnitude and a number, and you have chaos.http://thenewcalculus.weebly.com
Post a Comment