Aswad -> RE: Indoctrination (12/2/2012 5:39:37 PM)
|
quote:
ORIGINAL: GotSteel You should perhaps contemplate the possibility that your having an emotional reaction because you don't want to hear what I'm saying. In an effort to bridge the gap, I did precisely that before posting, but thanks for reminding me anyway. It is important to consider bias and affect when one either disagrees with a point of view or fails to see the sense behind it, I know. That's why I usually do just that. I think I've narrowed it down, in this case, to an error on your part, most likely either one of failing to engage (e.g. for the same reason you suggested I contemplate) or one of failing to comprehend (i.e. not seeing my point at all). My models of the trains of thought I can see leading to what you posted are consistent with both of these possibilities. Quite simply put, we're talking about two entirely different things, and if you're unable- rather than unwilling- to recover my point from what I've already said, it would probably take an unjustifiable amount of time and effort to lead you to the point I was making. On the other hand, if it comes down to being unwilling, then there's nothing I can do about that, nor anything I'd want to. If you're both willing and able, I have spoken to other posters that have had no problem seeing what was meant, and you should consequently have what you need. I don't mind continuing this part of the conversation, but you're going to have to reconnect for it to be worth my time to do so. By the way, I don't expect you to trust things I say that you can't verify yourself. But I do expect you not to expect me to invest a lot of time in proving something just because you can't verify it for yourself, when it seems to me that you're not even trying (which may or may not be the actual case, but that's immaterial as regards expectations). Make whatever investment you expect me to make, and I shall try to return the courtesy. quote:
You're talking about axioms (self evident truths) it doesn't get much more self evident than being able to find your own ass. You shouldn't even have to use both hands. Under about a million and one assumptions, the location of the ass is hard coded into your neural fabric, and in terms of self evident truth it is meaningless, as there are a number of things hard coded into your neural fabric that have no meaningful correlates in the real world. Dropping a few assumptions, you could be stuck in the frickin' Matrix and be able to fire those neurons in interaction with the machine to provide you with the illusion that you've found your ass, while in fact you've never found, seen or felt your ass, as your mind is entirely without any route to accessing your ass. Sure enough, it would seem meaningful that your mind and the simulation agreed, in the context of your simulated life, but that's as far as it goes. At the bus stop today, some kid had written: LOL + LOL = LLOOLL To understand how funny that is, it's worth realizing that our kids will emphasize a LOL by pronouncing it more slowly, which would end up being rendered with the letters doubled that way because they do geminate the Ls in doing that. And so, yes, under a certain set of whatchamacallits (you didn't like calling 'em axioms, after all), adding the two LOLs would result in LLOOLL. Under a different set of whatchamacallits, the result would be LOLLOL. And under many, the notion of adding two LOLs is nonsensical altogether. Now, Wikipedia has, among other things, this to say about axioms: «As used in modern logic, an axiom is simply a premise or starting point for reasoning. Axioms define and delimit the realm of analysis. In other words, an axiom is a logical statement that is assumed to be true. Therefore, its truth is taken for granted within the particular domain of analysis, and serves as a starting point for deducing and inferring other (theory and domain dependent) truths.» For instance, simplified, if we postulate (set forth the axiom) that lives have equal value, and that conserving life is the foremost moral imperative, then we might- in the domain of ethics- deduce that the most ethical measure is the one that conserves the most lives total, though there's obviously a ton of other axioms to that which I didn't list in this absurdly simple example. But if we postulated, instead, that ending lives is the foremost moral imperative, then we would similarly deduce- with equal correctness- that the most ethical measure is the one that kills the most people, with the same caveats. Without a choice of axioms, postulates, or whatever else you prefer to call it, rationality itself cannot determine the correctness of either conclusion. And before you start submitting things like survival as being self evident, I would like to point out that I have explored many configurations that permit us to thrive at a species level, and to build functioning civilizations, but which you would no doubt consider to be entirely abhorrent. Hell, even history has a few examples of this. The notion that there is an objective basis on which to build is one that's been thoroughly refuted time and time again, even when one isn't committing the naturalistic fallacy in the process. Yes, what you've been reared with is so familiar that it provides an illusion of selfevidentiality, but in the end, whatever you want to do is going to come down to arbitrary givens, hardwired instincts and nothing else, with rationality providing only a means to effectively pursue what derives from another source than rationality. quote:
There's no good reason to reject all the evidence we do have about reality and buy into the conspiracy theory of solipsism, no evidence for solipsism whatsoever. If you want to call that an axiom, fine whatever floats your boat but that word doesn't validate claiming any old thing as a self evident truth. Unsubstantiated supernatural speculations aren't even in the same ballpark. We have precisely zero evidence about reality. We have evidence about our own experiences. And it's practical to assume those hold water in some way, even though they really don't. We also have precisely zero evidence about solipsism, for or against, as well, and I have no gripe with anyone choosing to believe in it, or not believe in it. In fact, our reality is about as regular as a simulation, and there really isn't any a priori reason to assume reality has to be regular or consistent, so some of the basic predictions of any theory of solipsism do at least pan out, though it still offers no evidence either way (because there's also no reason to assume reality wouldn't be regular or consistent). Insofar as we have any evidence, it says our perceptions are limited by our neural architecture and our mental state. We also know we're working to find ways to interact with the brain, bypassing what connections it has to the outside world, to deal with people that have been "locked in" by strokes, accidents and the like. Which means we know that in a few decades, at most, our own technology will permit someone to be raised in an entirely virtual environment, should we choose to do so, further highlighting the limits of our perception and our concept of reality. One of the funny things I learned during my one drug induced episode of psychosis, is that my ability to distrust my own assumptions was crucial to being able to both identify the psychotic state, deal with it while it was in progress, and work my mind out of it without the aid of neuroleptics, ending most of it in fifteen minutes and the rest in half an hour, when it should have lasted at least eight hours, probably two days or more. That's precisely because I don't implicitly trust that anything I believe myself to know has any actual truth to it. When my mind fabricated a false reality, I tested it, found it to be inconsistent, dissected it and dispelled it. The docs confirmed it, and have no idea how I did it. To me, it was obvious. That's how versed I am in rational, systematic thinking. As a child, I heard we can't consciously control our ear lobes, so I had a look at a muscle chart and tracked down the area of my sensorimotor self image that covers those muscles in front of a mirror in order to learn how to do it. I did the same thing for pulse rate, emotional states, etc., again because others told me I couldn't. I've never been fond of having "truths" handed to me. I like to challenge them. To ponder and think outside the box. And I do so in an exceedingly systematic, rational manner. That's what people hire me for: what seems like magic to them is common sense to me, and I can sort it out for them. Unfortunately, I can't always explain everything to everyone. That is, given enough time, I can break down arbitrarily complex ideas into something that the target person can follow, of course, but just like writing up a set of guitar tabs won't give you an ear for music, or teach you improvisation, having me break things down doesn't help people understand something if they don't have what it takes. I like to credit people with having what it takes, and so I assumed you hadn't been applying yourself. If you have, then me breaking it down for you isn't going to do shit for your comprehension. I don't much like to brag, probably a consequence of Jante Law (a Scandinavian secular cultural phenomenon), but the simple fact of the matter is I overestimate people, what they can do, what should be obvious to them, when in truth about one in a million humans could do what to me is as natural and intuitive as breathing. Ponder that for a few moments, and something "obvious" might reveal itself: what is "self evident" to me might not be "self evident" to everyone else, and vice versa. Particularly, the absence of a connection or commonality or other sort of pattern is going to be seen as "self evident" to anyone lacking the capacity to see a pattern (whether or not one is actually there; we must allow for that). Similarly, I figure if you give me a bunch of random noise, I'll find a pattern in it, which is just a closest fit to a bounded sequence, of course. I'm aware of the limitation, and aware that the pattern I'm seeing is a local one that has zero validity in predicting future output from the generator, that the randomness will average out over time and that it will produce local patterns (in essence, random noise with a flat frequency distribution will have cycles at all scales that will be locally indistinguishable from a pattern). So, no, I don't trust anything, but I make several assumptions, with varying degrees of confidence, constantly refined. I take precisely nothing as self evident in any absolute sense, and anything as self evident in the loosest sense. Since it's arbitrary (or at least depends on the person), where you prefer to draw the line isn't really any of my business, except insofar as you care to make it my business... which you consistently do on these threads. Simplifying reality to the point of incorrectness is an extremely human flaw. Just because you believe in nothing doesn't make it less of a flaw, or more of a flaw. When you assert that the fact that you believe in nothing makes you immune to the flaw you demonstrably have, I am somewhat compelled to point out the same thing I point out when religious folks say their faith makes them immune to such flaws, and that in your case that assertion becomes somewhat hypocritical, to boot. Solipsism is a perfectly workable example, to my mind. To yours, maybe not. IWYW, — Aswad.
|
|
|
|