E
Apr/May 2004 Miscellaneous

In the Precincts of Duality

by Rajgopal Nidamboor


The classy doctrine of duality, expounded by Madhvacárya, the propounder supreme of the Dvaita [Dualistic] School of Indian philosophy, is not only based on the utility of analyses, but also the reliability of reason. It is a concept that is more than focused in its broad expanse—perforce, much more systematically—in defending its tenets. It's both logic and dialectics. It's also realistic, and even idealistic. It is more than a doctrine that believes in the multiple character of ultimate reality, where matter, time and space are all recognized as interdependent entities. Its principal thrust, apparently, lies in its palpable conception of atoms, where all atoms have taste, color, smell, and touch: one that alludes, most notably, to the fact that atoms may differ in their qualitative structure. To draw an example: air has just one attribute: touch; while fire has two: touch and color; water has three: touch, color, and taste; and, Earth has all four, including smell.

A certain disposition for reflection in repeated branching, or dualities, may, likewise, rest deeply within human nature itself. As philosopher Protagoras once said: "There are two sides to every question, exactly opposite to each other." Add to that a supplementary dimension of human predilection, and you have a very sublime prospect in hand: of corporeal adaptability and its resultant possibility for surmounting such fundamental shortcomings by way/s of knowledge.

Human proclivity to grasp complicated characteristics into a combine of "we-contra-others" should, perforce, be resolved as erroneous in our macrocosm of shades and the perpetual. It's a facet that is more than dangerous in the context of yet another anthropological penchant for assessment: of "we-contra-others" belief that could also quite nonchalantly become "righteous-versus-evil." Such a scenario could even smack of eccentricity: of a set oddity, triggered by our own sense of resentment, including ethnic explosion, juxtaposed by a fulsome sheath of oblation. Call it martyrdom, or what you may!

Yes, the accidental and fundamentally optional character of developmental confines has sadly been propelled and also made "natural" by our inclination to build dualities for the sake of dualities. To draw one example: science versus art, the most distinct among such dichotomies. What's more, thanks to our own fancy for insularity and narrow-minded structures, such a deceptive demarcation becomes amplified as two, largely non-communicating sides that develop distinct cultural traditions, which in turn arouse shared conventions and even derision.

Let's now take a look at the context with a tint of divergence—or, fluid construct. Most altruistic individuals, with a few exceptions, don't have much respect, unlike scientists or technologists, for using pictorial representation in their oration. It is not that they don't appreciate the fact that the written and the spoken language, be it English or a local language, is different. That a select few can read well in public also makes them different: to conveniently mock at the darkness in the auditorium even before they begin their "salvo" with words without pictures and/or illustrations.

Which only explains why archetypes spin an exceptionally cogent cusp between art and science with a common ground of methods for tangible artistic innovation, and the scholastic probity of combined nourishment for all types of human creativity. They generally motivate accordance for a collective strut. It needs to be understood, therefore, that both dominions meet opposition in education. If only art and science could join hands for common methods in thinking, innovation, and historical achievement—rather than emphasizing our disparate substrates and trying to profit from the differences in playing a cipher-aggregate game at the other's expense—then we might truly dangle together rather than hover individually.

Innovations, of course, cannot be neatly slotted into either camp, but they can only be understood as a reinforcing unification of goals usually equated between the two realms of Rudyard Kipling's motif, "never-the-twain-shall-meet." Aside from that, the standard examples of Leonardo Da Vinci and other "Renascence" personages have been correctly and fairly cited. But, our best cases should not be sought in an earlier age that did not recognize our modern disciplinary boundaries or even possess a word for the enterprise now called "science." If we only look at 20th-century figures that endured the invectives of skepticism and delusion for working in both domains simultaneously, we can make our major validation in more instant expressions.

There are a number of examples of innovators, referred to as "artists," who used the tools of their trade to make discoveries that had eluded official "scientists" within their own parochial world. In the 18th century, the Dutch artist Camper, for example, established rules for depicting characteristic differences in the physiognomies of human groups after he noticed that many Renaissance paintings of the Three Kings had illustrated Balthazar, the black magus, as a European painted dark, rather than a native of sub-Saharan Africa. Reason: European artists could find few African models at the time.

At the beginning of the last century, the celebrated American artist and amateur ornithologist A.H. Thayer, for instance, discovered the adaptive value of counter-shading for making a three-dimensional object descend into invisibility, because counter-shaded organisms often seem totally flat, or two-dimensional, against their background. It's a solution that had eluded scientists! Interestingly, however, it seemed starkly clear to one artist who had spent his life promoting the opposite illusion of making flat paintings look three-dimensional. Additionally, it was another great artist's work that led to important advances in naval camouflage. It was an idea whose time had come: to save innumerable lives in modern armed conflicts. But, what could be more precious, or more difficult, than such a conceptual field of study? Indeed! It is, therefore, more than imperative that we need to access all the tools at our command—even when linguistic/sociological convention bundles out these widespread mental tendencies among non-communicating disciplinary camps—to triumph in this adamantine, yet most rewarding, of all cerebral pursuits.

It would also be interesting to note that we are always in awe with the grandeur and immensity of our universe, in spite of the fact that we sometimes tend to forget that it is made bit by bit, through apparently not-so-significant interactions. Call it leela, the never-ending divine play, in which all of us—stars, microbes, leaves, mountains, space, the Homo sapiens etc.—are both dancers and the dance, or what you may. One inescapable fact endures: the interconnectedness of things, a unity of a vast multiplicity.

The concept of interconnectedness also extends to the human body: the mind has consciousness; the body being simply matter in motion, even though the two may not be distinct. Because it's the mind that moves?! To illustrate the idea with an Eastern aphorism:

"Is it the flag that moves, or is it the wind that moves?"
"No," answers the Zen Master,
"It is the mind that moves."

Consciousness, as noted scholar-physicist Evan Harris Walker avers, is also something that exists in its own identity. It may, therefore, be construed that it is quite distinct from all other objects, processes, energies, and even realities the physics of science would reveal. More so, because, physicists do not mean anything that constitutes the substance and, what is meant by the term, consciousness. It is a complex credo, yes, one that is best grappled with the comprehension of our Zen mind—especially with our analytical brains just before the endeavor to achieve such abstraction goes beyond our mental capabilities.

All the same, it may just as well be analyzed that all physical things are measurable or built on measurable things. For instance, space is quantifiable; time is measurable. However this may be, we cannot hold an answer with a similar yardstick vis-à-vis consciousness. Consciousness characteristics, such as pain, cannot be measured directly by the use of any device known to science. If this isn't a dualistic postulate, what is? More so, because consciousness is real and non-physical, but it exists. It also unifies and constrains us all as individual beings. What's more, it orders space and time out of chaos and random events.

What about knowledge or understanding? Madhvacárya, for one, to use a classical premise, thought of knowledge as being relative, not absolute. In so doing, he spurned the Universal as a natural consequence of a principal sense of belief, or the uniqueness of a particular person or a thing. To know a thing, said Madhvacárya, is to know it as distinct from all others in the general sense, and from some in a specific way. Mere appearance, Madhvacárya also related, wasn't reality, while objective experience was. It's a theme song that Immanuel Kant espoused much later. Not only that, Madhvacárya also maintained the simple fact that if things are transient and ever-changing, it does not mean they are not real. And, so he opined, every new relation changes or modifies the substance to some extent—greater in some, less in others.

As Fritjof Capra, one of the world's foremost theoretical physicists, puts it in his landmark book The Web of Life: "[This] new paradigm implies that epistemology—understanding of the process of knowing—has to be included explicitly in the description of natural phenomenon..." Reason? Systems, according to Capra, are all interdependent. They also encompass a web of relationships, including nature, with a corresponding network of concepts and models, none of which is any more fundamental than the others... This novel mode of thinking, Capra also contends, recognizes that all scientific concepts are limited and approximate; and that science can never provide any complete or definitive, or total, understanding. According to Capra, the process of living is not the world, but a world: one that is always dependent on interdependent structures, including the genetic information encoded in the DNA.

Do I dare
Disturb the universe?
I have known them all already,
known the evenings, mornings,
afternoons,
I have measured out my life
with coffee spoons.
 
      —T.S. Eliot

To take a dekko at yet another construct, language or language as metaphor: Language, it need not be over-emphasized, arises from our frame of reference, of how we position what we want to communicate. The deduction is obvious. If our chassis of referral is to become more efficient, our language needs to reflect that—like a mirror. In the past, the matrix of reference was that of a presumed, objective and pre-existent reality. It also reflected and imitated such a reality and even duality. Today, language is increasingly characterized by a major emphasis on rethinking of its nature, role, and function.

Language, according to the German philosopher Martin Heidegger, conveys not only information but also commitment. Heidegger always reckoned that language was not about describing a separate world that existed out there. His contention, in today's context, was clear. If we were to eliminate language, the ubiquitous computer a worker assembled would be reduced to a nonsensical object. To quote Fernando Flores, a Heidegger protégé: "A human society operates through the expression of requests and promises."

That's not all. Complexity thinkers Howard Sherman and Ron Schultz, for instance, contend that many organizations and individuals use language as simile in the description of narrative or storytelling. "These language-based ways of communicating and feeding back information," they further emphasize, "are in line with pragmatic philosophy." They argue that while business and/or management, for example, may not have truly loved stories, all right, science's novel attempts to lighten language to its simplest form—an equation—has now helped it to rediscover the power of story to convey it.

Flashback. The early works of philosophy and knowledge were all written in a language that was almost metaphorical, mythological, and even evocative. Plato, for instance, didn't write treatises. He wrote dialogues. And, Madhvacárya, in a totally different epoch, wrote treatises, not dialogues. Because language as a tool of communication had matured, and there was a very deliberate effort to liberate it from mythology/metaphorical allegory? Yes. Not only that. Language had also begun to construct a more profound, cogent grammar, not to speak of the processes of analyses and conditions of intelligibility for all discourse.

It goes without saying that language is a response to needs that arise from contacts with objects. It's subjective too. Suppose it was the other way around—one that was modeled for the world to fit in? Just think. Fortunately, in practice, you just don't have such a paradox, and for one simple reason: there is no means of representing thoughts outside language, except, of course, with the help of some other means of expression, like art, or calculus, which is also a form of language, no less. So, there it is! Of one primal reason why everyone is familiar with that timeless idea—that it's your language that determines the way you think! Language, as one wise soul put it, is an instinct, an innate power of the mind. And, silence? It is more than a part of speech. It is a void of mystical experience, not just formed by language but also induced by it.

There is more to language than what meets the eye/ear/mind. If Aristotle, to cull a classical model, reduced language to its essences and a whole new possibility for the power of language, Galileo, for one, made the distinction between primary and secondary qualities in language: one that had geometric properties, the other that did not. The rest is history, what with the likes of Descartes, Newton et al "masterminding" some incredible things in comparison to their predecessors. Which also brings us to a notable allegory: today's thought returning to early sources.

Language today seems to have brought a profound balance, a wonderful analogy between Descartes and Einstein. What's more, our language today does not, in any way, reduce anything. Instead, it now includes everything, scientific or not. Maybe we have got to accept that there is a very peculiar dualism at work here. Yet, its profound analogy is immanent. We are also now talking of concepts. We are talking of ideas. We are also talking of experiences, behaviors, sensations, intentions, feelings, etcetera. Most importantly, we are also communicating ideas through our experiences and stories. And, the idea—language as similitude—is, doubtless, critical to us all. It is, in other words, language's very own monumental feat of virtuosity or vitality, and a magical carpet of the narrative.

Which brings us to a vital chapter from one of the most persuasive books of the last century, The Structure of Scientific Revolutions, by T.S. Kuhn, who connected the developmental cleft between visual illustration and hypothetical representation when he used the famous gestalt delusion of the duck-rabbit as a fundamental emblem for the definition and character of scientific transformation. In his own words: "It is as elementary prototype/s for these transformations of the scientist's world that the familiar demonstrations of a switch in visual gestalt prove so suggestive. What were ducks in the scientist's world before the revolution are rabbits afterwards."

It is, in sum, a perfect illustration of duality as a wide postulate or cosmological scheme for the emergence of the world and the word—as we know them now—as reality.

 

Previous Piece Next Piece