An artistic idea can only be expressed in a work of art; the idea doesn’t precede the actual work. In other words, no matter what excellent plan comes into an artist’s head before he sets out to work, it’s only in the execution of the creative process that greatness and originality are shown. This means that whatever subject is chosen and whatever style is employed, those factors are independent of the creative moment and consequently they don’t even have to originate from the artist himself. It is perfectly fine to borrow a general concept from someone else, and that does not at all lessen the status of the artist as an original creator.
In fact, no artist has ever been the sole originator of his product. He always finds himself at the endpoint of a tradition leading up to the present time and place, and whether he is deeply indebted to his predecessors or relatively independent of them is of no consequence as to the artistic value of his work.
Suppose you suggested a motive and a painter followed your suggestion. Who would be the artist, you or the painter? The painter, of course, and only him. The idea that you seemingly gave him would not be an artistic idea but merely a source of inspiration since an artistic idea can only be expressed in an artistic language – in this case by means of paint.
But an inspiration can also be transmitted in the same artistic language, one painting can inspire another painting, for as long as the essential artistic idea is not copied, it will be a genuine piece of art.
A copy is the opposite of a work of art, but an inspiration is not a copy, not even as a matter of degree. Art is real when the work itself is original.
Great art is always renewal. It liberates itself from the shackles of its predecessors, rises from the routine of incessant copying and creates true originality.
But this doesn’t mean what you might think. An artist who aspires to greatness doesn’t have to be an inventor of new techniques, come up with revolutionary subjects or employ an entirely different style. In fact, great masters of the past have seemingly followed in the footsteps of their teachers appearing to stick to the subject matter and by no means commencing any new art movement. But still, they were creators of great originality. How?
Originality doesn’t require the invention of a new genre of art. In classical Athens annual theater competitions were held where playwrights submitted works that had to conform to strict rules. No radical deviations from the standard could be accepted, but within these constraints masterpieces of world literature were created.
Religious art in Medieval and Renaissance Europe saw an endless repetition of the same themes. The “Madonna and Child” kept recurring and surely there were many uninteresting copies among them, but a few, like the Sistine Madonna, stand out as one of the most sublime pieces in art history. It is certainly original, but not in an obvious external way.
When there is great originality in a piece of art, it is found in the moment of artistic creation. Anyone can come up with a new subject, paint something that has never been painted before or use a material no one else has used, but that doesn’t make him an original artist – it doesn’t even make him an artist.
The greatness and originality of an artist is asserted when such external matters as subject and material are already given; his genius his shown in the way he makes use of what is handed out to him and in his unique way of responding to a particular challenge. He does the same thing differently and that’s how he is a liberator.
Art is the breaking of rules, but it’s also submission to rules; there can’t be the one without the other. If there were no rules, there would be no rules to break and the product would be irrelevant.
Art is saying something in a different way than has been said before, but the words spoken, the colors displayed or the forms expressed must be recognizable. It’s no use to utter the most profound sentences if it’s done in a language no one understands.
Much of modern art makes that mistake. How are we supposed to understand something if we have never seen anything remotely like it before? But the audience of today are easily fooled; they confuse understanding with acceptance and they are oh so tolerant, which means they think they understand everything.
Art is feeling, people say, and of course, every time we see something strange that we have never seen before, we will feel something – strangeness at least – and the purpose of art seems to have been fulfilled. But what we feel depends on what we understand and if we are unfamiliar with the means of expression, our feeling and understanding will be limited.
They often don’t give us the time to learn the language of new forms of art; we don’t get the chance to grasp the new rules before they are broken, and we are left clueless.
Great art breaks the rules in a most subtle way. There’s no need for dramatic shattering of glasses and violent attention grabbing, but sensing the small and important nuances require practice and that we are hardly granted in the ever-changing scene of contemporary art.
Classical art is not necessarily more art than what is produced in the modern anarchy, but it follows a recognizable standard, and the returning question “Is this art?” can’t be answer until we have a certain standard.
What makes sense in this mad world? Well, there is certainly a logic to this place or else it wouldn’t evolve so smoothly around the sun and kept itself in position. Nature is dependent on logic to exist and everything in it must recognize its proper state and follow its expected course. Animal instincts make perfect sense and in spite of their lack of reason, their action is always reasonable. In spite of? No, because of. Only humans, the rational animals, can act contrary to reason; irrationality presupposes rationality.
Human society, that artificial organism, also generally makes sense within its own structure, and when it doesn’t, it will soon disintegrate or fumble around for a new equilibrium. But measured against an external reason, most societies fail miserably: they systematically exclude their own members and leave them to redundancy and distress. If the rationality of the ant hill is the ideal, they are all infinitely far off.
That is madness, isn’t it, but depending on perspective, it’s still within reason.
Down to the level of the atoms of society, the individuals, their rationality or lack of such depends on conformity with the system they belong to. But what system is that? On the one hand they are included in a state that is likely to exclude them, and on the other they belong to the ultimate logic of nature. Which one is more significant?
Perhaps we would expect the ultimate logos of nature to trump the petty expectations of human society but neglecting all social conventions also doesn’t make sense. A man who chose to live like his stone age ancestors in the midst of modern society, would rightly be considered mad.
If there is universal reason, there is also universal madness. It is then not a relative quantity since we are dealing with something absolute. But from a frog perspective the view becomes muddier: Sometimes it makes sense to conform to the conventions of the pond and sometimes it doesn’t. How do we know? No wonder we go crazy.
Insanity is a misconception of reality. Insisting that there is a pink elephant in the room when no such creature can be observed by anyone else present, is probably an indication of insanity. A skeptic may of course object that for all you know it may be the other way around: those who don’t see that pink elephant may be insane. But we don’t really believe that, do we? Still, it is a valid question: What perception of reality is entitled to be called mentally healthy?
It can’t be the majority view since collective madness is sometimes observed. Masses of people may in a burst of changing fashion accept today what was deemed crazy yesterday and sometimes even follow their leader into self-destruction.
Then who is to judge what’s insane? Well, this question misses the point. Who? You, anyone. It’s less important to figure out who is right than to acknowledge that someone is right and then look for what this right may be. There is a reality, and outside of this reality there is insanity.
We all lapse into temporary delusions, and we are all the time guilty of minor misconceptions, but it’s to be hoped that we are generally correct about our main sense perceptions; there is no pink elephant in the room.
We generally have to rely on our senses, or else we are sure to go insane since we would lose all connection with reality. And in spite of all uncertainty and varying state of mind, there is probably a stable core that protects us against outright madness. Even the medically affirmed conditions of insanity will probably have a stable basis that contrasts their delusion, a previous state of realism or an unaffected part of their mind that they have in common with the rest of us.
It’s a mad world, but something makes sense.
Public discourse notoriously suffers from lack of clarity about the words and terms that are tossed around. A term that has an inherently bad connotation is turned into a word of abuse and used to discredit opponents without defining exactly what the accusation refers to and so making it impossible for the targets to defend themselves.
“Racist” is such a word. Who are the racists? Does it only refer to extremists who actually think that one race is biologically superior to another or is it alluding to subtle attitudes that subconsciously penetrate into people so that we are all basically racists? Or is it something in between?
When attempting to answer that question, I can’t just pick my favorite definition and claim that that is what it should mean. A word only has the meaning that the consensus among the speakers of the language gives to it. But in this case, is there such a consensus?
Dictionaries are supposed to reflect this consensus in their definitions, but a quick look in one of them suggests that such is not the case for this term. Dictionary.com says that “racism” is:
1“a belief or doctrine that inherent differences among the various human racial groups determine cultural or individual achievement” 2“a policy, system of government, etc., based upon or fostering such a doctrine; discrimination” 3“hatred or intolerance of another race or other races”
However, if those were really the definitions accepted by most people the significant race related problems that exist in Western societies would not be called racism. After all, rather few people are believers in racial doctrines and most governments other than Nazi Germany and a few others don’t foster such a doctrine. Moreover, blatant hatred of other races is probably not so common as “hatred” is a very strong word.
Still, some phenomenon exists that has to do with negative attitudes against other races and causes social problems and we need a word for it. We may very well call it racism but then we must not at the same time confuse it with other latent definitions, including the ones in the dictionary just cited, that only emphasize the most extreme forms of racism. If someone is guilty of the kind of subconscious attitudes that probably all of us slip into occasionally, it is very unfair to make it sound like he is a hardcore Nazi.
Name calling doesn’t solve problems and unclear meanings don’t create awareness.
What are the chances that you were born in the best place on earth? Sure, you are likely to love your birth place as it is connected with nostalgic memories of your childhood, but you should be able to lean back for a moment and realize that that dirty old town may not be equally appreciated by a casual observer. When pressed, you will defend its virtues, of course. You will draw attention to the hidden treasures that escape the stranger’s notice and correctly remark that he lacks the necessary experience to acknowledge them. After all, beauty is in the eyes of the observer.
But when it is not a matter of aesthetic appreciation, you don’t really have much of an advantage over outsiders when assessing the virtues of your land of birth. Rather, you are in possession of that disturbing fog called bias. A certain objective distance is required to evaluate the right or wrong of any issue, and that’s exactly what you lack where you are sitting deep inside the trenches of home.
What are the chances that of all countries on earth you were born in the one that has the most reasonable government and whose interests are most worthy of defense? The probability is about 1 to 195 (there being some 195 countries), so the likelihood that you live in the wrong country objectively speaking, is simply overwhelming. Add to that our psychological inclination to see everything from our own narrow perspective and the chance that you are right in supporting your country is quite slim.
It would be better if we could counteract our faulty instinct by stressing the possibility of being in the wrong place. You know you are biased in favor of your country, so instead you turn around and tell yourself: My country is probably wrong.
Know thyself, but why? To know your chosen identity? To know that you are good enough or bad enough? To know that you are guilty or not guilty, that your feelings are for real and that you deserve respect? Should you know yourself to pamper yourself, obtain the best comfort on the market and indulge in self-gratifying self-help for modern self-centered selfishness?
Know thyself! It was written on the ancient temple of Delphi and carries the weight of ancient wisdom. The philosophers of old taught us the maxim long before the self-indulging ego became a popular commodity. It is a recognition of the individual but not a mindless celebration of it. It is a tribute to freedom but not the kind of boundless license that is akin to slavery.
Know thyself! You are truly unique; that is your asset, but it is also your limitation, and it couldn’t be the one if it wasn’t the other. You can’t be just anything because you are already you. Most things you can’t do well, but in one thing you could potentially exceed: in being you.
But how sad; most of us can’t do even that one thing well. We are not good at being ourselves because we don’t know who that is. How are we to know? In our so-called free society, we may be allowed to choose anything and are sometimes even praised for divergent whims, but chances are you are just as far off from yourself than any dictatorship could take you.
If no one can tell you who you are, they say you can do anything, but that is just not true. Of all the hundreds of thousands of potential paths you could take, only a few of them are right for you. They don’t know which one it is, but neither do you, for you don’t know yourself.
Certain things are generally considered interesting, worthwhile, attractive and pleasing by a large group of people at the same time. It is popular and trendy from a broad perspective and for the individual it simply appears to be likable. That is probably what we mean when we use that informal but indispensable word “cool”.
“Cool” is a curious merger of personal and popular taste. You wouldn’t normally call something cool if you knew you were the only person in the world who liked it, and neither would you if everyone else approved of it but not you.
However, even if this is the definition of “cool”, it remains forever impossible to explain what makes something cool. Why is it that at a certain period and in a certain place something becomes so attractive that even critically minded individuals embrace it along with the general masses? Why is it that we follow the trends not just out of convenient submission but because we positively approve of it?
One may try to look for a rational answer to this question, but it would be in vain. One may search for material and socio-economic reasons to explain why it would be rational to adopt a trend and even employ quasi Darwinian hypotheses to argue that a certain mode of behavior is necessary for survival. But such attempts are generally useless, for normally no particular advantage can be found as to why something becomes a fashion.
That being said, everything in the world has a cause and nothing comes from nothing. Our perception of what is cool is subject to waves of influence that always reach us when we live in a society and notice what is going on around us. As long as we refuse to live in isolation, we must accept the tyranny of coolness, but hopefully we don’t have to totally submit to it. Critical thinking is difficult but give it a try though it’s not so cool.
The bad guys are always terrorists, and that makes it more convenient to fight them. One doesn’t have to get into complicated explanations as to why they are bad; call them terrorists and everyone understands that they must be hunted and killed mercilessly.
This is an old rhetorical trick with lethal consequences: Avoid defining your words clearly and you can tag anything you want to defame. That way the skeptics are disarmed, and you can freely fight your enemies without being troubled by bothersome counterarguments.
The great power of the West always follows this rhetorical strategy (especially since 9/11) at that has allowed it to pursue its various economic and political interests facing less criticism than would otherwise have been the case. After all, everyone is against terrorism.
It’s just a pity that all those other powers are also picking up this effective rhetoric. China can oppress the Uighurs citing the necessity of fighting terrorism and Turkey is allowed to bully the Kurds, just to mention two examples.
It would have been so much more honest if we could restrict the use of the word “terrorist” to a simple definition we could all agree on, but of course no one voluntarily gives away powers, and the power of vagueness is one of them.
Nevertheless, I think we have an underlying minimum definition already that everyone, even the terrorists themselves, would agree on. Whoever kills people indiscriminately in order to create general fear, is a terrorist.
Extending the definition beyond that minimum is usually done to discredit rather than giving an accurate description. Anyone fighting a war for a bad cause, will also kill for a bad cause and so we throw the label “terrorist” at them. The problem is that the other side easily throws it back since they have a different opinion about whose cause is bad. The result is a war of words that leads to war.
