here's an interesting read on how pagan tradition of megalopsychia interacted with early christianity (link).
Sunday, August 29, 2010
Megalopsychia
here's an interesting read on how pagan tradition of megalopsychia interacted with early christianity (link).
Friday, August 27, 2010
In Defense of the Sialkot Mob
PAKISTAN'S PATHETIC SENSE OF MORAL OUTRAGE
Recently a video went viral in Pakistan. It showed the brutal lynching of two brothers in a village near Sialkot. The video was shown on almost all news channels and the Chief Justice was forced into taking action. Political parties jumped in to gain brownie points by riding the wave of anger and horror at this incident.
Dozens of people get killed in Pakistan everyday. Some are kidnapped and then their throats are slit as the victim’s family was not able to collect the requisite ransom on time. Others are just shot dead over petty crimes. Anyone hailing from Karachi knows exactly what it feels like to have a gun pointed at you and then asked for your wallet and mobile phone. On many occasions these young gung-ho mobile thieves shoot the innocent victims if any resistance whatsoever is shown. The sad part is that many young criminals belong to educated and well to do families. They might well be "hafiz-e-quran" but that is no guarantee that the next guy on a motor bike steeling your phone isn’t. The kidnapping scene in Pukhtunkhwa is also very hot and if you notice a car slowing down near you then you might find yourself in the middle of such a scene. The interesting part is that almost every official in Pukhtunkhwa knows who the kidnapper is, who is probably present in a lawless area and these officials are the ones who actually guarantee your safe delivery once the ransom is paid which is kind of ironic in itself.
The justice in Pakistan is far from ideal. Most robberies are rumored to be the handiwork of police officers in our country. Criminals and our security forces are thought to work together on many occasions. On top of it, most areas in our country hardly have any security presence. People are left to fend for themselves by keeping guns in their homes. Pakistan happens to be a very well armed country where almost everyone has some form of fire arm in his position which would most likely be used in self defense. The taliban’s initial claim to fame was that they eliminated crime all together from there areas which made them immensely popular. Hardened criminals and mafias were hung from posts in the center of the cities. Most people ridicule the Taliban (rightly so) but also forget that these people had a very effective justice system which produced really brilliant results. If somebody kills your family member then going to court after the killer is even worse than the suffering you just endured. Lawyers would fleece you out of your money and the judge would delay the hearing for years to come. Meanwhile the killers would threaten you with dire consequences and if the killers belong to some respectable family then they would have the decency to offer some sort of blood money in return for a withdrawal of case. And did i mention that probably no one would be willing to testify in court, witnesses are just not willing to go to a police station or record a statement in court. Since your anger and your grief are no longer as intense after being fleeced by lawyers hence you would very likely except that blood money and see the killer of your relative go completely free, probably with a big smile on his face.
Most people accuse the mob in Sialkot of being barbaric which they probably were but forget that there are equally barbaric punishments in our own justice system or even in Islam. There is great controversy surrounding the actual events. Some people say that the two brothers were involved in a robbery and shot three people when they were surrounded by a crowd and were only lynched when news spread of the death of one of the victims called Bilal. A second victim is still in coma. There is a high level inquiry going on right now but if the facts come out that these brothers actually were killers then I would have to defend the mob. Our elites who accuse them of barbarism have never provided these people with a justice system. As long as there isn’t a justice system in our country then no one has the right to point fingers and accuse these people of barbarism. In the absence of a justice system, this was justice in its purest form. Criminals in Pakistan have little fear of the system which works in their favor. The people of Pakistan will break these chains of slavery once in a while to prove a point, that if justice is not delivered then they are willing to take the law into their hands. There is a huge deficit of trust in the system.
I have also seen people mixing this incident with the attacks on Ahmedis. There is no comparison between the two except that both were violent incidence. The Ahmedi issue is the result of years of bigotry preached at the pulpit where as the Sialkot mob was a direct result of a lack of trust in the police and the judiciary. If an efficient justice system is not devised, especially for crimes that our becoming everyday occurrences then this trust would erode further. There is an atmosphere of fear in Pakistan because of the current wave of crime and I wish that the honorable Chief Justice, instead of taking suo moto notices devises a much broader strategy which involves police to tackle this wave of crime. There are many violent crimes where no videos are ever made hence they never come in the limelight.We need to come out of this shock, and start thinking clearly. We should stop accusing Pakistanis of being barbaric. Gross generalization or suo moto notices do not cure the ill. Justice needs to be served for all those crimes where there is no camera present.
Post Script: I would like to thank DIG Gujranwala, Mr Zulfiqar Cheema for his honest work in the area and the elimination of crime and many famous mafia and crime bosses in encounters. Controversial but very very effective.
I would also like to point out how our moral brigade (elites) are pathetically immune to the religious slaughter that takes place every day. Just today (2/9/2010), several Shiite processions were bombed and fired upon killing thirty people in Lahore, Multan and Karachi. Religious shrines are bombed and desecrated and Ahmedis are brutally murdered but this elitest moral brigade points fingers at Black Water, RAW and CIA. It is capable of coming up with the most outlandish forms of conspiracy theories to justify these incidents. They don't go raving mad about such incidents. It is clear that our madressa system is feeding these terrorists but not a single person has the audacity to blame them. This elitest moral brigade consists of cowards of the highest order, who listen to people like Dr Israr who openly calls for the slaughter of Ahmedis. They are fond of Zakir Naik who pays the highest respect to Yazid. When a sectarian madressa (laal masjid) which openly propagated violence in the capital Islamabad, these moral elites came for their defense. Instead of going raving mad about closing all such institutions they chose to defend them instead. Frankly the outrage over the video has nothing to do with brutality or violence but it has everything to do with how naive the people of Pakistan are. The attacks on the Shiite procession in Lahore and Karachi would quickly be forgotten. The dead body parts shown by Geo would fail to produce the least sense of moral outrage. Their would be no countrywide campaigns calling for the closure of sectarian madressas and outfits. But if two thieves are slaughtered then all hell breaks lose. Hypocrisy at its best.
"The Athiestic Delusion" by John N. Gray
An atmosphere of moral panic surrounds religion. Viewed not so long ago as a relic of superstition whose role in society was steadily declining, it is now demonised as the cause of many of the world's worst evils. As a result, there has been a sudden explosion in the literature of proselytising atheism. A few years ago, it was difficult to persuade commercial publishers even to think of bringing out books on religion. Today, tracts against religion can be enormous money-spinners, with Richard Dawkins's The God Delusion and Christopher Hitchens's God Is Not Great selling in the hundreds of thousands. For the first time in generations, scientists and philosophers, high-profile novelists and journalists are debating whether religion has a future. The intellectual traffic is not all one-way. There have been counterblasts for believers, such as The Dawkins Delusion? by the British theologian Alister McGrath and The Secular Age by the Canadian Catholic philosopher Charles Taylor. On the whole, however, the anti-God squad has dominated the sales charts, and it is worth asking why.
The abrupt shift in the perception of religion is only partly explained by terrorism. The 9/11 hijackers saw themselves as martyrs in a religious tradition, and western opinion has accepted their self-image. And there are some who view the rise of Islamic fundamentalism as a danger comparable with the worst that were faced by liberal societies in the 20th century.
For Dawkins and Hitchens, Daniel Dennett and Martin Amis, Michel Onfray, Philip Pullman and others, religion in general is a poison that has fuelled violence and oppression throughout history, right up to the present day. The urgency with which they produce their anti-religious polemics suggests that a change has occurred as significant as the rise of terrorism: the tide of secularisation has turned. These writers come from a generation schooled to think of religion as a throwback to an earlier stage of human development, which is bound to dwindle away as knowledge continues to increase. In the 19th century, when the scientific and industrial revolutions were changing society very quickly, this may not have been an unreasonable assumption. Dawkins, Hitchens and the rest may still believe that, over the long run, the advance of science will drive religion to the margins of human life, but this is now an article of faith rather than a theory based on evidence.
It is true that religion has declined sharply in a number of countries (Ireland is a recent example) and has not shaped everyday life for most people in Britain for many years. Much of Europe is clearly post-Christian. However, there is nothing that suggests the move away from religion is irreversible, or that it is potentially universal. The US is no more secular today than it was 150 years ago, when De Tocqueville was amazed and baffled by its all-pervading religiosity. The secular era was in any case partly illusory. The mass political movements of the 20th century were vehicles for myths inherited from religion, and it is no accident that religion is reviving now that these movements have collapsed. The current hostility to religion is a reaction against this turnabout. Secularisation is in retreat, and the result is the appearance of an evangelical type of atheism not seen since Victorian times.
As in the past, this is a type of atheism that mirrors the faith it rejects. Philip Pullman's Northern Lights - a subtly allusive, multilayered allegory, recently adapted into a Hollywood blockbuster, The Golden Compass - is a good example. Pullman's parable concerns far more than the dangers of authoritarianism. The issues it raises are essentially religious, and it is deeply indebted to the faith it attacks. Pullman has stated that his atheism was formed in the Anglican tradition, and there are many echoes of Milton and Blake in his work. His largest debt to this tradition is the notion of free will. The central thread of the story is the assertion of free will against faith. The young heroine Lyra Belacqua sets out to thwart the Magisterium - Pullman's metaphor for Christianity - because it aims to deprive humans of their ability to choose their own course in life, which she believes would destroy what is most human in them. But the idea of free will that informs liberal notions of personal autonomy is biblical in origin (think of the Genesis story). The belief that exercising free will is part of being human is a legacy of faith, and like most varieties of atheism today, Pullman's is a derivative of Christianity.
Zealous atheism renews some of the worst features of Christianity and Islam. Just as much as these religions, it is a project of universal conversion. Evangelical atheists never doubt that human life can be transformed if everyone accepts their view of things, and they are certain that one way of living - their own, suitably embellished - is right for everybody. To be sure, atheism need not be a missionary creed of this kind. It is entirely reasonable to have no religious beliefs, and yet be friendly to religion. It is a funny sort of humanism that condemns an impulse that is peculiarly human. Yet that is what evangelical atheists do when they demonise religion.
A curious feature of this kind of atheism is that some of its most fervent missionaries are philosophers. Daniel Dennett's Breaking the Spell: Religion as a Natural Phenomenon claims to sketch a general theory of religion. In fact, it is mostly a polemic against American Christianity. This parochial focus is reflected in Dennett's view of religion, which for him means the belief that some kind of supernatural agency (whose approval believers seek) is needed to explain the way things are in the world. For Dennett, religions are efforts at doing something science does better - they are rudimentary or abortive theories, or else nonsense. "The proposition that God exists," he writes severely, "is not even a theory." But religions do not consist of propositions struggling to become theories. The incomprehensibility of the divine is at the heart of Eastern Christianity, while in Orthodox Judaism practice tends to have priority over doctrine. Buddhism has always recognised that in spiritual matters truth is ineffable, as do Sufi traditions in Islam. Hinduism has never defined itself by anything as simplistic as a creed. It is only some western Christian traditions, under the influence of Greek philosophy, which have tried to turn religion into an explanatory theory.
The notion that religion is a primitive version of science was popularised in the late 19th century in JG Frazer's survey of the myths of primitive peoples, The Golden Bough: A Study in Magic and Religion. For Frazer, religion and magical thinking were closely linked. Rooted in fear and ignorance, they were vestiges of human infancy that would disappear with the advance of knowledge. Dennett's atheism is not much more than a revamped version of Frazer's positivism. The positivists believed that with the development of transport and communication - in their day, canals and the telegraph - irrational thinking would wither way, along with the religions of the past. Despite the history of the past century, Dennett believes much the same. In an interview that appears on the website of the Edge Foundation (edge.org) under the title "The Evaporation of the Powerful Mystique of Religion", he predicts that "in about 25 years almost all religions will have evolved into very different phenomena, so much so that in most quarters religion will no longer command the awe that it does today". He is confident that this will come about, he tells us, mainly because of "the worldwide spread of information technology (not just the internet, but cell phones and portable radios and television)". The philosopher has evidently not reflected on the ubiquity of mobile phones among the Taliban, or the emergence of a virtual al-Qaida on the web.
The growth of knowledge is a fact only postmodern relativists deny. Science is the best tool we have for forming reliable beliefs about the world, but it does not differ from religion by revealing a bare truth that religions veil in dreams. Both science and religion are systems of symbols that serve human needs - in the case of science, for prediction and control. Religions have served many purposes, but at bottom they answer to a need for meaning that is met by myth rather than explanation. A great deal of modern thought consists of secular myths - hollowed-out religious narratives translated into pseudo-science. Dennett's notion that new communications technologies will fundamentally alter the way human beings think is just such a myth.
In The God Delusion, Dawkins attempts to explain the appeal of religion in terms of the theory of memes, vaguely defined conceptual units that compete with one another in a parody of natural selection. He recognises that, because humans have a universal tendency to religious belief, it must have had some evolutionary advantage, but today, he argues, it is perpetuated mainly through bad education. From a Darwinian standpoint, the crucial role Dawkins gives to education is puzzling. Human biology has not changed greatly over recorded history, and if religion is hardwired in the species, it is difficult to see how a different kind of education could alter this. Yet Dawkins seems convinced that if it were not inculcated in schools and families, religion would die out. This is a view that has more in common with a certain type of fundamentalist theology than with Darwinian theory, and I cannot help being reminded of the evangelical Christian who assured me that children reared in a chaste environment would grow up without illicit sexual impulses.
Dawkins's "memetic theory of religion" is a classic example of the nonsense that is spawned when Darwinian thinking is applied outside its proper sphere. Along with Dennett, who also holds to a version of the theory, Dawkins maintains that religious ideas survive because they would be able to survive in any "meme pool", or else because they are part of a "memeplex" that includes similar memes, such as the idea that, if you die as a martyr, you will enjoy 72 virgins. Unfortunately, the theory of memes is science only in the sense that Intelligent Design is science. Strictly speaking, it is not even a theory. Talk of memes is just the latest in a succession of ill-judged Darwinian metaphors.
Dawkins compares religion to a virus: religious ideas are memes that infect vulnerable minds, especially those of children. Biological metaphors may have their uses - the minds of evangelical atheists seem particularly prone to infection by religious memes, for example. At the same time, analogies of this kind are fraught with peril. Dawkins makes much of the oppression perpetrated by religion, which is real enough. He gives less attention to the fact that some of the worst atrocities of modern times were committed by regimes that claimed scientific sanction for their crimes. Nazi "scientific racism" and Soviet "dialectical materialism" reduced the unfathomable complexity of human lives to the deadly simplicity of a scientific formula. In each case, the science was bogus, but it was accepted as genuine at the time, and not only in the regimes in question. Science is as liable to be used for inhumane purposes as any other human institution. Indeed, given the enormous authority science enjoys, the risk of it being used in this way is greater.
Contemporary opponents of religion display a marked lack of interest in the historical record of atheist regimes. In The End of Faith: Religion, Terror and the Future of Reason, the American writer Sam Harris argues that religion has been the chief source of violence and oppression in history. He recognises that secular despots such as Stalin and Mao inflicted terror on a grand scale, but maintains the oppression they practised had nothing to do with their ideology of "scientific atheism" - what was wrong with their regimes was that they were tyrannies. But might there not be a connection between the attempt to eradicate religion and the loss of freedom? It is unlikely that Mao, who launched his assault on the people and culture of Tibet with the slogan "Religion is poison", would have agreed that his atheist world-view had no bearing on his policies. It is true he was worshipped as a semi-divine figure - as Stalin was in the Soviet Union. But in developing these cults, communist Russia and China were not backsliding from atheism. They were demonstrating what happens when atheism becomes a political project. The invariable result is an ersatz religion that can only be maintained by tyrannical means.
Something like this occurred in Nazi Germany. Dawkins dismisses any suggestion that the crimes of the Nazis could be linked with atheism. "What matters," he declares in The God Delusion, "is not whether Hitler and Stalin were atheists, but whether atheism systematically influences people to do bad things. There is not the smallest evidence that it does." This is simple-minded reasoning. Always a tremendous booster of science, Hitler was much impressed by vulgarised Darwinism and by theories of eugenics that had developed from Enlightenment philosophies of materialism. He used Christian antisemitic demonology in his persecution of Jews, and the churches collaborated with him to a horrifying degree. But it was the Nazi belief in race as a scientific category that opened the way to a crime without parallel in history. Hitler's world-view was that of many semi-literate people in interwar Europe, a hotchpotch of counterfeit science and animus towards religion. There can be no reasonable doubt that this was a type of atheism, or that it helped make Nazi crimes possible.
Nowadays most atheists are avowed liberals. What they want - so they will tell you - is not an atheist regime, but a secular state in which religion has no role. They clearly believe that, in a state of this kind, religion will tend to decline. But America's secular constitution has not ensured a secular politics. Christian fundamentalism is more powerful in the US than in any other country, while it has very little influence in Britain, which has an established church. Contemporary critics of religion go much further than demanding disestablishment. It is clear that he wants to eliminate all traces of religion from public institutions. Awkwardly, many of the concepts he deploys - including the idea of religion itself - have been shaped by monotheism. Lying behind secular fundamentalism is a conception of history that derives from religion.
AC Grayling provides an example of the persistence of religious categories in secular thinking in his Towards the Light: The Story of the Struggles for Liberty and Rights That Made the Modern West. As the title indicates, Grayling's book is a type of sermon. Its aim is to reaffirm what he calls "a Whig view of the history of the modern west", the core of which is that "the west displays progress". The Whigs were pious Christians, who believed divine providence arranged history to culminate in English institutions, and Grayling too believes history is "moving in the right direction". No doubt there have been setbacks - he mentions nazism and communism in passing, devoting a few sentences to them. But these disasters were peripheral. They do not reflect on the central tradition of the modern west, which has always been devoted to liberty, and which - Grayling asserts - is inherently antagonistic to religion. "The history of liberty," he writes, "is another chapter - and perhaps the most important of all - in the great quarrel between religion and secularism." The possibility that radical versions of secular thinking may have contributed to the development of nazism and communism is not mentioned. More even than the 18th-century Whigs, who were shaken by French Terror, Grayling has no doubt as to the direction of history.
But the belief that history is a directional process is as faith-based as anything in the Christian catechism. Secular thinkers such as Grayling reject the idea of providence, but they continue to think humankind is moving towards a universal goal - a civilisation based on science that will eventually encompass the entire species. In pre-Christian Europe, human life was understood as a series of cycles; history was seen as tragic or comic rather than redemptive. With the arrival of Christianity, it came to be believed that history had a predetermined goal, which was human salvation. Though they suppress their religious content, secular humanists continue to cling to similar beliefs. One does not want to deny anyone the consolations of a faith, but it is obvious that the idea of progress in history is a myth created by the need for meaning.
The problem with the secular narrative is not that it assumes progress is inevitable (in many versions, it does not). It is the belief that the sort of advance that has been achieved in science can be reproduced in ethics and politics. In fact, while scientific knowledge increases cumulatively, nothing of the kind happens in society. Slavery was abolished in much of the world during the 19th century, but it returned on a vast scale in nazism and communism, and still exists today. Torture was prohibited in international conventions after the second world war, only to be adopted as an instrument of policy by the world's pre-eminent liberal regime at the beginning of the 21st century. Wealth has increased, but it has been repeatedly destroyed in wars and revolutions. People live longer and kill one another in larger numbers. Knowledge grows, but human beings remain much the same.
Belief in progress is a relic of the Christian view of history as a universal narrative, and an intellectually rigorous atheism would start by questioning it. This is what Nietzsche did when he developed his critique of Christianity in the late 19th century, but almost none of today's secular missionaries have followed his example. One need not be a great fan of Nietzsche to wonder why this is so. The reason, no doubt, is that he did not assume any connection between atheism and liberal values - on the contrary, he viewed liberal values as an offspring of Christianity and condemned them partly for that reason. In contrast, evangelical atheists have positioned themselves as defenders of liberal freedoms - rarely inquiring where these freedoms have come from, and never allowing that religion may have had a part in creating them.
Among contemporary anti-religious polemicists, only the French writer Michel Onfray has taken Nietzsche as his point of departure. In some ways, Onfray's In Defence of Atheism is superior to anything English-speaking writers have published on the subject. Refreshingly, Onfray recognises that evangelical atheism is an unwitting imitation of traditional religion: "Many militants of the secular cause look astonishingly like clergy. Worse: like caricatures of clergy." More clearly than his Anglo-Saxon counterparts, Onfray understands the formative influence of religion on secular thinking. Yet he seems not to notice that the liberal values he takes for granted were partly shaped by Christianity and Judaism. The key liberal theorists of toleration are John Locke, who defended religious freedom in explicitly Christian terms, and Benedict Spinoza, a Jewish rationalist who was also a mystic. Yet Onfray has nothing but contempt for the traditions from which these thinkers emerged - particularly Jewish monotheism: "We do not possess an official certificate of birth for worship of one God," he writes. "But the family line is clear: the Jews invented it to endure the coherence, cohesion and existence of their small, threatened people." Here Onfray passes over an important distinction. It may be true that Jews first developed monotheism, but Judaism has never been a missionary faith. In seeking universal conversion, evangelical atheism belongs with Christianity and Islam.
In today's anxiety about religion, it has been forgotten that most of the faith-based violence of the past century was secular in nature. To some extent, this is also true of the current wave of terrorism. Islamism is a patchwork of movements, not all violently jihadist and some strongly opposed to al-Qaida, most of them partly fundamentalist and aiming to recover the lost purity of Islamic traditions, while at the same time taking some of their guiding ideas from radical secular ideology. There is a deal of fashionable talk of Islamo-fascism, and Islamist parties have some features in common with interwar fascist movements, including antisemitism. But Islamists owe as much, if not more, to the far left, and it would be more accurate to describe many of them as Islamo-Leninists. Islamist techniques of terror also have a pedigree in secular revolutionary movements. The executions of hostages in Iraq are copied in exact theatrical detail from European "revolutionary tribunals" in the 1970s, such as that staged by the Red Brigades when they murdered the former Italian prime minister Aldo Moro in 1978.
The influence of secular revolutionary movements on terrorism extends well beyond Islamists. In God Is Not Great, Christopher Hitchens notes that, long before Hizbullah and al-Qaida, the Tamil Tigers of Sri Lanka pioneered what he rightly calls "the disgusting tactic of suicide murder". He omits to mention that the Tigers are Marxist-Leninists who, while recruiting mainly from the island's Hindu population, reject religion in all its varieties. Tiger suicide bombers do not go to certain death in the belief that they will be rewarded in any postmortem paradise. Nor did the suicide bombers who drove American and French forces out of Lebanon in the 80s, most of whom belonged to organisations of the left such as the Lebanese communist party. These secular terrorists believed they were expediting a historical process from which will come a world better than any that has ever existed. It is a view of things more remote from human realities, and more reliably lethal in its consequences, than most religious myths.
It is not necessary to believe in any narrative of progress to think liberal societies are worth resolutely defending. No one can doubt that they are superior to the tyranny imposed by the Taliban on Afghanistan, for example. The issue is one of proportion. Ridden with conflicts and lacking the industrial base of communism and nazism, Islamism is nowhere near a danger of the magnitude of those that were faced down in the 20th century. A greater menace is posed by North Korea, which far surpasses any Islamist regime in its record of repression and clearly does possess some kind of nuclear capability. Evangelical atheists rarely mention it. Hitchens is an exception, but when he describes his visit to the country, it is only to conclude that the regime embodies "a debased yet refined form of Confucianism and ancestor worship". As in Russia and China, the noble humanist philosophy of Marxist-Leninism is innocent of any responsibility.
Writing of the Trotskyite-Luxemburgist sect to which he once belonged, Hitchens confesses sadly: "There are days when I miss my old convictions as if they were an amputated limb." He need not worry. His record on Iraq shows he has not lost the will to believe. The effect of the American-led invasion has been to deliver most of the country outside the Kurdish zone into the hands of an Islamist elective theocracy, in which women, gays and religious minorities are more oppressed than at any time in Iraq's history. The idea that Iraq could become a secular democracy - which Hitchens ardently promoted - was possible only as an act of faith.
In The Second Plane, Martin Amis writes: "Opposition to religion already occupies the high ground, intellectually and morally." Amis is sure religion is a bad thing, and that it has no future in the west. In the author of Koba the Dread: Laughter and the Twenty Million - a forensic examination of self-delusion in the pro-Soviet western intelligentsia - such confidence is surprising. The intellectuals whose folly Amis dissects turned to communism in some sense as a surrogate for religion, and ended up making excuses for Stalin. Are there really no comparable follies today? Some neocons - such as Tony Blair, who will soon be teaching religion and politics at Yale - combine their belligerent progressivism with religious belief, though of a kind Augustine and Pascal might find hard to recognise. Most are secular utopians, who justify pre-emptive war and excuse torture as leading to a radiant future in which democracy will be adopted universally. Even on the high ground of the west, messianic politics has not lost its dangerous appeal.
Religion has not gone away. Repressing it is like repressing sex, a self-defeating enterprise. In the 20th century, when it commanded powerful states and mass movements, it helped engender totalitarianism. Today, the result is a climate of hysteria. Not everything in religion is precious or deserving of reverence. There is an inheritance of anthropocentrism, the ugly fantasy that the Earth exists to serve humans, which most secular humanists share. There is the claim of religious authorities, also made by atheist regimes, to decide how people can express their sexuality, control their fertility and end their lives, which should be rejected categorically. Nobody should be allowed to curtail freedom in these ways, and no religion has the right to break the peace.
The attempt to eradicate religion, however, only leads to it reappearing in grotesque and degraded forms. A credulous belief in world revolution, universal democracy or the occult powers of mobile phones is more offensive to reason than the mysteries of religion, and less likely to survive in years to come. Victorian poet Matthew Arnold wrote of believers being left bereft as the tide of faith ebbs away. Today secular faith is ebbing, and it is the apostles of unbelief who are left stranded on the beach.
John Gray's Black Mass: Apocalyptic Religion and the Death of Utopia will be out in paperback in April (Penguin)
Tuesday, August 24, 2010
Why is it Harmful to have a Discussion online
Firstly, we all know that online addiction is very common. A recent study at Stanford has shown that mental abilities deteriorate because of multitasking (link). Humans have an evolutionary trait of automatically scanning the environment for information, but as that information increases, we start to overwhelm our minds which then leaves little room for anything else.
The information that is considered dangerous has one unique trait. It should appear to be random or as some say natural. Our brains at tuned to seek out things that are unique and don't fit a certain parameter. It quickly gets bored if it is fed the same information over and over again. The information needs to be new, no matter how irrelevant it is and should be unpredictable. Online newspapers, emails, social networking etc are therefore the most addictive as they contribute the most to this category of information. The more random the time interval the more addictive it is. The effects of such an activity last for some time and debilitates one's ability as it contains a lot of stressors. The accompanying stress lasts longer.
I will now explain why online communication is harmful. Firstly, humans communicate very little through writing or language itself. Although they are useful but body language i.e variations in tone, eye contact, body posture, face color etc contribute a lot to a normal discussion. If two people are in love, then you just know from their body language. If a person is angry then that is also visible. Online communication is emotionally dangerous. Online telephony or visual may be slightly better. Without the full signals, it is difficult to know when you are getting into a fight. It is easier to say negative things about others online then on their faces. Real discussions prevents a lot of negative stuff. People fighting online are a lot more vicious then in real. In real life, your mental thoughts are shielded by your bodies instincts which continuously predicts the other person's response and provides you with a feedback about what is appropriate. Online, there is no such feedback and your thoughts go through your laptop's keyboard without any feedback mechanism. I prefer to stay anonymous online, simply because i cannot control the appropriateness of what I write and I fail to predict how other people would react to it.
My recommendations would be to do as little discussions online as possible. Blogging is appropriate because it is not discussion. It is an inhibited expression of one's thoughts which is what internet is useful for. At least use your voice to make your point in a discussion than simply hacking at your keyboard. Don't communicate on social networking sights, news websites etc because then you'll one day end up in a fierce discussion. And if you want to know what that is like, then please go and read the comments on any youtube video. The most peaceful and soft spoken people turn violent serial murderers on the net.
Internet provides your brain with food for thought. Your brain can spend hours munching over stuff that is stressful and useless. You can read about injustices on another continent and be stressed for weeks, forgetting the injustice that might have happened just a few blocks from where you live. Information sharing needs to be localized and not globalized. If you don't know what's going on near you, then you are not connected. Recently, there was a gruesome incident of public lynching in Sialkot and the video was circulated around the world. Pakistanis all over the world became extremely stressed, but sadly that video had really no relevance to the little world they live in. People need to differentiate between whats their business and whats not. People need to clear their minds now, otherwise it would be difficult to focus on anything worthwhile.
Friday, August 20, 2010
"Reflections On Academic Success And Failure" by Gary T. Marx (MIT Emeritus Professor)
in B. Berger, ed., Authors of Their Own Lives, Univ. of California Press, 1990, pp.260-284
The article is written by an MIT Emeritus Professor Gary T. Marx. It is a very insightful article describing the challenges of working as an academic. I have highlighted some important points in "red" which are intuitively appealing and are universally applicable to any career.
Gary T. Marx
Academic work is publicly and correctly viewed as having a sacred quality involving the pursuit and transmission of truth. But it also involves a job or career carried out in a competitive milieu where the usual human virtues and vices are never far from the surface.2
I will try to shed some light on this secular side of the profession and to offer some practical advice. I first describe my experiences, then discuss seven characteristics of success and some practical conclusions I have drawn. Although the themes are universal, I have written with two groups in mind: persons beginning their career, and those at midcareer sorting it all out --the former because I wish someone had told me these things when I was starting out, and the latter because they may believe them.
Life Could Be a Dream
In 1970 there could not have been many sociologists just three years beyond the Ph.D. who were as professionally satisfied and optimistic as I was. The promise of the popular 1950srhythm-and-blues song "Shboom" that "life could be a dream" hadcome true. Immigrants, gold miners, and aspiring actors might head West, but as an ambitious academic born on a farm in central California I had headed east to where I thought the real actionwas --Cambridge, Massachusetts.
I had a job at Harvard with a higher salary and a longercontract (negotiated under threat of deserting to another Ivy League school) than the other assistant professors in the Department of Social Relations. I taught only one course and had a mammoth corner of office, where I was protected from intruders by my own secretary in an outer office.
My book Protest and Prejudice had sold fifteen thousand copies and had been translated into Japanese. Various chapters had been reprinted in more than twenty books. The major newspapers, magazines, and radio and television media gave good coverage to research I had done on the civil-rights movement, civil disorders, and community police patrols. From my experience in presenting papers at the annual meetings of the American Sociological Association I assumed that it was not unusual to receive more than 150 requests for preprints of a timely paper. 3
After receiving my Ph.D. from the University of California, I had barely settled into Cambridge and got over jet lag in September 1967 when I received an invitation to join the staff of the National Advisory Commission on Civil Disorders. Barely a year before, in beard and sandals, I had been sitting in smoke-filled cafes on Telegraph Avenue in Berkeley, listening to folk music and talking about the machinations of the power elite, plotting coups and bemoaning the sad role of co-opted American intellectuals. At Harvard I became a regular on the Boston-Washington shuttle and dressed in a three-piece suit. I eagerly rejected Thoreau's advice, "Beware of all enterprises that require new clothes." Ignoring the sarcasm, I chose instead to follow Bob Dylan's advice "Get dressed, get blessed. Try to be a success."
A student-published course evaluation booklet (The Harvard ConfiGuide), known for its biting critiques, praised my courses: "Marx ranks among the best lecturers in the University.... If you don't take the course, at least sit in on some of the lectures." I was fortunate to encounter an unusually bright, well-read, socially conscious group of graduate and undergraduate students, some of whom are now major figures in American sociology. We were on the same side of the generation gap and shared intellectual interests, a desire to see research aid social change, and a quest for professional status. Training students and involving them in research was deeply fulfilling. (It also allowed me to get more work done.)
I received several prestigious fellowships that enabled me to take leaves of absence. My name was added to the list of those under consideration to be invited for a year in residence at several think tanks. Consultation and research money was falling into my lap. CBS-TV needed a consultant for a series on urban areas. ABC-TV wanted a commentator on the Kerner Commission report. Encyclopedia Britannica wanted an article on riots. The Joint Center for Urban Studies of MIT and Harvard offered summer salaries. Unsolicited, funding sources such as the Urban Institute and Law Enforcement Assistance Administration offered me money for research; all they required from me was a letter of a few pages, and I would receive a grant.
At a relatively young age I was fortunate to have the chance to serve on the editorial boards of several major journals and was elected to the Council of the American Sociological Association, enjoying the company of senior colleagues old enough to be my parents and even grandparents. The mail routinely brought inquiries about positions elsewhere, along with requests to write books, articles, and reviews for both academic and popular publications, serve on editorial and other boards, participate in symposia, and give lectures and deliver papers at an array of academic meetings both in the United States and abroad. The invitations removed from me the anxiety and risk many of my peers experienced as they sought professional attention. I was not conducting research with only a hope that someday, somehow, the results would be published. Instead, I could adopt the more cost-effective and safe technique of filling orders on hand. Since invitations were usually general, I had the freedom to write on whatever I wanted.
It seemed to be a seller's market. In one of those nasty social principles wherein the rich got richer, each invited article or presentation triggered new invitations in an almost geometric expansion. Each article was an investment that earned interest. My problem was not having the goods rejected but finding it impossible to keep enough in stock. The certainty of publication probably encouraged me to produce more than I otherwise might have and perhaps to let it go to press earlier.
It also may have meant a freer, more interpretive writing and research style because I did not have to conform to the expectations of an editorial board or reviewers committed to a narrow notion of sociological research.4 Since esteemed members of my profession were offering these invitations, my self-confidence increased and I came to believe that I had important things to say. Perhaps a positive labeling effect was at work.
I brushed up against a busy world of movers and shakers, elites, and academic gatekeepers. Editors, reporters, lawyers, and heads of social research consulting firms asked me to dine at expensive restaurants and private clubs or tendered invitations to cocktail parties. Often they asked me for my opinion or help on topics I knew nothing about. I negotiated a contract to do a race-relations textbook with a colleague for what seemed in 1970 to be an unprecedented sum, far greater than my annual salary. I had lunch with Vice President Humphrey and dinner with several Cabinet secretaries. I attended briefing lunches and dinners with other real and aspiring political leaders. I was approached by a former (or so he claimed) CIA agent still working for the government but in some other capacity. He had read Protest and Prejudice and wanted to talk about the student movement. I eagerly responded to a request to join a group of academics helping Robert F. Kennedy's 1968 presidential campaign and drafted a position paper.
This bountiful professional harvest spilled over into private life. We lived in a university-owned apartment in the heart of Cambridge in a former botanical garden. We were invited to large, somewhat formal dinner parties attended by celebrated American intellectuals in eighteenth-century homes. Our son was the only nonconnected four-year-old accepted into Shady Lane, a wonderful Cambridge school founded by William James and John Dewey. We bought an expensive foreign car and land on Martha's Vineyard. Plans for the summer home were drawn up. I developed a taste for sherry and even pretended to enjoy playing squash.
I had moved from being an unknown graduate student at a state university in the outback to what seemed to be the core of American academic and political life.5 George Homans, Alex Inkeles, Seymour Martin Lipset, Talcott Parsons, and David Riesman were all down the hallway from my office. It was the same hallway that not long before had been graced by Pitirim Sorokin, Gordon Allport, and Clyde Kluckhohn, located in a building named after still another illustrious predecessor, William James. The periphery of the Kennedy circle of advisers from Harvard beckoned. One of my mentors, Daniel P. Moynihan, had moved on to a job in the White House.
I would eagerly return to my office (after an afternoon or day away) in the hope of finding several neatly written pink phone messages requesting that I return a New York or Washington call. Those little pink notes were lifelines, unobtrusive symbolic indicators bearing evidence of a career in motion. The higher reaches of sociology and perhaps even American intellectual life, public service, the mass media, and a patrician life-style all seemed to be beckoning. This was heady stuff for a person whose highest aspiration a decade before had been to write a master's thesis that would receive one scholarly citation and who kept the following lines from jazz-blues singer Mose Allison in his top drawer: 6
I made my entrance on the Greyhound bus
I don't intend to cause a fuss
If you like my style, that's fine with me
But if you don't, just let me be
I got some kids,
I got a wife
I'm just trying to swing my way through life
As a student of American society I knew all about blocked mobility aspirations. But my situation was the reverse (or so it seemed during those glorious years of ascent). I had not been denied anything I felt entitled to. Instead I sometimes felt I had received things I did not deserve. In three short years, from 1967 to 1970, I had already achieved far more than I ever intended or expected.
In the warm glow of solidarity offered by elites who validate each other's status through self-fulfilling effects, it was easy to believe that what I was doing was important and that my success was meaningful and appropriate and could only increase. True, I knew that the chances of someone who had not received at least one degree from Harvard getting tenure were very slim.7 But I was too busy to think much about tenure in those early years. Besides, there was always the exception, and wasn't I on the fast track (as the list of achievements I also kept tucked away in the top drawer of my desk indicated)? Clearly sociology offered a great career if you had the right stuff. Who knows where it might lead? --an endowed chair, a deanship, a presidential appointment, honorary degrees, plenary addresses, editorships, more foreign translations, directorship of a research center, perhaps a best-selling novel and even a movie career. Was life ever so sweet for a young academic? Could a surfer from California disguised in academic clothing find happiness in an eastern elite academic setting? Did the rising sun have to set?
My academic knowledge of stratification and fashion should have told me that the dream could not last. That realization was not as sudden as when my chance for all-city high-school track medals was dashed when I broke an ankle just before the big meet in the Los Angeles Coliseum. There was no single calamitous incident. But gradually the sweet smell of success turned slightly rancid. As traditional achievements became less satisfying and little failures accumulated, stalagmites of disillusionment, anger, and confusion built up over several years.8 What I had naively assumed to be the natural order of things turned out to be but a passing phase conditioned by historical factors and luck.
After the Fall
In 1972 someone even younger than me, and with (at the time) a less impressive teaching and publication record, was suddenly given tenure in sociology. I had to give up my big office as a result. My book went out of print. A race-relations reader I edited did not sell well enough to recoup the advance. The race-relations text was never written. A partially written introductory text done with several colleagues, and which was supposed to make us comfortable and even rich, was rejected by the publisher. A number of editors I knew lost or changed jobs. After more than a decade of receiving everything I applied for, a grant application was rejected, and then another.
The Republicans had taken over Washington. Whites writing about minority groups and favoring integration came under increased attack from segments of the left and the right. Liberal approaches to social issues became less fashionable.
Advertisements made up an increased proportion of my mail. The reporter stopped calling. The pink phone messages were mostly from the library about overdue books and reminders to bring home a quart of milk and some bananas.
When my two most supportive senior colleagues and mentors left Harvard for Stanford, I realized that it was time to look further afield for work. Yet by 1972 the job offers had become fewer. A long promised job in the University of California system turned out not to be there when I finally wanted it. A promised year at the Russell Sage Foundation suddenly fell through. I had several years left on my Harvard contract in 1973, but in an anticipatory version of you-can't-fire-me-I-quit, I left Harvard for an associate professorship at MIT. Although certainly a good move in a market that was starting to tighten up, it was not the move to full professor that I naturally assumed would be my right should I leave Harvard.
My son made some great ashtrays in his progressive private school, but my wife and I came to have doubts about its permissive learning environment. Leaving Harvard meant giving up our ideal Cambridge apartment in our ideal academic ghetto and moving to a faceless suburb with affordable housing and neighbors whose politics, life-styles, and landscaping were far from what we had become accustomed to. The engine block in our foreign car cracked. A forest fire burned our land on Martha's Vineyard and exposed its proximity to the Edgartown dump. We sold the land.
I now had to confront ghosts that had lain dormant during the past decade of continuous graduate school and professional success. My need for achievement had been well served in those early years. I was able to leverage the success I found against inner demons always ready to tell me that I was not worth much.
Of course the need to display occupational merit badges is part of the American achievement ethos. But I was also responding to childhood experiences with a father who, whatever his virtues, was difficult to please. His own needs were such that he made me feel very inferior.9 As a result I had a strong need to prove myself. Seeking the external symbols of success was a way to demonstrate to the world and myself that the inner doubts I harbored were mistaken. Like Max Weber's Puritans looking for a sign of redemption through their worldly striving, I looked for evidence of my competence through competitive efforts --in high school through athletics, speech contests, student government, and stylish conspicuous consumption, 10 and later in graduate school and beyond by applying for grants and submitting papers for publication. 11
My experience in those early years had supported a simple, adolescent, Nietzschean (and probably male) view in which the world could be neatly divided into winners and losers, leaders and led, those in the inner circle and those outside it. Of course, depending on the arena, one might be in or out. But many of my youthful memories revolve around a desperate need to be in that circle. Good taste required not openly acknowledging the intensity of the drive or that sweet, smug feeling that success made possible. But the quiet, invidious feelings achievement permitted were terribly important. Through grit, determination, hard work, and luck I had done a good job of showing the world where I stood --at least up to the early 1970s. 12
Then things changed. The appropriate tragic model was not the Greek hero destroyed by his own virtues, but the Biblical hero Job brought down by random external forces. I was the same person doing what I had always done (and probably even doing it better), and yet things were not working as they had before. I had jumped through what I thought were the appropriate burning hoops, but the cheers were now muffled. 13 I had worked very hard to reach the brass ring, but it was always just out of reach. I had constructed a positive self-image based on possessing a nice suit of clothes, but they now were in danger of becoming outmoded and even being repossessed. I was suddenly vulnerable in a way I had not been before. What is more, the achievements that had given me so much pleasure in the past seemed less fulfilling on repetition.
Not even old enough for a real mid-life crisis, I went through a period of reassessment and asked all the familiar questions: What did it all add up to? Was it worth it? Why keep playing the same old game if the connection between merit, hard work, and reward was not assured or if the reward was not all that great to begin with? What were my goals? Who was I, after losing some of the formal trappings of success? What was important? Was there life after Harvard and the bountiful harvest of my first decade in sociology? My answers were hardly original, but they worked for me.
I came to terms with both winning and losing and was better able, as Kipling advised, to "meet with Triumph and Disaster and treat these two imposters just the same." I developed a perspective that made both failure and success easier to understand and accept. A part of this perspective is awareness of a Woody Allen paradox wherein when we do not have what we want, we are unhappy, but should we get it, it turns out not to be enough. 14
Seven Characteristics of Success
While success is nice to have, it is not all it is cracked up to be:
1. It does not last. Mark Twain said, "One can live for two months on a good compliment." Depending on one's psyche two hours or two weeks might also apply. But as a character in a Neil Simon play observes, "Nothing recedes like success." With appalling regularity, there is always a later edition of a journal or newspaper telling someone else's story. Books go out of print and journal articles cease to be read. The pages rapidly yellow and are forgotten. People ask what you are doing now. Colleagues who know what you have done retire, and they are replaced by younger persons unaware of your contributions. To make matters worse, unlike the natural sciences, sociology is not very cumulative. Whatever social wheel you discover may be rediscovered a few years later by someone at another school or in another discipline unaware of what you have done (or at least not acknowledging it in a footnote). The Romans were wise to have servants march next to victorious generals in parades and whisper in their ears, "Fame is a passing phenomenon."
2. You can never be successful enough (at least in your own eyes). No matter how good you are, there is always someone better. Whatever you did, you could always have done it better and done more, or done it earlier. You never were as important or well known as you thought you were. Even the truly famous are not exempt. 15 What is worse, you never really get there. As Durkheim observed, in a rapidly transforming society you can never achieve enough success. When what is at stake is something as open-ended as reputation, productivity, impact, or accumulation, there is no clear limit. With each higher level of achievement the definition of success changes such that it is forever out of reach. By contrast, failure more often seems limited and finite: you know when you have hit the wall.
3. The more success you have, the harder it becomes to reach the next level of achievement. As one moves from getting accepted to graduate school, to getting a Ph.D., to getting a teaching job, to getting tenure and national awards and distinction, the competition gets stiffer, the number of slots declines, and the price of success increases. With each level of achievement the field is narrowed. Once a certain level is reached, there is little variation among participants. Everyone is qualified and hardworking, and there are fewer rewards.
4. There is a diminishing-returns effect. National Basketball Association star Larry Bird captured it in his comment on receiving the Most Valuable Player Award a second time: "It's funny because when you're a kid you can't wait to get those trophies. You get 'em home and shine 'em up. Now I forget all about 'em. I got one last year that I left in a friend's truck for a whole year before he reminded me of it." 16 The satisfaction from external rewards is not as great the second or third time around, whether it be delivering or publishing a paper, writing a book, or getting a grant. Part of the reason may be just the diminution of passion that comes with aging. But repetition does not have the same kick. The sense of curiosity and expectation that accompanies the initial pursuit of rewards weakens once they have been achieved. It has become clear to me that a meaningful life cannot be constructed out of repetitively doing things to please an impersonal public.
5. Success may have costly and unintended side effects (apart from the price initially paid to achieve it). There are the obvious dangers of hubris and taking yourself too seriously and the bottomless-pit (or perhaps ceilingless-roof) quality of success. Less obvious is the paradox that success brings less time to do the very thing for which you are now being recognized. In an academic setting increased achievement is associated with increased responsibility. Being well known brings good-citizenship requests to review articles and books, write letters of recommendation, and serve on committees. Although such invitations are symbolic of success and can be directly or indirectly marshaled to obtain still more success, they can seriously undermine productivity. A virtue of obscurity is greater control over your time and greater privacy.
Public visibility may bring requests for more information about your research, job offers, and speaking, consulting, and research invitations. But being quoted or reviewed in the print media or seen on television may also bring appeals from job seekers, salespersons, and charities and requests for free advice or help on topics you know little about. When your topic is controversial, as mine on race, civil disorders, secret police, and surveillance tended to be, you are also likely to get bizarre missives including hate mail and threats, incomprehensible letters from very crazy people, and be besieged by persons seeking to recruit you to propagate strange ideas and schemes.
6. The correlation between ability, or merit, and success is far from perfect. This is of course a central sociological message. Factors beyond merit that may bear on the distribution of rewards include the makeup of the selection committee, what it had done the previous year, timing, the characteristics of the applicant pool, and intellectual, ideological, or personal biases. Even when the selection process is fair, rejections are often more a comment on the scarcity of rewards than on the incompetence of applicants. The major factors here are surely organizational. But the structure and ambiguity of reward situations also make it possible to mask the role sometimes played by corruption. With age and experience you come to feel comfortable judging, and even sometimes doubting the judges. 17 There are enough questionable cases involving tenure and promotion, the awarding of grants, and the acceptance of materials for publication to make clear the role of nonachievement criteria in social reward. Cynical awareness of this state of affairs need not make you throw in the towel or become corrupt, but it may mean slowing down, putting less emphasis on outcomes, and becoming more philosophical about failure and success. This awareness can take some of the sting out of defeat. It also ought to take some of the pride out of victory. 18
7. There is no reason to expect that what you do next will be better, by your own standards, than what you have done in the past or will necessarily bring equivalent or greater recognition and reward. In graduate school and the early professional years this may not be true. You start with little, so each achievement is a milestone and more rewarding than the last. Yet this training effect is short-lived. Career satisfaction in academia and the quality and quantity of productivity are not linear, in spite of the rhetoric of cultural optimism and metaphors of growth. Academics are not like professional athletes, many of whom gradually peak over a period of three to six years and then fall off. For the minority of Ph.D.'s who continue to do research after receiving their degrees, the average pattern for both the quality of their work and the recognition it receives is probably jagged. 19 There may be periods of intense creativity and productivity, followed by periods of reading, pursuing unrelated interests, or laying the ground for the next period of activity.
Fallow periods, if that be the right term, are nothing to worry about (at least if you have tenure). As in agriculture, they may even be functional.
Practical Lessons
Three broad practical lessons follow from these perspectives on success and failure:
(2) develop new professional goals;
(3) do not make your career your life.
Getting off this dirty bus
one thing I understood.
It has got to be the going
not the getting there that's good.
In graduate school I was impressed by Erich Fromm's argument to live life such that you did everything as an end in itself and not as a means. At the time I saw this directive in terms of interpersonal relations. It never occurred to me that the argument had local occupational application. But I now see that once you have tenure, if you do not enjoy the research or writing (apart from whatever payoff the finished product might bring), then it is not worth doing. I came to realize that I got pleasure from finding partial answers to questions I wondered about, turning a clever phrase, ordering a set of ideas, and seeing connections between apparently unrelated phenomena. In a competitive world of uncertain and perhaps unsatisfying reward there is much to be said for valuing the process of production as an end in itself. 20
The focus on process and becoming can mean less concern over the quantity of work produced and fewer comparisons to colleagues. It can protect against judging yourself by some quantitative standard wherein whatever you do next has to be more and better than what you did earlier and bring greater rewards. If for personal satisfaction what matters is enjoying your work, then it does not much matter how many publications that work eventually leads to, or how quickly, or even in which places it gets published. I am not particularly troubled that some of my work may never be published, or may be published a decade after its completion, or may bounce down the prestige hierarchy of journals before finding a resting place. This attitude contrasts markedly with the rational cost-benefit calculation and the intensity and snobbishness about publication I felt as a young academic. What matters most is a sense of engagement with your work and of movement. I do not deny that the need for social recognition can be congruent with, and even conducive to, the advancement of knowledge or that there is pleasure in seeing an article or book in print --producers need markets for validation and feedback. But that is not enough to sustain research activity, particularly after a professional reputation is established.
A second conclusion involves the need to develop new professional goals because of the diminishing-returns effect and the increasing difficulty of climbing ever higher. I broadened my professional and personal goals (described in the next section). In the case of the former, I expanded my intellectual repertoire. You are likely to discover early in your career that you quickly master contemporary sociological research knowledge regarding your topic (or if not, at least get bored with it). Occasionally there will be some highly informative, useful, or fresh empirical findings, concepts, theoretical approaches, or methods, but not often. Although by and large it is not true that sociology consists of "findings of the obvious by the devious" (as an Alison Lurie character suggests), there is not much new under the sun after you have been out in it for a while. 21
I sustained intellectual interest by developing new substantive areas of interest, turning to comparative research and to other disciplines, investigating new sources of data and methods, and taking up consulting. My initial interest was in race and ethnic relations, part of a more general interest in stratification. Partly as a result of being a white studying blacks in an age of black power, but more out of the fatigue I have described, I shifted from race-relations research to questions combining my interest in race and ethnic issues with an interest in collective behavior and, later, deviance and social control. I now see the latter giving way to an interest in questions concerning technology and society. Such moves are gradual and not very rational. You cannot predict your intellectual trajectory by what you are concerned with in graduate school. But I would venture that unless you change and expand, it is easy to get turned off to intellectual inquiry.
Variety can come from studying in some other country what you have studied here. It is fun, and there are solid intellectual grounds for doing it. I went to India to study race relations. I went to France and England to study police. I hope to go to Scandinavia to study computer systems. Beyond the new intellectual horizons travel presents, it offers a new set of colleagues and new bodies of literature and outlets for publication. Whatever knowledge may be gained, I get a strange pleasure from struggling to read the French journal I receive. Variety can also come from learning what other disciplines have to say about your topic. One consequence of having spent more than a decade in a planning department that is problem centered rather than discipline-centered is a continual reminder of the variety of perspectives, methods, and data sources needed to understand a phenomenon. In this sense discipline based professional education, with its insular, self-aggrandizing, and often imperialistic tendencies, does an intellectual disservice.
Although I always start with sociological questions, they are no longer enough. Over the years they have been supplemented by a series of questions from psychology, political science, economics, history, law, and ethics. What is more, for the research that touches on public issues I have added a broad normative question: given what I have learned from my research, where do I stand on a policy issue, and what would I recommend? In graduate school, still reeling from the conservatism of the 1950s and the thrust to make sociology a science, such issues were ignored or seen as disreputable. I have also broadened my definition of data and of what I feel comfortable working with. For both my M.A. thesis on Father Coughlin and my Ph.D. dissertation on the civil-rights movement I used standard survey research data. I continued to conduct survey research for several years after getting the Ph.D., but now rarely do. Instead, I have made increased use of observational, historical, and literary materials. My book Undercover has a historical chapter. In my work on forms of interdependence between rule breakers and rule enforcers I am analyzing novels and film. In my work on social movements I am investigating the role of art and songs in mobilizing people. My work on electronic surveillance methods for discovering violations deals directly with ethics. This broadening I advocate may not endear you to those with highly specialized disciplinary concerns who have their hands on the reward levers of your profession. But it is likely to enhance the quality of the intellectual product. The sense of growth and development it offers feels good and helps keep one fresh.
What I have described represents diversification rather than displacement. I have expanded the questions I am concerned with, the kinds of evidence I see as data, the places I look for them, and the methods I use. The movement between questions, data, methods, and location has not been linear. Instead it has, to a degree, been cyclical. I think that characteristic is another key to staying motivated. It is easy and fun to come back to a topic after having been away for a while. New materials will have appeared, and the experiences you have had in the interim may cause you to see what was once familiar in a new way. There is some salvation in moving back and forth between qualitative and quantitative, domestic and international, contemporary and historical, basic and applied questions and the various social-science disciplines.
This diversity also makes it easier to have a few irons always in the fire. If nothing more, it gives one a modest reason to go to work to check out the mail. Beyond statistically improving your chances of success, having submitted multiple articles, proposals, and grant applications can serve as a kind of safety net for the imagination. When a rejection comes, you have the hope that the other things still out will meet with a happier fate. Of course, there is the risk of a harder fall if they all end up being rejected. However, with enough nets and fishing lines out, that need never occur. The future has an open-ended quality that can be wonderfully conducive to optimism.
I also guard against demoralization from rejection by typing out two letters whenever I submit an article. The first is to the journal to which I am submitting the article, and the second (undated) is to the next place I will send the article if it is rejected. I would not deny, though, that there is also wisdom in knowing when to fold, as well as when to hold. Another professional goal that I actively pursued for a while (but am now ambivalent about) involved earning extra income through consulting and textbook writing. Earning money did not become an obsession, but I stopped seeing it as necessarily an unworthy goal. It was what I did to earn it, I thought, that merited moral evaluation, not the goal per se.
If making all the right academic moves did not insure success or satisfaction, why not use the same skills and credentials to get rich? The payoff was likely to be more certain and immediate, and the standard required was less demanding.
Given disillusionment and fatigue with academic amateurism, it was easy to rationalize spending more time playing for pay instead of for honor, footnotes, and the acclaim of adolescents.22 However, as will be noted, this emphasis is not without problems if you remain committed to academic values.
A reassessment of the bourgeois life began with my move from Berkeley to Cambridge. My senior colleagues were living well, and well beyond their academic salaries. Spacious, elegantly restored historic homes with cleaning services, travel to exotic places in the winter and vacation homes in the summer, camp and enriched education for children, gourmet foods and foreign sports cars were not available to persons who gave all their royalties to political causes (as I had originally planned to do) or who only did social research gratis on behalf of causes they believed in. This shift in emphasis began symbolically with my gradual acceptance of, and eventual belief in, the usefulness of an electric can opener. We received one as a wedding present in the 1960s, and it stayed in its unopened box for many years. For reasons I cannot clearly recall, at the time it seemed to epitomize all that was wrong with our society. Brick-and-board book shelves were replaced by real book shelves. A new sofa eliminated the need for a draped Mexican serape to disguise the sorry state of the sagging couch beneath it. We came to view paying someone to clean the house as salvation rather than exploitation.
While it was nice to have the extra income, earning outside money was not all that great either. It got boring, and I did not like the feeling of being a sociologist for sale: have ideas and methods, will travel. I was not comfortable with the salesmanship that pleasing and finding clients seemed to require. After all, I had chosen an academic life rather than the commercial life of my ancestors precisely to avoid the need to pander to customers. The pressures to meet deadlines were much greater than in the university. I felt the consulting reports I wrote were generally unappreciated and unread, except for the oversimplified and watered down "executive summaries" with which they had to begin. There were also role conflicts. The norms of scholarship sometimes conflicted with the interests of my employer. The substitution of market and political criteria for those of truth and intellectual rigor troubled me. It was alienating to be told what research to do and to have business persons and bureaucrats place conditions on intellectual inquiry. I did not like the lack of editorial and distributional control over what was produced.
I encountered bad faith on the part of employers. Thus, in an evaluation of a community-oriented criminal-justice project I pointed out how innovative and important the program was, while also honestly documenting problems and ways of overcoming them. Imagine my surprise when the research document was not used to improve the program but to kill it. It became clear that the hiring agency viewed research as a tool to pursue a course of action that had been decided before the research was undertaken.
In another example a well-established consulting firm hired me to write a proposal for a large grant and promised me a major role in it. The grant was funded, although all I received was an invitation to serve on the advisory panel of the study.
I felt uncomfortable with the pressures and temptations to dilute work, cut corners, treat issues superficially, and delegate tasks I was hired to do to much lower-paid graduate students. These could be rationalized since consulting standards were generally lower than those of academic peer review. The goal was to maximize income rather than obtain a high level of craftsmanship, which in most cases would not have been recognized or appreciated.
I emphasized earning extra income for about five years. I met with some modest financial success and learned some things about government programs, textbook writing, and social science as business. It was a nice break from my early years but clearly could not sustain me. I gradually moved back to a predominant focus on academic work and caught a second wind.
I still appreciate the benefits of doing sociology in applied and remunerative settings, and I have not given up such activities entirely --they can keep you fresh, involved, and informed and be a source of research data and a way to influence policy and shape debate. It is refreshing to meet people who actually do things rather than merely talk about what others do. Yet if you are fortunate enough to have a job in an academic setting, it seems foolish not to take advantage of the freedom for intellectual inquiry it offers.
The third practical conclusion I reached was that your career cannot (or should not) be your entire life. Not only did I question the payoff from occupational success beyond a certain point, but I also saw the price that excessive devotion to a career could extract from personal and family life. The prospect of being a narrow, one-dimensional person with a good chance of having family trouble and an early heart attack was unappealing, even if there had been greater certainty in the hardwork-success -happiness connection.
In the initial years after moving from Harvard to MIT I left several projects undone for lack of funding and graduate students. A bit weary and cynical about the single-minded pursuit of academic achievement, I devoted more time to highly personal, noncompetitive activities over which I had more control. I spent more and better time with my family, rebuilt a dilapidated Victorian house, learned to play the guitar, read novels, kayaked wild rivers, and worked on a family history project. Watching "Sesame Street" with a young companion, plastering and painting walls, scrutinizing the 1840 Detroit census for information about a great-grandfather, struggling with an out-of-tune guitar, and catching up on a decade's worth of unread novels were far removed from the usual academic obsessions and compulsions.
The respite from an unrelenting focus on academic work gave me great pleasure. Concrete activities provided immediate rewards. Ascriptive rather than achievement criteria were present. There were no risks and no concern over whether distant judges would find me wanting. These activities belonged to me in some very basic sense. They could not be taken away or withheld by editorial or academic gatekeepers. My family history, for example, was simply waiting to be discovered. The work was intensely personal and involved no deadlines or evaluations.
Yet as with exclusively playing the monastic academic game or going commercial, focusing primarily on quality of life also has its limits. It is not much fun to paint the same room a second time. Small children quickly become adolescents who do not want to go on family outings with you. You can trace back family history only so far.
After five years of spending considerable time on other things, I returned to the conventional academic activities of applying for grants, writing journal articles, and presenting papers. I was fortunate to find and help develop a broad topic involving social control, deception, and technology that has sustained me for more than a decade. I find issues of surveillance and society and the revelation and concealment of information endlessly fascinating. The topic has implications for social theory and social change. It is of interest to academic, practitioner, and general audiences, and I have not had trouble obtaining resources to investigate it. Through working with congressional committees, federal agencies, public interest groups, and the media, the research has also had some modest impact on shaping national debate and on public policy. 23 But I have not pursued this project with the same single-mindedness or desire for professional success of the early years. My life has become more balanced.
There are some issues that I have not resolved. One concerns feelings of being underutilized and underappreciated,24 which comes with being the only academic sociologist in an interdisciplinary department of urban studies and planning at a technology institute. 25 To be sure, in other ways my department and MIT have offered a superb home. There are advantages to being left alone in an environment where no one is like yourself. But it leaves a vague sense of loss.26 The part of academic life that I have found most satisfying is mentoring and working with younger colleagues and students on research. I would have learned and published more and done less self-questioning had I had the steady flow of students and the day-to-day validation and chance to contribute that large graduate sociology programs offer. It does not feel right to offer a new class or hold office hours and have few or no students appear. What kind of a professor are you if no one seems interested in what you profess?
Another unresolved issue is what to do with the anger I still feel toward certain persons who have treated me unfairly or simply wounded my pride. These actions were in discretionary contexts where what I believe to be the ideological and personal motives could easily be masked. On any broad scale such events were minor and are now long gone. Intellectually I know that to dwell on the past is unproductive and I may even be wrong in attributing personal and political moves to some of the rejections, but the feelings remain. Life is too short to waste time on replaying the past, and the evidence indicating unfairness is rarely unequivocal.
But in general I have ceased being so self-reflective. The issues about work, life, and identity that had troubled me became less important. I realized I was caught in the paradoxes of achievement and its discontents. I became more accepting of dilemmas and tensions that had once consumed enormous amounts of emotional energy. Instead of viewing these as problems to be solved and choices to be made, I was better able to accept personal and professional contradictions and multiple motives as the order of things and, in Robert Merton's words, to appreciate the "functional value of the tension between polarities."27 Sometimes I would be drawn to one end of a continuum and at the other times to its opposite. Sometimes I would try to combine them in my writing or bridge them in my political work.
I also realized that I wanted a number of things that could not be had to the fullest extent or necessarily all at the same time. I compromised and settled for less of any one in order to have some of each.28 Instead of worrying about what I "really" was and what I valued most, I saw that I was probably more marginal than most people. I came to value being something of an invisible person and social chameleon, able to fit into, and move in and out of, different worlds. This quality may be part of my intellectual interest in deception, passing, and infiltration.
I am both the intensely driven, hardworking, competitive, ambitious person (like those I encountered early in my career) and the laid-back bohemian surfer of my California days; the intellectual interested in ideas for their own sake and one of the progeny of Karl Marx and C. Wright Mills who wanted to see ideas linked to change (perhaps a committed spectator, as Raymond Aron termed it); the quantitative and systematic sociologist and the journalist seeking to describe in language that people could understand what Robert Park called the big story; the scholar and the handyman; the athletic, river-running, beer-drinking, former fraternity man who could admit to still having some neanderthal-like macho attitudes and feelings and the righteous carrier of a new gender morality; a Jew with German and Eastern European roots and a secular American at home on both coasts (and in northern as well as southern California); the pin-striped suiter who could easily pass among elites and yet announce when the emperor was scantily clad or naked --but always with civility and in the King's English. And, as Levi-Strauss notes, sociological inquiry can be enhanced by the skill of distantiation.
A cynic might suggest that the cautionary wisdom I have offered about success be viewed skeptically, as sour grapes. Are my new goals just compromises made out of necessity or, with appropriate professional socialization, is it possible to start a career with them? If my career trajectory had continued upward at its original pace, and had there been no fall, would I still have reached the same conclusions? 29 1 certainly would not have thought as much about these issues, and the emphasis might be somewhat different. But since the fall I described was temporary, I am confident that my advice is sound and represents more than the idiosyncrasies of my personal situation. It is based on two decades of successes and failures, and not only those in the beginning.
Unlike the Doctorow character quoted in the epigraph to this essay, I came East rather than West as a young man, and my expectations did not really wear away. However, they did change, and I was able to put them in perspective. Human existence is dominated by vast contingent forces that we gamely try to channel and control. That we sometimes succeed should no more lull us into thinking we can continually pull it off than should failure lead us to stop trying.
It was once said of Willie Nelson that he wrote songs out of love but was not above accepting the money. Nor am I above accepting professional recognition should it come. Yet I have become more concerned with process and learned more about how to deal with outcomes, whatever they are. I have become less troubled by rejection and also less thrilled by success. I have sought a more balanced life.
The Greeks gave their Olympic champions laurel wreaths as an ironic reminder that victory could be hollow. In Greek mythology Apollo pursues the nymph Daphne. She flees, and he runs after her. Abhorring the thought of marriage she prays to her father to save her by changing the form that has so attracted Apollo. Just as Apollo is upon her she is changed into a laurel tree. Is it a sign of modernity and a cause of its malaise that we offer our Olympic heroes gold instead?
Notes
I am grateful to my wife, Phyllis Rakita Marx, who has patiently and lovingly helped me sort out these issues, and for further critical comments and suggestions I wish to thank Jerry Aumente, Judith Auerbach, Murray Davis, Rosabeth Kanter, John McCarthy, Nancy Reichman, Zick Rubin, Susan Silbey, Barry Stein, Mike Useem, John Van Maanen, Chuck Wexler, and Jim Wood.
1. Among other themes I would like to pursue at some point are the experience of being at Berkeley in the 1960s; the move from the West Coast to the East Coast; family life, parenting, and professional ambitions; teaching; the selection of research topics; the uses of sociology and the role of moral commitment in sustaining research; the method (and challenge) of writing critical yet scientifically grounded essays. I have dealt a bit with the first theme in "Role Models and Role Distance: A Remembrance of Erving Goffman," Theory and Society 13 (1984), and the last two in the introduction to Muckraking Sociology (New Brunswick, N.J.: Transaction Books, 1972).
2. This seems to be particularly true for a discipline such as sociology that specializes in the study of stratification and in which there is only limited consensus about what constitutes good work. One observer even suggests that academic fauna can be ordered according to the degree of concern shown toward the outward presentation of self. Variation is inversely related to a discipline's certainty of results: "Thus at one end of the spectrum occupied by sociologists and professors of literature, where there is uncertainty as to how to discover the facts, the nature of the facts to be discovered, and whether indeed there are any facts at all, all attention is focused on one's peers, whose regard is the sole criterion for professional success. Great pains are taken in the development of the impressive persona.... At the other end, where, as the mathematicians themselves are fond of pointing out, 'a proof is a proof,' no concern need be given to making oneself acceptable to others; and as a rule none whatsoever is given." Rebecca Goldstein, The Mind- Body Problem. (New York: Norton, 1983), p. 202.
3. Only later, when I gave what I thought was an equally timely paper and received only a handful of requests, did I realize that on average 150 might be more appropriate as a lifetime total.
4. This more interpretive, discursive, sensitizing style inspired by authors such as David Riesman, Erving Goffman, Herbert Gans, and Howard Becker was later to get me into trouble when I had to take greater initiative in submitting articles and applying for grants. Ground rules different from the ones pertaining to the invited contribution were in force. In assessing my mounting collection of failures in the early 1970s, I learned that as a humble petitioner, rather than an invited guest, one had to conform more rigidly to the conventional academic rules. Moreover, at that time quantitative methods as ends in themselves were ascendant.
5. I am reporting the elitist views encountered at Harvard. The University of California, though not an Ivy League school, was certainly an institution of enormous distinction. The consequences of being around highly successful people who work very hard and see themselves as among the chosen are mixed. On the one hand, they become role models and you mimic them. You get more done than most people, and their sponsorship and advice help your career. On the other hand, you have doubts about whether you could ever do anything as impressive as they have done and (even if you could) whether you wish to pay the price that such success may require.
6. With success came ever greater aspirations. My modest goals as a young professional were closely linked to what I thought I could accomplish. This was no doubt a self-protective device. I had not yet learned to shoot for the moon with the hope that if you miss, you might still grab a few stars. I think the willingness to take risks and face failure are as (or more) important a determinant of academic success as native ability.
7. Even with a degree from Harvard, the odds were still against tenure, as the cases of prize-winning sociologists Theda Skocpol and Paul Starr indicate.
8. In retrospect I now see that this pattern was more a leveling off than a fall, but that was not how it felt at the time. What had been unusual (and more worthy of explanation) was the degree and consistency of the early success, not the far more common pattern of intermingled success and failure that followed.
There are of course variants of falls. Some are easier to deal with than others. However poignantly felt, mine was gradual and partial. I had lots of time for hedging bets, putting out safety nets, and devising alternatives. That kind of fall is easier to respond to than one that is swift, total, and unexpected. The latter is the case with the assistant professor who had planned a large celebration and whose oh-so-sure department head had sent him a case of champagne the night before the faculty voted to deny him tenure.
9. Two examples will suffice. An account I heard too many times was that when my mother would push me in the baby carriage accompanied by our handsome collie, people would stop her and say, What a beautiful dog. A corresponding family tale stressed my father's resemblance to Rudolph Valentino.
10. To wit a "real sharp," chopped and lowered 1949 Pontiac convertible with duo carburetors, chrome pipes, and dice hanging from the rearview mirror and what used to be called "real cool threads" --a powder-blue one-button-roll zoot suit with enormous shoulder pads. The car did get attention, but to my chagrin it was never chosen by the school newspaper as "heap of the week."
11. The first three years of my undergraduate career were an exception to the pattern of success in high school and my first decade in sociology: I looked but did not find much. This lack of success partly was due to a demanding outside job, but also to the confusion and dissipation of youth in southern California (in the surfing film "Big Wednesday" a girl from Chicago, recently moved to California, observes, "Back home, being young is something you do until you grow up. Here, well it's everything.")
I was surprised when after a series of aptitude and vocational tests at UCLA in my senior year I was told by the psychologist that I could be a professor if I wanted to. An expert had passed on my qualifications and given me permission to go on and become a professor.
12. In high school I had an experience that should have taught me some thing about the pitfalls of narcissism and hubris. There is a Fats Domino song with the lines "I'm gonna be a wheel someday, I'm gonna be somebody." I can still recall the excitement I felt working as a box boy in the King Cole Market on Los Feliz Boulevard in Glendale, California, when I saw a vegetable box with the label "Big Wheel Produce" on it. It was the perfect thing for a self-fancied big wheel to hang on his bedroom wall. I deserted my assigned duties and proceeded to cut out the label. When the knife slipped and cut deep into my index finger, I knew there was a God and that he or she had caught me. Not only was I guilty of hubris, but on company time.
The scar is still there. As in Pinocchio, mutatis mutandis, it sometimes itches when I get too carried away by achievement fantasies.
13. Of course there is always ambiguity about, and a gap between, theory and practice with respect to the rules that govern the awarding of tenure, receipt of awards, or acceptance of an article for publication. See, for example, John Van Mannen's consideration of types of rules surrounding career games, "Career Games: Organizational Rules of Play," in Work, Family, Career, ed. C. Brooklyn Derr (New York: Praeger, 1980), pp. 111-143.
14. George Bernard Shaw observed in Man and Superman, "There are two tragedies in life. One is not to get your heart's desire. The other is to get it." In some ways our culture does a better job of preparing us to cope with failure than with success.
15. Paul Newman received the following letter complimenting him on his spaghetti sauce: "My girlfriend mentioned that you were a movie star, and I would be interested to know what you've made. If you act as well as you cook, your movies would be worth watching. Are any of your movies in VCR?" New York Times Magazine, Sept. 31, 1986.
An academic career is strewn with humbling little reminders that bring you back down to earth. For example, several times I have eagerly turned from a book's index to the pages where G. Marx was referenced only to find that the reference was to Groucho or discover that as a result of typographical errors I was given credit for Karl Marx's ideas. I well recall the smug feeling I had when I received a call from the president of a Midwestern school telling me I had been the unanimous choice of their Faculty to deliver a prestigious lecture. Since a recent publication was receiving considerable attention, it seemed only fitting. Yet it soon became apparent that the invitation was for my esteemed MIT colleague Leo Marx.
16. Boston Globe, June 4, 1985.
17. Beyond an occasional case of corruption the questioning of judges' decisions is aided by the lack of consensus among sociologists about what quality is (beyond the extremes) and how quality in different areas (qualitative-quantitative, comparative-domestic, contemporary-historical, theoretical-empirical) ought to be weighed. Every way of seeing is also a way of not seeing. Among my collection of diametrically opposed responses for the same research proposals and articles I have submitted are the following: "This is the best article I have ever reviewed for this journal --an absolutely outstanding contribution" versus "This tiresome review of things everyone knows does not merit publication here"; "An extraordinarily important project.. . absolutely indispensable. I urge strongly and without reservation that this request for support be approved" versus "This study offers little that would improve the infrastructure of science. Do not fund it."
18. It is comforting to think that when we fail the causes are structural and the system is unfair and when we succeed the causes are personal effort and the system fair. My naivete and ego needs in the early period of my career probably led me to overemphasize the latter. That some of my success had little to do with merit per se was something to which I gave little thought. To be sure, I had worked hard and done respectable work. But there were a lot of things going for me that I had no control over. As research in the last decade has made clear, there is a sense in which demography (and timing) is destiny. In the case of my first job at Harvard one of my Berkeley mentors was then teaching there, and another held in very high regard had left not long before my arrival. I thus had a strong push from the outside and pull from the inside. What is more, the year I went on the job market Harvard had three openings at the assistant-professor level. Berkeley as an institution for training sociologists was at its height, and its graduate students were then very competitive on the job market. I had done my thesis on the civil-rights movement and specialized in race relations, topics much in demand. The macro factors that aided my success in the 1960s ceased doing so in the politically more conservative period that followed.
19. As paragraph 6 suggests, these patterns can be independent For quantity, a major pattern is flat. Some people hit their stride early and stay with it producing about the same amount of work each year of their career.
20. As a graduate student one of the most important things I learned from Erving Goffman was that you had to click with your topic and really care about it or else you were in the wrong business. He implied it would happen early --it either grabbed you, or it did not. Since that was a time of many job offers for each applicant, rather than the reverse, this advice needs to be qualified.
21. This partly explains the exhaustion with reading journals as one ages (though an additional factor is an expansion in the number of journals). Although I would not go as far as a colleague who said he could not think of a worse way to spend an afternoon than to read the American Sociological Review, the moral imperative I felt as a graduate student to read it from cover to cover is long gone. The imperative has been diluted to reading the table of contents and occasionally marking an article to read later. I took this step with some of the same trepidation my grandmother reported when she made the decision to ignore kosher restrictions regarding the mixing of meat and dairy dishes and waited for God to strike her down. In neither case did harm befall us.
22. To be sure, in my early years there had been extra income, but I had not actively sought it out. I also felt a little uncomfortable being paid for work I would have gladly done for free.
Although I did not neglect my students, I must admit to an increased curiosity about those teachers whose moral (or immoral) code permitted them to devote an absolute minimum of time to teaching. Examples include the professor who required students who wanted to see him to make an appointment by calling a phone number that was rarely answered; the professor who did not have his name on his office door; the professor whose lectures consisted of reading from someone else's book; the professor who always came late to the first class meeting, did not have a syllabus, and was vague about just what the course would comprise (other than a heavy load of exams and term papers); and the professor who offered political (anti-elitist) and pedagogic (students should learn from each other) justifications for never preparing for class and never lecturing.
23. The research is reported in Undercover: Police Surveillance in America (Berkeley: University of California Press, 1988) and Windows into the Soul: Surveillance and Society in an Age of High Technology (forthcoming).
24. More broadly, such feelings seem to characterize American social scientists and humanists relative to scientists and engineers in academic settings, and academics relative to persons in applied settings.
25. It appears that sociology is increasingly being practiced outside traditional departments, whether in various interdisciplinary-studies programs or in applied contexts in professional schools. This goes beyond seeking new audiences; it is a matter of economic survival. In most of these settings one sociologist is fine but two is too many.
26. Though to a degree this sense of loss is also my fault. I did not try to construct a more satisfying campus life or sell sociology. Instead I kept a low profile to maximize the time available for research.
27. Robert K. Merton, Sociological Ambivalence and Other Essays (New York: Free Press, 1977), p. 63.
28. William Butler Years ignored an alternative when he wrote, "The intellect of man is forced to choose. Perfection of the life, or of the work." One can opt for doing each as well as possible, but coming short of what might be accomplished by pursuing only one.
29. A related question is whether I could have reached these conclusions without experiencing the success with which I became disillusioned.
Labels:
academics,
passion,
productivity,
work
Subscribe to:
Posts (Atom)