From neurons to knowledge; the inspiration from our brains that sparked the quest for Artificial Intelligence

The buzz on AI is bigger than ever in digital innovation land these days. We all tend to get excited when we hear of new devices and systems that are being developed, which are said to be “intelligent” and which mimic human behavior. Ranging from IBM’s Watson API to virtual agents to self driving trucks to smart anti-personnel drones, we see an explosion of technology that was originally inspired by the workings of our own Natural Intelligence; which is a creation of our brains and the neural networks that underpin them.

An Artificial Intelligence is a goal-driven machine or system deriving semantics – i.e. meaning or relevance from the environment it navigates – and thereby taking actions that increase its chances of achieving its goals.

So how does a goal-driven system becoming intelligent? Let us consider how intelligence arises in our brains through neural computation, which most AI systems also mimic in one form or another.

There are a few basic facts we must first know about our brains. Our brains consist of vast networks of interconnected neurons (a type of specialized cell). Here is a typical neuron.

A neuron is a vastly complex entity, and is almost a separate living organism in itself, but for the purposes of understanding the basics of how intelligence dawns on us, we can think of it as a sort of fancy Logic Gate (recall to mind your electronics lesson from GCE O-Level days). As you can see, there are several types of connectors protruding from the cell body. The Dendrites serve as Inputs to the cell, and the Axon serves as the one single Output. The Inputs and Output consist of electrical impulses of a variable frequency.

Apart from the Inputs and Output, a typical neuron also has two other important variables that determine its function. One is what is known as the Synaptic Strength, and the other is the Threshold. The Synaptic Strength (aka Weight) is a chemically controllable barrier in the connection between the Axon (Output) of one neuron and the Dendrite (Input) of another. Like a sort of chemically controlled Variable Resister connecting two neurons. The Threshold is an internal barrier potential that must be exceeded for an Output to be triggered. Consider the following simplified functional model of a neuron.

How a neuron works is that if (Input1 X Weight1 + Input2 X Weight2 + Input3 X Weight3) is greater than the Threshold, then the Output is 1. Else the Output is 0.

Given this functional behavior, we can easily show how a neuron can be trained to behave as an AND Gate. Let us say that the Weights are 0.3, 0.3 and 0.4 respectively, and the Threshold is set at 1 Hz. If each input is triggered with a frequency of 1 Hz, then 1 X 0.3 + 1 X 0.3 + 1 X 0.4 = 1, and therefore the Output becomes 1 Hz (i.e. True). If any one or more of the Inputs are 0, then the Output becomes 0 Hz (False). This neuron is now behaving like a multi-input AND Gate.

Let us now consider a neuron that is trained to behave as an OR Gate. We can set all the Weights at 1, and leave the Threshold at 1 Hz. In such a circumstance, the Output will fire (become True) if any one (or more) of the Inputs are triggered at 1 Hz. You have your OR Gate.

So far so good. But how does a network of conditional switches grasp truths about the world? The beauty is that once we know that neurons can function as multi-input AND and OR Logic Gates, we don’t need to care anymore about the above details like Thresholds and Synaptic Weights. We can begin to see how a network of such multi-input Logic Gates could be made to recognize real world objects and concepts with the aid of input sensors.

Let us consider a very simple real-world concept like a line. Let us say its important to recognize and act upon a line-like object, given a set of five linear point-like sensors. Lets us also assume that in order for an object to be a line, it must at least be of three sensors in length. Now take a close look at the below network of Logic Gates.

As you can see from the above diagram, a line-like object (in red) is pressing against three of the five sensors. Using common logic, we can see that the output of the gate network will be True. The beauty is that if we slide the line anywhere along the sensors, it will return True. But any object lesser then three contiguous sensors in length will return False, no matter where you slide it. This key insight is the origin of generating meaning through a network of neuron-mimicking logic gates.

We can extend the above line-detector concept in our minds and see that we can wire a network in three dimensions, for a 5 X 5 square sensor panel, which will be able to detect a line greater than three sensors in length, touching the sensor panel in any orientation.

Extending this concept much further, and with the aid of inhibition, feedback and other complex factors that mimic the actual function of neural networks in our brains, scientists are able to show that such networks can focus on edges, detect movement, and recognize shadows, shapes, words, and other important characteristics in our environment. A single “downwind” neuron (or equivalent logic gate) can be made the holder such a complex characteristic as the face of one’s mother. A Jennifer Aniston neuron was actually discovered in the brain of a human subject, in a ground breaking experiment conducted by a team from the University of Leicester, some years ago.

The science of Artificial Intelligence based on networks that perform neural computation – the most exciting and hopeful frontier of human advancement that one can foresee – has developed over the years to become an extensive and exceedingly complex one. Yet it is a truly exhilarating feeling one gets when one begins to understand the rudimentary concept that inspired this revolution in the first place.

Republished from my LinkedIn article: https://www.linkedin.com/pulse/from-neurons-knowledge-inspiration-our-brains-sparked-ruwan-rajapakse/?published=t

Advertisements

Is there even an alternative to agility?

I keep bumping into business owners, CIOs, and, astonishingly, sometimes even tech entrepreneurs who like to begin the conversation by saying “we don’t believe in agile development”. What’s more, I’ve seen a good many others who, whilst paying lip service to agility and the good practices of Scrum, Kanban and similar frameworks, still fail to adopt the fundamental frame shift from rigidity to agility. This has been particularly evident in how these folks managed their requirements, costs and relationships on the ground. I’d like a quick word with these agile holdouts, and anyone else who needs motivation to adopt the agile development mindset.

The question to ask here is fairly simple, and best exemplified with an analogy from Physics. Is Scrum a historical step forward from Waterfall, like the step General Relativity took, when it outperformed Newtonian Gravitation, yielding more accurate results under extraordinary circumstances? Or, is Scrum much more like the leap forward that Newtonian Mechanics made, when it entirely replaced Aristotelian Mechanics – which in hindsight turned out to be plain batty? I’m inclined to think the latter.

Why? Building a software solution is not quite like building a bridge. The reason for this lies, surprisingly, not so much in the details of the engineering process, but rather in our big-picture objective of building that software solution in the first place. Unlike in the case of bridge-building, software solution development involves extensive, ongoing learning from the environment. In this case, the “environment” is the collection of intentions and fancies that the human users who would interact with the solution, have. Rather like in Darwin’s view of nature, a rudimentary piece of software evolves through hindsight into a highly useful – and consequently complex – one. We couldn’t possibly fathom its complete purpose and usage at the onset.

So, the process for understanding user expectations and providing a digital solution for a given domain (such as dating or crypto currency) must necessarily be gradual, involving the production of an initial artefact that collides with end-users, and then changing it continually according to a steady stream of end-user feedback. Therefore, a MVP release aught never to be seen as a mere step or two away from the final deal.

Also, development teams should not be evaluated only for their peak performance, when they are smashing the ball around the park in the first month or two. The whole delivery framework must be geared towards gradual learning, and the development team evaluated for domain interest, stamina and openness to change.

As such, nurturing a motivated, collaborative development team, whose contribution can be scaled up or down based on the velocity of end-user demands, is absolutely essential in today’s ultra-competitive business environment. Business owners, CIOs and tech entrepreneurs should gravitate towards appropriate contractual and/or compensation models that foster such collaborative teams. Scrum and Kanban methodology and their associated compensation models for extended offsite teams, such as monthly retainer, have matured sufficiently to deliver software product increments that can chain-react with end-users and evolve very successfully.

Its time to bury “waterfall” and similar low-collaboration delivery frameworks, and their associated adversarial mindsets, into the annals of history.

What’s in a Business Analyst these days?

A few weeks ago, a graduate from a reputed university, who had studied software engineering, stopped by for an interview. He had applied for a role in Business Analysis, and naturally the first thing I asked him was, “what do you think a Business Analyst (BA) does, in todays agile development environment?” He answered that a BA’s role is to help developers understand product requirements. So I asked him if he could imagine himself working on a project, and describe in some detail what he’d do to elicit product requirements from a business owner.

He struggled with a clear answer. At first I thought it was just his nerves acting up at the interview, but the more I got him comfortable, by asking him smaller questions like “what diagrams or artifacts would you use?”, the more he seemed to grope in the dark about the overall job role. He managed to name some of the diagrams and techniques used in business analysis, like sequence diagrams, use cases and flowcharts. And he was even able to describe what purposes they served. But he was unable to string together the different activities and tools, and describe the job role in a cogent manner.

After some discussion, I realized that he actually was not a poor candidate, at least in terms of his intelligence. Rather, the big picture of how one would set about serving the role of a BA seemed not to have been a discussed, as a part of his academic training.

I think he is not alone. I recall to mind the relatively small number of BA interviews I’ve taken part in (I’ve participated in over a thousand developer interviews, and yet only a half-dozen or so BA ones), and it seems to me that all of them ran the same course. So I thought it might be helpful to software engineering students, if I could lay out a plausible big picture approach to Business Analysis, which is aligned with today’s agile app development zeitgeist.

A BA would ideally commence his job brainstorming with the business owner or entrepreneur seeking to develop the technology solution, and defining the solution holistically, by penning down the overall idea in a couple of English Language paragraphs. Let us consider a Company “Leave Management” Solution as a hypothetical example.

I’d like to build a Company Leave Management Solution, where company Employees can see their leave balances and apply for leave through a mobile app, and where their Line Managers can approve leave via this same app, and where Executives can see reports on leave entitlement, usage and trends. The HR Team would be able to update employee leave balances through a Web portal, for speed and convenience. I’d like to have employees and companies enroll for the solution as a readily available service, by downloading the app on smartphones and signing up. We’d like to deploy a minimal solution within 3 months at the onset, learn from our experiences with our client companies, and thereafter extend our solution to suit their deeper needs.

This type of big picture vision, stated in simple language, is so important for the success of any project in app development or digital transformation. It sets people in the right direction, in terms of the broad vision, functions and actors that the solution encompasses, as can be seen at the solution’s inception. I would even suggest producing a marketing brochure or Website describing the concept to its customers, prior to actually building the solution itself. Understanding the big picture is so important. Once this big picture is understood, we can then move on towards defining the actual software solution in an agile, evolving manner.

We can then identify the actors in the solution, as can be seen at this moment. Like for example, a HR Manager, an Executive, an Employee and a Line Manager. We’d then setup a brainstorming session, and list down the User Stories that describe the expectations that each actor would have of the solution. A User Story from the previous example of the Leave Management solution might be, “As an Employee, I want to open a Leave Application Form in my mobile app. I’d like to see my Leave Balances for each Leave Type, in this form. I’d want to pick the Leave Type I’m applying for, and schedule the dates through a Calendar Control. I’d then like to Submit my Leave Application, which would be routed automatically to my designated Line Manager.

Once we’ve understood the solution’s User Stories, we can begin to prototype the User Experience (UX) in earnest. This would typically take several brainstorming sessions to conclude, depending on the size of the solution. We’d first white board rough sketches of the user screens and their input/output controls, so as to deliver the input/output expectations we have in our User Stories. It might be helpful to initially list down the approximate screen names, that correspond to the overall mental picture we have, of how users would intuitively engage with the system. This UX development activity should always be done in discussion with the product owner(s), a UX/UI developer and other helpful strategists, in a requirements elaboration workshop.

Once a few screens are conceptualized and white boarded, the workshop should break, allowing for offline thinking time for the participating team, as well as time for the UX/UI developer to produce a low-fidelity mockup of the discussed user experiences. There would always be refinements that would get voiced at the next workshop session, about the previous session’s screens, whose mockups would be shared at the beginning of each new session.

After several sessions of UX white boarding, we’d end up with a complete first cut of the solution’s User Experience, and the mockups would now be fit for high-fidelity creative enhancement. It would be important at this stage, to begin creating a Software Requirements Specification (SRS) document, in which the User Stories and mockup screens would be embedded. Business rules that apply to the user actions in each screen would be described, along with any input validation rules that apply for the screen.

Let us consider the example of the Leave Management solution. We would perhaps only allow integers values and multiples of 0.5 for the Leave Quantity field, and it should not exceed the Balance Leave for the Leave Type chosen to be applied for. This would be logged as an input validation rule. The error messaging for input validation violationswould be jotted down, such as “You have applied beyond your available Casual Leave balance. Please try again”. Most importantly, the business rule that (say) the submitted leave will be immediately deducted from the Leave Balance for the chosen Leave Type, on pressing the Submit button, would be noted. We’d ideally also record any special data types for storing the information gathered from the user, in this SRS narrative.

The above process is the most rudimentary one for eliciting and defining requirements for a piece of software to be developed. There may be several other important sections in a SRS document, for communicating non functional requirements such as security and performance expectations for the solution. There may even be some software solutions that are so sophisticated in scope, that they require complex mathematical models or other tools to help define them completely. But it is highly likely that the overwhelming majority of software app solutions would follow a BA process like the one we’ve described above.

In conclusion, its important to understand that this BA process is not a waterfall-like, irreversible process. The essential activities may be repeated, from Release Cycle to Release Cycle, where previously defined requirements may be changed, and the SRS updated as a living document.

–Republished from my LinkedIn Article: https://www.linkedin.com/pulse/whats-business-analyst-days-ruwan-rajapakse/

Measures of enlightenment: Do all prominent factors for a happier existence show signs of steady historical progress?

Steven Pinker, a Harvard College Professor and one of the world’s leading authorities on Language and The Mind, is due to publish a new book titled “Enlightenment Now”, demonstrating historical progress in the human condition. Or at the very least, he expects to show us that humanity’s approach towards progress has been a learning experience, and that we’ve steadily incorporated measures that foster better living conditions for ourselves. This piece of research is a sequel to his equally illuminating earlier thesis titled “The Better Angeles Of Our Nature”, in which he elegantly demonstrated how human violence has declined over the decades, centuries and millennia.

I keenly look forward to his new book. I would also like to raise a question that arises as a natural consequence of the book’s title, and the preview of its core thesis that Prof. Pinker has shared here. I agree that the steady decline of physical violence, the evidence of economic growth not being at cross purposes with environmental conservation, the improvement in living conditions through increasing adoption of technology etc, are all contributory factors to increasing wellbeing. Yet I still wonder if human attitudes towards each other have shown an equally steady improvement, leading to higher levels of inner satisfaction and external social equilibrium.

In simpler words, have we learned to live happier with each other and with ourselves? I think this is a justifiable, empirical question that can be posed objectively. The factors hinted by professor Pinker unquestionably impinge directly or indirectly on our happiness, such as iPhones bringing in easy conveniences and opportunities for relaxation or the reduction in oil spills leading to a safer environment. However, these kinds factors are the more easily measured, “material” outcomes of human progress, and whilst important to human happiness, are nonetheless incomplete measures of our progress towards enlightenment and the lessening of human suffering.

Daniel Kahneman tells us that we communicate our subjective wellbeing to others in two ways: through our experiencing self and our remembering self. In brief, our experiencing self is our moment-to-moment feelings, expressed through our speech, smiles, laughter, tears and such. Our remembering self is our memory of past experiences. Kahneman tells us that these two kinds of reports can sometimes contradict each other, for one and the same experience, such as whether we enjoyed a particular vacation. The important point though, is that our happiness in both cases can be measured at least to some degree of approximation.

Equally important is the concept of “social technology”. By “social technology” I speak of the innovations throughout history in human attitudes, that reshape (or claim to reshape) our character for the better, and lead to enhanced inner happiness as reported by their adopters and as experienced by those whom they come into contact with. There has been a vast range of such social technologies invented throughout history, some arguably obsolete, and some potentially of great value to this day.

Gods one can appeal to and seek solace when in difficulty, The Golden Rule of the ancients, the dissolution of the self and the practice of self-detachment in the face of adversity (as advocated by the ancient Buddhists and some modern secular thinkers), the art of forgiveness of the early Christians, the stance of non-punitive, restorative justice, the precedence of evidence over speculation or hearsay, the abandonment of superstition and the application of pragmatic solutions to life’s problems, the scientific method, psychology and the science of counseling, the art of reasoning through differences of opinion, the limits of income towards increasing happiness, or even such mentifacts as Sam Harris’s base moral stance of not harming the “wellbeing of conscious creatures” are some diverse examples of social technology that emerged in our turbulent history.

These types of social technologies have been claimed to help us live together in greater harmony, and/or yield greater inner fulfillment. It may be possible to research and chart our progressive adoption of such “social technologies”, over history. Have we adopted the ones that yield results, to the same extent that we brush our teeth, and abandoned the ones that don’t, to the same extent that we don’t use charcoal anymore to brush our teeth?

Yet another angle to my question is to inquire into the progress of our spiritual development. I used the word “spiritual” advisedly here, in the same context as someone like Thomas Metzinger would, where spirituality is understood to be a deep sense of intellectual honesty. Are we more intellectually honest today than we were in the past?

There is of course the tricky problem of determining measurable outcomes that demonstrate the mental wellbeing and spiritual development of humanity. What tangible outcomes can we survey to assess the qualitative progress in human wellbeing, beyond the material substratum shared in Prof. Pinker’s preview? We are not talking about oil spills here. Also, we are unlikely to get a very meaningful result by simply asking people “are you happy” over a period of several decades. Instead, I propose a survey of the following factors.

  1. Do young adults communicate a greater sense of optimism now than in the past?
  2. Are we more prolific today in our hobbies and pursuits per capita, adjusted for age and other relevant demographics, than in 1950?
  3. Do we claim to feel less anxious or threatened on a moment-to-moment basis?
  4. Do we report feeling progressively lesser mental duress – at work for example – over history, and exhibit lesser numbers of stress-related maladies and social consequences?
  5. On a mundane basis, are we increasingly free to be ourselves, express our opinions uninhibited (within the limits of non-violent conduct), without experiencing harmful consequences?
  6. Have we become less judgmental and more objective over time, focusing on resolving ethical dilemmas rather than assuming stances and polarizing society?
  7. When transgressions happen, are we more forgiving, understanding the frailty of the human condition?
  8. Have we progressively given less credence to speculation and hearsay, and focused more and more on evidence, in the media for example?
  9. Are we less superstitious, and increasingly rational?
  10. Are we kinder to each other, and do we help each other oftener, volunteering to provide advice and guidance?
  11. Do we have more fulfilling social lives, and have more friends and soul mates than (say) 50 years ago (the kind whom we discuss issues with, not our friends list on Facebook)?
  12. Are we more satisfied with what we’ve done with our lives, and have fewer regrets?
  13. Is jealousy on the decline?
  14. Are parents being more like friends and educators to their children, and less like authoritarian figures?
  15. Is politics becoming increasingly about team performance, and less about clan identity?
  16. Do we smile more, and laugh more?
  17. Are sociopathic personalities increasingly filtered out for positions of authority and for public office?
  18. Is public intellectualism becoming increasingly and uniformly popular in both the East and West?
  19. Do our education systems distill the progressive social technologies in an objective, secular manner, and instill them in our children?
  20. Do we brainstorm more often as families and social units?

Obviously one would have to structure these rather loose and diverse factors into a more cogent scientific proposition. It would certainly make an interesting research project, and perhaps Prof. Pinker’s new book addresses the above. Research into some of the individual factors mentioned may already be available, and it would perhaps be a case of mining the data and looking for an overall trend in the context of happiness and wellbeing.

My own hypothesis is that human beings have become less authentic and more hemmed in by retrograde social technologies like extreme political correctness, or by downright fear of unforeseen negative reactions. A few decades ago, a person would at worst have been politely chided for an inarticulate, yet well-meaning remark, such as “Tamils are good people”, for which one would be ostracized today. I suspect we have become so finicky that we hardly voice our thoughts out aloud in our own peculiar way.

Furthermore, I conjecture that in the past, there have been at least two peaks in human mental wellbeing, with troughs in between. One peak was at the height of the flourishing of the hunter-gatherer (or “noble savage” as Yuval Harari would say), just prior to civilization and the establishment of extensive social hierarchies. Wellbeing took a nosedive after civilization, because one’s actions were now largely controlled by a system beyond ones power to change, and because vast numbers were literally enslaved for the purposes of economic progress.

A more recent peak was reached shortly before the rapid acceleration of the information technology revolution, somewhere in the 1950s. This was the time when Western economies were growing, scientific optimism was high, population pressure was lower in the East, most occupations were not so specialized, and lucrative opportunities were widespread for those of average cognitive ability.

I theorize that at the present moment we are approaching some sought of a local trough in human wellbeing, where the rat race is so cranked up, where population pressure in the East is still very high, where lucrative opportunities are shifting to the right tail of the ability bell curve, and where earlier socialist or welfare state ideals are frowned upon to such an extent that the baby it encapsulated (aspirations of fair play and a more even distribution of comforts) has been thrown out with the bathwater (the hegemony of the former communist dictatorships).

One hopes get more answers from Enlightenment Now, and to research this topic further, about whether we are progressively happier to our own estimation, and whether our attitudes towards our fellow beings have become steadily more enlightened.

In praise of Universal Basic Income

…and why objections based on so-called ethical principles are silly

In a recent address to his graduation class, Facebook founder Mark Zuckerberg called for the implementation of Universal Basic Income (UBI): a well-known and forward-looking concept in the social sciences, where the state provides everyone with an income sufficient to meet their basic survival needs such as food, shelter and clothing, irrespective of whether they are gainfully employed or not.

I am greatly encouraged by this resurgence of interest in UBI, amongst Zuckerberg, Musk and other Silicon Valley big wigs. I noticed however, that Zuckerberg’s speech in support of UBI drew a hostile reaction of non-trivial proportions across social media, and thought it worthwhile saying a few words in support of his cause.

Zuckerberg put the case forward quite clearly; he provided at least three compelling reasons for embracing UBI.

  1. Advancing technology and increasing automation is leading to fewer jobs.
  2. The dire need for a financial cushion for people, so they could educate themselves as adults, or engage in quality parenting, or perform other productive activities at different stages of their lives, which don’t provide a direct cash return.
  3. The undeniable role that basic financial security plays in fostering entrepreneurship.

I would like to reinforce Zuckerberg’s case for UBI by expanding and extending his rationale. I am charitably assuming of course, that Zuckerberg’s interest in the matter goes deeper than a mere desire to absolve people of the need to work, so they could spend all day on Facebook – a rather sardonic yet common enough reaction to his address.

Let us for a moment explore the ethical underpinnings of the objections to UBI. The commonest objection raised against UBI is the objection to giving people “a free lunch”, and thus “spoiling” them. I saw this particular objection echoed time and again in the commentary surrounding Zuckerberg’s address. One commentator stated this objection with poetic elegance, sighting the good book. “In the sweat of thy face shalt thou eat bread”, he chimed in.

I don’t buy this ethic. Iron age religions codified our inherited instincts to forage and hunt, which were perfectly natural, into a dogmatic ethical principle that one doesn’t deserve to eat, unless one has worked for it. There are two problems with this rather unfortunate paraphrasing of our natural instincts. The first is that nature doesn’t set a precedent to frown upon idle eaters who reach out effortlessly for an easy meal procured by someone else. A male lion, mooching about on the savanna whilst the rest of his pride sweats hard to bring down a kill, may simply saunter over and dig into the carcass, without causing any ill feeling.

Of course one has to sometimes “work” to obtain a meal in nature, but everyone doesn’t have to work for it all the time, and, more importantly, providing food to others is not something for which one need demand an explicit return. This is the second problem with the canonical viewpoint. Social animals such as lions, gorillas or meerkats instinctively understand that opportunity is the biggest success factor in nature, and the individual who “wins the bread” shares it without placing demands. An ancient Homo sapiens may have brought down a boar and dragged it over to his tribal dwelling, to be shared with his kinsmen with altruistic pleasure. Group cooperation, and adaptation for altruism amongst kin, are well-known Darwinian processes.

Civilization and the rise of religious ethics changed this protocol of feeding each other FOC, by sub-optimally placing a mandatory barter value for a meal; it had to be obtained by working (and usually working for someone else). To put it plainly, we were told we have to toil for every f…ing meal. We were conditioned to feel squeamish if we had procured our lunch effortlessly, even if it harmed no one.

Another fallacy, which again I suspect has its roots in the folk psychology of religions, is the idea that poverty is the main driver of success. One particularly benevolent commentator on the Zuckerberg story had these words of wisdom to say: “poverty will be merely a step you take towards success”. Really?! Contrary to this rather masochistic view, the majority of those whom I’ve met who had lost jobs due to no fault of their own, will attest to the huge dependence of their subsequent success on how much financial support they got, when they were “down”.

Rather than being a driver of employability, the fear of starvation often rushes and muddles up the process of getting back on one’s feet. Your friends and relatives push you get any kind of job, which often doesn’t match your skillset, causing further disruptions in your career and more psychological distress.

I quote from a conversation that the political scientist Charles Murray had with the philosopher Sam Harris, where Murray says that an income stream actually improves moral agency, contrary to popular belief. It’s much easier for society to demand more from someone whose basic survival needs are already met. “Don’t tell us you are helpless because you aren’t helpless; the question is whether you are going to do anything to further improve your lot” is something we can tell those who are unproductive, yet receiving a basic income from the state. In contrast, far too many homeless people without a predictable income are powerless to land a job interview, simply because they can’t afford to dress up tastefully. This fact reinforces Zuckerberg’s third point.

Let us bring in another perspective to Zuckerberg’s second point. Many young people sacrifice their best years helping others, at the expense of helping themselves to a comfortable salary. If one raised a child (or looked after an aged parent or grandparent), one has discharged an important practical responsibility towards maintaining a civilized and productive society, for which one ideally aught to receive some material benefit. However, when such a dutiful person looks about to make a living after a hiatus in paid employment, they often face a forbidding society that won’t employ them again because they have a “broken track record”, or are “too old”, or judged to be “overqualified” if they seek a “lesser” job than they last held.

To expand on Zuckerberg’s first point, it is more than mere automation that future employment seekers must worry about. The demography of working class society is changing, towards the upper end of the IQ and EQ bell curves. The rise of importance in IT is a classic example. Coming from this industry myself, I can say that not everyone is cut out to be a good software engineer, for example. In fact, very few people are. Successful lateral career moves into software engineering are an absolute rarity, and worse, the ratio of employability of graduates keeps dropping over the years. It is harder to become an expert software engineer in 2017 than it was to become a successful corporate executive in 1980, in real terms.

The eminent historian Yuval Noah Harari predicts that, barring other catastrophic possibilities like extreme climate change or nuclear war wiping us out, humanity is reliably on course towards freeing itself from the shackles of existential labor, and morphing itself into a species that spends most of their lifetime on pursuits of a recreational nature, either of the intellectual or physical kind. Hence the title of his latest book, “Homo Deus” – human gods. Work, including food production and delivery, will soon be seconded to technology, and humanity will be left to worry about doing things to please themselves, or please each other. This doesn’t sound like all too bad a predicament for us, particularly if one didn’t subscribe to those silly Iron Age philosophies about the sanctity of laboring for one’s meal.

I’ve purposefully not discussed the economics of walking towards UBI and ultimately a labor-less, recreation-focused society. I’ll leave that discussion to the economists and experts. Suffice to say that a very promising trial is in progress in Finland.

However, I argue strongly against any moral objections to freeing ourselves from the need to labor for our basic needs. Social norms are evolving, and its time that we freed our minds of the ancient burden of mere survival, in order to move 100% into the more joyous space of innovation and recreation. Just as Homo erectus evolved towards freeing two of its four limbs to use tools and develop its mind, Homo deus aught to evolve towards freeing its mind of the worry of survival, and focus on developing its technology and the quality of its leisure-time, at an accelerated pace.

Alternative Systems of Medicine – a vacuous, outdated and dangerous notion

snake-oilMany years ago, my uncle – who was a doctor – once told me, “There are no systems of medicine, just a system [singular]”. What he meant was that the only effective “system of medicine” known to humankind is the one that discovers new ways to heal the sick through rational supposition (about a drug or a clinical method), and subsequent confirmation through controlled experiments.

Over 150 years have past since Pasteur & Koch confirmed the germ theory of disease, and it’s been nearly 70 years since the basics of health science and modern medicine were introduced into middle & upper school curricula in our own country. Yet I find that this foundational truth about the empirical nature of medicine has not taken root in the ethos of Sri Lankans. I see a worryingly large number of compatriots believe that there are several alternative “systems of medicine” available at our disposal, such as “Western Medicine”, “Ayurveda Medicine”, “Acupuncture”, “Homeopathic Medicine”, “Astral Medicine”, “Alternative Medicine” or “Indigenous Medicine”.

Furthermore, the fact that there is a functioning government Ministry for “Indigenous Medicine” shows how far and wide this retrogressive misconception is entrenched in Lankan society. I believe its high time movers, shakers and socially conscious individuals muster their courage and the necessary resources to launch a massive campaign to educate the masses away from this harmful notion of the availability of alternative “medical systems”. There are many “disruptive” campaigns afoot in Lanka to raise the consciousness of society about problem areas like Gay Rights, Smoking, Drinking, Drug Abuse, Women’s Rights and Children’s Rights. The addition of PSEUDOMEDICINE to this list is long overdue.

If one were to properly survey the magnitude of the damage caused by so-called “alternative systems” of medicine, calculated in the form of loss of life, debilitation and needless discomfort caused by maltreatment of diseases, and the amount of money frittered away on bogus therapies for chronic or incurable conditions, one might be stunned by its enormity. It may very well prove to overshadow the combined “cost” to the nation, incurred by the aforementioned problem areas combined, such as Smoking, Drinking and Drug Abuse.

I confess I’ve forgotten most of the details about elementary medical science that I learned at school; and yet I was impressed enough by the subject matter to have etched in memory such useful principles like The Double-Blind Trial, The Hippocratic Oath (i.e. doctors swearing to first do no harm to the patient), The Germ Theory of Disease and How Infections Spread, the Theory of Immunity and How Vaccination Works, the Hereditary Nature of Some Illnesses, the Unreliability of Anecdotal Evidence, or The Difference Between a Virus and a Bacterium.

Let us recollect for a moment the concept of The Double-Blind Clinical Trial; if memory serves me, this is something we learn about in our GCE O/L Class. Any new medicine is assigned a period to test its effectiveness, where neither the researcher of the drug, nor the drug’s potential beneficiaries, actually know who gets the potent pills, and who gets the dummy pills that are thrown into the bargain to eliminate subjective human bias. We learned that an impartial third party adjudicator does a random assignment of patients to pills (potent or dummy), and that this same third party gathers the raw results, performs statistical analysis, and presents only the final outcome to the research team.

Have such impartial clinical trials ever been conducted to test the effectiveness of these so-called alternative methods of treatment? I challenge readers to present a single credible experiment conducted on popular “alternative medicines” like the thailayas, guliyas and arishtas of Ayurveda, published as a case study in a peer-reviewed journal. At best these substances facilitate the placebo effect – where a patient’s psychology improves immediately because they think they are under treatment, perhaps causing some degree of physiological improvement in turn, due to reduction in stress. At worst though, some compounds (such as Alcohol, Heavy Metals) in these “medicines” can be toxic when ingested over long periods of time, aggravating the original condition or causing other illnesses to crop up.

I suspect however that the biggest problem is that there are countless unreported cases of patients having delayed receiving proper medical attention for their complaints, because they counted instead on an “alternative” therapy to do its work. When their condition gets acute, they are rushed to hospital, where oftentimes it’s too late. Septicemia or other complications set in, causing death.

The way so-called Western Medicine is administered in our country is far from perfect, where abuses range from incompetence, to the indiscriminate prescription of antibiotics for colds (which are caused by viruses and thus unaffected by antibiotics, unless there is secondary bacterial infection that needs treatment), to delays in the treatment of acute infections due to fear of accountability for their side effects, to the administering of drugs without informing patients of their side effects, and taking no precautions against them.

The naked truth though is that in spite of these common imperfections in its practice, the “Western” system of medicine remains the only effective and self-improving system of medicine available to us, and its benefits far outweigh its drawbacks. There is simply no comparison with “alternative medicines”; they are mere hocus pocus, and represent an early historical attempt at healing the sick. They were superseded by modern, evidence-based medicine around 150 years ago. We must move on.

What worries me most, and what I am trying to address in this appeal to Lankan society, is that the knowledge we are taught at school about health and medicine aught to shape our subconscious instincts about the world. Much in the same way that gravity makes us shy of heights, or the volatility of petrol makes us shy of lighting matches near open fuel tanks, one would expect the educated masses to shy away from pseudo medicines and quacks reflexively. It is this instinct, to know when we are stepping outside of the medical system into woo-woo land, which I feel we aught to inculcate in our children.

The movement to educate society about how to look after ourselves and our loved ones in times of ill health is worthy of being elevated to the level of a profound social campaign akin to human rights, anti-smoking or gay rights, where the consciousness of the masses are sensitized to this issue through direct action. Where are the NGOs promoting health awareness?

I am by no means advocating here that we must become completely mechanistic in our approach to helping sick people. All human existential problems in general, and illness in particular, must without doubt be approached with a touch of spirituality. I personally am an atheist, unless one considers a belief in a deistic order in nature that transcends parochial religion, as being religious. Yet I certainly could empathize with a more religious minded person, who says a prayer for her love ones to recover. Any compassionate human being aught to be able to relate to this need for an almighty’s help, when one feels utterly helpless. However a loving, spiritual approach to patient care clearly doesn’t include allowing charlatans to deceive patients and aggravate illnesses, or holding off on more effective treatments due to one’s sheer ignorance. It is this ignorance that we must eradicate.

If you care about your loved ones, and want them to be able to get the best possible medical attention when they fall sick, then please join this campaign and echo this mantra.

When you fall sick

Lets learn about our bodies,
and how we fall sick.
There is just one system of medicine,
that makes us well quick.

Or even if it doesn’t,
and it only eases the pain,
its far far better,
than suffering in vain.

Do say a prayer,
to heal your sister,
but don’t waste your time,
take her to a doctor.

When you fall sick, charlatans will rush forth,
they will play upon your vulnerability, and take you up the garden path.
Its only your education, and your desire to know the truth,
that will save you and your family, from the devils hearth.

Letter to a Burkini wearer

burkini

Dear Burkini Wearer,

I feel that wearing Burkinis (and indeed Burkas) doesn’t make good dress sense at this point in history. I particularly dislike the Burkini fad because I believe that this fad helps symbolize an outdated and implicitly offensive view of normative human relations between the two genders. As civilized human beings, we have an obligation to inoffensively conduct ourselves in public places, if we can help it. Please allow me to explain myself.

I readily concede two possible handicaps, which may impair my judgment on this matter. I’m not a woman, and I’m not a Muslim. I’d be grateful to stand corrected, through rational discussion.

I believe that anyone has a human right to wear a Burkini. Any attempt to introduce a law banning Burkinis would violate so many fundamental human rights during the process of enforcing it, that such a ban would result in a moral travesty. Forcibly stripping the garment (and the dignity) of a woman is simply unthinkable to me.

Sadly, something of this sort happened in Nice last month. I am very disappointed with those French authorities that were responsible for this physical violence against Burkini wearers. I recoil from the notion that Muslim women must be “taught a lesson” physically, for revealing their religious identity through their clothing. If anyone wants to wage a “war” against what they feel is a highly offensive dress sense, then the proper thing to do would be to reach into the hearts and minds of the wearers.

I find nothing offensive in the mere physical appearance of the Burkini. Nor does it appear to be an impractical garment for the circumstances it was designed to be worn in. The Burkini is not quite like the Burka. Burkas were originally meant to be universal, commonplace clothing for women, yet they inhibit physical dexterity and the range of activities one can participate in today’s world, such as running for the bus, motorcycling, walking in the brush, exercising in the park or even driving a car.

The Burkini has no such shortcomings in my view, within its envisioned purpose. It is more or less like a loose, hooded wet suit, suitable for wading into the water, swimming (although a figure hugging wet suit made of the proper material might be more streamlined), or even hanging about the beach while avoiding a suntan. Burkinis might also be useful for those who have skin conditions or hair loss, which they’d like to hide when taking a dip. They come in attractive colors, can compliment a woman’s figure, and are pleasing to the eye.

I understand that the Burkini was designed with the good intentions. Aheda Zanetti presumably developed it as a step forward in the emancipation of Islamic women, allowing them to swim or wade in public places without revealing their skin and hair, thereby helping them to conform to the Islamic tradition of “modesty” in women. Women who wouldn’t swim beforehand, for fear of raising eyebrows in Muslim society by wearing a “revealing” swimsuit, are able to swim now.

I can appreciate the fact that some women, who have followed certain wardrobe habits through tradition, might feel an awkwardness to change them. Perhaps it may be similar to the awkwardness I felt the very first time I jogged in the park, in running shorts (I was a very shy teenager). I agree that you cannot be forced to wear something you feel awkward in, such as a swimsuit.

I don’t think however, that it’s a major leap of faith to change one’s dress sense. Islamic societies have been changing dress patterns rather rapidly at various points in history, in countries like Iran, Iraq, Syria or even in Sri Lanka, where I come from. Muslims have lived harmoniously in cosmopolitan Lankan society wearing both western and eastern (Sari) dress for centuries. It’s only within the past decade and a half that we Lankans have seen the Burka come into fashion amongst Muslim women. Their mothers didn’t wear them.

The Burkini and its “parent” garment the Burka cannot be isolated from the loud religious symbolism that underpins them. Anyone knows that Muslim ladies can only wear them, and that it would be an offence (in the eyes of a Muslim) for someone who doesn’t subscribe to the Islamic teachings to wear them. This is quite unlike other traditional garments such as Saris or a Kurtas, which were originally adorned by a particular culture, but with no exclusionist philosophy attached to them. Christian Lankans and Atheist Londoners have been seen wearing Saris and Kurtas for decades.

In a day and age where inter-cultural collaboration has led to better prospects for humanity, I feel it’s a little ostentatious to flaunt one’s inner religious beliefs as if it were the most important thing about oneself, to announce to the rest of the world. I feel the same way about the garb of Nuns or Priests, although in the case of nuns and priests, they by definition are renunciants from society. They would presumably like to discourage interaction with other people, except for solicited religious discourse. For women of the world, working closely with men and women of other cultures and religious denominations, I wonder if this flaunting of one’s religion makes good sense. It’s sort of like warning people that you belong to some intolerant cult.

Although some people might want to characterize Islam as such, I’m hopeful its not.

I am put off by the gender-demeaning symbolism of Burkinis and Burkas. The integrity and self-respect of both the genders are challenged by this symbolism. Just think about. In the case of the full Burka, we often find a well-dressed and otherwise attractive woman covered in what can only be described as a black cloth bag, to hide the “shameful” body she was born with. What are we ashamed of here?

Long before the advent of Islam, different human races had strived towards an optimal balance in body covering, balancing protection (from weather and sexual aggression) with display (of one’s unique identity and attractiveness). As dress senses evolved, we saw common patterns emerge, where one’s vulnerable places were often tastefully covered, whilst the rest of the body like the head, arms, hair, midriff and feet were often exposed (and adorned) for dexterity, recognition and beauty. Sure there were variations in the extent of cover, mainly based on climate. Those residing in temperate countries covered more of themselves because it was cold, and those in the tropics covered less because it was warm. There was no concept of hiding one’s entire body as a shameful object, with either gender. The fur coat of the Eskimo and the Sari of the North Indian are examples of naturally evolved wardrobe.

Furthermore, the majority of societies around the world developed systems of ethics, and rules of law, that strictly forbade women being molested by men at sight, for their bodily attractiveness. If we take Western Europe as an example, lawmakers and leaders improved social conditions over centuries, to allow attractive, figure enhancing dress to be worn by women, without being in danger of coming in harm’s way. The incidence of rape or violent sexual harassment due to the wearing of so-called “revealing” clothing is statistically insignificant in Western Europe today.

The philosophy of encasing women in order to protect them from the marauding instincts of men sets rather a low standard for men, and for the beautiful affair of human courtship. Since the days of the enlightenment, Western social norms neither accept nor allow disrespectful sexual submission; instead they expect high standards of restraint when it comes to sexual conduct. Women are not raped because they chose to be sexually attractive; rather, women occasionally get raped because of the psychopathic or violent behavior of errant men. Society trained to despise such men, and to protect the freedom of women (and men) to express their sexuality (i.e. capacity for sexual feelings and sexual orientation) openly, as a necessary part of friendly, nonviolent courtship.

Western traditions around courtship are fine-grained, such as reading the right body language before venturing into a kiss. Sex and courtship has evolved away from the course-grained affair described in the ancient religious texts, where women either covered themselves to look nondescript, or got plundered by sex-starved men. Courtship is about mutual attraction, love and consent today. Westerners or even easterners like myself who happened to grow up in liberal, evolved society, feel a tad uncomfortable to be implicitly branded as potential women-molesters.

I have this intuition that to be wisely dressed involves finding some middle ground between nudity and complete encasement in a cloth bag or skinny. Do you not feel this instinctively? That we should look nice and confident to others, but at the same time not offend others? That we should change how we dress based on our activities, our desire for comfort, and the weather?

If you do, I urge you to dress not for isolation, but for the occasion. If your society forbids you to do so, fight it nonviolently.

%d bloggers like this: