Tuesday, November 17, 2009

The Internet and driving

While witnessing a well-warranted and funny episode of road rage today, it suddenly occurred to me how similar surfing on the Internet is to driving.

Every intersection is akin to a forum or a chatroom, where there's the possibility that someone will oblige you with a nicety, but more (or less, if you live among kinder folk) likely, you'll see someone not let you in, or someone will block you, or cut you off, etc. Visiting destinations is like visiting points of interest. Road-tripping is very akin to online collaboration, the likes of which will increase thanks to Google Wave. In the big picture, the main similarity is this: just like how people often forget that the people who post on forums and in chatrooms are actual people, so do drivers forget that people in other vehicles are people. People aren't perfect and they make mistakes.

On the net, nothing's THAT important that we should forget that other people are actually sitting in front of computers on the other end of those conversations. If you acknowledge their humanity, you end up being less misanthropic and angry towards them, or so I see it.

Driving is a bit different because if someone makes a big enough mistake, then it becomes a life or death situation. Does that truly justify our anger, or are we just pompous about how we drive? We can't possibly expect everyone to know we're in a rush. And, if this kind of anger is justified, the what justifies our anger on the net, the situation being so far from dire?

Everyone makes mistakes, right?

Thursday, October 8, 2009

Suspension of disbelief

While at a play recently, a friend of mine pointed out to me a few things which he said didn't suspend his disbelief. In most artistic media, suspension of disbelief is important it either proves that things are real, or makes them seem as though they are possible. It makes stories more compelling because we get more enravelled in them; we learn to accept the stories' worlds as our own, which makes identifying with the characters easier.

Nowadays, however, it is very difficult to suspend disbelief. Many of us know so much trivial information that it sort of ruins our acceptance of a story if it includes something which contradicts those facts. In addition, things are always so focused on being as accurate and detailed as possible that we can't help but build up this mental database of trivia. And, on the most part, it's not entirely a bad thing to know a lot of different things. We value things are are depicted very accurately, and rightly so. This clears up a lot of biases that people have towards things, which can be a great boon.

In the realm of artistic expression, however, this isn't entirely helpful. In order to make things very believable, to suspend the belief of the audience, the artist has to really do his homework. I mean that he has to research in depth all aspects of his creation and reflect all of this in his work. That's not to say that it's a bad thing, but putting so much work into a secondary aspect of a project really does detract from the primary purpose of one's creation.

My friends and I are the type of people to go into a sci-fi movie and whisper to each other about how certain things aren'y physically possible, or about how things aren't accurately portrayed on-screen. I have gone to some historically fictive films and made a list of things that were historically inaccurate. Then I've separated those inaccuracies based on whether they were essential to the plot or not. I may be a geek - and a proud one, I might add - but I know that I am nowhere near alone in noting these types of discrepencies, though I will admit I'm one of the few to actually jot them down. I feel like this is a product of our well-educated and information-ready society.

Think, if you will, to about thirty-five years ago. A pleothora of horrible horror movies existed which were much more effective than today's, despite today's flicks having much nicer computer-generated graphics and more realistic make-up and stuff. As great as the original Star Wars movies are, I still turn away from the horribly choreographed sword-fights. On the other hand, I just saw the second Transformers movie. It was a ridiculously bad movie, but the graphics were beautiful and worth seeing. Again, I know that I am not alone in this regard.

Regardless, the times, they are a-changin'! The damage is done, and so now, we can only hope to move forwards. I realize that not every movie or television series can be as accurate and intense as Eureka or Battlestar Galactica. But, hey, Lost tries, and despite its occasional fact-bending, it's not so bad overall, not to mention that it's a great series.

Yes, I may be to blame for my lack of tolerance of incorrect facts and inconsistencies, but I'm sure that the number of people who agree with me are growing. And really, is it so hard to get things that right? But at the same time, I definitely think we need to learn to look past this kind of stuff a little more, so that we can appreciate something for its message while artistic media tries to catch up with our demand for high quality. I can definitely appreciate the fact that it gets very tedious and annoying, and requires quite a lot of work, but it's worth the effort in order to have a much more convincing world in place. The more convincing the picture, the more likely we are to see the artist's message without getting distracted.

Unless, of course, the point is the distraction itself.

Saturday, October 3, 2009

Of languages, computer and human

I noticed an interesting difference between humans and computers the other day. I was conversing with a fellow linguist about language and technology and I sort of stumbled upon it.

The progress of language is the same as the progress of computers, but their sources are reversed.

In the days of the early computers, we had vacuum tubes and punch cards for data storage. In order to work, one used a terminal (one of many) connected to an always-running computer. Most everything was done in RAM. Then along came something wonderful: magnetic storage. Now you could much more efficiently store data to be accessed and modified later. As with all things, however, there a few obvious problems. One was capacity and the other was price. Price went down over time, and personal computers became more and more common.

Capacity was another issue. In order to save as much space as possible (i.e. maximize available capacity) and to keep the price down, interfaces didn't progress all that much for a while. Text-based command-line input did wonders, but it proved to be hard for many people to adapt to. There was a significant learning curve. However, the efficiency of these systems was such that they still exist and are in use today. And, the more that commands were predictable and formulaic, the easier things got.

Then, along came a concept called GUI - Graphical User Interface. As storage capacity increased and price went down further, it was much more feasible to run an interface that was easier to use at the expense of it being not as direct or efficient. The reason this trade-off really was important was because it allowed access to not just those familiar with computers, but to those who had never even touched one before. It allowed access to "outsiders," those who weren't a member of the computer-based community. The learning curve dropped. Progress, however, was even more dependent on the increase of storage capacity.

In a related note, as computers become dated, those who are more "tech-savvy" often return to more efficient operating systems to run on their older hardware. Linux is a favorite for many. The reason for this is that you can run new software on these older machines and still have them be usable by decreasing bloat. I, myself, turned an older computer into a server devoid of any GUI. Without the "bloat" of a GUI, it still remains very useful and usable for many things. And, I won't be as affected by software deprecation as I would be if I had left an old operating system running that wasn't updated anymore.

You can see how the progress of personal computers was based on simplification of the user interface, so that others could use them more efficiently, even if they weren't great with the technology. Accessibility came at the cost of dependence on storage. As storage became available and higher and higher capacities, process efficiency became less important for the average user (think today's average user, not 1980's average user.)

The progress of language also works towards increasing accessibility.

Long ago, language was a very difficult thing to grasp. This seems counter-intuitive because language is so fundamental, but it's easier to see when you look towards the number of people who were bi-, tri-, or multi-lingual. Much lower than now. This is because of many reasons, such as the fact that globalism wasn't as high as it is now. This can be easily seen by the biases seen in the Western world and that of early Sanskritic society in India. Greeks took pride in their language, so much so that they deemed anyone who could not speak it to be uncivilized. This definitely carries through time by the concept of "The White Man's Burden." Of course, that particular example is not based solely on language, but I've always found that language and culture tie together so intimately that they almost certainly go together. This is especially true when analyzing one's cultural identity.

When we look to the east, however, we find that multi-lingualism increases. This is in no small way based on the silk routes from the Middle East, through India, to China. The advantage of being multi-lingual is multi-faceted, especially regarding the business world. It was also important because the Middle East, India, and China all retained many subcultures, each with their own language. India today has over 20 official languages, not to mention the many "dialects" of China (most Westerners only know of Cantonese and Mandarin, but there are many more).

Back to the point, another important reason for the reason that multi-lingualism was difficult was that languages had different characteristics than seen those of today. The largest spoken language family is that of the Indo-European branch. These languages were originally highly inflected and word order mattered less. This means that words had many, many different forms based on their use in a sentence. Verbs had many more conjugations than we often see today, and their associated nominal usages also required lots of rearranging. Latin, Greek, and Sanskrit are primary examples here. Lots of word forms and usages. The benefit to this was that it was easier (in many ways) to convey meaning. On the whole, though definitely not always by any means, one could convey more precise information by using fewer words. This was because word endings conveyed the meanings better.

[Important and notable exceptions to this rule are languages in the Sino-Tibetan family, such as the Chinese languages, and other such languages that were not directly or immediately related to or in contact with Indo-European languages.]

Now, growing up and speaking these languages was one thing, and learning them was quite another. Unless you grew up in a place that spoke more than one language, you wouldn't necessarily learn the other language until you were an adult, and learning languages becomes significantly harder after your teens. As a result, speaking more than one language fluently was rarer (moreso in the West, as I stated before) than it is today.

Went you look at an inventory of words of these languages, you may notice that each verb root has many, many different variations, which may or may not be formulaic. You could consider these highly inflected languages to be more "storage" based than Chinese languages or today's languages. As time went on, languages diversified, but started being focused less on nominal cases, simplifying verb tenses and conjugation-groups, and started focusing more on word-order. As a result, you learned more differentiated words and fewer eccentric morphological endings. Now, if you learned fewer words on the whole, you could still convey meaning, albeit with more words in each sentence. Less efficient but much easier to learn for those who weren't so great with languages. Multi-lingualism just got a whole lot easier. Ignore my last statement, because this is slow, steady progress over years and years, but you can understand how and why things changed.

Orthography is important, but in my opinion, didn't matter as much to the average person until the Arabic empire rose to its height. This was the era of copying and preserving, leading up to the invention of the printing press by Gutenburg. Prior to that, writing was important, but not so integral to the learning of language, especially if that language was a second, third, or fourth one.

Another benefit to more formulaic language is that it frees up time to think more abstractly. Language becomes less of a pure inventory, so we're free to remember more. It also requires less attention because we can always add more words to alleviate ambiguity later. This was always true, but is much more apparent now. At least, so I've noticed. Anyway, this way, we're free to multitask better.

So, here we can see a definite change towards accessibility at the (relatively slight) cost of efficiency. However, you may, as I have, noticed a few important differences between this and the progress of computers.

Computers moved from always-on, process-based centralized systems towards individual computers that were more easily accessible because of the presence and development of ever-increasing storage. Language moved away from pure "storage" towards more formulaic usage. It became more "process-friendly" in a way, and this is definitely true when we do consider the entrance of orthography to the mix. Knowledge can be stored and accessed later, but the process of learning (how to read especially) becomes elevated.

The thing about the older languages is that in their earliest forms (Mycenaean Greek, Vedic Sanskrit, and Old Latin), many of these inflections weren't so standardized. There were more exceptions to the rule, and these tended to decrease as time went on, much like the command-line based systems whose predictability eventually became nigh-universal. Another similarity is the context of these. As Latin, Greek, and Sanskrit became liturgical languages, they were standardized much as linux commands were alongside the rising use of Windows and Mac OS. Latin and Greek became used specifically for scientific naming, as well, as Unix and Linux are arguable defaults for high-end servers.




As technology tries to break the limitation of today's magnetic storage abilities, we should take some time and think about our language. In light of today's post, I challenge you to take the time to read, write, and speak more efficiently, at the cost of speed and time. I guarantee that if you do this for a while, you will gain something from the simple act of moving a little more slowly, along the lines of "Ungeek to Live."

Friday, July 17, 2009

Funniest Buddhism joke ever.

I think I just found the funniest Buddhism joke ever. It's pretty dark, but if you're Buddhist, you should be used to at least a little bit of nihilism. And, you probably shouldn't take offense to it. =P

Saturday Morning Breakfast Cartoons - #769

Monday, July 13, 2009

A long-misplaced analogy

Operating systems have changed a lot from when computers first fell into the mainstream. Movies like "Hackers" and "The Pirates of Silicon Valley" are such that, despite their errors and inconsistencies, still popularize computer-culture (as well as the computer-related counter-culture).

For those of you who are really familiar with computers, there are a variety of operating systems out there to make things more convenient for the end user. They also provide major headaches when things go wrong. As if you couldn't tell by the premise of this blog, religious paths can work the same way.


[Let's take a moment to understand that any copyrighted names of software or anything else doesn't belong to me, but to their actual owners, stated here or otherwise. It's the internet; I'm using names and very clearly not trying to steal them, so don't sue me.]


Lets take a look at Windows. Windows is, more or less, a defacto standard for both business and pleasure. Consequently, Windows also has a lot going for it in the way of software. This is especially true for people who can't find applications for their purposes; they end up programming for Windows because that's what they use. In the business world, it's fairly easy to see why Windows wins. The customer support is provided world-wide, and to date, I have not encountered more diverse, more easily accessible, or more integrated language support. You also have a MUCH wider variety in things like hardware components and software utilizing them.

On the downside, Windows gets to feel really bloated after only a few months of use. Sometimes, with such a variety of hardware and software (not to mention parsing through drivers), you'll stuff that acts up, stuff that doesn't work properly, and stuff that just disappears. Let's not forget the fact that being a defacto standard means that most forms of malware are designed to exploit any and all vulnerabilities in the software. Fixes only provide temporary relief. I won't even begin to discuss Windows' paltry CLI.

There's also Apple. Mac OS, especially from Tiger onwards, is really, really easy to learn and to use. Nearly all aspects of the OS are fully integrated with each other. iTunes, the iLife Suite, and other software solutions all work together very well. And, being based on FreeBSD means that those who like Unix can jump into the Darwin prompt and do things in the "old-fashioned" non-GUI way. There's also an almost non-existent threat of malware. Things in general run more smoothly because Apple decides what hardware is supported, and so there's less bloat in terms of drivers and such.

This approach has its drawbacks, however. There's MUCH less flexibility in the way of hardware configuration. This also means you're paying more for what you're getting (for hardware), and can't configure a comparable system for less because the software is effectively tied down to the hardware. There ARE those who have gone the route of "Hackintoshes," but the practice remains to be deemed legal, and there's a whole slew of problems on that route, as well. Things often don't work perfectly and you really have to know what you're doing to fix those problems on your own. Other drawbacks of the Mac-platform include sub-par language support, occasionally restrictive software choices, and being bound to specific companies based on hardware configuration. This last one is a big deal to some people, as Apple doesn't support ATI or AMD up-front (buying a card for your desktop would be a slightly different issue, in my opinion).

And, then there's Linux, or Unix, or Solaris, or what have you. These are really interesting OSes because they come from antiquity and have their roots there. While the other two platforms are almost completely GUI-oriented, with the *nix group, you have a choice in the matter. Underlying Linux is a Command-Line Interface that allows you to directly interact with the OS in a very different way from the GUI. If you choose, you can have the GUI instead, or you can have both. And, the commands you learn, or the programs you learn to use via CLI can be used on other systems and other distributions of Linux. It's based on "old-school" ideas of freedom ('free' as in 'beer' as well as 'free' as in 'speech'). The community surrounding each distribution, and the platform on the whole, is amazing and is very supportive. And, if you have old hardware that you don't want going to waste, Linux can breathe new life into it.

The drawbacks here are numerous, though. The end-user at home doesn't get much in the way of technical support, at least not like you can with Windows or Mac OS. You often have to resort to the community to get some help, which isn't so much a bad thing is it is a waiting game. Sometimes you go through a lot before you find a solution. Sometimes you don't get proper answers. Sometimes the first suggestion you get works. It's not always consistent. Even Ubuntu - often hailed as the most newbie-friendly disro - doesn't always behave the same on every system. If you're buying or building and you plan on running this type of OS, you have to do your homework to know what exactly will work easily, what needs work, and what won't work at all. There's also a steep learning curve, not so much because it's hard but because there's so much to learn, and it all depends on what you're running and trying to do. One consolation is that as you learn, things get much much easier to a point. Some of what you can do is truly amazing; you just have to figure out what you're doing. You can eventually play around with things and you can get a feel for what things do. If you don't develop that type of approach, though, you face a significant disadvantage.



Religious paths work similarly. You have those approaches which offer diversity and are easily accessible, or are very mainstream. However, they're not always "user-friendly." Sometimes you get a lot of contradictions. Sometimes they appear counter-intuitive.

The least outrageous comparison is that of the Linux/Unix type of OSes to mystical paths. On mystical paths, like Wicca, New Age paths, Neo-Paganism, Zen/Vipassana/Tibetan Buddhism, Yoga, or even the Hindu Monastic sects, the community is a very integral part of the learning process. The learning curve is not the easiest, and finding good sources for guidance is often a trial in and of itself. Still, you really get a feel for direct, intuitive interaction with the Divine. If you choose to have some visual or ritualistic approaches, then those are available as well. Mainstream aspects are often found as-is or adapted for mystical paths. And, sometimes things like meditation find their way into more mainstream paths as well.

The Windows of the religious traditions would most probably be faith-based paths. Easily accessible, nearly universal in its approach, and at this point, a de facto standard for the approach to Divinity. At the same time, it has its flaws. It's much harder to preach faith to a group when you have some "free-thinkers" in the bunch. I'm not saying it's not possible, but it definitely requires significant effort and time to reach out to them, as opposed to those who take to faith unquestioningly. On the other hand, usually you get a high-volume of people who take to your "platform." Religions that fall into this category are mostly mainstream, like most Christian denominations, Islam and Judaism to a large extent, some 'Hindu' sects, like Swaminarayanism, and many MahaayaaNa Buddhism schools, such as Pureland.

There aren't many "Macs" in the slew of religions in the world. I'd argue that mainstream Hinduism falls into this category, as does Sufiism, some Judaic traditions (the study of Kabbalah comes to mind), and Eastern Orthodox Christianity. Hinduism is very Mac-like because of the openness inherent in its paths. The Bhagavad Giitaa itself affirms the karma-yoga, jn~aana-yoga, raaja-yoga, and bhakti-yoga, so it has its easy-to-learn side, but also the tinker-under-the-hood side. Most schools generally agree that Sufiism is technically an offshoot of Islam, so as such, you are required to obey the five pillars of Islam. But, Sufiism also offers a mystical approach that's distinctly different from the Linux-like religions. Asceticism is kept to a minimum, and Sufis are encouraged to have families and marry, and to have jobs and contribute to the community. Instead of seeing those aspects as a hindrance to their path, they value the lessons learned from them and relish the added difficulty they provide. The Eastern Orthodoxy's approach is slightly different, with lay people encouraged to take spiritual, meditative retreats, and under proper guidance, use mystical means to add to their religious experiences. In contrast, though, you're definitely stuck in terms of "hardware." You'd be hard-pressed to dedicate yourself to most Sufi schools without converting to Islam. In order to properly understand the teachings of the Kabbalah, it's usually understood that not only should you be aware of the Judaic tradition, but also have Rabbinical training. Eastern Orthodoxy and Hinduism also have their baggage as well.


I'm not saying that one's better or worse. Ultimately, technology should work based on your needs. I feel that religion should, too. The trick is that while most of us can admit it to ourselves when we're indulging a little in a new computer, we don't always do the same with life as a whole. As such, we can't properly assess our needs.

It's a wise man who can differentiate between his needs and his wants, let alone choose to follow his need over his want.

Still, I think we should really try to think about religion as something that's meant to ultimately help us, even if we find ourselves restrained a little. It's also something that needs some sort of updating. We definitely face issues that we didn't face two-thousand-something years ago, and we also have knowledge we didn't have back then. Fixing bugs is important, and it also doesn't hurt to add new features.

Wednesday, April 29, 2009

King of the Hill and Buddhism

When I took Buddhist Philosophy at Rutgers, my professor played for us an episode of King of the Hill: Won't You Pimai Neighbor?

It's a great episode and a lot of interesting Buddhist philosophy can be found in it, but really it's the ending that does it. Bobby finds out that he's the reincarnation of a lama after a few monks come to visit his girlfriend, Connie's parents. Things go well for a time, while they wait for another monk to come and confirm the finding. That is, until Connie and Bobby find out that lama's are celibate monks and cannot have relationships. Bobby worries a lot and even considers trying to throw the test, but realizes he can't when Connie explains her feelings about her religion and tells him what would happen if he purposely failed.

It's a heartwarming episode. At the end, he's supposed to choose among the effects of the old lama. He chooses Connie, by pointing to her reflection in the mirror. After everyone leaves, one of the monks talks to the newly arrived head monk, pointing out:

MONK # 1: But that was Sanglug's mirror.
HEAD MONK: I know, but he didn't pick it.
MONK # 1: But he used it.
HEAD MONK: Tough call. But it's mine, and I made it.

One of the ideas behind this exchange is that Bobby passed the test, but did not choose to be a monk. His relationship with Connie in this life was more important to him, and to take him away from that would be unacceptable.

While this is not exclusively Buddhist in ideology, and while it's neither highly detailed nor completely (in)accurate, it is notable for the fact that it provides a tangible argument in an appealing way, and that it's set in modern times. It's a great episode and I highly recommend it. That kind of television is something else entirely.

Monday, April 27, 2009

Linguistic similarities in Arabic and Sanskrit vowels

I don't know if this belongs here, or in A Modern Hindu's Perspective, but since it's not directly religious in any way, and provides very interesting notes linguistically, and since linguistics has a major impact on modern technology (by way of voice recognition, sound analysis, and linguistic interpretation by machines), I figured it wouldn't hurt to throw it this way.

NOTE: Here, I use my slightly altered version of ITRANS for sanskrit (saMskRta) transliteration. I use the Buckwalter transliteration for arabic (Eraby). For the purposes of this post, my alterations to ITRANS are negligible. Also, I will describe the appearance of the ta$kyl so that people who are familiar with arabic phonetics and logography can follow along without worrying about the transcription. Also, while I've taken linguistic classes and studied both languages with native/polished speakers (one more than the other), I am in no way a linguist. Thus, I do my best to be as accurate as possible and give helpful links, but feel free to study both languages and compare yourself.

Also, if you have no idea what I just said, don't panic! Just read on and you can ignore that jazz.




Something interesting I noticed today was how arabic's vowels and sanskrit's vowels are similar. In arabic, you essentially have three tiers of vowels, dictated by length (traditionally, in terms of beats).

You have the long vowels (in order of strength): yA (/a/), waw (/w/), Alif (/A/). These are pronounced as "ee" in 'beet,' "oo" as in 'boom,' and "a" as in 'hat,' in standard American English. They are held for two beats.

You have the short vowels (same order as long vowels), the kasra, Dam~a, and fatHa. There is one phonetic difference here: the kasra is often pronounced like "i" as in 'bit,' but they do still correspond to the longer vowels. These short vowels are held for one beat.

Then, you have the hamzap (hamza). It represents the glottal stop, which most English speakers will recognize as the hyphen in 'uh-oh.' It's that short abruptness you cause when you close your throat. The hamzap in arabic has vowel quality associated with it. In writing, when this appears at the beginning of word, it is represented as an Alif with the hamzap under it (for the yA equivalent), an Alif with the hamzap and Dam~a over it (for the waw equivalent), and an Alif with the hamzap over it alone (for the Alif equivalent).

For English speakers, think of that teenage apathetic "I'm not interested" sounding "eh," except with the aforementioned vowel sounds and shorter.

These hamzap representations are held for a half of a beat in duration.

Now let's get to the sanskrit representation.

You have the long vowels (in a comparitive, not traditional, order): /ii/, /uu/, and /aa/. /ii/ is pronounced just like arabic yA and /uu/ is pronounced just like arabic waw. /aa/ is NOT pronounced like Alif, however; Alif is more frontal (remember, "a" as in 'hat'), but /aa/ is a little farther back. Think "a" as in 'far,' or the first "o" in 'October.' The long vowels are also held for two beats.

You have the short vowels (same comparitive order): /i/, /u/, and /a/. In sanskrit, /i/ and /u/ have the same quality as /ii/ and /uu/, but are just one beat in length. This changes for modern Indian languages, where the short versions end up sounding like "i" as in 'bit' and "u" as in 'put.' Also, /a/ has two schools of thought as to its pronunciation. In one, it's pronounced just like "aa" but held for one beat. The second, and more predominant school has /a/ pronounced like "u" as in 'bun,' and "o" as in 'done.' Here, too, it is held for one beat.

Then, you have two semivowels, /ya/ and /va/. /ya/ is a palatal semivowel and is associated with /i/ and /ii/ in sanskrit's system of sandhi (which documents phonetic assimilation). In vedic sanskrit, /va/ was pronounced like an English "w," but came to be pronounced like the English "v." However, it still remains the labial semivowel, related to /u/ and /uu/.

Let's say you have a sanskrit word, /karmaNi/ "actions." Then, you have another word after it, /eva/ "only." You put them together in a phrase and you get /karmaNi eva/. However, in sanskrit, you must apply sandhi, and the /i/ changes to the semivowel /ya/. You end up with one word, /karmanyeva/, which still means "only actions."

It's a little bit easier to say and if you were to say the two words in casual speech (read: quickly and not in a metrically significant way), you'd end up with this anyway. To steal the wikipedia article's example, think of the phrase "don't be" in English. You say it casually and quickly, it comes out as "dome be." It happens in a great deal of languages, and instead of forcing uncomfortable articulation (when you speak "properly" or formally), sanskrit accepts and documents it, and then "forces" you to apply those changes (you apply them when you speak "properly" or formally).

/karmaNyeva/'s semivowel conversion illustrates that /ya/ - and the "y" in English for that matter - is just a broadening and truncation of the vowel /i/, to which you can then give another vowel to. Sanskrit doesn't like consecutive vowels, unlike greek and latin (mostly greek). This is how it deals with them. But, getting to the point, this makes these semivowels /ya/ and /va/ like very short half-beat length vowels in their own way.



It's fun to see similar structures.


Now, in arabic, you have two diphthongs, /ay/ and /aw/. This is a fatHa (one beat length of Alif) with yA and waw, respectively.

Sanskrit has four diphthongs (merged from a few more), /e/ and /ai/, and /o/ and /au/. Of these, /ai/ and /au/ come to my mind. Originally, scholars think they may have been pronounced as /aa/+/ii/ and /aa/+/uu/, but in modern pronunciation (and perhaps as far back as classical sanskrit), they are pronounced as /a/+/i/ and /a/+/u/. Each of these two diphthongs are of two beats' length and are classified as "long" vowels in sanskrit. Nice parallel structure, eh?



I don't know how much you guys are familiar with linguistics and such, but this is pretty interesting to me because arabic is an Afro-Asiatic language, while sanskrit is an Indo-European language. The two languages are pretty distant in terms of linguistic geneology and the features mentioned here are old in both respective languages, implying that a much later borrowing of structure and phonemes did not occur; it's more likely that each language retained these features independently. Also, while I used sanskrit here, please note that the vowel structures and such are in use in modern Indian languages in general, though the use of sandhi has declined in favor of consecutive vowels.

You may also be interested in how sanskrit and avestan are related.

Sunday, April 19, 2009

Youtube's Audio Content Fingerprinting...

While this is predominantly a more spiritual blog, it also deals a great deal with technology. As such, it's my pleasure to give you all a link to an experiment conducted by a good friend of mine, Scott Smitelli:

Fun with Youtube's Audio Content ID System

Scott explains what Youtube's Audio Content ID System is and conducts a battery of experiments to try see what limitations that system has**. He does a great job with his tests and describes things for the layman while also giving all of the settings he used and extensively describing his procedure. You'll even find a neat little table of his results and a concise "Conclusions" section toward the end.

If you ever find yourself wondering why something you uploaded is muted, it'll pay to know why!

**Please read the disclaimer at the bottom of his page. I, too, am not responsible for what you do with this information. There's something to be said about knowledge for knowledge's sake.

Enjoy!

Monday, April 13, 2009

From the (g)olden age to the information age

I try not to get too personal here, but some interesting things have happened that may make me cross that line a bit. This primarily concerns children of immigrants or those who have large elements of other cultures, but it's also food for thought for everyone. Bear with me, this is going someplace.

My grandmother lives with us. She's a dear and all, but she can be very annoying at times. She's my dad's mother, and so you always have that "mother-in-law" tension going on, but being here for 25+ years does a lot to ease that. When she's not criticizing in a "subtle" way, she's usually nitpicking at other things, but I feel that that's some leftover idiosyncrasy from her family life. Today, she was in a good mood, so I knew good conversation was coming forth. Normally, she talks about religious topics; folk stories and deeper religious stuff comes naturally for her. Sort of ironically, she's the most religiously liberal person I know.

Today, however, she decided to talk to us (my brother was present, too) about her childhood and her experiences. From where she was and how she grew up to where I am and how I've grown up, there have been tons and tonnes of changes. My grandmother told us stories of how she grew up in a village, how she could walk between two or three villages easily, with no worries about safety, about how she moved to Mumbai and picked up on some Marathi, about how her parents were and what was important to them.

As first-generation American-born kids, we have a lot of responsibility. Inevitably, the culture that's handed down to us will dissipate and fade over a few generations. Language usually goes first, and food goes last (or so I've found, while for me it's the opposite). Why is the preservation of this culture so important?

The way I see it, we learn very often through experience, but we also learn important lessons through the experiences of others. We learn not only how to act, think, and feel, but also how to be through what we learn from the experiences of others. If you have access to a wealth of experience from someone else in a drastically different place and time, why would you not use that? Of course, repeating "In my day, we had to..." as if it's some holy mantra doesn't really help the cause, but when you can show your kids, who are growing up with really nice plumbing and the internet what it was like to have to pump your own water and walk for miles just to get to school, you can show them that there's nothing wrong with hard work. Being smart isn't the only thing in the world, and there are plenty of people who don't luck out and have to work hard to make ends meet. Most importantly, just because you happen to be smarter, or work harder, it doesn't mean they don't deserve your respect.

There are important things worth keeping. Because of my background, I had an easier time identifying with and learning to work with certain ideas. Some of this was because of the remnants of clan culture that were passed on down, but a lot of it was independent of what my parents and grandparents taught me. There's even a handful which they consider completely rebellious. But, I found a lot of value in what I've learned. I want my kids to share that.

Actually, no I don't. My kids, and everyone else's, will be their own people. I can only try to give them what I have and hope they can use it. They will have their own dreams and hopes, beliefs and rebellious ideas. All I can do is provide what I value, in hopes that they will use it if they need it. I really think this is something that people in general forget. You don't have kids to make them something; kids are there to become something of their own.

And, while we have an easy way to find what we need, language doesn't quite work the same way. In my grandmother's time, and up to my parents time, you could easily tell where someone was from based on how they spoke. It's true of a lot of places today, too, but it's different when more than half of the active vocabulary in a language becomes borrowed English words. Gujarati now is degenerating rapidly, and indeed, so are many of the world's languages. It's not just English that's "taking over" either.

My grandmother's brand of Gujarati is very different from others'. Her dialect is one small way she identifies herself. We all have a need to identify ourselves and find belonging (consult Maslow if you're doubtful about that), and for many, familial/ancestral idiosyncrasies really help. More often than not, we mix and match those with our own ideas.

And this all will eventually fade. But, while we can, we're responsible to keep it going. Not in some trivial way. I hated being criticized for not being "a real Indian" because I ate meat (Hindus have a long history of eating all sorts of meat), or speaking imperfectly (everyone actively makes mistakes in speech in all languages). Just because I don't value what they, or they don't value what I do (you wouldn't believe how many of them don't know the first thing about actual Hinduism or Indian history), or I don't value it for the same reasons, I get criticized. The big picture is what's important here. What's important is that I find some sort of identity, and for my own purposes. We'd do well, all of us, to keep this kind of stuff in mind.

So, while I may be very annoyed at my grandmother for her nitpicky habits, I really love the times when we connect over important issues. I don't think I can properly explain what the internet is to her, or how the times have changed between mine and hers, but it's enough to know the difference and smile eagerly when she's telling me stories. Appreciation goes a long way. And, when she's stubbornly trying to be as self-sufficient as possible, I'll let her do her thing, but stand close by in case she does want my help. I hope that the future will see the past's desire to be self-sufficient, but will offer its help when necessary. It's the least we can do.

Sunday, March 15, 2009

A Technological Monk

Today, I'm going to elaborate on something I've discussed before: why it's important to take time to slow down.

I am a technological monk, a modern monk. I meditate, though not nearly as frequently as I want (or need) to. Truth be told, we have to make time for the things we enjoy, and one of these days I'll get around to working it into a routine or schedule. Until then, however, I make do by slowing down. I adapt the older techniques to a more modern way of life.

The fundamentals of Eastern meditation can be found from the Upanishads down to Patanjali (Deva: पतञ्जलि, pata~njali), who compiled the yoga sutras. Actually, the first line of the yoga sutras is as follows:

अथ योगानुशासनम् ||१||
atha yogaanushaasanam ..1..
Here (अथ) is the continuation (denoted by prefix अनु-) of the teachings (शासनम्) of yoga. This indirectly (though not merely implicitly) shows that the study of yoga had been going on for some time before Patanjali's formal compilation of sutras. While I'm on the subject, here are the next few lines:

योगश्चित्तवृत्ति निरोधः ||२||
तदा द्रष्टुः स्वरूपेऽवस्थानम् ||३||
वृत्तिसारूप्यमितरत्र ||४||

yogashcittavRtti nirodhaH ..2..
tadaa draSTuH svaruupe.vasthaanam ..3..
vRttisaaruupyam itaratra ..4..

"Yoga is the cessation (nirodhaH) of the turnings (vRtti) of the mind (citta).
Then (tadaa), the seer (draSTuH) resides (avasthaanam) in its own true form (svaruupe).
In other cases (elsewise, etc. ; itaratra), the true self (saaruupyam) [identifies with, "is"] the turnings (vRtti)."

What this essentially means is that:
  1. The process of "yoga" is when the mind (in actuality, citta is the amalgamation of three components of sense-related consciousness) stops turning or revolving. It stops creating movement.
  2. This is a very bold statement. Most people have never experienced this in a waking state, and so the third sutra serves to allay any fears of death.
  3. The "seer" (a metaphor for the true inner consciousness) resides in the knowledge of itself.
  4. In other cases, this inner consciousness identifies with movements in the mind. This identification is fallacious.
The idea here is that we have consciousness. It cannot be turned off while we are alive. This consciousness is usually focused "outwards," through the mind and its movements, through sensory perception, and out to the world. However, through careful and sustained practice, prayer, and/or raw discipline, one can turn off perception to these "outward" things, including to one's thoughts. Since consciousness cannot be turned off, it insteads reflects back on itself, and this "self-awareness" is the basis for yoga. Mystics find their liberation from the world through this, and despite being a horrid cliché that I hate, I will buckle and say that a Westerner can think of this as "enlightenment."

(Breakdown here is courtesy of my amazing former professor, Dr. Edwin Bryant, and his amazing Yoga Sutras topical study of religion. My explanation and interpretation exists because of what I learned in his classes.)

Relax, I'm getting to the point.

Nowadays, we're brought up to multitask. Multitasking is great, and useful, and is a great skill. But, overdeveloping that ability backfires. We learn to focus first, before we learn to split our attention amongst other things. When we learn to multitask, most of us continue to develop that without fully developing the ability to truly focus on one or two things. We don't have balanced attention.

Meditation works entirely on focus, especially with only one object. I'm not saying that multitasking has absolutely no place in meditation, but unless you're advanced, have another motive, or are a special case, it primarily hinders progress. That's why I don't buy the excuse that absolutely EVERYONE gives: "I just can't focus." Guess what? NO ONE can! It's nothing that doesn't affect everyone else. "Stopping" thought is not easy. You have to work at it, over a long period of time, and with discipline. Really, that statement is pretty much just a poor excuse; either they don't really care about it or don't realize that they have to invest a significant amount of time. Instant gratification really doesn't apply, especially for things considered "ascetic" arts.

At any rate, the fact of the matter is that we're stuck with a better multitasking ability and we're left wanting in terms of singular focus. My good friend Adam pointed out to me recently that an average pack/day smoker gets to have anywhere from forty to an hour and forty minutes of time that could be considered mild meditation. Adam, being ever the resourceful one, takes whatever opportunity he can to do what he refers to as "bullshit meditations." What a great idea! I, myself, do a lot of these b.s. meditations in my daily routines.

As I've said before, taking time to slow down can really have magical effects for some people. Taking time to focus on doing something in the not-so-efficient or not-so-resourceful way can serve a great deal of purposes, including building character-defining traits, forming idiosyncracies that can enrich your life (for yourself), and de-stressing! These habits give you a chance to concentrate your focus on one or two things, which lets you regroup. Many people think that by constantly checking on problems or worrying (essentially bringing things to the forefront of your mind from time to time) "in the background" that they're doing something good. Actually, it's a lot like flicking Alt+Tab; you're flipping through open programs, but just because you're not seeing some of the programs for more than five seconds at a time doesn't mean that they're magically "in the background." You have to let them sit, until they're tossed into the swap partition. This frees up your RAM to do something else, and when you do finally switch back to your other thoughts, they really are "refreshed." From personal experience, I can tell you this is really conducive to the Eureka Effect.

Understandably, modern life differs from ancient life. We can't all just up and leave our jobs and become ascetics or monks; devoting our lives to a method to free ourselves from life doesn't seem to fit the contemporary mood. On the whole, we don't care, and most of us haven't even thought about our own mortality in a truly life-altering way (aside from the fifteen minutes after somebody close to us passes away). However, why should that stop us from utilizing meditation as a quick tool to boost the quality of our lives? It can boost productivity, balance our moods, give us some greater perspective beyond the immediate here & now of our individual lives, and perhaps give us some spiritual insight in the process.

And why shouldn't we recruit the use of technology for this? As a personal example of how I sharpen my focus, I recently started learning the Linux command-line. I've been learning some scripting so that I could do some batch video conversions for my iPod. While in the future I can convert video really easily and without much thought, I spent two to three hours last night trying to get the script to work just right. That was good, solid focus. No multitasking; I wasn't checking torrents, downloading guides, writing this blog post. I was taking things one step at a time and trying to get exactly one thing working. This is just one example of how I take time to work on laser- or flashlight-like focus, instead of a lantern or lightbulb-like focus (a modern take on a very old metaphor). Slowly but surely I am learning some discipline. Actually, I've read numerous articles on the web that highlight research in education techniques. Doing things for shorter periods of time with a more intense focus and doing them daily is generally much more effective than "brute-forcing" something into your head irregularly and for prolonged periods of time. From my varied sources, this is true of meditation. The misconception is that when you sit down to meditate, you sit down for hours at a time until you get it. Beginners hear this and it really turns them away for the idea after trying it. Actually, it is much more effective to try and meditate for maybe a half hour a day for a few weeks, and as it gets more comfortable/familiar/easier, to increase that time. Very similar to many doctors' recommendations for exercise...

This is another junction where we can identify some of our issues by taking a look at our technological practices, and how some of our technological solutions can trickle back into other aspects of our everyday lives. As if I haven't said it enough already, there's no reason we can't still find ancient wisdom in our cutting-edge laptop or bleeding-edge software release. Similar ideas are at play now that were in effect thousands of years ago. And, at least for some things, that's not such a bad idea.

Thursday, March 12, 2009

Portability; a cue from technology.

Portability. Either hardware or software, it applies.

Software portability, as you may or may not know, refers to the design of a program and how able it is to reuse existing code when in a new environment. This relies on an abstraction layer, which serves to negotiate between the system interface and the application. Essentially, when you want to make software that works on different systems without having to rewrite much (or any) of the code, you're focusing on portability. This reduces cost and effort in development while allowing a greater base from which to draw profit.

Physical portability is also an important concept. Nowadays, we have PDA-phones which make calls, get the internet, track our global position, and even test blood! Whether you're diabetic and testing your blood-sugar, lost and need to find directions, comparing prices, or talking to your mother - or all four! - you can look to a single handheld device to do this kind of stuff for you, no matter where you are. Even if you're using your phone to test blood for disease, portability makes everything easier.

The key to our highly active and mobile lifestyle can also be a source for spiritual inspiration.

In the wake of this amazing article about a preacher whose religious exclusivity was shattered by a five-year-old girl's giggle, I've been thinking about how our beliefs influence much of what we do, but how we can find ourselves backed in a corner when put in an environment that seems alien to us. As a Hindu who grew up in a largely Christian environment, I quickly learned to develop my beliefs and identify with my culture without having to constantly re-evaluate all of life when someone mentioned some holiday, hymn, or Biblical parable that I didn't know about. In retrospect, the surprising thing was that this made me more interested in the traditions of others.

It's a three-fold benefit, really. First and foremost, you find some suitable foundation upon which to base your beliefs. This keeps your belief-system stable. Secondly, you remain open to forming bonds with others based on your beliefs, while simutaneously not feeling pressured when you're among people whose beliefs differ vastly from your own. You learn that beliefs belong to individuals, not large, generic groups of people. This way, you can still believe what you want without condemning others for their beliefs. This gives you a great foundation for diverse friendships.

Lastly, and most importantly, you get to learn about others' beliefs phenomenologically (i.e. from within the tradition itself, instead of as an outsider). You learn more accurately about what others believe, instead of taking it one piece at a time. You don't analyze each individual piece as to whether or not it fits in your own belief system, then take a stance, and judge the next piece. You take it as a whole, and understand how it comes together, and then, you can choose to take your personal stance on its parts or its whole as you see fit. This is really important because now, even if you don't agree with someone's perspective, you can still understand and respect theirs. It really enables you to connect with a much larger set of people than just the ones in your immediate, shared-belief community.

This has a few important side-effects as well. With a strong, portable spiritual foundation, you don't feel as though you're constantly pushed to convert to a different religious structure. When you're not under pressure, you're more open to new ideas. With a better understanding of those ideas, you're more likely to not throw the bucket out with the water, should you find some aspect appealing but not others. You better understand the ramifications of a line of thought, as it applies to someone else, before you take it on. And, if you find that you don't agree, you don't feel animosity toward others, as so many people somehow do. You don't take it personally, and you are also more careful to make sure others don't take it personally as well.

So what does this mean in the long-run? You get less cataclysmic change and more gradual change in your spiritual perspective! This serves to make you less liable for what some people consider "huge pitfalls" on your path. Of course, there are no pitfalls, just more redirects, but if you can avoid being in that position in the first place, why not? Then, when you do have cataclysmic re-evaluations, you can also rearrange your belief structure more neatly and easily. When you find that you have to change your beliefs to accept more, it becomes easier to integrate that belief in your daily life. There's less time spent thinking, "But, if I believe this, then what does that mean about my other beliefs?" and more time actually acting and living. That sounds like some real low-cost development to me.




It looks like I haven't been able to make time and make up posts. At any rate, I'm posting, which is something.

Also, I'll be switching a few things around on the site, but since I haven't already plugged, here's my other blog: A Modern Hindu's Perspective. It's more geared towards actual religious and philosophical thoughts, and does use some technical terminology, but it's just as much fair game as this blog is. Enjoy, and thanks for reading!

Tuesday, February 17, 2009

Making up for lost time

Sorry folks. First and foremost, my list of excuses.

January was hard for me to draw inspiration from. Amongst friends who were home fanning out once again, I found January to be very spiritual (and also religious) to me. Don't lose me, alright? I'm saving the preaching for you readers. That being said, I will plug my new blog, "A Modern Hindu Perspective," when I make a post that's not from my phone.

It also didn't help staring down the barrel of more than forty days of
unread Engadget.com and Lifehacker.com feeds in Google Reader.

Now, while I don't think I can actually make up for twenty-one posts
(plus seven weekend posts, plus a few more I'm forgetting), I will
attempt to make up for some lost time. I'm aiming for one major post
for each week missed, and a summary post encompassing each of the
missed weekend posts.

Just letting you know I'm still around, and this blog's not dead yet.
You can't outrun real life. Though, you can get pretty close.