Showing posts with label article. Show all posts
Showing posts with label article. Show all posts

Sunday, September 14, 2008

Software to extract meaning out of life's trivia

Here's a very interesting article in Washington Post, that ties in to some degree with Clive Thompson's article on ambient awareness (discussed in my previous blog post):

Bytes of Life: For Every Move, Mood and Bodily Function, There's a Web Site to Help You Keep Track

It reflects a lot of my own thoughts about what part data plays in our lives, and how it could let us get much more out of life.

"In San Diego, statistics student David Horn [...] is working with his engineer girlfriend, Lisa Brewster, to develop an all-encompassing life tracker, under the working title of "I Did Stuff."


I would like to have these guys' job. They want to track and record everything -- everything that happens in their lives, down to (or especially) the most mundane events.

It's been known for a long time, and a recent study confirmed, that keeping a diary recording every bite they ate helped people to lose weight. And therapists recommend people who have trouble sleeping to record what they ate, drank, and did before sleep, to see if a trend emerges that shows a correlation between certain foods / activities and insomnia. Also, keeping track of your time minute-by-minute -- writing down all activities, no matter how mundane -- may allow you to see where all your time goes, if you feel you have no time for anything in your life. So there is a well-established practical use for navel-gazing, that predates the internet. And the internet made it infinitely easier to record your daily events, both the kind you do consciously (Brightkite for tracking your location, MyMileMarker.com for driving habits, Fitday.com to map food intake and calorie expenditure, Last.fm for listening habits, and even BedPost for sex life), and the kind your body does autonomously (sites for tracking heart rate and blood glucose levels, or the self-explanatory MyMonthlyCycles.com :-))

But these two researchers want to take it much further.

Tracking not just what you did, but what you got out of it



[...] David Horn already belongs to BrightKite, Last.fm and Wakoopa.com, which tracks his Internet usage. He's also experimented with Fitday.com to map food intake and calorie expenditure. It was satisfying for a while, but now he wants something bigger -- something simultaneously broader and more nitpicky -- to fill in the gaps that individual sites don't currently track.

Horn is working with his engineer girlfriend, Lisa Brewster, to develop an all-encompassing life tracker, under the working title of "I Did Stuff."

"I'd like to track the people I talk to," says Brewster, "and how inspired I am six hours later. And definitely location history -- where I am, what time -- "

"Correlated with weather history," interjects Horn. "And allergy data, pollen and mold in the air."

Plus, "Web sites I read and their effect," says Brewster. "If I spend a long time reading a blog, like TechCrunch, but I don't get noticeable output from it."


At first the author of this article is boggled by this level of self-indulgent navel-gazing, but then she seems to understand what it is about. The usefulness of tracking is of course not in the raw data (who would have the time to re-read their life at the same pace as they are living it? :-)) but in extracting trends that would help you correlate perceptions with facts.

Has it really been a month since you last had sex, or does it just feel like that? Did you really floss five times last week, or was it more like twice? Now that you realize that, are you a little less angry at your dentist for that painful last appointment?


Analysis of mundane events reveals profound trends in one's life



Self-tracking [...] is partly about the recording, but also as much about the analysis that goes on after the recording.

The apparent meaninglessness of data recorded over time is actually what makes it profound.

The problem with diaries and blogs, trackers say, is that people use them to record the events they think are meaningful. What they forget is that meaningful events are often a result of months of insignificance, a cause and effect not readily visible to the human eye but easily detected with the help of a computer program.

"Things that happen over time can lead up to bigger events," says Horn. "They may seem small by themselves, but looking at them as a whole I can see how they lead to a bigger theme or idea."

"I was always a terrible self-journaler," says Messina. "Every once in a while I'd write in a journal, but it was always a major, momentous event. 'Got to college.' 'Broke up with girlfriend.' You lose a lot of the nuance that caused that situation to come about."

Tracking can "zoom out over my entire life," he says. It could, for example, help him better understand the aforementioned breakup. "When you've self-documented the course of an entire relationship, trivia that doesn't seem like much could, over time," help him understand exactly what went wrong, and when.

Maybe, to extrapolate on Messina's idea, your weekly date night had been Friday. And maybe you were always in a tetchy mood on Fridays because you'd just come from chem lab, which you hated. Maybe the whole relationship could have been saved by switching date night to Sunday, after your endorphin-boosting yoga class. Maybe you just didn't realize the pattern, because you weren't tracking it. All the answers could be right there, in your life data.


We can extrapolate even further. Perhaps the tracking software, if it was sophisticated enough, could notice increasing frequency and viciousness of arguments between you and your significant other, increasing frequency and length of time spent apart, and things like that. The software could flag it to you as a warning sign that the relationship is in danger. Then you could take steps to get it back on track. You might say most people don't need software to tell them when their relationship is off track; however, I think people often ignore warning signs -- sometimes wilfully, sometimes out of inertia. Inertia certainly plays a huge part in everything we do. We would rather keep a mental image of things as they were at their most comfortable, or downplay the significance of worrisome events, than acknowledge the truth that something is going astray. Life-tracking software could point out discrepancies between our partner's words and actions. It could force us to pay attention to those signs before it is too late.

The software could also give us tools to defuse certain recurring arguments which, if unexamined, tend to pick up destructive strength like a hurricane crossing the Gulf of Mexico. :-) You could look at the software and say: "we've had this discussion before; here is what was said; here is the conclusion we have reached. Do you have any new information that would give us a reason to revisit this issue?"

Of course, there are a lot of people -- most people, perhaps -- who would hate the idea of having their every word or phrase recorded, and of those records being resurrected as evidence (even by people they trust). I'm sure some people might think it diminishes their relationship somehow. But how could truth diminish it? Anyway, that's a social engineering problem, though those are often harder than computer engineering. Among the latter, a major problem would be to find a way to structure the data so as to capture its essential qualities. For example, how would you compute the intensity of the four horsemen of Apocalypse (made famous by John Gottman): Criticism, Contempt, Defensiveness and Stonewalling? How do you quantify formless, deeply subjective data? How do you even decide what to measure? It would be a tough task, but one I would gladly spend years working on, if I didn't have to worry about making a living. :-)

In fact, if I had come of age at the time of Web 2.0., I would seriously consider going to grad school so that I could do this project as my thesis / dissertation. I would probably find a professor somewhere in some university who could get interested in this idea enough to serve as my advisor. (I've seen people in computer science departments doing stranger projects than that. Or if not in computer science, then surely in the interdisciplinary studies. :-))

Bookmark This on Delicious

Tuesday, September 09, 2008

Ambient awareness, digital ESP

There is a great article by Clive Thompson in New York Times magazine:

Brave New World of Digital Intimacy

that explains the appeal of Twitter. Like many people, when I first heard of Twitter, and even long after I signed up for it, I thought it was pretty useless. At the very least it seemed useless for verbose bloggers like me, who don't like to post mere facts or sound-bite opinions without context or analysis (and you can't provide much analysis in 140 characters). But, as Clive Thompson says, the constant stream of friends' tweets provides an "ambient awareness" of daily rhythms of friends' lives. To quote the article, "It is very much like being physically near someone and picking up on his mood through the little things he does -- body language, sighs, stray comments -- out of the corner of your eye."

Each day, Haley logged on to his account, and his friends' updates would appear as a long page of one- or two-line notes. He would check and recheck the account several times a day, or even several times an hour. The updates were indeed pretty banal. One friend would post about starting to feel sick; one posted random thoughts like "I really hate it when people clip their nails on the bus"; another Twittered whenever she made a sandwich -- and she made a sandwich every day. Each so-called tweet was so brief as to be virtually meaningless.

But as the days went by, something changed. Haley discovered that he was beginning to sense the rhythms of his friends' lives in a way he never had before. When one friend got sick with a virulent fever, he could tell by her Twitter updates when she was getting worse and the instant she finally turned the corner. He could see when friends were heading into hellish days at work or when they'd scored a big success. Even the daily catalog of sandwiches became oddly mesmerizing, a sort of metronomic click that he grew accustomed to seeing pop up in the middle of each day.

This is the paradox of ambient awareness. Each little update -- each individual bit of social information -- is insignificant on its own, even supremely mundane. But taken together, over time, the little snippets coalesce into a surprisingly sophisticated portrait of your friends' and family members' lives, like thousands of dots making a pointillist painting. This was never before possible, because in the real world, no friend would bother to call you up and detail the sandwiches she was eating. The ambient information becomes like "a type of E.S.P.," as Haley described it to me, an invisible dimension floating over everyday life.


I'm beginning to feel that way about it too. And it's good for those fleeting observations that are not meaty enough to warrant a blog post. I might even change my mind about whether such observations do not reveal someone's personality better than well-thought-out blog posts. (If anyone wonders, my Twitter ID is elze.)

Wednesday, July 16, 2008

If real-life people cared as much as netizens

From Washington Post:

The Impassive Bystander: Someone Is Hurt, in Need of Compassion. Is It Human Instinct to Do Nothing?

A woman fell on the floor, convulsed and died half and hour later in a hospital's waiting room while the staff walked by, watched, and did nothing. (Fellow patients didn't call for help either.)

A 78-year-old man tries to cross a street with a carton of milk. To quote the article, "He steps off the curb just as two cars that appear to be racing swerve on the wrong side of the street. The first car swerves around the man. The second car hits him and throws him into the air like a doll, then speeds away. What follows is even more chilling: People walk by. Nine vehicles pass him lying in the street. Some drivers slow down to look but drive away."

The article questions how it could have happened, and whether we are actually wired for indifference.

If no one else is moving, individuals have a tendency to mimic the unmoving crowd. Although we might think otherwise, most of us would not have behaved much differently from the people we see in these recent videos, experts say. Deep inside, we are herd animals, conformists. We care deeply what other people are doing and what they think of us. The classic story of conformist behavior can be found in the 1964 case of Kitty Genovese, the 28-year-old bar manager who was slain by a man who raped and stabbed her for about half an hour as neighbors in a New York neighborhood looked on. No one opened a door for her. No one ran into the street to intervene.


We seem to be more caring on the internet



At the same time, most of us in the blogosphere have probably heard of cases of people rallying around a virtual friend who's having a real life emergency -- e.g. calling police upon suspicion that a fellow blogger is about to commit suicide -- even if they have never met that person before. It can make you wonder: can we engineer the real world to be more like the internet in that respect? Would we be more likely to help a stranger on the street if we knew he was just a few friends away from being connected to us on Facebook?

Can we engineer real life to be more like the net?



Among humans, negative example is apparently contagious. But perhaps our machines could show some positive example for us, if we programmed them to do so. Let's say, a victim's mobile device notifies his/her Facebook friends, Twitter followers, or other social networking contacts; they in turn could alert their friends on their mobile devices based on location; and so the bubble of alerts could propagates to those who happen to be relatively close to the victim at the moment. (In the 6-degrees-of-separation world this should not take long.) Then perhaps those people within the alert wave might be motivated to help, not the least because they know there is an "audit trail" making them accountable to their friends. (Of course, there are a few technical problems here, but I'm talking about the concept.)

So in a sense the users would be Big-Brother'ing each other. But at the same time social networking applications could prompt us to do the right thing by making it appear as if somebody is doing the right thing. :-)

Sunday, July 13, 2008

I knew this first-hand for decades

... but it's always good to hear an authority say it. An authority on humor, no less. And British! Aren't British supposed to be foremost experts on humor. It is thus with a huge sense of validation that I present this news:

From this article on msnbc.com:

The president of the humor society, British sociologist Christie Davies, offered insights Tuesday on the state of humor in today's world.

Among other things, he said, jokes in eastern Europe were a lot better when the communists ran the show.

"Once you have a democracy with free speech, you have fewer jokes," said Davies, an emeritus professor at the University of Reading, in England. "Jokes, in many ways, are a way of getting around restrictions on what you can say. That was a very important factor in eastern Europe."

Saturday, July 12, 2008

From itch to Immanuel

The Itch: Its mysterious power may be a clue to a new theory about brains and bodies (New Yorker) is one freaky article. You might not think of itch as a Job-like misfortune, but in the cases featured in the article, it is. Those cases are almost biblical in their bizareness, too. :-) There's a woman whose head itched so persistently and maddeningly that one night, while asleep, she scratched right through her skull and into her brain. Another guy died from an itch on his neck because he scratched into his carotid artery.

Does perception originate in nerve endings, or in the brain itself?



What's worse, the cause of the woman's itch was not any problem her skin; in fact, the nerves in the itchy spot were 96% dead, so they could not have possibly conveyed itch signals to her brain. Rather, a neurologist thought that "the itch system in M's brain had gone haywire, running on a loop all its own."

[This speculation challenges] what neuroscientists call 'the naïve view,' and it is the view that most people, in or out of medicine, still have. We're inclined to think that people normally perceive things in the world directly. We believe that the hardness of a rock, the coldness of an ice cube, the itchiness of a sweater are picked up by our nerve endings, transmitted through the spinal cord like a message through a wire, and decoded by the brain."


But a theory that's emerged lately considers that sensory perceptions originate in the brain itself, where it integrates, rather imperfectly, nerve signals coming from the outside world. So it is entirely possible for a brain to experience a fantom itch in a place where there's nothing to itch. It also explains the phenomenon of phantom limb. It also talks about fascinating therapies to treat pain and discomfort in phantom limbs by tricking brain into accepting contradicting information regarding the missing limb.

Things in themselves (the unknowability of)



All in all a fascinating article, but what's most amazing about it is that it takes evidence from sciences that deals with tangible things, such as biology and neuroscience, and uses it to support a realm of inquiry that's commonly thought of as metaphysical.

"In a 1710 "Treatise Concerning the Principles of Human Knowledge," the Irish philosopher George Berkeley objected to [the naive] view. We do not know the world of objects, he argued; we know only our mental ideas of objects. "Light and colours, heat and cold, extension and figures -- in a word, the things we see and feel -- what are they but so many sensations, notions, ideas?" Indeed, he concluded, the objects of the world are likely just inventions of the mind, put in there by God."


All this reminds me of Kant, with his impossibility of knowing "things in themselves". It makes me want to regret not giving "Critique of Pure Reason" proper attention when it was mandatory reading for a philosophy course I took in college. As dry as it was, I vaguely remember being intrigued by its notion that what we think are experiences of real objects are actually just our own mental structures. My thought was, "there's got to be an idea for a science fiction story somewhere in there". :-) But the idea was too abstract to even try to turn it into a story. Now, however, I have a feeling that neuroscience could provide scaffolding on which to build a story exploring the most abstract (if not to say metaphysical) aspects of our existence. That, to me, is fascinating.

Friday, July 04, 2008

Turns out, cavemen loved to sing, says an article headline on msnbc.com. How, one wonders, do the modern scientists figure out such things? This article is interesting not just in that it reveals specifics of life of prehistoric civilizations, but also that it shows how scientists reach conclusions about things as nebulous as leisure preferences of long-gone civilizations. Namely,

Ancient hunters painted the sections of their cave dwellings where singing, humming and music sounded best, a new study suggests.

Analyzing the famous, ochre-splashed cave walls of France, scientists found that the most densely painted areas were also those with the best acoustics. Humming into some bends in the wall even produced sounds mimicking the animals painted there.


They did it not just for their amusement. Cave dwellers used echolocation to map out the properties of the caves, the article says.

With only dull light available from a torch, which couldn't be carried into very narrow passages, the ancient hunters had to use their voices like sonar to explore the crooks and crannies of a newfound cave.


As an aspiring science fiction writer, I'll certainly keep this article in mind when or if I try to invent an alien civilization for a story I'm writing. It provides enough inspiring details to fill out a picture of an ancient culture.

Saturday, June 14, 2008

A brief fantasy of a web application

that I might like to write as a hobby if my day was three times as long...

If your spouse thinks your child is poorly behaved, while you think your child's behavior is fairly good for her age, how do you decide where the truth lies? Is one of you too lax, or does the other has unrealistic expectations for a 3-year-old? The truth may lie in a large scale study that would let one compare their child's behavior against that of thousands of other children of the same age. Where would one easily come across such data? Why, that's where internet and social networks come into play.

There could be a web-based application that allowed a parent to record a child's daily tantrums, as well as episodes of good behavior. Then it could draw various statistical conclusions from those numbers. At the very least it would let a parent to discover where their child falls on the curve of his or her peers. This would, of course, require many parents' participation. They would need to track every instance of their offspring's good or bad behavior: every tantrum, every "please" and "thank you", and so on.

Mind-numbingly tedious? You bet. But, if the latest explosion of social web applications is any indication, people like to do mind-numbingly tedious things, as long as they get to do them on the internet. :-) Well, a certain category of people do. Witness the popularity of Twitter. If people don't get tired of posting what they ate for lunch, some of the same people might become just as obsessive about posting their child's behavioral microupdates. And of course they don't have to be at a computer for that. A text messaging-enabled cell phone is enough.

The internet indeed has a way of converting tedious chores into games. It's a quality that's already been leveraged by such applications as Chore Wars, where players get points for chores they do. Once you get stoked about beating fellow players, you don't even notice that you've finished cleaning your kitchen! Indeed humans (or a certain category of humans) are all about keeping scores. So an application that exploits this urge has a potential to do well.

And let's not forget that parenting is a competitive sport anyway -- so if anyone is inclined to keep scores, parents would be among those people! :-)