Thursday, July 31, 2008
Writing, a semi-failure
I finished my story for the Fencon writers' workshop. Alas, it's not something I can be proud of. I tried to write an original story, and as always, even when trimmed to the minimum, it was 3 times as long as the word quota (the magical 5000 words). So instead I dusted off one of my older stories that could be easily chopped down to 5000 words, and submitted it. This is so dispiriting. I wonder if I'll ever be able to write a story that fits the guidelines (well, obviously, I did that a few times in the past, but I'm afraid it's not reproducible). I'm going to try now to expand my latest story into a full-blown novel (or at least a novella), filling it in with more characterization (ha! who am I kidding?) and more worldbuilding, and then see if I could take just one episode and make a story out of it. The problem is that a story must stand on it own: it should not require additional background, and it should achieve resolution in the end -- and I'm not sure that's possible to do within the setting I'm writing about. What a conundrum.
Wednesday, July 16, 2008
If real-life people cared as much as netizens
From Washington Post:
The Impassive Bystander: Someone Is Hurt, in Need of Compassion. Is It Human Instinct to Do Nothing?
A woman fell on the floor, convulsed and died half and hour later in a hospital's waiting room while the staff walked by, watched, and did nothing. (Fellow patients didn't call for help either.)
A 78-year-old man tries to cross a street with a carton of milk. To quote the article, "He steps off the curb just as two cars that appear to be racing swerve on the wrong side of the street. The first car swerves around the man. The second car hits him and throws him into the air like a doll, then speeds away. What follows is even more chilling: People walk by. Nine vehicles pass him lying in the street. Some drivers slow down to look but drive away."
The article questions how it could have happened, and whether we are actually wired for indifference.
At the same time, most of us in the blogosphere have probably heard of cases of people rallying around a virtual friend who's having a real life emergency -- e.g. calling police upon suspicion that a fellow blogger is about to commit suicide -- even if they have never met that person before. It can make you wonder: can we engineer the real world to be more like the internet in that respect? Would we be more likely to help a stranger on the street if we knew he was just a few friends away from being connected to us on Facebook?
Among humans, negative example is apparently contagious. But perhaps our machines could show some positive example for us, if we programmed them to do so. Let's say, a victim's mobile device notifies his/her Facebook friends, Twitter followers, or other social networking contacts; they in turn could alert their friends on their mobile devices based on location; and so the bubble of alerts could propagates to those who happen to be relatively close to the victim at the moment. (In the 6-degrees-of-separation world this should not take long.) Then perhaps those people within the alert wave might be motivated to help, not the least because they know there is an "audit trail" making them accountable to their friends. (Of course, there are a few technical problems here, but I'm talking about the concept.)
So in a sense the users would be Big-Brother'ing each other. But at the same time social networking applications could prompt us to do the right thing by making it appear as if somebody is doing the right thing. :-)
The Impassive Bystander: Someone Is Hurt, in Need of Compassion. Is It Human Instinct to Do Nothing?
A woman fell on the floor, convulsed and died half and hour later in a hospital's waiting room while the staff walked by, watched, and did nothing. (Fellow patients didn't call for help either.)
A 78-year-old man tries to cross a street with a carton of milk. To quote the article, "He steps off the curb just as two cars that appear to be racing swerve on the wrong side of the street. The first car swerves around the man. The second car hits him and throws him into the air like a doll, then speeds away. What follows is even more chilling: People walk by. Nine vehicles pass him lying in the street. Some drivers slow down to look but drive away."
The article questions how it could have happened, and whether we are actually wired for indifference.
If no one else is moving, individuals have a tendency to mimic the unmoving crowd. Although we might think otherwise, most of us would not have behaved much differently from the people we see in these recent videos, experts say. Deep inside, we are herd animals, conformists. We care deeply what other people are doing and what they think of us. The classic story of conformist behavior can be found in the 1964 case of Kitty Genovese, the 28-year-old bar manager who was slain by a man who raped and stabbed her for about half an hour as neighbors in a New York neighborhood looked on. No one opened a door for her. No one ran into the street to intervene.
We seem to be more caring on the internet
At the same time, most of us in the blogosphere have probably heard of cases of people rallying around a virtual friend who's having a real life emergency -- e.g. calling police upon suspicion that a fellow blogger is about to commit suicide -- even if they have never met that person before. It can make you wonder: can we engineer the real world to be more like the internet in that respect? Would we be more likely to help a stranger on the street if we knew he was just a few friends away from being connected to us on Facebook?
Can we engineer real life to be more like the net?
Among humans, negative example is apparently contagious. But perhaps our machines could show some positive example for us, if we programmed them to do so. Let's say, a victim's mobile device notifies his/her Facebook friends, Twitter followers, or other social networking contacts; they in turn could alert their friends on their mobile devices based on location; and so the bubble of alerts could propagates to those who happen to be relatively close to the victim at the moment. (In the 6-degrees-of-separation world this should not take long.) Then perhaps those people within the alert wave might be motivated to help, not the least because they know there is an "audit trail" making them accountable to their friends. (Of course, there are a few technical problems here, but I'm talking about the concept.)
So in a sense the users would be Big-Brother'ing each other. But at the same time social networking applications could prompt us to do the right thing by making it appear as if somebody is doing the right thing. :-)
Tuesday, July 15, 2008
I finally saw "Sex and the City"
It's fluff as advertised, but a really long fluff, at that. Almost 3 hours of a nearly content free parade of ridiculous fashions. And goodness, are the fashions ridiculous. It boggles my mind that grown women would swoon after a bag in gaudy, primary colors -- something my 3-year-old would get excited about -- so what that it's Louis Vuitton.
Well, there is a plot, of course. But the story is rather predictable. So much so that at some point I was for a second confused if I slept through "Sex and the City" and was now seeing "Pride and Prejudice", which I was planning to watch on PBS Masterpiece Theater later that night. Because the story of Carrie and Mr. Big's circuitous courtship (if it can be called that) is so Jane Austen'esque. It's full of tropes that were launched into being by Jane Austin... or at least made popular by her. We already know that the hero and the heroine will get together in the end -- the only suspense is how they will get there; what kind of scheming it will take for them to reconcile after a series of misunderstandings and hurt feelings. Just like Elizabeth Bennet and Darcy. The similarity between Jane Austen's heroines and those of "Sex and the City" is that the fashions are dreadful in both cases; only in "Pride and Prejudice" is the other polar opposite: its heroines look like they live in their nightgowns day and night.
But the women's friendship, and how they rally around Carrie during the darkest time of her life, and nurse her back into sanity -- that was sweet.
Well, there is a plot, of course. But the story is rather predictable. So much so that at some point I was for a second confused if I slept through "Sex and the City" and was now seeing "Pride and Prejudice", which I was planning to watch on PBS Masterpiece Theater later that night. Because the story of Carrie and Mr. Big's circuitous courtship (if it can be called that) is so Jane Austen'esque. It's full of tropes that were launched into being by Jane Austin... or at least made popular by her. We already know that the hero and the heroine will get together in the end -- the only suspense is how they will get there; what kind of scheming it will take for them to reconcile after a series of misunderstandings and hurt feelings. Just like Elizabeth Bennet and Darcy. The similarity between Jane Austen's heroines and those of "Sex and the City" is that the fashions are dreadful in both cases; only in "Pride and Prejudice" is the other polar opposite: its heroines look like they live in their nightgowns day and night.
But the women's friendship, and how they rally around Carrie during the darkest time of her life, and nurse her back into sanity -- that was sweet.
Sunday, July 13, 2008
I knew this first-hand for decades
... but it's always good to hear an authority say it. An authority on humor, no less. And British! Aren't British supposed to be foremost experts on humor. It is thus with a huge sense of validation that I present this news:
From this article on msnbc.com:
From this article on msnbc.com:
The president of the humor society, British sociologist Christie Davies, offered insights Tuesday on the state of humor in today's world.
Among other things, he said, jokes in eastern Europe were a lot better when the communists ran the show.
"Once you have a democracy with free speech, you have fewer jokes," said Davies, an emeritus professor at the University of Reading, in England. "Jokes, in many ways, are a way of getting around restrictions on what you can say. That was a very important factor in eastern Europe."
Saturday, July 12, 2008
From itch to Immanuel
The Itch: Its mysterious power may be a clue to a new theory about brains and bodies (New Yorker) is one freaky article. You might not think of itch as a Job-like misfortune, but in the cases featured in the article, it is. Those cases are almost biblical in their bizareness, too. :-) There's a woman whose head itched so persistently and maddeningly that one night, while asleep, she scratched right through her skull and into her brain. Another guy died from an itch on his neck because he scratched into his carotid artery.
What's worse, the cause of the woman's itch was not any problem her skin; in fact, the nerves in the itchy spot were 96% dead, so they could not have possibly conveyed itch signals to her brain. Rather, a neurologist thought that "the itch system in M's brain had gone haywire, running on a loop all its own."
But a theory that's emerged lately considers that sensory perceptions originate in the brain itself, where it integrates, rather imperfectly, nerve signals coming from the outside world. So it is entirely possible for a brain to experience a fantom itch in a place where there's nothing to itch. It also explains the phenomenon of phantom limb. It also talks about fascinating therapies to treat pain and discomfort in phantom limbs by tricking brain into accepting contradicting information regarding the missing limb.
All in all a fascinating article, but what's most amazing about it is that it takes evidence from sciences that deals with tangible things, such as biology and neuroscience, and uses it to support a realm of inquiry that's commonly thought of as metaphysical.
All this reminds me of Kant, with his impossibility of knowing "things in themselves". It makes me want to regret not giving "Critique of Pure Reason" proper attention when it was mandatory reading for a philosophy course I took in college. As dry as it was, I vaguely remember being intrigued by its notion that what we think are experiences of real objects are actually just our own mental structures. My thought was, "there's got to be an idea for a science fiction story somewhere in there". :-) But the idea was too abstract to even try to turn it into a story. Now, however, I have a feeling that neuroscience could provide scaffolding on which to build a story exploring the most abstract (if not to say metaphysical) aspects of our existence. That, to me, is fascinating.
Does perception originate in nerve endings, or in the brain itself?
What's worse, the cause of the woman's itch was not any problem her skin; in fact, the nerves in the itchy spot were 96% dead, so they could not have possibly conveyed itch signals to her brain. Rather, a neurologist thought that "the itch system in M's brain had gone haywire, running on a loop all its own."
[This speculation challenges] what neuroscientists call 'the naïve view,' and it is the view that most people, in or out of medicine, still have. We're inclined to think that people normally perceive things in the world directly. We believe that the hardness of a rock, the coldness of an ice cube, the itchiness of a sweater are picked up by our nerve endings, transmitted through the spinal cord like a message through a wire, and decoded by the brain."
But a theory that's emerged lately considers that sensory perceptions originate in the brain itself, where it integrates, rather imperfectly, nerve signals coming from the outside world. So it is entirely possible for a brain to experience a fantom itch in a place where there's nothing to itch. It also explains the phenomenon of phantom limb. It also talks about fascinating therapies to treat pain and discomfort in phantom limbs by tricking brain into accepting contradicting information regarding the missing limb.
Things in themselves (the unknowability of)
All in all a fascinating article, but what's most amazing about it is that it takes evidence from sciences that deals with tangible things, such as biology and neuroscience, and uses it to support a realm of inquiry that's commonly thought of as metaphysical.
"In a 1710 "Treatise Concerning the Principles of Human Knowledge," the Irish philosopher George Berkeley objected to [the naive] view. We do not know the world of objects, he argued; we know only our mental ideas of objects. "Light and colours, heat and cold, extension and figures -- in a word, the things we see and feel -- what are they but so many sensations, notions, ideas?" Indeed, he concluded, the objects of the world are likely just inventions of the mind, put in there by God."
All this reminds me of Kant, with his impossibility of knowing "things in themselves". It makes me want to regret not giving "Critique of Pure Reason" proper attention when it was mandatory reading for a philosophy course I took in college. As dry as it was, I vaguely remember being intrigued by its notion that what we think are experiences of real objects are actually just our own mental structures. My thought was, "there's got to be an idea for a science fiction story somewhere in there". :-) But the idea was too abstract to even try to turn it into a story. Now, however, I have a feeling that neuroscience could provide scaffolding on which to build a story exploring the most abstract (if not to say metaphysical) aspects of our existence. That, to me, is fascinating.
Friday, July 04, 2008
Turns out, cavemen loved to sing, says an article headline on msnbc.com. How, one wonders, do the modern scientists figure out such things? This article is interesting not just in that it reveals specifics of life of prehistoric civilizations, but also that it shows how scientists reach conclusions about things as nebulous as leisure preferences of long-gone civilizations. Namely,
They did it not just for their amusement. Cave dwellers used echolocation to map out the properties of the caves, the article says.
As an aspiring science fiction writer, I'll certainly keep this article in mind when or if I try to invent an alien civilization for a story I'm writing. It provides enough inspiring details to fill out a picture of an ancient culture.
Ancient hunters painted the sections of their cave dwellings where singing, humming and music sounded best, a new study suggests.
Analyzing the famous, ochre-splashed cave walls of France, scientists found that the most densely painted areas were also those with the best acoustics. Humming into some bends in the wall even produced sounds mimicking the animals painted there.
They did it not just for their amusement. Cave dwellers used echolocation to map out the properties of the caves, the article says.
With only dull light available from a torch, which couldn't be carried into very narrow passages, the ancient hunters had to use their voices like sonar to explore the crooks and crannies of a newfound cave.
As an aspiring science fiction writer, I'll certainly keep this article in mind when or if I try to invent an alien civilization for a story I'm writing. It provides enough inspiring details to fill out a picture of an ancient culture.
Subscribe to:
Posts (Atom)