Sunday, December 28, 2008

OLPC and IPhones

My son was given a One Laptop Per Child (OLPC) laptop this holiday season. It came in a neat little box with icons on each side. The neatness continued as we unpacked it, revealing a sleek white and green plastic lozenge with a handle. A bit of effort was needed to figure out how to open the thing, but we were pleased that it fired up immediately, not requiring a 12 hour charging cycle.

For those not familiar with the OLPC project, the idea is to create $100 laptops that can be given to children worldwide to try to bridge the technology divide and empower young people to use technology. The purchase of our OLPC was actually paired with the gift of a second OLPC to a Third World nation.

My son already has a laptop, it turns out, but I was bound and determined to set-up and try the OLPC. My fun began with trying to get build 706 of the Linux OS to recognize our WiFi network. The problem is that I have a hidden SSID (non-broadcast) for security reasons (this is security through obscurity, which is not the best policy, but when paired with strong encryption reduces the threat of easy compromises). So I began the Linux hacks that I am far too familiar with having five Linux servers in my stable.

I could never get the OLPC machine to see the network for more than a moment, unfortunately, so I eventually relented and just exposed my WiFi node. Then I hunted down the upgrade procedure from within a command line shell and spent hours doing an upgrade. The upgrade radically re-arranged the UI's already cryptic iconic interface (I still can't figure out a few of the button functions!) but also brought some improvements to the WiFi connectivity UI.

Overall, though, my wife complained that I spent 6 hours hacking on a computer that was supposed to be useful to African tribesmen. I also have something like 15 years experience with Linux, even running early versions on laptops with experimental X windows servers by 1994 as an alternative to purchasing a Sun "luggable" at the time. I also have innumerable hours pouring over the Linux FAQs/wikis that are always out-of-date or pertain to earlier builds, trying to unravel the correct changes to make something work. I think OLPC has some more work to perfect.

I contrast OLPC with my IPhone 3G that cost only marginally more. If the claim holds true that WalMart may start offering $99 IPhones (8G), the cost would be comparable to the OLPC while providing substantially greater memory capacity. Some of the educational aspects of the OLPC are missing (built-in graphical Python programming, for instance), but everything else is easier and more stable.

Thursday, December 18, 2008

Monophony and Complexity

I like George Will because he is slightly less loopy than, say, Jonah Goldberg at the National Review. I like his inventive language and use of baseball metaphors. But he has recently become monophonic and uninventive. Perhaps that is the fate of political conservatives who have a quiver filled with dozens of the same arrow labeled "smaller government might be better government."

Here he is in the most recent Newsweek building a rickety historical scaffolding to re-explain how an originalist (political originalism to distinguish from legal originalism) interpretation of the Constitution takes a dim view of government involvement in almost anything, how government involvement in almost anything inevitably leads to a request for more government involvement in our affairs (repeating Hayek), and how regulation drives lobbying because regulation drives defensive politics by business entities.

The odd thing about Will's unending exposition of political and economic history is that it minimizes or ignores the complex problems that led to our modern state. Madison's America was a smaller, emptier and more racist place where notions like the tragedy of the commons were only intellectually interesting; the vastness of the wilderness was also a vastness of available resources for exploitation. It took a hundred more years for economic interactions to become complex enough that monopolistic inequities began to become obvious and regarded as unfair. It took longer still for environmental pollutants to be understood and pervasive enough that regulation and environmental law was shaped. It took almost as long for American law to begin to shed itself of racism and sexism and expand that originalist notion of equality to all people.

For American conservatism to move forward there must be something more than raw originalism crossed with Hayek's fears of serfdom. There are too many counterexamples of each in the modern world for those arguments to carry much weight. Instead of fixed ideology, what is needed is targeted creativity. As an example, while environmental regulation is a thorn for conservatives like Will, a targeted improvement might be to focus on streamlining and using technological means to speed the process of regulatory approval. Partisan efforts that broadly deny the value of such regulation can't possibly succeed.

The same is true of health care reform which Will mentions in passing. He rings the topic with Grover Cleveland's assertion concerning originalism but doesn't penetrate to exactly what might be wrong with the specific programs that Obama hinted at in his platform. It's enough for him to proclaim it might be bad because the Constitution failed to mention it in succinct language (unlike, say, the existence of the Navy).

Fair enough. We are warned. Now I want actual policy that goes beyond a single note.

Wednesday, December 3, 2008

Thanksgiving and Poodles

So the Thanksgiving events wind down.

There was soggy architecture, rain-soaked and oversubscribed at the California Academy of Sciences. Two giant spheres sandwiched between a living roof and a subterranean aquarium, pushing the roof outward with a graceful femininity and seemingly mocking the functionally irrelevant twisted tower of the De Young Museum directly across from it.

There was a sunset dinner cruise on the Bay, with detailed discussions of the implications of new theories concerning IGF2, imprinting and the spectrum of mental disorders from autism through to schizophrenia as the swells poured in through the Golden Gate Bridge.

Then, finally, there was SFMOMA, Martin Puryear and participatory art projects. Katharina Fritsch's visually stunning Kind mit Pudeln (concentric circles of black poodles surrounding a baby) invoked the embrace of kitsch (yes, Fritsch invoked kitsch) by, say, Jeff Koons, while simultaneously reminding me that the singular requirement of art is discovery of the art by critics and museums. Advocacy is the requirement. It's not that Fritsch's piece is not interesting, but it has the postmodern conceit of denying conceptual depth with a gentle tongue-in-cheek amusement. As observers we spend time interpreting it, but the interpretation is at best a reference to Faust, at worst a reference to Koons.

The social aspect of art was reinforced while browsing a volume on the roots of cyberart in the 90s (and before). The examples and the narratives built around explorations in coastal metropolitan centers and major universities. The author spun a narrative history in a web of advocacy connecting together their friends and acquaintances from that era. There was not, nor could there have been, an appreciation of anything other than what passed through the author's sphere.

Could it be otherwise? Disintermediation and long-tails might hold a key with the internet supporting better access of artists to art markets and observers, but they also promote an unfiltered view of content without sieving by non-amateurs.

Thursday, November 6, 2008

Guanocholia and Politics

The most striking effect of the election of Barack Obama was how the extreme Right Wing legitimized outright insanity. I'll get back to that in a moment.

First, I want to speculate on the origins of the phrase "bat shit." There are many contenders as this discussion points out. The most likely case is that the creative spatial metaphor "bats in the belfry" (spatial insofar as the belfry is atop the building and that the bats are like random thoughts colliding about "in the head"--our mental metaphors do tend to be spatial) was co-opted through "ape shit" and "batty" to become "bat shit."

So "bat shit" is an excellent description of some of the memes floating around in Right Wing circles during the run-up to the presidential election. Recursivity chronicles a few of them, including the remarkable claim that Obama was hypnotizing his audiences using some weird variation of neural linguistic programming. To reduce the obviousness of the scatological quality of the phrase, however, I will coin guanocholia to describe a general susceptability and belief in bat shit ideas.

Extraordinary Popular Delusions and the Madness of Crowds demonstrates that guanocholia is far from a recent phenonema, but what concerns me is that the internet now makes it remarkably easy to transmit and sustain these delusions. What once took years of word of mouth to spread, now can spread in minutes from blogs like so much fertilizer mined from a cave.

Is there any inocculation for guanocholia? Likely not, but the American public seems to be tired of erratic temperments and most guanocholics are, by definition, bat-shit erratic.

Sunday, October 12, 2008

Collapse and Flat Technology

The backlash and analysis is beginning.

I should join in since the first of my September statements just arrived showing a 10% drop in a diverse basket of international growth, domestic plodders and domestic technology funds. I have a feeling that is just a pinprick compared with next month...

How could this have happened?

One line of toxic reasoning is that Fannie Mae/Freddie Mac were forced to take-on subprime mortgages by Congress desiring to spread homeownership. But that appears to have been only a small slice of the entire pie since FM/FM only held 40% of the subprime mortgages (and many of those bundles were acquired late in the game in an effort to shore up the broader market).

There is also the moral/economic analytic dimension based on moral hazard theory that the S&L bailouts of the late 80s combined with the hedge fund debacles of recent years gave a sense of cushion on the downside. But I don't tend to think that fund managers look much at worst case downside; upside is where the profit is and moral hazard reasoning is meaningless when golden parachutes will automatically deploy in contractual severance packages.

Instead, the best available analysis (thanks, Ted) divides across two arguments: (1) risk was hidden (information loss/information assymetry) due to the complexity of the security instruments, lack of regulation and restraint, and short-term profit objectives; (2) quants and algorithms screwed up resulting in (1).

The latter argument is summed up in the NY Times: "Beware of geeks bearing formulas" begins the article with a quote from Warren Buffet. This is the same bugbear that attacked in the 80s with automatic trading; deploying technology results in unexpected outcomes.

Perhaps. And perhaps the current election is unduly influenced by the flattening of information resources in the internet-driven world.

But there are inevitable corrections to extremes that result in people losing money or power, and the technology will continue to be pervasive while the users of the technology will get smarter about its impact. On the political front, factcheck.org demonstrates how there is already an evolution from pre-internet rumor-driven political feelings to partisan exploitation of technology channels, and then on to sensible corrections to those partisan swings.

Perhaps. I just hope my son's 529 plan recovers enough in the next 8 years to make it better than just a wash.

Friday, September 19, 2008

Startles and Moral Reasoning

I was startled awake today by the work at University of Nebraska, Lincoln, that showed a potential link between political affiliation and startle response. Partisan Republicans exhibit greater startle response to threats than do partisan Democrats, seemingly supporting the penumbra of classic definitions of "liberal" like this fine Bertrand Russell offering:

The essence of the liberal outlook lies not in what opinions are held but in how they are held: instead of being held dogmatically, they are held tentatively, and with a consciousness that new evidence may at any moment lead to their abandonment. This is the way opinions are held in science, as opposed to the way in which they are held in theology.
There are other results that seemingly bear this out, including Jonathan Haidt's findings that political conservatives simply value tradition and fairness at different levels from liberals. Preservation and stability trumps flexibility and risk.

Other recent interesting finds this week include Pyschiatric Times reporting that adult ADHD sufferers have lower educational and professional outcomes than non-ADHD individuals, even when IQ was held constant:

Adults with ADHD are not achieving the educational and occupational successes that they should be, noted researchers in the Journal of Clinical Psychiatry.

In a case control study, Dr. Joseph Biederman and colleagues looked at 222 adults with and 146 adults without ADHD to determine if educational and occupational functioning in ADHD represented low attainment or underattainment relative to expectations based on intellectual abilities.

In the control group, educational levels were significantly predicted by IQ scores, and, in turn, employment attainment was significantly predicted by educational levels. However, in the ADHD group, patients did not achieve successes as expected based on IQ and educational levels. In fact, only 50% of patients with ADHD were college graduates, yet based on IQs, 84% should have been. Similarly, only 50% achieved semi-professional or major professional levels, although 80% were expected to achieve such based on their education. Most importantly, the researchers noted, ADHD was associated with significantly decreased educational obtainment independent of IQ.

“These findings stress the critical importance of early identification and aggressive treatment of subjects with ADHD,” the researchers concluded. “Appropriate intervention could be highly beneficial in reducing the disparity between ability and attainment for individuals with ADHD.”


The take-away to me continues to support the notion that an evolutionary effective brain that trades-off risk aversion with creativity (and the kinds of transcendant and even randomizing cogitation that is essential to creativity) is in a wide valley of contributory genetic and environmental inputs that are easy to get just slightly wrong, whether we are looking at dysfunctional and excessive behavior among artists, interference with educational success for ADHD sufferers, or enhanced mental capabilities among borderline autistic individuals. The continued maintenance of this diversity of types suggests that the diversity is or was more adaptively useful than the obverse.

Finally, Marc Hauser works the landscape of moral decision making in a recent Newsweek article, once again describing how remarkably uniform a "moral grammar" we seem to share, regardless of ethnic background or political affiliation. In discussions among friends and family on this topic, I always come away with a more complicated picture of the moral dilemmas. How can you guarantee that dropping the fat man onto the railroad tracks will stop the train? How can you be certain additional help will not arrive before cutting the woman out of the cave mouth? How can you be certain that the death row inmate really committed the crime?

In every case, the trade-off is not between what is morally permissible and obligatory, but between the individual's level of certitude and the killing of one or many. I can therefore almost always answer the dilemmas with a refusal to act until the situation is so dire that action is required. Flipping the switch to divert the trolley car is the exceptional case that demonstrates pure utilitarian moral reasoning, but almost all others require persmissibilty to be modified to something like "permissible only given a lose lose situation where there is the strongest signal that other lives may be lost."

That kind of reasoning requires a low startle response, of course.

Thursday, September 4, 2008

Gray Laptops and Luminous Clouds

Last Friday my new laptop arrived, throwing me into a dark, confused fog that is only beginning to lift. I know, I know, it should have been a joyous and exciting time for me, but the time commitment needed to make it functional has been somewhere between distracting and onerous.

The machine is impressive enough: a Sony VAIO FW (no, not the recalled model) with 4GB of RAM, 250GB of disk, and a Blu-ray device with HDMI output built-in. Although large, it was a concession to a round of deliberation about how I use computers. My old machine, a Gateway with a 13 inch screen and 1GB of RAM running XP, was simply too slow and lacked sufficient screen real estate for effective software development. With 10 Firefox windows open and a running Eclipse instance, things would start to drag and switching became ponderous. Part way through my deliberations over what to get, I seriously considered a Mac Air, which would have not met my requirements at all but was just so delicious I had to give it consideration. I thought briefly that the Air would work because I have five Linux servers hosted in a high-rise in San Jose, California and could use them remotely for my development needs. Almost--but not quite--due to networking speeds and the need to sometimes work offline. The larger Macs were also considered, although the price points to get serious bang were too steep. In the end, the VAIO was a good trade-off, with the Blu-ray add-on a concession I made to myself because I was not going for the high-end Mac.

And then the work began.

Luckily, I have cultivated a model of continuous holographic reflection of all work-related materials through the use of a source code control system called Subversion (SVN for short). In this model, every document, note, source file, image, etc. I create is checked-in to a repository hosted on one of my servers through an HTTPS connection and WebDAV. Change logs are maintained on the server, and periodic backups are created to other machines in the cluster as well as to a portable USB drive and, soon enough, to Blu-ray writeable media tossed in the trunk of a car.

So the first thing I did was install SVN on my new machine and check out everything to my local hard drive. Nice. But to get everything working took 4 days of software installations and configurations. I configured 10 different POP and IMAP accounts, PHP5, Apache HTTPD, MySQL, PHP5 plugins, Eclipse, Subclipse, ITunes, FabFORCE DB Designer, The GIMP, Microsoft Office, Microsoft Visual Studio 8, Adobe Flash CS3, Cygwin, Inkscape, Firefox, Propel, Xemacs, and many more. I updated everything to the latest versions and got automated and manual patches. I rebooted many times (wasn't that supposed to be fixed back when I worked on Windows 2000?)

All the complexities were smoothed out gradually and incrementally, of course, and I am fairly happy with the screen real estate and performance of the new machine after I disabled most of the security features of Vista. I even picked up the Blu-ray edition of Blade Runner, The Final Cut and ran HDMI to our LCD TV to confirm it all worked (note: no start-up lag unlike some BD console players; also, HDMI and LCD can't be running simultaneously due to HDMI-based DRM policies, which seems like ridiculous overkill).

But I wondered why I can't have a computing universe where the ease of the SVN management of my own resources was replicated in the software installation world? There is a hint of that capability in recent Linux installations that can download and install software packages and their dependencies with a single, short command. Still, configuration and customization remains daunting and can even be exacerbated because the installation process doesn't communicate all the details about where resources go (and the destinations change with some regularity).

Ideally we can imagine a computing cloud where apps are no longer installed locally, just web-based, and all of our configuration settings and password management is remote (and trustable) as well. Hints of that have been emerging with Java Web Start, Adobe AIR, Microsoft Silverlight and, to a lesser degree, Google Chrome. Each is an attempt to move web-based applications away from the limitations of a browser-based model and support more sophisticated interaction models. One company I work with has shown the model can work for specialized enterprise computing needs, so I think there is hope, but the evolution is nevertheless slow and may even require re-imagining the computing platform itself.

I'm guessing I have several dozen more gray laptops, virtual tablet devices, holographic mental interfaces and whatnot to go before everything becomes as neat and easy as I'd like, disconnected from individual platforms and universally available on-demand from some luminous computing cloud where replicants slave away maintaining and upgrading software in those tilt-up buildings in Silicon Valley.