Tuesday, February 2, 2010

Sharing the future data deluge



Really fun and provocative reading dealing with how to interpret our information-laden future: subjects covered include data management and sharing, online collaboration, the cognitive and physical possibilities of machines, knowledge commons, among others.

Benkler:
The wisdom of crowds can be very helpful to an individual, especially when these individuals are part of an online network that where it is possible to see an organized stream or thread of their communication. I always joke about "outsourcing" my brain to the Web when I feel overwhelmed with school or professional work. I often outsource my brain to unknown "friends" and helpful individuals on listservs or newsgroups when I have questions about a given subject. I shoot them a question and I will always get a few varying answers that help me with decision-making. I don't know them but we are loosely affiliated. They are my personal search engine, humanized.

Benkler is right in stating that as citizens and consumers, and occasional producers of knowledge and information, we ought to value these activities and work towards a global (this is my understanding, correct me if I am wrong) knowledge commons that starts with economically and socially liberal societies. The altruistic motives of groups and individuals in the ecology of liberal information societies/economies is an important factor in how we might persuade policymakers to work towards creating an open online environment, where information can be shared and criticized, and where the democratic function of free expression can be protected and encouraged. Obviously we must acknowledge that the Internet and a networked information economy possesses a basic material infrastructure- cables and electricity and hardware production- and that many of these exchanges have everything to do with the transfer of material goods (e-commerce, for example), and that there might be many complexities along the way to this global information economy involving very real problems of resource management and sustainability.

Kurzweil:
"drugs today are genetically engineered specifically for the individual's own DNA composition. Interestingly the manufacturing process that's used is based on the protein-folding work that was originally designed for the nanopatrols. In any event, drugs are individually tailored and tested in a host simulation before introducing any significan volume to the actual host's body. So adverse reactions on a meaningful scale are quite rare."

This statement tangentially reminded me of an app that I read about online for the iPhone where people are asked to participate in helping biologists fold proteins. I like how ordinary people who might never participate in this sort of scientific research are asked to contribute not only because the tasks required of them might entertaining and cognitively challenging in and of themselves, but also because it extends to intrapersonal relations and health issues. In my mobile phone learning class, we discussed the use of "dead time", time where people might have previously been idle (such as waiting in line) as an opportune moment for establishing a learning environment and perhaps helping people engage in productive collaboration via their portable networked devices.

One scary thing about the Kurzweil reading concerns the notion that if our health systems, diganoses and prescriptions are so personalized, what will we do about synthetic or other viruses that may be constructed the same way? Will viruses then be so personalized that we won't have generalizable tools to combat them? The scales of data collection and management we have to consider may be focused extremely specifically, on individuals, or may cast a wider net onto mass populations.

The Fourth Paradigm/Cognitive Load/Beating a Dead Horse:
One theory that I've heard over and over again since coming to TC is that of cognitive load: perhaps the reason why we need to parse this information is simply because our minds can't handle it all. It's clearly been an issue especially with the endless proliferation of data in real-time monitoring systems. What are the critieria used to analyze data? What tools should be used to do this, and how might they be available? How might one design a data management system that is flexible and precise in query targets? Do I get to help answer these questions in any way, or is it up to computer engineer whiz-kids at Google, or other places that jealously guard their algorithms (built on the backs of our searches)? Since when does Microsoft of all people care about the open engineering and filtering of information? Bah.

2 comments:

  1. Love your closing observation about microsoft's motives. My main concerns in several of these readings come down to money. I think Benkler is right in many ways, but do you think he's a bit naive about how many of these endeavors (at least the ones that will last) are funded? They're basically advertising platforms in many cases, as Lanier points out, no?

    ReplyDelete
  2. yo sophie lam, I just lost my original comment. It was shockingly awesome. Now we must live with an echo of the original.

    Yo, sophielam, How exactly do you roll with your listserves, etc. Tweets, what? I'm always looking to improve my methods in this area. I tend to be a bit slow on uptake. Why are some people better at this than others? I generally post on forums and then wait for rss and email updates.

    Interesting question re: specificity of viruses.

    Dead time---OMG please don't fill up my dead time. It is my sanity! Please, techworld, let me keep my Sabbath, if only for brief lifeindeath moments! I think this way of thinking arises from all those worksheets our teachers use to give us just to keep us busy. Anything to stop deep thought.

    M$. Did you see that Jim Gray was lost at sea directly after giving his speech? Clearly, BG put a hit out on him. Gray went over the line. Also, 9-11 was an inside job.

    ReplyDelete