Showing posts with label Technology. Show all posts
Showing posts with label Technology. Show all posts

March 17, 2009

Thru You: The YouTube Mash-Up, ReMix, Megajam

Do yourself a favor while you're slacking off at work or procrastinating on that essay, book, project, Spring cleaning, yoga practice, feeding the kids, etc., and check this out.

Titled Thru You: Kutiman mixes YouTube this fascinating project seamlessly mixes together strange little, digital odds and ends into some decent music. Even more, it highlights what's best about collaborative digital environments: together, we can make some cool shit.

As explained by the man himself:

February 9, 2009

Cognitive Computing Project Aims to Reverse-Engineer the Mind

From Priya Ganapati's recent Wired Blog Network post:
In what could be one of the most ambitious computing projects ever, neuroscientists, computer engineers and psychologists are coming together in a bid to create an entirely new computing architecture that can simulate the brain's abilities for perception, interaction and cognition. All that, while being small enough to fit into a lunch box and consuming extremely small amounts of power.
As you might have guessed from the title, the team of neuroscientists et. al are going to come up with this architecture by "reverse-engineering" the brain.

Whoa. Like, remember when I was ranting about The Singularity?

I think it's time to start thinking about what kind of robot body you'd like.

I'm going for the Adrienne Barbeau-bot.

January 14, 2009

Surveillance, Property, and the Body

I just came across David Kravets's recently posted comments on a University of California study of the effectiveness of public surveillance as a method of reducing violent crime rates. He points out that, although property crime rates decrease in the presence of visible surveillance cameras, violent crimes don't.
"Violent incidents do not decline in areas near the cameras relative to areas further away," added the study, which noted the cameras helped police bring charges against six people accused of felony property crimes. "We observe no decline in violent crimes occurring in public places."
Of course, I find this study fascinating. For the record, I'm far more interested in the ways the widespread distribution of surveillance cameras can be liberating, or at the very least, I'm interested in how a proliferation of visual information affects postmodern subjectivity and narrative structures. It's interesting, though, that the mechanism meant to create the illusion of safety does little to actually secure it.

Image from aforementioned Threat Level blog post.

January 6, 2009

Web 2.0 and the Expansion of the Filmic Event

In my talk at the recent MLA conference, I made the following case:
I believe that both the virtual community’s relationship to film and the advancing development of entertainment technology, in and out of the virtual world, suggest that the filmic experience is telescopically, if not exponentially, expanding to an inevitable point: an organic and integrated experience that straddles and blurs the boundaries between the real world and the virtual, between forms and genres of visual entertainment, and between the spatio-temporally isolated experience of the cinema and television and the ongoing experience of everyday life.
and that this "inevitable point" means that:
From here on out, and for those who have the technology and access, the filmic experience will become something less identifiable as traditional film and more recognizable as a hybrid between traditional film and something like a television show (if, in no other way, only by virtue of length and accompanied advertisements). The viewer will be able to choose what she wants to watch and when and where she wants to watch it. Moreover, the experience itself may well be accompanied by a social, “Web 2.0” component wherein the viewer may evaluate a film, asynchronously communicate with the filmmaker, synchronously interact with a virtual community of fellow viewers, or take part in an ARG or fantasy community collaboratively constructed around a particular film or around a series of films.

I used The YouTube Screening Room - a venue gushed about in December - as my paradigmatic example of this - but there are so many more examples of both digital venues and real-world technologies (like AppleTV and the TiVo).

What surprised me so much in the audience's reaction was the vast difference between those who felt like I was pointing out the obvious and those who couldn't accept that the cinematic experience can really happen to someone watching it on a computer screen (or iPhone or other such device) as opposed to the old movie house, big screen.

It seems to me that "Web 2.0" - particularly Web 2.0 as social space - has become a dividing line between those who can accept that social activity can happen without a physical bodily referent (these are the folks who have no problem with considering their facebook friends to be closer than their classmates, co-workers, etc.) and those who can't. Unfortunately, this line too easily looks like one determined by age - typically, the folks who haven't grown up with with the internet fall on one side; and those in the "Wired Generation" fall on the other.

Age aside, perhaps the key to my perspective - one that I'm still figuring out - is that I suggest that the cinematic experience will become (or already is) a hybrid one (and hybrid in a unique way for each individual): consider alternate reality games (ARG) that have surrounded films and television (from the marketing campaign called "The Beast" for the Kubrick cum Spielberg film A.I. to the more recent "Lost Experience"), the popularity of video games based on films (like ones that let you explore the Star Wars galaxy or roam Tolkein's MiddleEarth), or even just the popularity of rating film clips on YouTube... For those of us who want the filmic experience to go beyond the cinema screen, there is ample opportunity.

On the other side of this coin, so to speak, we might expect a cultural reaction comparable to the one regarding the digital distribution of music. Vinyl sales have been skyrocketing in the last few years, suggesting that the digitization of music has left us hungering for a physical artifact. What, then, might a similar reaction to digital film distribution be?

December 10, 2008

"The New Examined Life"? Or Dataveillance?

Thanks to Swade for this article from the Wall Street Journal:

The New Examined Life
Why more people are spilling the statistics of their lives on the Web


By JAMIN BROPHY-WARREN

In the first week of January, New York graphic designer Nicholas Felton will boil down everything he did in 2008 into charts, graphs, maps and lists.

The 2007 edition of his yearly retrospective notes that he received 13 postcards, lost six games of pool and read 4,736 book pages. He tracked every New York street he walked and sorted the 632 beers he consumed by country of origin.

Part experimentation, part self-help, such "personal informatics" projects, as they are known, are gathering steam thanks to people like Mr. Felton who find meaning in the mundane. At their disposal are a host of virtual tools to help them become their own forensic accountants, including Web sites such as Dopplr, which allows people to manage and share travel itineraries, and Mon.thly.Info, for tracking menstrual cycles. Parents can document infant feeding schedules with Trixie Tracker. And couples can go from between the sheets to spreadsheets with Bedpost, which helps users keep track of their amorous activities.

The objective for Mr. Felton and others is to seize data back from the statisticians and the scientists and incorporate it into our daily lives. Everyone creates data -- every smile, conversation and car ride is a potential datapoint. These quotidan aggregators believe that the compilation of our daily activities can reveal the secret patterns that govern the way we live. For students of personal informatics, the practice is liberating because it shows that our lives aren't random, and are more orderly than some might expect.

Mr. Felton calls his compilation the Feltron Annual Report; the slight alteration of his name connotes the mechanical nature of his autobiographical cataloging effort, now entering its fourth year. He plans to continue his project over the next decade in what he hopes will result in a modern-day spin on James Boswell's famously detailed biography of Samuel Johnson. "I want to create connections where I didn't know that they existed," Mr. Felton says. "I'm a natural annotator."

The elegantly graphical reports, as much design projects as they are data compilations, are posted online by Mr. Felton. He also creates hard-copy limited editions, available free of charge. They have become so popular that he recently launched a Web site with his friend Ryan Case called Daytum, which helps fellow chroniclers track the details of their own experiences.

The culture of sharing information online has shifted in recent years, from a focus on blog ramblings to the ubiquitous micro-movements of posters' daily lives. Microblogging sites like Twitter have become commonplace. President-elect Barack Obama, for example, had his own Twitter account and used it to keep his supporters up to date on his campaign's daily comings and goings. (It's been silent since the election.) Facebook's News Feed feature initially drew criticism from members because it offered a running log of users' minute postings and updates, but has since became a core part of the Web site's community. Some sites collect data automatically for their users. Last.fm keeps a record of all of the songs users have listened to, and Netflix keeps track of members' movie-watching habits.

"It's a natural progression from people sharing things like movies, photos and videos," says Dennis Crowley, founder of Dodgeball, an early social-networking service for mobile phones which was sold to Google in 2005. "What's left to share? Basic data."

Yannick Assogba, a graduate student at MIT's Media Lab, created a site called Mycrocosm to help users compile and share the "minutiae of daily life" in the form of multicolored bar charts and pie charts. Mr. Assogba, for example, tracks his ping-pong winning streaks and what days he spends the most money. Created in August, Mycrocosm now has 1,300 registered users. "We're living in an era of data," Mr. Assogba says.

Today's info-chroniclers are just the latest in a long history of diarists and scientists who kept notes by hand. Nineteenth-century English inventor and statistician Francis Galton, who introduced statistical concepts such as regression to the mean, was an obsessive counter who created the first weather map and carried a homemade object called a "registrator" to, among other things, measure people's yawns and fidgets during his talks. (Mr. Galton's preoccupation with data, specifically with human hereditary traits, also yielded an unsavory by-product -- eugenics.)

In 1937, a social research organization called Mass Observation in London used about 2,000 volunteers to develop an "anthropology of ourselves." For more than a decade, participants recorded such things as their neighbor's bathroom habits and what end of their cigarettes they tapped before lighting up. Personal tracking also showed up in "Cheaper by the Dozen," a 1948 book about efficiency experts Frank Bunker Gilbreth and Lillian Moller Gilbreth and their attempts to track and optimize the daily routines of their 12 children (including when they brushed their teeth and made their beds).

Several technological shifts in the last decade have helped turn personal informatics into a mainstream pursuit. The iPhone, for example, has several applications such as Loopt that use the product's internal global positioning system to record a user's location and then share it with others. Low-cost products such as Wattson, an energy monitor that tracks real-time power consumption, make it easy to record otherwise nebulous data.

Some of the new data collectors hope to make better decisions about their activities and improve their quality of life. For the last four months, Alexandra Carmichael, the founder of a health research Web site called CureTogether in San Francisco, has been tracking more than 40 different categories of information about her health and personal habits. In addition to her daily caloric intake, her morning weight and the type and duration of exercise she performs, she also tracks her daily mood, noting descriptions such as "happiness" and "feeling fat."

From her initial readings, she concluded that her mood went up when she exercised and went down when she ate too much. "I realized my relationship with food is a distorted, unhealthy one," Ms Carmichael says. She has concluded that she may have an eating disorder and has decided to seek counseling.

Andy Stanford-Clark, an inventor for IBM, began tracking the power usage of his 16th-century thatched cottage on the Isle of Wight in an unusual way. Everything in his house, from his phone to his doorbell, is hooked up to automated sensors. Each time water is used, or a light goes on or off, it's catalogued publicly on Twitter for all to see, along with the total household water and electricity consumption. Mr. Stanford-Clark says he now tries harder to conserve power. "I just couldn't believe how much money that was wasting," he says.

Keeping track of personal data online can yield unexpected consequences. "Initially, it sounds like a great idea, such as the social aspects," says Christopher Soghoian, a fellow at Harvard University's Berkman Center for Internet and Society. But "for most users, the costs outweigh the benefits," he says. Specifically, Mr. Soghoian points to the legal concept called the "third-party doctrine" which eliminates the right to privacy for users who voluntarily place their information on Web sites. "If you're cataloging every movement, that might come up if you get divorced," he say.

Private investigators and the federal government could also use such information in some circumstances. In the application for jobs with Mr. Obama's administration, applicants are asked to list all of the social networks that they are involved in and to supply any potentially problematic blog posts from their online past. "All this stuff is creating a huge digital paper trail that could come back and haunt you," says Mr. Soghoian.

Personal data collection can get in the way of living, some people admit. "It becomes an obsession," says Toli Galanis, an aspiring filmmaker in New York who tracks everything from his mercury levels to his vitamin D consumption. He says that he's had to forgo outings with friends when he's trying a new diet that requires scheduled mealtimes, and elicits strange looks from his parents when he measures his dinner food to the ounce.

Still, he adds, "Life and its goals are like a lab. Why not use it like a scientist? Then you'll really know what you want to. There's so much info that it'd be a shame not to track it."

I find this fascinating.

Felton's annual report reminds me of Hassan Elahi's ongoing Tracking Transience project. It's awesome. Elahi continually broadcasts his current location and regularly posts pics of his meals and mechanisms of travel. What, at first, looks like full disclosure turns out to be something much more subversive.

So, to get uber-theoretical: Does "data-blogging" represent a Foucauldian internalization of the ideology of surveillance? Does it subvert the Foucauldian model by introducing the idea of flooding the panoptic eye with an excess self-circulated information? Neither? Both?

And, as Swade asked, is data-blogging a technologically-assisted practice in mindfulness? Or narcissism?

December 4, 2008

The Singularity

Last night, with a few good friends, I got sucked into the old "life, the universe, and everything" conversation (and not the one where you gush about how much you love Douglas Adams). Maybe it was the beer talking, but I'd just read an excellent article in Wired on Ray Kurzweil, so I ranted about "The Singularity" for a little while, and I've been thinking about it ever since.

If you know Moore's Law, then you know it refers to the exponential increase of the number of transistors that can be reasonably (in terms of cost and physical space) placed on an integrated circuit. Moore observed that the number doubled roughly every two years.

Take Moore's law, apply it more broadly to evolutionary technological advances, and you get Raymond Kurzweil's brand of futurism that essentially says this: Technology is evolving exponentially faster and faster. First, it will reach a point wherein technology can replace the body (or at least many bodily functions - Kurzweil asserts that nanotechnology will be able to replicate and replace the function of, for example, the human liver). Second, it will replicate consciousness, effectively pass the Turing Test, and give us true Artificial Intelligence. Third, humans, having at this point augmented their bodies and extended their lifespans through nanotechnology, and machines, having "waken up," will merge into a new species. Finally, the entire planet and ultimately the entire universe, will become one, huge supercomputer full of disembodied and eternal consciousnesses.

"The Singularity" happens when artificial intelligence and technological advances render humans obsolete. The posthuman - or, in Kurzweil's phrase "Human 3.0" - will essentially be immortal, but will not have an embodied life as we know it.

And all of this, according to Kurzweil, may very well happen in your lifetime. 2045.

Sound crazy? It might be, but then Kurzweil is inarguably one of the brightest minds alive today. If you're old enough to remember when, in the early 80s, some scientist dude built a revolutionary reading machine for Stevie Wonder, then you remember Kurzweil. That was just one of many inventions. Morever, he's awarded about one honorary PhD per year (wait...shouldn't that also exponentially increase?).

If he's right, you better get off your ass, dust off those joggin' shoes, and get yourself into shape. It'd be a real kick in the pants to die of poor health shortly before the first bridge to immortality is built. But, then again, there probably won't be hot wings and PBR around when you're broken into a million digital bits of info. floating in the ether.

I wonder if they'll have good music all up in that b.

December 3, 2008

The YouTube Screening Room

I've been working on a paper entitled "Web 2.0 and the Expansion of the Filmic Event" for presentation at the MLA Conference in San Francisco later this month, and my "research" has involved watching short, independent films in the recently created YouTube Screening Room. I'm falling in love with this venue, which makes it hard to maintain enough cynical distance to write critically about it.

The YouTube folks describe the Screening Room as "a platform for top films from around the world to find the audiences they deserve." To quote the description in full:

Every other Friday, you’ll find four new films featured in the YouTube Screening Room.

These films always appear with the permission and involvement of the filmmakers, so be sure to rate, share and leave comments. This is your chance to not only watch great films from all corners of the globe, but also to converse with the filmmakers behind them.

While the majority of these films have played at international film festivals, occasionally you’ll find films that have never before screened for wide audiences.

All films playing in the YouTube Screening Room are displayed within our High Quality player to give you the best viewing experience possible.

Be a part of a new generation of filmmaking and distribution and help us connect films and audiences in the world’s largest theater!

The Screening Room arguably pushes the world of entertainment even closer to the inevitable: A multi-world, hybrid media that will allow us to chose what we want to watch and when we want to watch it. YouTube has recently signed deals with Lionsgate, Showtime, and CBS (yes, you can watch vintage episodes of Beverly Hills 90210 in full, any time you want), and its largest competitor, Hulu, currently partners with even more traditional television networks and film companies. YouTube and Hulu aside, you can use the old idiot box to get shows "on Demand," you can download your favs on iTunes (and other digital venues), you can record programs if you have cable connectivity and the right software, and, of course, there's the modern version of the VCR - the DVR. The point is that technological advances are pushing traditional venues of entertainment toward greater integration and increased availability for the viewer.

Enough gushing. Below, check out The Danish Poet - one of the first films to air at The Screening Room. Reminiscent of the reconstruction of the protagonist's random and chaotic genetic history in Jeffrey Eugenides's MiddleSex, The Danish Poet offers a lovely, animated take on the seemingly chaotic paths that lead to people to love.

December 2, 2008

Digital meets Analog meets Radiohead

A friend of mine recently shared this incredible video (embedded below) by James Houston.

In Houston's words:

I've just graduated from the Glasgow School of Art's graphic design course. This was my final project.

Radiohead held an online contest to remix "Nude" from their album - "In Rainbows" This was quite a difficult task for everybody that entered, as Nude is in 6/8 timing, and 63bpm. Most music that's played in clubs is around 120bpm and usually 4/4 timing. It's pretty difficult to seamlessly mix a waltz beat into a DJ set.

This resulted in lots of generic entries consisting of a typical 4/4 beat, but with arbitrary clips from "Nude" thrown in so that they qualified for the contest.

Thom Yorke joked at the ridiculousness of it in an interview for NPR radio, hinting that they set the competition to find out how people would approach such a challenging task.

I decided to take the piss a bit, as the contest seemed to be in that spirit.

Based on the lyric (and alternate title) "Big Ideas: Don't get any" I grouped together a collection of old redundant hardware, and placed them in a situation where they're trying their best to do something that they're not exactly designed to do, and not quite getting there.

It doesn't sound great, as it's not supposed to.

I missed the contest deadline, so I'm offering it here for you to enjoy.

Sinclair ZX Spectrum - Guitars (rhythm & lead)
Epson LX-81 Dot Matrix Printer - Drums
HP Scanjet 3c - Bass Guitar
Hard Drive array - Act as a collection of bad speakers - Vocals & FX


Houston does much more than just "take the piss a bit" here. He aesthetically expresses one of the contemporary intersections of the digital and the analog. And, while the topic has already been extensively covered, it's worth mentioning that, for example, the resurgence of the popularity of vinyl - analog media - in an age where most music is purchased or otherwise acquired digitally suggests that the line separating the digital and the analog is easily traversed if not increasingly blurred (there's a reason most LPs come with a digital download of the album, and I don't think that it's just to compensate for the difficulty in ripping vinyl into iTunes).

If art expresses the cultural undercurrents of an age, then Houston is clearly onto something...

See for yourself:


Big Ideas (don't get any) from James Houston on Vimeo.