Thursday 11 December 2008

what's digital and why you should apply

Something I did for the folks over at AdGrads.


100 percent digital

It might be overstated (there’s probably always going to be the “‘bed, bog, bath’ element”) but Mr Billingsley’s comment is almost certainly right: we’re going to be digital advertisers because the world is now digital, and getting more so.

What does all this digital malarkey mean for people looking to get into the communications business and, before we look at that, what does digital mean anyway?

One of the lovely insights of Dare’s grad video – where the parents of Dare folk gloriously fail to define what their children do – is that pinning it down is tricky.

Part of its slipperiness is that things just keep shifting. Facebook was born in 2004, YouTube in 2005, Twitter in 2006, the App Store in 2008 etc. The only constant is change.

The other thing about ‘digital’ is that it’s polysemous – it has multiple meanings. 

It’s used to refer to electronic media (web, screens, mobiles, ipods, nike+ shoes etc) but also, and more importantly, the behaviours those media have unleashed and fed: interaction.

There’s an important difference there that Jeremy Bullmore expressed perfectly in Campaign when he said,

all about interactivity

There are two important things there for grads trying to get into the industry. The first is, “whose roof”?

Most of you will have been concentrating on the big above-the-line ones. That’s a good bet for a digital future as long as that ATL agency gets digital, which means they aren’t just talking about it, they’re doing it  (hmm, a black sheep has just popped into my head.)

On the other hand, another good bet are the agencies whose best is yet to come: the digital ones, primed as they are to thrive in the coming digital ecosystem.  

And now for the second important bit of Mr Bullmore’s quote: if you’re worried about applying to a digital agency because it’s got the word digital in it, don’t be: as he says, it’s not really about tech, it’s about interactivity.

And what's that? It's spreading the intelligence more evenly between people who make stuff and people who consume it. Sometimes it’s only a little, sometimes it’s a lot

This interactivity let’s you do a lot more than you can at your typical traditional ATL agency. Or to reunite that idea with its owner:

we are not an advertising agency

I think that's really exciting (and Mr Tait has 9 more great reasons digital is better for those interested). In digital you’re unshackled from just doing TV, print and radio to all sorts of exciting things like sitesapplicationsblogsgamesbranded contentwidgetspodcastssocial things and experimental stuff And a lot of this (not all) is actually useful to people; it's additive rather than interruptive

In my experience grads tend to think of digital as something on-the-sidey and techy. Maybe it once was. Now it ain’t. Technology is so ubiquitous, so ‘ready-to-hand’, that it’s becoming invisible and when that happens it gets socially interesting. In other words, technology and culture used to be separate, increasingly they are the same (look what you're doing now.)

It’s a brilliant time to get into an industry that’s only going to grow (even in these tough times) and that’s much more about interesting interactive ideas than it is about tech.

Go on, apply!

Obviously I am biased but this would be a good place to start...

(For those wanting more, I suggest you have a play in here, read this, canoe back up this and maybe watch this. That should be enough to be getting on with.)

Monday 10 November 2008

a good, good guide


GoodGuide is great. It helps you find healthy, safe and green products. And - most importantly - it's now available on the iPhone. The fact that it's on the iPhone isn't the important bit. It's that it can be used at the point of purchase, which I'd imagine impinges much more on buying decisions than the memory of the site from home or work.

Saturday 8 November 2008

50,000

Hit 50,000 hits today on Flickr for digitalbites and not much else happened so I thought it was worth a post. Good times.

Friday 7 November 2008

getting creepy

New technologies can look like magic. That's Douglas Adams speaking.


But magic comes in two shades: black and white (which if the last posts are anything to go by seems to be a minor obsession at the moment).

The black stuff indicates some dark intention; the white stuff a benevolent effect.

New tech goes the same way. Phorm looks black. Genius looks white. But, in essence, they both do the same thing: use our data to sell more effectively.

And there's going to be a load more black technologies as the web breaks out and evolves into an Internet of Things.

Throw into the mix that data capture will get a whole lot smarter not only because of new ways of getting it (through GPS, RFID, accelerometers and the like [SPIME devices]) but because our increasing desire for personalisation means absolute transparency (that's Kelly), and you have some really quite creepy tech around the corner.

Stuff that knows about YOU. Where YOU are. What YOU like. Maybe even why YOU like it - and tense changes of all those. It's gonna get freaky.

The challenge is to tweak and present these technologies from having a perceived dark purpose (I don't really think Phorm does) to being understood as benevolent. We need to fuzz them up.

art from code



from here

Thursday 30 October 2008

Friday 17 October 2008

clients paying agencies to advertise agencies

So the FT will shortly be running ads to warn against slashing ad budgets. Says Frances Brindle, FT's Global Marketing Director, "There is considerable evidence to suggest that companies that continue to invest in advertising in tough times emerge stronger than those that don't." 

It's all correct but there's just something lovely about clients paying an agency to advertise agencies.

turn left where the telephone box used to be


Thought it might be nice to record the Stage 1 IPA talks for selfish future referencing and for anyone else interested. They are a series of talks that aim to cover some of the essential truths of the communications industry. My posts will be pithy and I'll add bits in sometimes, esp. if there's a jump-off to psychology.

The first one was by industry legend Jeremy Bullmore. Here's his talk, triple-distilled:
  • A man asks for directions to a shop in a small town. The postman tells hims to go up the road and turn left where the telephone box used to be.

    Why has the postman failed in his communication? Because he makes the assumption that the listener knows what he knows. Or rather, he fails to appreciate the listener's knowledge is not the same as his. This lacks a word in English but it's something like empathy. Psychologists, however, do have a term for this faculty, theory of mind. Using clever methods - like the Sally/Anne task - it is possible to see this mental trick coming online around the age of four in developmentally typical children. Autistics never master this. The point: communicators need a theory of mind - or the ability to see events through the eyes of those they are communicating to - in order to be successful.

  • Passive audiences were never passive. Audiences have always actively understood communications, it's just that before digital they never had a way to express it; digital makes stuff that has always happened explicit.

  • There are no such things as messages. There are stimuli and responses. 

  • The best creativity elicits the best contribution from the receiver (the artist rules his subjects by turning them into accomplices)

  • There is no dichotomy between creativity and effectiveness in communications. Effectiveness is the end; creativity is the means.

  • Advertising creativity makes client's money go further. Anything outside of that definition is not creativity.

  • Brand body language is what people read. When the body language doesn't match the communication, there's a problem. 

  • Good brands make you feel safe, they release you from anxiety (mostly likely because of problems with information in market economies)

digital britian

"Our ambition is to see Digital Britain as the leading major economy for innovation, investment and quality in the digital and communications industries. We will seek to bring forward a unified framework to help maximise the UK's competitive advantage and the benefits to society." 

Stephen Carter, UK Minister for Communications, Technology and Broadcasting

clarity crunch

"My favorite time to manage is during a bust. It brings more clarity about what your customers need and what your priorities should be."
Sergey Brin

Thursday 16 October 2008

i love data layering




...especially if it's in 3D.

Friday 10 October 2008

believing real

One of the first things I think when I see something online, and one of the first things that gets banded about on comment lists, is whether or not something is genuine, rather than set-up, faked, CGed etc. If it's genuine, interest soars; if not, interest dwindles (usually).

With this in mind here's a variation on Kelly's idea:
When content is faked, it becomes emotionally worthless.
When content is faked, stuff which isn't fake becomes scarce and valuable.
When content is faked, you need to show people things which are not faked.
I'd add on that some stuff that's great is faked (e.g. Cadbury's Gorilla; clearly that's not a real ape drumming away). It's when stuff is faked and needn't have been (or could have been done for real) that the emotional bottom drops out.

rebel selling

This is a screen grab from LastFM before they ruined their design. Putting that to one side for the moment, the interesting thing about it is that, after 'rock', 'alternative' came in as the second most popular tag. A screen grab from today shows much the same pattern.

[Aside: It's interesting how 'seen live' is a major label too. Like Kevin Kelly has said when stuff gets superabundant it gets cheaper to the point of being free. When this happens things that can't be copied become more valued by both the ordinary people, hence the tag's popularity, and record companies, hence the money in music now being in touring.]


There is more than a strong whiff of irony about one of the most popular tags being 'alternative': 'popular' and 'alternative', at first blush, cannot operate in the same place in a Gaussian distribution.

Now, it could be something particular about LastFMers: they might be alternative sort of folk. There's that. It's part of the reason the 'alternative' tag is so big. But, there's a greater truth too: 'alternative' is the driver of capitalism and culture.

Bakunin, Nietzsche, Sombart and Schumpeter all saw capitalism not for the homogeneity it created but as a fundamentally creative (and thus destructive) system. Capitalism and culture can be stated as the effort to escape sameness.

But you get information problems here. There are too many alternatives. Just like brands are there to help solve this problem, so certain things become pin-ups for 'alternative' to avoid the crippling effects of having too many alternative things. These things are then popular for being alternative.
Here, then, 'alternative' and 'popular' can operate the same place on a Gaussian distribution: everyone is trying not to be mainstream.

(Another irony here: counterculturalists believe they are rebelling against 'the system' when they are most probably contributing towards it, because rebellion is in the very spirit of capitalism. This could be problematic: "
Not only does it distract energy and effort from the sort of initiatives that lead to concrete improvements in people's lives, but it encourages wholesale contempt for such incremental changes [The Rebel Sell])

Tuesday 7 October 2008

why psychology is so important

The mind has evolved to solve certain problems that kept cropping up in evolutionary history. Part of that is a general intelligence to work stuff out (be flexible) that history hasn't prepared it for. However, there are lots of other things that are more specific and muddy general intelligence.

When I say psychology is not the same as philosophy now but vital to it, what I mean is that without an understanding of the mind, we cannot get a good understanding of reality. We have to know the mud to remove it.

All knowledge gathering without an awareness of the mind's natural biases is the straight line in the diagram below. Knowing how the mind works allows you to travel along the bendy line, circumventing the mud, and obtaining unfettered knowledge. Close eyes, deep hum....

Wednesday 17 September 2008

buy-product

thomashawk/flickr

The short version of this post is:

Selling on social networks can only be a by-product (hence the title 'buy-product'. Geddit? Oh dear, I am sorry) of the activities occurring on those sites, never a primary activity.

The long version is this:

In market economies there are two big problems with information: it's (occasionally)* inadequate and it's superabundant.

Why is information inadequate? Because you only know how good something is after you have bought it, so how do you choose between alternatives?

Why is there too much information? Because everyone wants a slice of the pie making those alternatives and the market gets flooded with products or services and thus information about those. (How do you choose between one-hundred types of olive oil? You don't for the most part. It's crippling. You move on.)

Add in the fact that this competition strips profits to the bone and you have three pretty good reasons for why brands should (and do so successfully) exist: to remove doubt about quality and ease the process of otherwise crippling choice for people who want to buy stuff, and pump scarcity (and thus juicer profits) back into things for people who want to sell stuff.

However, in the spirit of several recent books, solutions to the problem of inadequate and superabundant information can also be solved by [dramatic pause] other people. And more effectively because trust in other individuals is second only to personal experience itself.

Other people can literally 'test' products and services for you before you buy them yourself (by buying them themselves) and help narrow down the choice for you (by having had to narrow down the choice for themselves.)

People have probably been doing this since information in markets got to be doubly-dodgy. Nothing new in the behaviour. It even has a 5-syllable name: recommendation.

However, what might be new is that hitherto implicit recommendation could be made explicit and more useful with a dash of digital. Making stuff explicit seems to me to be the formula of the successful things in recent web (Facebook makes social relationships explicit, blogs records thoughts that would otherwise only exist in conversation or the mind, LastFM records the wake of your audio, StumbleUpon ossifies your digital discoveries etc.)

How would this work? Social networks would allow self-expression at a much finer level of detail allowing libraries of music, films, books, clothes**, etc that people have honed (not harvested automatically, which was the problem with Beacon). (Eventually content could be brought within such networks, so they operates as hubs of digital content.)

This in itself could present recommendations based on content (like iTunes Genuis) but also those made by other people in your group of friends. These recommendations wouldn't really be recommendations but comments/ratings tagged onto things by people. 'This track is awesome' could be really useful from someone who you (or your computer) knows you share taste with.

Picture time:

digital demographics

<span class=

One of the nice things about such a system is that it's lovely for people, providing stuff they might be interested in. However, there is also an opportunity here for social networks.

Another picture first:

untarnished social networks

And now I am going to quote myself, which is probably a bit wanky but here we go from a few posts back:
"When people use Google, they're looking for information. When they use Amazon, they're buying (or researching). The ads are working here because people want information, it's welcome if its good enough.

When they use Facebook (or any other social media) they're expressing, communicating and interacting with others (being social not being cognitive). The same ads aren't working here for the same reason you'd be a bit miffed if someone marched into the pub, dropped a sausage in your pint, yaddered on about how delicious they are and, by the way, how they are half-price at the moment."
The point is that in the social space clumsy selling (ads) is not welcome: the communication gets tarnished by it, so it's ignored and disliked. But, that is not to say that selling as a by-product is out of the question. Links to things - and charging the producers a little for those links - suits everyone.

* I put 'occasionally' in brackets because as soon as information is partially inadequate it gets perceived at wholly inadequate.
** Self-generated recommendations will probably work for clothes; other-generated recommendations won't work as well. People like their clothes to be different (but not too different) from their peers.

Thursday 11 September 2008

negroponte's predictions

Prediction is usually a dubious business: things are way too uncertain and we just don't know what we're going to know in the future ('unknown unknowns' in NNT's terms). That's why Being Digital by Nicholas Negroponte, which came out in 1995, is all the more freakish in its prescience.

Here's a smattering of things I liked, with the occasional few words after each from the perspective of now.

moving intelligence through media
The tidiest way I have seen the 'receive -> interact' paradigm change articulated.

computing <span class=
Basically, Apple's strategy and success.

pulling bits
RSS, Google Reader...

touch, the dark horse
Creeping in more and more. He also talked about "the tiny hole or two in plastic or metal, through which your voices access a small microphone" (p.159). This is still proving difficult.

digital demographics
Things like Google Reader's Top Recommendations, Amazon's recommend emails and iTunes' Genius represent this one quite nicely. Although still some way to go here.

digital on-demand
Hulu, BBC iPlayer and all the underground antecedents to these.

the global social fabric
This idea - communication as well as information - is rephrased a lot by pundits. What's impressive about this is that it saw the value of social online before it was made explicit with, sorry, nasty phrase coming up, Web 2.0.

the peeling boundary
Blogging seems the best example.

the process
Radiohead is my favorite example of this at the mo. (Also see here for what bitcasting - another of Negroponte's babies - is all about and how Radiohead's House of Cards 'video' is likely to have been the first example of this).

laws for atoms
Very broadly gets to the nub of all the legal issues bouncing around online.

And a few others that didn't make it into digital bites:
"Clipping bits is very different from clipping atoms" p.59
"On the net each person can be an unlicensed TV station" (p.176)
One word. YouTube
"...bits that describe other bits...will proliferate in digital broadcasting. These will be added by humans aided by machines, at the time of release...or later (by viewers and commentators). The result will be a stream with so much header information that your computer really can help you deal with the massive amount of content" (p.179)
tags, labels etc
"automobiles will enjoy another very particular benefit of being digital: they will know where they are" (p.216)
SatNav.
"The important point is to recognise that the future of digital devices can include some very different shapes and sizes from those that might naturally leap to mind from our current frames (sic) of reference. Computer retailing of equipment and supplies may not be limited to Radio Shack and Staples, but include the likes of Saks and stores that sell products from Nike, Levis and Banana Republic."
Basically, the web breaking out from behind screens, which I have thought about here. Nike+ is the golden example of this right now. A continuation of this idea:
"When this happens in a tiny format, all "things" can be digitally active. For example, every teacup, article of clothing, and (yes) book in your house can say where it is. In the future, the concept of being lost will be as unlikely as being "out of print"
I like the nod to long tail stuff at the end there with "as unlikely as being out of print"

(Skepticism: the book, being widely read, could have prompted people to work on the things Negroponte predicted ('invented'), giving the impression that the book is farsighted when it may have been prescriptive to future-makers)

Friday 22 August 2008

digital uses more neurons

rizzato/flickr

There is a lot frothing up on how the Google Generation (which is actually a very misleading idea) is full of a bunch of cognitively myopic and depthless individuals just skimming from one digital distraction to another. And somewhere in that there is probably some truth for some people.

But the assumption that the endpoint of new web behaviours is neuromush is wrong. History teaches us that any new technology brings with it a grimly predictable cohort of detractors. It also teaches us that for every game-changing innovation - the alphabet, writing, printing - humans didn't end up mentally crippled but enriched, seriously enriched, in fact.

That's why - in the same spirit as Everything Bad is Good for You - it's nice amid all this gloom to know there are some historically alert digital optimists. Digital culture is such a massive improvement in many aspects to 'receive' culture*. As Don Tapscott and Anthony D. Williams put it in Wikinomics (p.47),
Rather than being passive recipients of mass consumer culture, the Net Gen spend time searching, reading, scrutinizing, authenticating, collaborating and organising.
So in that respect, Google is not really making us stupid. Perhaps, quite the opposite: all this new media may cause more cognitive sweat than the media before it.

Rewind ten years, people would read a book or an article and that was that. A few might make notes. Only a handful would write about it and publish, and typically on a professional basis. Back to today, and the same book or article generates way more thought than it would have done a decade earlier. Digital culture uses more neurons.

Added to that, what is also omitted from the view that the new represents a mental downgrade is that while we are outsourcing certain brain functions to silicon we are gaining literally superhuman abilities in the process. The critics focus on what is lost and ignore what is gained, like memory.

As Faris has neatly put it,
I think increasingly, our brains are less like databases and more like index servers
And it's better that way because actually I can store more, not less. My memory isn't atrophied by the internet, it's augmented. And that's the mark of some of the most transformational technologies: they extend our ability to keep information alive by outsourcing it.

*This is clumsy. Some digital stuff is obviously sit-back too.

aquajelly and airjelly

Thursday 21 August 2008

more fragments

blentley/flickr

Why is advertising in social media not really working out? Because of a silent assumption that got forgotten in the move from trad to new media.

You can deploy car and beer spots during football matches; ads for age-defying cream and Heat during How to Look Good Naked; stuff for DIY during Grand Designs...I could go on.

And this makes a lot of sense. By knowing what your audience is like you can be more selective and hope your ad is hitting a bigger group for whom it is more relevant, rather than just splattering it randomly across the schedule.

This model has been carried across to the Internet, with superb success for Google who worked out how to automate the process with AdSense.

However, one of the things the traditional model never had to worry about was what the audience were doing. It didn't have worry because it knew: they were listening to something, or watching a show, or standing on the Tube, or whatever.

Because the model had this silent assumption when it was transferred to the new medium it got forgotten. The hidden fragment got left behind.

What are the audience doing online?

When people use Google, they're looking for information. When they use Amazon, they're buying (or researching). The ads are working here because people want information, it's welcome if its good enough.

When they use Facebook (or any other social media) they're expressing, communicating and interacting with others (being social not being cognitive). The same ads aren't working here for the same reason you'd be a bit miffed if someone marched into the pub, dropped a sausage in your pint, yaddered on about how delicious they are and, by the way, how they are half-price at the moment.

So what people are doing online is probably as important for click-through rates as who they are.

I haven't thought about specific examples yet for social networks, but essentially companies selling in this space should assist with communication and expression, not clutter it.

The Internet isn't one medium, it's fragmented media - and not just by who, but why what.

Wednesday 20 August 2008

smarter reviews

Once mobile internet gets properly off the ground, lots of shopping in the real world will change. It will change because prices and reviews - things normally all the way back at home - will suddenly be at your fingertips in stores. The buying decision is going to get another brain contributing.

But reviews have their problems.

Say I am swotting up on a new book I have heard is rather tasty or investigating a new camera so I don't have to mashup (read, lazily appropriate) others' photos on Flickr. I read a load of book reviews on various sites. For the camera, I come up against some sites wanting my money for their opinions, others with more reviews than I can possibly read and perhaps a couple of blog posts from some real keenos.


Here's my beef with all of this.

'Old' media stuff is dense and long but trustworthy and rich. It's also one person's view normally.


Customer reviews are helpful because they are likely to tell it like it is; they have no reason not to. Except some people can't tell it like it is even if they want to, making a chunk of customer reviews unhelpful by being unreadable, like this beauty from the BBC's gleefully entertaining Have Your Say (distilled here).


And even when people can get their thoughts in order, how do you know that what they like you are going to like? So you look at quite a few of these, try to average across opinion. That's a bit time-consuming. One quick way to do this is to look at things like the 5 stars on Amazon.


This is beautifully quick but often rather unhelpful: it doesn't tell you all that much. Added to that, fans swarm in and leave in their slaver pages of universally positive reviews. In its most extreme form this sort of review takes binary form.: thumbs up or down, cool or not, rotten or fresh. Essentially, the problem with taking lots of data and reducing them, is that it can only provide a dirty average.


Basically, the problems of reviews are that we have work hard to find them, when we do there is too much information overall and there is a poor summary of it. We need something that combines the best bits. Basically, something that is quick but rich:


So all the power of collaboration is used. All the time spent reading and cogitating is stripped away. The in-depth, expert stuff is there if you want it. And, most importantly, the reviews become a whole lot more powerful by taking into account who has left them.

This has probably been thought up somewhere before. All the same, I don't see this kind of thing anywhere. And it is the perfect sort of review system for mobile: lightweight, powerful, visual, personalised and genuinely useful.

Wednesday 13 August 2008

transactive digital memories

kendrak/flickr
"Knowledge is of two kinds: we know a subject ourselves, or we know where we can find information upon it."
Samuel Johnson had the idea of transactive memory down before Wegner, Giuliano, and Hertel formally brought it to the table in 1985.

What it acknowledges is that there's memory in our heads, and memory that's elsewhere, usually other people's heads. Transactive memory is the smearing of a reality to a number of different minds to lighten the load on one.

Swap out the 'other people' in this arrangement and swap in the Internet. Feel familiar? As Clive Thompson said in Wired, "Almost without noticing it, we’ve outsourced important peripheral brain functions to the silicon around us."

Just as the nature of transactive memory in relationships or groups is only conspicuous by someone's absence, so it is when we are away from the Internet. Our memories are now neuronal and digital. Sounds rather cyborgish, doesn't it? But a hefty chunk of my memory - names, numbers, dates, addresses, quotations - are all outsourced.

Wegner and his mates should broaden their research to see what's going on with transactive digital memories.

early twitter

Sunday 10 August 2008

circlesquare - sub-reminisce


Circlesquare - Sub-Reminisce from Bienvenido Cruz on Vimeo.

facebook and fireworks


Facebook: the social equivalent of watching the fireworks in the park on TV.


anxiogenic tv

bbaltimore/flickr

There is a lot on how the Google Generation (which is actually a very misleading idea) is full of a bunch of cognitively myopic and depthless individuals just skimming from one digital distraction to another. And somewhere in that there is probably some truth for some people.

But the assumption that the endpoint of all these new web behaviours is neuromush for everyone is wrong. That's why it's nice when you hear something that says, actually all this new media causes more cognitive sweat than the media before it. It might even be making some of us a bit smarter.

It's the same with TV shows. They are getting more complicated and we are having to be better decoders to understand them. Even 'rubbish' like Big Brother forces us to track a large number of relationships, which have to be continually updated. If this example is fatuous, shows like The Sopranos, The West Wing, The Wire, Prison Break, 24, Lost and so on are most certainly feistier than their antecedents.

As usual, someone else has thought about this a lot more and got it down on paper. Everything Bad is Good for You by Stephen Johnson argues that "that popular culture has, on average, grown more complex and intellectually challenging over the past thirty years" (xv). I agree.

But there's something else that I have been feeling in the last five years when I watch television, which is increased anxiety. The reason for this is that I can no longer trust writers.

Somewhere things changed.

For me, this point came with Season 1 of 24. With everything I had watched up to that point it was a fairly good assumption that by the end of a film or season everything would be nicely tied up - the baddies dispensed, the objective achieved, the romance consummated. More than anything, writers brought their characters close to death but pulled them back again at the last moment.

Then audiences got bored of this, the assumption was so strong that it started to dilute the experience because the outcome could be predicted. The reaction was for writers to start being ruthless and, to a certain extent, inconsistent when it came to dispensing with characters.

Now audiences were never really sure who was going to live or not because instead of pulling all major characters back from death they started popping them off left, right and centre.

I believe Keifer Sutherland is the only major cast member remaining on the television show 24 after its 7 seasons. Prison Break in its third season is considerably thinner on characters than when it started.

This has the effect of pulling you deeper into shows, because you face the confusion and uncertainly the characters feels instead of being able to watch the show with the saftey net below you.

As Johnson has noted, things like multiple-threading in TV shows have forced us to sharpen up cognitively. But, increasingly this and other devices are coupled with wanton disregard for nearly all of a show's major characters, which has sharpened our anxiety.

Friday 8 August 2008

solution to the prosumption dilemma

<span class=

Solution is in the middle ground. Companies should try to build platforms that open the door to prosumers without knocking down their own walls.

I think Apple has got this right. When its lead users started hacking the iPod with Podzilla, PodQuest and Encylopodia and jailbroke the first version of the iPhone, it did a bit of of commercial Jitsu, taking the force of the demand and turning it to its advantage with the SDK and iPhone apps store.

Here is perhaps the best example of the prosumption dilemma solved: Apple keeps control, appears open and adds value to its product way beyond what it could achieve alone through vast collaboration.

Thursday 7 August 2008

mixin



Two things.

One. Vodafone should be doing something like this.

They could have a whole arsenal of apps to turn their positioning - make the most of now - into a something valuable for people and link it up in lovely ways to mobile and social platforms.

Two. It's so helpful when you come across a new site not to have to explain it in your blog but just embed the code for a video they provide, which does the job for you, thank you very much.

spencer higgins


There's just something very intriguing about this. You need to see it big.

Wednesday 6 August 2008

crime mashup

"It is simply unacceptable at this point in history that a citizen can use Web services to track the movies he is renting, the weather around his house, and the books he's recently purchased but cannot as easily monitor data regarding the quality of his drinking water, legislation, or regulations that will directly impact his work or personal life, what contracts are currently available to bid on for his state, or what crimes have recently occurred on his street."

James Willis, director of eGovernment for the Rhode Island Office, 2005
It is.

Or it was.

Now there's ZubediPI which tells me the West End is a bit dodgy when it comes to theft and violence amongst a whole host of other rather useful stuff.

i am

If an alien arrived on earth one of the first things it would note about our species was how much time we spent with each other. 'Humans are social creatures' it might jot down (see Cartwright and Zander, 1953).

If it had been around for a while, it might have added 'but this has been decreasing in the last few decades' (Putnam, 2000). And, maybe if it had been paying special attention, it might scribble something like 'increasingly people have fewer others with whom to discuss their most intimate thoughts and feelings' (McPherson, Smith-Lovin, & Brashears, 2006).

As Ybarra et al (2008) sum up,
The success of the social networks is probably in no small part because they strengthened and made explicit these withering social connections we all naturally crave.

Orange's new campaign, which orbits around the strap line ‘I am’, mines this too.

What does 'I am' mean?

Simply, you are better off working together than you are by yourself, thus necessitating communication technology. You are the sum of the people you communicate with.

One of the nice things about this strategy is that it is actually true. It's not that staple of advertising - myth-making - but a statement of something fundamental about human interaction and that makes it fresh.

For one, you are smarter working together. Ybarra et al (2008) writing in the Personality and Social Psychology Bulletin found that social interaction (as little as 10 minutes) improved intellectual performance.

More social contact is correlated with well-being (Sinha & Verma, 1990; Triandis et al., 1986) and its absence is marked by depression (Gladstone, Parker, Malhi, & Wilhelm, 2007).

There is even some evidence to show that after controlling for level of health, fewer social connections are linked to an increased risk of death (House, Landis, & Umberson, 1988). Yikes!

Orange is in the relationship business, not the mobile phone business any more.

And not only the relationships between each other but the relationships between all our different digital identities.

'I am' Facebook, Flickr, Blogger, YouTube, Amazon, Vimeo, Google, Dopplr, Wikipedia, del.icio.us, Stumble Upon the list goes on....

This unruly mass of services exists and yet there is nothing to tie them all together.

That is, until Orange’s offering, My Social Place, hits the scene in the autumn soldering all our online identities into one ball.

So ‘I am’ is a strategy that is timely in two ways. It fosters and facilitates being social, something we all need but aren't getting enough of. And it is digitally prescient, preparing for communicative life beyond simple mobile. More evidence of marketing and service dancing together so fast you can tell who's who.

But there is a final bit of cleverness in here. The social thing and the digital thing, as well as being two separate but rather nifty uses of the same strategy for ordinary people, are also commercially adroit when holding hands. As Cory Doctorow, in a speech at Cambridge last month, explains,
"The thing that the Internet is even better at than providing universal access to all human knowledge is nuking collaboration costs, getting rid of the cost of getting people together to do stuff…[This is] what allows us to be literally superhuman. That is to say that if you and someone else can do something that transcends that which you could do alone, then you have done something that is more than one human can do and is superhuman." (around 19 mins in)
'I am' in this sense perhaps acknowledges that what's around the corner is really big collaboration online and on-phone. And Orange is going to be a company to help out with all of that in its services, the branding seed of which is being planted now.

Pity the execution isn't more exciting and less pretentious. Still, early days.

search for 'i am'


Been waiting for to get a pic of the new Orange print campaign, but everywhere I seem to be at the moment the damned ad sneaks off somewhere.

Anyway.

I might be wrong - I'm probably wrong - but, isn't this the first big campaign in the UK to jettison URLs in favour a search term?

In case it has passed you by, the ad suggests you

search for 'I am'

in the spot normally reserved for a URL.

[Aside: 'I am' has to be the shortest sentence in English?]

This is a great improvement for the interested. Much better than having to remember "www.i-am-everyone.co.uk".

Of course, the Japanese have been doing this sort of thing for a bit, probably because their mobile market is so much more mature than ours.

But there doesn't appear to have been any SEO. Instead Google, Yahoo et al. seem to have profited rather nicely from this campaign (any of the Fallon people currently dating people at the search engines?)

People search for everything, sometimes even if they have the site in their bookmarks. No one really types in a whole web address any more. It's good to see brands acknowledge this.

Monday 28 July 2008

ipint + the global spirit level



The iPint, the stonkingly popular download for the iPhone. Apparently "the best example to date of mobile advertising" according to Clare Beale. I'm inclined to agree, even though, according to Faris, the idea seems to be lifted from here:



Although fun, it is quite shallow. It needs more depth to win my vote.

The next best example of mobile advertising from Carling would be if you could add some sort of value in beyond a neat trick, like actually having your pint waiting for you at the bar or brought to you if you order on your iPhone so you can avoid a ten-deep human barrier between you and your cool, refreshing beverage. Probably would get more Carling sold too, which would "stretch the definition of what advertising is" even further.

[It's the accelerometer in the iPhone that Carling has exploited. The accelerometer is the interesting bit. And for no reason other than it's possible it would be cool to know the tilt of every iPhone user on the planet. Making the half-decent assumption that facing up and moving a bit equals movement and down flat equals inactivity it would provide an interest peek into people's activity.]

pulse



According to the blurb, "pulse is a live visualisation of the recent emotional expressions written on the private weblogs of blogger.com". It's interesting because stuff online is being used to create art offline.

It's kind of a mashup of Julius Popp's intriguing Bit.Fall (below) and and also the 'feel map' that I blogged about here.



Also reminds me a bit of the plant in E.T. which seems to respond to E.T.'s health.

I really, really want time and location to be factored into these things, especially as mining the emotions gets better. It would be such an interesting insight into people's expressed emotions as news stories ripple through a population or more generally what a particular population is feeling in a year.