Mad Men mergers (WPP + Interpublic)

So the latest rumor is that WPP is planning/trying to/wanting to buy Interpublic group.

As mentioned in this earlier post, in a time of rapid change, especially in the field of advertising/media/tech, shouldn’t companies try to become smaller and more nimble to deal with digital changes? The continued mergers in this industry seem to be more aligned to old-school capitalism.

Photo: By JRibaX (Own work) [GFDL ( or CC-BY-SA-3.0-2.5-2.0-1.0 (, via Wikimedia Commons

What experiments in urban development can teach us about society

Photo: JackDayton at en.wikipedia [CC-BY-1.0 (, from Wikimedia Commons

Following on from this post about speeches that were prepared, but never delivered (except perhaps in parallel universes where all possible actions take place), it is really interesting to look back and think about the same thing in an urban setting, and what that could mean for urbanism going forward. This 99% Invisible podcast episode on Unbuilt structures looks back in history at structures in San Francisco and New York that were proposed but never built.

What is interesting is how tied to their time period these proposed ideas are. At some points in our history, we’ve been focused on Tayloristic effectiveness and shaving off seconds of time spent commuting, at other times it’s been all about urban spaces that can generate happiness and increase quality of life.

Since cities are by now the dominant form of dwelling, already holding over half of the world’s population, what we value and design for in cities is really what we value in society as a whole. This becomes even more pronounced given how important cities are in the political process as well. Federal systems are paralyzed, and cities need to step up to fill make their own future. This, interestingly, is true both in the US, with a Congress in deadlock and some strong mayors taking action (see this earlier post on Bloomberg), and in developing countries. In some countries, such as Nigeria, the government might be very weak, but the cities develop their own structure and institutions, formal or informal. Lagos is an example of a city that has been said to be like a country in itself. The FT had an op-ed on the metropolitan revolution recently, and there was an interesting BBC Analysis podcast on whether the UK could learn from the US in this regard.

So should cities be like factories, only aimed at increasing output? Or should they be places for discussion and improved human knowledge? There have been some interesting recent experiments in urban redesign that I think provides us with some ideas of what we seem to value most for our cities.

On the smaller level, we find successful “reclaimings” of derelict parts of cities, such as the High Line in New York, which has rejuvenated a large area by the creation of a park at an old elevated rail track, high above the usual city bustling. London has its Olympic village in East London, which is hopefully driving development in this long-ignored area.

In the category of more extreme urban redesign, which gets closer to full societal redesign, we see for example this city on wheels in Fast Company. This raises a lot of interesting questions about what a city should be. Another interesting concept is that of seasteading – creating fully functioning societies on the sea, outside the borders of existing countries. A bit of extreme libertarianism, perhaps, but quite a nice utopian vision.

Some of these redesigns are too expensive to be considered appropriately. But perhaps we can take hope in how well small, temporary societies can be created. Here, for example, Good magazine described what the Burning Man festival can teach us about how society can be quickly created and upgraded.

Photo: Aaron Logan [CC-BY-2.0 (, via Wikimedia Commons

In new journalism, audience comes first (Vice and Buzzfeed grow up)


Audience comes first, then serious, investigative journalism comes later. That the Internet turns the old newspaper model upside down seems to be one of the key lessons we can learn from recent new ventures in journalism.

Buzzfeed, long a purveyor d’excellence of cat videos, is now beefing up its staff and aims to become the main news source for the social generation. Its experience, and that of Huffington Post, which just received its first Pulitzer, shows that, if you build audience on the Internet first, you’ll then get the funds to invest in building up your “serious journalism” efforts. Mark Glaser had a good Media Shift podcast on this last month.

Matthew Ingram also writes about this today. There, he is also including Vice in the equation. Vice seems to follow the model of do gonzo journalism first, then become a bit more serious. After stunts like the Rodman North Korea trip, Vice is now valued at $1.4 billion. Although they’re still running articles that are just flat-out weird, such as this one on subliminal messaging in Spotify.

Business Insider, one of my favorite Internet news sources, has long been derived for being just listicles and ALL CAPS-headlines such as “the ONE CHART that shows why the EUROZONE is DOOMED”, but it really is one of the smartest sources for “financial markets with a twist” news.

What this means is that journalism is changing. What is journalism now? It’s no longer just writing and interviewing, it’s now coding, data visualizations, social media spreading. The Atlantic wrote recently on how journalism school should adapt. If they don’t, budding new-era journalists can just go straight to Quartz, where they can start using their open-source charting tool.

If it continues like this, maybe we’ll see soon Zero Hedge, one of the finest purveyors of gonzo financial journalism, receive its first Pulitzer for its in-depth reporting on the next financial crisis.

Photo credit: Key Foster

We need Apple to create the iHome in order for the Internet of Things to go mass market

Every major disruptive innovation we’ve had over the last decades has started as a walled garden.

The Web started with AOL and Compuserve building their own closed off small corners of web pages. Reuters’ open-system financial terminals never stood a chance against the sleek Bloomberg terminals. Smartphones and tablets didn’t take off until Apple took out all the unnecessary fluff and designed a sleek user experience, heavily controlled in terms of content. Social media didn’t fully take off with MySpace‘s messy, do-what-you-want interface, but rather with the heavily locked-down interface of Facebook.

It seems that it’s a necessary trade-off to do in the first stages of a new technology, sacrificing functionality for usability. Then, after the initial walled garden has existed for a few years, successful, “messier” options can succeed.

Once users have gotten used to the gist of the service, and a need has been established, then they start venturing out for more options. Now for example, after 6 years with iOS, the messier, customizable Android has now become the dominant driver of smartphone innovation.

Even for the Internet as a whole, it seems people now want their own “splinternets”. The much-faster, university-only Internet 2 was hyped for a long time, and now of course, in the wake of Snowden, Brazil and other countries want their own Internets.

Now, it seems we need the same thing for the Internet of things. Products run on a number of different standards, products don’t talk to each other if they’re from different manufacturers, and setting everything up can get really complicated. A lot of functionality exists, but it’s often pointless, like the washing machine discussed the other day on the O’Reilly Radar podcast that is somehow Internet-enabled, but doesn’t actually do anything useful with that connection.

Further, security is an issue. As Stacey Higginbotham discussed the other week on the GigaOm Internet of Things podcast, the way it is today, someone could hack into your coffeemaker and wreak every-day havoc.

We therefore need Apple to step up to the plate and play its role of chief technology simplifier and beautifier. Don’t let Nest beat you to it (or these guys). Imagine a sleek iHome console that cuts out unnecessary functionality and delivers a seamless experience. That would rejuvenate my hope in Apple.

Japan is Apple without (Steve) Jobs

English: Metropolitan Government Building in S...

I first visited Japan in 2002. It was at the time the most futuristic country I’d ever seen. Everything was electronic, automated and smoothly running like a machine. It seemed light years ahead of the rest of the world, and set to lead the world forward.
The second time I visited, in 2013, it was still a fantastically modern country, but everything looked fairly similar to the way it did in 2002. It no longer felt like they were ahead of the rest of the world. It was as if they’d had a lost decade not only on the stock market, but also in terms of city innovation.
As Shinzo Abe tries his best now to reinvent Japan, I’m struck by all the comments I read solidifying the view that Japan is still stuck in the past. It is a very modernistic past, but it is still the past. The FT ran an article yesterday about how the Japanese high-tech companies, such as Sony, still make old-school, deeply commoditized products such as electric rice cookers. And they’re unable to cut their losses and move on. No matter how hard hedge fund activist Daniel Loeb tries to make Sony split itself up, they’re stuck in their old mindset.
Abe has shot the first two of his three arrows, and the world is amazed at the boldness of the Bank of Japan, but the third arrow, that of reform, seems to be still stuck, held back by corporate Japan.
It is like Japan is Apple without Steve Jobs – once very innovative, but now just repeating the same formula, churning out a country that looks modern, but feels old.
Apple still has a chance to reinvent itself, as 2014 rolls around hopefully bringing iWatches and iTVs, but judging from the latest news, they seem to be happy keeping on doing incremental changes.
Similarly, Japan now has a chance again given that they were just awarded the Olympics for Tokyo in 2020. Let’s hope they take it.
Photo credit: Wikipedia

The need for a more mathematical language

Pieter Bruegel the Elder - The Tower of Babel (Vienna) - Google Art Project - edited

Our language is multiplying and taking on new forms. Every day on Urban Dictionary and Word Spy, we see new words popping up as more and more people are connected and try to communicate with each other. The other day, a review in The New York Times of Mark Halperin and John Heilemann’s new book on the 2012 US election – Double Down – was more interested in the new words that the authors had created to describe the changing political landscape than the actual contents.

This shouldn’t have to be the case. The world might be betting more complicated, but what we need to better communicate  is a simpler language, not a more complicated one. It is clearly time to bring back a simpler approach to language.

Now, a simpler approach doesn’t necessarily have to mean a dumbed-down language. Simon Kuper wrote a great piece on the rise of “Globish” a few years ago, but if we all went in that direction, we’d probably lose more than we’d gain.

One approach that we could perhaps learn from is Ordinary Language Philosophy (OLP), a school of philosophy that was popular in the UK between the 30’s and the 70’s, which was the topic of a recent BBC In Our Time podcast. OLP argued that a lot of the problems we encounter could be solved by using words in the meanings they have in everyday use, instead of looking at their abstract meanings. E.g. Wittgenstein argued that “We must do away with all explanation, and description alone must take its place“. A more mathematical language like this, with less unnecessary abstractions might be what we need.

A similar phenomenon we’re seeing today is the rise of computer programming languages becoming more popular, and becoming a part of our culture. In a great article in FastCompany, Cathy Davidson argued that web literacy should be the fourth area of literacy. It was also very encouraging to see recently that the UK has recently added computer coding as a subject already in the first year of school.

Perhaps the best way to go is to follow the example of Daniel Dennett, who prescribes the use of linguistic tools (one part of what he calls intuition pumps in his latest book) as tools for thinking. We can use his examples of linguistic tools such as “sour grapes” or “loose cannons” as a way to simplify our language, and speak both more economically and more clearly.

Perhaps what we need in the end is a combination of the limited set of words of Globish with the exactness of OLP and Dennett, leaving out any room for ambiguity.

India dreaming

In his speech at Rice in 1962, President Kennedy said: We choose to go to the moon. We choose to go to the moon[…], not because it’s easy, but because it’s hard. It set off a dream of space exploration that spurred on American science and technology innovation for almost 50 years.

This week, India sent off its first mission to Mars. This was also the first Mars mission from a developing country. Like the efforts of the US, in the middle of its Cold War with the Soviet Union, this is one of the starting shots in a space and technology war between India and China. But despite the political justifications, it is likely to have hugely beneficial effects for the country.

Many commentators, in and outside of India said that this is money that could be better spent on the many Indian poor. Even if India’s space program operates on a budget that would fund one day of NASA’s, it can course sometimes be hard to see how you can justify space exploration trips, lasting many years and with uncertain payoffs, when you have people starving in the country’s here and now.

But it is likely to be money well-spent. In the West, all the “dreaming” budgets, of space exploration, and of large-scale science fiction-style projects are largely being slashed and being put on the back burner. Stian Westlake argued in the FT recently that we need Elon Musk’s Hyperloop, and large, dream-like projects like it to spur on our collective imagination. With a few exceptions, such as the Hyperloop, a few space Kickstarter and Indiegogo projects, the US has stopped dreaming, which is having huge ramifications on the level of science and technology being developed. I wrote earlier here about how Google is one of the few institutions that have the imperative (and resources) to push for moonshot projects.

Judith Shulevitz, in the New Republic, argued recently that the classic American liberal arts has driven many of the scientific advances. For example, the tablet computer was envisioned in Star Trek, and cyberspace was first suggested by C W. That is another one of the “dreaming” budgets being cut, as money is spent on educating engineers who will make only incremental innovations rather than revolutionary ones. She argues that the current decline of liberal arts education means that we will face decades of less innovation.

There is clearly a case to be made for these kind of dreaming projects. I am happy to see India take the step to dream, despite all of its day-to-day issues. Like Indian economist Amartya Sen said, development is freedom. And space exploration is a huge driver for development, by letting a generation of Indians dream.

Image from NASA Earth Observatory


Life after death in the digital era

Among the Toraju tribe in Indonesia, the dead stay with the living. Literally. Until the family can afford the extensive and expensive burial rites, which lasts several days at least, the dead person stays in the house. And once the burial has taken place, the dead person will still stay with their former family, in the form of a wooden effigy. It all amounts to a very close relationship between the dead and the living. In many ways, it seems quite healthy, and, one would imagine, leading to less of a fear of death.

Now, with the rise of social media, Westerners are creating their own effigies. We heard this week the sad, but in a way uplifting, BBC story about parents who put pictures of their stillborn children on YouTube. On Facebook, the pages of the dead are turned into living shrines, with their friends writing to them long after their death. Funeral companies are developing new products, for example events on death anniversaries, where family and friends gather every year on the anniversary of the death to celebrate their loved one, or tombs with digital touches.

As much as one could see could see these new practices as weird, and unusual from our Western perspective, perhaps we’re moving towards the closer relationship with death that many other cultures have long had. Unless these new digital estate companies manage to convince us all to sign up for our social media presences being taken down after we die, we can all live on in perpetuity on social media.

For a Western culture obsessed with the fear of dying (even if it has led to fantastic works of art), that can only be a good thing.

Image: Sangita Pujara, under Creative Commons.


Film festivals in Burkina Faso, the legacy of colonialism and why there’s no one Africa

Burkina Faso - Les Pionniers de la révolution

Listening to Monocle24’s excellent Aperitivo podcast this morning, I came across a segment on a film festival in Burkina Faso, Fespaco, which is the subject of a recent documentary by filmmaker Dave Calhoun. Fespaco is the biggest film festival in Africa, but remains very unknown in the rest of the world.

The segment was interesting from a number of aspects. First, we’re always hoping that cinema will become more global, and that films made outside of Hollywood and from smaller countries will have a chance to make a splash on the global market. There has been numerous false starts. At times it seems that Asian cinema, e.g. Taiwanese, would get its global breakthrough, and even Scandinavian cinema has had a renaissance lately on the back of strong interest in its noir sensibilities expressed in books and TV series. But African cinema remains very unknown. To hear that Ouagadougou every two years becomes the capital of African cinema, showing more than 100 films from across Africa, is encouraging.

Perhaps global cinema is finally having its moment? All forms of other cultural artefacts that would have been deemed too local to make it a few years ago can now grab a global audience. Just ask Psy.

Looking at the media landscape, we are indeed seeing new media outlets coming out that provides perspectives that are not only global, but try to be local, globally, i.e. highlight small local events in less-reported countries. Ethan Zuckermann’s Global Voices is a good example.

Another very interesting thing that came out of the interview with Calhoun in the podcast is the difference in cinema between the former French colonies and the former English colonies. Apparently, the former English colonies have a much smaller cinema scene, or take after Hollywood. Nigeria with its Nollywood has a production that almost rivals that of Bollywood, with a lot of low-budget, straight-to-DVD, action movies. The former French colonies on the other hand produce more arthouse films (apparently Fespaco still requires movies to be shot on 35 mm film instead of digital!), that are supported by the French studios and French government art support.

It’s fascinating to see how deep the colonial influences run. Simon Kuper was recently discussing how there’s no one Africa, but very distinct Francophone and Anglophone Africas. The British colonies might be more represented in the next set of African frontier markets (e.g. Nigeria, Ghana, Kenya), but the French clearly left other important aspects. The best French food I’ve ever had was in Luang Prabang, Laos. Not to mention the baguettes in Hanoi. However, it must kill the French that the African fashion capital ended up in Lagos rather than in a Francophone city.

Photo credit: Wikipedia

David Cameron to nation: Are you an anxious aspirational or an urban struggler?

English: Prime Minister David Cameron speaking...

Big data politics comes to the UK.

Heard this morning on The Times’ Did You Read podcast (also reported in the Telegraph) that David Cameron has discovered segmentation. With help from Obama’s campaign manager Jim Messina and Lynton Crosby, they have divided the nation into eight tribes. The tribes include interesting groupings such as “young urbanites”, “anxious aspirationals” as well as more traditional formations such as “steady Conservatives”.

It is clear that the Tories need to do something. With 18 months to go, Labour is in the lead and Ed Miliband has lately discovered something of an identity (even if it’s a very socialist one). Question is whether trying to segment UK voters based on big data and target them with different messages is the right thing to do.

Everyone was already surprised with Jim Messina coming on board. And now they’ve discovered segmentation and are applying it like we were back in the 90’s. Remember when 1-1 marketing was the rage and the Internet was supposed to lead to personalized messages for all? With even more data available, that should have just become a better and better strategy in the age of Google. But it didn’t work out too well. It turned out having a distinct message to each person is neither nor effective, nor necessary. A Blue Ocean Strategy of finding a common need and applying it across separate groups is normally much more effective.

This goes especially in a smaller media landscape such as the UK. Much smaller than that of the US, it becomes much harder to not have the lines crossed between the various “tribes”  and end up communicating the wrong messages.

Further, in order to constitute a tribe, it should be need-based, like Seth Godin says. These eight groupings might not be based on demographics only like old-style segmentation, but their shared needs seem fairly superficial.

A much more interesting Blue Ocean Strategy would be to go deeper and uncover more hidden needs that could be expressed in policy promises that can resonate between people that previously seemed far apart. A good example would be in the US where promises to rein in the NSA is an area where right-wing libertarians meet left-wing softies. They have very different reasons behind it (freedom versus human rights), but the need is shared.

Photo credit: Wikipedia