YouTube vs Viacom
Many are writing the battle of YouTube vs Viacom as a classic new media vs old media battle. So the recent news that Viacom tried to buy Youtube prior to suing it for $1 Billion seems to suggest that this is more about business with copyright as the proxy:
YouTube, frankly, has moved the video market forward faster than any other player, and that high risk game is ultimately good for users. If it had been left to Viacom and its ilk to move forward with Internet TV, we’d still be watching everything in Windows Media or Realplayer (RealPlayer) format, with no progress made over the past two years. These companies didn’t innovate, and suddenly found themselves contending with a young upstart that was driving more viewers to their content than they ever could. Fearing loss of control, particularly of the distribution channel, suing seemed like the best option for a company that’s anti-innovation. That said, YouTube was also so incredibly slow to roll out its copy protection, and only delivered a deal with AudibleMagic when we expected an in-house solution, that it gave these lawsuits the opportunity to bubble up.
As more and more details of the case appear online – it also gives an interesting insight into a older, larger corporation deciding how to approach a young upstart:
[A] July 2006 e-mail exchange between Jason Hirschhorn and Viacom General Counsel Michael Fricklas as Viacom deliberates about buying the video portal. Fricklas replies to an Ars Techinca explainer Hirschhorn sent around on YouTube and copyright: “Mostly, YouTube behaves—and why not—user-generated content appears to be what’s driving it right now. Also the difference between YouTube’s behavior and Grokster’s is staggering. while the supreme court’s language IS broad; the precedent is not THAT broad.” Hirschhorn carefully responds: “I believe that more than 60% of youtube’s traffic is from copyrighted material.” … Viacom execs, who missed out on MySpace (NYSE: NWS) the year before, are beyond keen to get YouTube. Not surprisingly, Google highlights gthe most, um, exuberant parts: MTVN CEO Judy McGrath telling M&A execs: “Help us get YouTube. We cannot see it go to Fox/NBC” and “I want to own YouTube. I think it’s critical asnd if it goes to a competitior!!!!!!!!!!!!!!!!! Even if we have to buy it with a partner to keep it below the line.” Then-Viacom CEO Tom Freston:”If we get UTube… I wanna run it.” McGrath: “You’ll have to kill me to get to it first.”
Editing, not Text but Software Development
I was struck by this article from Jeff Jarvis:
“Almost everything you see in Twitter today was invented by our users,” its creator, Jack Dorsey, said in this video (found via his investor, Fred Wilson). RT, #, @, & $ were conventions created by users that were then—sometimes reluctantly—embraced by Twitter, to Twitter’s benefit. Dorsey said it is the role of a company to edit its users.
Edit. His word. I’m ashamed that I haven’t been using it in this context, being an editor myself and writing about the need for companies to collaborate with their customers.
I have told editors at newspapers that, as aggregators and curators, they will begin to edit not just the work of their staffs but the creations of their publics. But that goes only so far; it sees the creations of that public as contributions to what we, as journalists, do. And that speaks only to media organizations. Dorsey talks about any company as editor.
What interests me is that this casts the role of Twitter can a company akin to an evolutionary agent – casking their eye over the adaptations that the mass of users generate, and putting their resources/support behind adaptations they judge as beneficial. What is difficult from an evolutionary point-of-view is that it is a blind process – it can’t see forward, it only is. When we act as evolutionary agents, we are trying to second guess what will and won’t work – except that the number of variables at play is so huge, our guesses are just that – guesses.
However Twitter has an advantage here in selecting software amendments – it’s users are a huge laboratory testing out the ideas to see what works and what does not. That is not to say the still can’t get it wrong (witness the debate over the RT) but you can atleast make an informed guess based on the mass behaviour of your users.
This is real-time software development!
The Artificial Creation of Life, A Comment
The huge news of the creation of the world’s first truly artificial life-form is currently creating lots of news and comment – as you might expect:
Last night, in a dramatic announcement that led some to accuse him of playing God, Venter said the dream had come true, saying he had created an organism with manmade DNA. The feat, hailed as an epochal scientific breakthrough by some but an alarming development by others, was achieved by scientists at the J Craig Venter Insititute in Rockville, Maryland using little more than a computer, some common microbes, a DNA synthesizer and four bottles of chemicals.
However, this is not the spark of new life where there was nothing, there is still a strong element of mothering and support from existing lifeforms to make this new organism:
One could argue that it isn’t entirely artificial yet, since the artificial DNA is being placed in a cell of natural origin…but give it time. The turnover of lipids and proteins and such in the cytoplasm in the membrane means that within 30 generations all of the organism will have been effectively replaced, anyway.
That said, this is still really important stuff – not least because it firmly places the concept of DNA as information – just as we have developed technologies to manipulate information with ease, so too will the technology of DNA manipulation accelerate. It is hard to see how it could be anything else, given what we know about information and what we know about DNA…
“In biology the basic substance of life is now understood to be made of the information code of DNA. The analogy between life as coded information and software codes running the ‘body’ of cyberspace is too close for the conjectures of an imaginary to miss. The fantasy beneath the cyborgs of cyberspace is that everything, even or especially life, has become information…Here life is just information with no fundamental, ontological distinction between it and library databases. As with all other components of the imaginary this collapse of lie into information also appears to be occurring before our very eyes, driving the urgency that many feel to understand this change.” (Jordan 1999:191)
Cloud-Sharing
There seems to be more and more applications looking to move p2p file-sharing into the cloud. For example Gygan – a propriety application that allows users to upload their own files to Gygan’s servers for storage. So far, sound like existing applications that offer cloud-storage; but they also allow users to search though what other people are storing and take copies. While there is lots of debate as to if Gygan will be inundated with copyright lawsuits, allow copyright holders to sue infringers on the service, even who they are and where they are based – what is of interest here to me is the idea; mass storage and mass sharing but facilitated by the cloud.
It is interesting as it pulls away the walls between storage and sharing. As server and storage costs come down and down, the ability to set-up and run such services will also plummet. What will also be interesting is when this idea becomes truly p2p – that the cloud is composed a small parts of each users bandwidth and hardrive.
Talk Notes: Technology and Evolution
Update! I’ve now passed my PhD and now have a doctorate! The full title is: Network Media Distribution Systems: Understanding the Media Ecology of Software Using an Evolutionary Framework.
Below are the slides from my presentations at eComm 2009, Virt3c@Hull 2010, Pervasive Media Lab (Bristol) and OKCon 2010 with notes below each explaining a bit more… (This research also formed the basis of a peer-reviewed article in the Journal of eLearning and Digital Media & an article on GamesIndustry.biz)
The talk is about my PhD research looking at technology and evolution. I focus on p2p technology but think the finding do apply on a wider scale. The seed to my research came from working on a p2p client development project. What we noticed was that at a set point in the project life, our version of the project (which used Azureus as a base) was no longer compatible with the ongoing development of the main strand of Azureus (now called Vuse). This stuck me as interesting – almost like the branching of two new species who can no longer inter-breed. I thought it is worth looking at biological models, specificity evolution, to help understanding what is going on…

First it is worth exploring what evolution is, as there are lots of misconceptions about this. The image above shows some of the famous finches from the Galapagos Islands that inspired Darwin to make the vital conceptual link to figure out what was happening in nature. Darwin only had the end-results of evolution – living things and fossils – to look at. He could not see the process and so had to reverse-engineer evolution from his observations.

The quote below shows what he found. He observed small differences in the same species and noted that they must somehow have a common root. The famous example of his reasoning is the variations of finches he encountered on his voyage with the ship, HMS Beagle. Each separate species of finch has adapted to the conditions on its island and the food available there. e.g Short beaks are good for tearing at the base of cactus and eating the pulp and insects that they find there. Long beaks are good for punching holes in prickly-pear fruit and eating the flesh.

Pulling these strands together, we can see that evolution has three basic components;
- Mendelian Genetics; passing on your characterises to the next generation,
- Mutation; Copying errors in the generational process. This is a means of change being introducing change to the next generation – reproduction methods also do this; sexual reproduction is a good example of this.
- Natural selection; nature selecting which individuals will survive to procreate, thus selecting the ‘fittest‘ genes for the next generation.

He then combined this idea with the fact that when living things procreate, they tend to produce more offspring than is needed to continue the species. He noted that all the offspring seem to have tiny differences in each. Many of the offspring will not make it to adulthood to procreate in turn. He noted that the offspring with some kind of advantage (however small) are more likely to make it and so pass that advantage on to future generations.

Other authors are looked at the idea of applying the concept of evolution to technology. Basalla (1988) noted that the concept of the noble inventor was a myth; that great inventions were built on the back of lots of existing innovations. For example: Invention of the telephone – Alexander Graham Bell – but it was built on the progress of other inventions; electricity, telegraph etc.

More recently W Brian Authur (2009) suggest that the dominant force of technology is the idea of ‘combinatorial evolution’ that is technology evolves by the combination of existing forms of technology. He said that this does happen in biology, but is rare – however in technology, this is a common form of evolution.

We do see this in biology, but it is rare; example endosymbiotic theory suggests that the mitochondria (the power plant of a cell) were separate organisms that somehow got engulfed into a cell, formed a symbiotic relationship and the rest is history (our history!).
For example – GPS – a new technology but one that consists of a number of pre-existing forms of technology; computers, satellites etc: “A GPS system is a combination of computer processors, satellites, atomic clocks, radio transmitters and receivers. Whoever invented that did not say, “I am going to combine existing technologies”; they’re basically saying, “What is it going to take to solve a problem here—the one of finding a point’s location on the earth?” And that combination resulted. So technology evolves by combination, and once the technology’s in place, then the Darwinian mechanisms of variation and selection set in.”

However Arthur does not seem to account for where we see smaller incremental improvements in a technology. If as an example we look at the technology changes is modems:
- 1980s – Used frequency-shift keying as a method of turning sound into digital and back again.
- Early 1990s – Based on the same idea but added 90 degree variation of frequencies used, so adding an extra bit per cycle; phase-shifting keying technology (1200 bps)
- 1992 – Add error checking routines to existing technology to combat the echo problems, so modems get faster (9600bps).
- 1995 – Added Trellis modulation – where the data in need of processing is placed into 2D structures in a logical manner, allowing mass error-checking – speeds go to 28K then to 56K…
Then cable arrives and changes the environment. We can see steady evolution of speed to meet the demands of the users (selection) but each new advance builds upon (retains a legacy) and not replaces the old – until the environment changes; some technologies adapt, some go extinct.
Indeed this is a co-evolutionary process as the creation of more content heavy websites pushes demand for faster connections, so content creators make use of the faster connections to push more content and so on.

All of the examples discussed so far relate to the phenotype – what we can see. It was all Darwin had – but it is not all we have today. Biologists can delve into the DNA – the genotype to discover much more about the interrelationships between species; phylogenetics.
This image below is an image of the phylogenetic tree of fungus:

In my research, I am looking at software – specifically p2p clients. This gives a distinct advantage over other approaches to technology because much of the development is open-source, so we can peer inside the technology and see how it was built. We can start to look at it’s genotype.
Source-code is not an exact match to DNA – has similarities and differences, but the similarities are vital because they allow us to do what biologists have gone and track generational change. So I have been looking at building proto-phylogenetics tree of p2p families.
Key ideas:
- Based on source-code relationships (akin to the genes in biology).
- Not based on the idea (the meme, though that is important too).
- Shows change over time. – each generation (release) of the software is the offspring.

So I looked to develop a method to find the family trees in p2p clients. In a spreadsheet, I create an entry for each type of torrent client software – the full method and more information is here and a summary is below:

This graph (above) is the result of that method. It shows releases of p2p clients over time…. (bigger image):

Next we add in the code/genetic links between the p2p clients (I.e. where one software project is based on the source code of a past project.)

Then by removing the background legends of the graph, we can pull out the famliy trees showing a basic (yes/no, not by degree) phylogenetic relationship. The one below is for Azureus:

And BitTorrent:

And LibTorrent:

Now if we plot the changes of new releases (new offspring) over time from the last data-set into a new graph we also see something very interesting. Initially there is a small amount of activity. Then there is a growth of different clients and versions of clients in the middle. The is a huge rise in the population (… many more individuals of each species are born than can possibly survive) and finally a drop-off of activity as the best p2p clients dominate the space:

So what does this mean? This means that the lessons of biology can (sort of) be applied:
- Killing off individual sub-species, unless you get 100% of them (and the meme too!) far from stopping the spread of that species as a whole, may actually strengthen the species. e.g see BitTorrent as a sub-species of the p2p (torrent) species and by removing that sub-species, provided at least one project survives, then the this variant passes the adaptations that mike it fitter on to the following versions. It’s the same principle as to why antibiotics begin to fail.
- Variation is inevitable; the ease of development and branching creates more variation – which is why Open Source is so variable and thus it’s species are very strong (over all) against attack.
- Prediction is almost impossible. What will be next? Given the number of ideas, platforms and technologies – who knows? Evolution is blind. It only adapts to the now. Users act as a component of ‘selection’; the inconsistencies in choices acts as – for all intents and purposes – natural selection. Clay Shirky had a great Twitter message that puts that point into perspective nicely: “Why I ignore all “5 year plans”: 5 years ago, YouTube and Twitter didn’t exist, and Facebook was only for college kids” – Twitter itself came as a result of a flexible approach to technology from a company that was doing the same sort of thing as everybody else and realising it needed change.
- Adaptive technologies will win the evolutionary arms race. The technologies that produce new generations and variation will eventually win.
Thanks! You can follow my research via Twitter (@TomasRawlings).
The Evolution of Torrents – Going Private
As laws come in to allow the identification of p2p users, one would expect that p2p users (and developers) would move towards systems that seek to make the user anonymous and/or enforce privacy for their users, rather than stop using p2p. Well that happened in France, and now it is happening in the UK…
After the UK Government passed its new and improved anti-piracy bill we decided to ask some of the people that run anonymizing BitTorrent services if they’ve seen a rise in users. One of the well established services in this niche is BTGuard, an easy to configure proxy service that hides the IP-addresses of its users from the public. Services like these are a thorn in the side of anti-piracy outfits because they make it impossible to track down infringing users. As expected, the demand for BTGuard from UK citizens has grown significantly in recent months. One of BTGuard’s founders told us that their service now has almost as many UK users as US users, and also detailed some major changes that have been implemented since we covered the launch of the service two years ago.
So I would now expect to see the integration/development of such ideas grow and grow as such laws come online…
Cost of Piracy, Cost of Copyright
It is a hard thing to estimate the cost of the possible (if any) loss that companies might get from piracy. The industry argument goes that each copy pirated is a lost sale; yet this seems an over-estimate as there must be people pirating media artefacts who would not consume it is they had to purchase it. A counter argument goes that the pirates are not people who were ever going to pay, so nothing is lost – which also seems an argument too far in the other direction. Clearly there are not one single ‘type’ of pirate; people do it for many reasons. Some studies have found that pirates are avid consumers, whereas studies from industry lobby groups often distort the findings for political reasons.
So it is with big surprise that the US Government’s internal watchdog’s study into the issue can’t arrive at a conclusion of how much piracy costs.
What is also interesting in this area is a spreadsheet put online to show the various amounts of sales/streams that a musician would need to do per year to make the minimum wage…
– 54,588,235 streams of Spotify.
– 148,789 MP3 sales in iTunes.
– 92,986 Commercial CD sales (low end of royalties).
– 1720 self-pressed CDs.
So what is clear is that the best income stream for musicians is the DIY approach – if you can sell ’em. However it does not account for live gigging – and other sales opportunities; t-shirts and the like. Still it is interesting…
P2P Traffic Stats Show Interesting Changes in Use
I have noted before the difficulty of giving reliable measurements of p2p traffic; so it is with interest (and a pinch of salt) that we come to new figures again showing a decline in p2p traffic. This time the report is from Arbour Networks and reports a decline of over 71% between 2007 and 2009. That sounds significant. So what does that mean re users behaviour on the internet?
Well first off, demand for video grew by a huge 67% – which is what I’d have expected; the proliferation of good quality streaming sites mean that users can watch the video of choice without having to download it – so they will. (which interestingly is where it seems to be legal in the EU to stream copyrighted material).
Second point to note is that while there has been a rise in the use of remote-hosting sites (such as Megaupload) for sharing files rather than P2P – again I suspect for reasons of user ease and convenience.
Third point to note is that there might be methodology issues as the graph reports using ‘well known p2p ports’ and of late there has been a rise in using randomised ports by P2P software. While this is not impossible to detect, it is more work. (I’ve not read the original paper, so this might have been accounted for…)
Hat-tip to P2P-Blog.
Here comes… Insect Media!
I am excited to read about the forthcoming (fall 2010) book from Finish media theorist Jussi Parikka, Insect Media: An Archaeology of Animals and Technology. Here’s the info that he posted on his blog about the book:
Since the early nineteenth-century, when entomologists first popularized the unique biological and behavioral characteristics of insects, technological innovators and theorists have proposed the use of insects as templates for a wide range of technologies. In Insect Media, Jussi Parikka analyzes how insect forms of social organization—swarms, hives, webs, and distributed intelligence—have been used to structure modern media technologies and the network society, providing a radical new perspective on the interconnection of biology and technology. … Parikka develops an “insect theory of media,” one that conceptualizes modern media as more than the products of individual human actors, social interests, or technological determinants. They are, rather, profoundly nonhuman phenomena that both draw on and mimic the alien life-worlds of insects.
I am really interested in the ‘insect theory of media’ – especially as my own work looks at media and evolution. I enjoyed Parikka’s previous work, ‘Digital Contagions’ which – a not unrelated idea – looks a media though the lens of viruses; both real and virtual. I am also drawn to the subject matter as I was very influenced by another book looking at human/non-human boundaries – John Gray‘s excellent Straw Dogs (2002) which declares that there is no difference between a human city and an ant city. In an ant city we see storage depots, graveyards and security posts. In a human city we see storage depots, graveyards and security posts. If one is natural, so must the other be, humanity and its creations cannot not escape nature to exist on a pedestal because they are nature.
There is a considerable amount of internal competition in biology. By this I mean that internal to an organism, it can compete within its self to produce the best ‘goods’. So for example some plants will abort the growth of fruit where it does not have enough seeds. One could see this as a form of internal-competition between possible fruits so that only the fittest has the resources to grow it to full term, is used. I was reading an interesting account of the internal competition at Microsoft:
Internal competition is common at great companies. It can be wisely encouraged to force ideas to compete. The problem comes when the competition becomes uncontrolled and destructive. At Microsoft, it has created a dysfunctional corporate culture in which the big established groups are allowed to prey upon emerging teams, belittle their efforts, compete unfairly against them for resources, and over time hector them out of existence. It’s not an accident that almost all the executives in charge of Microsoft’s music, e-books, phone, online, search and tablet efforts over the past decade have left.
So where is their internal competition going wrong? Power corruption; the conditions set for the survival of the projects is not about the fittest (in the biological sense of the word) for the market – but is instead mapped to an internal fitness landscape mapped to the company. Put simply by selecting for the internal and not external conditions, Microsoft is making ideas that work fine within the Microsoft offices but fail in practice.




