NFL Testing New Formations

March 26, 2015 · Posted in Featured Blog, M&E Daily · Comments Off 

By Paul Sweeting

The NFL seems to be in a test pattern. On Monday, the league announced that it will make next season’s match-up between the Buffalo Bills and the Jacksonville Jaguars available exclusively via the internet outside of the teams’ home markets, rather than on national television. That was followed by an announcement that the league will suspend its local TV blackout rule for the entire 2015 season allowing games to be shown in their local markets even if the game is not a sell-out.

The league described both moves as tests, although what exactly is being tested in each case was left a bit vague.

The Bills-Jaguars game is a one-off, and a low-risk one at that. The game was set to be broadcast by the NFL’s own NFL Network, so there were no pre-existing rights deals to renegotiate, and involves two struggling teams with little national following in a game to be played in London and shown in the U.S. at 9:30 a.m. Eastern Time. Even if the test is a disaster the damage will be limited.

Read More

From Apple Pay to Apple TV, Leveraging a Lack of Knowledge

March 19, 2015 · Posted in Featured Blog, M&E Daily · Comments Off 

By Paul Sweeting

We are not in the business of collecting your data,” Apple senior VP Eddie Cue declared in announcing the Apple Pay mobile payment system. “When you go to a physical location and use Apple Pay, Apple doesn’t know what you bought, where you bought it, or how much you paid for it.”

The line was clearly meant as a swipe at Google and other competitors in the mobile payments space, who do collect purchase data and use it in ways that can implicate users’ privacy. But Apple’s studied indifference to the details of purchase transactions is also central to Apple strategy in launching Apple Pay.

When a iPhone user adds a credit card to her Apple Pay account, the card information is encrypted by the device and sent to Apple’s servers, where it is decrypted to identify the issuing bank, and then forwarded to the bank in re-encrypted form.

Continue Reading

Meerkat and Dawn of Sender-Side VOD

March 17, 2015 · Posted in Featured Blog, M&E Daily · Comments Off 

By Paul Sweeting

There are plenty of live-streaming platforms out there for anyone who wants to set up their own broadcast on the cheap. But few have caught on as quickly or generated as much buzz as Meerket, the barely month-old streaming app that rides atop Twitter.

Or at least it did until Friday, when Twitter abruptly cut off Meerkat’s ability to easily access users’ list of followers to automatically alert them to when a new “Meerkast” is in progress. The move was neither unprecedented for Twitter, which has never been overly developer-friendly, nor particularly surprising insofar as Twitter announced its acquisition of Periscope, a competing live-streaming app, reportedly for $100 million, on the very day it shut the door on Meerkat.

So much for platform neutrality.

It’s not hard to see why Twitter would want to reserve the opportunity represented by Meerkast for itself, however. It has the potential to become a very powerful platform in its own right.

Live video streaming is not a new technology. But the Meerkat app got a lot of things about it right. The app is launched and streams are initiated from a smartphone (so-far iOS-only but an Android version is in the works) and, like Snapchap photos, the streams are ephemeral. There is no pausing, rewinding or sharing during a Meerkast (although the originator can download a video of the stream).

Continue Reading

FCC Unloads, Releases 313-page Report and Order on Net Neutrality

March 12, 2015 · Posted in Featured Blog, M&E Daily · Comments Off 

By Paul Sweeting

The full text of the FCC’s open internet order has now been released, along with 305 additional pages of exegetical elaboration and 79 pages of formal dissents from the two Republican commissioners.

From an OTT perspective, there isn’t much in the full text that wasn’t already known from what the FCC released last month when it voted to approve the rules: The order’s “bright-line” rules against blocking, throttling and paid prioritization do not apply to commercial interconnection arrangements. However, the FCC will consider complaints regarding those arrangements and will take (unspecified) enforcement action if an ISP’s behavior is determined to violate the order’s “general conduct standard,” prohibiting actions that “unreasonably” interfere with or damage consumers or edge providers.

Continue Reading

HBO Leaves in the Middle Man

March 10, 2015 · Posted in Featured Blog, M&E Daily · Comments Off 

By Paul Sweeting

HBO just can’t quit the bundle. With HBO Now, it’s new, over-the-top streaming service, the network for the first time is making its content available to stream without a pay-TV subscription. But HBO still hopes to sell it as part of a bundle. The only differences are the the other components of the bundle and the identity of the bundlers.

At launch, HBO Now will be sold exclusively by Apple and available on Apple devices only. According to HBO’s FAQ, “you can subscribe to HBO NOW℠ using your iTunes account. Customers can access HBO NOW℠ by going to, through AppleTV® or by downloading the HBO NOW℠ app in the Apple App Store®.” Apple and HBO will then share customer support duties.

After a three-month Apple exclusive, HBO will make the service available to other digital distributors, such as Amazon and Roku, presumably on terms similar to Apple’s, with the distributor doing most of the heavy sales lifting. But the network is also very much hoping to persuade its current cable-operator affiliates to bundle HBO Now with their broadband-only offering, so far with little success.

Continue Reading

Supply-Side Content Discovery

March 6, 2015 · Posted in Featured Blog, M&E Daily · Comments Off 

By Paul Sweeting

Nearly a decade after Netflix went over-the-top, at least a full decade after the launch of YouTube, and more than two decades since Bruce Springsteen first sang of having “57 channels and nothin’ on,” the video industry, which we used to call the TV industry, is still wrestling with the problem of content discovery.

If anything, the problem is getting worse, not better, as the volume of programming and the number of program sources are both growing rapidly thanks to the new digital platforms.

Heroic efforts have been made over the years to tame the flood, using search technology, algorithmic recommendation engines and various other big-data strategies.

Rovi’s Fan TV, for instance, which it acquired late last year and reintroduced at CES in January, uses voice-activated semantic search and leverages Rovi’s vast trove of video metadata to generate recommendations or locate specific titles in response to natural-language queries.

Continue Reading

Finger-pointing Over Interconnection

March 3, 2015 · Posted in Featured Blog, M&E Daily · Comments Off 

By Paul Sweeting

When a consumer’s OTT video stream starts rebuffering, or suffers packet losses resulting in degraded quality, it’s often hard to know where to direct blame. The problem is typically caused by congestion somewhere between the content’s originating server and the consumer’s receiving device.

But exactly where in the chain of transit that congestion is occurring, and more importantly who is responsible and why, can be difficult even for engineers — and virtually impossible for consumers — to ascertain.

Back when it appeared the FCC was poised to classify interconnection arrangements between last-mile ISPs and third-party transit and content providers as a new, distinct type of Title II service the question of liability for congestion in the chain of transit suddenly became urgent for those involved in wholesale traffic exchanges.

Fearing the new classification would leave them at a disadvantage in negotiating interconnection agreements with content delivery networks (CDNs) and other transit providers and worried they’d be blamed for problems occurring elsewhere in the transit chain, ISPs rushed to the FCC to insist that any new rules regarding traffic exchanges cover both parties to the exchange.

Continue Reading

For OTT Providers, ‘Strong’ Net Neutrality May Be Losing Its Strength

February 25, 2015 · Posted in Featured Blog, M&E Daily · Comments Off 

By Paul Sweeting

Don’t look now OTT fans but the net neutrality rules expected to be enacted Thursday by the FCC may turn out to be not as OTT-friendly as it originally appeared they would be.

When FCC chairman Tom Wheeler unveiled his “fact sheet” on the upcoming rules on Feb. 4, it looked as if the commission was poised to adopt the “strong” version of net neutrality pushed by Netflix and others. According to the fact sheet, the rules would treat interconnection arrangements between ISPs and third-party edge providers as a Title II service subject to the same “just and reasonable” standard that will apply to ISPs’ management of their last-mile networks.

Since then, however, as noted in a previous post here, even some net neutrality advocates have raised questions about the legal and statutory grounds for extending Title II to interconnection arrangements. In a letter to the commission dated Feb. 11, Free Press policy director Matthew Wood warned the interconnection arrangements were unlikely to qualify as Title II services as defined by the Communications Act, creating an opening for a legal challenge to the new rules.

Democratic FCC commissioner Mignon Clyburn is reportedly also having doubts about applying Title II to interconnection. According to a report Tuesday by the Capitol Hill newspaper The Hill Clyburn is seeking eleventh-hour changes to the proposed rules, including dropping plans to classify interconnection as a distinct Title II service.

Continue Reading

The Next OTT Battleground: Zero-Rating

February 23, 2015 · Posted in Featured Blog, M&E Daily · Comments Off 

By Paul Sweeting

The FCC this week is expected to approve on a party-line vote chairman Tom Wheeler’s long-gestating plan to impose new net neutrality rules by reclassifying internet access as a telecommunications service under Title II of the Communications Act, setting in motion a process by which the world will finally get to see the full text of the 308-page Memorandum and Order and begin fighting — almost certainly in court — over its particulars.

One thing that apparently will not be in the order, however, is any bright-line rule banning so-called “zero-rated” data plans offered by wireless operators and ISPs under which particular applications are not counted toward a user’s monthly data cap.

“We do not take a position on zero-rating,” the FCC’s special counsel for external affairs Gigi Sohn confirmed last week on the C-Span program The Communicators. Instead, she said, the agency would review complaints about zero-rated services on a “case-by-case basis” to determine whether they harmed consumers.

That has many OTT providers, start-ups and VCs worried that wireless carriers and ISPs will rush to embrace zero-rated data plans, producing the same sort of anti-competitive and market-distorting effects as paid prioritization, which the new rules do explicitly ban.

Continue Reading

Confessions of an Academy Screener Sneak

February 20, 2015 · Posted in Featured Blog, M&E Daily · Comments Off 

By Martin Porter

Dear Academy:

I don’t know who else to write to… so considering that this Sunday is your big day of the year and ultimately your show is the marketing force behind my story… you’re it. I have a confession to make because I have sinned.

You better than anyone know that it is screener season and we all know what that means. There are discs of all those great movies everyone has been meaning to see circulating at parties and among friends, creating a virtual industry underground among those who should know better but simply can’t resist watching one of your Academy Awards contenders for free.

My recent failing involved the Sony Pictures Classics picture “Whiplash,” which appealed to my childhood obsession with jazz drummer Buddy Rich. It was also one of the many other movies that are still on my pre-Academy Awards broadcast must-see list. I actually paid to view the movie in my hotel room during a recent vacation, which was cut short by my car service to the airport arriving too soon. I never saw it to the end and I was obsessed with seeing it through (take note for an opportunity here UltraViolet).

Unfortunately, despite a tease on VUDU that it was coming soon, the movie was nowhere to be found legally on the web. (The fact that I never considered checking out Fandango to see it in the theater is as much a reflection of my travel schedule as it is the state of theatrical affairs). I’m at least ethical enough to steer clear of the bootleg sites.

But then, by happenstance, the screener surfaced during one of those all-too-common industry chats that were taking place over the past few months among those somehow connected (albeit by 6 degrees) to an Academy-voting member.

Continue Reading

Net Neutrality Disconnection?

February 18, 2015 · Posted in Featured Blog, M&E Daily · Comments Off 

By Paul Sweeting

As ISPs, both large and small, gear up to sue the FCC over its forthcoming net neutrality order, even strong supporters of net neutrality have begun pointing to potential legal problems with the proposal outlined by FCC chairman Tom Wheeler earlier this month. One of biggest potential problems, as far as OTT providers are concerned, was flagged by Free Press policy director Matthew Wood.

As described in the fact sheet distributed by the FCC, the order will treat the “service” ISPs provide to OTT services and other edge providers as a Title II service, just as it does the internet access services ISP’s provide to subscribers, giving the commission the authority to review interconnection agreements between OTT services and ISPs and potentially declare them not to be “just and reasonable” as required by Title II:

Continue Reading

The Daily Show With Jon Stewart’s Legacy

February 12, 2015 · Posted in Featured Blog, M&E Daily · Comments Off 

By Paul Sweeting

It’s hard to remember now, but Stewart took over anchoring duties at the Daily Show nearly 17 years ago — more than six years before YouTube was invented. Yet they seemed made for each other. The show’s easily chunkable format was ideal for the atomized milieu of YouTube, especially in the early days when YouTube uploads were tightly restricted by length, and the website quickly became the Daily Show’s second time slot — for better or worse.

Even today, after an epic legal battle between Comedy Central’s parent company, Viacom, and YouTube, the online platform remains a critical outlet for the Daily Show. As Peter Kafka noted on Re/Code, the Daily Show draws about a million viewers in its initial airing. But millions more see it on YouTube the next day on their laptops and smartphones, or at least the bits their friends alert them to via Twitter, Facebook and other social media channels.

Viacom’s nearly decade-long litigation against YouTube for copyright infringement, in fact, was in large measure about the Daily Show, along with the Colbert Report, South Park and a few other properties. It was the unchecked, unauthorized uploading of clips from the Daily Show and the Colbert Report, as much as anything else, that spurred Viacom to launch its $1 billion lawsuit against YouTube (and later Google ex-acquisition) in 2007.

Continue Reading

Music Streaming Business Gets Funky

February 9, 2015 · Posted in Featured Blog, M&E Daily · Comments Off 

By Paul Sweeting

Music subscription service Spotify last week hired Goldman Sachs to help it raise around $500 million at a valuation in the neighborhood of $7 billion. Private market analysts currently value the company at around $6 billion.

The new fund raising round likely pushes back any plans the company had for an IPO, no doubt disappointing some investors. But it buys the company some time before it has to focus on IPO prep as it gets ready to face its first real competition. According to a report by the usually well-sourced 9to5Mac, Apple is gearing up to relaunch a Beats-branded music streaming service this summer.

Rather than simply dropping a Beats app onto Apple devices, the report says Apple has been working on a deep integration of Beats technology and functionality into iOS, iTunes and Apple TV.

Continue Reading

Different OTT Strokes for Different Folks

January 30, 2015 · Posted in Featured Blog, M&E Daily · Comments Off 

By Paul Sweeting

We’re just at the dawn of the virtual MVPD era and we’re already seeing signs of more market segmentation and product differentiation than with the current, facilities-based service provider model.

sling TV logoOn the heels of Dish’s breakthrough launch this week of its Sling TV service, Sony has begun to pull the curtain back a bit on its own virtual pay-TV service, PlayStation Vue, which is expected to launch by the end of the first quarter. GigaOM’s Janko Roettgers got a sneak peak courtesy of a beta tester, including some screen shots of the UI, and it’s clear the Sony service is a very different animal from Sling TV.

Unlike Sling TV’s low-priced, slimmed-down bundle of a dozen channels built around ESPN, PlayStation Vue includes a nearly full load of broadcast and pay-TV networks — over 70 according to the list provided to GigaOM — along with catch-up VOD and cloud-based DVR functionality, and is likely to cost $60 to $80 a month — roughly the same as traditional cable or satellite service.

The difference in the bundles reflects the very different audience segments Dish and Sony are targeting as well as their different strategic goals. Sling TV is targeted at the 10 million or so U.S. households, many of them counted among the Millennials, who currently have broadband service but do not subscribe to pay-TV.

Continue Reading

Media Big Data; Getting to Value

November 25, 2014 · Posted in Featured Blog, M&E Daily · Comments Off 

By Don Terry

Big Data…Hadoop…Data Lakes? Everywhere you turn there is a lot of industry buzz in the news about the value of “Big Data”, and the potential for this exciting new technology.

Big Data may indeed be a buzzword, but if so it’s a buzzword that can have a measureable and incredible impact on a company’s top, and bottom lines.

At its core, the concept of Big Data is that of supporting executive decision-making with the most accurate, current, comprehensive and comprehensible presentation of all information available regarding a business. Unstructured data is doubling every year, per IDC, driven by mobile devices, gaming consoles, social media, the Internet of Things, second screen and digital “non-linear” television viewing. But why does this matter? The promise was that Big Data was going to cure cancer make our lives easier and change our lives forever.

Read More

The Final Barrier to Media Workflows in the Cloud

October 9, 2014 · Posted in Featured Blog, Featured Blogger · Comments Off 

By Chuck Parker

It seems everywhere you look these days there is something about “the Cloud” in front of you. Twitter, LinkedIn, the tech press, and seemingly every press release you read has the various players in the Media and Entertainment industry describing what they can do for you in the cloud.

The promise of the cloud is BIG. At its most basic level, there is the opportunity for a company to turn its fixed investments in CAPEX into variable (or “burstable”) spend when required as OPEX. For smaller companies and companies without legacy infrastructure this is potentially the best way forward so that their costs are directly tied to their revenue stream whether those requirements are storage, transcoding or rendering for post production and visual effects work flows.

For larger and more established companies, it is the opportunity to exceed current infrastructure capacity to take on that surprise project. It is also the opportunity to set an investment level threshold where companies build to the “trough” rather than the “peak” for the inherent variability in the media industry business season.

But this isn’t a new promise in the IT world. Back in the early 2000s, this promise was held out to the largest companies in the guise of “outsourcing”. What’s changed now? First, CEOs and CFOs in the M&E industry are well educated now about “cloud” and understand enough to know that their businesses should be at least experimenting with workflows in the cloud.

Additionally, the “burstable” nature of the cloud means that businesses can actually “test drive” new capabilities in their workflows without significant investment or risk to their business. These two structural changes have resulted in a proliferation of growth in “back office” workflows across industries. SaaS (which is the ultimate cloud approach where the application and infrastructure are “by the drink”) has been a driving force here, allowing companies to put their expense systems, HR systems, and even sales and CRM systems into the cloud (think with great success for the companies who are deploying them.

But putting production systems into the cloud has been elusive. These applications are both complex and customized to the point where SaaS is not really an option. Even when two companies are using the same rendering application for their workflows, they are often managing their compute and storage in entirely different ways. So the industry has coined a new term to attempt to educate the CxO suite on how to approach this landscape – IaaS – or “Infrastructure as a Service.”

At its most basic level, when you retain control of the application but are leaning on the cloud for storage or compute resources, this IaaS term describes your approach to leveraging the cloud. But while this approach has better economics and risk models than the “outsourced” approach in the industry from 10 years ago, it isn’t exploding at the rate you would expect with its promise of “on demand” and “less investment”.

So what is holding the M&E industry back from investing in the cloud for its primary workflows?

Two things: security and connectivity.

Security. While every major cloud provider goes to some length to describe what kind of security protocols their customer’s data lives in while on their servers, we still hear horror stories every day about large companies being hacked for their valuable resources (Target and Home Depot are the most recent infamous incidents). In our specialized industry, all of us know that one single breach of pre-release materials can be the death of a company and no amount of promised encryption, even when from established and emerging cloud platforms, can alleviate those fears.

Further, if the project you are working on isn’t your IP, you are likely already bound by contract to use certain security measures that preclude using “public” cloud infrastructure for your workflows. The ability to audit security processes and posture is important to trusting partners in the service chain and remains a requirement for the most important content workflows.

Connectivity. The challenge of delivering the promise of “burstable”, “on demand” storage and compute power to these resource intensive applications comes down to the internet’s age old axiom: sustainable bandwidth. While there are plenty of companies that can drop a multi-gig connection to a cloud provider, there are few that have the expertise to connect your application into the cloud resources right to where they need to be and integrate with your existing network and workflow – taking account of aspects like low latency requirements.

Even then, just finding a “large pipe” for your data doesn’t complete the business model—if you cannot get your bandwidth to be as “burstable” as your storage and compute power, the investment model for cloud falls apart.

At Sohonet, we believe the key to unlocking the M&E industry’s “cloud potential” is the ability to offer studios, post production houses and visual effects companies options for getting their applications connected to public and private
cloud infrastructure in a manner that meets their low latency and security requirements while still meeting the “on demand” business model to support their ability to “burst” into the cloud for production.

We believe that the M&E industry will embrace a mix of three approaches to meet their production workflow and business model requirements. Inherent to all three approaches is unlimited or inexpensive egress (essential for the unpredictable production process), improved security posture, and (most critically) access to 24/7 support resources that understand their unique workflow requirements.

  • Low-cost access to generic compute and storage resources (public cloud) coupled with sustained low-latency bandwidth that includes unlimited egress.
  • Access to application-specific low-latency and/or industry standard security approaches (private cloud) for storage and compute resources coupled with sustained low-latency bandwidth that includes unlimited egress.
  • High-speed, burstable connectivity to major cloud providers where the support for the application and security are already “in-house” and the only missing component is the “burstable” bandwidth directly into their resource center that provides the lowest possible latency and inexpensive egress while still improving the security posture.

We believe that access to cloud resources is critical to the industry’s progression along ever-increasing data storage and compute requirements as 4K workflows begin their progression to 8K and High Dynamic Range workflows and while 4K consumption becomes mainstream in consumer homes. As the trusted communications partner for the M&E industry Sohonet is committed to providing the same Fast, Flexible and Phenomenal customer service that has built our brand and reputation over the past 15 years, and that delivering on the promise of “Connected Cloud Services” is critical to our customers’ future.

For more information please visit or contact him directly at


It’s All About The Showrunners

October 2, 2014 · Posted in Featured Blog, M&E Daily · Comments Off 

By Alan Wolk

For all the debate around who should be in charge of second screen and social TV efforts, one thing is becoming very clear: the key to success rests with the showrunners.

That’s because when the showunner is involved, along with the actors and the writing staff, it seems like the second screen experience is an actual part of the show, not some sort of bolted-on afterthought. In fact, a recent study from Twitter, Fox and the Advertising Research Foundation revealed that 40% of viewers prefer to see tweets from cast members versus 18% who wanted to see tweets from the official show handle.

This stands to reason on many levels: the type of viewer who is fan enough to want to tweet about a show is the type of viewer who’s likely formed some sort of connection with the actors and wants to read their tweets…

Continue Reading

SiriusXM Hits Fast Lane Using Integrated Marketing & Analytics

September 10, 2014 · Posted in Featured Blog, M&E Daily · Comments Off 

As my kids get older, I’ve had to cede control of the car radio. One result? Thanks to the forced waning of my NPR habit, I’m much less interesting at cocktail parties than I care to be. Another consequence? Trying to smoothly navigate radio programming that doesn’t meet my parental, shall we say, scrutiny – nor, assuage the pop-cultural tastes of the wannabe teenager.

Satellite radio – long dominated by SiriusXM Radio – brought peace of mind to parents everywhere by offering up more programming options in the car than we ever thought possible. But – like the living room before it – the car is becoming the latest field on which today’s digital media game plays out.

Continue Reading

SiriusXM Hits Fast Lane Using Integrated Marketing & Analytics

September 8, 2014 · Posted in Colleen Quinn's Blog · Comments Off 

As my kids get older, I’ve had to cede control of the car radio. One result? Thanks to the forced waning of my NPR habit, I’m much less interesting at cocktail parties than I care to be. Another consequence? Trying to smoothly navigate radio programming that doesn’t meet my parental, shall we say, scrutiny – nor, assuage the pop-cultural tastes of the wannabe teenager.

Satellite radio – long dominated by SiriusXM Radio – brought peace of mind to parents everywhere by offering up more programming options in the car than we ever thought possible. But – like the living room before it – the car is becoming the latest field on which today’s digital media game plays out.

New streaming music providers offered multi-platform music experiences that were highly personalized, mobile, and which threatened to make in-car satellite services too niche. (Register here for the Sept. 23 webinar to learn how SiriusXM navigated new channel demands with analytics and explore how SiriusXM is leveraging customer data integration, behavioral analytics, and real-time interaction to build deeper relationships with their subscribers.)

With more than 25 million subscribers, SiriusXM didn’t seem to be in a bad position. Still, with increasing competition from streaming music entrants like Pandora and Spotify, SiriusXM Radio needed a strategy to future proof their business, and meet subscribers’ demands for content anywhere, at any time.

Like many companies who don’t fully realize the treasure that is customer and marketing data, SiriusXM had outsourced marketing technology. When the time came to quickly respond to the shifting music market, a key first step was to bring their most valuable asset back in house. This step marked the first in a journey for SiriusXM Radio to develop and deploy next generation marketing analytics that would allow them to reinvent their relationships with subscribers and prospects, and take ownership their data.

Across media & entertainment and digital media, the expectations of consumers continue to shift. More than ever, audiences expect deeply personal messages on their platform of choice, at the right time. Yet, many companies don’t leverage their most valuable asset – detailed insight into audience behavior. And, in some cases, that data is left to third parties to manage.

If your business is ready to take control of marketing data to drive a more personalized, engaging customer experience, here’s your chance to learn from the top. SiriusXM Chief Information Officer, Bill Pratt, is sharing his experience about his company’s analytics journey in a live, one-hour webinar on September 23rd.

Sports and Second Screen – The Winning Combination

April 21, 2014 · Posted in Featured Blog · Comments Off 
Sports are big for TV. To be convinced of this, just look at the amount paid by BT to broadcast the football games of the European Champions league: £900m ($1.5bn/€1.1bn). As BT won the football rights, BSKYB the losing bidder saw its share price drop 11%[1] and £1.3bn ($2.1bn/€1.6bn) was wiped off its market capitalisation in one day.
Clearly, the loss of football games rights was seen as a major risk to its future profitability[2]. According to the Telegraph, BSKYB even lobbied Champions League officials for three days to reopen the bid after it was excluded from the auctions.
Sports are big for TV because it can draw a huge number of fans who are ready to pay for content. Together with Hollywood blockbusters, sports form the basis of pay TV. Contrary to films, sports is also very big for 2nd screen.  Nielsen compiled the below statistics[3] that shows that sports is the main driver for tweets about TV shows. Forget about the “X-Factor”, and the US Presidential race–this is all about the Super Bowl; 50% of tweets about TV are about sports. Why do sports events drive so much social TV activity? This is due to the nature of fans and of sports events themselves. Sports fans are really engaged and very emotional about their teams and players. Sports events are broadcasted live and drive immediate reactions.

Dell Helps Studios Maximize Compute Power and Minimize Energy Costs

April 18, 2014 · Posted in Featured Blog · Comments Off 

In this M&E Journal Digital Exlusive, Rakesh Nair of Dell discusses how they are powering the compute-intensive needs of the Media & Entertainment industry with their products, technology and for their creative partners. For example, the 45-second tracking shot of Paris that kicks off Academy-Award winning animated film “Hugo” entailed some serious animation rendering.

Pixomondo, the visual effects company tasked with rendering the film, needed high-powered computing to support this complicated composite, which for just that 45-second tracking shot, cost tens of thousands of dollars in power alone. It is not hard to imagine then that things like hardware performance, technology footprint and heating and power efficiency play a major role in the ability of Pixomondo and similar companies to turn a profit.

Click here to continue reading or click here to view and download a PDF version of the entire article.

Q1 2014 Quarterly Second Screen Update: Impacting the 1st Screen

April 15, 2014 · Posted in Chuck Parker's Blog · Comments Off 
2nd Screen has continued to reveal evidence of progress in both monetization and engagement.  In addition to our focused research on monetization in Q1, we have been completed a 30-page research report on Sports on behalf of our society members to help them and their primary stakeholders (investors, customers, management) cut through the hype and the disillusionment and focus on clear examples of what is working…

Second Screen by the Numbers, Q1 2014 – more growth, more engagement, more monetization

April 10, 2014 · Posted in Chuck Parker's Blog · Comments Off 

2nd Screen Viewing Experiences: 73% of TV Everywhere views are on a 2nd Screen.  ReelSeo.  Feb 6th. 89% of video views on the BBC’s iPlayer are VOD vs. Live. Click link to view Infographic.

Facebook Didn’t Need To Go Sci Fi To Own Hollywood

April 3, 2014 · Posted in Featured Blog, M&E Daily · Comments Off 

If Facebook’s new acquisition –Oculus Rift – sounds like something out of a Science Fiction movie, your gut isn’t that far off. The virtual reality headset maker – snatched-up for the astonishing  $2 BILLION in a surprise move – is the ultimate in geek chic. The device – which can create rich virtual reality, immersive experiences for gamers and beyond – already has a devoted if pocket-protector-wearing fan base. But, what does it mean for Facebook?

Read More

Facebook Didn’t Need To Go Sci Fi To Own Hollywood

April 3, 2014 · Posted in Colleen Quinn's Blog, Featured Blogger · Comments Off 

If Facebook’s new acquisition –Oculus Rift – sounds like something out of a Science Fiction movie, your gut isn’t that far off. The virtual reality headset maker – snatched-up for the astonishing  $2 BILLION in a surprise move – is the ultimate in geek chic. The device – which can create rich virtual reality, immersive experiences for gamers and beyond – already has a devoted if pocket-protector-wearing fan base.

But, what does it mean for Facebook?  Aside from broad brush comparisons to Apple and Google – companies which squarely saddle the software and hardware divide – there’s a buzz in Hollywood that this puts Facebook squarely in the movie business. Huh?  Some media analyst say that we should think of Oculus this way: it’s just another screen.

While that may be true in the long view, I’d argue that Facebook was ALREADY in the movie business, even without trying to out-Google Google Glass.  And, the current Facebook movie business doesn’t demand another screen.  Facebook – and its cousins Twitter and Pinterest –  are the collective mouthpiece for audiences to share what  think about anything Hollywood puts on any screen. I’m talking about what you or I or our mothers  (Yes, it’s true. Your mother!) have to say about movies and television on social media.

Those comments or “likes” are untapped gold in Hollywood. They reveal your audience’s interests in ways that box-office numbers still can’t. With the right analytics tools and insight, you can mine this data to learn EXACTLY what members of your audience think. What they think about your movie. Your talent. Your Second Screen App. Your recommendations. Your levels of customer service, if you’re in the subscriber business.  I’d argue Facebook and Twitter and their social media cousins have more to teach Hollywood than Hollywood has to teach Hollywood. The trick is, is Hollywood ready and able to listen?

The ability to analyze social media and behavioral data – and, most importantly, do so in a way that is able to be looped back in to bigger marketing and planning operations – is essential to making and monetizing content in the new M&E Ecosystem.  But, the fact is, few studios and distributors are doing this well.  Read this article to get a better sense of how social media analytics need to play into your content  and analytics strategy.

Speaking of Hollywood – the industry converges in a matter of days at NAB.  Check back here for your NAB wrap-up, insights and ideas.  Until then, Sci-Fi friends and believers, may the force be with you.

The Oscars, Roku and the BBC – Revealing the future of TV

March 8, 2014 · Posted in 2nd Screen Blog, Chuck Parker's Blog · Comments Off 

It’s been another fast-paced week in the digital video and second screen industries.  While the OTT video world is still reeling from the previous week’s announced Disney Movies Anywhere service (a serious threat to UltraViolet) and Marvel’s announcement of an exclusive output deal with Netflix (continuing to threaten HBO), second screen took a shot in the arm from the Oscars, and Roku mounted an attack on Chromecast.  At a glance:

  • “Watch ABC” did Second Screen for the Oscars “right”
  • Ellen broke Twitter
  • Roku announced their “streaming stick” device
  • Dish struck a deal with Disney to delay commercial skips
  • FreeWheel was acquired by Comcast
  • The BBC announced the death of analog for it’s Channel 3 service
  • An Aereo lost its court battle in Salt Lake City and Denver
Despite January’s negative press about second screen and social TV, the Oscar’s continued to propel both industries forward (as it has every year), with a shining example of how second screen apps should be built AND the most engaging social activity for a first screen experience ever.  ABC helped to solidify the “convergence trend” in second screen of viewing and companion experiences coming together by killing off its separate app that was required in 2012 and 2013 and integrating the enhanced viewing experience into its “Watch ABC” viewing app.  Upon opening the Watch ABC app on Sunday, you were presented with an option to go “Backstage” (sponsored by Samsung Galaxy) where a number of companion experiences were available from the ability to share in Facebook and Twitter, see photos and video clips of the arriving stars, and entire live camera feeds of different locations along the red carpet.  And yes, of course, the ability just to watch the broadcast feed itself (which unfortunately broke just as the awards show got underway).
Ellen took the social side of second screen to new heights by live tweeting throughout the performance, creating a new world record with the now famous selfie of stars centered around Meryl Streep (taken by Bradley Cooper).  IF anyone is still questioning the power of social media and its ability to engage TV audiences in real-time and in the days that follow, they only need to go check out the retweet traffic from that event–which gained momentum on Monday as those who missed the live showing were suddenly engaged from their social networks.  More than 3 million retweets in the first 24 hours.  And so much in the first few minutes of the selfie, that Twitter itself “broke” for a few minutes as their API servers buckled under the deluge of real-time SocialTV.
Roku took a page from Chromecast and the DIAL protocol and announced its next generation device (another HDMI stick), but took on some of the short comings with Chromecast by adding a remote and making all of its services available in an interface on the main screen.  For second screen, this is further evidence that Control and Discovery are not only inextricably linked to the 2nd Screen itself, but that the DIAL protocol is the most likely candidate for driving both adoption of the ease of use it can present via Control and the ease of development integration it can offer Enhanced second screen experience providers from their native viewing apps.  Stick with me here for a second: the announced device claims it will “cast” Netflix and YouTube out of the box.  Keep in mind here that similar to Chromecast, those hooks are “built in” to Netflix and YouTube via DIAL.  So, the other apps will quickly follow (as they did with Chromecast), creating another viewing ecosystem that allows the fast adaptation of converged second screen companion experiences.  Watch this space closely and expect great user experiences from the apps consumers already use every day (my guess is MLB and ESPN will be first on the scene with something dazzling — only because its too late for March Madness to harness it this year).
In a very interesting move, Dish struck a trade with Disney agreeing to “delay” the commercial skipping capability until after Nielsen C3 ratings expire on their shows in exchange for the rights to stream ABC channels direct to consumers (not TV Everywhere, but Video Anywhere, to quote Ran Harnevo from AOL).  As Dish is by far the industry leader in the Pay TV Operating space when it comes to second screen (led by Jimshade Chaudhari), I would expect a number of converged experiences to be delivered directly out of their flagship DISH Explorer app as soon as they are able to access those streams.
FreeWheel was acquired by Comcast this week.  If you don’t know them, they are an ad decision server that has been helping a large majority of Tier-1 TV networks to dynamically deliver ads to their streamed on-line video experiences (which means highly targeted ads resulting in higher CPMs).  Comcast already owns the Platform and the two-combined capabilities could be a powerful force in the TV Everywhere and Ad Supported Video markets.  This is a big deal for second screen because monetization largely hinges on the ability for the app publisher to be able to deliver lucrative in-line video ads to the consumer during converged viewing/companion experiences.  Worried about the potential powerhouse Comcast already is?  Not to worry as Google’s DFP Premium product continues to gain momentum in the industry, combining the ability to manage companion ads (display) and in-line video ads in the same campaign tool.
The BBC announced the end of analog broadcast for Channel 3, but more importantly the birth of its on-line linear channel experience.  Not dissimilar to Disney’s efforts in Germany, the BBC plans to continue to offer the programming in a linear IP-feed with 7-day catchup VOD in the iPlayer experience.  Now this is truly the future of TV happening right in front of us.  Some day, major brands will launch “virtual” channels around holidays and events that offer attractive programming both in a lean back experience (as we know and love our TV) and as a more selective VOD experience.  Did I lose you?  Imagine during the Christmas holiday season, Nickelodeon or Amazon launch a linear IP channel of programming (new title in their app) that streams Christmas movies aimed at certain ages or themes from middle of November until the 2nd of January, complete with those same titles being available individually for VOD through some appropriately designed UI along side the linear channel itself.  Or for FIFA to create country themed channels about all of the major players and teams in the world cup, streaming to fans as the build-up in the competition heads to the championship.  Accessing those virtual channels on your smartphone or tablet will be the norm, with individual programs available on demand (and promoted in Twitter in their Video Card feature).
Finally, Aereo took a stiff arm from the courts this week as it lost 2 battles for its services in Denver and Salt Lake City.  As the industry watches the looming April 22nd Supreme Court date approach, the debate about potential outcomes will intensify.  If Aereo is successful, it will very likely mean the beginning of the end for the $4B retransmission fee market in the U.S., which could very likely result in the major TV networks going directly after those Aereo consumers with ad supported products of their own (have you noticed you can’t find Fox, NBC, CBS or ABC easily available on your TV without some sort of authentication through your Pay TV provider?). In effect, “Video Anywhere” will be the resulting industry approach, with MSOs participating in the value chain through Tune-in incentives rather than the captured-authenticated TV Everywhere approach (where they prop up their subscription fees).  If Aereo loses, expect retransmission fees to increase and for authenticated TV Everywhere to be the winning approach, requiring TV networks to give Pay TV providers a bigger cut of the streamed shows’ ad revenue as their strength as gatekeepers grows.  So an Aereo win = Video Anywhere while an Aereo loss = TV Everywhere.
A lot of change in a week.
Come join the conversation with the industry’s leaders in Las Vegas at NAB’s 2nd Screen Sunday on the afternoon of April 6th or find us on Twitter @S32Day or @ChuckParkerTech.

Netflix Revolutionizes Media & Entertainment Through a BILLION Events Per Day

March 6, 2014 · Posted in Featured Blog · Comments Off 

It’s hard to imagine a technology company with more Media & Entertainment clout. Or, are they an entertainment company with massive technology chops? Either way, Netflix’s literal invention and dominance of the OTT market has revolutionized the way content is consumed . And now, they’ve even successfully re-engineered the way that content is created. A true sign of the Netflix zeitgeist? They’ve inspired a new lexicon for how audiences watch content, with new language like “cord-cutting” and “binge watching.”

Read More

Netflix Revolutionizes Media & Entertainment Through a BILLION Events Per Day

March 6, 2014 · Posted in Colleen Quinn's Blog · Comments Off 

It’s hard to imagine a technology company with more Media & Entertainment clout. Or, are they an entertainment company with massive technology chops? Either way, Netflix’s literal invention and dominance of the OTT market has revolutionized the way content is consumed . And now, they’ve even successfully re-engineered the way that content is created. A true sign of the Netflix zeitgeist? They’ve inspired a new lexicon for how audiences watch content, with new language like “cord-cutting” and “binge watching.”

As if you need proof that the language of Netflix is real, here are some hard facts. By the end of the first weekend following its Valentine’s Day release, the second season of House of Cards was streamed in its ENTIRETY by more than 2% of Netflix subscribers. With more than 40 million subscribers worldwide, that means one million people binge-watched an entire season in a matter of days.

Is that kind of massive success a surprise for a show that wasn’t even subjected to a real pilot process? Not for Netflix. Very little about how, when and where their audiences watch content is a mystery. That’s because Netflix uses sophisticated analytics to evaluate a billion transactional events per day. Every nuance of audience interaction is mined to drive their business forward, from securing the best content, the best prices for content, to developing meaningful and targeted recommendations that keep their subscribers watching and wanting more. Netflix’s analytic muscle is so strong in its recommendation engine that they attribute 75% of their streaming activity to recommendations.

And, that’s just the beginning of how Netflix uses data to drive their business. In a rare, live webinar, Netflix analytic thought-leader Kurt Brown will share how this Media & Entertainment pioneer is using analytics in the cloud to drive its business. To learn more or register for the March 18th webinar, click here.

Quarterly Second Screen Market Trend Update

February 18, 2014 · Posted in Featured Blog · Comments Off 

2nd Screen had a tumultuous run up to CES 2014 with the press continuing to be split between hype and disillusion.  While we normally would have written and presented this update at CES, we decided to focus on releasing our research on monetization on behalf of our society members to help them and their primary stakeholders (investors, customers, management) cut through the hype and the disillusionment and focus on clear examples of what is working.  Ironically, the additional insight gained in the first few weeks of January has been invaluable with regards to both consolidation (Yahoo closing IntoNow) and M&A (Viggle buying Dijit, TiVo buying Digitalsmiths).

Read More

What happened to Second Screen this week?

February 2, 2014 · Posted in 2nd Screen Blog, Featured Blog · Comments Off 

What a crazy week.  As if it wasn’t enough for NATPE to be taking place in Miami (with some great research and stats published about second screen), there was a ton of consolidation activity in our industry (Dijit/Viggle, IntoNow from Yahoo, ) and some rebranding by GetGlue.  At the same time the 2nd Screen Society (S3) published a teaser on its new research about monetizing the second screen, and then Gigaom and TechCrunch wrote some pretty disparaging views, with Gigaom reverting to the salacious headline of “Social TV is Dead“.

It’s no surprise that I received several requests from S3 members and from kindred spirits in the Twitter-sphere (thanks @Gip89) to attempt to help make some sense of all the news.  My head is spinning, too.
Let’s try to pull apart this ball of string by looking back at the fundamentals of the space and what each of the news events might mean, keeping in mind that while this week might have seemed like a “revolution” in terms of the number of momentous events taking place, its been a relatively progressive “evolution”, with each of these events building up over time in a manner similar to what we have been forecasting at conferences and in our research.  This is a bit like an eagle hatching from an egg–to those watching on the outside, it is a very momentous occasion, while for the bird itself, it is a milestone along his life journey that has been taking place inside the egg for sometime.
First, the fundamentals.  As we emphasized during our “2nd Screen by the Numbers” session at CES, this market place continues to grow at a rapid pace not because of the development of great user experiences on various apps of smartphones or tablets, but purely because of the massive proliferation of those smartphones and tablets and the ever present consumer behavior to turn to them in any lull of activity during entertainment (TV or otherwise).  The opportunity presented by the second screen phenomena is to capture this opportunity with an engaging consumer experience related to their video experience (companion or viewing) rather than letting email or Angry Birds seize it instead.  We spend a decent amount of energy across 30-pages of research on how successful business models are developing to monetize this change in consumer behavior, but regardless of our ability to capture the opportunity in terms of consumer engagement or monetization, that behavior will continue in larger and larger numbers.
Now, the momentous news events:
  •  NATPE.  Chris Tribbey wrote up a pretty decent summary of the content creators’ panel during NATPE discussing the insights from the CEA/NATPE research, presenting some GREAT stats about second screen usage and more importantly, a strong view from content creators (“Show Creators See Second Screen as Permanent”).
  • Yahoo’s IntoNow.  Yahoo made a decision to shut down IntoNow, the synchronous enhanced viewing experience app they acquired only three years ago.  I think two developments lead to this decision by Yahoo.  1) it wasn’t a very engaging experience (too broad and shallow) and was likely not attracting a ton of consumer engagement, and 2) Yahoo’s Screen app is taking off and has cemented their view that focusing on engaging the consumer around the viewing experience was a more attractive monetization play.  Let’s face it, Adam Cahan founded Auditude, IntoNow and is now Marissa’s right hand man for all things mobile video at Yahoo–he didn’t do this without thinking it through.
  • .  What a great validation of how important Discovery is in the second screen ecosystem.  Led by Ben Weinberger, Digitalsmiths has been quietly winning most of the MVPD operators in the U.S. with their personalization and recommendation platform.  The cash purchase for $135M by Tivo is certainly validation that the space is valuable for investors, but more directly indicates that Tivo is going to keep moving into the direction of creating great viewing and companion experiences for the living room (their current experience on a smartphone and tablet is already amazing and getting better all the time, lead by Tara Maitra and Evan Young).
  • Dijit acquired by Viggle.  Viggle is perhaps one of the most successful at monetizing the second screen companion experience (a short section in our research sheds light on their success).  Dijit’s Nextguide is perhaps the most engaging consumer Discovery experience (yes, better than Fan), with significant broadcast partnerships on their tune-in “reminder button” feature.  I am convinced that with Jeremy Toeman leading the UI/UX and Greg Consiglio leading the monetization, this marriage will be a happy one for their shareholders, their customers (brands and TV networks) and consumers.
  • The Grammy’s.  Why is this important?  I am sure a ton of stats will come out this week about how many tweets, etc, were pushed during the broadcast.  But did you see that Chromecast commercial?  Somehow Google managed to create the ultimate Discovery and Control powered second screen experience, stealing that opportunity from Apple, Netflix and Samsung.  They have not only created a pervasive and passive experience that seamlessly allows consumers to “cast” their viewing experience from their second screen to the first, but by using the DIAL protocol, the second screen is then freed up for a companion experience (or synchronous advertising – see our research).  The evidence of the commercial during the Grammy’s means they are SERIOUS about being successful with their $35 dongle.  Apple, Sony, Samsung and Roku should take heed.
  • GetGlue.  What does that mean to you?  It launched many, many moons ago as an attempt to create a social network around your viewing (and reading and wine drinking) habits, letting consumers check-in to a show and share with their Facebook or Twitter friends.  i.TV bought them last fall and no surprise has decided to re-brand them into something that speaks to the opportunity Shazam is busy uncovering–tvtag.  Despite Gigaom’s view on this, I think this is positive in that it means the i.TV management recognizes the opportunity (engagement with the consumer at specific points of the viewing experience) and the threat (Twitter is chasing this and so is Facebook).  Will they be successful?  Who knows, but consumer behavior will continue irregardless.
  • Shazam.  Perhaps more interestingly, Shazam is making a big push during the SuperBowl this year to see if their momentous growth in active users can move the needle on advertising and consumer engagement during the world’s largest live viewing event.  While I had been skeptical over the previous 18 months, their new CEO Rich Riley (joined in April, 2013) seems to have turned the ship in the right direction, racing towards a UX that both engages the consumer and provides a monetization platform.  And a partnership with Facebook isn’t a bad idea either.  Watch this space (and that Jaguar commercial).
  • SocialTV is dead.  Hmm.  Did you read the article?  First of all, yes I agree that gimmicky concept of social (badges, check-ins) is challenged, but keep a few things in mind: 1) second screen experiences can typically be broken into 5 segments,

    1 of which is the sharing or social aspect.  2) All of the hype from Gigaom and TechCrunch in the last 3 months has been that the Social TV battle is down to Facebook vs. Twitter–neither of which is going anywhere or walking away from TV.  3) Read the last 3 paragraphs and you will see both confirmation that they still believe in point 2 AND that the right engaging experience still needs to be developed–confirmation of the fundamentals above (that the consumer behavior will continue despite the poor UX).  Conclusion: a salacious headline that certainly made MANY people read the article.

Reviewing the facts with the lens of the industry fundamentals sheds a different light on each of these events, and in summary they all point one direction: It’s ALL second screen!.  The proliferation of the devices, the changing millenial behaviors (preferring to view on their 2nd screen rather than the big screen), the early success in monetization of some companion and viewing business models, and the consolidation of the space which means more marketing and development weight behind the strategies to lead to successful consumer engagement (and monetization).

You Don’t Have to Make A Deal With The Devil To Go Viral

January 29, 2014 · Posted in Colleen Quinn's Blog, Featured Blog · Comments Off 

It’s a sight not all that unfamiliar to new parents: an ashen, red-eyed baby, shrieking uncontrollably and spewing bile in its path.  In fact, come on over to the Quinn household, and you can witness the excitement first hand.

But, if you’ve been on any social network lately – and you haven’t been hiding under a rock – you’ve seen the infamous “devil baby”.  This little hellion has amassed more than 36 million views on YouTube since taking the internet by storm.  And, while the antics may not surprise the average new mother, this baby is no ordinary kid.

Devil Baby was an inside Hollywood prank – the marketing brainchild of 20th Century Fox  in anticipation of their new horror flick called Devil’s Due.  But, more than being just a trick, Devil Baby is a digital marketing phenomenon, revealing new data driven marketing table stakes for today’s Media & Entertainment market.

Here’s what you can learn from Devil Baby:

  • To Go Big, You Need To Go Viral –  There is no surefire recipe for viral success. But, one thing is certain, big data analytics can increase your odds through analytics.  Take the dominant video network,Machinima, for example.  Their business is ensuring the rapid and massive uptake of content across a wide swath of users – and they do that by not only creating awesome content, but by using behavioral analytics to identify key networks, influencers and consumers ripe for that experience.  (You can also check out our exclusive Machinima White Paper here!)


  •  Share of Voice is Great; Share of Wallet is Better –  You know a meme has hit the mainstream when my mother makes a comment about it.  But, buzz isn’t enough in today’s competitive content landscape.  It’s about the bottom line. Content creators, studios and distributors need to be able to use that buzz to predict performance and drive revenue. Being able to tap into, analyze and act on the ocean of big data – includingsocial sentiment and network analysis  – are key factors in the age of the Connected Consumer.


Data driven marketing – and execution through an end-to-end integrated marketing strategy  — doesn’t need to be as painful as an exorcism.  But, it will take the right tools and know-how being driven deep into our marketing organizations.  Devil Baby is one of many examples we’ll see of today’s marketing revolution being driven by big data.

Speaking of red-eyed hellions, nap time is over… and my second shift is calling.

Monetizing the 2nd Screen–business models that work

January 27, 2014 · Posted in Chuck Parker's Blog, Featured Blog · Comments Off 

Second screen, social media and companion applications are all high on the agenda of executives in the media and technology industries. As a reflection of all major TV and technology conferences in 2013, CES, NAB, IBC, and MIP had several sessions dedicated to second screen. But second screen, while proven as a reality of consumer behavior, is not yet widely seen as a revenue driver. Indeed the reality of the second screen phenomenon is accepted, as is proven by the continuous flow of statistics showing that viewers use another screen in front of their TV (one of the latest being Nielsen saying that 75% of smartphone and tablet users are engaging with second-screen content more than once a month as they watch TV…

Monetizing the 2nd Screen–business models that work

January 25, 2014 · Posted in 2nd Screen Blog, Featured Blog · Comments Off 

Second screen, social media and companion applications are all high on the agenda of executives in the media and technology industries. As a reflection of all major TV and technology conferences in 2013, CES, NAB, IBC, and MIP had several sessions dedicated to second screen. But second screen, while proven as a reality of consumer behavior, is not yet widely seen as a revenue driver. Indeed the reality of the second screen phenomenon is accepted, as is proven by the continuous flow of statistics showing that viewers use another screen in front of their TV (one of the latest being Nielsen saying that 75% of smartphone and tablet users are engaging with second-screen content more than once a month as they watch TV[1]). Another proof of the generalization of second screen is the multiplication of companion screen applications: over the course of 2013 they have become widespread in new geographies including the Middle East, Eastern Europe and Latin America, where they had little presence only 12 months before. Comparing the space with 2012, it is clear that no TV players can ignore it. Even more striking, the players behind some of the most successful apps are large and well established: Peel now has  40m+ downloads, mostly through a global partnership with Samsung; Apple bought Matcha in August 2012; zeebox grew their partnerships with Sky, Comcast (NBCU) and Foxtel, while DirecTV acquired a share of i.TV (which bought Getglue at the end of 2013); Viggle has a longstanding partnership with DirecTV; Comcast has launched “SEE iT” with Twitter[2]and Xbox SmartGlass app was downloaded more than 17m times.  Despite this popularity and the presence of the largest players, few industry executives dare to speak openly about monetization of second screen applications and only a small percentage of 3rdparty app providers have made their progress public. There may be good reason for the industry stalwarts to keep their progress private with commercial competition so tough, but the sceptics of course believe that is because no one has actually experienced much monetization success.  So while many in the industry are wondering where the money is in second screen, next to nobody is ready to “show [you] the money”.

The purpose of our research paper is to do exactly that: “show you the money”. We review the various monetization strategies used by second screen companion and viewing applications and evaluate how these strategies work and which ones drive the most value.   We also provide an evaluation of the second screen market size and review its main drivers. Finally, we review how Twitter, Microsoft, Samsung and other players not directly in the second screen ecosystem are planning to use the second screen to increase their revenue.

More importantly perhaps, we have taken the time to update our market sizing from last year in an effort to demonstrate where the large opportunities for monetization lay for players in the ecosystem.

Interested in learning more?

Feel free to explore our research and infographics on our website, engage us on Twitter (@ChuckParkerTech@S32Day), or meet us in person at Mobile World Congress (Feb 26th in Barcelona) or at NAB (April 6th in Las Vegas).


Looking for Evidence of Monetization and Engagement after the Hype

January 6, 2014 · Posted in Chuck Parker's Blog, Featured Blog · Comments Off 
By Chuck Parker
When I look at the 2nd Screen industry trends today, I can’t help but think back to what we were focused on only 12 months ago as we prepared to come together at CES in Las Vegas.  We spent a lot of time talking about Social TV, the consolidation of the industry, ACR, and whether or not consumers were actually using their 2nd Screen devices to engage with there video content—or just to play Angry Birds.
During the last year, we tracked the important figures that defined each quarter: Q4 2012.  35 million tablets sold in the U.S. alone during the Christmas rush and significant social TV and 2nd Screen engagement growth in all scenarios

Looking for Evidence of Monetization and Engagement after the Hype

January 6, 2014 · Posted in 2nd Screen Blog, Featured Blog · Comments Off 
When I look at the 2nd Screen industry trends today, I can’t help but think back to what we were focused on only 12 months ago as we prepared to come together at CES in Las Vegas.  We spent a lot of time talking about Social TV, the consolidation of the industry, ACR, and whether or not consumers were actually using their 2nd Screen devices to engage with there video content—or just to play Angry Birds.
During the last year, we tracked the important figures that defined each quarter:
  • ·      Q4 2012.  35 million tablets sold in the U.S. alone during the Christmas rush and significant social TV and 2nd Screen engagement growth in all scenarios.
  • ·      Q1 2013.  Clear evidence of “t-commerce” from 2nd Screens and the continued growth of active zeebox and Viggle subscribers—bell weathers for the industry on consumer engagement in 2nd Screen.
  • ·      Q2 2013.  Hyper growth in mobile video viewing, especially in ad supported—a key trend to observe for the potential of 2nd Screen monetization in converged experiences.
  • ·      Q3 2013.  Continued viewing growth on mobile, strong revenue growth from enhanced 2ndScreen engagement apps and the launch of Chromecast—an opportunity for any 2ndScreen app developer to give Discovery and 1st screen Control capabilities to their video viewing experience.

We started last year by identifying 10 potential trends to watch for in the industry, some of which quickly became self-evident and some of which did not materialize.  Of course some unanticipated trends revealed themselves along the way.  We think the most impactful trends for our industry right now are:
·      An “ecosystem” not an app.  Success in consumer engagement   Microsoft’s Xbox Smartglass and Google’s Chromecast are the best current examples in this space.  For Smartglass, there is one app which reveals all 2nd Screen companion experiences for video, music and games in their content ecosystem.  For Chromecast, they have found a way to give app developers the ability through the DIAL protocol to leverage control of the 1st screen, combining Discovery and Control capabilities across the app ecosystem and freeing up the device for Enhanced Viewing and Social experiences.  Expect Netflix and Apple to do something here quickly.

continues to be stronger where there are pervasive yet passive opportunities for engagement with the consumer.

·      Convergence of companion and viewing experiences.  This trend will continue to develop a millennials choose mobile devices over the living room and because it is by far the leading monetization opportunity – inline video advertising.  As the ability to click thru ads continues to grow, this will become a more and more valuable opportunity for all members of the ecosystem.
·      Ad supported video on mobile devices.  Today, while the total spend in this space is somewhere between 1-3% of total TV advertising spend (roughly $6 billion in 2013), it continues to grow at a breathtaking pace (25%+ CAGR) and TV networks are beginning to find higher CPM pricing for tablets and smartphones than they are for the living room TV (digital or analog).  This is attributable both the highly targetable nature of a 2nd Screen (often not shared with others) and the easy interactive capabilities (the ability to click thru to more information during a video ad). 
·      Monetization.  You don’t have to look very far to find revenue success in this space as 3rd party Enhanced Viewing experience app Viggle is a publicly traded company, reporting $4.6 million in revenue for most recently publish quarter—which was a growth of 289% on the prior 12 months.  Assuming they continue to grow even at a modest pace, they will be able to deliver more than $20 million in revenue in 2014.
·      The rise of HTML 5.  In the past 12 months, a significant trend has developed in mobile where 66% of consumers are now choosing to engagement in their entertainment content through their mobile browser rather than through an app.  This means developing responsive HTML 5 apps is more important than ever for both in app and “mobile web” 2nd Screen experiences for the consumer.
·      The rise ad blockers.  Once a desktop only problem, there are companies who are focused on building feature sets that allow consumers to circumvent the most successful monetization approach in this space—blocking display and video ads.  Development of this trend and approaches to thwart its impact on 2nd Screen companion and viewing experiences is paramount to the success of the ecosystem.
However, despite the continuous positive points of data revealing themselves along the way, the industry finds itself in a precarious place—somewhere between the “Peak of Inflated Expectations” and the “Trough of Disillusionment” (to quote Gartner’s Hype Cycle terminology).  Analysts and trade press are now keying in on failed experiences nearly as often as successes, all archived in our twice weekly 2nd Screen 2Day new letter.  Combine that with the natural consolidation of the startups in the industry (which includes failed ventures), and it is easy to find yourself being more disillusioned than hyped.
Our responsibility as an industry association is to help our members focus on the key elements of success that can drive their business forward.  So as we prepare for the 2014 CES 2nd Screen Summit, we will drive our efforts towards the three primary business drivers which should represent every member of our ecosystem:
  • 1.     Increased consumer engagement in the content.  The majority of the investment in 2nd Screen companion and viewing experiences is coming from the content creators and distributors (primarily the TV networks).  Creating a lift in engagement (i.e. viewing time) translates to increased revenue regardless of their monetization model.
  • 2.     Increased consumer engagement with the advertising brands.  The vast majority of the content ecosystem focused on 2nd Screen monetized their content through advertising in some form.  As major brands place bets in this space, they are focused on metrics like “Cost per Touch” instead of impressions delivered (CPM).  The brands crave interactivity and engagement, working to determine which consumers are interested enough to move forward in their purchase cycle.
  • 3.     Monetization itself.  While major TV networks and brand advertisers can get comfortable with metrics that have a strong correlation to monetization, many of the 1st and 3rd party engagement developers depend on revenue coming in the door to support their investments—“where the rubber meets the road” so to speak, as actual payments for advertising, t-commerce and engagement come together in the 2ndScreen companion and viewing experiences.

To support the 2nd Screen Society members in this journey up the “Slope of enlightenment”, we are going to focus both our research and our conference engagement topics along these three major business drivers.  First, we are going to make our research more readily available to members, developing case studies in the industry helping to reveal tangible evidence of success in these three areas.  Second, as we identify, track and engage in industry trends, we are going to do so in a manner that helps reveal the impact on these three important business models.
I am looking forward to watching the industry grow and develop towards the great potential we all identified many, many months ago. 
See you in Vegas!

Warner Bros’ D2C Puts Pedal to the Metal with Eye Opening Results

October 4, 2013 · Posted in Colleen Quinn's Blog, Featured Blog · Comments Off 

By Colleen Quinn, Teradata Corporation

As Hollywood shifts into high gear around direct-to-consumer engagement, content creators and distributors are working fast to develop the know-how and analytic capabilities to execute. There’s a lot to consider, especially for organizations that are new to the D2C fray.

Cut to Warner Bros., who is leading the charge among Hollywood Studios in developing rich direct-to-consumer offerings, and the CRM efforts that make those offers successful. Michele Edelman, Warner Bros. Vice President, Direct-to-Consumer, opened the curtains on Warner Bros.’ pioneering work in D2C at a recent Teradata webinar.

The virtual-standing-room-only crowd had a front-row seat, as Edelman described the evolution of CRM and D2C at the studio.  Warner Bros.’ capabilities have expertly woven together best-in-class integrated marketing, with a big data strategy that gives them a detailed understanding of each member of their audience.

Launched in 2009, the Warner Bros.’ CRM strategy boasts massive success where it counts:  in the numbers.  Any savvy digital marketer knows that benchmarks are critical. Without them, there’s no real way to measure your success.  So, imagine Warner Bros. excitement when they saw the powerful results driven by their new direct-to-consumer CRM programs – seeing rapid, exponential improvements across all key marketing KPIs, including:

  • 25% Email Open Rates
  • 13% Click-through Rates
  • Decreases in unsubcribe rates

Take a listen to the webinar replay to hear how Warner Bros. launched, refined and mastered their Direct-to-Consumer CRM and analytics strategy, featuring an extensive audience-driven Q&A.

And, the big data conversation for Media & Entertainment didn’t stop there! Industry thought leaders in advertising, cable, broadcasting and more are convened in the Big Apple this week, as Teradata and UCLA Anderson reprised Create, Captivate and Engage a big data analytics event with M&E in mind.


A New Big Bang: Battles Over Streaming Rights Pave Way for Direct-to-Consumer

September 19, 2013 · Posted in Colleen Quinn's Blog, Featured Blog · Comments Off 

By Colleen Quinn, Teradata Corporation

It was the pen-stroke heard around Hollywood. CBS and Time Warner had (finally) reached agreement about retransmission fees. Viewers from coast-to-coast exhaled a collective sigh of relief, and switched on Pro Football.

One term at issue? The big per-subscriber fee hike that CBS demanded, aiming to double their carriage fees over the 5-year term. While a huge boost in revenue is always worthwhile, CBS’ negotiations hinged on a term that is much more interesting. They wanted to retain streaming rights.

Here’s why. Increasingly, streaming rights are the gateway to commanding your future. With them, content owners can seek the best opportunities to fully monetize content across every channel. But, more importantly, streaming rights often pave the way for content owners into the direct-to-consumer fray.

Going D2C means more than just having content rights, though. For studios and distributors, it means developing a keen understanding of each member of your audience. It’s about having the capabilities to deliver the right content, right message and right impact.

Lots of content creators are talking about this seismic shift in the business  – but,  only a brave few are putting their collective money where their mouths are. There are trailblazers. Warner Bros. Entertainment is one of them.  Among the first to build-out a robust, start-up-like technical operations organization, Warner was also among the first to take the lead with Ultraviolet.

Now, Warner Bros. Vice President of Marketing for Digital Distribution, Michele Edelman, offers a rare opportunity to listen via live webinar as she shares the studio’s vision and insights for launching and leading industry-changing, direct-to-consumer capabilities.

It’s rare that inside Hollywood can learn from inside Hollywood – but, once in a great while, it happens.  Don’t miss your chance to join in!

A Review of Google’s Chromecast – Leveraging the Discovery and Control Powers of Second Screen

August 19, 2013 · Posted in 2nd Screen Blog, Chuck Parker's Blog · Comments Off 

We have often discussed in this blog the 4 major features sets of second screen (To Control, To Discover, to Enhance, to Share – relevant research linked here and here).  We have also reviewed what Netflix was experimenting with for leveraging the 2nd Screen as a discovery and control device via DIAL (try opening Netflix on your iPhone while it is also running on your PS3, find the blog here).  Finally, we have predicted what a DIAL-enabled world might look like with its major backers (Netflix and YouTube) driving the protocol acceptance into every new device launch since early 2013 (DIAL blog here, 10 predictions here).

Well ChromeCast is the incarnate of all those opportunities and at the same time evidence of where the industry will head with rapid adoption.  While we have tried to tell the SmartTV industry that the best implementation for their platform is to be the launch pad for the stream, Chromecast demonstrates that use case out right.

Similar to an Apple experience, the packaging of the device in simple and clean.  The small dongle device comes with a power cord and USB cord (alternative for power) and an adapter in case your HDMI port is in a tight spot.

Setup was easy, though not as straight forward as it could have been.  But once established, the device and its Control and Discovery functionality are powerful.  The same YouTube or Netflix app you already have now has a new icon on it when the DIAL protocol detects the device on the same Wi-Fi network.  And then, similar to Airplay, you choose the device to send the stream to.  However, unlike Apple, the Chromecast dongle then takes over the stream, freeing your tablet or smartphone to be the discovery and control device instead of leaving it shackled as a streaming terminal.
While I encountered the occasional error, for the most part, the 2 apps delivered their functionality robustly.  
This is what a DIAL-enabled living room will look like.
This is where 2nd Screen is powerful–control and discovery.
There are of course additional use cases around enhanced companion experiences and social engagement, but this is a powerful step.
The real winning use case of course is to combine the power of NextGuide or BuddyTV with this device (or any DIAL enabled device).  Rather than live through the “easter egg hunt” of checking multiple apps for your desired content, you would use a Discovery app and then launch your content to the appropriate device in your living room.
Welcome to the second screen-powered living room–control and discovery like it was envisioned in the beginning.  That creates consumer utility–a reason for the consumer to pick up the 2nd screen and use that app again and again, creating the opportunity for engagement in both social and enhanced use cases–which provides the opportunity for monetization.

M&E Journal: Time-Based Metadata and the Emerging Video Landscape

August 8, 2013 · Posted in Featured Blog · Comments Off 

By Zane Vella, Founder and Chief Product Officer, Watchwith

Over the last decade a familiar battle cry of the digital media executive was “Anytime, Anywhere,” meaning the promise of digital for the consumer was to watch “what you want, when you want it.” And as we look around today, much of that future has arrived in the form of HBO Go, Netflix, Xbox, Xfinity – all popular on-demand services now available on tablets, phones, computers, game-consoles, Blu-ray Disc players, and connected TVs. So what’s next?

As the MESA readership is well aware, much of the traditional entertainment distribution business is an analytic and strategic exercise in windowing and differentiation. In short, this means extracting greater return through enforced scarcity or by delivering added value through one or another distribution channels or partners. This article examines how and why time-based metadata is becoming a critical strategic asset for content owners, and how it enables new forms of windowing and differentiation across the digital distribution landscape.

A New Vocabulary
First, some definitions: “time-based metadata,” a.k.a. “related content metadata” is descriptive information related to a particular scene, shot, or moment of a film or TV episode. Unlike traditional program metadata that defines general information applicable to an entire program, time-based metadata follows the heartbeat of the program content itself and includes a steady time-code or time-reference that refers to a relative time within the media asset. Fundamental examples of time-based metadata include what actors are currently onscreen, what music is playing, what locations are in the scene, and what featured products are on screen at any particular moment.

Within the realm of time-based metadata, there is also an important concept of “event types.” For example, “actor,” “music,” “quiz,” “poll,” “behind the scenes video” and “production still” are all types of events or related content (which can also be thought of as layers) that are associated with particular moments in a program. Event types can be anything a content owner or producer desires that either adds value to a program or is related to the program.

One of the defining characteristics of time-based metadata is that it is information related to content which is abstracted from any particular visual presentation or consumer experience. This primarily means information in the form of text, images, and links to other Internet-based content or services.[1]

Lastly, another key concept is “metadata syndication,” or more simply, fine-grained control of which event types or layers of time-based metadata are made available to certain business partners, based on business rules such as time-window or geographic location. Technically speaking, metadata syndication is implemented via access credentials, (a.k.a. API keys) that are provided to a distribution partner or to each application that consumes time-based metadata made available by a content owner.

Foundational Digital Trends
Before returning to the discussion of windowing and differentiation, it is important to identify two broad overarching technical trends which are both transforming our industry and providing the foundation that time-based metadata strategies are built upon; first, the dominance of digital file based workflows, and second, the increased importance of more traditional program-level metadata in digital distribution operations (as opposed to the time-based data).

Until very recently, program assets were delivered to distribution partners via a broad range of technical means. Broadcast and pay TV exploitation relied primarily on satellite uplink, theatrical exploitation relied on physical delivery of 35mm prints, home entertainment (DVD and Blu-ray) exploitation was via tape formats (DLT) to manufacturing facilities , and digital exploitation (iTunes, Xbox, PS3) via file transfer. Within just the last few years, the economics, practicality and operational benefits of digital video workflows have elevated digital file transfer as the primary means of asset delivery across all distribution channels.

Once operating within such a digital file ecosystem, program-level metadata associated with those files becomes critical for inventory management, merchandising, fulfillment, pricing, royalty tracking and interoperability across various systems.  These requirements have driven a great deal of innovation, and over the past several years, an enormous amount of ingenuity, intelligence, and dedication has gone into solving industry challenges around program-level metadata. While challenges and opportunities for efficiency remain, great progress is being made, particularly by industry organizations such as ISAN and EIDR.[2]

Together, these two trends are an important indicator of the direction that the industry overall is heading, and form the basis of some logical conclusions: If digital file delivery persists or increases, content owners will increasingly need to provide their distribution partners with metadata around their assets, and different types of metadata will be required for different means and channels of exploitation. Metadata will increasingly become the means of delivering information to business partners throughout the entertainment production and distribution ecosystem.

Anytime, Anywhere, But Now What?
Thanks in large part to standardization of digital file formats and the hard work of many digital distribution operations teams, most large entertainment companies are now able to reliably deliver their audio and video assets across a wide range of distribution partners. There is at the same time, however, a definite and glaring absence of any unified or efficient way to enhance the consumer experience around that video or any standardized means to deliver value-added related content.

This means that while the industry has been successful with the first critical step of delivering program content to the consumer, there is a distinct lack of business tools or “levers” for distribution executives to efficiently create consumer demand for their digital assets.  Unlike in DVD and Blu-ray, each distribution partner, such as iTunes or Xbox, has their own unique set of requirements for delivering value-added content (if at all) and promotional opportunities also require unique one-off asset production and expense.

This lack of a unified platform for creation and delivery of added-value content may be a significant contributor to decreased consumer interest in sell-through and ownership models.

Turning the Tables for Everyone’s Benefit
Digital distribution executives not only lack a unified means of enhancement and promotion for program assets, they also operate in a highly fragmented landscape. Traditional cable, satellite and telco distribution partners have increasingly complex delivery requirements to fulfill their own evolving customer viewing habits. In addition to these MVPDs, a new wave of mobile and tablet applications, web video distribution, and OTT partners bring additional delivery requirements and new valuable ways to connect with the audience. No matter how well resourced a media or entertainment company might be, it is near impossible to keep up with every new digital distribution opportunity, and equally impossible to differentiate your program content from one distribution outlet to another.

The solution is to turn the tables, and for the content owner to offer each distribution partner a variable package of time-based related content metadata associated with each licensed program. This related content metadata becomes the key ingredient for each distribution partner to deliver a differentiated, value-added consumer experience to their end-user or consumer.

For example, electronic sell-through partners and ultimately the consumers that purchase movie and TV programs through them, can have access to extensive layers of value-added content, while rental partners and their consumers can be restricted to a more limited subset of metadata, and fewer layers of value-added content, if any.

In practice, this means that the consumer who purchases a film or TV program can enjoy a different, presumably higher value experience, than one who rents that same video asset. By extension, this also means that a subscription model could potentially emerge in which the content owner would provide the consumer with ongoing or evolving enhancements (active layers of engagement) with their favorite films or TV programs.

The Power of Metadata Syndication
This approach is extremely powerful for the content owner because it allows them to function more similarly to how they have traditionally operated. It becomes the content owners’ responsibility to create the highest-value master asset possible, but now that asset is a combination of audio, video and time-based metadata. Individual distribution platform idiosyncrasies and presentation layer requirements become the responsibility of the distribution partner, and the content owner can focus their attention and resources on delivering value to the consumer and marketing those benefits.

This approach also opens the door for content owners to focus on the ongoing interactive social and commerce services that may be connected to any scene or moment of their content, and with the right technology platform at their disposal, enable the content owner to benefit from these additional layers of monetization in cooperation with their downstream distribution partners.

“Turning the tables” though metadata syndication is also powerful because it challenges distributors to innovate and compete with each other to deliver the best consumer experience, as opposed to expecting content owners’ limited marketing budgets to stretch across all distribution platforms they currently have to reach. In many cases, particularly related to television, this approach also solves a major timing problem. Only the content owner or network programmer has access to first-run television episodes before their first airing, so metadata syndication allows them to make related content such as quizzes, trivia and behind the scenes images available in a way that no distributor would be able.

The Time-Based Metadata Ecosystem
Creation, production and distribution are all part of the time-based metadata ecosystem. From a creation perspective, an enormous amount of valuable time-based related content exists from the earliest stage of pre-production. Similar to popular DVD, Blu-ray and synchronized “Second Screen” experiences[3], examples of this type of material include early storyboards, location scouting photos, and production design sketches. These are are all valuable related content that can be set to time in a film or TV episode, and are good examples of how to extract value from existing production artifacts. Additional examples of existing information that can be quickly set to time are music cue-sheets, branded entertainment product list, and on-set photography.

Applications of time-based metadata also open up new creative opportunities for writers, producers, and multimedia storytellers. Instead of leaving related content creation to marketing and programming teams, writers are increasingly taking responsibility for various types of related content metadata as an integral part of the creative process. For example, Fourth Wall Studios is an LA based entertainment company that is creating new forms of storytelling where, for example, the on-screen characters call the viewers cell phone at designated moments in the story timeline.[4]

Another example of creative time-based metadata creation and production comes from USA Network where Twitter “hashtags,” originally intended to drive social media activity during first-run viewing, are being stored as persistent time-based metadata with particular episodes and scenes, so that they can be leveraged by applications and users in later syndication and VOD.

Time-based metadata also has important implications for ecommerce and enable new transactional revenue opportunities for both content owners and distribution partners. In 2012, eBay introduced Watch With eBay, a stand-alone iPad application that surfaces current auctions and “Buy it Now” items that are related to a particular program. eBay has also demonstrated a version of the application that uses time-based metadata to surface items that are related to a particular scene, and expects.

Time-based Metadata, Windowing & Personalization
One of the greatest opportunities for content owners and distributors alike is to leverage time-based metadata to proactively drive consumer activity in new viewing windows, and with new viewing patterns. Through metadata syndication, the same digital file can offer the consumer a new experience with each view, and that experience can be influenced by whether it is being experienced in parallel with the first-run viewing, within the Nielsen C3 window[5], or in a VOD session.

Technically speaking, windowing relative to time-based metadata means that based upon the specific time-window at which a viewer engages with a piece of content, a corresponding package of related content layers can be made available. These time-windows can be relative to first-run or premiere of the content, or personalized to a specific viewer and corresponding with successive views.

Time-based Metadata and the Future of TV
Over two decades of video product development, time-based metadata has emerged as one of the most important components of a successful digital video distribution strategy. This descriptive information about what is happening at any moment is critical to differentiation in a multiscreen world, and will play an increasingly important role in differentiation across distribution partners. As smartphones, tablets, and smart TVs proliferate, there will be increased demand for rich and valuable time-based metadata delivered as part of the master asset. Increasingly, time-based metadata will unlock the context of film and television, and will power the new user experiences and new revenue streams that are only possible on emerging two-way digital platforms.

Just one decade in to the twenty-first century, we are starting to see indicators of a vibrant metadata ecosystem growing within the folds of the traditional film and TV production and distribution industries. Writers and producers will increasingly create time-based metadata as an inherent part of their creative storytelling process, and production companies will increasingly package, license, and sell that critical enabling meta-layer to their programmer and distributor customers. Programmers and distributors will in turn increasingly deliver a time-based metadata layer to their cable, satellite, telco, web, mobile and OTT licensees, so that those consumer facing services can unlock the context of every moment of film and TV for their audiences.

Zane Vella is the Founder and Chief Product Officer at Watchwith, a software platform to create and distribute time-based related content around films, TV and commercials. He has 20 years experience at the intersection of TV, Internet, and software product strategy and has led the development of interactive products and platforms for media and entertainment companies including Apple, Disney, NBCU, Netflix, Viacom, and Warner Bros.


[1] Time-based metadata is typically provided as a JSON or XML formatted message so that a product developer or programmer can choose from available time-based information and use it as they see fit in a consumer experience.

[2] ISAN is the International Standard Audiovisual Number a voluntary numbering system and metadata schema enabling the unique and persistent identification of any audiovisual works and versions thereof including films, shorts, documentaries, television programs, sports events, advertising etc. EIDR is a universal unique identifier for movie and television assets.


[3] Walt Disney Studios Distribution has been a leading innovator of synchronized consumer experience on a tablet associated with a film. More info available at

[4] Founded in 2007, the Culver City-based company develops new properties delivered via Internet browsers, smartphones, game consoles, TVs, movie screens and in the physical world.

[5] Nielsen C3 is a metric launched in 2007 which refers to the ratings for average commercial minutes in live programming plus three days of digital video recorder playback.

300GB Optical Discs – Harbinger for Ultra HD?

August 6, 2013 · Posted in Featured Blog · Comments Off 

By Geoff Tulley

Sony and Panasonic recently announced an agreement to jointly develop standards for a next-generation optical disc that has the capacity to hold more than 300 gigabytes of data (six times the capacity of current Blu-ray Discs) by the end of 2015. According to the two companies, the 300 GB discs are geared toward the archival storage market. Is this project the next-generation of Blu-ray? Or is it the consumer electronics industry’s answer to being ahead of the 4K curve? See below for an analysis. 

In the joint release issued by both companies, each included reference to the other’s cartridge-based storage solutions that are currently in the market (Panasonic’s Data Archiver LB-DM9 series and Sony’s Optical Disc Archive system). These systems employ multiple recordable optical discs encased in a protective cartridge (beyond this similarity, however, the systems and their media are completely different).

As an associate of mine commented: “(These cartridge-based systems) may have a fairly tough time in the enterprise market though, as it seems to be more of a packaging trick than anything really new — proprietary cartridges and the like can be a tough sell.”

The companies make the point pretty clearly that this announcement is about a single disc solution that ups the capacity of recordable optical discs. It will be interesting to see what mix of layers, lasers and the like will be required to make that magic. Since multiple layers at Blu-ray Disc wavelengths are already in current specifications, the implication is that this new format will be a departure from BD as we know it.

As TV Technology reported, “Both companies pointed to the expanding needs for archiving in video production as well as from cloud data centers as the reasons behind their work in advancing the format.”

I did notice another Web site, however, that took the same announcement and (IMHO) leapt off the deep end:

“But while streaming content seems like a good idea, some consumers (especially videophiles) are clamouring for a physical solution to the problem,” the article stated.

It would be interesting to see the data behind the “consumers are clamoring” bit. The Blu-ray Disc Association might want to examine it.

According to the article, “Though neither company has admitted as much, it’s clear that the partnership is an effort to resolve the 4K media question once and for all. The two Japanese firms are teaming up to create what will essentially become the successor to the Blu-ray Disc. Their ambitious plan is to create a higher capacity optical disc that’s ready for consumer use before the end of 2015.”

I don’t think that it is at all clear that this is about a consumer format; certainly not about one that is aimed at 2015. 2015 is the stated target for the commercial, data archiving implementation. I think it is safe to assume that the quest for Ultra HD consumer distribution will not be waiting for this new format to emerge, so one has to wonder what feats of marketing may be required to re-introduce the world to new forms/formats of physical media two years from now (or after).

One also has to wonder if this writer appreciates the significant differences between recordable optical discs and replicated media (such as DVD and Blu-ray) that is used for movie distribution; not to mention the investment required to create the replication infrastructure required to mass produce “affordable” home movie discs.

The Blu-ray Disc Association did make the announcement at CES 2013 that they have a task force studying the issues around incorporating Ultra HD content into the specification. I expect that effort will generate lots of discussions and ultimately product development, I just don’t see this 300GB announcement as a harbinger of a consumer solution.

In any case, this discussion does provide lots of interesting food for thought.


M&E Journal: The Future of Movie Sell-Through

July 24, 2013 · Posted in Featured Blog · Comments Off 

By Tony Knight, Senior Product Manager, Rovi Corporation

The other day, I began to realize how much of the physical media that my generation took for granted would be completely absent from the lives of our children.  My four-year-old daughter Izzy, who was born the year the iPhone was first introduced, already has far different expectations on how content is created, transmitted and consumed.  For her, you never have to put anything in a machine to get something you want to come out on a screen.  For decades, the act of taking a picture, listening to music, or watching a film required the movement of something physical into the apparatus of something mechanical.  In the space of just a few short years, the relentless march of technology has separated content from the spinning gears they were previously bound to.

What’s more, technology has rapidly increased the rate of change in the home entertainment business, and this metric can be measured in months rather than years.  Consider how long it took older formats, such as VHS and cassette tapes, to be succeeded by new standards, like DVD and CD.  Compare that against the plethora of new content delivery methods available today on such a variety of new devices and you will begin to realize the challenges in store for the home entertainment industry.  Six years ago the most common way consumers get access to premium content was in the form of a DVD disk.  It was a universal standard and consumers gravitated towards it.  This greatly simplified the home entertainment business model for content holders and the businesses that supported them.  Move ahead a few years, and it’s not hard to recognize that consumers have many more home entertainment choices, ranging from subscription VOD, kiosk rentals to a variety of over-the-top delivery channels.

While the physical disk still accounts for the single biggest piece of home entertainment revenue, it is becoming besieged by a number of other options vying for consumers’ attention.  A few years ago, an entertainment hungry consumer might have purchased a DVD for $15 to $20 because it represented the best value for money among a smaller choice of consumption modes.  Today, that same buyer has many more choices, including free or low cost access.  This competition for consumer attention has forced those of us who make our living in entertainment technology to rethink consumer value, or risk losing the premiums that were once the mainstay of the physical media home entertainment business.  In fact, the future of the home entertainment business may hinge on the very question of whether or not consumers want to ‘own’ movies anymore.

The commercially successfully concept of ‘owning’ a retail movie has always taken some physical form.  VHS tapes had a measure of success in the retail market, but it wasn’t until DVDs were introduced that people bought and collected them in droves.  Today, DVD and BD still sales constitute the lion’s share of home entertainment revenue, but that revenue is declining 5-15% worldwide, year over year.  Electronic sell-through, the digital equivalent of owning a movie on physical media, has been available for many years, but it has yet to garner anything close to the same level of commercial success as DVD disks.  The key question for many in our industry is striking:  Are consumers willing to pay to own movies, or are they content to rent on occasion?

Several years ago, I spoke at an industry event, and I was asked when electronic sell- through was going to be successful.  My answer was short and sweet: When consumers view EST as being as valuable as DVDs.  In the intervening years, the mass market has yet to adopt EST, and physical disk sales have continued to decline.  Consumer behavior is changing, and not in ways that promote the traditional home entertainment business model.  To put it another way, five years ago the home entertainment revenue pie was cut up in ways that benefited certain actors.  Today, that pie is in the process of being recut.  Those that were used to getting a healthy slice in the past may be alarmed to be getting either a smaller piece, or none at all.  Others that didn’t have a slice in the past are now sitting at the table.  The question of consumer ownership of content is central to how big the pie is, and how it is to be sliced.

Unless the industry acts (and acts decisively), in a few short years, margins in the home entertainment business could reduce sharply as consumers shift from movie ownership to a much less lucrative over-the-top rental business.  In fact, I think the entire industry is in need of something akin to the Marshall Plan.  To that end, here is my 3-point plan save movie ownership that can be treated as additions to UltraViolet:

1) Clearer Differentiation from the Rental Experience

Today, when you consider buying a movie from an over-the-top service, you are confronted with two buttons: Buy or Rent. Clicking the ‘buy’ button leaves many customers feeling shortchanged.  There are generally no menus, extras, special features or other perks that make them feel like the ownership experience has been conferred on them.  Charging four to five times more for those that bought a movie that has the same user experience as a rental just starts to feel like you are prepaying for your next four rentals.  The industry needs to find a way to drive more value into the electronic sell- through format, and this means adding features that customers are used to getting from physical disks today.  Remember, most consumers only watch a movie they like one time.  They watch movies they love many times, and they want the extras that connect them to the film’s back-story.

2) Get Aggressive with Disk-to-Digital

My shelves at home have about 400 movies on them.  The key to getting consumers like me to own new movies digitally is to help me move my library towards the new paradigm.  How successful do you think Apple would have been with the iPod and iTunes if they hadn’t expressly enabled you to bring your existing library of CDs into the same interface as the music you purchased from them electronically?  Not very, I think.  There is an effort by a certain large retailer to move existing DVD and BD disks to UltraViolet (UV).  This is a great start, but the initial reviews have been mixed.  By my own experience, only two-thirds of the sample I brought in was available for conversion, and none of the extras that were available on those disks are part of my new UV rights.  Would I now spend $800 or so to move just the movies (without extras) over to a new standard if that new standard makes me feel like I am prepaying for over-the top-rental?  Not likely, I’m afraid.

I think disk to digital is a great idea, and some consumers undoubtedly adopt a scheme where they move their libraries over on a per disk basis for a fee.  That said, this approach presents a barrier that I believe will prevent it from becoming mainstream.

Here is a different approach. Charge little or nothing to convert my existing library to UltraViolet once a retailer has confirmed that  my library consists of legitimate retail disks, and marked each out of circulation once the digital right has been conferred.  If some of the movies are not available, record my right anyway, and bring it to my locker once it is available.  Now I can feel the totality of the UV experience with content I’ve spent the past 15 years collecting.  It didn’t involve a big bet on my part, and if I like it, the chances are very good that I will probably make my next purchases as UV ones.

3) Go Crazy with Metadata

My shelves at home used to be great for impressing guests and those with lesser collections.  That moment has past.  Here is what my shelf can’t do well: recommend a good movie for me, or tell me where there are gaps in my collection.  My shelf can’t sort my movies in ways that help me consume more content.  In fact, after Izzy figured out how to reach that shelf, it isn’t even particularly well organized.  Once you’ve helped move my entire library over to a digital locker of some kind, don’t make it the digital equivalent of my shelf.  Use rich metadata and some excellent user interfaces to help me visualize my library in new and interesting ways. How many Stanley Kubrick movies do I own?  Am I missing some Fellini movies?   If you tell me, I’m probably a willing buyer.  The best recommendation engine is one that takes my own library as input.  Put my existing movies into a snazzy interface, empower it with some intelligent metadata smarts, and I am much more likely to consume.  I promise.

There are a lot of people thinking and working on solutions to promote the continuation of the ownership model in home entertainment.  That said, I’m sticking firmly with my beliefs from many years ago.  Customers pay for what they value, and digital distribution of content must delight consumers if they are going to own it at the same rate they did with DVD.  Izzy is almost 5 now.  Is her first movie related transaction going to be a rental, a buy, or a subscription?  Much of that will depend on what the industry does over the next two years.

M&E Journal: From Silicon Valley to Hollywood,Mobile Revolutionizes the Way We Work

July 18, 2013 · Posted in Featured Blog · Comments Off 

By Robin Daniels, Head of Enterprise Product Marketing, Box

Although Silicon Valley and Hollywood are close geographically and are both working toward bringing transformative experiences (content and apps) to market, they are vastly different digitally and philosophically. Hollywood’s success relies far more on the connections between individuals than market and technology strategies, rendering decision making complex – and the Valley’s direct-to-consumer, “execute without asking for permission” model doesn’t quite match with Hollywood’s “collaborate with everyone” mentality. However, one technology that has taken hold and shaken up both Silicon Valley and Hollywood is the use of mobile devices in business.

Half of all devices sold this year will be non-Windows based. Apple alone has sold more than 172 million iPhones and iPads in the last year and its iPhone business alone generated more revenue than all of Microsoft. More computing power and connectivity is in more hands, and in more ways, than ever before.

Sure, workers everywhere have been using smartphones to be more productive for years, but advances in mobile devices paired with cloud applications have simply changed what’s possible. Not only can workers user their mobile devices to share instantly in their personal lives, business systems are evolving just as rapidly to make the impossible possible. From a creative executive accessing crucial production documents from an iPad to give real-time feedback, to a marketing team tracking campaign results while on the road – we are all working with new mobile and cloud technologies to stay competitive and to easily create, access and share content from anywhere across multiple devices.

Technology Demands

The technology in our personal lives is certainly influencing and changing expectations in our professional lives. It isn’t necessarily the convergence of the tools we use in these two worlds, but rather the consistency of ideals.

Employees are demanding the ability to choose the devices they use for work and are becoming less productive if all of their data is sequestered on different devices or locked down to specific systems. Along with mobile devices, consumers are also bringing different expectations for technology to work with them. The reason mobile devices like smartphones and tablets, or social media like Facebook and Twitter, are so popular is that they are radically simple and intuitive.

While incredibly empowering for end users, this fragmentation of platforms in the workplace means that any organization that is embracing mobility also has to embrace device diversity. And IT departments not only need to support all these new devices – they also need to ensure that the content and tools employees need to get work done are both accessible and secure.

This new paradigm poses a major challenge for today’s businesses: how can IT let new technology run rampant through an organization, technology that is fundamentally improving business outcomes, while still maintaining some semblance of a coherent IT strategy?

Enter The Enterprise Cloud

Everyone wins when workers have the mobile devices and software they want to use, rather than what they have to use, and IT departments have the oversight and visibility they require – and this can be achieved with a next generation enterprise cloud solution.

New generation of cloud-based business solutions are beginning to make this duality possible. Intuitive services like Ubic, Signiant and Box give employees at media and entertainment companies the flexibility and mobility they require, while also providing enterprise-grade security and visibility for IT professionals.

For new devices to be fully corporate-ready they need elements like security, device tracking and management, and powerful cross company collaboration – areas that can help continually and sustainably innovate. The cloud rewrites the rules here, enabling new handsets and tablets to connect to the “grid” like any other computer and become a tool that enables employees to work together securely across multiple platforms and from any location – finally making the mobile workplace a reality.

And with the truly mobile workforce, completely new computing cases are emerging.  As a large number Hollywood studios and labels are now moving their information and collaboration to the cloud, our customers share stories of sales teams showcasing their latest project – from the scripts to the trailers, while on-site with just their iPad in hand, a marketing team delivering and tracking exclusive content straight from their mobile devices, and a creative executive creating a centralized library in the cloud on an iPad for all media assets related to a movie launch. Mobile devices are becoming a catalyst for completely new enterprise applications, and vice versa.  The marriage of the two is so uniquely powerful that businesses will experience a wave of productivity transformation over the next few years.

Mobile + Cloud Revolutionizes the Way We Work, Together

With mobile and cloud technologies, employees now have the ability to store information once, and then easily extend it across all the applications, devices and people they are working with. People don’t work in a siloed world anymore. It’s about using solutions that work together, and powerful platforms that connect and become enhanced through integration: cloud-delivered applications like Salesforce to run your sales organization will connect to your business information on Box or HR information in Workday and Netsuite will plug into your social software from Jive or Yammer. The mixing and matching of services that’s common in our personal lives is now extending to the workplace, and in turn driving vastly more open solutions that are changing the business landscape and how we interact with each other.

Mobile and cloud adoption in business has led to dramatic changes in productivity, speed of execution, and overall sentiment towards technology. People are able to work much more quickly, access more information than ever before, and make decisions in real-time that are backed by data – all leading to a more open, connected and collaborative work environment. With the right solutions, both the end user and IT professionals are happy – employees are using products they love and IT is finally able to get ahead of the game instead of having to fight fires, solve problems, and answer to unhappy users. We’ve seen more progress made in moving towards a more collaborative and mobile IT strategy in the last year than in the previous ten years, and this revolution will continue to gain momentum – and attention.

Robin Daniels, Head of Enterprise Product Marketing, Box

Robin is head of enterprise product marketing at Box. Robin is a prominent advocate and expert on Enterprise Cloud Computing and how it is transforming enterprises and the software industry. Having worked in the tech industry for over 15 years for leading companies such as Salesforce, Veritas and Vignette, Robin has extensive knowledge in the areas of cloud computing, enterprise software, collaboration technologies, and marketing innovation.

Even a Day Late and a Billion Dollars Short, Hulu Still Matters

July 11, 2013 · Posted in Colleen Quinn's Blog, Featured Blog · Comments Off 

By Colleen Quinn, Teradata Corporation 

In most cases, 70+ job openings at a sexy company which, with a few others, single-handedly helped redefine the Media and Entertainment landscape would be an indicator of growth. Not so in the case of Hulu.  I’d consider this more an indication of rudderlessness.   Visionary CEO, Jason Kilar – my serious tech crush – has left the helm, along with many (if not most) of his right-hands. Hulu has been on-and-off the auction block for as long as anyone can remember.

Now, with a front-running bid on the table from DirecTV (along with a few other contenders) – and looming rumors that a deal will be done soon – it looks like Hulu’s days as a platform for Disney, Fox and NBC are numbered.

Ah. What a difference a couple of years make– $1 billion to be exact.  That’s the difference between the one-time price tag wooers were offering the OTT darling then and the rumored price on the table now.

While there’s no telling how the shift to any acquirer might unfold for the service, the news that the acquirer might be DirecTV is telling.  Bringing Hulu into the family could allow the satellite TV behemoth to close a significant gap in its service by providing a meaningful online offering. And, with well-established relationships with Hollywood and solid licensing agreements, DirectTV has the oomph to make sure Hulu’s content remains relevant, if no longer exclusive. The same is true for any cable, telco or satellite buyer, though – buying Hulu makes you look like you’re ready for the future.

But, there’s one angle the tech trades seem to be missing. And, maybe it takes the keen eye of an analytics powerhouse to notice. You see, DirecTV has long been an industry leader in applying analytics to gain customer insight  – using analytics to maintain and grow their subscriber base through meaningful offers. The promise of an online and OTT channel brings with it the opportunity to capture and analyze exactly how consumers are engaging with content across multiple channels, in ways that few companies are doing today.

I call that capability content analytics. Just imagine the power that cross-channel, real-time, behavioral and engagement data can wield.

Well, you don’t even need to imagine –because some analytics powerhouses are already using content analytics to drive business and insight. One key example? Netflix, who readily admit that 75% of their audience watches content because they recommend it – and those recommendations, along with just about everything else at Netflix, are done with analytic muscle.

So, while we all wait with baited breath for the final word on what happens to Hulu, there’s no question that it’s as relevant as ever.  Maybe no longer as a pioneer –but as a rocket to take its new owner into a future landscape where the living room isn’t the end-all be-all.

M&E Journal: Streamlining Processes – Server Solutions Power 3D Animation

June 14, 2013 · Posted in Featured Blog · Comments Off 
By Janet Bartleson, Director, Dell

Abstract: This case study outlines the partnership between Dell and ToonBox Entertainment, a Toronto-based 3D animation studio. Toonbox deployed Dell’s PowerEdge C6100 rack-mounted servers to render its stereoscopic animation. This article will detail how the partnership benefitted both companies.

To captivate their audience, mischievous Surly Squirrel and his rat friend Buddy need to be rendered in eye-popping detail. ToonBox Entertainment deploys Dell™ PowerEdge™ servers in its render farm to help deliver world-class 3D animation.

Toronto-based ToonBox Entertainment hit the ground running — the company’s first original TV production “Bolts & Blip” is one of the world’s first 3D stereoscopic animated television series. But stereoscopic animation places extremely heavy demands on workstations and servers in the render farm. When ToonBox prepared to start stereoscopic animation for its film The Nut Job, the company sought new studio space and a hardware vendor to furnish it. “It was critical to select the right vendor up front, because we were looking for a long-term solution,” says Ria Westaway, vice president of production.

“Dell treated us as their first priority. That commitment to our needs helped us make a decision relatively quickly.” For rendering, ToonBox selected Dell PowerEdge C6100 rack- mounted servers powered by Intel® Xeon® pro- cessor 5600 series. “For every animation we produce, we’re rendering twice as many frames as we would in 2D,” says computer graphics (CG) supervisor Andrew McPhillips. “Each shot in our rich and highly detailed film is comprised of dozens—sometimes even hundreds—of layers. Feature-length animated films typically have more than 1,000 shots. Because The Nut Job is 3D, we are rendering each of those shots twice, once for each eye. In this environment, the Dell PowerEdge C6100 servers have been fantastic. The PowerEdge C6100 makes a great render farm machine because it’s fast, highly configurable, and incredibly robust.

Streamlining Management

As the company grows, the hot-plug service- ability of each server node facilitates the rapid expansion of the render farm. “These servers enable us to scale up and down very easily,” says Aaron Pearce, systems administrator. “Adding a Dell PowerEdge C6100 server is basically plug-and-play. We receive a server, drop it into our infrastructure, install software, and that’s it.”

Furthermore, the servers’ built-in manage- ment controllers help simplify administra- tion. For example, instead of spending 50 hours each week manually installing various operating systems for testing on individual computers, IT staff hooks the open-source tool Extreme Cloud Administration Tool- kit (xCAT) into the PowerEdge server’s baseboard management controller (BMC) to automatically deploy preconfigured operating systems and software. “It deploys a new operating system across the entire server farm within minutes and takes almost no staff time,” says Pearce.

Ten months after deployment, the ToonBox render farm still has 100 percent availability. “Everything in the PowerEdge C6100 servers is redundant,” Pearce explains. “If we have a failure, we’re just going to re- move the failed component, fix it in-house, or call Dell ProSupport for extended support. A motherboard in one of our servers had a small issue reading a piece of memory and because we have Dell ProSupport on the machine, the motherboard was received and replaced within an hour and a half of failure. That turnaround by Dell ProSupport was absolutely fantastic.”

Creating stellar animation ToonBox artists who work mostly with Autodesk SketchBook Pro, Adobe® Photoshop®, or Adobe Premiere® Pro software received Dell Precision™ T3500 workstations. For artists who work primarily in Autodesk Maya, eyeon Fusion, or Pixologic ZBrush, Toon- Box provided Dell Precision T5500 work- stations. “They are very powerful machines that facilitate the type of work our artists are doing,” says Pearce.

Most of the company’s back-office func- tions run in a virtual environment enabled by VMware® virtualization software. More than 30 virtual servers run on two Dell PowerEdge R710 hosts and one PowerEdge R510 host. “Intel Virtualization Technology for DirectedI/O (Intel VT-d) enables the processor to split up resources for different virtual machines managed by the VMware layer,” says Pearce. “It works fantastically. Our Dell and Intel hardware is enabling us to make excellent use of the resources we have without flooding our server room with excess equipment, power consumption, and heat.”

To see that it selected the right hardware partner, ToonBox looked no further than its high-definition, stereoscopic animations. “In our teaser for The Nut Job, the image quality was so high that people couldn’t believe we did it in the time frame we did with the resources we had,” says McPhillips. “That validates our decisions, because a high image quality is the top of the pyramid. To get to that level, you need great people, great technology, and fantastic hardware. Dell computers give us one level of the pyramid.” Furthermore, ToonBox’s state-of-the-art equipment has helped the company recruit animators. “When you’re on the cutting edge of what can be done in animation, you need  a solid backbone,” says McPhillips. “Selecting Dell as our hardware partner was one of the best decisions ToonBox has made. It has been a fantastic relationship.”


M&E Journal: Integration of Linear & Non-Linear Platforms – A Blueprint for Consideration

May 29, 2013 · Posted in Featured Blog · Comments Off 

By Subhankar Bhattacharya, Global Practice Head Media & Entertainment, HCL Technologies


Abstract: In the television network industry, the linear and nonlinear businesses have evolved independently over several decades. There are many reasons for this separate path of evolution. First and foremost, the non-linear business targeted end-consumer interaction which was different from the distribution model of the linear business. Coupled with small revenue base and a different advertising model, the non-linear business was not considered mainstream. Over the years more and more brands have established significant presence on the web, thus raising cost of managing the nonlinear channels. On the other hand consumers are increasingly looking for seamless experience across devices and platforms. Given this situation, integration of linear and non-linear workflows has become imperative both from cost as well as customer perspective.  However the prospect of this integration is fraught with Disparate advertising models, disparate systems, disparate metadata and the absence of a single view of the customers. This article provides an approach to this integration which could provide the best possible ROI and least possible transition pain for the TV networks.


Executive Summary

Talks about an integrated approach to linear (television/on air) and non-linear (online/broadband) content value chains have been making rounds in the television network industry for several years.  At the back-end (supply chain) of the content value chain, progress has been limited to the cost driven initiatives such as a shared digital asset management system or a shared infrastructure platform. However, organizations today have become more ambitious and experimental in the front-end (consumer experience) of content value chain through several new revenue driven initiatives such as TV Everywhere, live broadband streaming, etc. With digital upfront (NewFront) in full swing and consumer expectation about seamless multi-screen experience on the rise, the process of linear & non-linear integration may get a much needed boost from business/brand owners within the industry. This paper outlines an approach toward linear and non-linear technology platform integration within the context of this evolving environment.  

Business Context

Video content is the primary asset of the television network industry. While the total advertising spend on nonlinear (online) video was only USD 1.42 Bn in 2010, the forecasted cumulative average growth rate (CAGR) for this channel is a whopping 31 percent. During the same period the forecasted growth rates for television advertisement spend is only 3 percent.  By 2016, that could take the share of non-linear video advertisement revenue to 13% of the linear (TV) advertising revenue from its current share of 3 percent. (Refer Chart 1: Video advertisement spend forecast 2010-2016 (Linear & Non Linear) for detail)

2HCLchart1This is more than a meaningful number to sit up and take note of. How well the television networks will be able to capitalize on the non-linear video advertising market compared to new media businesses like Google, Yahoo, Facebook, or Vevo will depend upon the capability of these networks to provide seamless content access across various consuming mediums. As content owners, the television networks will no doubt have their own share of revenue while they leverage the likes of Facebook and Vevo as syndication platforms. But in that case it has to shell out a much larger share of revenue to these platforms. There is an additional risk of non-linear video cannibalizing traditional TV advertising revenue, a hypothesis that could be hard to ignore.

Leveraging Cross-Industry Experiences

We may not be able to find a perfect example of the linear, non-linear integration model from other industries. However analysis of similar business environment in other industries reveals that four key factors, namely (a) Leadership mindset, (b) business involvement in IT/Technology decision making, (c) maturity of the industry as a whole for driving such initiatives and (d) ability of an organization to pull in capital for investment in right place and at the right time are the key four factors that has helped organizations emerge victorious from such challenging environment. (See chart 2 for an illustrative representation of these four factors)


The recently launched UltraViolet project facilitates consumers, who had bought content through physical media to watch the content on any personal device at any time. This is a bold move that challenges the likes of Apple TV, Google TV, and other legal online streaming services as well as content pirates that thrive on consumer dissatisfaction about accessibility of content on all personal devices. The success of this project stems from collaboration among studios, retail chains, software service providers and the determination of a few who believed in the philosophy of the ubiquitous access to content. The project has given a tremendous boost to the concept of universal content ID and universal content metadata. The industry associations (DECE, DEG, EIDR, HITS) are also playing a key role in shaping this project.   

Publishing industry ran separate processes for e-book and physical book production for a long time. As a result cost of e-book production ran high and simultaneous launch of e-book with physical book was difficult. Most leading book publishers however have invested heavily to integrate the digital and e-book production processes. This integration not only has cut down the cost of e-book production by 70-90 percent, the integrated process is also allowing simultaneous launch of print and digital books.

The multi-channel integration in the retail industry, which has been on-going for nearly a decade now, is a mature process. Product master data management, customer 360° and multi-channel fulfillment are some of the key initiatives within the retail multi-chain process. The process transformation here tackled many of the complexities the media industry faces today. For example, the retail industry had problems with global identifier but worked hard along with trade associations to adopt EAN, UPC standards. Simultaneously, they addressed every possible use case in multi-channel fulfillment to provide unique customer experiences. In addition, many of the multi-channel integration programs were run from CEO’s office thus giving it the necessary support, budget and focus.

In the early days of internet it became clear that the future of telecom revenue will be more data driven than voice driven.  The ability of the packet switch network to handle data traffic better than the circuit switch network forced many telecom providers to invest billions in changing their network infrastructure.

On similar note, if we are to believe that a seamless consumer experience across devices will define the future of content, linear & non-linear integration must be looked at as a multiyear transformation project with board level oversight and capital must be allocated with a view on long term return on investment.

Uniqueness of the Television Network Industry and How it Impacts the Prospect of Linear, Non-Linear Integration


While learning from other industries is relevant, the television network industry faces many unique and complex challenges in the process of linear, non-linear integration.

Consumer Experience

The consumer experience for content among various platforms continues to be different based on the nature of the device itself. The pattern of consumer content consumption is an evolving area and differs vastly among various demographics as well. While the book publishing and retail industries have similar problems with the consumer experience, it is not as varied or as evolving as video consumption across devices.



While the content for the linear channels is fairly standard, the consumption of content for non-linear channels is undergoing significant changes. A 2011 survey by TV Guide suggests that 15 percent of the population consume more than six hours of online video/week.  In 2010, the number of such viewers was just 4 percent. This growth has been primarily driven by massive growth of online content in type and quality. Diving deeper into content consumption by demographic will be significant while attempting a linear and non-linear integration.



The linear medium is a mass advertisement medium and has finite advertisement inventory (spots) to deal with. In contrast, the non-linear medium is a targeted advertisement medium and hence could have nearly infinite advertisement inventory (spots). This is a significant technical challenge to overcome for the industry. Since the content consumption pattern by demographic may not match across linear and non-linear channels, an integrated advertising strategy could be extremely hard to deploy.


A large portion of the industry’s intellectual property is third party content with complex contractual clauses. These contracts are far more complex than the publishing industry intellectual property rights and in some cases more complex than the music industry contracts. Additional complexity is generated through exclusive and restrictive distribution contracts with carriers, syndication partners and international contracts. While television is a geographically contained medium, mixing TV with a global Internet platform may cause significant rights enforcement issues.


Creating a Business Case

Just like any other solution, linear and non-linear integration must start with clarity of purpose.  There could be revenue or cost considerations for this integration and each of these considerations could result in a different path of action.

 On the revenue side, a solution for TV Everywhere may require building a consumer authentication and partner entitlement system; whereas an integrated Multi Channel (Broadband & TV) C3 solution might entail re-engineering the entire advertising sales, traffic and program planning system.

On the cost side, the shared digital asset management system could require more focus and effort in terms of migration, whereas a shared search solution might require an integrated master data management solution. (Chart 3 provides a visual representation of the business case drivers for a linear, non-linear integration)


Detailing Out the Problem Areas

Once the business case has been established, it is essential to look at the cost and challenges with respect to execution and implementation. This could typically consist of rationalization of portfolio and enhancing services to execute the business base. There could potentially be numerous technical, process and ownership (organization structure) related problems in execution. Some of them are listed below

  • Inventory pricing for TV and non-linear channels are different and typically use different tools. For example, television ratings forecast is based on historical Nielsen rating, while online is not.
  • The sales systems for TV and non-linear channels are different. Although the processes are fairly similar, an integrated strategy can’t be created easily because the technology environment is disparate.
  • Inventory management platforms for non-linear channels and Television are usually different and not in sync. Thus the sales system for these cannot work in unison.
  • The program lineups for TV are usually not in sync with content distribution for non-linear channels, thereby impacting the multi-screen strategy.
  • The deal/contract management process for TV and non-linear are different and hence cannot be managed from a centralized process center.
  • The in-house traffic management process may not exist for non-linear channels; hence integration with TV/broadcast traffic management system with linear channels may be difficult.
  • The invoicing and reconciliation processes with the agency use a different set of people and processes for linear and nonlinear channels.

The 1st step to address these issues related to process and technology disparity is to create a unified reference architecture that could support the linear and non-linear business in totality.

Creating a Future-Ready Reference Architecture

For the purpose of this article, reference architecture is created with four key components – experience, service, content and data. The main objective of defining this reference architecture would be to clearly identify the process owners and systems that embody components of this architecture.


This layer of the architecture deals with the experience of all parties involved across the linear and non- linear value chain. This could include partners, customers, end users and employees. Typically the non-linear business would have a highly evolved end-consumer experience technology platform, whereas linear business would have invested in a more evolved internal consumers or partner experience platform. (Chart 4 below details out some of the key aspects of the “Experience Layer” of this reference architecture.)


Services constitute a large part of the reference architecture.. Traditionally, content ingestion and transformation services are robust in linear business, whereas search and distribution platforms are more evolved in a non-linear model. (Chart 5 below details out some of the key aspects of the “Services Layer” of this reference architecture.)

2HCLChart4-5While shared infrastructure services among linear and non-linear systems are very common, shared services are yet to be explored in the areas of quality control and analytics. This is because quality control in linear workflow is typically far more stringent and if the same process is applied in non-linear, the cost of production might go up significantly. In the area of analytics, linear and non-linear processes deal with different sets of source data. In the absence of universal content ID and universal metadata standards, the process of shared services continues to be a challenge in the analytics space.


In most networks linear function owns the master content. Typically advertisement and other promotional content follow the business which runs it. Potentially there could be a single process owner for all forms of content. (Chart 6 below details out some of the key aspects of the “Content Layer” of this reference architecture.)


Data is one of the most significant parts of architecture. Data not only drives control but also define business strategy and hence there could be conflicts in ownership with respect to data within an organization. In order to achieve a successful linear and non-linear integration, acquisition and distribution rights must be centralized as a single shared service.  Other data sets could have a federated model in which they are incrementally enriched. (Chart 7 below details out some of the key aspects of the “Data Layer” of this reference architecture.)






Portfolio Analysis

Once the reference architecture is agreed upon by all the stakeholders, and the responsibilities of managing components of this architecture are defined, the next necessary step would be to carry out a portfolio analysis of systems and processes across linear and non-linear value chain.

This could include a maturity model analysis of the systems and processes from functional as well as architecture stand point. A subsequent analysis of the systems and processes in the context of the original business case must then be carried out to establish the best re-engineering strategy.  (Chart 8 provides a sample illustration of approach to this portfolio analysis)


Advertising Age published a very interesting report in May 2012 which showed YouTube video viewing dropped by 28 percent since December 2011 (based on Comscore data). Does this mean TV networks should rejoice? Not really. The same report indicates that the average length of video viewing on YouTube has grown by 33 percent to four minutes in the last year. Actually, YouTube by design is going for quality over quantity as longer views dramatically increase advertising opportunities. This could be viewed as good news for the networks as well; since every channel does capitalize YouTube as a syndication platform. However as a result of this strategy, a significant part of revenue will move to YouTube even though the content is owned by the network.  If television networks want to get a large chunk of the forecasted 9.3 billion potential online video advertising revenue for 2016, and gain more control over content, they have to bring television and online together for the consumer.

About the Author

Subhankar currently leads HCL’s effort in developing its practice within Studios, Television Networks, Music, Advertising and Digital Publishing space. His areas of expertise include Digital Strategy, Digital and Physical Supply Chain, Social Analytics, Piracy Control, Rights and Royalties, Customer relationship management, pricing, revenue management & sales systems. At HCL Subhankar has  spent the last several years consult existing clients in defining and implementing various facets of their digital architecture, build HCL’s own repertoire of solutions, augment relationship with partners and associations and help the business win large multi-year contracts. Prior to joining HCL, Subhankar has worked as a Principal Media Consultant with Infosys Technologies. Subhankar has more than 17 years of consulting and industry experience. He holds a master’s degree in Management from Indian Institute of Management (Ahmedabad), widely considered as the best business school in India. Subhankar can be reached at

To download a PDF version of this article, click here: M&E HCL_5.29.13



The Next Episode: Dr. Dre Meets Big Data

May 28, 2013 · Posted in Colleen Quinn's Blog, Featured Blog · Comments Off 

By Colleen Quinn, Teradata Corporation 

Fight on! As a USC alum, it’s odd that I cringe when other well-meaning Trojans shriek the school’s battle cry. But, last week, instead of hearing the bombast of a marching band in my head at the thought of ‘SC’s fight song, I was feelin’ a little more hip-hop. I had a kindred spirit in Dr. Dre. That’s cuz (as Dre would say) famed hip-hop star Dr. Dre and music mogul Jimmy Iovine announced a $70 million donation to create a new academy for music, focusing on the intersection of art, technology, business and innovation.

The curriculum includes computer science. Entrepreneurship. Art. Marketing. Couldn’t we all stand to find these intersections a bit more clearly?

This is the challenge for the media and entertainment industry of today – the need to find that intersection of “gut” and “insight.”  I’m sure this is the bane of any long-standing creative industry in today’s data driven climate. That’s because it’s hard to dispute the collective wisdom of creative powerhouses who’ve been at their trades for decades. But, no one is arguing that there needs to be a wholesale switch. Rather, just some more appreciation for the intersection.

In the handful of years I’ve been working in analytics – which were preceded by MANY handfuls of years working in production, post-production, and digital media – I’ve seen a real ramp-up in the role of analytics at traditional and digital media companies alike.  But, the truth is, there are still factions.  Whether I’m talking to a content creator, distributor, publisher or MSO, there are often camps: the we-don’t-need-no-stinkin’-analytics camp vs. the analytics-are-our-future camp.  Those two camps are starting to meet in the middle – and it’s about time.

At the risk of sounding overly prophetic, there is beauty in the intersection of art and science.  And, that, I think, is the promise of big data analytics across the content value chain ( When creative companies can integrate what they know about their audiences, their content, their channels and their marketing, they can unleash the value of the intersection of art and science.  Any successful analytics framework demands a detailed understanding of the art of both.

So, for all of you aspiring data artisans ( out there, take heart! Dr. Dre has got your back on this on this one. Fight on!

Big Data Boondoggles: NAB and UCLA Shine the Spotlight on TV’s Platinum Era

April 11, 2013 · Posted in Colleen Quinn's Blog · Comments Off 

By Colleen Quinn

It’s that time of year! Nearly  100,000 of the industry’s best and brightest flock to Sin City for meetings, demos, and debauchery. It’s NAB!  I’m a little wistful writing this from a cold, hard desk in Los Angeles, as pictures of productive-days-turned-long-raucous- nights start to flood into my Inbox.

“Wish you were here!” they scream. Alas, me too. Not so much for the endless hands of blackjack (where I lose every penny to my name), the 36-ounce rib-eyes (“Hey! Who has the most liberal expense account?!), or the dreaded booth-baby-T-shirt.  No.  Those things are nice – and essential to the NAB boondoggle – but mostly, I’d like to be there because I think this year, really, NAB is important.

My mantra for the last 5-years has been “the landscape for media and entertainment is changing.” I hear others talk about “the battle for the living room.”  Well, I’ve got news for all of us. The landscape isn’t changing. It’s changed. And, that battle is over.  Now we’re talking about a war – for the consumer. This month’s WIRED magazine has dubbed this the Platinum era of television.  They’re right.  (See my recent blog post  – was it a premonition to Wired’s new issue?)

NAB – really the community of creators, technologists, and post-production innovators that drive the show– are at the nexus of this platinum age.  The show is still well underway, but already, major themes are resounding: hyper-social content, deeper engagement, precision personalization.

This new age demands that content creators and distributors be able to understand and know an audience-of-one in ever more sophisticated ways. While the applications and services – from TV Everywhere to the Second Screen – may vary, they all share a common, critical foundation: analytics.  Without most of our peers realizing it, the entertainment industry’s biggest currency has become data. Big data.

It’s time for a deeper conversation about the role of analytics in our industry – and many of the most forward-thinking studios and distributors are already starting that dialog. Industry  thought-leaders will be represented at an upcoming industry roundtable on  May 23rd, “Using Analytics to Create, Captivate and Engage” – hosted by UCLA Center for MEMES  and Teradata – to drive that discussion.  If this is the Platinum Age of television, then data is the alchemy that can create it.

Oh. There will also be a cocktail hour. Westwood isn’t quite Vegas, but at least they won’t force you to wear a baby-T.

Viewers Need a Better Way to Find and Watch TV Content in a Multi-Platform World

April 4, 2013 · Posted in Featured Blog · Comments Off 

By Bart Myers, Vice President of Consumer Web Properties, Rovi

Despite the rising number of entertainment options, it’s clear that Americans are watching as much TV as ever. They’re just watching it differently, discovering new ways to enjoy TV on their own terms and on whatever device is available to them.

They are streaming free and premium content to their tablets and smartphones. They are tuning into online services like Hulu and Netflix to watch their favorite shows. They are even paying a few bucks to watch single episodes via services like iTunes and Amazon Prime. And, yes, they still rely on cable and satellite subscriptions.

Still, many people in the television industry are freaking out. They’re obsessed with the same question: “Who’s cutting the cord?” After all, Comcast Corp., the largest U.S. cable provider, said it lost 117,000 video customers in the third quarter of 2012. After decades of steady increase, the number of U.S. households subscribing to pay-TV service is now on the decline, according to the Nielsen Company.

But cord-cutting isn’t the real issue. Instead, content producers should be asking, “What cords are viewers using—and how can we maximize the value of those cords for everyone involved?”

Nielsen recently began tracking the viewing habits of millions of Americans that now connect to entertainment content via the Internet, and not through a cable or satellite service. Nielsen found that more than two-thirds of these “Zero TV” homes get their content from a broad range of devices including personal computers, Internet-connected TVs, smartphones and tablets.

For the content provider industry, there are lessons to be learned from these “Zero TV” homes. The Nielsen study found that nearly half of the “Zero TV” homes now watch shows through online subscription services. Yet, many are finding that the content they love is not readily available in all formats and platforms, which leads them back to a siloed approach to getting their TV content.

The three main use cases in TV watching are:

  • The viewer is at home, trying to decide what’s on TV to watch right now.
  • The viewer is anywhere, trying to plan what to watch later.
  • The viewer is on a connected device and wants to watch whatever is available online.

The problem for content owners, however, is maintaining the relationship with the show’s fan base across multiple siloed platforms.  Content providers should be able to connect to the user where they are watching TV, but the current ecosystem of devices and content rights makes that relationship extremely hard to manage.

Increasingly, viewers need an advanced set of tools and services that can guide them across all the different ways they consume televised entertainment today. For content owners, this means that they need to keep an eye on rapidly evolving TV consumption habits.  To keep up with the changing consumer landscape, content providers need to think differently about how people are planning for and enjoying content.

Consumers, meanwhile, are hungering for televised entertainment content online, but it’s simply too hard to find. They don’t understand why they can’t access their favorite content anytime, anywhere, from any device—especially since the technology exists to make it happen. The maze of blind alleys the viewer must navigate to find that content can be excruciating.

If content providers want to maintain the relationship with viewers in this future world of entertainment on every medium, content discovery needs to be much easier.

As things stand today, there is too much friction in the marketplace. We shouldn’t be cutting consumers off from their favorite shows. Instead, we should be finding new ways to better connect viewers with entertainment content, and helping them understand the appropriate options for enjoying that content—wherever they are and on whatever platform they choose.

About the Author
Bart Myers is Vice President of Consumer Web Properties at digital entertainment innovator, Rovi Corporation. He co-founded in 2006, which was acquired by Rovi in 2011. is a website that helps people watch and track their favorite TV shows online and was one of the early sites to embrace the then emerging trend of cord cutters. For more information, visit or Follow Bart on Twitter, @bartolah.

M&E Journal: The Transformative Effect of User Research On the Consumer Experience

March 27, 2013 · Posted in Featured Blog · Comments Off 

By Seth Hallen, CEO, Testronic Labs

Editorial Contributions By: Graham McAllister, Ph.D., VP of User Research, Testronic Labs

In the home entertainment industry, never before have we seen such rapid and dramatic changes in the way consumers are accessing and viewing content. Broadcast television was the first effective format available to the masses for watching content in the home. It took almost 40 years for the next significant innovation in content delivery to arrive on the scene, videotape, which allowed a consumer to choose when and where to watch a movie or TV show. More than 20 years later, DVD and then Blu-ray improved the quality of in-home content and DVR’s made time-shifting possible. Now, less than 10 years after the introduction of the Blu-ray Disc, we are experiencing a colossal shift towards online digital video consumption.

According to Futuresource Consulting, there were over 485 billion legitimate (not pirated) online video views in the US last year! That is up from 266 billion in 2009. As impressive as that number is, only 1 percent of those views were purchased. Clearly the public is hungry for online access to content. The challenge is offering a value proposition the consumer is willing to pay for.

If core consumption has changed so dramatically, it is only logical to ask how behaviors regarding searching, accessing, cataloging and interacting with content will change. There are already various options available, from simple web interfaces to 2nd screen apps to voice-command interfaces. Which of these, if any, are most compelling to the consumer? How can you determine exactly how consumers like to access and interact with their content? How can you best forecast the trends so that developers can create UIs that will truly engage consumers so that they recognize the value in purchasing content they view online and through all their various devices? How can you ensure an experience that home video consumers love so much that it will yield DVD-caliber success?

Creating a world-class experience that consumers love is the goal of many in the digital content space, but it is achieved by remarkably few. Attempting to understand not only what consumers enjoy, but more importantly why they enjoy it, is the key to success, and in recent years a growing field has developed which focuses on this very issue.

User Research is a discipline which combines elements from psychology, design, computer science and many other fields, with the elemental goal of understanding people. The idea being – if we better understand people, then we can better design products and services which they should enjoy using. It sounds so obvious in retrospect, but it is only in recent years that this has become an important and specific developmental focus.

Creating Enjoyable Experiences

In the past, DVDs, software, and websites were judged on what features they offered. Early consumer devices and websites were often used by technically literate power-users, where ease of use and aesthetics were not the focus. However, the shift in emphasis from features to the user experience has been clear in recent years, with 2 prominent examples standing out.

In 2001, Apple’s iPod was released. It was not the first MP3 player on the market, nor did it offer as many features as its rivals. However it quickly became the best selling MP3 player on the market, a market it still dominates more than 10 years later.

In late 2006, Nintendo released the Wii games console. The new Nintendo console went on to outsell its competitors by approximately 50 percent.

Why did these two products achieve such success against very stiff competition? In both cases, an obvious differentiator was ease of use. They both utilized a simplified interface that made the devices accessible not only to a narrow target market of technophiles and gamers, but also to a broad age and gender range who were not attracted to their competitors’ products. But, was that really the deciding factor?

Change in Focus

At an event in 2006, an Apple representative said that the App Store changed where developers should focus their efforts. He noted that before the App Store, developers were probably putting about 90 percent of their effort into technical features and approximately 10 percent on design and user experience (if even that); after the App Store was launched however, users’ quality expectations of apps increased very quickly. He advised developers that if they were not putting more than 50 percent of their efforts into the user experience, then they should not expect their product to do well. The message was becoming clear, it’s not about the technology, it’s about how people use and experience the technology.

Defining the User Experience

I’ve used the term ‘user experience’ throughout this article, but what does it really mean? It is often used as an umbrella term for two main areas; usability and actual user experience. Usability is concerned with the user’s ability to complete the task that the product or website is designed to do and how many steps it takes to get it done. This is very functional, and it can be quantified in ways such as time taken or clicks required. User experience (or UX) concerns itself with understanding whether or not the user enjoyed doing the task. UX is actually the much more important of the two since usability is essentially a yes/no proposition. Users expect features to work, but the precise details of how they work can turn them into long-term customers. UX is how user loyalty is generated.

Industry Comparisons, Usability and UX

The web and video game industries have understood for a while now that building a website or game that technically works is not good enough; the experience must also be smooth, clean and enjoyable. There is a lot of competition vying for a consumer’s business, and your product has to be better than the next guy’s.

Usability has proven to be extremely important in the development of websites. Many usability firms exist, and they evaluate how users interact with websites with the aim of refining the experience. Even seemingly trivial changes can bring massive increases in revenue. One high-profile case is that of the $300 million button. A large online retailer had a sign-up form which simply asked for a customer’s e-mail address and password before entering a website. This information was also asked for at checkout, so was not essential at the start. Asking users to sign-up before buying created enough initial resistance that some users went elsewhere. Once the change was made, the number of customers making purchases increased by 45 percent, leading to an extra $15 million in revenues in the first month and $300 million extra over the course of the year. The online retailer was not even aware there was an issue. They had simply not put enough effort into understanding their users to realize that a simple tweak to their website could offer users a better experience that would dramatically increase profits.

For video games, however, the focus is more on UX. In gaming it is not about how easy it is to complete a task, and in fact, often times the more difficult it is the better. It is all about the enjoyment the player experiences along the journey.

The New World of UX Testing

So why did the iPod and the Wii outsell their competitors? Usability was certainly an important factor, but UX was likely the key.

To understand how someone feels using a product, when they are enjoying it and when they are not, a new crop of high-end services, such as Biometric Testing, are becoming available. Biometric Testing involves the use of psycho-physiological sensors, such as galvanic skin response (GSR), heart rate, and skin temperature, to ‘read’ the player experience on a second-by-second basis. It is also possible to track eye movements to see where the user is focusing at any given moment. Such approaches to understanding the user experience complement traditional testing and development methods and offer unique insights into what users are really experiencing, rather than what they tell you they are experiencing.


How does the home entertainment industry create the next iPod or Wii? With online digital stores becoming the norm for searching, accessing, and interacting with content, both usability and UX will have key roles to play.  Emerging products seeking to establish a foothold in a market with free access points and significant, established competition must be better, smoother and more enjoyable to use. The facts prove that even when there is content available for free, a cost-based product can rise above as long as it offers a superior experience. By employing Usability and UX testing to understand what users want, experiences that consumers will love can be designed around those expectations, allowing developers to build loyal and long-term relationships. At Testronic Labs, we believe that investing effort in understanding users in this way will bring benefits to all and will help to ensure the healthy proliferation of emerging home entertainment platforms and products well into the future.

About the Author

As Chief Executive Officer of Testronic Labs, a global third party Quality Assurance and Testing Services company with worldwide facilities, Hallen oversees global operations and the execution of Testronic Labs’ strategy in emerging markets. Prior to joining Testronic, he was VP of North American Operations at Lightwork, and oversaw the business development Digital Media Services and DVD Authoring  for Lightning Media. Hallen currently serves as a board member for the Hollywood Post Alliance (HPA), as well as an Advisory Board Member of MESA.

M&E Journal: Using Automatic Content Recognition to Benefit the Fragmented Media Landscape

March 18, 2013 · Posted in Featured Blog · Comments Off 

By Emmanuel Josserand

We are all living in an increasingly fragmented world. And the media world is no different. Consumers are demanding their viewing experience expands to the smartphone and tablet from the primary screen, which is itself becoming much more complex with the propagation of smart TVs. Therefore content distributors need to make their content compatible for distribution in myriad formats. The conundrum, of course, is that the consumer therefore expects more at the same price. The content creators and distributors need to spend more to generate the same price.

Fig. 1: 2nd Screen sync used on audio watermarking

Automatic Content Recognition (ACR) helps bridge this disconnection on both ends of the broadcasting experience, offering consumers deeper immersion and interaction with television programming and advertising, while providing rights holders and broadcasters a heightened level of business intelligence through highly granular tracking of how viewers interact with content. ACR, powered by either watermarking or fingerprinting, allows dynamic and seamless interlinking of devices, viewers, content and applications. So it fuses the viewing experience across multiple screens for the viewer, while closing the delivery-feedback loop for the content owner and distributor. The whole process becomes more efficient.

In the multi-screen environment, ACR is a tool that gives a smart device – such as a smartphone or tablet – the ability to become “content-aware.” This awareness allows the smart device to recognize what is being watched on the primary TV screen without the need for direct input from the user. This automatic recognition can then be employed to trigger content on the 2nd screen device that is complementary to that of the primary screen. Television programs, films, advertisements, and other types of main-screen content can therefore extend to the viewer’s 2nd screen, creating an immersive multi-screen viewing experience without the need for the user to manually enter Web site addresses, or search for the relevant information on those sites.

In the single-screen environment, ACR solutions can also be integrated into the chipsets of connected/Smart TVs and smart set-top boxes themselves to enable real-time content identification, and the triggering of events at the device level. As opposed to the multi-screen application described above, this single-screen enhancement enables the Smart TV or smart set-top box itself to become “content-aware,” and therefore offer a host of value-added features for the consumer directly on the primary screen of the TV itself.

Fig. 2: ACR integrated in SmartTVs

For content owners and distributors, along with the ever-growing number of companies involved in the development, delivery and monetization of content, ACR acts as a multi-faceted toolkit that can add a rich variety of new, commercially vital functions and features to these companies’ core operations. Advertisement triggering to the 2nd screen based on live TV content that is being broadcast is a key example. By automatically notifying application providers in real time of what content is airing on which channel, the service allows for the synchronization of value-added functionality such as content-specific background information, hyperlinks, and synchronized social newsfeeds, all within the developer’s 2nd screen or smart TV applications. The application provider can therefore offer users a more powerful and engaging TV-synchronized experience. In addition, such services enable application providers to work in close partnership with advertising agencies and brands to further monetize their application platforms.

With ACR-powered content-aware devices continually monitoring in real time what is being watched, broadcasters and content owners are able to track highly granular viewing habits, and identify detailed information as to where, when, for how long, and on what device content is being consumed. The implications of these detailed analytics are enormous and can provide a comprehensive range of benefits to both protect and enhance the business models and revenues of content owners and distributors.

While much of this content identification technology has until now been focused on enforcing copyright—or ensuring that a video asset appears when and where it is supposed to—in the longer term, ACR provides a vital strategic and tactical tool that addresses the multi-screen environment in which today’s viewers consume content, while offering substantial benefits to everyone in the content value chain. Content-aware devices—be they the primary or secondary screen—with the ability to subtly and automatically drive viewer interactivity, provide an infinitely flexible springboard from which developers, content  providers, brands and broadcasters can construct an eco-system to offer entirely new creative dimensions in which the viewer can be engaged, and the content owner and distributor informed.

Fig. 3: Content triggering using video fingerprinting from a broadcast monitoring network

About the Author
Emmanuel leads the global marketing and communication activities for Civolution. Prior to Civolution, Emmanuel was part of Teletrax, which in 2008 became the Media Intelligence arm of Civolution. Emmanuel was previously Business Manager at digital imaging software company Arcsoft, where he helped set up their European offices. He has more than 15 years experience holding various roles in marketing, sales and business development.

Next Page »

  • Upcoming Events

MM NY Button
HITS 2015 Button
  • Hollywood IT Society

  • 2nd Screen Society

  • Content Protection

  • Stay Connected

Stay informed!
Let MESA keep you updated on the latest technologies and industry developments.
Never display this again