By Paul Sweeting
The NFL seems to be in a test pattern. On Monday, the league announced that it will make next seasonâs match-up between the Buffalo Bills and the Jacksonville Jaguars available exclusively via the internet outside of the teamsâ home markets, rather than on national television. That was followed by an announcement that the league will suspend its local TV blackout rule for the entire 2015 season allowing games to be shown in their local markets even if the game is not a sell-out.
The league described both moves as tests, although what exactly is being tested in each case was left a bit vague.
The Bills-Jaguars game is a one-off, and a low-risk one at that. The game was set to be broadcast by the NFLâs own NFL Network, so there were no pre-existing rights deals to renegotiate, and involves two struggling teams with little national following in a game to be played in London and shown in the U.S. at 9:30 a.m. Eastern Time. Even if the test is a disaster the damage will be limited.
By Paul Sweeting
We are not in the business of collecting your data,â Apple senior VP Eddie Cue declared in announcing the Apple Pay mobile payment system. âWhen you go to a physical location and use Apple Pay, Apple doesnât know what you bought, where you bought it, or how much you paid for it.â
The line was clearly meant as a swipe at Google and other competitors in the mobile payments space, who do collect purchase data and use it in ways that can implicate usersâ privacy. But Appleâs studied indifference to the details of purchase transactions is also central to Apple strategy in launching Apple Pay.
When a iPhone user adds a credit card to her Apple Pay account, the card information is encrypted by the device and sent to Appleâs servers, where it is decrypted to identify the issuing bank, and then forwarded to the bank in re-encrypted form.
By Paul Sweeting
There are plenty of live-streaming platforms out there for anyone who wants to set up their own broadcast on the cheap. But few have caught on as quickly or generated as much buzz as Meerket, the barely month-old streaming app that rides atop Twitter.
Or at least it did until Friday, when Twitter abruptly cut off Meerkatâs ability to easily access usersâ list of followers to automatically alert them to when a new âMeerkastâ is in progress. The move was neither unprecedented for Twitter, which has never been overly developer-friendly, nor particularly surprising insofar as Twitter announced its acquisition of Periscope, a competing live-streaming app, reportedly for $100 million, on the very day it shut the door on Meerkat.
So much for platform neutrality.
Itâs not hard to see why Twitter would want to reserve the opportunity represented by Meerkast for itself, however. It has the potential to become a very powerful platform in its own right.
Live video streaming is not a new technology. But the Meerkat app got a lot of things about it right. The app is launched and streams are initiated from a smartphone (so-far iOS-only but an Android version is in the works) and, like Snapchap photos, the streams are ephemeral. There is no pausing, rewinding or sharing during a Meerkast (although the originator can download a video of the stream).
By Paul Sweeting
The full text of the FCCâs open internet order has now been released, along with 305 additional pages of exegetical elaboration and 79 pages of formal dissents from the two Republican commissioners.
From an OTT perspective, there isnât much in the full text that wasnât already known from what the FCC released last month when it voted to approve the rules: The orderâs âbright-lineâ rules against blocking, throttling and paid prioritization do not apply to commercial interconnection arrangements. However, the FCC will consider complaints regarding those arrangements and will take (unspecified) enforcement action if an ISPâs behavior is determined to violate the orderâs âgeneral conduct standard,â prohibiting actions that âunreasonablyâ interfere with or damage consumers or edge providers.
By Paul Sweeting
HBO just canât quit the bundle. With HBO Now, itâs new, over-the-top streaming service, the network for the first time is making its content available to stream without a pay-TV subscription. But HBO still hopes to sell it as part of a bundle. The only differences are the the other components of the bundle and the identity of the bundlers.
At launch, HBO Now will be sold exclusively by Apple and available on Apple devices only. According to HBOâs FAQ, âyou can subscribe to HBO NOWâ using your iTunes account. Customers can access HBO NOWâ by going to HBONOW.com, through AppleTVÂŽ or by downloading the HBO NOWâ app in the Apple App StoreÂŽ.â Apple and HBO will then share customer support duties.
After a three-month Apple exclusive, HBO will make the service available to other digital distributors, such as Amazon and Roku, presumably on terms similar to Appleâs, with the distributor doing most of the heavy sales lifting. But the network is also very much hoping to persuade its current cable-operator affiliates to bundle HBO Now with their broadband-only offering, so far with little success.
By Paul Sweeting
Nearly a decade after Netflix went over-the-top, at least a full decade after the launch of YouTube, and more than two decades since Bruce Springsteen first sang of having â57 channels and nothinâ on,â the video industry, which we used to call the TV industry, is still wrestling with the problem of content discovery.
If anything, the problem is getting worse, not better, as the volume of programming and the number of program sources are both growing rapidly thanks to the new digital platforms.
Heroic efforts have been made over the years to tame the flood, using search technology, algorithmic recommendation engines and various other big-data strategies.
Roviâs Fan TV, for instance, which it acquired late last year and reintroduced at CES in January, uses voice-activated semantic search and leverages Roviâs vast trove of video metadata to generate recommendations or locate specific titles in response to natural-language queries.
By Paul Sweeting
When a consumerâs OTT video stream starts rebuffering, or suffers packet losses resulting in degraded quality, itâs often hard to know where to direct blame. The problem is typically caused by congestion somewhere between the contentâs originating server and the consumerâs receiving device.
But exactly where in the chain of transit that congestion is occurring, and more importantly who is responsible and why, can be difficult even for engineers â and virtually impossible for consumers â to ascertain.
Back when it appeared the FCC was poised to classify interconnection arrangements between last-mile ISPs and third-party transit and content providers as a new, distinct type of Title II service the question of liability for congestion in the chain of transit suddenly became urgent for those involved in wholesale traffic exchanges.
Fearing the new classification would leave them at a disadvantage in negotiating interconnection agreements with content delivery networks (CDNs) and other transit providers and worried theyâd be blamed for problems occurring elsewhere in the transit chain, ISPs rushed to the FCC to insist that any new rules regarding traffic exchanges cover both parties to the exchange.
By Paul Sweeting
Donât look now OTT fans but the net neutrality rules expected to be enacted Thursday by the FCC may turn out to be not as OTT-friendly as it originally appeared they would be.
When FCC chairman Tom Wheeler unveiled his âfact sheetâ on the upcoming rules on Feb. 4, it looked as if the commission was poised to adopt the âstrongâ version of net neutrality pushed by Netflix and others. According to the fact sheet, the rules would treat interconnection arrangements between ISPs and third-party edge providers as a Title II service subject to the same âjust and reasonableâ standard that will apply to ISPsâ management of their last-mile networks.
Since then, however, as noted in a previous post here, even some net neutrality advocates have raised questions about the legal and statutory grounds for extending Title II to interconnection arrangements. In a letter to the commission dated Feb. 11, Free Press policy director Matthew Wood warned the interconnection arrangements were unlikely to qualify as Title II services as defined by the Communications Act, creating an opening for a legal challenge to the new rules.
Democratic FCC commissioner Mignon Clyburn is reportedly also having doubts about applying Title II to interconnection. According to a report Tuesday by the Capitol Hill newspaper The Hill Clyburn is seeking eleventh-hour changes to the proposed rules, including dropping plans to classify interconnection as a distinct Title II service.
By Paul Sweeting
The FCC this week is expected to approve on a party-line vote chairman Tom Wheelerâs long-gestating plan to impose new net neutrality rules by reclassifying internet access as a telecommunications service under Title II of the Communications Act, setting in motion a process by which the world will finally get to see the full text of the 308-page Memorandum and Order and begin fighting â almost certainly in court â over its particulars.
One thing that apparently will not be in the order, however, is any bright-line rule banning so-called âzero-ratedâ data plans offered by wireless operators and ISPs under which particular applications are not counted toward a userâs monthly data cap.
âWe do not take a position on zero-rating,â the FCCâs special counsel for external affairs Gigi Sohn confirmed last week on the C-Span program The Communicators. Instead, she said, the agency would review complaints about zero-rated services on a âcase-by-case basisâ to determine whether they harmed consumers.
That has many OTT providers, start-ups and VCs worried that wireless carriers and ISPs will rush to embrace zero-rated data plans, producing the same sort of anti-competitive and market-distorting effects as paid prioritization, which the new rules do explicitly ban.
By Martin Porter
I donât know who else to write toâŚ so considering that this Sunday is your big day of the year and ultimately your show is the marketing force behind my storyâŚ youâre it. I have a confession to make because I have sinned.
You better than anyone know that it is screener season and we all know what that means. There are discs of all those great movies everyone has been meaning to see circulating at parties and among friends, creating a virtual industry underground among those who should know better but simply canât resist watching one of your Academy Awards contenders for free.
My recent failing involved the Sony Pictures Classics picture âWhiplash,â which appealed to my childhood obsession with jazz drummer Buddy Rich. It was also one of the many other movies that are still on my pre-Academy Awards broadcast must-see list. I actually paid to view the movie in my hotel room during a recent vacation, which was cut short by my car service to the airport arriving too soon. I never saw it to the end and I was obsessed with seeing it through (take note for an opportunity here UltraViolet).
Unfortunately, despite a tease on VUDU that it was coming soon, the movie was nowhere to be found legally on the web. (The fact that I never considered checking out Fandango to see it in the theater is as much a reflection of my travel schedule as it is the state of theatrical affairs). Iâm at least ethical enough to steer clear of the bootleg sites.
But then, by happenstance, the screener surfaced during one of those all-too-common industry chats that were taking place over the past few months among those somehow connected (albeit by 6 degrees) to an Academy-voting member.
By Paul Sweeting
As ISPs, both large and small, gear up to sue the FCC over its forthcoming net neutrality order, even strong supporters of net neutrality have begun pointing to potential legal problems with the proposal outlined by FCC chairman Tom Wheeler earlier this month. One of biggest potential problems, as far as OTT providers are concerned, was flagged by Free Press policy director Matthew Wood.
As described in the fact sheet distributed by the FCC, the order will treat the âserviceâ ISPs provide to OTT services and other edge providers as a Title II service, just as it does the internet access services ISPâs provide to subscribers, giving the commission the authority to review interconnection agreements between OTT services and ISPs and potentially declare them not to be âjust and reasonableâ as required by Title II:
By Paul Sweeting
Itâs hard to remember now, but Stewart took over anchoring duties at the Daily Show nearly 17 years ago â more than six years before YouTube was invented. Yet they seemed made for each other. The showâs easily chunkable format was ideal for the atomized milieu of YouTube, especially in the early days when YouTube uploads were tightly restricted by length, and the website quickly became the Daily Showâs second time slot â for better or worse.
Even today, after an epic legal battle between Comedy Centralâs parent company, Viacom, and YouTube, the online platform remains a critical outlet for the Daily Show. As Peter Kafka noted on Re/Code, the Daily Show draws about a million viewers in its initial airing. But millions more see it on YouTube the next day on their laptops and smartphones, or at least the bits their friends alert them to via Twitter, Facebook and other social media channels.
Viacomâs nearly decade-long litigation against YouTube for copyright infringement, in fact, was in large measure about the Daily Show, along with the Colbert Report, South Park and a few other properties. It was the unchecked, unauthorized uploading of clips from the Daily Show and the Colbert Report, as much as anything else, that spurred Viacom to launch its $1 billion lawsuit against YouTube (and later Google ex-acquisition) in 2007.
By Paul Sweeting
Music subscription service Spotify last week hired Goldman Sachs to help it raise around $500 million at a valuation in the neighborhood of $7 billion. Private market analysts currently value the company at around $6 billion.
The new fund raising round likely pushes back any plans the company had for an IPO, no doubt disappointing some investors. But it buys the company some time before it has to focus on IPO prep as it gets ready to face its first real competition. According to a report by the usually well-sourced 9to5Mac, Apple is gearing up to relaunch a Beats-branded music streaming service this summer.
Rather than simply dropping a Beats app onto Apple devices, the report says Apple has been working on a deep integration of Beats technology and functionality into iOS, iTunes and Apple TV.
By Paul Sweeting
Weâre just at the dawn of the virtual MVPD era and weâre already seeing signs of more market segmentation and product differentiation than with the current, facilities-based service provider model.
sling TV logoOn the heels of Dishâs breakthrough launch this week of its Sling TV service, Sony has begun to pull the curtain back a bit on its own virtual pay-TV service, PlayStation Vue, which is expected to launch by the end of the first quarter. GigaOMâs Janko Roettgers got a sneak peak courtesy of a beta tester, including some screen shots of the UI, and itâs clear the Sony service is a very different animal from Sling TV.
Unlike Sling TVâs low-priced, slimmed-down bundle of a dozen channels built around ESPN, PlayStation Vue includes a nearly full load of broadcast and pay-TV networks â over 70 according to the list provided to GigaOM â along with catch-up VOD and cloud-based DVR functionality, and is likely to cost $60 to $80 a month â roughly the same as traditional cable or satellite service.
The difference in the bundles reflects the very different audience segments Dish and Sony are targeting as well as their different strategic goals. Sling TV is targeted at the 10 million or so U.S. households, many of them counted among the Millennials, who currently have broadband service but do not subscribe to pay-TV.
By Don Terry
Big DataâŚHadoopâŚData Lakes? Everywhere you turn there is a lot of industry buzz in the news about the value of âBig Dataâ, and the potential for this exciting new technology.
Big Data may indeed be a buzzword, but if so itâs a buzzword that can have a measureable and incredible impact on a companyâs top, and bottom lines.
At its core, the concept of Big Data is that of supporting executive decision-making with the most accurate, current, comprehensive and comprehensible presentation of all information available regarding a business. Unstructured data is doubling every year, per IDC, driven by mobile devices, gaming consoles, social media, the Internet of Things, second screen and digital ânon-linearâ television viewing. But why does this matter? The promise was that Big Data was going to cure cancer make our lives easier and change our lives forever.
By Chuck Parker
It seems everywhere you look these days there is something about âthe Cloudâ in front of you. Twitter, LinkedIn, the tech press, and seemingly every press release you read has the various players in the Media and Entertainment industry describing what they can do for you in the cloud.
The promise of the cloud is BIG. At its most basic level, there is the opportunity for a company to turn its fixed investments in CAPEX into variable (or âburstableâ) spend when required as OPEX. For smaller companies and companies without legacy infrastructure this is potentially the best way forward so that their costs are directly tied to their revenue stream whether those requirements are storage, transcoding or rendering for post production and visual effects work flows.
For larger and more established companies, it is the opportunity to exceed current infrastructure capacity to take on that surprise project. It is also the opportunity to set an investment level threshold where companies build to the âtroughâ rather than the âpeakâ for the inherent variability in the media industry business season.
But this isnât a new promise in the IT world. Back in the early 2000s, this promise was held out to the largest companies in the guise of âoutsourcingâ. Whatâs changed now? First, CEOs and CFOs in the M&E industry are well educated now about âcloudâ and understand enough to know that their businesses should be at least experimenting with workflows in the cloud.
Additionally, the âburstableâ nature of the cloud means that businesses can actually âtest driveâ new capabilities in their workflows without significant investment or risk to their business. These two structural changes have resulted in a proliferation of growth in âback officeâ workflows across industries. SaaS (which is the ultimate cloud approach where the application and infrastructure are âby the drinkâ) has been a driving force here, allowing companies to put their expense systems, HR systems, and even sales and CRM systems into the cloud (think SalesForce.com) with great success for the companies who are deploying them.
But putting production systems into the cloud has been elusive. These applications are both complex and customized to the point where SaaS is not really an option. Even when two companies are using the same rendering application for their workflows, they are often managing their compute and storage in entirely different ways. So the industry has coined a new term to attempt to educate the CxO suite on how to approach this landscape â IaaS â or âInfrastructure as a Service.â
At its most basic level, when you retain control of the application but are leaning on the cloud for storage or compute resources, this IaaS term describes your approach to leveraging the cloud. But while this approach has better economics and risk models than the âoutsourcedâ approach in the industry from 10 years ago, it isnât exploding at the rate you would expect with its promise of âon demandâ and âless investmentâ.
So what is holding the M&E industry back from investing in the cloud for its primary workflows?
Two things: security and connectivity.
Security. While every major cloud provider goes to some length to describe what kind of security protocols their customerâs data lives in while on their servers, we still hear horror stories every day about large companies being hacked for their valuable resources (Target and Home Depot are the most recent infamous incidents). In our specialized industry, all of us know that one single breach of pre-release materials can be the death of a company and no amount of promised encryption, even when from established and emerging cloud platforms, can alleviate those fears.
Further, if the project you are working on isnât your IP, you are likely already bound by contract to use certain security measures that preclude using âpublicâ cloud infrastructure for your workflows. The ability to audit security processes and posture is important to trusting partners in the service chain and remains a requirement for the most important content workflows.
Connectivity. The challenge of delivering the promise of âburstableâ, âon demandâ storage and compute power to these resource intensive applications comes down to the internetâs age old axiom: sustainable bandwidth. While there are plenty of companies that can drop a multi-gig connection to a cloud provider, there are few that have the expertise to connect your application into the cloud resources right to where they need to be and integrate with your existing network and workflow – taking account of aspects like low latency requirements.
Even then, just finding a âlarge pipeâ for your data doesnât complete the business modelâif you cannot get your bandwidth to be as âburstableâ as your storage and compute power, the investment model for cloud falls apart.
At Sohonet, we believe the key to unlocking the M&E industryâs âcloud potentialâ is the ability to offer studios, post production houses and visual effects companies options for getting their applications connected to public and private
cloud infrastructure in a manner that meets their low latency and security requirements while still meeting the âon demandâ business model to support their ability to âburstâ into the cloud for production.
We believe that the M&E industry will embrace a mix of three approaches to meet their production workflow and business model requirements. Inherent to all three approaches is unlimited or inexpensive egress (essential for the unpredictable production process), improved security posture, and (most critically) access to 24/7 support resources that understand their unique workflow requirements.
- Low-cost access to generic compute and storage resources (public cloud) coupled with sustained low-latency bandwidth that includes unlimited egress.
- Access to application-specific low-latency and/or industry standard security approaches (private cloud) for storage and compute resources coupled with sustained low-latency bandwidth that includes unlimited egress.
- High-speed, burstable connectivity to major cloud providers where the support for the application and security are already âin-houseâ and the only missing component is the âburstableâ bandwidth directly into their resource center that provides the lowest possible latency and inexpensive egress while still improving the security posture.
We believe that access to cloud resources is critical to the industryâs progression along ever-increasing data storage and compute requirements as 4K workflows begin their progression to 8K and High Dynamic Range workflows and while 4K consumption becomes mainstream in consumer homes. As the trusted communications partner for the M&E industry Sohonet is committed to providing the same Fast, Flexible and Phenomenal customer service that has built our brand and reputation over the past 15 years, and that delivering on the promise of âConnected Cloud Servicesâ is critical to our customersâ future.
By Alan Wolk
For all the debate around who should be in charge of second screen and social TV efforts, one thing is becoming very clear: the key to success rests with the showrunners.
Thatâs because when the showunner is involved, along with the actors and the writing staff, it seems like the second screen experience is an actual part of the show, not some sort of bolted-on afterthought. In fact, a recent study from Twitter, Fox and the Advertising Research Foundation revealed that 40% of viewers prefer to see tweets from cast members versus 18% who wanted to see tweets from the official show handle.
This stands to reason on many levels: the type of viewer who is fan enough to want to tweet about a show is the type of viewer whoâs likely formed some sort of connection with the actors and wants to read their tweets…
As my kids get older, Iâve had to cede control of the car radio. One result? Thanks to the forced waning of my NPR habit, Iâm much less interesting at cocktail parties than I care to be. Another consequence? Trying to smoothly navigate radio programming that doesnât meet my parental, shall we say, scrutiny â nor, assuage the pop-cultural tastes of the wannabe teenager.
Satellite radio â long dominated by SiriusXM Radio â brought peace of mind to parents everywhere by offering up more programming options in the car than we ever thought possible. But â like the living room before it â the car is becoming the latest field on which todayâs digital media game plays out.
As my kids get older, I’ve had to cede control of the car radio. One result? Thanks to the forced waning of my NPR habit, Iâm much less interesting at cocktail parties than I care to be. Another consequence? Trying to smoothly navigate radio programming that doesn’t meet my parental, shall we say, scrutiny â nor, assuage the pop-cultural tastes of the wannabe teenager.
Satellite radio â long dominated by SiriusXM Radio â brought peace of mind to parents everywhere by offering up more programming options in the car than we ever thought possible. But â like the living room before it â the car is becoming the latest field on which todayâs digital media game plays out.
New streaming music providers offered multi-platform music experiences that were highly personalized, mobile, and which threatened to make in-car satellite services too niche. (Register here for the Sept. 23 webinar to learn how SiriusXM navigated new channel demands with analytics and explore how SiriusXM is leveraging customer data integration, behavioral analytics, and real-time interaction to build deeper relationships with their subscribers.)
With more than 25 million subscribers, SiriusXM didn’t seem to be in a bad position. Still, with increasing competition from streaming music entrants like Pandora and Spotify, SiriusXM Radio needed a strategy to future proof their business, and meet subscribersâ demands for content anywhere, at any time.
Like many companies who donât fully realize the treasure that is customer and marketing data, SiriusXM had outsourced marketing technology. When the time came to quickly respond to the shifting music market, a key first step was to bring their most valuable asset back in house. This step marked the first in a journey for SiriusXM Radio to develop and deploy next generation marketing analytics that would allow them to reinvent their relationships with subscribers and prospects, and take ownership their data.
Across media & entertainment and digital media, the expectations of consumers continue to shift. More than ever, audiences expect deeply personal messages on their platform of choice, at the right time. Yet, many companies donât leverage their most valuable asset â detailed insight into audience behavior. And, in some cases, that data is left to third parties to manage.
If your business is ready to take control of marketing data to drive a more personalized, engaging customer experience, hereâs your chance to learn from the top. SiriusXM Chief Information Officer, Bill Pratt, is sharing his experience about his companyâs analytics journey in a live, one-hour webinar on September 23rd.
In this M&E Journal Digital Exlusive, Rakesh Nair of Dell discusses how they are powering the compute-intensive needs of the Media & Entertainment industry with their products, technology and for their creative partners. For example, the 45-second tracking shot of Paris that kicks off Academy-Award winning animated film âHugoâ entailed some serious animation rendering.
Pixomondo, the visual effects company tasked with rendering the film, needed high-powered computing to support this complicated composite, which for just that 45-second tracking shot, cost tens of thousands of dollars in power alone. It is not hard to imagine then that things like hardware performance, technology footprint and heating and power efficiency play a major role in the ability of Pixomondo and similar companies to turn a profit.
2nd Screen Viewing Experiences: 73% of TV Everywhere views are on a 2nd Screen. Â ReelSeo. Â Feb 6th. 89% of video views on the BBCâs iPlayer are VOD vs. Live. Click link to view Infographic.
If Facebookâs new acquisition âOculus Rift â sounds like something out of a Science Fiction movie, your gut isnât that far off. The virtual reality headset maker â snatched-up for the astonishing Â $2 BILLION in a surprise move â is the ultimate in geek chic. The device â which can create rich virtual reality, immersive experiences for gamers and beyond â already has a devoted if pocket-protector-wearing fan base. But, what does it mean for Facebook?
If Facebookâs new acquisition âOculus Rift â sounds like something out of a Science Fiction movie, your gut isnât that far off. The virtual reality headset maker â snatched-up for the astonishing Â $2 BILLION in a surprise move â is the ultimate in geek chic. The device â which can create rich virtual reality, immersive experiences for gamers and beyond â already has a devoted if pocket-protector-wearing fan base.
But, what does it mean for Facebook?Â Aside from broad brush comparisons to Apple and Google â companies which squarely saddle the software and hardware divide â thereâs a buzz in Hollywood that this puts Facebook squarely in the movie business. Huh?Â Some media analyst say that we should think of Oculus this way: itâs just another screen.
While that may be true in the long view, Iâd argue that Facebook was ALREADY in the movie business, even without trying to out-Google Google Glass.Â And, the current Facebook movie business doesnât demand another screen.Â Facebook â and its cousins Twitter and Pinterest âÂ are the collective mouthpiece for audiences to share what Â think about anything Hollywood puts on any screen. Iâm talking about what you or I or our mothersÂ (Yes, itâs true. Your mother!) have to say about movies and television on social media.
Those comments or âlikesâ are untapped gold in Hollywood. They reveal your audienceâs interests in ways that box-office numbers still canât. With the right analytics tools and insight, you can mine this data to learn EXACTLY what members of your audience think. What they think about your movie. Your talent. Your Second Screen App. Your recommendations. Your levels of customer service, if youâre in the subscriber business.Â Iâd argue Facebook and Twitter and their social media cousins have more to teach Hollywood than Hollywood has to teach Hollywood. The trick is, is Hollywood ready and able to listen?
The ability to analyze social media and behavioral data â and, most importantly, do so in a way that is able to be looped back in to bigger marketing and planning operations â is essential to making and monetizing content in the new M&E Ecosystem.Â But, the fact is, few studios and distributors are doing this well.Â Read this article to get a better sense of how social media analytics need to play into your contentÂ and analytics strategy.
Speaking of Hollywood â the industry converges in a matter of days at NAB.Â Check back here for your NAB wrap-up, insights and ideas.Â Until then, Sci-Fi friends and believers, may the force be with you.
It’s been another fast-paced week in the digital video and second screen industries. While the OTT video world is still reeling from the previous week’s announced Disney Movies Anywhere service (a serious threat to UltraViolet) and Marvel’s announcement of an exclusive output deal with Netflix (continuing to threaten HBO), second screen took a shot in the arm from the Oscars, and Roku mounted an attack on Chromecast. At a glance:
- “Watch ABC” did Second Screen for the Oscars “right”
- Ellen broke Twitter
- Roku announced their “streaming stick” device
- Dish struck a deal with Disney to delay commercial skips
- FreeWheel was acquired by Comcast
- The BBC announced the death of analog for it’s Channel 3 service
- An Aereo lost its court battle in Salt Lake City and Denver
Itâs hard to imagine a technology company with more Media & Entertainment clout. Or, are they an entertainment company with massive technology chops? Either way, Netflixâs literal invention and dominance of the OTT market has revolutionized the way content is consumed . And now, they’ve even successfully re-engineered the way that content is created. A true sign of the Netflix zeitgeist? They’ve inspired a new lexicon for how audiences watch content, with new language like âcord-cuttingâ and âbinge watching.â
Itâs hard to imagine a technology company with more Media & Entertainment clout. Or, are they an entertainment company with massive technology chops? Either way, Netflixâs literal invention and dominance of the OTT market has revolutionized the way content is consumed . And now, they’ve even successfully re-engineered the way that content is created. A true sign of the Netflix zeitgeist? They’ve inspired a new lexicon for how audiences watch content, with new language like âcord-cuttingâ and âbinge watching.â
As if you need proof that the language of Netflix is real, here are some hard facts. By the end of the first weekend following its Valentineâs Day release, the second season of House of Cards was streamed in its ENTIRETY by more than 2% of Netflix subscribers. With more than 40 million subscribers worldwide, that means one million people binge-watched an entire season in a matter of days.
Is that kind of massive success a surprise for a show that wasn’t even subjected to a real pilot process? Not for Netflix. Very little about how, when and where their audiences watch content is a mystery. Thatâs because Netflix uses sophisticated analytics to evaluate a billion transactional events per day. Every nuance of audience interaction is mined to drive their business forward, from securing the best content, the best prices for content, to developing meaningful and targeted recommendations that keep their subscribers watching and wanting more. Netflixâs analytic muscle is so strong in its recommendation engine that they attribute 75% of their streaming activity to recommendations.
And, thatâs just the beginning of how Netflix uses data to drive their business. In a rare, live webinar, Netflix analytic thought-leader Kurt Brown will share how this Media & Entertainment pioneer is using analytics in the cloud to drive its business. To learn more or register for the March 18th webinar,Â click here.
2nd Screen had a tumultuous run up to CES 2014 with the press continuing to be split between hype and disillusion.Â While we normally would have written and presented this update at CES, we decided to focus on releasing our research on monetization on behalf of our society members to help them and their primary stakeholders (investors, customers, management) cut through the hype and the disillusionment and focus on clear examples of what is working.Â Ironically, the additional insight gained in the first few weeks of January has been invaluable with regards to both consolidation (Yahoo closing IntoNow) and M&A (Viggle buying Dijit, TiVo buying Digitalsmiths).
What a crazy week. As if it wasn’t enough for NATPE to be taking place in Miami (with some great research and stats published about second screen), there was a ton of consolidation activity in our industry (Dijit/Viggle, IntoNow from Yahoo, ) and some rebranding by GetGlue. At the same time the 2nd Screen Society (S3) published a teaser on its new research about monetizing the second screen, and then Gigaom and TechCrunch wrote some pretty disparaging views, with Gigaom reverting to the salacious headline of “Social TV is Dead“.
- NATPE. Chris Tribbey wrote up a pretty decent summary of the content creators’ panel during NATPE discussing the insights from the CEA/NATPE research, presenting some GREAT stats about second screen usage and more importantly, a strong view from content creators (“Show Creators See Second Screen as Permanent”).
- Yahoo’s IntoNow. Yahoo made a decision to shut down IntoNow, the synchronous enhanced viewing experience app they acquired only three years ago. I think two developments lead to this decision by Yahoo. 1) it wasn’t a very engaging experience (too broad and shallow) and was likely not attracting a ton of consumer engagement, and 2) Yahoo’s Screen app is taking off and has cemented their view that focusing on engaging the consumer around the viewing experience was a more attractive monetization play. Let’s face it, Adam Cahan founded Auditude, IntoNow and is now Marissa’s right hand man for all things mobile video at Yahoo–he didn’t do this without thinking it through.
- . What a great validation of how important Discovery is in the second screen ecosystem. Led by Ben Weinberger, Digitalsmiths has been quietly winning most of the MVPD operators in the U.S. with their personalization and recommendation platform. The cash purchase for $135M by Tivo is certainly validation that the space is valuable for investors, but more directly indicates that Tivo is going to keep moving into the direction of creating great viewing and companion experiences for the living room (their current experience on a smartphone and tablet is already amazing and getting better all the time, lead by Tara Maitra and Evan Young).
- Dijit acquired by Viggle. Viggle is perhaps one of the most successful at monetizing the second screen companion experience (a short section in our research sheds light on their success). Dijit’s Nextguide is perhaps the most engaging consumer Discovery experience (yes, better than Fan), with significant broadcast partnerships on their tune-in “reminder button” feature. I am convinced that with Jeremy Toeman leading the UI/UX and Greg Consiglio leading the monetization, this marriage will be a happy one for their shareholders, their customers (brands and TV networks) and consumers.
- The Grammy’s. Why is this important? I am sure a ton of stats will come out this week about how many tweets, etc, were pushed during the broadcast. But did you see that Chromecast commercial? Somehow Google managed to create the ultimate Discovery and Control powered second screen experience, stealing that opportunity from Apple, Netflix and Samsung. They have not only created a pervasive and passive experience that seamlessly allows consumers to “cast” their viewing experience from their second screen to the first, but by using the DIAL protocol, the second screen is then freed up for a companion experience (or synchronous advertising – see our research). The evidence of the commercial during the Grammy’s means they are SERIOUS about being successful with their $35 dongle. Apple, Sony, Samsung and Roku should take heed.
- GetGlue. What does that mean to you? It launched many, many moons ago as an attempt to create a social network around your viewing (and reading and wine drinking) habits, letting consumers check-in to a show and share with their Facebook or Twitter friends. i.TV bought them last fall and no surprise has decided to re-brand them into something that speaks to the opportunity Shazam is busy uncovering–tvtag. Despite Gigaom’s view on this, I think this is positive in that it means the i.TV management recognizes the opportunity (engagement with the consumer at specific points of the viewing experience) and the threat (Twitter is chasing this and so is Facebook). Will they be successful? Who knows, but consumer behavior will continue irregardless.
- Shazam. Perhaps more interestingly, Shazam is making a big push during the SuperBowl this year to see if their momentous growth in active users can move the needle on advertising and consumer engagement during the world’s largest live viewing event. While I had been skeptical over the previous 18 months, their new CEO Rich Riley (joined in April, 2013) seems to have turned the ship in the right direction, racing towards a UX that both engages the consumer and provides a monetization platform. And a partnership with Facebook isn’t a bad idea either. Watch this space (and that Jaguar commercial).
- SocialTV is dead. Hmm. Did you read the article? First of all, yes I agree that gimmicky concept of social (badges, check-ins) is challenged, but keep a few things in mind: 1) second screen experiences can typically be broken into 5 segments,
1 of which is the sharing or social aspect. 2) All of the hype from Gigaom and TechCrunch in the last 3 months has been that the Social TV battle is down to Facebook vs. Twitter–neither of which is going anywhere or walking away from TV. 3) Read the last 3 paragraphs and you will see both confirmation that they still believe in point 2 AND that the right engaging experience still needs to be developed–confirmation of the fundamentals above (that the consumer behavior will continue despite the poor UX). Conclusion: a salacious headline that certainly made MANY people read the article.
Itâs a sight not all that unfamiliar to new parents: an ashen, red-eyed baby, shrieking uncontrollably and spewing bile in its path.Â In fact, come on over to the Quinn household, and you can witness the excitement first hand.
But, if youâve been on any social network lately â and you havenât been hiding under a rock â youâve seen the infamousÂ âdevil babyâ.Â This little hellion has amassed more than 36 million views on YouTube since taking the internet by storm.Â And, while the antics may not surprise the average new mother, this baby is no ordinary kid.
Devil Baby was an inside Hollywood prank â the marketing brainchild of 20thÂ Century FoxÂ in anticipation of their new horror flick calledÂ Devilâs Due.Â But, more than being just a trick, Devil Baby is aÂ digital marketing phenomenon, revealing new data driven marketing table stakes for todayâs Media & Entertainment market.
Hereâs what you can learn from Devil Baby:
- To Go Big, You Need To Go ViralÂ –Â There is no surefire recipe for viral success. But, one thing is certain, big data analytics can increase your odds through analytics.Â Take the dominant video network,Machinima, for example.Â Their business is ensuring the rapid and massive uptake of content across a wide swath of users â and they do that by not only creating awesome content, but by using behavioral analytics to identify key networks, influencers and consumers ripe for that experience.Â (You can also check out our exclusiveÂ Machinima White PaperÂ here!)
- Â Share of Voice is Great; Share of Wallet is BetterÂ –Â You know a meme has hit the mainstream when my mother makes a comment about it.Â But, buzz isnât enough in todayâs competitive content landscape.Â Itâs about the bottom line. Content creators, studios and distributors need to be able to use that buzz to predict performance and drive revenue. Being able to tap into, analyze and act on the ocean of big data – includingsocial sentiment and network analysisÂ â are key factors in the age of the Connected Consumer.
Data driven marketing â and execution through an end-to-endÂ integrated marketing strategyÂ Â — doesnât need to be as painful as an exorcism. Â But, it will take the right tools and know-how being driven deep into our marketing organizations.Â Devil Baby is one of many examples weâll see of todayâs marketing revolution being driven by big data.
Speaking of red-eyed hellions, nap time is overâŚ and my second shift is calling.
Second screen, social media and companion applications are all high on the agenda of executives in the media and technology industries. As a reflection of all major TV and technology conferences in 2013, CES, NAB, IBC, and MIP had several sessions dedicated to second screen. But second screen, while proven as a reality of consumer behavior, is not yet widely seen as a revenue driver. Indeed the reality of the second screen phenomenon is accepted, as is proven by the continuous flow of statistics showing that viewers use another screen in front of their TV (one of the latest being Nielsen saying that 75% of smartphone and tablet users are engaging with second-screen content more than once a month as they watch TV…
Second screen, social media and companion applications are all high on the agenda of executives in the media and technology industries. As a reflection of all major TV and technology conferences in 2013, CES, NAB, IBC, and MIP had several sessions dedicated to second screen. But second screen, while proven as a reality of consumer behavior, is not yet widely seen as a revenue driver. Indeed the reality of the second screen phenomenon is accepted, as is proven by the continuous flow of statistics showing that viewers use another screen in front of their TV (one of the latest being Nielsen saying that 75% of smartphone and tablet users are engaging with second-screen content more than once a month as they watch TV). Another proof of the generalization of second screen is the multiplication of companion screen applications: over the course of 2013 they have become widespread in new geographies including the Middle East, Eastern Europe and Latin America, where they had little presence only 12 months before. Comparing the space with 2012, it is clear that no TV players can ignore it. Even more striking, the players behind some of the most successful apps are large and well established: Peel now has 40m+ downloads, mostly through a global partnership with Samsung; Apple bought Matcha in August 2012; zeebox grew their partnerships with Sky, Comcast (NBCU) and Foxtel, while DirecTV acquired a share of i.TV (which bought Getglue at the end of 2013); Viggle has a longstanding partnership with DirecTV; Comcast has launched âSEE iTâ with Twitterand Xbox SmartGlass app was downloaded more than 17m times. Despite this popularity and the presence of the largest players, few industry executives dare to speak openly about monetization of second screen applications and only a small percentage of 3rdparty app providers have made their progress public. There may be good reason for the industry stalwarts to keep their progress private with commercial competition so tough, but the sceptics of course believe that is because no one has actually experienced much monetization success. So while many in the industry are wondering where the money is in second screen, next to nobody is ready to âshow [you] the moneyâ.
The purpose of our research paper is to do exactly that: âshow you the moneyâ. We review the various monetization strategies used by second screen companion and viewing applications and evaluate how these strategies work and which ones drive the most value. We also provide an evaluation of the second screen market size and review its main drivers. Finally, we review how Twitter, Microsoft, Samsung and other players not directly in the second screen ecosystem are planning to use the second screen to increase their revenue.
More importantly perhaps, we have taken the time to update our market sizing from last year in an effort to demonstrate where the large opportunities for monetization lay for players in the ecosystem.
Feel free to explore our research and infographics on our website, engage us on Twitter (@ChuckParkerTech, @S32Day), or meet us in person at Mobile World Congress (Feb 26th in Barcelona) or at NAB (April 6th in Las Vegas).
- Âˇ Q4 2012. 35 million tablets sold in the U.S. alone during the Christmas rush and significant social TV and 2nd Screen engagement growth in all scenarios.
- Âˇ Q1 2013. Clear evidence of ât-commerceâ from 2nd Screens and the continued growth of active zeebox and Viggle subscribersâbell weathers for the industry on consumer engagement in 2nd Screen.
- Âˇ Q2 2013. Hyper growth in mobile video viewing, especially in ad supportedâa key trend to observe for the potential of 2nd Screen monetization in converged experiences.
- Âˇ Q3 2013. Continued viewing growth on mobile, strong revenue growth from enhanced 2ndScreen engagement apps and the launch of Chromecastâan opportunity for any 2ndScreen app developer to give Discovery and 1st screen Control capabilities to their video viewing experience.
continues to be stronger where there are pervasive yet passive opportunities for engagement with the consumer.
- 1. Increased consumer engagement in the content. The majority of the investment in 2nd Screen companion and viewing experiences is coming from the content creators and distributors (primarily the TV networks). Creating a lift in engagement (i.e. viewing time) translates to increased revenue regardless of their monetization model.
- 2. Increased consumer engagement with the advertising brands. The vast majority of the content ecosystem focused on 2nd Screen monetized their content through advertising in some form. As major brands place bets in this space, they are focused on metrics like âCost per Touchâ instead of impressions delivered (CPM). The brands crave interactivity and engagement, working to determine which consumers are interested enough to move forward in their purchase cycle.
- 3. Monetization itself. While major TV networks and brand advertisers can get comfortable with metrics that have a strong correlation to monetization, many of the 1st and 3rd party engagement developers depend on revenue coming in the door to support their investmentsââwhere the rubber meets the roadâ so to speak, as actual payments for advertising, t-commerce and engagement come together in the 2ndScreen companion and viewing experiences.
By Colleen Quinn, Teradata Corporation
As Hollywood shifts into high gear around direct-to-consumer engagement, content creators and distributors are working fast to develop the know-how and analytic capabilities to execute. Thereâs a lot to consider, especially for organizations that are new to the D2C fray.
Cut to Warner Bros., who is leading the charge among Hollywood Studios in developing rich direct-to-consumer offerings, and the CRM efforts that make those offers successful. Michele Edelman, Warner Bros. Vice President, Direct-to-Consumer, opened the curtains on Warner Bros.â pioneering work in D2C at a recent Teradata webinar.
The virtual-standing-room-only crowd had a front-row seat, as Edelman described the evolution of CRM and D2C at the studio. Â Warner Bros.â capabilities have expertly woven together best-in-class integrated marketing, with a big data strategy that gives them a detailed understanding of each member of their audience.
Launched in 2009, the Warner Bros.â CRM strategy boasts massive success where it counts:Â in the numbers.Â Any savvy digital marketer knows that benchmarks are critical. Without them, thereâs no real way to measure your success.Â So, imagine Warner Bros. excitement when they saw the powerful results driven by their new direct-to-consumer CRM programs â seeing rapid, exponential improvements across all key marketing KPIs, including:
- 25% Email Open Rates
- 13% Click-through Rates
- Decreases in unsubcribe rates
Take a listen to the webinar replay to hear how Warner Bros. launched, refined and mastered their Direct-to-Consumer CRM and analytics strategy, featuring an extensive audience-driven Q&A.
And, the big data conversation for Media & Entertainment didnât stop there! Industry thought leaders in advertising, cable, broadcasting and more are convened in the Big Apple this week, as Teradata and UCLA Anderson reprised Create, Captivate and EngageÂ â a big data analytics event with M&E in mind.
By Colleen Quinn, Teradata Corporation
It was the pen-stroke heard around Hollywood. CBS and Time Warner had (finally) reached agreement about retransmission fees. Viewers from coast-to-coast exhaled a collective sigh of relief, and switched on Pro Football.
One term at issue? The big per-subscriber fee hike that CBS demanded, aiming to double their carriage fees over the 5-year term.Â While a huge boost in revenue is always worthwhile, CBSâ negotiations hinged on a term that is much more interesting. They wanted to retain streaming rights.
Hereâs why. Increasingly, streaming rights are the gateway to commanding your future. With them, content owners can seek the best opportunities to fully monetize content across every channel. But, more importantly, streaming rights often pave the way for content owners into the direct-to-consumer fray.
Going D2C means more than just having content rights, though. For studios and distributors, it means developing a keen understanding of each member of your audience. Itâs about having the capabilities to deliver the right content, right message and right impact.
Lots of content creators are talking about this seismic shift in the businessÂ – but,Â only a brave few are putting their collective money where their mouths are. There are trailblazers. Warner Bros. Entertainment is one of them.Â Among the first to build-out a robust, start-up-like technical operations organization, Warner was also among the first to take the lead with Ultraviolet.
Now, Warner Bros. Vice President of Marketing for Digital Distribution, Michele Edelman, offers a rare opportunity to listen via live webinar as she shares the studioâs vision and insights for launching and leading industry-changing, direct-to-consumer capabilities.
Itâs rare that inside Hollywood can learn from inside Hollywood â but, once in a great while, it happens.Â Donât miss your chance to join in!
We have often discussed in this blog the 4 major features sets of second screen (To Control, To Discover, to Enhance, to Share – relevant research linked here and here). We have also reviewed what Netflix was experimenting with for leveraging the 2nd Screen as a discovery and control device via DIAL (try opening Netflix on your iPhone while it is also running on your PS3, find the blog here). Finally, we have predicted what a DIAL-enabled world might look like with its major backers (Netflix and YouTube) driving the protocol acceptance into every new device launch since early 2013 (DIAL blog here, 10 predictions here).
Well ChromeCast is the incarnate of all those opportunities and at the same time evidence of where the industry will head with rapid adoption. While we have tried to tell the SmartTV industry that the best implementation for their platform is to be the launch pad for the stream, Chromecast demonstrates that use case out right.
Similar to an Apple experience, the packaging of the device in simple and clean. The small dongle device comes with a power cord and USB cord (alternative for power) and an adapter in case your HDMI port is in a tight spot.
By Zane Vella, Founder and Chief Product Officer, Watchwith
Over the last decade a familiar battle cry of the digital media executive was âAnytime, Anywhere,â meaning the promise of digital for the consumer was to watch âwhat you want, when you want it.â And as we look around today, much of that future has arrived in the form of HBO Go, Netflix, Xbox, Xfinity â all popular on-demand services now available on tablets, phones, computers, game-consoles, Blu-ray Disc players, and connected TVs. So whatâs next?
As the MESA readership is well aware, much of the traditional entertainment distribution business is an analytic and strategic exercise in windowing and differentiation. In short, this means extracting greater return through enforced scarcity or by delivering added value through one or another distribution channels or partners. This article examines how and why time-based metadata is becoming a critical strategic asset for content owners, and how it enables new forms of windowing and differentiation across the digital distribution landscape.
A New Vocabulary
First, some definitions: âtime-based metadata,â a.k.a. ârelated content metadataâ is descriptive information related to a particular scene, shot, or moment of a film or TV episode. Unlike traditional program metadata that defines general information applicable to an entire program, time-based metadata follows the heartbeat of the program content itself and includes a steady time-code or time-reference that refers to a relative time within the media asset. Fundamental examples of time-based metadata include what actors are currently onscreen, what music is playing, what locations are in the scene, and what featured products are on screen at any particular moment.
Within the realm of time-based metadata, there is also an important concept of âevent types.â For example, âactor,â âmusic,â âquiz,â âpoll,â âbehind the scenes videoâ and âproduction stillâ are all types of events or related content (which can also be thought of as layers) that are associated with particular moments in a program. Event types can be anything a content owner or producer desires that either adds value to a program or is related to the program.
One of the defining characteristics of time-based metadata is that it is information related to content which is abstracted from any particular visual presentation or consumer experience. This primarily means information in the form of text, images, and links to other Internet-based content or services.
Lastly, another key concept is âmetadata syndication,â or more simply, fine-grained control of which event types or layers of time-based metadata are made available to certain business partners, based on business rules such as time-window or geographic location. Technically speaking, metadata syndication is implemented via access credentials, (a.k.a. API keys) that are provided to a distribution partner or to each application that consumes time-based metadata made available by a content owner.
Foundational Digital Trends
Before returning to the discussion of windowing and differentiation, it is important to identify two broad overarching technical trends which are both transforming our industry and providing the foundation that time-based metadata strategies are built upon; first, the dominance of digital file based workflows, and second, the increased importance of more traditional program-level metadata in digital distribution operations (as opposed to the time-based data).
Until very recently, program assets were delivered to distribution partners via a broad range of technical means. Broadcast and pay TV exploitation relied primarily on satellite uplink, theatrical exploitation relied on physical delivery of 35mm prints, home entertainment (DVD and Blu-ray) exploitation was via tape formats (DLT) to manufacturing facilities , and digital exploitation (iTunes, Xbox, PS3) via file transfer. Within just the last few years, the economics, practicality and operational benefits of digital video workflows have elevated digital file transfer as the primary means of asset delivery across all distribution channels.
Once operating within such a digital file ecosystem, program-level metadata associated with those files becomes critical for inventory management, merchandising, fulfillment, pricing, royalty tracking and interoperability across various systems. Â These requirements have driven a great deal of innovation, and over the past several years, an enormous amount of ingenuity, intelligence, and dedication has gone into solving industry challenges around program-level metadata. While challenges and opportunities for efficiency remain, great progress is being made, particularly by industry organizations such as ISAN and EIDR.
Together, these two trends are an important indicator of the direction that the industry overall is heading, and form the basis of some logical conclusions: If digital file delivery persists or increases, content owners will increasingly need to provide their distribution partners with metadata around their assets, and different types of metadata will be required for different means and channels of exploitation. Metadata will increasingly become the means of delivering information to business partners throughout the entertainment production and distribution ecosystem.
Anytime, Anywhere, But Now What?
Thanks in large part to standardization of digital file formats and the hard work of many digital distribution operations teams, most large entertainment companies are now able to reliably deliver their audio and video assets across a wide range of distribution partners. There is at the same time, however, a definite and glaring absence of any unified or efficient way to enhance the consumer experience around that video or any standardized means to deliver value-added related content.
This means that while the industry has been successful with the first critical step of delivering program content to the consumer, there is a distinct lack of business tools or âleversâ for distribution executives to efficiently create consumer demand for their digital assets. Â Unlike in DVD and Blu-ray, each distribution partner, such as iTunes or Xbox, has their own unique set of requirements for delivering value-added content (if at all) and promotional opportunities also require unique one-off asset production and expense.
This lack of a unified platform for creation and delivery of added-value content may be a significant contributor to decreased consumer interest in sell-through and ownership models.
Turning the Tables for Everyoneâs Benefit
Digital distribution executives not only lack a unified means of enhancement and promotion for program assets, they also operate in a highly fragmented landscape. Traditional cable, satellite and telco distribution partners have increasingly complex delivery requirements to fulfill their own evolving customer viewing habits. In addition to these MVPDs, a new wave of mobile and tablet applications, web video distribution, and OTT partners bring additional delivery requirements and new valuable ways to connect with the audience. No matter how well resourced a media or entertainment company might be, it is near impossible to keep up with every new digital distribution opportunity, and equally impossible to differentiate your program content from one distribution outlet to another.
The solution is to turn the tables, and for the content owner to offer each distribution partner a variable package of time-based related content metadata associated with each licensed program. This related content metadata becomes the key ingredient for each distribution partner to deliver a differentiated, value-added consumer experience to their end-user or consumer.
For example, electronic sell-through partners and ultimately the consumers that purchase movie and TV programs through them, can have access to extensive layers of value-added content, while rental partners and their consumers can be restricted to a more limited subset of metadata, and fewer layers of value-added content, if any.
In practice, this means that the consumer who purchases a film or TV program can enjoy a different, presumably higher value experience, than one who rents that same video asset. By extension, this also means that a subscription model could potentially emerge in which the content owner would provide the consumer with ongoing or evolving enhancements (active layers of engagement) with their favorite films or TV programs.
The Power of Metadata Syndication
This approach is extremely powerful for the content owner because it allows them to function more similarly to how they have traditionally operated. It becomes the content ownersâ responsibility to create the highest-value master asset possible, but now that asset is a combination of audio, video and time-based metadata. Individual distribution platform idiosyncrasies and presentation layer requirements become the responsibility of the distribution partner, and the content owner can focus their attention and resources on delivering value to the consumer and marketing those benefits.
This approach also opens the door for content owners to focus on the ongoing interactive social and commerce services that may be connected to any scene or moment of their content, and with the right technology platform at their disposal, enable the content owner to benefit from these additional layers of monetization in cooperation with their downstream distribution partners.
âTurning the tablesâ though metadata syndication is also powerful because it challenges distributors to innovate and compete with each other to deliver the best consumer experience, as opposed to expecting content ownersâ limited marketing budgets to stretch across all distribution platforms they currently have to reach. In many cases, particularly related to television, this approach also solves a major timing problem. Only the content owner or network programmer has access to first-run television episodes before their first airing, so metadata syndication allows them to make related content such as quizzes, trivia and behind the scenes images available in a way that no distributor would be able.
The Time-Based Metadata Ecosystem
Creation, production and distribution are all part of the time-based metadata ecosystem. From a creation perspective, an enormous amount of valuable time-based related content exists from the earliest stage of pre-production. Similar to popular DVD, Blu-ray and synchronized âSecond Screenâ experiences, examples of this type of material include early storyboards, location scouting photos, and production design sketches. These are are all valuable related content that can be set to time in a film or TV episode, and are good examples of how to extract value from existing production artifacts. Additional examples of existing information that can be quickly set to time are music cue-sheets, branded entertainment product list, and on-set photography.
Applications of time-based metadata also open up new creative opportunities for writers, producers, and multimedia storytellers. Instead of leaving related content creation to marketing and programming teams, writers are increasingly taking responsibility for various types of related content metadata as an integral part of the creative process. For example, Fourth Wall Studios is an LA based entertainment company that is creating new forms of storytelling where, for example, the on-screen characters call the viewers cell phone at designated moments in the story timeline.
Another example of creative time-based metadata creation and production comes from USA Network where Twitter âhashtags,â originally intended to drive social media activity during first-run viewing, are being stored as persistent time-based metadata with particular episodes and scenes, so that they can be leveraged by applications and users in later syndication and VOD.
Time-based metadata also has important implications for ecommerce and enable new transactional revenue opportunities for both content owners and distribution partners. In 2012, eBay introduced Watch With eBay, a stand-alone iPad application that surfaces current auctions and âBuy it Nowâ items that are related to a particular program. eBay has also demonstrated a version of the application that uses time-based metadata to surface items that are related to a particular scene, and expects.
Time-based Metadata, Windowing & Personalization
One of the greatest opportunities for content owners and distributors alike is to leverage time-based metadata to proactively drive consumer activity in new viewing windows, and with new viewing patterns. Through metadata syndication, the same digital file can offer the consumer a new experience with each view, and that experience can be influenced by whether it is being experienced in parallel with the first-run viewing, within the Nielsen C3 window, or in a VOD session.
Technically speaking, windowing relative to time-based metadata means that based upon the specific time-window at which a viewer engages with a piece of content, a corresponding package of related content layers can be made available. These time-windows can be relative to first-run or premiere of the content, or personalized to a specific viewer and corresponding with successive views.
Time-based Metadata and the Future of TV
Over two decades of video product development, time-based metadata has emerged as one of the most important components of a successful digital video distribution strategy. This descriptive information about what is happening at any moment is critical to differentiation in a multiscreen world, and will play an increasingly important role in differentiation across distribution partners. As smartphones, tablets, and smart TVs proliferate, there will be increased demand for rich and valuable time-based metadata delivered as part of the master asset. Increasingly, time-based metadata will unlock the context of film and television, and will power the new user experiences and new revenue streams that are only possible on emerging two-way digital platforms.
Just one decade in to the twenty-first century, we are starting to see indicators of a vibrant metadata ecosystem growing within the folds of the traditional film and TV production and distribution industries. Writers and producers will increasingly create time-based metadata as an inherent part of their creative storytelling process, and production companies will increasingly package, license, and sell that critical enabling meta-layer to their programmer and distributor customers. Programmers and distributors will in turn increasingly deliver a time-based metadata layer to their cable, satellite, telco, web, mobile and OTT licensees, so that those consumer facing services can unlock the context of every moment of film and TV for their audiences.
Zane Vella is the Founder and Chief Product Officer at Watchwith, a software platform to create and distribute time-based related content around films, TV and commercials. He has 20 years experience at the intersection of TV, Internet, and software product strategy and has led the development of interactive products and platforms for media and entertainment companies including Apple, Disney, NBCU, Netflix, Viacom, and Warner Bros.
 Time-based metadata is typically provided as a JSON or XML formatted message so that a product developer or programmer can choose from available time-based information and use it as they see fit in a consumer experience.
 ISAN is the International Standard Audiovisual Number a voluntary numbering system and metadata schema enabling the unique and persistent identification of any audiovisual works and versions thereof including films, shorts, documentaries, television programs, sports events, advertising etc. http://www.isan.org. EIDR is a universal unique identifier for movie and television assets. http://eidr.org/
 Walt Disney Studios Distribution has been a leading innovator of synchronized consumer experience on a tablet associated with a film. More info available at http://disneysecondscreen.go.com/
 Founded in 2007, the Culver City-based company develops new properties delivered via Internet browsers, smartphones, game consoles, TVs, movie screens and in the physical world. http://fourthwallstudios.com
By Geoff Tulley
Sony and Panasonic recently announced an agreement to jointly develop standards for a next-generation optical disc that has the capacity to hold more than 300 gigabytes of data (six times the capacity of current Blu-ray Discs) by the end of 2015. According to the two companies, the 300 GB discs are geared toward the archival storage market. Is this project the next-generation of Blu-ray? Or is it the consumer electronics industryâs answer to being ahead of the 4K curve? See below for an analysis.Â
In the joint release issued by both companies, each included reference to the other’sÂ cartridge-basedÂ storage solutions that are currently in the market (Panasonicâs Data Archiver LB-DM9 series and Sonyâs Optical Disc Archive system).Â These systems employ multiple recordable optical discs encased in a protective cartridge (beyond this similarity, however, the systems and their media are completely different).
As an associate of mine commented: “(These cartridge-based systems)Â may have a fairly tough time in the enterprise market though, as itÂ seems to beÂ more of a packaging trick than anything really new — proprietaryÂ cartridges and the like can be a tough sell.”
The companies make the point pretty clearly that this announcement is about a single disc solution that ups the capacity of recordable optical discs. It will be interesting to see whatÂ mix of layers, lasers and the like will be required to make that magic. Since multiple layers at Blu-ray Disc wavelengths areÂ alreadyÂ in current specifications, the implication is that this new format will be a departure from BD as we know it.
As TV Technology reported, âBoth companies pointed to the expanding needs for archiving in video production as well as from cloud data centers as the reasons behind their work in advancing the format.â
I did notice another Web site, however, that took the same announcement and (IMHO)Â leapt off the deep end:
“But while streamingÂ content seems like a good idea, some consumers (especially videophiles) areÂ clamouring for a physical solution to the problem,” the article stated.
It would be interesting to see theÂ data behind the “consumers areÂ clamoring” bit. The Blu-ray Disc Association might want to examine it.
According to the article, “Though neither company has admitted asÂ much, itâs clear that the partnership is an effort to resolve theÂ 4K mediaÂ question once and for all. The two Japanese firms are teaming up to createÂ what will essentially become the successor to the Blu-ray Disc. Their ambitiousÂ plan is to create a higher capacity optical disc thatâs ready for consumerÂ useÂ before the end of 2015.”
I don’t think that it is at all clear that this is about a consumer format; certainly not about one that is aimed at 2015. 2015 is the stated target for the commercial, data archiving implementation. I think it is safe to assume that theÂ quest for Ultra HD consumer distribution will not be waiting for this new format toÂ emerge, so one has to wonder what feats of marketing may be required toÂ re-introduce the world to new forms/formats of physical mediaÂ two years from now (or after).
One also has to wonder if this writer appreciates the significant differences between recordable optical discs and replicated media (such as DVD and Blu-ray) that is used for movie distribution; not to mention the investment required to create the replication infrastructure required toÂ mass produce “affordable” home movie discs.
The Blu-ray Disc Association did make the announcement at CES 2013 that they have a task force studying the issues around incorporating Ultra HD content into the specification. I expect that effort will generate lots of discussions and ultimately product development, I just donât see this 300GB announcement as a harbinger of a consumer solution.
In any case, this discussion does provide lots of interesting food for thought.
By Tony Knight, Senior Product Manager, Rovi Corporation
The other day, I began to realize how much of the physical media that my generation took for granted would be completely absent from the lives of our children.Â My four-year-old daughter Izzy, who was born the year the iPhone was first introduced, already has far different expectations on how content is created, transmitted and consumed.Â For her, you never have to put anything in a machine to get something you want to come out on a screen.Â For decades, the act of taking a picture, listening to music, or watching a film required the movement of something physical into the apparatus of something mechanical.Â In the space of just a few short years, the relentless march of technology has separated content from the spinning gears they were previously bound to.
Whatâs more, technology has rapidly increased the rate of change in the home entertainment business, and this metric can be measured in months rather than years.Â Consider how long it took older formats, such as VHS and cassette tapes, to be succeeded by new standards, like DVD and CD.Â Compare that against the plethora of new content delivery methods available today on such a variety of new devices and you will begin to realize the challenges in store for the home entertainment industry.Â Six years ago the most common way consumers get access to premium content was in the form of a DVD disk.Â It was a universal standard and consumers gravitated towards it.Â This greatly simplified the home entertainment business model for content holders and the businesses that supported them.Â Move ahead a few years, and itâs not hard to recognize that consumers have many more home entertainment choices, ranging from subscription VOD, kiosk rentals to a variety of over-the-top delivery channels.
While the physical disk still accounts for the single biggest piece of home entertainment revenue, it is becoming besieged by a number of other options vying for consumersâ attention.Â A few years ago, an entertainment hungry consumer might have purchased a DVD for $15 to $20 because it represented the best value for money among a smaller choice of consumption modes.Â Today, that same buyer has many more choices, including free or low cost access.Â This competition for consumer attention has forced those of us who make our living in entertainment technology to rethink consumer value, or risk losing the premiums that were once the mainstay of the physical media home entertainment business. Â In fact, the future of the home entertainment business may hinge on the very question of whether or not consumers want to âownâ movies anymore.
The commercially successfully concept of âowningâ a retail movie has always taken some physical form.Â VHS tapes had a measure of success in the retail market, but it wasnât until DVDs were introduced that people bought and collected them in droves.Â Today, DVD and BD still sales constitute the lionâs share of home entertainment revenue, but that revenue is declining 5-15% worldwide, year over year.Â Electronic sell-through, the digital equivalent of owning a movie on physical media, has been available for many years, but it has yet to garner anything close to the same level of commercial success as DVD disks.Â The key question for many in our industry is striking:Â Are consumers willing to pay to own movies, or are they content to rent on occasion?
Several years ago, I spoke at an industry event, and I was asked when electronic sell- through was going to be successful.Â My answer was short and sweet: When consumers view EST as being as valuable as DVDs.Â In the intervening years, the mass market has yet to adopt EST, and physical disk sales have continued to decline.Â Consumer behavior is changing, and not in ways that promote the traditional home entertainment business model.Â To put it another way, five years ago the home entertainment revenue pie was cut up in ways that benefited certain actors.Â Today, that pie is in the process of being recut.Â Those that were used to getting a healthy slice in the past may be alarmed to be getting either a smaller piece, or none at all.Â Others that didnât have a slice in the past are now sitting at the table.Â The question of consumer ownership of content is central to how big the pie is, and how it is to be sliced.
Unless the industry acts (and acts decisively), in a few short years, margins in the home entertainment business could reduce sharply as consumers shift from movie ownership to a much less lucrative over-the-top rental business.Â In fact, I think the entire industry is in need of something akin to the Marshall Plan.Â To that end, here is my 3-point plan save movie ownership that can be treated as additions to UltraViolet:
1) Clearer Differentiation from the Rental Experience
Today, when you consider buying a movie from an over-the-top service, you are confronted with two buttons: Buy or Rent. Clicking the âbuyâ button leaves many customers feeling shortchanged.Â There are generally no menus, extras, special features or other perks that make them feel like the ownership experience has been conferred on them.Â Charging four to five times more for those that bought a movie that has the same user experience as a rental just starts to feel like you are prepaying for your next four rentals.Â The industry needs to find a way to drive more value into the electronic sell- through format, and this means adding features that customers are used to getting from physical disks today.Â Remember, most consumers only watch a movie they like one time.Â They watch movies they love many times, and they want the extras that connect them to the filmâs back-story.
2) Get Aggressive with Disk-to-Digital
My shelves at home have about 400 movies on them.Â The key to getting consumers like me to own new movies digitally is to help me move my library towards the new paradigm.Â How successful do you think Apple would have been with the iPod and iTunes if they hadnât expressly enabled you to bring your existing library of CDs into the same interface as the music you purchased from them electronically?Â Not very, I think.Â There is an effort by a certain large retailer to move existing DVD and BD disks to UltraViolet (UV).Â This is a great start, but the initial reviews have been mixed.Â By my own experience, only two-thirds of the sample I brought in was available for conversion, and none of the extras that were available on those disks are part of my new UV rights.Â Would I now spend $800 or so to move just the movies (without extras) over to a new standard if that new standard makes me feel like I am prepaying for over-the top-rental?Â Not likely, Iâm afraid.
I think disk to digital is a great idea, and some consumers undoubtedly adopt a scheme where they move their libraries over on a per disk basis for a fee.Â That said, this approach presents a barrier that I believe will prevent it from becoming mainstream.
Here is a different approach. Charge little or nothing to convert my existing library to UltraViolet once a retailer has confirmed thatÂ my library consists of legitimate retail disks, and marked each out of circulation once the digital right has been conferred.Â If some of the movies are not available, record my right anyway, and bring it to my locker once it is available.Â Now I can feel the totality of the UV experience with content Iâve spent the past 15 years collecting.Â It didnât involve a big bet on my part, and if I like it, the chances are very good that I will probably make my next purchases as UV ones.
3) Go Crazy with Metadata
My shelves at home used to be great for impressing guests and those with lesser collections.Â That moment has past.Â Here is what my shelf canât do well: recommend a good movie for me, or tell me where there are gaps in my collection.Â My shelf canât sort my movies in ways that help me consume more content.Â In fact, after Izzy figured out how to reach that shelf, it isnât even particularly well organized.Â Once youâve helped move my entire library over to a digital locker of some kind, donât make it the digital equivalent of my shelf.Â Use rich metadata and some excellent user interfaces to help me visualize my library in new and interesting ways. How many Stanley Kubrick movies do I own?Â Am I missing some Fellini movies?Â Â If you tell me, Iâm probably a willing buyer.Â The best recommendation engine is one that takes my own library as input.Â Put my existing movies into a snazzy interface, empower it with some intelligent metadata smarts, and I am much more likely to consume.Â I promise.
There are a lot of people thinking and working on solutions to promote the continuation of the ownership model in home entertainment.Â That said, Iâm sticking firmly with my beliefs from many years ago.Â Customers pay for what they value, and digital distribution of content must delight consumers if they are going to own it at the same rate they did with DVD.Â Izzy is almost 5 now.Â Is her first movie related transaction going to be a rental, a buy, or a subscription?Â Much of that will depend on what the industry does over the next two years.
By Robin Daniels, Head of Enterprise Product Marketing, Box
Although Silicon Valley and Hollywood are close geographically and are both working toward bringing transformative experiences (content and apps) to market, they are vastly different digitally and philosophically. Hollywoodâs success relies far more on the connections between individuals than market and technology strategies, rendering decision making complex â and the Valleyâs direct-to-consumer, âexecute without asking for permissionâ model doesnât quite match with Hollywoodâs âcollaborate with everyoneâ mentality. However, one technology that has taken hold and shaken up both Silicon Valley and Hollywood is the use of mobile devices in business.
Half of all devices sold this year will be non-Windows based. Apple alone has sold more than 172 million iPhones and iPads in the last year and its iPhone business alone generated more revenue than all of Microsoft. More computing power and connectivity is in more hands, and in more ways, than ever before.
Sure, workers everywhere have been using smartphones to be more productive for years, but advances in mobile devices paired with cloud applications have simply changed whatâs possible. Not only can workers user their mobile devices to share instantly in their personal lives, business systems are evolving just as rapidly to make the impossible possible. From a creative executive accessing crucial production documents from an iPad to give real-time feedback, to a marketing team tracking campaign results while on the road â we are all working with new mobile and cloud technologies to stay competitive and to easily create, access and share content from anywhere across multiple devices.
The technology in our personal lives is certainly influencing and changing expectations in our professional lives. It isnât necessarily the convergence of the tools we use in these two worlds, but rather the consistency of ideals.
Employees are demanding the ability to choose the devices they use for work and are becoming less productive if all of their data is sequestered on different devices or locked down to specific systems. Along with mobile devices, consumers are also bringing different expectations for technology to work with them. The reason mobile devices like smartphones and tablets, or social media like Facebook and Twitter, are so popular is that they are radically simple and intuitive.
While incredibly empowering for end users, this fragmentation of platforms in the workplace means that any organization that is embracing mobility also has to embrace device diversity. And IT departments not only need to support all these new devices â they also need to ensure that the content and tools employees need to get work done are both accessible and secure.
This new paradigm poses a major challenge for today’s businesses: how can IT let new technology run rampant through an organization, technology that is fundamentally improving business outcomes, while still maintaining some semblance of a coherent IT strategy?
Enter The Enterprise Cloud
Everyone wins when workers have the mobile devices and software they want to use, rather than what they have to use, and IT departments have the oversight and visibility they require â and this can be achieved with a next generation enterprise cloud solution.
New generation of cloud-based business solutions are beginning to make this duality possible. Intuitive services like Ubic, Signiant and Box give employees at media and entertainment companies the flexibility and mobility they require, while also providing enterprise-grade security and visibility for IT professionals.
For new devices to be fully corporate-ready they need elements like security, device tracking and management, and powerful cross company collaboration – areas that can help continually and sustainably innovate. The cloud rewrites the rules here, enabling new handsets and tablets to connect to the “grid” like any other computer and become a tool that enables employees to work together securely across multiple platforms and from any location – finally making the mobile workplace a reality.
And with the truly mobile workforce, completely new computing cases are emerging.Â As a large number Hollywood studios and labels are now moving their information and collaboration to the cloud, our customers share stories of sales teams showcasing their latest project – from the scripts to the trailers, while on-site with just their iPad in hand, a marketing team delivering and tracking exclusive content straight from their mobile devices, and a creative executive creating a centralized library in the cloud on an iPad for all media assets related to a movie launch. Mobile devices are becoming a catalyst for completely new enterprise applications, and vice versa.Â The marriage of the two is so uniquely powerful that businesses will experience a wave of productivity transformation over the next few years.
Mobile + Cloud Revolutionizes the Way We Work, Together
With mobile and cloud technologies, employees now have the ability to store information once, and then easily extend it across all the applications, devices and people they are working with. People donât work in a siloed world anymore. Itâs about using solutions that work together, and powerful platforms that connect and become enhanced through integration: cloud-delivered applications like Salesforce to run your sales organization will connect to your business information on Box or HR information in Workday and Netsuite will plug into your social software from Jive or Yammer. The mixing and matching of services thatâs common in our personal lives is now extending to the workplace, and in turn driving vastly more open solutions that are changing the business landscape and how we interact with each other.
Mobile and cloud adoption in business has led to dramatic changes in productivity, speed of execution, and overall sentiment towards technology. People are able to work much more quickly, access more information than ever before, and make decisions in real-time that are backed by data – all leading to a more open, connected and collaborative work environment. With the right solutions, both the end user and IT professionals are happy â employees are using products they love and IT is finally able to get ahead of the game instead of having to fight fires, solve problems, and answer to unhappy users. We’ve seenÂ more progress made in moving towards a more collaborative and mobile IT strategy in the last year than in the previous ten years, and this revolution will continue to gain momentum – and attention.
Robin Daniels, Head of Enterprise Product Marketing, Box
Robin is head of enterprise product marketing at Box. Robin is a prominent advocate and expert on Enterprise Cloud Computing and how it is transforming enterprises and the software industry. Having worked in the tech industry for over 15 years for leading companies such as Salesforce, Veritas and Vignette, Robin has extensive knowledge in the areas of cloud computing, enterprise software, collaboration technologies, and marketing innovation.
In most cases, 70+ job openings at a sexy company which, with a few others, single-handedly helped redefine the Media and Entertainment landscape would be an indicator of growth. Not so in the case of Hulu.Â Iâd consider this more an indication of rudderlessness.Â Â Visionary CEO, Jason Kilar â my serious tech crush â has left the helm, along with many (if not most) of his right-hands. Hulu has been on-and-off the auction block for as long as anyone can remember.
Now, with a front-running bid on the table from DirecTV (along with a few other contenders) â and looming rumors that a deal will be done soon – it looks like Huluâs days as a platform for Disney, Fox and NBC are numbered.
Ah. What a difference a couple of years make– $1 billion to be exact.Â Thatâs the difference between the one-time price tag wooers were offering the OTT darling then and the rumored price on the table now.
While thereâs no telling how the shift to any acquirer might unfold for the service, the news that the acquirer might be DirecTV is telling.Â Bringing Hulu into the family could allow the satellite TV behemoth to close a significant gap in its service by providing a meaningful online offering. And, with well-established relationships with Hollywood and solid licensing agreements, DirectTV has the oomph to make sure Huluâs content remains relevant, if no longer exclusive. The same is true for any cable, telco or satellite buyer, though â buying Hulu makes you look like youâre ready for the future.
But, thereâs one angle the tech trades seem to be missing. And, maybe it takes the keen eye of an analytics powerhouse to notice. You see, DirecTV has long been an industry leader in applying analytics to gain customer insightÂ â using analytics to maintain and grow their subscriber base through meaningful offers. The promise of an online and OTT channel brings with it the opportunity to capture and analyze exactly how consumers are engaging with content across multiple channels, in ways that few companies are doing today.
Well, you donât even need to imagine âbecause some analytics powerhouses are already using content analytics to drive business and insight. One key example? Netflix, who readily admit that 75% of their audience watches content because they recommend itÂ – and those recommendations, along with just about everything else at Netflix, are done with analytic muscle.
So, while we all wait with baited breath for the final word on what happens to Hulu, thereâs no question that itâs as relevant as ever.Â Maybe no longer as a pioneer âbut as a rocket to take its new owner into a future landscape where the living room isnât the end-all be-all.
Abstract: This case study outlines the partnership between Dell and ToonBox Entertainment, a Toronto-based 3D animation studio. Toonbox deployed Dellâs PowerEdge C6100 rack-mounted servers to render its stereoscopic animation. This article will detail how the partnership benefitted both companies.
To captivate their audience, mischievous Surly Squirrel and his rat friend Buddy need to be rendered in eye-popping detail. ToonBox Entertainment deploys Dellâ˘ PowerEdgeâ˘ servers in its render farm to help deliver world-class 3D animation.
Toronto-based ToonBox Entertainment hit the ground running â the companyâs first original TV production âBolts & Blipâ is one of the worldâs first 3D stereoscopic animated television series. But stereoscopic animation places extremely heavy demands on workstations and servers in the render farm. When ToonBox prepared to start stereoscopic animation for its film The Nut Job, the company sought new studio space and a hardware vendor to furnish it. âIt was critical to select the right vendor up front, because we were looking for a long-term solution,â says Ria Westaway, vice president of production.
âDell treated us as their first priority. That commitment to our needs helped us make a decision relatively quickly.â For rendering, ToonBox selected Dell PowerEdge C6100 rack- mounted servers powered by IntelÂŽ XeonÂŽ pro- cessor 5600 series. âFor every animation we produce, weâre rendering twice as many frames as we would in 2D,â says computer graphics (CG) supervisor Andrew McPhillips. âEach shot in our rich and highly detailed film is comprised of dozensâsometimes even hundredsâof layers. Feature-length animated films typically have more than 1,000 shots. Because The Nut Job is 3D, we are rendering each of those shots twice, once for each eye. In this environment, the Dell PowerEdge C6100 servers have been fantastic. The PowerEdge C6100 makes a great render farm machine because itâs fast, highly configurable, and incredibly robust.
As the company grows, the hot-plug service- ability of each server node facilitates the rapid expansion of the render farm. âThese servers enable us to scale up and down very easily,â says Aaron Pearce, systems administrator. âAdding a Dell PowerEdge C6100 server is basically plug-and-play. We receive a server, drop it into our infrastructure, install software, and thatâs it.â
Furthermore, the serversâ built-in manage- ment controllers help simplify administra- tion. For example, instead of spending 50 hours each week manually installing various operating systems for testing on individual computers, IT staff hooks the open-source tool Extreme Cloud Administration Tool- kit (xCAT) into the PowerEdge serverâs baseboard management controller (BMC)Â to automatically deploy preconfigured operating systems and software. âIt deploys a new operating system across the entire server farm within minutes and takes almost no staff time,â says Pearce.
Ten months after deployment, the ToonBox render farm still has 100 percent availability. âEverything in the PowerEdge C6100 servers is redundant,â Pearce explains. âIf we have a failure, weâre just going to re- move the failed component, fix it in-house, or call Dell ProSupport for extended support. A motherboard in one of our servers had a small issue reading a piece of memory andÂ because we have Dell ProSupport on the machine, the motherboard was received and replaced within an hour and a half of failure. That turnaround by Dell ProSupport was absolutely fantastic.â
Creating stellar animation ToonBox artists who work mostly with Autodesk SketchBook Pro, AdobeÂŽ PhotoshopÂŽ, or Adobe PremiereÂŽ Pro software received Dell Precisionâ˘ T3500 workstations. For artists who work primarily in Autodesk Maya, eyeon Fusion, or Pixologic ZBrush, Toon- Box provided Dell Precision T5500 work- stations. âThey are very powerful machines that facilitate the type of work our artists are doing,â says Pearce.
Most of the companyâs back-office func- tions run in a virtual environment enabled by VMwareÂŽ virtualization software. More than 30 virtual servers run on two Dell PowerEdge R710 hosts and one PowerEdge R510 host. âIntel Virtualization Technology for DirectedI/O (Intel VT-d) enables the processor to split up resources for different virtual machines managed by the VMware layer,â says Pearce. âIt works fantastically. Our Dell and Intel hardware is enabling us to make excellent use of the resources we have without flooding our server room with excess equipment, power consumption, and heat.â
To see that it selected the right hardware partner, ToonBox looked no further than its high-definition, stereoscopic animations. âIn our teaser for The Nut Job, the image quality was so high that people couldnât believe we did it in the time frame we did with the resources we had,â says McPhillips. âThat validates our decisions, because a high image quality is the top of the pyramid. To get to that level, you need great people, great technology, and fantastic hardware. Dell computers give us one level of the pyramid.â Furthermore, ToonBoxâs state-of-the-art equipment has helped the company recruit animators. âWhen youâre on the cutting edge of what can be done in animation, you needÂ a solid backbone,â says McPhillips. âSelecting Dell as our hardware partner was one of the best decisions ToonBox has made. It has been a fantastic relationship.â
By Subhankar Bhattacharya, Global Practice Head Media & Entertainment, HCL Technologies
Abstract: In the television network industry, the linear and nonlinear businesses have evolved independently over several decades. There are many reasons for this separate path of evolution. First and foremost, the non-linear business targeted end-consumer interaction which was different from the distribution model of the linear business. Coupled with small revenue base and a different advertising model, the non-linear business was not considered mainstream. Over the years more and more brands have established significant presence on the web, thus raising cost of managing the nonlinear channels. On the other hand consumers are increasingly looking for seamless experience across devices and platforms. Given this situation, integration of linear and non-linear workflows has become imperative both from cost as well as customer perspective.Â However the prospect of this integration is fraught with Disparate advertising models, disparate systems, disparate metadata and the absence of a single view of the customers. This article provides an approach to this integration which could provide the best possible ROI and least possible transition pain for the TV networks.
Talks about an integrated approach to linear (television/on air) and non-linear (online/broadband) content value chains have been making rounds in the television network industry for several years. Â At the back-end (supply chain) of the content value chain, progress has been limited to the cost driven initiatives such as a shared digital asset management system or a shared infrastructure platform. However, organizations today have become more ambitious and experimental in the front-end (consumer experience) of content value chain through several new revenue driven initiatives such as TV Everywhere, live broadband streaming, etc. With digital upfront (NewFront) in full swing and consumer expectation about seamless multi-screen experience on the rise, the process of linear & non-linear integration may get a much needed boost from business/brand owners within the industry. This paper outlines an approach toward linear and non-linear technology platform integration within the context of this evolving environment. Â
Video content is the primary asset of the television network industry. While the total advertising spend on nonlinear (online) video was only USD 1.42 Bn in 2010, the forecasted cumulative average growth rate (CAGR) for this channel is a whopping 31 percent. During the same period the forecasted growth rates for television advertisement spend is only 3 percent. Â By 2016, that could take the share of non-linear video advertisement revenue to 13% of the linear (TV) advertising revenue from its current share of 3 percent. (Refer Chart 1: Video advertisement spend forecast 2010-2016 (Linear & Non Linear) for detail)
This is more than a meaningful number to sit up and take note of. How well the television networks will be able to capitalize on the non-linear video advertising market compared to new media businesses like Google, Yahoo, Facebook, or Vevo will depend upon the capability of these networks to provide seamless content access across various consuming mediums. As content owners, the television networks will no doubt have their own share of revenue while they leverage the likes of Facebook and Vevo as syndication platforms. But in that case it has to shell out a much larger share of revenue to these platforms. There is an additional risk of non-linear video cannibalizing traditional TV advertising revenue, a hypothesis that could be hard to ignore.
Leveraging Cross-Industry Experiences
We may not be able to find a perfect example of the linear, non-linear integration model from other industries. However analysis of similar business environment in other industries reveals that four key factors, namely (a) Leadership mindset, (b) business involvement in IT/Technology decision making, (c) maturity of the industry as a whole for driving such initiatives and (d) ability of an organization to pull in capital for investment in right place and at the right time are the key four factors that has helped organizations emerge victorious from such challenging environment. (See chart 2 for an illustrative representation of these four factors)
The recently launched UltraViolet project facilitates consumers, who had bought content through physical media to watch the content on any personal device at any time. This is a bold move that challenges the likes of Apple TV, Google TV, and other legal online streaming services as well as content pirates that thrive on consumer dissatisfaction about accessibility of content on all personal devices. The success of this project stems from collaboration among studios, retail chains, software service providers and the determination of a few who believed in the philosophy of the ubiquitous access to content. The project has given a tremendous boost to the concept of universal content ID and universal content metadata. The industry associations (DECE, DEG, EIDR, HITS) are also playing a key role in shaping this project.Â Â
Publishing industry ran separate processes for e-book and physical book production for a long time. As a result cost of e-book production ran high and simultaneous launch of e-book with physical book was difficult. Most leading book publishers however have invested heavily to integrate the digital and e-book production processes. This integration not only has cut down the cost of e-book production by 70-90 percent, the integrated process is also allowing simultaneous launch of print and digital books.
The multi-channel integration in the retail industry, which has been on-going for nearly a decade now, is a mature process. Product master data management, customer 360Â° and multi-channel fulfillment are some of the key initiatives within the retail multi-chain process. The process transformation here tackled many of the complexities the media industry faces today. For example, the retail industry had problems with global identifier but worked hard along with trade associations to adopt EAN, UPC standards. Simultaneously, they addressed every possible use case in multi-channel fulfillment to provide unique customer experiences. In addition, many of the multi-channel integration programs were run from CEOâs office thus giving it the necessary support, budget and focus.
In the early days of internet it became clear that the future of telecom revenue will be more data driven than voice driven.Â The ability of the packet switch network to handle data traffic better than the circuit switch network forced many telecom providers to invest billions in changing their network infrastructure.
On similar note, if we are to believe that a seamless consumer experience across devices will define the future of content, linear & non-linear integration must be looked at as a multiyear transformation project with board level oversight and capital must be allocated with a view on long term return on investment.
Uniqueness of the Television Network Industry and How it Impacts the Prospect of Linear, Non-Linear Integration
While learning from other industries is relevant, the television network industry faces many unique and complex challenges in the process of linear, non-linear integration.
The consumer experience for content among various platforms continues to be different based on the nature of the device itself. The pattern of consumer content consumption is an evolving area and differs vastly among various demographics as well. While the book publishing and retail industries have similar problems with the consumer experience, it is not as varied or as evolving as video consumption across devices.
While the content for the linear channels is fairly standard, the consumption of content for non-linear channels is undergoing significant changes. A 2011 survey by TV Guide suggests that 15 percent of the population consume more than six hours of online video/week.Â In 2010, the number of such viewers was just 4 percent. This growth has been primarily driven by massive growth of online content in type and quality. Diving deeper into content consumption by demographic will be significant while attempting a linear and non-linear integration.
The linear medium is a mass advertisement medium and has finite advertisement inventory (spots) to deal with. In contrast, the non-linear medium is a targeted advertisement medium and hence could have nearly infinite advertisement inventory (spots). This is a significant technical challenge to overcome for the industry. Since the content consumption pattern by demographic may not match across linear and non-linear channels, an integrated advertising strategy could be extremely hard to deploy.
A large portion of the industryâs intellectual property is third party content with complex contractual clauses. These contracts are far more complex than the publishing industry intellectual property rights and in some cases more complex than the music industry contracts. Additional complexity is generated through exclusive and restrictive distribution contracts with carriers, syndication partners and international contracts. While television is a geographically contained medium, mixing TV with a global Internet platform may cause significant rights enforcement issues.
Creating a Business Case
Just like any other solution, linear and non-linear integration must start with clarity of purpose. Â There could be revenue or cost considerations for this integration and each of these considerations could result in a different path of action.
Â On the revenue side, a solution for TV Everywhere may require building a consumer authentication and partner entitlement system; whereas an integrated Multi Channel (Broadband & TV) C3 solution might entail re-engineering the entire advertising sales, traffic and program planning system.
On the cost side, the shared digital asset management system could require more focus and effort in terms of migration, whereas a shared search solution might require an integrated master data management solution. (Chart 3 provides a visual representation of the business case drivers for a linear, non-linear integration)
Detailing Out the Problem Areas
Once the business case has been established, it is essential to look at the cost and challenges with respect to execution and implementation. This could typically consist of rationalization of portfolio and enhancing services to execute the business base. There could potentially be numerous technical, process and ownership (organization structure) related problems in execution. Some of them are listed below
- Inventory pricing for TV and non-linear channels are different and typically use different tools. For example, television ratings forecast is based on historical Nielsen rating, while online is not.
- The sales systems for TV and non-linear channels are different. Although the processes are fairly similar, an integrated strategy canât be created easily because the technology environment is disparate.
- Inventory management platforms for non-linear channels and Television are usually different and not in sync. Thus the sales system for these cannot work in unison.
- The program lineups for TV are usually not in sync with content distribution for non-linear channels, thereby impacting the multi-screen strategy.
- The deal/contract management process for TV and non-linear are different and hence cannot be managed from a centralized process center.
- The in-house traffic management process may not exist for non-linear channels; hence integration with TV/broadcast traffic management system with linear channels may be difficult.
- The invoicing and reconciliation processes with the agency use a different set of people and processes for linear and nonlinear channels.
The 1st step to address these issues related to process and technology disparity is to create a unified reference architecture that could support the linear and non-linear business in totality.
Creating a Future-Ready Reference Architecture
For the purpose of this article, reference architecture is created with four key components – experience, service, content and data. The main objective of defining this reference architecture would be to clearly identify the process owners and systems that embody components of this architecture.
This layer of the architecture deals with the experience of all parties involved across the linear and non- linear value chain. This could include partners, customers, end users and employees. Typically the non-linear business would have a highly evolved end-consumer experience technology platform, whereas linear business would have invested in a more evolved internal consumers or partner experience platform. (Chart 4 below details out some of the key aspects of the âExperience Layerâ of this reference architecture.)
Services constitute a large part of the reference architecture.. Traditionally, content ingestion and transformation services are robust in linear business, whereas search and distribution platforms are more evolved in a non-linear model. (Chart 5 below details out some of the key aspects of the âServices Layerâ of this reference architecture.)
While shared infrastructure services among linear and non-linear systems are very common, shared services are yet to be explored in the areas of quality control and analytics. This is because quality control in linear workflow is typically far more stringent and if the same process is applied in non-linear, the cost of production might go up significantly. In the area of analytics, linear and non-linear processes deal with different sets of source data. In the absence of universal content ID and universal metadata standards, the process of shared services continues to be a challenge in the analytics space.
In most networks linear function owns the master content. Typically advertisement and other promotional content follow the business which runs it. Potentially there could be a single process owner for all forms of content. (Chart 6 below details out some of the key aspects of the âContent Layerâ of this reference architecture.)
Data is one of the most significant parts of architecture. Data not only drives control but also define business strategy and hence there could be conflicts in ownership with respect to data within an organization. In order to achieve a successful linear and non-linear integration, acquisition and distribution rights must be centralized as a single shared service. Â Other data sets could have a federated model in which they are incrementally enriched. (Chart 7 below details out some of the key aspects of the âData Layerâ of this reference architecture.)
Once the reference architecture is agreed upon by all the stakeholders, and the responsibilities of managing components of this architecture are defined, the next necessary step would be to carry out a portfolio analysis of systems and processes across linear and non-linear value chain.
This could include a maturity model analysis of the systems and processes from functional as well as architecture stand point. A subsequent analysis of the systems and processes in the context of the original business case must then be carried out to establish the best re-engineering strategy. Â (Chart 8 provides a sample illustration of approach to this portfolio analysis)
Advertising Age published a very interesting report in May 2012 which showed YouTube video viewing dropped by 28 percent since December 2011 (based on Comscore data). Does this mean TV networks should rejoice? Not really. The same report indicates that the average length of video viewing on YouTube has grown by 33 percent to four minutes in the last year. Actually, YouTube by design is going for quality over quantity as longer views dramatically increase advertising opportunities. This could be viewed as good news for the networks as well; since every channel does capitalize YouTube as a syndication platform. However as a result of this strategy, a significant part of revenue will move to YouTube even though the content is owned by the network. Â If television networks want to get a large chunk of the forecasted 9.3 billion potential online video advertising revenue for 2016, and gain more control over content, they have to bring television and online together for the consumer.
About the Author
Subhankar currently leads HCLâs effort in developing its practice within Studios, Television Networks, Music, Advertising and Digital Publishing space. His areas of expertise include Digital Strategy, Digital and Physical Supply Chain, Social Analytics, Piracy Control, Rights and Royalties, Customer relationship management, pricing, revenue management & sales systems. At HCL Subhankar hasÂ spent the last several years consult existing clients in defining and implementing various facets of their digital architecture, build HCLâs own repertoire of solutions, augment relationship with partners and associations and help the business win large multi-year contracts. Prior to joining HCL, Subhankar has worked as a Principal Media Consultant with Infosys Technologies. Subhankar has more than 17 years of consulting and industry experience. He holds a masterâs degree in Management from Indian Institute of Management (Ahmedabad), widely considered as the best business school in India. Subhankar can be reached at firstname.lastname@example.org.
To download a PDF version of this article, click here: M&E HCL_5.29.13
Fight on! As a USC alum, itâs odd that I cringe when other well-meaning Trojans shriek the schoolâs battle cry. But, last week, instead of hearing the bombast of a marching band in my head at the thought of âSCâs fight song, I was feelinâ a little more hip-hop. I had a kindred spirit in Dr. Dre. Thatâs cuz (as Dre would say) famed hip-hop star Dr. Dre and music mogul Jimmy Iovine announced a $70 million donation to create a new academy for music, focusing on the intersection of art, technology, business and innovation.
The curriculum includes computer science. Entrepreneurship. Art. Marketing. Couldnât we all stand to find these intersections a bit more clearly?
This is the challenge for the media and entertainment industry of today â the need to find that intersection of âgutâ and âinsight.âÂ Iâm sure this is the bane of any long-standing creative industry in todayâs data driven climate. Thatâs because itâs hard to dispute the collective wisdom of creative powerhouses whoâve been at their trades for decades. But, no one is arguing that there needs to be a wholesale switch. Rather, just some more appreciation for the intersection.
In the handful of years Iâve been working in analytics â which were preceded by MANY handfuls of years working in production, post-production, and digital media â Iâve seen a real ramp-up in the role of analytics at traditional and digital media companies alike.Â But, the truth is, there are still factions.Â Whether Iâm talking to a content creator, distributor, publisher or MSO, there are often camps: the we-donât-need-no-stinkinâ-analytics camp vs. the analytics-are-our-future camp.Â Those two camps are starting to meet in the middle â and itâs about time.
At the risk of sounding overly prophetic, there is beauty in the intersection of art and science.Â And, that, I think, is the promise of big data analytics across the content value chain (http://www.teradata.com/industry-expertise/media-and-entertainment/#tabbable=1&tab1=0&tab2=0&tab3=0). When creative companies can integrate what they know about their audiences, their content, their channels and their marketing, they can unleash the value of the intersection of art and science.Â Any successful analytics framework demands a detailed understanding of the art of both.
So, for all of you aspiring data artisans (http://bitsbytesandbricks.blogspot.com/2012/11/i-like-term-data-artisans-george-mathew.html) out there, take heart! Dr. Dre has got your back on this on this one. Fight on!
By Colleen Quinn
Itâs that time of year!Â NearlyÂ 100,000 of the industryâs best and brightest flock to Sin City for meetings, demos, and debauchery. ItâsÂ NAB!Â Iâm a little wistful writing this from a cold, hard desk in Los Angeles, as pictures of productive-days-turned-long-raucous- nights start to flood into my Inbox.
âWish you were here!â they scream. Alas, me too. Not so much for the endless hands of blackjack (where I lose every penny to my name), the 36-ounce rib-eyes (âHey! Who has the most liberal expense account?!), or the dreaded booth-baby-T-shirt.Â No.Â Those things are nice â and essential to the NAB boondoggle â but mostly, Iâd like to be there because I think this year, really, NAB is important.
My mantra for the last 5-years has been âthe landscape for media and entertainment is changing.â I hear others talk about âthe battle for the living room.âÂ Well, Iâve got news for all of us. The landscape isnât changing. Itâs changed. And, that battle is over.Â Now weâre talking about a war â for the consumer. This monthâsÂ WIRED magazineÂ has dubbed this the Platinum era of television.Â Theyâre right.Â (See my recentÂ blog postÂ Â â was it a premonition to Wiredâs new issue?)
NAB â really the community of creators, technologists, and post-production innovators that drive the showâ are at the nexus of this platinum age. Â The show is still well underway, but already,Â major themesÂ are resounding: hyper-social content, deeper engagement, precision personalization.
This new age demands that content creators and distributors be able to understand and know an audience-of-one in ever more sophisticated ways. While the applications and services â from TV Everywhere to the Second Screen â may vary, they all share a common, critical foundation: analytics.Â Without most of our peers realizing it, the entertainment industryâs biggest currency has become data. Big data.
Itâs time for a deeper conversation about the role of analytics in our industry â and many of the most forward-thinking studios and distributors are already starting that dialog. Industry Â thought-leaders will be represented at an upcoming industry roundtable onÂ May 23rd, âUsing Analytics to Create, Captivate and Engageâ – hosted byÂ UCLA Center for MEMESÂ Â andÂ TeradataÂ â to drive that discussion.Â If this is the Platinum Age of television, then data is the alchemy that can create it.
Oh. There will also be a cocktail hour. Westwood isnât quite Vegas, but at least they wonât force you to wear aÂ baby-T.
By Bart Myers, Vice President of Consumer WebÂ Properties, Rovi
Despite the rising number of entertainment options, itâs clear that Americans are watching as much TV as ever. Theyâre just watching it differently, discovering new ways to enjoy TV on their own terms and on whatever device is available to them.
They are streaming free and premium content to their tablets and smartphones. They are tuning into online services like Hulu and Netflix to watch their favorite shows. They are even paying a few bucks to watch single episodes via services like iTunes and Amazon Prime. And, yes, they still rely on cable and satellite subscriptions.
Still, many people in the television industry are freaking out. Theyâre obsessed with the same question: âWhoâs cutting the cord?â After all, Comcast Corp., the largest U.S. cable provider, said it lost 117,000 video customers in the third quarter of 2012. After decades of steady increase, the number of U.S. households subscribing to pay-TV service is now on the decline, according to the Nielsen Company.
But cord-cutting isnât the real issue. Instead, content producers should be asking, âWhat cords are viewers usingâand how can we maximize the value of those cords for everyone involved?â
Nielsen recently began tracking the viewing habits of millions of Americans that now connect to entertainment content via the Internet, and not through a cable or satellite service. Nielsen found that more than two-thirds of these âZero TVâ homes get their content from a broad range of devices including personal computers, Internet-connected TVs, smartphones and tablets.
For the content provider industry, there are lessons to be learned from these âZero TVâ homes. The Nielsen study found that nearly half of the âZero TVâ homes now watch shows through online subscription services. Yet, many are finding that the content they love is not readily available in all formats and platforms, which leads them back to a siloed approach to getting their TV content.
The three main use cases in TV watching are:
- The viewer is at home, trying to decide whatâs on TV to watch right now.
- The viewer is anywhere, trying to plan what to watch later.
- The viewer is on a connected device and wants to watch whatever is available online.
The problem for content owners, however, is maintaining the relationship with the showâs fan base across multiple siloed platforms.Â Content providers should be able to connect to the user where they are watching TV, but the current ecosystem of devices and content rights makes that relationship extremely hard to manage.
Increasingly, viewers need an advanced set of tools and services that can guide them across all the different ways they consume televised entertainment today. For content owners, this means that they need to keep an eye on rapidly evolving TV consumption habits.Â To keep up with the changing consumer landscape, content providers need to think differently about how people are planning for and enjoying content.
Consumers, meanwhile, are hungering for televised entertainment content online, but itâs simply too hard to find. They donât understand why they canât access their favorite content anytime, anywhere, from any deviceâespecially since the technology exists to make it happen. The maze of blind alleys the viewer must navigate to find that content can be excruciating.
If content providers want to maintain the relationship with viewers in this future world of entertainment on every medium, content discovery needs to be much easier.
As things stand today, there is too much friction in the marketplace. We shouldnât be cutting consumers off from their favorite shows. Instead, we should be finding new ways to better connect viewers with entertainment content, and helping them understand the appropriate options for enjoying that contentâwherever they are and on whatever platform they choose.
About the Author
Bart Myers isÂ Vice President of Consumer WebÂ PropertiesÂ at digital entertainment innovator, Rovi Corporation.Â He co-founded SideReel.com in 2006, which was acquired by Rovi in 2011.Â SideReel.com is a website that helps people watch and track their favorite TV shows online and was one of the early sites to embrace the then emerging trend of cord cutters.Â For more information, visitÂ www.rovicorp.comÂ orÂ www.sidereel.com. Follow Bart on Twitter, @bartolah.
By Seth Hallen, CEO, Testronic Labs
Editorial Contributions By: Graham McAllister, Ph.D., VP of User Research, Testronic Labs
In the home entertainment industry, never before have we seen such rapid and dramatic changes in the way consumers are accessing and viewing content.Â Broadcast television was the first effective format available to the masses for watching content in the home.Â It took almost 40 years for the next significant innovation in content delivery to arrive on the scene, videotape, which allowed a consumer to choose when and where to watch a movie or TV show.Â More than 20 years later, DVD and then Blu-ray improved the quality of in-home content and DVRâs made time-shifting possible.Â Now, less than 10 years after the introduction of the Blu-ray Disc, we are experiencing a colossal shift towards online digital video consumption.
According to Futuresource Consulting, there were over 485 billion legitimate (not pirated) online video views in the US last year! That is up from 266 billion in 2009.Â As impressive as that number is, only 1 percent of those views were purchased.Â Clearly the public is hungry for online access to content.Â The challenge is offering a value proposition the consumer is willing to pay for.
If core consumption has changed so dramatically, it is only logical to ask how behaviors regarding searching, accessing, cataloging and interacting with content will change.Â There are already various options available, from simple web interfaces to 2nd screen apps to voice-command interfaces.Â Which of these, if any, are most compelling to the consumer?Â How can you determine exactly how consumers like to access and interact with their content?Â How can you best forecast the trends so that developers can create UIs that will truly engage consumers so that they recognize the value in purchasing content they view online and through all their various devices?Â How can you ensure an experience that home video consumers love so much that it will yield DVD-caliber success?
Creating a world-class experience that consumers love is the goal of many in the digital content space, but it is achieved by remarkably few.Â Attempting to understand not only what consumers enjoy, but more importantly why they enjoy it, is the key to success, and in recent years a growing field has developed which focuses on this very issue.
User Research is a discipline which combines elements from psychology, design, computer science and many other fields, with the elemental goal of understanding people. The idea being – if we better understand people, then we can better design products and services which they should enjoy using. It sounds so obvious in retrospect, but it is only in recent years that this has become an important and specific developmental focus.
Creating Enjoyable Experiences
In the past, DVDs, software, and websites were judged on what features they offered. Early consumer devices and websites were often used by technically literate power-users, where ease of use and aesthetics were not the focus. However, the shift in emphasis from features to the user experience has been clear in recent years, with 2 prominent examples standing out.
In 2001, Appleâs iPod was released. It was not the first MP3 player on the market, nor did it offer as many features as its rivals. However it quickly became the best selling MP3 player on the market, a market it still dominates more than 10 years later.
In late 2006, Nintendo released the Wii games console. The new Nintendo console went on to outsell its competitors by approximately 50 percent.
Why did these two products achieve such success against very stiff competition?Â In both cases, an obvious differentiator was ease of use.Â They both utilized a simplified interface that made the devices accessible not only to a narrow target market of technophiles and gamers, but also to a broad age and gender range who were not attracted to their competitorsâ products.Â But, was that really the deciding factor?
Change in Focus
At an event in 2006, an Apple representative said that the App Store changed where developers should focus their efforts. He noted that before the App Store, developers were probably putting about 90 percent of their effort into technical features and approximately 10 percent on design and user experience (if even that); after the App Store was launched however, users’ quality expectations of apps increased very quickly. He advised developers that if they were not putting more than 50 percent of their efforts into the user experience, then they should not expect their product to do well. The message was becoming clear, itâs not about the technology, itâs about how people use and experience the technology.
Defining the User Experience
Iâve used the term âuser experienceâ throughout this article, but what does it really mean?Â It is often used as an umbrella term for two main areas; usability and actual user experience. Usability is concerned with the userâs ability to complete the task that the product or website is designed to do and how many steps it takes to get it done. This is very functional, and it can be quantified in ways such as time taken or clicks required. User experience (or UX) concerns itself with understanding whether or not the user enjoyed doing the task. UX is actually the much more important of the two since usability is essentially a yes/no proposition.Â Users expect features to work, but the precise details of how they work can turn them into long-term customers. UX is how user loyalty is generated.
Industry Comparisons, Usability and UX
The web and video game industries have understood for a while now that building a website or game that technically works is not good enough; the experience must also be smooth, clean and enjoyable.Â There is a lot of competition vying for a consumerâs business, and your product has to be better than the next guyâs.
Usability has proven to be extremely important in the development of websites.Â Many usability firms exist, and they evaluate how users interact with websites with the aim of refining the experience. Even seemingly trivial changes can bring massive increases in revenue. One high-profile case is that of the $300 million button. A large online retailer had a sign-up form which simply asked for a customerâs e-mail address and password before entering a website. This information was also asked for at checkout, so was not essential at the start. Asking users to sign-up before buying created enough initial resistance that some users went elsewhere. Once the change was made, the number of customers making purchases increased by 45 percent, leading to an extra $15 million in revenues in the first month and $300 million extra over the course of the year.Â The online retailer was not even aware there was an issue.Â They had simply not put enough effort into understanding their users to realize that a simple tweak to their website could offer users a better experience that would dramatically increase profits.
For video games, however, the focus is more on UX.Â In gaming it is not about how easy it is to complete a task, and in fact, often times the more difficult it is the better. It is all about the enjoyment the player experiences along the journey.
The New World of UX Testing
So why did the iPod and the Wii outsell their competitors? Usability was certainly an important factor, but UX was likely the key.
To understand how someone feels using a product, when they are enjoying it and when they are not, a new crop of high-end services, such as Biometric Testing, are becoming available. Biometric Testing involves the use of psycho-physiological sensors, such as galvanic skin response (GSR), heart rate, and skin temperature, to âreadâ the player experience on a second-by-second basis. It is also possible to track eye movements to see where the user is focusing at any given moment.Â Such approaches to understanding the user experience complement traditional testing and development methods and offer unique insights into what users are really experiencing, rather than what they tell you they are experiencing.
How does the home entertainment industry create the next iPod or Wii? With online digital stores becoming the norm for searching, accessing, and interacting with content, both usability and UX will have key roles to play.Â Emerging products seeking to establish a foothold in a market with free access points and significant, established competition must be better, smoother and more enjoyable to use.Â The facts prove that even when there is content available for free, a cost-based product can rise above as long as it offers a superior experience. By employing Usability and UX testing to understand what users want, experiences that consumers will love can be designed around those expectations, allowing developers to build loyal and long-term relationships. At Testronic Labs, we believe that investing effort in understanding users in this way will bring benefits to all and will help to ensure the healthy proliferation of emerging home entertainment platforms and products well into the future.
About the Author
As Chief Executive Officer of Testronic Labs, a global third party Quality Assurance and Testing Services company with worldwide facilities, Hallen oversees global operations and the execution of Testronic Labs’ strategy in emerging markets. Prior to joining Testronic, he was VP of North American Operations at Lightwork, and oversaw the business development Digital Media Services and DVD AuthoringÂ for Lightning Media. Hallen currently serves as a board member for the Hollywood Post Alliance (HPA), as well as an Advisory Board Member of MESA.
By Emmanuel Josserand
We are all living in an increasingly fragmented world. And the media world is no different.Â Consumers are demanding their viewing experience expands to the smartphone and tablet from the primary screen, which is itself becoming much more complex with the propagation of smart TVs. Therefore content distributors need to make their content compatible for distribution in myriad formats. The conundrum, of course, is that the consumer therefore expects more at the same price. The content creators and distributors need to spend more to generate the same price.
Automatic Content Recognition (ACR) helps bridge this disconnection on both ends of the broadcasting experience, offering consumers deeper immersion and interaction with television programming and advertising, while providing rights holders and broadcasters a heightened level of business intelligence through highly granular tracking of how viewers interact with content. ACR, powered by either watermarking or fingerprinting, allows dynamic and seamless interlinking of devices, viewers, content and applications. So it fuses the viewing experience across multiple screens for the viewer, while closing the delivery-feedback loop for the content owner and distributor. The whole process becomes more efficient.
In the multi-screen environment, ACR is a tool that gives a smart device – such as a smartphone or tablet – the ability to become âcontent-aware.â This awareness allows the smart device to recognize what is being watched on the primary TV screen without the need for direct input from the user. This automatic recognition can then be employed to trigger content on the 2nd screen device that is complementary to that of the primary screen. Television programs, films, advertisements, and other types of main-screen content can therefore extend to the viewerâs 2nd screen, creating an immersive multi-screen viewing experience without the need for the user to manually enter Web site addresses, or search for the relevant information on those sites.
In the single-screen environment, ACR solutions can also be integrated into the chipsets of connected/Smart TVs and smart set-top boxes themselves to enable real-time content identification, and the triggering of events at the device level. As opposed to the multi-screen application described above, this single-screen enhancement enables the Smart TV or smart set-top box itself to become âcontent-aware,â and therefore offer a host of value-added features for the consumer directly on the primary screen of the TV itself.
For content owners and distributors, along with the ever-growing number of companies involved in the development, delivery and monetization of content, ACR acts as a multi-faceted toolkit that can add a rich variety of new, commercially vital functions and features to these companiesâ core operations. Advertisement triggering to the 2nd screen based on live TV content that is being broadcast is a key example. By automatically notifying application providers in real time of what content is airing on which channel, the service allows for the synchronization of value-added functionality such as content-specific background information, hyperlinks, and synchronized social newsfeeds, all within the developerâs 2nd screen or smart TV applications. The application provider can therefore offer users a more powerful and engaging TV-synchronized experience. In addition, such services enable application providers to work in close partnership with advertising agencies and brands to further monetize their application platforms.
With ACR-powered content-aware devices continually monitoring in real time what is being watched, broadcasters and content owners are able to track highly granular viewing habits, and identify detailed information as to where, when, for how long, and on what device content is being consumed. The implications of these detailed analytics are enormous and can provide a comprehensive range of benefits to both protect and enhance the business models and revenues of content owners and distributors.
While much of this content identification technology has until now been focused on enforcing copyrightâor ensuring that a video asset appears when and where it is supposed toâin the longer term, ACR provides a vital strategic and tactical tool that addresses the multi-screen environment in which todayâs viewers consume content, while offering substantial benefits to everyone in the content value chain. Content-aware devicesâbe they the primary or secondary screenâwith the ability to subtly and automatically drive viewer interactivity, provide an infinitely flexible springboard from which developers, contentÂ providers, brands and broadcasters can construct an eco-system to offer entirely new creative dimensions in which the viewer can be engaged, and the content owner and distributor informed.
About the Author
Emmanuel leads the global marketing and communication activities for Civolution. Prior to Civolution, Emmanuel was part of Teletrax, which in 2008 became the Media Intelligence arm of Civolution. Emmanuel was previously Business Manager at digital imaging software company Arcsoft, where he helped set up their European offices. He has more than 15 years experience holding various roles in marketing, sales and business development.