By Paul Sweeting
Pay-per-view operators in the U.S. had trouble handling the last minute rush of signups for the âFight of the Centuryâ on Saturday, forcing promoters to delay the start of the welterweight championship bout between Floyd Mayweather and Manny Pacquaio by 45 minutes as operators scrambled to process the late orders and maximize the take.
In contrast, the live-streaming apps Periscope and Meerkat worked flawlessly â so much so that it was possible to watch the entire fight for free as thousands of âMeerkastersâ and âPeriscopersâ turned their phone cameras to their TV sets and rebroadcast the official HBO and Showtime broadcasts. There were so many streams available that Twitter users were able to catch every round, even as Periscope and Meerkat scrambled to respond to DMCA takedown requests, simply by jumping from one stream to the next.
There were also, of course, any number of free live streams of the fight available online for those who wanted to search for them, just as there are for any such big-ticket event, many of higher quality than anything you could see on Periscope or Meerkat. Boxing promoters in particular, in fact, have been battling pay-per-view piracy since the days of illegal, âblack boxâ decoders in the 1980s and 90s.
What Meerkat and Periscope bring to the mix is the power to make such piracy viral, social and mobile â and as easy to do as taking a selfie â potentially amplifying its impact. Many of the streams from the MayPac fight were poor quality, and none were better than second-generation rips created by pointing a cellphone camera at the TV screen. But that didnât stop individual streams from registering 10,000 or more viewers at a time.
Truth be told, many of not most of those viewers were likely casual fight fans, attracted as much by the novelty of the still-nascent mobile live streaming phenomenon and the lure of the big event as by a genuine interest in the fight.
By Paul Sweeting
No U.S. television network is more invested in, or has benefited more from the dynamics of the bundle than ESPN. The combination of must-have programming for a key segment of the pay-TV audience, and the must-carry leverage of its sister-broadcast network ABC, has given the Disney-owned sports network the power to command the highest per-subscriber carriage fees in the industry, ensure placement on basic tiers, and compel carriage of ancillary networks like ESPN Classics and ESPN Deportes.
For those pay-TV subscribers not in the ESPN demographic, however, that leverage has acted like a tax, imposing higher costs for networks and programming they donât watch, yielding what amount to windfall rents for ESPN.
Those windfall rents, in turn, have given ESPN the wherewithal to pay the skyrocketing rights fees for live sports. Those inflated rights fees, in turn, have become the primary economic engine of most professional and big-time amateur sports while acting as a formidable barrier to entry for would-be competitors to ESPN, yielding a virtuous cycle that reinforces ESPNâs dominant position within the pay-TV ecosystem.
By Paul Sweeting
The collapse of the Comcast-Time Warner Cable merger comes just as the TV industry is embarking on what is likely to be a long and contentious renegotiation of the size and cost of the bundle, and of the terms of distribution generally. In just the past few months weâve seen the launch of Sling TV, Dishâs greatly slimmed down bundle of channels delivered over-the-top, the launches of CBS All Access and HBO Now outside the bundle, and Verizon touch off a brawl with ESPN, Fox and (irony alert:) NBC with its unilateral rollout of FiOS Customer TV, offering subscribers mix-and-match packages of channels at reduced prices.
Up to now, thanks in large part to legacy FCC rules like must-carry and retransmission consent, the media companies, especially those tied to broadcasters, have held the upper hand over distributors in setting the price and scope of the bundle, leaving cable operators like Comcast squeezed between ever-growing carriage fees and increasing resistance to rate hikes by consumers. But Comcast, along with Verizon and other major ISPs with significant pay-TV interests, have made it very clear, in their dealings with Netflix for instance, that they would like to see a very different dynamic, and very different terms, emerge for over-the-top distribution.
For all its claims of consumer benefits that would have flowed from being allowed to merge with TWC, Comcastâs main goal in pursuing the acquisition was to gain scale and leverage for the negotiations to come over the terms of over-the-top distribution.
By Paul Sweeting
For all the disruptive innovation Apple has unleashed on the markets for devices and software it has not been particularly disruptive to the content markets it has entered. Often just the opposite.
By the time Apple introduced the iTunes Music Store the record business was already reeling from the impact of Napster and its progeny. Rather than disrupt the business, Appleâs entry created a new market for paid downloads. The record companies later came to rue the terms of the deals they made initially with Apple, the iTunes store helped restore legitimate commerce to digital music platforms and on balance has been a net positive for the incumbent rights owners.
Apple is now trying to do the same thing in music streaming, relaunching a paid-only Beats Music service as the record companies try to marginalize free streaming platforms.
By Paul Sweeting
According to a report by Recodeâs Peter Kafka, Apple is asking the TV networks to provide their own streaming infrastructure and handle their own video delivery as part of Appleâs planned subscription OTT service.
The two leading theories for why Apple is looking to take such a hands-off approach are a) to avoid the costs involved in building out its own streaming infrastructure, and/or b) Apple thinks cable-based ISPs would be less likely to engage in f@ckery against the service if the networks are delivering the streams.
Neither theory is entirely persuasive.
By John Libby, President, MediaMax Online
Does this sound familiar? A time critical digital asset like an image, ad or video needs to be quickly created and delivered. The process requires many steps for review, approval, edits, final approval, mastering, distribution, reporting and inventory management.
Marketing, sales, executives and even finance are awaiting the success of the asset. In the case of most companies, this process gets repeated frequently in many time zones, territories, languages and variations along with timing and security requirements. Technology and personnel are clearly the linchpins to the successful workflow that can and will occur at any time.
Enter, the âcloud.â Cloud is a marvelous marketing term to rebrand âoff-premiseâ business functions enabled by Internet connectivity. Thankfully, the cloud is much more than hype as evidenced by the considerable business improvements, cost efficiencies and seemingly endless entrepreneurial creativity exploding from the Internet. This realized delivery on the cloud promise fuels the continued adoption and expansion of cloudbased technology, services and staffing.
Traditional cloud hosted data center services have provided reliable and scalable computing platforms for storage, servers, bandwidth, hosting, streaming and application services with cost-effective precision. Companies benefit from reducing headcount and avoiding the pitfalls of constant technology refreshes.
A 2013 study by Gartner demonstrates acceleration in cloud adoption, predicting that global spending on public cloud services will grow at a compounded annual growth rate (CAGR) of 17.7 percent from 2011 to 2016. The Infrastructure-as-a-Service segment leads with the fastest predicted growth of 41.3 percent through 2016.
Outsourcing and staffing have evolved into a cloud delivery system as well by leveraging the Internet and software tools residing âon-premisesâ or âoff-premisesâ to effectively manage a wide range of company functions, such as finance, accounting, information technology, procurement, legal, human resources and marketing operations.
Just like cloud computing, the trend is increasing in adoption and acceleration, with the entertainment, media and publishing industries expecting to increase outsourcing by 80 percent, according to HfS Research.
Furthermore, cloud staffing is becoming an established staffing model as entrepreneurial staffing firms are rebranding themselves for business functions provided by virtual staff located off premise.
Digital Asset Management (DAM), like every other software category, has clearly adopted cloud offerings with traditional âon-premisesâ software vendors offering hosted options. In addition, new DAM cloud offerings continue to provide media solutions for enterprise DAM, encoding, media services, workflow, project collaboration and file sharing. DAM related cloud service segments are predicted to grow, according to Gartner, at an annual rate of 25.9 percent for enterprise content management and 36.4 percent for storage through 2016.
With the continued adoption and acceleration of cloud computing, cloud staffing, cloud media services and cloud DAM, it seems logical that enterprise media management would benefit from the convergence of these trends.
Outsourcing provides a means for companies to focus on their core competencies and business competitiveness. Leveraging their assets and content is a core competency but the technical workflow can be left to cloud sourcing options. More and more companies are accepting the practice of trusting asset management and librarian functions to cost-effective experts.
Expanding managed DAM in the cloud
As a developing area, cloud sourcing options for managing an enterprise or departmental DAM can be found in relatively limited vendor offerings such as post-production or
library service companies. Post-production companies have long coupled together end-to-end media services to accommodate media workflow and now include options for DAM with its management and maintenance.
Post-production companies have facilitated the monetization of content libraries with the creation, on-going management and support of DAM driven sites for the likes of Johnny Carsonâs television episodes and Paul McCartneyâs music library.
Post companies have created specialized business units to manage client media workflow and DAM management, such as DVS InteleStream for the distribution of marketing and publicity materials for film, television and music. LAC Group uniquely provides Library as a Service (LAAS). There are an array of DAM, archiving, research, preservation, curation, library outsourcing services and experts in DAM platforms such as North Plains, OpenText, Xytech, etc.
Per Robert Corrao of LAC Group, options for the digitization and management of your assets include:
1. Outsourcing the recovery of archives, assets lost in a crisis, including storage repository recovery and expansion. The digitization and organization of specific assets for a specific project.
2. Outsourcing the cataloging, meta-tagging, and organization of your digital and physical assets, followed by training of your internal information management staff for the ongoing administration of your library from in-house.
3. Outsourcing the digitization of your assets and retaining a contracted digital asset managers as part of your IT team for the ongoing administration of your asset library.
4. Adding an experienced digital asset manager to compliment your in-house employees for the digitization and/or permanent administration of your library.
Cloud hosted and managed DAM reliably handles the technology burdens of support, training, performance, storage, business continuity, capacity planning, technology refreshes and application development. Business unit leaders prefer to focus on their core business practices and not engage IT in continuing projects to upgrade on-premises software systems and related infrastructure.
Technical and end-user training concerns are minimized as the managed DAM vendor typically runs the system and facilitates training for only client oversight, as needed.
A managed DAM adoption clearly alleviates many technology burdens but clients also benefit from continued software enhancements released into a platform.
As managed DAM clients essentially partner with their provider, their collective ideas materialize into software features and even industry improvements.
The New York Times in 2009 reported a predicted tripling in the need for DAM professionals this decade. Companies that implement DAM certainly know the cost
dynamics of staffing the operation, which often comes as a surprise.
Experts like Jeff Lawrence at Celerity and industry implementation guides like North Plainâs âThe 13 Cost Areas for a Digital Asset Management Systemâ routinely cite the overlooked cost areas of DAM manager, business support and technical support. Most traditional DAM or cloud DAM implementations will require the hiring of additional staff to support and maintain the new DAM system.
According to the DAM Foundationâs salary survey, the mean reported salary for Digital Asset Managers is $82,198. DAM software makers have used Total Cost of Ownership (TCO) models to demonstrate organizational value in software solutions. Recent studies performed by Gistics with OpenText on outsourced or cloud-based DAM services solutions show a remarkable improvement on TCO for an enterprise size implementation with 5,000 global consumers.
Three-year operations costs with startup were approximately 71 percent lower for an outsourced DAM ($1.57 million versus an internal deployment DAM ($5.34 million). The cost reductions come from typical areas like avoiding hardware purchases and internal implementation staffing. (However, an outsourced DAM three-year implementation cost still includes 52 percent internal staffing for library services and end-user training for a monthly cost of nearly $23,000.)
The Gisticsâ study, although obviously just a hypothetical case, does certainly highlight the internal staffing costs that cannot be overlooked with a company DAM operation. A managed DAM cost improvement would be expected given the nature of the pay-as-you-go model and leveraged resources of a cloud based provider across many clients.
A managed DAM avoids the direct and indirect costs of traditional staffing, such as turnover, hiring, training, management and budgetary headcount. Most importantly, a
professional service resource, like a managed DAM, provides the accountability and performance demanded by leading companies.
âŠand what happened to the time critical digital asset? With the managed DAM solution, the finished time critical asset example is handed off to a managed DAM vendorâs
client representative, who shepherds the asset, variations, communications, ingestion, tagging, delivery, archiving and reporting while the client focuses on the ultimate
promotion and success of their business.
The managed DAM provides the end-to-end service thatâs reliable, timely and works in partnership with a clientâs common goals.
John Libby, President of MediaMax Online and MESA Board Member, is a 25 year veteran of entertainment technology and marketing. MediaMax Online specializes in hosted media management, monitoring solutions, media analysis and software development such as EPK.TV, MMD.TV, PSAmedia.org and Daily Buzz.
By Paul Sweeting
YouTube is not confirming but not exactly denying a report by the Daily Dot on Wednesday claiming the video site is getting ready to relaunch its live-streaming platform in with a new emphasis on games and e-sports. An announcement could come as soon as June, during the E3 game expo in Los Angeles, the report said.
Asked for comment, YouTube provided the website with a link to a GIF with no further explanation. Asked in a follow-up inquiry whether the GIF was meant as a joke, YouTube replied that no, âthe GIF really was [its] official response.â Make of it what you will. But for YouTubeâs sake I really hope the original report is correct, because Google really needs to do something big in live streaming, and soon.
By Paul Sweeting
The NFL seems to be in a test pattern. On Monday, the league announced that it will make next seasonâs match-up between the Buffalo Bills and the Jacksonville Jaguars available exclusively via the internet outside of the teamsâ home markets, rather than on national television. That was followed by an announcement that the league will suspend its local TV blackout rule for the entire 2015 season allowing games to be shown in their local markets even if the game is not a sell-out.
The league described both moves as tests, although what exactly is being tested in each case was left a bit vague.
The Bills-Jaguars game is a one-off, and a low-risk one at that. The game was set to be broadcast by the NFLâs own NFL Network, so there were no pre-existing rights deals to renegotiate, and involves two struggling teams with little national following in a game to be played in London and shown in the U.S. at 9:30 a.m. Eastern Time. Even if the test is a disaster the damage will be limited.
By Paul Sweeting
We are not in the business of collecting your data,â Apple senior VP Eddie Cue declared in announcing the Apple Pay mobile payment system. âWhen you go to a physical location and use Apple Pay, Apple doesnât know what you bought, where you bought it, or how much you paid for it.â
The line was clearly meant as a swipe at Google and other competitors in the mobile payments space, who do collect purchase data and use it in ways that can implicate usersâ privacy. But Appleâs studied indifference to the details of purchase transactions is also central to Apple strategy in launching Apple Pay.
When a iPhone user adds a credit card to her Apple Pay account, the card information is encrypted by the device and sent to Appleâs servers, where it is decrypted to identify the issuing bank, and then forwarded to the bank in re-encrypted form.
By Andy Hurt, Senior VP, Global Product Management and Marketing, Front Porch Digital
Private-cloud solutions offer more than disaster recovery, which is what people often think of when they consider cloud implementations. Disaster recovery is certainly an important use case, but a private cloud can do so much more. Once the content is in the cloud, a private cloud can be configured to accommodate a variety of workflows, in many cases automating repetitive processes while scaling massively.
CSM as a foundation
The ideal private-cloud for any M&E organization starts with a content storage management (CSM) system, which automatically retrieves broadcast-quality content from any storage infrastructure â disk, datatape library (with the aid of a robot), optical archive, etc. â and delivers it to an edit station, a playout device, or wherever else it might be needed. CSM systems were invented for media companies.
Unlike IT-based storage systems, which are designed to handle documents, numerical data, and the like, CSM systems are primed for the big-data requirements of video files, with built-in âvideo-awareâ characteristics such as file-based quality assurance, timecode-based partial restore, and in-path content transcoding.
CSM systems are also designed to automatically handle difficulties arising from file compatibilities, essence types, wrappers, etc., in a highly active media-storage environment. These and other advanced features allow media organizations not only to store limitless amounts of high-resolution video assets, but to share those assets seamlessly throughout their organizations.
In short, CSM systems help content owners cope with what would otherwise be an overwhelming volume of content, to address the video-specific complexity of that content, and to facilitate smooth integration with video operations.
Deploying CSM in the cloud yields a feature- rich system built for media, without the infrastructure investment and overhead costs that can be a barrier for many organizations, but with all the advantages of the cloudâs unlimited storage space and computing power.
Active-archive cloud solutions exist as a counterpoint to the public-cloud scenario. They offer cost-effective big-data storage and mediacentric features in a pay-per-use, âCSM as a Serviceâ package. They are also built to meet all SLAs, however stringent, that an organization defines.
Creating a purpose-built private cloud with these cloud-based CSM services lets media organizations overcome financial barriers and address the challenges associated with cloudbased video management. With CSM in place in a private cloud, that cloud can become far more than a storage center.
Disaster recovery (DR) is perhaps the most obvious use for a private cloud. In one implementation, a well-known global entertainment conglomerate uses a massive CSM-based private cloud as the backup and DR archive for its entire organization. As depicted in Figure 1, the private cloud, located in another part of the country, is configured to ingest archived content from the companyâs West Coast operations
to an off-site datacenter.
Unlike a DR implementation, where the restore function happens reactively, this private cloud is configured to be an active archive, where content is simultaneously and continuously being ingested and restored every day. The same conglomerate that uses a private cloud for DR also built 14 separate private clouds for some of its distribution networks.
This implementation takes a proactive approach to disaster by ingesting a continuous 14-day playout scheduling, with elements of the programming coming from two separate locations on the West Coast into one datacenter in a third location. From there, the private cloud continuously restores that content to a redundant playout center on the East Coast.
If ever there is an event that affects the primary playout center on the West Coast, the East Coast center can â with virtually the flip of a switch â ensure that playout on the 14 channels
A private cloud can also be configured for active use, such as in editing and postproduction workflows. In this way, the cloud becomes a universal, collaborative workspace that is especially useful for multisite organizations.
In countries where laws require that a subset of the data cannot leave the country, companies can split the data as appropriate, storing some in a private cloud within the country, while sending content intended for more long-term storage and archiving to a private cloud within a datacenter in another country.
In another scenario, a sports team built a private cloud to connect directly to its onpremises archive. It also created an iPad app to be used internally, so that coaches can share video clips with players. Selected clips are sent from the local archive to the cloud, and from there, a cloud-based distribution workflow publishes the video to the app. This example illustrates how, once content reaches the cloud, it can be repurposed and published to any platform.
No matter how it is used, a purpose-built private cloud using CSM as a service has its economic benefits. As discussed, cloud-based CSM solutions were created with video in mind, so they provide video-centric service for a fraction of the cost of a traditional cloud service. Building cloud-based workflows that take advantage of the video-aware features of CSM can not only alleviate budget-intensive local equipment and processes, but automate parts of the workflow to make them more efficient. In addition to DR and business continuity, the cloud can also handle other mission-critical but expensive services such as technology refresh and migration to new tape technologies, thus eliminating significant capital expenditures.
Andy Hurt has more than 13 years of experience leading product development, management, strategy, and operations in multiple global technology organizations. He previously worked at Level (3) Communications, First Data and DISH Network. He is certified as a New Product Development Professional from the Product Development and Management Association.
By Paul Sweeting
There are plenty of live-streaming platforms out there for anyone who wants to set up their own broadcast on the cheap. But few have caught on as quickly or generated as much buzz as Meerket, the barely month-old streaming app that rides atop Twitter.
Or at least it did until Friday, when Twitter abruptly cut off Meerkatâs ability to easily access usersâ list of followers to automatically alert them to when a new âMeerkastâ is in progress. The move was neither unprecedented for Twitter, which has never been overly developer-friendly, nor particularly surprising insofar as Twitter announced its acquisition of Periscope, a competing live-streaming app, reportedly for $100 million, on the very day it shut the door on Meerkat.
So much for platform neutrality.
Itâs not hard to see why Twitter would want to reserve the opportunity represented by Meerkast for itself, however. It has the potential to become a very powerful platform in its own right.
Live video streaming is not a new technology. But the Meerkat app got a lot of things about it right. The app is launched and streams are initiated from a smartphone (so-far iOS-only but an Android version is in the works) and, like Snapchap photos, the streams are ephemeral. There is no pausing, rewinding or sharing during a Meerkast (although the originator can download a video of the stream).
By Paul Sweeting
The full text of the FCCâs open internet order has now been released, along with 305 additional pages of exegetical elaboration and 79 pages of formal dissents from the two Republican commissioners.
From an OTT perspective, there isnât much in the full text that wasnât already known from what the FCC released last month when it voted to approve the rules: The orderâs âbright-lineâ rules against blocking, throttling and paid prioritization do not apply to commercial interconnection arrangements. However, the FCC will consider complaints regarding those arrangements and will take (unspecified) enforcement action if an ISPâs behavior is determined to violate the orderâs âgeneral conduct standard,â prohibiting actions that âunreasonablyâ interfere with or damage consumers or edge providers.
As the media and Entertainment industry continues its rapid digital transformation, post-production houses, and those that work tangentially to the post house, are experiencing a bit of trepidation about what the future may hold. Even the word post-house itself, which implies a physical structure, has become something of a misnomer as a new generation of SaaS-based companies are entering the post-production space offering cloud-based, end-to-end platforms for production, post-production, distribution and delivery with very limited physical infrastructures. The challenges and opportunities facing post houses are significant as studios and filmmakers seek out more collaborative, connected, cost effective and scalable platforms to manage their content.
The Future of the Post House
It is a widely accepted fact that the media management role of the post house, along with key staff positions such as the digital imaging technician (DIT), remain exceptionally important in the modern film industry. The management of the media itself and the need for efficient quality control has also helped enhance the role of the post house.
However, current trends in the industry have reduced the need for a traditional post house as it was once understood. For example, one of the most outwardly visible signs of Hollywoodâs conversion from film to digital is the successive closure of the townâs once great film processing labs. In May, the Deluxe Hollywood Lab, which was built on the Fox Hollywood film lot in 1919, finally closed its doors. Technicolor has shuttered both its Glendale lab and its iconic boxy black building on the Universal lot (now a cutting-edge NBC Universal media center.) And this past year, the Academy of Motion Picture Arts and Sciences offered the ultimate posthumous tribute to the film-processing business when it presented an honorary Oscar to the men and women who operated the labs, with Chris Nolan giving a stirring eulogy for their âmore than a century of service to the motion picture business.â
These lab closings have helped spark an intense debate over the direction of post-production in the digital age. One of the key questions is whether a centralized brick-and-mortar post facility still makes sense at a time when so many film companies are turning to a cloud-based post-production model, in which work is globally dispersed and subcontracted to a wider range of companies.
On the one hand you have giants of the business like Technicolor, Deluxe and ModernVideoFilm which have made significant investments in physical infrastructure. All these major players are transforming themselves and adding new digital services and cloud-based offerings to meet the changing needs of their clients. On the other hand you have companies like Hulu Post and platforms like Amazon Web Services and the Google Cloud Platform, which allow for a virtual workflow and donât necessarily require a physical infrastructure. Both sides are searching for a business model adaptable to an industry that requires ever more streamlined and connected solutions to manage the digital production and post-production value chain.
Iâve worked on the creative, production and technology sides of the film industry for more than two decades and have seen countless business models come and go. When we stared FilmTrack more than 14 years ago, the relationship between content creators and post-houses had been largely unchanged for decades. Film companies would simply hand over their materials to a single post-house with instructions for distribution and delivery, and the post-house would handle the entire back-end servicing.
Today FilmTrack manages the content and data for close to 200 companies and works with some of the leading post-houses to define what the next-generation post-house will look like. Iâve seen the complexities of this process through our employeesâ involvement in organizations like the Hollywood Post Alliance, EIDR and SEMPTE. The forces driving these changes to the post-production landscape are manifold: vast improvements in technology and processing power; huge increases in digital and file-based content that is cheap to produce; and the vast amounts of data and metadata that must now be cataloged and stored for every film and television series.
The evidence is right before our eyes: Studios and filmmakers are now thinking in terms of end-to-end solutions for their artistic and business workflow â from ingestion to vault/storage, transformation, QC, delivery and commerce. More and more theyâre relying on cloud-based platforms which have the potential to provide content owners, distributors and their customers with a safe, secure and sophisticated model for the long-term life-cycle management of their content. In conjunction, the rate card for services found within the traditional post house are changing as functions like encoding and transcoding can be done with out-of-the-box software.
However, when you look at final color correction, visual effects and sound mixing, these processes are more complex than ever, and no less time consuming than 10 years ago. According to Bob Pfannkuch, an industry pioneer who founded Rank Video Services, now a division of Deluxe: âThe post house of the future may be called a âfinishing houseâ not a âpost house.â It will be known for taking content that is 90 percent done and finishing it.â
Such changes have fueled consolidation among the industryâs bigger players, with traditional rivals like Technicolor and Deluxe working together to offer complimentary services. This evolution has also created opportunities for new SaaS-based companies which provide a whole new level of flexibility and collaboration to meet evolving industry needs.
Theyâre also fundamentally transforming the way in which studios do business with post-houses. As James Staten, a Vice President at Forrester who blogs about cloud computing and next-generation business intelligence, points out: âDisneyâs Frozen required 50,000 CPU cores crunching simultaneously to process its 3D effects and meet its opening date. The next Frozen, shot in 4K, will up the effects complexity 10-12x, according to visual effects experts.â Staten also observes that, âon-premise workflow systems are hitting the limits both in ability to on-board and manage a federation of identities and support the collective editing of the growing video files. As such, nearly all the major workflow tools makers now offer SaaS-based workflow systems that are either used purely in the cloud or in a hybrid mode with some workflows on-premise and others delivered from the cloud.â
This is forcing industry professionals and technologists to start thinking in terms of file access rather than file transfers. Right now the major emphasis is on how fast can you transfer files, speed and bandwidth. But we want to get to a place where weâre giving access to files, not transferring files around. On a consumer level, weâre already doing this with music, pictures and email that are stored in the cloud and accessed with different devices. Weâre going to get there on the B2B level too: eventually the file will exist in one place in the cloud. That will bring cost savings and reduce concerns about asynchronicity since everyone will be working on same thing at same side.
These changes cut across the whole production, post-production and distribution cycle. Dailies captured on set are now routinely managed through virtual platforms that can be run for directors and producers in far-flung locations and allow for transcoding and color correction on premise. The dailies business has quickly morphed into another role managed by the DIT and others.
Furthermore, the management and delivery of global marketing assets has been radically transformed by the advent of cloud-based DAM services. In other words, a lot of things that used to be done with hardware processing are now being done with software updates and SaaS solutions. However, creative services for quality films with extensive digital effects will still be performed by the post house.
Bob Pfannkuch points out that âregardless of whether you have a big centralized lab or remote worker utilizing the cloud, the common and necessary thing for a post production house is to tie information together — the necessity to keep track of whoâs doing what and where everything is.â
Moshe Barkat, CEO of Modern VideoFilm, agrees: âRegardless of whether you have a big lab or people at home with storage in the cloud, you need a data infrastructure so that everyone can collaborate.â
Thatâs been a core mission for FilmTrack, as we team with partners across the industry to help them collaborate and connect the dots across the entire life cycle of their IP. At FilmTrack, we understand there is no one-size fits all solution. Thatâs why weâve made all metadata fields user-definable and configurable allowing clients to develop standards that fit with their unique business needs.
As we consider the future of post-production, one thing is clear: the business of content is expanding not shrinking. Emerging distribution platforms like Neflix, Hulu and Amazon are fueling this transformation, as is the rapid proliferation of devices on which content is being viewed â from iPhones to Androids, smart TVs, HD sets, 4K sets all of which have different constraints. Furthermore, the consumer is becoming more and more demanding â expecting more personalization and interactive options. The critical question is whether cities like Los Angeles, or even the US itself, will continue to serve as the hub for post-production.
By Paul Sweeting
HBO just canât quit the bundle. With HBO Now, itâs new, over-the-top streaming service, the network for the first time is making its content available to stream without a pay-TV subscription. But HBO still hopes to sell it as part of a bundle. The only differences are the the other components of the bundle and the identity of the bundlers.
At launch, HBO Now will be sold exclusively by Apple and available on Apple devices only. According to HBOâs FAQ, âyou can subscribe to HBO NOWâ using your iTunes account. Customers can access HBO NOWâ by going to HBONOW.com, through AppleTVÂź or by downloading the HBO NOWâ app in the Apple App StoreÂź.â Apple and HBO will then share customer support duties.
After a three-month Apple exclusive, HBO will make the service available to other digital distributors, such as Amazon and Roku, presumably on terms similar to Appleâs, with the distributor doing most of the heavy sales lifting. But the network is also very much hoping to persuade its current cable-operator affiliates to bundle HBO Now with their broadband-only offering, so far with little success.
By Paul Sweeting
Nearly a decade after Netflix went over-the-top, at least a full decade after the launch of YouTube, and more than two decades since Bruce Springsteen first sang of having â57 channels and nothinâ on,â the video industry, which we used to call the TV industry, is still wrestling with the problem of content discovery.
If anything, the problem is getting worse, not better, as the volume of programming and the number of program sources are both growing rapidly thanks to the new digital platforms.
Heroic efforts have been made over the years to tame the flood, using search technology, algorithmic recommendation engines and various other big-data strategies.
Roviâs Fan TV, for instance, which it acquired late last year and reintroduced at CES in January, uses voice-activated semantic search and leverages Roviâs vast trove of video metadata to generate recommendations or locate specific titles in response to natural-language queries.
By Paul Sweeting
When a consumerâs OTT video stream starts rebuffering, or suffers packet losses resulting in degraded quality, itâs often hard to know where to direct blame. The problem is typically caused by congestion somewhere between the contentâs originating server and the consumerâs receiving device.
But exactly where in the chain of transit that congestion is occurring, and more importantly who is responsible and why, can be difficult even for engineers â and virtually impossible for consumers â to ascertain.
Back when it appeared the FCC was poised to classify interconnection arrangements between last-mile ISPs and third-party transit and content providers as a new, distinct type of Title II service the question of liability for congestion in the chain of transit suddenly became urgent for those involved in wholesale traffic exchanges.
Fearing the new classification would leave them at a disadvantage in negotiating interconnection agreements with content delivery networks (CDNs) and other transit providers and worried theyâd be blamed for problems occurring elsewhere in the transit chain, ISPs rushed to the FCC to insist that any new rules regarding traffic exchanges cover both parties to the exchange.
By Paul Sweeting
Donât look now OTT fans but the net neutrality rules expected to be enacted Thursday by the FCC may turn out to be not as OTT-friendly as it originally appeared they would be.
When FCC chairman Tom Wheeler unveiled his âfact sheetâ on the upcoming rules on Feb. 4, it looked as if the commission was poised to adopt the âstrongâ version of net neutrality pushed by Netflix and others. According to the fact sheet, the rules would treat interconnection arrangements between ISPs and third-party edge providers as a Title II service subject to the same âjust and reasonableâ standard that will apply to ISPsâ management of their last-mile networks.
Since then, however, as noted in a previous post here, even some net neutrality advocates have raised questions about the legal and statutory grounds for extending Title II to interconnection arrangements. In a letter to the commission dated Feb. 11, Free Press policy director Matthew Wood warned the interconnection arrangements were unlikely to qualify as Title II services as defined by the Communications Act, creating an opening for a legal challenge to the new rules.
Democratic FCC commissioner Mignon Clyburn is reportedly also having doubts about applying Title II to interconnection. According to a report Tuesday by the Capitol Hill newspaper The Hill Clyburn is seeking eleventh-hour changes to the proposed rules, including dropping plans to classify interconnection as a distinct Title II service.
By Paul Sweeting
The FCC this week is expected to approve on a party-line vote chairman Tom Wheelerâs long-gestating plan to impose new net neutrality rules by reclassifying internet access as a telecommunications service under Title II of the Communications Act, setting in motion a process by which the world will finally get to see the full text of the 308-page Memorandum and Order and begin fighting â almost certainly in court â over its particulars.
One thing that apparently will not be in the order, however, is any bright-line rule banning so-called âzero-ratedâ data plans offered by wireless operators and ISPs under which particular applications are not counted toward a userâs monthly data cap.
âWe do not take a position on zero-rating,â the FCCâs special counsel for external affairs Gigi Sohn confirmed last week on the C-Span program The Communicators. Instead, she said, the agency would review complaints about zero-rated services on a âcase-by-case basisâ to determine whether they harmed consumers.
That has many OTT providers, start-ups and VCs worried that wireless carriers and ISPs will rush to embrace zero-rated data plans, producing the same sort of anti-competitive and market-distorting effects as paid prioritization, which the new rules do explicitly ban.
By Martin Porter
I donât know who else to write toâŠ so considering that this Sunday is your big day of the year and ultimately your show is the marketing force behind my storyâŠ youâre it. I have a confession to make because I have sinned.
You better than anyone know that it is screener season and we all know what that means. There are discs of all those great movies everyone has been meaning to see circulating at parties and among friends, creating a virtual industry underground among those who should know better but simply canât resist watching one of your Academy Awards contenders for free.
My recent failing involved the Sony Pictures Classics picture âWhiplash,â which appealed to my childhood obsession with jazz drummer Buddy Rich. It was also one of the many other movies that are still on my pre-Academy Awards broadcast must-see list. I actually paid to view the movie in my hotel room during a recent vacation, which was cut short by my car service to the airport arriving too soon. I never saw it to the end and I was obsessed with seeing it through (take note for an opportunity here UltraViolet).
Unfortunately, despite a tease on VUDU that it was coming soon, the movie was nowhere to be found legally on the web. (The fact that I never considered checking out Fandango to see it in the theater is as much a reflection of my travel schedule as it is the state of theatrical affairs). Iâm at least ethical enough to steer clear of the bootleg sites.
But then, by happenstance, the screener surfaced during one of those all-too-common industry chats that were taking place over the past few months among those somehow connected (albeit by 6 degrees) to an Academy-voting member.
By Paul Sweeting
As ISPs, both large and small, gear up to sue the FCC over its forthcoming net neutrality order, even strong supporters of net neutrality have begun pointing to potential legal problems with the proposal outlined by FCC chairman Tom Wheeler earlier this month. One of biggest potential problems, as far as OTT providers are concerned, was flagged by Free Press policy director Matthew Wood.
As described in the fact sheet distributed by the FCC, the order will treat the âserviceâ ISPs provide to OTT services and other edge providers as a Title II service, just as it does the internet access services ISPâs provide to subscribers, giving the commission the authority to review interconnection agreements between OTT services and ISPs and potentially declare them not to be âjust and reasonableâ as required by Title II:
By Paul Sweeting
Itâs hard to remember now, but Stewart took over anchoring duties at the Daily Show nearly 17 years ago â more than six years before YouTube was invented. Yet they seemed made for each other. The showâs easily chunkable format was ideal for the atomized milieu of YouTube, especially in the early days when YouTube uploads were tightly restricted by length, and the website quickly became the Daily Showâs second time slot â for better or worse.
Even today, after an epic legal battle between Comedy Centralâs parent company, Viacom, and YouTube, the online platform remains a critical outlet for the Daily Show. As Peter Kafka noted on Re/Code, the Daily Show draws about a million viewers in its initial airing. But millions more see it on YouTube the next day on their laptops and smartphones, or at least the bits their friends alert them to via Twitter, Facebook and other social media channels.
Viacomâs nearly decade-long litigation against YouTube for copyright infringement, in fact, was in large measure about the Daily Show, along with the Colbert Report, South Park and a few other properties. It was the unchecked, unauthorized uploading of clips from the Daily Show and the Colbert Report, as much as anything else, that spurred Viacom to launch its $1 billion lawsuit against YouTube (and later Google ex-acquisition) in 2007.
By Paul Sweeting
Music subscription service Spotify last week hired Goldman Sachs to help it raise around $500 million at a valuation in the neighborhood of $7 billion. Private market analysts currently value the company at around $6 billion.
The new fund raising round likely pushes back any plans the company had for an IPO, no doubt disappointing some investors. But it buys the company some time before it has to focus on IPO prep as it gets ready to face its first real competition. According to a report by the usually well-sourced 9to5Mac, Apple is gearing up to relaunch a Beats-branded music streaming service this summer.
Rather than simply dropping a Beats app onto Apple devices, the report says Apple has been working on a deep integration of Beats technology and functionality into iOS, iTunes and Apple TV.
By Paul Sweeting
Weâre just at the dawn of the virtual MVPD era and weâre already seeing signs of more market segmentation and product differentiation than with the current, facilities-based service provider model.
sling TV logoOn the heels of Dishâs breakthrough launch this week of its Sling TV service, Sony has begun to pull the curtain back a bit on its own virtual pay-TV service, PlayStation Vue, which is expected to launch by the end of the first quarter. GigaOMâs Janko Roettgers got a sneak peak courtesy of a beta tester, including some screen shots of the UI, and itâs clear the Sony service is a very different animal from Sling TV.
Unlike Sling TVâs low-priced, slimmed-down bundle of a dozen channels built around ESPN, PlayStation Vue includes a nearly full load of broadcast and pay-TV networks â over 70 according to the list provided to GigaOM â along with catch-up VOD and cloud-based DVR functionality, and is likely to cost $60 to $80 a month â roughly the same as traditional cable or satellite service.
The difference in the bundles reflects the very different audience segments Dish and Sony are targeting as well as their different strategic goals. Sling TV is targeted at the 10 million or so U.S. households, many of them counted among the Millennials, who currently have broadband service but do not subscribe to pay-TV.
By Don Terry
Big DataâŠHadoopâŠData Lakes? Everywhere you turn there is a lot of industry buzz in the news about the value of âBig Dataâ, and the potential for this exciting new technology.
Big Data may indeed be a buzzword, but if so itâs a buzzword that can have a measureable and incredible impact on a companyâs top, and bottom lines.
At its core, the concept of Big Data is that of supporting executive decision-making with the most accurate, current, comprehensive and comprehensible presentation of all information available regarding a business. Unstructured data is doubling every year, per IDC, driven by mobile devices, gaming consoles, social media, the Internet of Things, second screen and digital ânon-linearâ television viewing. But why does this matter? The promise was that Big Data was going to cure cancer make our lives easier and change our lives forever.
By Chuck Parker
It seems everywhere you look these days there is something about âthe Cloudâ in front of you. Twitter, LinkedIn, the tech press, and seemingly every press release you read has the various players in the Media and Entertainment industry describing what they can do for you in the cloud.
The promise of the cloud is BIG. At its most basic level, there is the opportunity for a company to turn its fixed investments in CAPEX into variable (or âburstableâ) spend when required as OPEX. For smaller companies and companies without legacy infrastructure this is potentially the best way forward so that their costs are directly tied to their revenue stream whether those requirements are storage, transcoding or rendering for post production and visual effects work flows.
For larger and more established companies, it is the opportunity to exceed current infrastructure capacity to take on that surprise project. It is also the opportunity to set an investment level threshold where companies build to the âtroughâ rather than the âpeakâ for the inherent variability in the media industry business season.
But this isnât a new promise in the IT world. Back in the early 2000s, this promise was held out to the largest companies in the guise of âoutsourcingâ. Whatâs changed now? First, CEOs and CFOs in the M&E industry are well educated now about âcloudâ and understand enough to know that their businesses should be at least experimenting with workflows in the cloud.
Additionally, the âburstableâ nature of the cloud means that businesses can actually âtest driveâ new capabilities in their workflows without significant investment or risk to their business. These two structural changes have resulted in a proliferation of growth in âback officeâ workflows across industries. SaaS (which is the ultimate cloud approach where the application and infrastructure are âby the drinkâ) has been a driving force here, allowing companies to put their expense systems, HR systems, and even sales and CRM systems into the cloud (think SalesForce.com) with great success for the companies who are deploying them.
But putting production systems into the cloud has been elusive. These applications are both complex and customized to the point where SaaS is not really an option. Even when two companies are using the same rendering application for their workflows, they are often managing their compute and storage in entirely different ways. So the industry has coined a new term to attempt to educate the CxO suite on how to approach this landscape â IaaS â or âInfrastructure as a Service.â
At its most basic level, when you retain control of the application but are leaning on the cloud for storage or compute resources, this IaaS term describes your approach to leveraging the cloud. But while this approach has better economics and risk models than the âoutsourcedâ approach in the industry from 10 years ago, it isnât exploding at the rate you would expect with its promise of âon demandâ and âless investmentâ.
So what is holding the M&E industry back from investing in the cloud for its primary workflows?
Two things: security and connectivity.
Security. While every major cloud provider goes to some length to describe what kind of security protocols their customerâs data lives in while on their servers, we still hear horror stories every day about large companies being hacked for their valuable resources (Target and Home Depot are the most recent infamous incidents). In our specialized industry, all of us know that one single breach of pre-release materials can be the death of a company and no amount of promised encryption, even when from established and emerging cloud platforms, can alleviate those fears.
Further, if the project you are working on isnât your IP, you are likely already bound by contract to use certain security measures that preclude using âpublicâ cloud infrastructure for your workflows. The ability to audit security processes and posture is important to trusting partners in the service chain and remains a requirement for the most important content workflows.
Connectivity. The challenge of delivering the promise of âburstableâ, âon demandâ storage and compute power to these resource intensive applications comes down to the internetâs age old axiom: sustainable bandwidth. While there are plenty of companies that can drop a multi-gig connection to a cloud provider, there are few that have the expertise to connect your application into the cloud resources right to where they need to be and integrate with your existing network and workflow – taking account of aspects like low latency requirements.
Even then, just finding a âlarge pipeâ for your data doesnât complete the business modelâif you cannot get your bandwidth to be as âburstableâ as your storage and compute power, the investment model for cloud falls apart.
At Sohonet, we believe the key to unlocking the M&E industryâs âcloud potentialâ is the ability to offer studios, post production houses and visual effects companies options for getting their applications connected to public and private
cloud infrastructure in a manner that meets their low latency and security requirements while still meeting the âon demandâ business model to support their ability to âburstâ into the cloud for production.
We believe that the M&E industry will embrace a mix of three approaches to meet their production workflow and business model requirements. Inherent to all three approaches is unlimited or inexpensive egress (essential for the unpredictable production process), improved security posture, and (most critically) access to 24/7 support resources that understand their unique workflow requirements.
- Low-cost access to generic compute and storage resources (public cloud) coupled with sustained low-latency bandwidth that includes unlimited egress.
- Access to application-specific low-latency and/or industry standard security approaches (private cloud) for storage and compute resources coupled with sustained low-latency bandwidth that includes unlimited egress.
- High-speed, burstable connectivity to major cloud providers where the support for the application and security are already âin-houseâ and the only missing component is the âburstableâ bandwidth directly into their resource center that provides the lowest possible latency and inexpensive egress while still improving the security posture.
We believe that access to cloud resources is critical to the industryâs progression along ever-increasing data storage and compute requirements as 4K workflows begin their progression to 8K and High Dynamic Range workflows and while 4K consumption becomes mainstream in consumer homes. As the trusted communications partner for the M&E industry Sohonet is committed to providing the same Fast, Flexible and Phenomenal customer service that has built our brand and reputation over the past 15 years, and that delivering on the promise of âConnected Cloud Servicesâ is critical to our customersâ future.
By Alan Wolk
For all the debate around who should be in charge of second screen and social TV efforts, one thing is becoming very clear: the key to success rests with the showrunners.
Thatâs because when the showunner is involved, along with the actors and the writing staff, it seems like the second screen experience is an actual part of the show, not some sort of bolted-on afterthought. In fact, a recent study from Twitter, Fox and the Advertising Research Foundation revealed that 40% of viewers prefer to see tweets from cast members versus 18% who wanted to see tweets from the official show handle.
This stands to reason on many levels: the type of viewer who is fan enough to want to tweet about a show is the type of viewer whoâs likely formed some sort of connection with the actors and wants to read their tweets…
As my kids get older, Iâve had to cede control of the car radio. One result? Thanks to the forced waning of my NPR habit, Iâm much less interesting at cocktail parties than I care to be. Another consequence? Trying to smoothly navigate radio programming that doesnât meet my parental, shall we say, scrutiny â nor, assuage the pop-cultural tastes of the wannabe teenager.
Satellite radio â long dominated by SiriusXM Radio â brought peace of mind to parents everywhere by offering up more programming options in the car than we ever thought possible. But â like the living room before it â the car is becoming the latest field on which todayâs digital media game plays out.
As my kids get older, I’ve had to cede control of the car radio. One result? Thanks to the forced waning of my NPR habit, Iâm much less interesting at cocktail parties than I care to be. Another consequence? Trying to smoothly navigate radio programming that doesn’t meet my parental, shall we say, scrutiny â nor, assuage the pop-cultural tastes of the wannabe teenager.
Satellite radio â long dominated by SiriusXM Radio â brought peace of mind to parents everywhere by offering up more programming options in the car than we ever thought possible. But â like the living room before it â the car is becoming the latest field on which todayâs digital media game plays out.
New streaming music providers offered multi-platform music experiences that were highly personalized, mobile, and which threatened to make in-car satellite services too niche. (Register here for the Sept. 23 webinar to learn how SiriusXM navigated new channel demands with analytics and explore how SiriusXM is leveraging customer data integration, behavioral analytics, and real-time interaction to build deeper relationships with their subscribers.)
With more than 25 million subscribers, SiriusXM didn’t seem to be in a bad position. Still, with increasing competition from streaming music entrants like Pandora and Spotify, SiriusXM Radio needed a strategy to future proof their business, and meet subscribersâ demands for content anywhere, at any time.
Like many companies who donât fully realize the treasure that is customer and marketing data, SiriusXM had outsourced marketing technology. When the time came to quickly respond to the shifting music market, a key first step was to bring their most valuable asset back in house. This step marked the first in a journey for SiriusXM Radio to develop and deploy next generation marketing analytics that would allow them to reinvent their relationships with subscribers and prospects, and take ownership their data.
Across media & entertainment and digital media, the expectations of consumers continue to shift. More than ever, audiences expect deeply personal messages on their platform of choice, at the right time. Yet, many companies donât leverage their most valuable asset â detailed insight into audience behavior. And, in some cases, that data is left to third parties to manage.
If your business is ready to take control of marketing data to drive a more personalized, engaging customer experience, hereâs your chance to learn from the top. SiriusXM Chief Information Officer, Bill Pratt, is sharing his experience about his companyâs analytics journey in a live, one-hour webinar on September 23rd.
In this M&E Journal Digital Exlusive, Rakesh Nair of Dell discusses how they are powering the compute-intensive needs of the Media & Entertainment industry with their products, technology and for their creative partners. For example, the 45-second tracking shot of Paris that kicks off Academy-Award winning animated film âHugoâ entailed some serious animation rendering.
Pixomondo, the visual effects company tasked with rendering the film, needed high-powered computing to support this complicated composite, which for just that 45-second tracking shot, cost tens of thousands of dollars in power alone. It is not hard to imagine then that things like hardware performance, technology footprint and heating and power efficiency play a major role in the ability of Pixomondo and similar companies to turn a profit.
2nd Screen Viewing Experiences: 73% of TV Everywhere views are on a 2nd Screen. Â ReelSeo. Â Feb 6th. 89% of video views on the BBCâs iPlayer are VOD vs. Live. Click link to view Infographic.
If Facebookâs new acquisition âOculus Rift â sounds like something out of a Science Fiction movie, your gut isnât that far off. The virtual reality headset maker â snatched-up for the astonishing Â $2 BILLION in a surprise move â is the ultimate in geek chic. The device â which can create rich virtual reality, immersive experiences for gamers and beyond â already has a devoted if pocket-protector-wearing fan base. But, what does it mean for Facebook?
If Facebookâs new acquisition âOculus Rift â sounds like something out of a Science Fiction movie, your gut isnât that far off. The virtual reality headset maker â snatched-up for the astonishing Â $2 BILLION in a surprise move â is the ultimate in geek chic. The device â which can create rich virtual reality, immersive experiences for gamers and beyond â already has a devoted if pocket-protector-wearing fan base.
But, what does it mean for Facebook?Â Aside from broad brush comparisons to Apple and Google â companies which squarely saddle the software and hardware divide â thereâs a buzz in Hollywood that this puts Facebook squarely in the movie business. Huh?Â Some media analyst say that we should think of Oculus this way: itâs just another screen.
While that may be true in the long view, Iâd argue that Facebook was ALREADY in the movie business, even without trying to out-Google Google Glass.Â And, the current Facebook movie business doesnât demand another screen.Â Facebook â and its cousins Twitter and Pinterest âÂ are the collective mouthpiece for audiences to share what Â think about anything Hollywood puts on any screen. Iâm talking about what you or I or our mothersÂ (Yes, itâs true. Your mother!) have to say about movies and television on social media.
Those comments or âlikesâ are untapped gold in Hollywood. They reveal your audienceâs interests in ways that box-office numbers still canât. With the right analytics tools and insight, you can mine this data to learn EXACTLY what members of your audience think. What they think about your movie. Your talent. Your Second Screen App. Your recommendations. Your levels of customer service, if youâre in the subscriber business.Â Iâd argue Facebook and Twitter and their social media cousins have more to teach Hollywood than Hollywood has to teach Hollywood. The trick is, is Hollywood ready and able to listen?
The ability to analyze social media and behavioral data â and, most importantly, do so in a way that is able to be looped back in to bigger marketing and planning operations â is essential to making and monetizing content in the new M&E Ecosystem.Â But, the fact is, few studios and distributors are doing this well.Â Read this article to get a better sense of how social media analytics need to play into your contentÂ and analytics strategy.
Speaking of Hollywood â the industry converges in a matter of days at NAB.Â Check back here for your NAB wrap-up, insights and ideas.Â Until then, Sci-Fi friends and believers, may the force be with you.
It’s been another fast-paced week in the digital video and second screen industries. While the OTT video world is still reeling from the previous week’s announced Disney Movies Anywhere service (a serious threat to UltraViolet) and Marvel’s announcement of an exclusive output deal with Netflix (continuing to threaten HBO), second screen took a shot in the arm from the Oscars, and Roku mounted an attack on Chromecast. At a glance:
- “Watch ABC” did Second Screen for the Oscars “right”
- Ellen broke Twitter
- Roku announced their “streaming stick” device
- Dish struck a deal with Disney to delay commercial skips
- FreeWheel was acquired by Comcast
- The BBC announced the death of analog for it’s Channel 3 service
- An Aereo lost its court battle in Salt Lake City and Denver
Itâs hard to imagine a technology company with more Media & Entertainment clout. Or, are they an entertainment company with massive technology chops? Either way, Netflixâs literal invention and dominance of the OTT market has revolutionized the way content is consumed . And now, they’ve even successfully re-engineered the way that content is created. A true sign of the Netflix zeitgeist? They’ve inspired a new lexicon for how audiences watch content, with new language like âcord-cuttingâ and âbinge watching.â
Itâs hard to imagine a technology company with more Media & Entertainment clout. Or, are they an entertainment company with massive technology chops? Either way, Netflixâs literal invention and dominance of the OTT market has revolutionized the way content is consumed . And now, they’ve even successfully re-engineered the way that content is created. A true sign of the Netflix zeitgeist? They’ve inspired a new lexicon for how audiences watch content, with new language like âcord-cuttingâ and âbinge watching.â
As if you need proof that the language of Netflix is real, here are some hard facts. By the end of the first weekend following its Valentineâs Day release, the second season of House of Cards was streamed in its ENTIRETY by more than 2% of Netflix subscribers. With more than 40 million subscribers worldwide, that means one million people binge-watched an entire season in a matter of days.
Is that kind of massive success a surprise for a show that wasn’t even subjected to a real pilot process? Not for Netflix. Very little about how, when and where their audiences watch content is a mystery. Thatâs because Netflix uses sophisticated analytics to evaluate a billion transactional events per day. Every nuance of audience interaction is mined to drive their business forward, from securing the best content, the best prices for content, to developing meaningful and targeted recommendations that keep their subscribers watching and wanting more. Netflixâs analytic muscle is so strong in its recommendation engine that they attribute 75% of their streaming activity to recommendations.
And, thatâs just the beginning of how Netflix uses data to drive their business. In a rare, live webinar, Netflix analytic thought-leader Kurt Brown will share how this Media & Entertainment pioneer is using analytics in the cloud to drive its business. To learn more or register for the March 18th webinar,Â click here.
2nd Screen had a tumultuous run up to CES 2014 with the press continuing to be split between hype and disillusion.Â While we normally would have written and presented this update at CES, we decided to focus on releasing our research on monetization on behalf of our society members to help them and their primary stakeholders (investors, customers, management) cut through the hype and the disillusionment and focus on clear examples of what is working.Â Ironically, the additional insight gained in the first few weeks of January has been invaluable with regards to both consolidation (Yahoo closing IntoNow) and M&A (Viggle buying Dijit, TiVo buying Digitalsmiths).
What a crazy week. As if it wasn’t enough for NATPE to be taking place in Miami (with some great research and stats published about second screen), there was a ton of consolidation activity in our industry (Dijit/Viggle, IntoNow from Yahoo, ) and some rebranding by GetGlue. At the same time the 2nd Screen Society (S3) published a teaser on its new research about monetizing the second screen, and then Gigaom and TechCrunch wrote some pretty disparaging views, with Gigaom reverting to the salacious headline of “Social TV is Dead“.
- NATPE. Chris Tribbey wrote up a pretty decent summary of the content creators’ panel during NATPE discussing the insights from the CEA/NATPE research, presenting some GREAT stats about second screen usage and more importantly, a strong view from content creators (“Show Creators See Second Screen as Permanent”).
- Yahoo’s IntoNow. Yahoo made a decision to shut down IntoNow, the synchronous enhanced viewing experience app they acquired only three years ago. I think two developments lead to this decision by Yahoo. 1) it wasn’t a very engaging experience (too broad and shallow) and was likely not attracting a ton of consumer engagement, and 2) Yahoo’s Screen app is taking off and has cemented their view that focusing on engaging the consumer around the viewing experience was a more attractive monetization play. Let’s face it, Adam Cahan founded Auditude, IntoNow and is now Marissa’s right hand man for all things mobile video at Yahoo–he didn’t do this without thinking it through.
- . What a great validation of how important Discovery is in the second screen ecosystem. Led by Ben Weinberger, Digitalsmiths has been quietly winning most of the MVPD operators in the U.S. with their personalization and recommendation platform. The cash purchase for $135M by Tivo is certainly validation that the space is valuable for investors, but more directly indicates that Tivo is going to keep moving into the direction of creating great viewing and companion experiences for the living room (their current experience on a smartphone and tablet is already amazing and getting better all the time, lead by Tara Maitra and Evan Young).
- Dijit acquired by Viggle. Viggle is perhaps one of the most successful at monetizing the second screen companion experience (a short section in our research sheds light on their success). Dijit’s Nextguide is perhaps the most engaging consumer Discovery experience (yes, better than Fan), with significant broadcast partnerships on their tune-in “reminder button” feature. I am convinced that with Jeremy Toeman leading the UI/UX and Greg Consiglio leading the monetization, this marriage will be a happy one for their shareholders, their customers (brands and TV networks) and consumers.
- The Grammy’s. Why is this important? I am sure a ton of stats will come out this week about how many tweets, etc, were pushed during the broadcast. But did you see that Chromecast commercial? Somehow Google managed to create the ultimate Discovery and Control powered second screen experience, stealing that opportunity from Apple, Netflix and Samsung. They have not only created a pervasive and passive experience that seamlessly allows consumers to “cast” their viewing experience from their second screen to the first, but by using the DIAL protocol, the second screen is then freed up for a companion experience (or synchronous advertising – see our research). The evidence of the commercial during the Grammy’s means they are SERIOUS about being successful with their $35 dongle. Apple, Sony, Samsung and Roku should take heed.
- GetGlue. What does that mean to you? It launched many, many moons ago as an attempt to create a social network around your viewing (and reading and wine drinking) habits, letting consumers check-in to a show and share with their Facebook or Twitter friends. i.TV bought them last fall and no surprise has decided to re-brand them into something that speaks to the opportunity Shazam is busy uncovering–tvtag. Despite Gigaom’s view on this, I think this is positive in that it means the i.TV management recognizes the opportunity (engagement with the consumer at specific points of the viewing experience) and the threat (Twitter is chasing this and so is Facebook). Will they be successful? Who knows, but consumer behavior will continue irregardless.
- Shazam. Perhaps more interestingly, Shazam is making a big push during the SuperBowl this year to see if their momentous growth in active users can move the needle on advertising and consumer engagement during the world’s largest live viewing event. While I had been skeptical over the previous 18 months, their new CEO Rich Riley (joined in April, 2013) seems to have turned the ship in the right direction, racing towards a UX that both engages the consumer and provides a monetization platform. And a partnership with Facebook isn’t a bad idea either. Watch this space (and that Jaguar commercial).
- SocialTV is dead. Hmm. Did you read the article? First of all, yes I agree that gimmicky concept of social (badges, check-ins) is challenged, but keep a few things in mind: 1) second screen experiences can typically be broken into 5 segments,
1 of which is the sharing or social aspect. 2) All of the hype from Gigaom and TechCrunch in the last 3 months has been that the Social TV battle is down to Facebook vs. Twitter–neither of which is going anywhere or walking away from TV. 3) Read the last 3 paragraphs and you will see both confirmation that they still believe in point 2 AND that the right engaging experience still needs to be developed–confirmation of the fundamentals above (that the consumer behavior will continue despite the poor UX). Conclusion: a salacious headline that certainly made MANY people read the article.
Itâs a sight not all that unfamiliar to new parents: an ashen, red-eyed baby, shrieking uncontrollably and spewing bile in its path.Â In fact, come on over to the Quinn household, and you can witness the excitement first hand.
But, if youâve been on any social network lately â and you havenât been hiding under a rock â youâve seen the infamousÂ âdevil babyâ.Â This little hellion has amassed more than 36 million views on YouTube since taking the internet by storm.Â And, while the antics may not surprise the average new mother, this baby is no ordinary kid.
Devil Baby was an inside Hollywood prank â the marketing brainchild of 20thÂ Century FoxÂ in anticipation of their new horror flick calledÂ Devilâs Due.Â But, more than being just a trick, Devil Baby is aÂ digital marketing phenomenon, revealing new data driven marketing table stakes for todayâs Media & Entertainment market.
Hereâs what you can learn from Devil Baby:
- To Go Big, You Need To Go ViralÂ –Â There is no surefire recipe for viral success. But, one thing is certain, big data analytics can increase your odds through analytics.Â Take the dominant video network,Machinima, for example.Â Their business is ensuring the rapid and massive uptake of content across a wide swath of users â and they do that by not only creating awesome content, but by using behavioral analytics to identify key networks, influencers and consumers ripe for that experience.Â (You can also check out our exclusiveÂ Machinima White PaperÂ here!)
- Â Share of Voice is Great; Share of Wallet is BetterÂ –Â You know a meme has hit the mainstream when my mother makes a comment about it.Â But, buzz isnât enough in todayâs competitive content landscape.Â Itâs about the bottom line. Content creators, studios and distributors need to be able to use that buzz to predict performance and drive revenue. Being able to tap into, analyze and act on the ocean of big data – includingsocial sentiment and network analysisÂ â are key factors in the age of the Connected Consumer.
Data driven marketing â and execution through an end-to-endÂ integrated marketing strategyÂ Â — doesnât need to be as painful as an exorcism. Â But, it will take the right tools and know-how being driven deep into our marketing organizations.Â Devil Baby is one of many examples weâll see of todayâs marketing revolution being driven by big data.
Speaking of red-eyed hellions, nap time is overâŠ and my second shift is calling.
Second screen, social media and companion applications are all high on the agenda of executives in the media and technology industries. As a reflection of all major TV and technology conferences in 2013, CES, NAB, IBC, and MIP had several sessions dedicated to second screen. But second screen, while proven as a reality of consumer behavior, is not yet widely seen as a revenue driver. Indeed the reality of the second screen phenomenon is accepted, as is proven by the continuous flow of statistics showing that viewers use another screen in front of their TV (one of the latest being Nielsen saying that 75% of smartphone and tablet users are engaging with second-screen content more than once a month as they watch TV…
Second screen, social media and companion applications are all high on the agenda of executives in the media and technology industries. As a reflection of all major TV and technology conferences in 2013, CES, NAB, IBC, and MIP had several sessions dedicated to second screen. But second screen, while proven as a reality of consumer behavior, is not yet widely seen as a revenue driver. Indeed the reality of the second screen phenomenon is accepted, as is proven by the continuous flow of statistics showing that viewers use another screen in front of their TV (one of the latest being Nielsen saying that 75% of smartphone and tablet users are engaging with second-screen content more than once a month as they watch TV). Another proof of the generalization of second screen is the multiplication of companion screen applications: over the course of 2013 they have become widespread in new geographies including the Middle East, Eastern Europe and Latin America, where they had little presence only 12 months before. Comparing the space with 2012, it is clear that no TV players can ignore it. Even more striking, the players behind some of the most successful apps are large and well established: Peel now has 40m+ downloads, mostly through a global partnership with Samsung; Apple bought Matcha in August 2012; zeebox grew their partnerships with Sky, Comcast (NBCU) and Foxtel, while DirecTV acquired a share of i.TV (which bought Getglue at the end of 2013); Viggle has a longstanding partnership with DirecTV; Comcast has launched âSEE iTâ with Twitterand Xbox SmartGlass app was downloaded more than 17m times. Despite this popularity and the presence of the largest players, few industry executives dare to speak openly about monetization of second screen applications and only a small percentage of 3rdparty app providers have made their progress public. There may be good reason for the industry stalwarts to keep their progress private with commercial competition so tough, but the sceptics of course believe that is because no one has actually experienced much monetization success. So while many in the industry are wondering where the money is in second screen, next to nobody is ready to âshow [you] the moneyâ.
The purpose of our research paper is to do exactly that: âshow you the moneyâ. We review the various monetization strategies used by second screen companion and viewing applications and evaluate how these strategies work and which ones drive the most value. We also provide an evaluation of the second screen market size and review its main drivers. Finally, we review how Twitter, Microsoft, Samsung and other players not directly in the second screen ecosystem are planning to use the second screen to increase their revenue.
More importantly perhaps, we have taken the time to update our market sizing from last year in an effort to demonstrate where the large opportunities for monetization lay for players in the ecosystem.
Feel free to explore our research and infographics on our website, engage us on Twitter (@ChuckParkerTech, @S32Day), or meet us in person at Mobile World Congress (Feb 26th in Barcelona) or at NAB (April 6th in Las Vegas).
- Â· Q4 2012. 35 million tablets sold in the U.S. alone during the Christmas rush and significant social TV and 2nd Screen engagement growth in all scenarios.
- Â· Q1 2013. Clear evidence of ât-commerceâ from 2nd Screens and the continued growth of active zeebox and Viggle subscribersâbell weathers for the industry on consumer engagement in 2nd Screen.
- Â· Q2 2013. Hyper growth in mobile video viewing, especially in ad supportedâa key trend to observe for the potential of 2nd Screen monetization in converged experiences.
- Â· Q3 2013. Continued viewing growth on mobile, strong revenue growth from enhanced 2ndScreen engagement apps and the launch of Chromecastâan opportunity for any 2ndScreen app developer to give Discovery and 1st screen Control capabilities to their video viewing experience.
continues to be stronger where there are pervasive yet passive opportunities for engagement with the consumer.
- 1. Increased consumer engagement in the content. The majority of the investment in 2nd Screen companion and viewing experiences is coming from the content creators and distributors (primarily the TV networks). Creating a lift in engagement (i.e. viewing time) translates to increased revenue regardless of their monetization model.
- 2. Increased consumer engagement with the advertising brands. The vast majority of the content ecosystem focused on 2nd Screen monetized their content through advertising in some form. As major brands place bets in this space, they are focused on metrics like âCost per Touchâ instead of impressions delivered (CPM). The brands crave interactivity and engagement, working to determine which consumers are interested enough to move forward in their purchase cycle.
- 3. Monetization itself. While major TV networks and brand advertisers can get comfortable with metrics that have a strong correlation to monetization, many of the 1st and 3rd party engagement developers depend on revenue coming in the door to support their investmentsââwhere the rubber meets the roadâ so to speak, as actual payments for advertising, t-commerce and engagement come together in the 2ndScreen companion and viewing experiences.
By Colleen Quinn, Teradata Corporation
As Hollywood shifts into high gear around direct-to-consumer engagement, content creators and distributors are working fast to develop the know-how and analytic capabilities to execute. Thereâs a lot to consider, especially for organizations that are new to the D2C fray.
Cut to Warner Bros., who is leading the charge among Hollywood Studios in developing rich direct-to-consumer offerings, and the CRM efforts that make those offers successful. Michele Edelman, Warner Bros. Vice President, Direct-to-Consumer, opened the curtains on Warner Bros.â pioneering work in D2C at a recent Teradata webinar.
The virtual-standing-room-only crowd had a front-row seat, as Edelman described the evolution of CRM and D2C at the studio. Â Warner Bros.â capabilities have expertly woven together best-in-class integrated marketing, with a big data strategy that gives them a detailed understanding of each member of their audience.
Launched in 2009, the Warner Bros.â CRM strategy boasts massive success where it counts:Â in the numbers.Â Any savvy digital marketer knows that benchmarks are critical. Without them, thereâs no real way to measure your success.Â So, imagine Warner Bros. excitement when they saw the powerful results driven by their new direct-to-consumer CRM programs â seeing rapid, exponential improvements across all key marketing KPIs, including:
- 25% Email Open Rates
- 13% Click-through Rates
- Decreases in unsubcribe rates
Take a listen to the webinar replay to hear how Warner Bros. launched, refined and mastered their Direct-to-Consumer CRM and analytics strategy, featuring an extensive audience-driven Q&A.
And, the big data conversation for Media & Entertainment didnât stop there! Industry thought leaders in advertising, cable, broadcasting and more are convened in the Big Apple this week, as Teradata and UCLA Anderson reprised Create, Captivate and EngageÂ â a big data analytics event with M&E in mind.
By Colleen Quinn, Teradata Corporation
It was the pen-stroke heard around Hollywood. CBS and Time Warner had (finally) reached agreement about retransmission fees. Viewers from coast-to-coast exhaled a collective sigh of relief, and switched on Pro Football.
One term at issue? The big per-subscriber fee hike that CBS demanded, aiming to double their carriage fees over the 5-year term.Â While a huge boost in revenue is always worthwhile, CBSâ negotiations hinged on a term that is much more interesting. They wanted to retain streaming rights.
Hereâs why. Increasingly, streaming rights are the gateway to commanding your future. With them, content owners can seek the best opportunities to fully monetize content across every channel. But, more importantly, streaming rights often pave the way for content owners into the direct-to-consumer fray.
Going D2C means more than just having content rights, though. For studios and distributors, it means developing a keen understanding of each member of your audience. Itâs about having the capabilities to deliver the right content, right message and right impact.
Lots of content creators are talking about this seismic shift in the businessÂ – but,Â only a brave few are putting their collective money where their mouths are. There are trailblazers. Warner Bros. Entertainment is one of them.Â Among the first to build-out a robust, start-up-like technical operations organization, Warner was also among the first to take the lead with Ultraviolet.
Now, Warner Bros. Vice President of Marketing for Digital Distribution, Michele Edelman, offers a rare opportunity to listen via live webinar as she shares the studioâs vision and insights for launching and leading industry-changing, direct-to-consumer capabilities.
Itâs rare that inside Hollywood can learn from inside Hollywood â but, once in a great while, it happens.Â Donât miss your chance to join in!
We have often discussed in this blog the 4 major features sets of second screen (To Control, To Discover, to Enhance, to Share – relevant research linked here and here). We have also reviewed what Netflix was experimenting with for leveraging the 2nd Screen as a discovery and control device via DIAL (try opening Netflix on your iPhone while it is also running on your PS3, find the blog here). Finally, we have predicted what a DIAL-enabled world might look like with its major backers (Netflix and YouTube) driving the protocol acceptance into every new device launch since early 2013 (DIAL blog here, 10 predictions here).
Well ChromeCast is the incarnate of all those opportunities and at the same time evidence of where the industry will head with rapid adoption. While we have tried to tell the SmartTV industry that the best implementation for their platform is to be the launch pad for the stream, Chromecast demonstrates that use case out right.
Similar to an Apple experience, the packaging of the device in simple and clean. The small dongle device comes with a power cord and USB cord (alternative for power) and an adapter in case your HDMI port is in a tight spot.
By Zane Vella, Founder and Chief Product Officer, Watchwith
Over the last decade a familiar battle cry of the digital media executive was âAnytime, Anywhere,â meaning the promise of digital for the consumer was to watch âwhat you want, when you want it.â And as we look around today, much of that future has arrived in the form of HBO Go, Netflix, Xbox, Xfinity â all popular on-demand services now available on tablets, phones, computers, game-consoles, Blu-ray Disc players, and connected TVs. So whatâs next?
As the MESA readership is well aware, much of the traditional entertainment distribution business is an analytic and strategic exercise in windowing and differentiation. In short, this means extracting greater return through enforced scarcity or by delivering added value through one or another distribution channels or partners. This article examines how and why time-based metadata is becoming a critical strategic asset for content owners, and how it enables new forms of windowing and differentiation across the digital distribution landscape.
A New Vocabulary
First, some definitions: âtime-based metadata,â a.k.a. ârelated content metadataâ is descriptive information related to a particular scene, shot, or moment of a film or TV episode. Unlike traditional program metadata that defines general information applicable to an entire program, time-based metadata follows the heartbeat of the program content itself and includes a steady time-code or time-reference that refers to a relative time within the media asset. Fundamental examples of time-based metadata include what actors are currently onscreen, what music is playing, what locations are in the scene, and what featured products are on screen at any particular moment.
Within the realm of time-based metadata, there is also an important concept of âevent types.â For example, âactor,â âmusic,â âquiz,â âpoll,â âbehind the scenes videoâ and âproduction stillâ are all types of events or related content (which can also be thought of as layers) that are associated with particular moments in a program. Event types can be anything a content owner or producer desires that either adds value to a program or is related to the program.
One of the defining characteristics of time-based metadata is that it is information related to content which is abstracted from any particular visual presentation or consumer experience. This primarily means information in the form of text, images, and links to other Internet-based content or services.
Lastly, another key concept is âmetadata syndication,â or more simply, fine-grained control of which event types or layers of time-based metadata are made available to certain business partners, based on business rules such as time-window or geographic location. Technically speaking, metadata syndication is implemented via access credentials, (a.k.a. API keys) that are provided to a distribution partner or to each application that consumes time-based metadata made available by a content owner.
Foundational Digital Trends
Before returning to the discussion of windowing and differentiation, it is important to identify two broad overarching technical trends which are both transforming our industry and providing the foundation that time-based metadata strategies are built upon; first, the dominance of digital file based workflows, and second, the increased importance of more traditional program-level metadata in digital distribution operations (as opposed to the time-based data).
Until very recently, program assets were delivered to distribution partners via a broad range of technical means. Broadcast and pay TV exploitation relied primarily on satellite uplink, theatrical exploitation relied on physical delivery of 35mm prints, home entertainment (DVD and Blu-ray) exploitation was via tape formats (DLT) to manufacturing facilities , and digital exploitation (iTunes, Xbox, PS3) via file transfer. Within just the last few years, the economics, practicality and operational benefits of digital video workflows have elevated digital file transfer as the primary means of asset delivery across all distribution channels.
Once operating within such a digital file ecosystem, program-level metadata associated with those files becomes critical for inventory management, merchandising, fulfillment, pricing, royalty tracking and interoperability across various systems. Â These requirements have driven a great deal of innovation, and over the past several years, an enormous amount of ingenuity, intelligence, and dedication has gone into solving industry challenges around program-level metadata. While challenges and opportunities for efficiency remain, great progress is being made, particularly by industry organizations such as ISAN and EIDR.
Together, these two trends are an important indicator of the direction that the industry overall is heading, and form the basis of some logical conclusions: If digital file delivery persists or increases, content owners will increasingly need to provide their distribution partners with metadata around their assets, and different types of metadata will be required for different means and channels of exploitation. Metadata will increasingly become the means of delivering information to business partners throughout the entertainment production and distribution ecosystem.
Anytime, Anywhere, But Now What?
Thanks in large part to standardization of digital file formats and the hard work of many digital distribution operations teams, most large entertainment companies are now able to reliably deliver their audio and video assets across a wide range of distribution partners. There is at the same time, however, a definite and glaring absence of any unified or efficient way to enhance the consumer experience around that video or any standardized means to deliver value-added related content.
This means that while the industry has been successful with the first critical step of delivering program content to the consumer, there is a distinct lack of business tools or âleversâ for distribution executives to efficiently create consumer demand for their digital assets. Â Unlike in DVD and Blu-ray, each distribution partner, such as iTunes or Xbox, has their own unique set of requirements for delivering value-added content (if at all) and promotional opportunities also require unique one-off asset production and expense.
This lack of a unified platform for creation and delivery of added-value content may be a significant contributor to decreased consumer interest in sell-through and ownership models.
Turning the Tables for Everyoneâs Benefit
Digital distribution executives not only lack a unified means of enhancement and promotion for program assets, they also operate in a highly fragmented landscape. Traditional cable, satellite and telco distribution partners have increasingly complex delivery requirements to fulfill their own evolving customer viewing habits. In addition to these MVPDs, a new wave of mobile and tablet applications, web video distribution, and OTT partners bring additional delivery requirements and new valuable ways to connect with the audience. No matter how well resourced a media or entertainment company might be, it is near impossible to keep up with every new digital distribution opportunity, and equally impossible to differentiate your program content from one distribution outlet to another.
The solution is to turn the tables, and for the content owner to offer each distribution partner a variable package of time-based related content metadata associated with each licensed program. This related content metadata becomes the key ingredient for each distribution partner to deliver a differentiated, value-added consumer experience to their end-user or consumer.
For example, electronic sell-through partners and ultimately the consumers that purchase movie and TV programs through them, can have access to extensive layers of value-added content, while rental partners and their consumers can be restricted to a more limited subset of metadata, and fewer layers of value-added content, if any.
In practice, this means that the consumer who purchases a film or TV program can enjoy a different, presumably higher value experience, than one who rents that same video asset. By extension, this also means that a subscription model could potentially emerge in which the content owner would provide the consumer with ongoing or evolving enhancements (active layers of engagement) with their favorite films or TV programs.
The Power of Metadata Syndication
This approach is extremely powerful for the content owner because it allows them to function more similarly to how they have traditionally operated. It becomes the content ownersâ responsibility to create the highest-value master asset possible, but now that asset is a combination of audio, video and time-based metadata. Individual distribution platform idiosyncrasies and presentation layer requirements become the responsibility of the distribution partner, and the content owner can focus their attention and resources on delivering value to the consumer and marketing those benefits.
This approach also opens the door for content owners to focus on the ongoing interactive social and commerce services that may be connected to any scene or moment of their content, and with the right technology platform at their disposal, enable the content owner to benefit from these additional layers of monetization in cooperation with their downstream distribution partners.
âTurning the tablesâ though metadata syndication is also powerful because it challenges distributors to innovate and compete with each other to deliver the best consumer experience, as opposed to expecting content ownersâ limited marketing budgets to stretch across all distribution platforms they currently have to reach. In many cases, particularly related to television, this approach also solves a major timing problem. Only the content owner or network programmer has access to first-run television episodes before their first airing, so metadata syndication allows them to make related content such as quizzes, trivia and behind the scenes images available in a way that no distributor would be able.
The Time-Based Metadata Ecosystem
Creation, production and distribution are all part of the time-based metadata ecosystem. From a creation perspective, an enormous amount of valuable time-based related content exists from the earliest stage of pre-production. Similar to popular DVD, Blu-ray and synchronized âSecond Screenâ experiences, examples of this type of material include early storyboards, location scouting photos, and production design sketches. These are are all valuable related content that can be set to time in a film or TV episode, and are good examples of how to extract value from existing production artifacts. Additional examples of existing information that can be quickly set to time are music cue-sheets, branded entertainment product list, and on-set photography.
Applications of time-based metadata also open up new creative opportunities for writers, producers, and multimedia storytellers. Instead of leaving related content creation to marketing and programming teams, writers are increasingly taking responsibility for various types of related content metadata as an integral part of the creative process. For example, Fourth Wall Studios is an LA based entertainment company that is creating new forms of storytelling where, for example, the on-screen characters call the viewers cell phone at designated moments in the story timeline.
Another example of creative time-based metadata creation and production comes from USA Network where Twitter âhashtags,â originally intended to drive social media activity during first-run viewing, are being stored as persistent time-based metadata with particular episodes and scenes, so that they can be leveraged by applications and users in later syndication and VOD.
Time-based metadata also has important implications for ecommerce and enable new transactional revenue opportunities for both content owners and distribution partners. In 2012, eBay introduced Watch With eBay, a stand-alone iPad application that surfaces current auctions and âBuy it Nowâ items that are related to a particular program. eBay has also demonstrated a version of the application that uses time-based metadata to surface items that are related to a particular scene, and expects.
Time-based Metadata, Windowing & Personalization
One of the greatest opportunities for content owners and distributors alike is to leverage time-based metadata to proactively drive consumer activity in new viewing windows, and with new viewing patterns. Through metadata syndication, the same digital file can offer the consumer a new experience with each view, and that experience can be influenced by whether it is being experienced in parallel with the first-run viewing, within the Nielsen C3 window, or in a VOD session.
Technically speaking, windowing relative to time-based metadata means that based upon the specific time-window at which a viewer engages with a piece of content, a corresponding package of related content layers can be made available. These time-windows can be relative to first-run or premiere of the content, or personalized to a specific viewer and corresponding with successive views.
Time-based Metadata and the Future of TV
Over two decades of video product development, time-based metadata has emerged as one of the most important components of a successful digital video distribution strategy. This descriptive information about what is happening at any moment is critical to differentiation in a multiscreen world, and will play an increasingly important role in differentiation across distribution partners. As smartphones, tablets, and smart TVs proliferate, there will be increased demand for rich and valuable time-based metadata delivered as part of the master asset. Increasingly, time-based metadata will unlock the context of film and television, and will power the new user experiences and new revenue streams that are only possible on emerging two-way digital platforms.
Just one decade in to the twenty-first century, we are starting to see indicators of a vibrant metadata ecosystem growing within the folds of the traditional film and TV production and distribution industries. Writers and producers will increasingly create time-based metadata as an inherent part of their creative storytelling process, and production companies will increasingly package, license, and sell that critical enabling meta-layer to their programmer and distributor customers. Programmers and distributors will in turn increasingly deliver a time-based metadata layer to their cable, satellite, telco, web, mobile and OTT licensees, so that those consumer facing services can unlock the context of every moment of film and TV for their audiences.
Zane Vella is the Founder and Chief Product Officer at Watchwith, a software platform to create and distribute time-based related content around films, TV and commercials. He has 20 years experience at the intersection of TV, Internet, and software product strategy and has led the development of interactive products and platforms for media and entertainment companies including Apple, Disney, NBCU, Netflix, Viacom, and Warner Bros.
 Time-based metadata is typically provided as a JSON or XML formatted message so that a product developer or programmer can choose from available time-based information and use it as they see fit in a consumer experience.
 ISAN is the International Standard Audiovisual Number a voluntary numbering system and metadata schema enabling the unique and persistent identification of any audiovisual works and versions thereof including films, shorts, documentaries, television programs, sports events, advertising etc. http://www.isan.org. EIDR is a universal unique identifier for movie and television assets. http://eidr.org/
 Walt Disney Studios Distribution has been a leading innovator of synchronized consumer experience on a tablet associated with a film. More info available at http://disneysecondscreen.go.com/
 Founded in 2007, the Culver City-based company develops new properties delivered via Internet browsers, smartphones, game consoles, TVs, movie screens and in the physical world. http://fourthwallstudios.com
By Geoff Tulley
Sony and Panasonic recently announced an agreement to jointly develop standards for a next-generation optical disc that has the capacity to hold more than 300 gigabytes of data (six times the capacity of current Blu-ray Discs) by the end of 2015. According to the two companies, the 300 GB discs are geared toward the archival storage market. Is this project the next-generation of Blu-ray? Or is it the consumer electronics industryâs answer to being ahead of the 4K curve? See below for an analysis.Â
In the joint release issued by both companies, each included reference to the other’sÂ cartridge-basedÂ storage solutions that are currently in the market (Panasonicâs Data Archiver LB-DM9 series and Sonyâs Optical Disc Archive system).Â These systems employ multiple recordable optical discs encased in a protective cartridge (beyond this similarity, however, the systems and their media are completely different).
As an associate of mine commented: “(These cartridge-based systems)Â may have a fairly tough time in the enterprise market though, as itÂ seems to beÂ more of a packaging trick than anything really new — proprietaryÂ cartridges and the like can be a tough sell.”
The companies make the point pretty clearly that this announcement is about a single disc solution that ups the capacity of recordable optical discs. It will be interesting to see whatÂ mix of layers, lasers and the like will be required to make that magic. Since multiple layers at Blu-ray Disc wavelengths areÂ alreadyÂ in current specifications, the implication is that this new format will be a departure from BD as we know it.
As TV Technology reported, âBoth companies pointed to the expanding needs for archiving in video production as well as from cloud data centers as the reasons behind their work in advancing the format.â
I did notice another Web site, however, that took the same announcement and (IMHO)Â leapt off the deep end:
“But while streamingÂ content seems like a good idea, some consumers (especially videophiles) areÂ clamouring for a physical solution to the problem,” the article stated.
It would be interesting to see theÂ data behind the “consumers areÂ clamoring” bit. The Blu-ray Disc Association might want to examine it.
According to the article, “Though neither company has admitted asÂ much, itâs clear that the partnership is an effort to resolve theÂ 4K mediaÂ question once and for all. The two Japanese firms are teaming up to createÂ what will essentially become the successor to the Blu-ray Disc. Their ambitiousÂ plan is to create a higher capacity optical disc thatâs ready for consumerÂ useÂ before the end of 2015.”
I don’t think that it is at all clear that this is about a consumer format; certainly not about one that is aimed at 2015. 2015 is the stated target for the commercial, data archiving implementation. I think it is safe to assume that theÂ quest for Ultra HD consumer distribution will not be waiting for this new format toÂ emerge, so one has to wonder what feats of marketing may be required toÂ re-introduce the world to new forms/formats of physical mediaÂ two years from now (or after).
One also has to wonder if this writer appreciates the significant differences between recordable optical discs and replicated media (such as DVD and Blu-ray) that is used for movie distribution; not to mention the investment required to create the replication infrastructure required toÂ mass produce “affordable” home movie discs.
The Blu-ray Disc Association did make the announcement at CES 2013 that they have a task force studying the issues around incorporating Ultra HD content into the specification. I expect that effort will generate lots of discussions and ultimately product development, I just donât see this 300GB announcement as a harbinger of a consumer solution.
In any case, this discussion does provide lots of interesting food for thought.
By Tony Knight, Senior Product Manager, Rovi Corporation
The other day, I began to realize how much of the physical media that my generation took for granted would be completely absent from the lives of our children.Â My four-year-old daughter Izzy, who was born the year the iPhone was first introduced, already has far different expectations on how content is created, transmitted and consumed.Â For her, you never have to put anything in a machine to get something you want to come out on a screen.Â For decades, the act of taking a picture, listening to music, or watching a film required the movement of something physical into the apparatus of something mechanical.Â In the space of just a few short years, the relentless march of technology has separated content from the spinning gears they were previously bound to.
Whatâs more, technology has rapidly increased the rate of change in the home entertainment business, and this metric can be measured in months rather than years.Â Consider how long it took older formats, such as VHS and cassette tapes, to be succeeded by new standards, like DVD and CD.Â Compare that against the plethora of new content delivery methods available today on such a variety of new devices and you will begin to realize the challenges in store for the home entertainment industry.Â Six years ago the most common way consumers get access to premium content was in the form of a DVD disk.Â It was a universal standard and consumers gravitated towards it.Â This greatly simplified the home entertainment business model for content holders and the businesses that supported them.Â Move ahead a few years, and itâs not hard to recognize that consumers have many more home entertainment choices, ranging from subscription VOD, kiosk rentals to a variety of over-the-top delivery channels.
While the physical disk still accounts for the single biggest piece of home entertainment revenue, it is becoming besieged by a number of other options vying for consumersâ attention.Â A few years ago, an entertainment hungry consumer might have purchased a DVD for $15 to $20 because it represented the best value for money among a smaller choice of consumption modes.Â Today, that same buyer has many more choices, including free or low cost access.Â This competition for consumer attention has forced those of us who make our living in entertainment technology to rethink consumer value, or risk losing the premiums that were once the mainstay of the physical media home entertainment business. Â In fact, the future of the home entertainment business may hinge on the very question of whether or not consumers want to âownâ movies anymore.
The commercially successfully concept of âowningâ a retail movie has always taken some physical form.Â VHS tapes had a measure of success in the retail market, but it wasnât until DVDs were introduced that people bought and collected them in droves.Â Today, DVD and BD still sales constitute the lionâs share of home entertainment revenue, but that revenue is declining 5-15% worldwide, year over year.Â Electronic sell-through, the digital equivalent of owning a movie on physical media, has been available for many years, but it has yet to garner anything close to the same level of commercial success as DVD disks.Â The key question for many in our industry is striking:Â Are consumers willing to pay to own movies, or are they content to rent on occasion?
Several years ago, I spoke at an industry event, and I was asked when electronic sell- through was going to be successful.Â My answer was short and sweet: When consumers view EST as being as valuable as DVDs.Â In the intervening years, the mass market has yet to adopt EST, and physical disk sales have continued to decline.Â Consumer behavior is changing, and not in ways that promote the traditional home entertainment business model.Â To put it another way, five years ago the home entertainment revenue pie was cut up in ways that benefited certain actors.Â Today, that pie is in the process of being recut.Â Those that were used to getting a healthy slice in the past may be alarmed to be getting either a smaller piece, or none at all.Â Others that didnât have a slice in the past are now sitting at the table.Â The question of consumer ownership of content is central to how big the pie is, and how it is to be sliced.
Unless the industry acts (and acts decisively), in a few short years, margins in the home entertainment business could reduce sharply as consumers shift from movie ownership to a much less lucrative over-the-top rental business.Â In fact, I think the entire industry is in need of something akin to the Marshall Plan.Â To that end, here is my 3-point plan save movie ownership that can be treated as additions to UltraViolet:
1) Clearer Differentiation from the Rental Experience
Today, when you consider buying a movie from an over-the-top service, you are confronted with two buttons: Buy or Rent. Clicking the âbuyâ button leaves many customers feeling shortchanged.Â There are generally no menus, extras, special features or other perks that make them feel like the ownership experience has been conferred on them.Â Charging four to five times more for those that bought a movie that has the same user experience as a rental just starts to feel like you are prepaying for your next four rentals.Â The industry needs to find a way to drive more value into the electronic sell- through format, and this means adding features that customers are used to getting from physical disks today.Â Remember, most consumers only watch a movie they like one time.Â They watch movies they love many times, and they want the extras that connect them to the filmâs back-story.
2) Get Aggressive with Disk-to-Digital
My shelves at home have about 400 movies on them.Â The key to getting consumers like me to own new movies digitally is to help me move my library towards the new paradigm.Â How successful do you think Apple would have been with the iPod and iTunes if they hadnât expressly enabled you to bring your existing library of CDs into the same interface as the music you purchased from them electronically?Â Not very, I think.Â There is an effort by a certain large retailer to move existing DVD and BD disks to UltraViolet (UV).Â This is a great start, but the initial reviews have been mixed.Â By my own experience, only two-thirds of the sample I brought in was available for conversion, and none of the extras that were available on those disks are part of my new UV rights.Â Would I now spend $800 or so to move just the movies (without extras) over to a new standard if that new standard makes me feel like I am prepaying for over-the top-rental?Â Not likely, Iâm afraid.
I think disk to digital is a great idea, and some consumers undoubtedly adopt a scheme where they move their libraries over on a per disk basis for a fee.Â That said, this approach presents a barrier that I believe will prevent it from becoming mainstream.
Here is a different approach. Charge little or nothing to convert my existing library to UltraViolet once a retailer has confirmed thatÂ my library consists of legitimate retail disks, and marked each out of circulation once the digital right has been conferred.Â If some of the movies are not available, record my right anyway, and bring it to my locker once it is available.Â Now I can feel the totality of the UV experience with content Iâve spent the past 15 years collecting.Â It didnât involve a big bet on my part, and if I like it, the chances are very good that I will probably make my next purchases as UV ones.
3) Go Crazy with Metadata
My shelves at home used to be great for impressing guests and those with lesser collections.Â That moment has past.Â Here is what my shelf canât do well: recommend a good movie for me, or tell me where there are gaps in my collection.Â My shelf canât sort my movies in ways that help me consume more content.Â In fact, after Izzy figured out how to reach that shelf, it isnât even particularly well organized.Â Once youâve helped move my entire library over to a digital locker of some kind, donât make it the digital equivalent of my shelf.Â Use rich metadata and some excellent user interfaces to help me visualize my library in new and interesting ways. How many Stanley Kubrick movies do I own?Â Am I missing some Fellini movies?Â Â If you tell me, Iâm probably a willing buyer.Â The best recommendation engine is one that takes my own library as input.Â Put my existing movies into a snazzy interface, empower it with some intelligent metadata smarts, and I am much more likely to consume.Â I promise.
There are a lot of people thinking and working on solutions to promote the continuation of the ownership model in home entertainment.Â That said, Iâm sticking firmly with my beliefs from many years ago.Â Customers pay for what they value, and digital distribution of content must delight consumers if they are going to own it at the same rate they did with DVD.Â Izzy is almost 5 now.Â Is her first movie related transaction going to be a rental, a buy, or a subscription?Â Much of that will depend on what the industry does over the next two years.