Tuesday, December 28, 2010

2010 Best Workflows

A bit longer in the making than anyone would like, but full of good information to set a baseline for live encoding and file-based transcoding workflows. We're happy to present the 2010 Best Workflows report, another in the Transitions in Technology white paper series.

Within the next few days, we also hope to be able to reveal details from a few participants. We'll keep you posted.

[Update: Elemental Technologies chose to publish their findings, which can be found at this link as the 2011 Q1 Best Workflows report, containing the same information as the 2010 Best Workflows plus additional information specific to the Elemental outcomes. Enjoy!]

Saturday, December 4, 2010

Has Fujitsu Abandoned the Macintosh Platform? ScanSnap Snow Leopard Issues

Fujitsu had a good thing going, with the ScanSnap for Mac offering one of the best scan-to-PDF options.

But like all good things, this one looks like it's coming to an end, as ScanSnap on the Snow Leopard platform is walking around gingerly on only two paws.

Consider this:

1. The S1500M is the only current Mac-based ScanSnap scanner; all the others (S300M, S500M, S510M) are on Fujitsu's discontinued list.

2. CardIris 3.6, which ships with the S1500M, is not compatible with Snow Leopard (OS X 10.6). CardIris 4.0 is compatible, but you don't receive that version when buying an S1500M

3. Adobe Acrobat is now at version 10 with Acrobat X. Which version ships with the S1500M? Acrobat 8. Yes, that's right, the only Fujitsu Mac ScanSnap has a version of Acrobat Pro that's TWO versions old.

When I asked Fujitsu about upgrading to Acrobat 9, six months after it launched, I was told that what you buy is what you get. Meaning that anyone buying an S1500M today will get Acrobat 8.

Fujistu is clearly choosing to abandon the Mac platform, as they've not made a ScanSnap unit that's Mac compatible since 2008.

How's that for progress?

Tuesday, November 30, 2010

PDF viewing in Apple's Preview looking fuzzy? Welcome to Acrobat X on Snow Leopard

I'm working through a few workflow scenarios with Acrobat X's scanning and optical character recognition (OCR) for an upcoming comparison study. Along the way, though, I have two quick insights in to Acrobat X.

First, it does a great job of shrinking the file size automatically after the text recognition is run. In many cases, I don't even have to run the "reduce file size" feature to bring multi-page documents to a manageable file size.

Second, the issue that cropped up under Apple's Preview in Acrobat 9, after I'd applied the "reduced file size" to shrink the file size, remains. The image will look blocky and jagged, as if a great bit of information has been abandoned.

I initially chalked the issue up to an error in the file reduction algorithms, but the advent of Acrobat X—with its better-than-average standard compression—brings the issue back to the forefront. Even without running the "reduce file size" feature in Acrobat X, any file that's been OCRed will appear jagged and blocky within Apple's Preview application.

Researching the topic online didn't turn up any clues, probably because Acrobat X is so new; since I couldn't ignore this issue, as it affects all the scans I was applying OCR to, I turned to my contact at Adobe's PR agency to get insight in to the issue.

It turns out that Adobe X's new rendering engine may steal a few pages from Acrobat 9's file size reduction score. In doing so, this presents a potential rendering issue in Apple's Preview.  According to my contact:

Acrobat X's new scan compression technology divides the image into 3 layers - Background (BG), Foreground (FG) & Mask. BG, FG images are highly down sampled while mask is kept at a higher resolution (to maintain text readability). 


The layering makes sense, as Adobe has always had the ability to choose Image-Text (where the image overlays the underlying OCR text) or Text-Image (where the text attempts to lay out in a patter closely resembling the image, but the text is the top layer).

I've always opted for Image-Text, as it allows the human looking at the document to read what's actually on the page, should he or she find the OCR text they copied from the PDF a bit, well, lacking.

My contact went on to provide some reasoning behind the miss-match of Preview and Acrobat Reader X, the latter of which seems to display the images in a much higher quality output:

Here is our hypothesis on the reason of low quality rendering by Preview. In order to render a page, it down samples mask to the resolution of FG (or maybe BG), so it loses on text crispness or quality that a high resolution mask provides. Adobe Reader/Acrobat, on the other hand, up samples images to the highest of the resolution of BG, FG and mask to get the rendered bitmap.


Is this the reality? I guess we'll have to see whether Apple issues an update to Preview in the near term. If not, I'm stuck either suggesting that every client upgrade to Acrobat Reader X, or choosing to completely forego the workflow of using QuickLook to view text-heavy OCRed documents.

Monday, September 27, 2010

Transitions In Technology Series: Juniper Media Flow

Over the past few months, we've been hard at work on a workflow article on Juniper's Media Flow technology, which the company has been integrating since the acquisition of Ankeena.

This paper is called "Refining Media Delivery: Intelligent Caching Comes of Age" and covers a variety of aspects around intelligent media caching. This is the first paper in the Transitions in Technology white paper series, a series of commissioned documents that leverage the Transitions' team's experience in the marketplace to tell a compelling story. 

In this case, it's a story that we've followed for several years, as Ankeena announced Media Flow Controller. We think the acquisition bodes well for Juniper's move in to the scaleable media marketplace. 

Here's the link to the white paper. Enjoy!

Tuesday, September 21, 2010

Does Adobe Elements 9 Plays Well With All?

Adobe today announced its Elements 9 consumer image- and video-manipulation products (editing sounds so 1990s) in the form of Photoshop Elements 9 and Premiere Elements 9.

Over the past few releases, Elements has begun to move toward parity on both the Macintosh and Windows platform, but the Mac version had always lagged behind.

It appears the days of second-class citizenry for Mac users may be over, though, as Adobe lists an equal feature set between the Mac and Windows versions of both Photoshop Elements and Premiere Elements.

Premiere Elements, in particular, now has the sharing and access to the Plus service, which gives up to 20 GB of storage for approximately $50.00 per year.

In addition, Premiere Elements 9 addresses the ability to use tapeless workflows—from Flip cameras to D-SLR consumer and pro cameras—a feature that's only recently been added to Adobe's flagship editing tools, Adobe Premiere Pro, as part of the much more expensive Creative Suite 5 software bundle.

Speaking of pricing, that has also dropped for Elements 9 bundles with Amazon reporting the combo pack—Photoshop Elements and Premiere Elements—that's pictured below clocking in at $149.00 but with a rebate that drops the price to $119.00 (after waiting for several weeks to receive the mail-in rebate check, of course).


We look forward to testing the two products, on both platforms, to compare the feature sets, before the product release on November 1, 2010.  [Update: Adobe's PR reps say that the product is available now, even though Amazon still lists the availability at November 1]

Saturday, September 18, 2010

SNAFU - Toast 10 Titanium Pro's Addition Problem

For Macintosh-based content creators, there aren't many options for Blu-Ray disc creation: Toast 10 Titanium Pro is the least expensive alternative that has an option for Blu-Ray conversion and burning. It clocks in around $110 on Amazon compared to several hundred dollars for Adobe Encore (and a downloadable version of Toast 10 Titanium Pro offered by Amazon is even less expensive).

Yet, for all the Toast goodness we've seen from Roxio over the years, this Blu-Ray version of Toast 10 Titanium is a dud: it can't calculate storage space in any meaningful way.

I found this out the hard way, having chosen the "Blu-Ray Video" setting and then spending 8+ hours digitizing standard-definition (SD) miniDV content into Toast 10 Titanium Pro.


Once my content was all digitized, I let the system choose its own automatic encoding, which yielded a pleasantly spacious additional 777.1 MB left on the theoretical Blu-Ray disc:



Except that's Toast Math, which means it bears no semblance to reality. While the Toast calculation tells the user that only 22.55 GB will be placed on the disc, the reality is that this number already exceeds the amount of space on a single-layer Blu-Ray disc!

How's that? Don't Blu-Ray discs have 25 GB of unformatted space (or 23.31 GB of formatted space)? Yes, gentle reader, you are correct. But in the world of Toast Math, 22.55 GB is greater than 23.31 GB.

To fully understand the equation, you must allow your computer to crunch data—lots of data—for more than 36 hours. At the end of this time, your computer will then generate the following error:


Cross-check the Toast Math: Toast calculates 22.55 GB out to be equal to 23.76 GB. The latter is the amount that Toast says it needs to burn the same disc that started with the original amount of 22.55 GB on disc (and 777.1 MB of free space!)

I thought this was an anomaly, so silly me, I re-ran the same scenario for another 36 hours, and used a different Blu-Ray (BD-RE) writer.


Guess what? Same exact issue.

Then,  foolishly, I tried making a stand-alone Toast image (ending in .toast) and burning it via the Copy "Image File" option in Toast 10 Titanium Pro.  The .toast file ended up at almost 25 GB, almost 2.75 GB higher than the original Toast calculation of 22.55 GB).

Yet, when I chose "Image File" under Copy, guess what happened? You are right, gentle reader, I received an error telling me that 23.76 GB of disc space was required on the Blu-Ray disc, the exact same amount that Toast 10 Titanium Pro had calculated in error during its initial bad math day (we're now over 100 hours into testing Toast Math).

What about just removing one of the miniDV files from the Toast layout?  That seemed like a natural conclusion, but due to Toast's automatic encoding, it's not so simple: rather than just taking away the 3.1 GB +/- for each of the miniDV tapes, Toast 10 Titanium Pro does another round of Toast Math, and comes up with less free space on the Blu-Ray disc than before.

Perhaps I don't understand new math, or Toast Math, but apparently the re-calculation is based on Toast 10 Titanium Pro changing the compression rates to "fit" on to the Blu-Ray disc.

In theory this makes sense, since fewer minutes of video being compressed to the Blu-Ray disc could yield a higher-bitrate-per-minute scenario. But not in Toast Math.

In the world of Toast Math, 1+1 doesn't equal 2, 3 or any standard calculation.  It equals Toast Math.

Please, Roxio, teach your program to add before sending it out in to the world!

Monday, June 7, 2010

Systems Integration for the rest of us

Preparing to attend a show for systems integrators in Las Vegas this week, I spent part of today covering Steve Jobs' WWDC keynote address for StreamingMedia.com.

Read the details on iPhone 4 and my take on Apple's attempt to outgoogle Google with iAd and HTML5 lip service on the streaminedia.com site, but pause first to reflect on these excerpted comments that Jobs made. . . .


"Apple is not just a technology company," said Jobs. "It's the marriage of technology and the humanities that distinguishes Apple."

Echoing sentiment I've heard from Sling's CTO Bhupen Shah, who reminded me years ago that the ease-of-use for compelling products is all about the hardware and software working seamlessly together, Jobs drove the point home using the example of the new iPhone's second camera.

"On iPhone 4, it's not just a front facing camera: it's a front facing camera and 18 months' worth of work to come up with software that you'll never even notice when you want to place a video call," Jobs said.

"It's a complete solution so all of us don't have to be system integrators."



iPhoned

Friday, May 21, 2010

Why Partner? Edgeware and MediaMelon

As I've been working through partnerships between various professional transcoding and live encoding manufacturers, so that two companies can provide a joint testing solution for both areas of our comparative testing, I've been struck by the continued uptick in joint solutions.

One such joint solution, which I wrote about today for StreamingMedia.com is a combination of edge caching devices and deterministic middleware, with the two partners being Edgeware AB and MediaMelon, Inc., respectively.

You can read the technical descriptions of the joint solution at StreamingMedia.com, but here's a brief overview of the two companies:

Edgeware was formed about 4 years ago by networking and video professionals, who wanted to solve the challenge of addressing next-wave internet traffic. They felt much of the traffic going forward would be video, and they chose to design an appliance with caching ability. Since this first-generation 1U server used solid-state memory and had an FPGA for programming, IPTV was a natural target market. Edgeware's appliance had a Linux derivative with their own home-grown UDP stack and  file systems, and Nokia, Alcatel, Siemens all sell the appliance as solutions to large telecoms who want to offer walled-garden IPTV services. 

MediaMelon is backed by founder and chairman of Macrovision, with its main office in San Francisco and primary R&D in India. The company's deterministic algorithms, backed by significant viewer experience data, allows MediaMelon to offer its customers - many of which overlap the Edgeware customer base - a set of differentiated service including quality of service for UDP, TCP. MediaMelon, whether through its MediaMelonDirect, for media customers, or its MediaCloud SaaS / hosted solution for telecoms, ISPs and MSOs, leverages real-time viewing quality information and then serves up chunks of streams from a variety of locations across a CDN.


So when Edgeware wanted to move beyond walled-garden IPTV appliances and into the realm of TCP delivery of "web TV" on second-generation appliance, using RTSP and chunks of adaptive bitrate content, the two companies had complementary technologies. 

Yet, why did the Edgeware-MediaMelon joint solution make sense for customers and integrators, especially since both companies had a slightly different approach to addressing the joint solution? 


"From Edgeware's perspective," said Jon Haley, Edgeware's VP of  business development for WebTV and Over the top (OTT)," our VARs and system integrators are seeing huge demand for building out CDNs in local service provider markets and then adding in OTT services. The MediaMelon federated model really is unique, based on the quality metrics from a user's perspective (QoE), and matches nicely with our quality philosophy of pushing content further out into the network via our boxes."

In other words, for markets like Europe, which are highly fragmented due to language and/or incumbent state-run service providers, the joint selling of a federated model couldn't be accomplished just by Edgeware.

"It really is a natural partnership," said Kumar Subramanian, founder and CEO of Media Melon, "making it easier to jointly provide value to service providers who can quickly deploy a full solution of Edgeware boxes along with our SaaS hosted middleware. The value proposition of offering a large footprint due to the federated model is a significant solution to a significant problem."

Tuesday, May 4, 2010

Test Criteria - Professional Transcoding and Live Encoding comparison testing

As mentioned yesterday, two additional companies expressed interest in the comparative testing for professional transcoding and live encoding workflow systems: Sorenson and a new player in the space, Octasic (or 8 ASIC).

While neither of the companies has enough of a total solution to compete in this year's testing, both may have solutions in the near- to mid-term that would fill the gaps and allow them to compete during our second annual test, to be held in 2011. That still leaves 13 companies, including Ateme, Envivio, Harmonic, Inlet, Media Excel, Telestream, who are weighing their solutions against the test criteria.

What are the testing criteria?  Here's a generic overview of the way Transitions will test the various hardware-software solutions:

1. Transcoding: these tests come first, as a way to establish a baseline of quality, with speed secondary but almost equally important.

Here we'll look at things such as failover, including job completion and workflow steps; ability to transcode a single file to multiple adaptive bitrates.

On the speed side, we'll look at speed to transcode files from M2T (MPEG-2 TS), H.264 and another format back and forth to each of the other formats; whether the solution is capable of simultaneous transcodes; and we'll judge the speed of transcoding multiple files to multiple adaptive bitrates.

Resolutions to transcode to will be: 1080p, 1080i, 720p, several web and mobile rates.

Finally, we'll see the number of concurrent file-based transcodes that can occur, based on three workflows: Real-time, Faster Than Real Time, Real-Time xN

2. Live Encoding: after transcoding, we'll turn the testing criteria on its head and look at what each solution can do within a set period of time - specifically within the confines of live encoding testing.


Both tests will also assess performance, throughput and ease of use in setting up a variety of workflows, taking a holistic approach to pro transcoding and live encoding workflows. For instance, for clustering, we'll look at initial setup (3 total units, plus mgmnt node and 5 total units, plus mgmnt node), replacement of individual node(s) or the management node.

Part of the testing will take place in our lab environment, to allow for a common set of test criteria in a typical workflow. In another part of the testing, we'll jointly assess management, clustering and fail-over response, in your lab or quality control location – recommended, unless you want to ship a hefty rack of gear.

To keep the tests objective but still cover the testing time, a sponsorship model is being used. We've found with large groups of test candidates that a sponsorship with equal amounts paid by each sponsor - so that no one company dominates the report - works best.

That's the model we'll use for these tests, and more details on the approach, as well as examples of previous multi-party tests and test methodologies have been made available on request to all invited companies.

Monday, May 3, 2010

It continues

Two more companies have expressed interest in being included in the transcoding and live encoding comparison testing: Sorenson and a brand-new company in the space. More on that tomorrow.

Friday, April 30, 2010

Invitations sent out for professional transcoding and live encoding comparison

As mentioned in yesterday's blog post, a series of email invitations were to be sent out today, highlighting an invitation to a comparative test bed for professional transcoding and live encoding solutions.

The first wave of invitations went out at 2pm Eastern. Response has been solid and immediate, in keeping with the conversations at NAB. Two companies are already signed on, there's a verbal commitment from a third, and strong interest from the two others that have responded.

Invited companies, in alphabetical order, include: Ateme, Elemental, Envivio, Inlet Technologies, Media Excel, Optibase, Ripcode, Telestream, Viewcast. Many of these companies are well-known to workflowed.com readers, but a few have new product offerings that should make testing rather competitive.

The deadline for commitment to testing is May 6, so we'll let you know which companies chose to brave the comparison workflows.

Thursday, April 29, 2010

It's Begun - Comparative Test Bed for Live Encoding and Transcoding

Earlier this week, I wrote an article for StreamingMedia.com entitled "Back to Basics: Hardware Acceleration" in which I discussed the often-misunderstood four areas where hardware acceleration is used for streaming media content: ingest, content creation, output (compression) and playback.

In the article, I mentioned:

Since ingest is the realm of specialized appliances such as Envivio's C4, Inlet's Spinnaker, or Elemental's Live appliance, I'm not going to spend a lot of time on it other than to say I'll be digging into this topic in more depth with the Transitions consulting team-and a few key colleagues-over the next few weeks for an in-depth comparison of select hardware-accelerated live ingest and transcoding boxes.

That time has come.

More than a year ago, I realized that the streaming market, especially the market for professional appliances that did pro-level transcoding and live encoding, was ripe for a comparative study. Through Transitions, the consulting firm I co-founded almost a decade ago, I'd been involved in similar test bed comparisons for videoconferencing. While that project, like this one, was a private venture, it wound up getting a significant amount of press coverage, including NetworkWorld coverage.

While my personal code of writing ethics won't let me craft an article about Streaming Media or any other paid publication about a report Transitions generates, I still thought a comparative study of transcoding and live encoding solutions had merit, so I approached several companies last August about a methodical test bed approach, measuring a combination of speed / quality / ease of use to compare major transcoding and live encoding products against one another.

All were open to the idea, but the timing wasn't right for a variety of reasons. I talked to a few of these same companies - plus a few new ones - at the National Association of Broadcasters (NAB) show in Las Vegas last week, and all expressed increased interest.

On Monday, I put together a presentation capturing the key thoughts; over the past two days, it's been shown to a few companies, asking for feedback and also suggestions on others who should be invited to join one or both of the tests - the first being transcoding, to establish a quality baseline, and the second being live encoding, to look at speed's effect on overall quality and timeliness of delivery.

What I didn't expect, when I tossed out the ideas this week on CPU, DSP, GPU and other processor types to test, was that I'd get the first two signed contracts before even sending the official invitation out to participant companies.

The invitation is set to be sent out on April 30, and on May 7, the day after invitation commitments are due back to Transitions, we'll be doing the first on-site analysis of a transcoding solution in sunny Portland.

Apparently there's more interest in this than I anticipated. . . .

I'll post the generic test criteria PDF tomorrow, along with the invite list for each test. [Update: the initial invite list has been posted, with scrubbed test criteria to follow, after invited companies have a chance to review the versions sent to each.]

Friday, April 9, 2010

iPad - Dead End for Flash?

[Update: It's getting ugly, as an Adobe rep responds.]

Between the launch of the iPad, on April 3, and the introduction of Apple's iPhone software version 4, on April 8, a significant amount of buzz was generated about CS5 Flash Professional's role in the iPad ecosystem.

The ability for Flash Pro to generate iPhone "packages" which allow some Flash content to play on the iPhone was first highlighted in October at Adobe MAX 2009 in Los Angeles.

I wrote about this "workaround" at the time, but interest in this topic was fairly low - until the iPad was announced without Flash support.

Between the sans-Flash iPad and the ongoing HTML5 video tag discussion, Apple continued to blow on the embers of its HTML5-CSS-JavaScript preference for iPad and iPhone delivery as the iPad launch date approached.

Based on the hype surrounding this "working around the web" from a Flash-iPad integration standpoint,  I wrote another article, positing on how Adobe could make Flash Pro relevant to the larger HTML5 development audience.

This week, the embers being fanned by Apple burst into a full-blown firefight. Apple's iPhone software version 4 has a modified licensing agreement, that states in part:

"Applications must be originally written in Objective-C, C, C++, or JavaScript as executed by the iPhone OS WebKit engine, and only code written in C, C++, and Objective-C may compile and directly link against the Documented APIs."

In other words, Apple's saying it's not how an iPad or iPhone package is compiled but also how it's written. Apple adds an example in its licensing agreement, directly following the portion noted above.

"(e.g., Applications that link to Documented APIs through an intermediary translation or compatibility layer or tool are prohibited)."

Jon Gruber at DaringFireball picked up on this with the title: iPhone Agreement Bans Flash Compiler that lists out a few integrated development environments (IDEs) that may cross the line Apple has drawn in the sand, but he saves the bulk of his blog post for the implications to Adobe:

"Wonder what Adobe does now? CS5 is this close to release and the iPhone compiler is the flagship feature in this version of Flash."

Going to be an interesting week at NAB, with the roll-out of CS5, discussion of the iPad's personal media consumption coup and the clash of the tech titans that is now beyond its flashpoint.

Update: AppleInsider notes that the issue here may be pre-emptive, since it only relates to iPhone software version 4.0, which will ship later this year, and that it also may be intended to address multi-tasking (although combining pre-emptive and multi-tasking in this case would be a misnomer).