Timeline: Digital Edition – by John Buck
A new history of editing, editors and the machines they used.
Timeline: Digital Edition contains never before seen photographs, video material, original brochures and animated patents, as well as audio clips and exclusive interviews that document the lives of editors and the craft of editing from the very beginning.
From scissors and cement in the hands of Griffith and Méliès through editing Ben Hur on the Moviola, to Scorsese and Schoonmaker creating Woodstock on a KEM flatbed and onto digital nonlinear with the CMX600.
This digital version of Buck’s highly regarded original Timeline, is designed specifically for the iPad and gives the reader an exhaustive and compelling read. The layered content gives an enriched view of the evolution of editing, and includes an instant glossary to provide additional material even when reading offline.
Timeline is a must for anyone studying film, working as an editor or with an interest in a behind-the-scenes look at the industry’s native craft.
This is the best history of film/video editing development I’ve ever seen in one place.
Ralph Guggenheim, ex Lucasfilm Editdroid
It’s like taking a time machine back into the pioneering days of non-linear.
Stuart Bass, editor of Scrubs, Pushing Daisies, Arrested Development
Now available on iTunes
Published by Enriched Books and Tablo
Apple Computer was due to hold its annual developer’s conference in San Jose, and trumpet what it believed would drive sales and engage third party developers. The hot topics would be Object Oriented programming, CD-ROM authoring, HyperCard and a new Finder. However Apple CEO, and now head of R&D, John Sculley had decided to announce something extra.
Tyler Peppel was managing Apple’s new product development including concepts such as a sports wristwatch, a desktop phone with touchscreen and a joint venture portable electronic book with Toshiba. He had convinced Sculley to make a move into multimedia, and then managed to secure backing and some resources for Project Warhol.
In the weeks leading up to WWDC, Apple’s VP of Networking and Multimedia Donald (Don) Casey directed the marketing department to create a profile for Project Warhol. What would become one of Apple’s great success stories needed an official, and distinctive name. Duncan Kennedy, an Apple product manager and early company evangelist, recalls:
Tyler Peppel had the attorneys at Apple checking out different names with the Quick prefix, and the one that we liked was QuickTime because this really was about time-managed events but there was a problem with that.
The U.S. company Tektronix which was one of the world’s largest makers of test and measurement instruments like multimeters, analyzers, and signal generators had already registered the name Quicktime. Tyler Peppel recalls:
Tektronix owned the name QuickTime, and we began to negotiate with them to see if they would relinquish it.
In the meantime the lawyers came back to us with an alternative, QuickStream. Tyler just rolled his eyes and said “That sounds like (urine)”, but it was called QuickStream for a few days until Don Casey finalised the agreement to buy the name QuickTime.
A few weeks later WWDC began, and attendees were told about Apple’s plans for the coming year. All went as predicted by the trade press until Don Casey took the stage, and introduced a new product called QuickTime. He told a surprised audience that Apple had created a new multimedia document architecture:
…a system wide time coding to allow synchronization of sound, animation and other time-critical processes.
QuickTime would provide developers with a common interface for controlling media devices and ways to produce media-data streams, and Casey hoped the new architecture would be delivered to developers by the end of the year.
Casey announced that QuickTime would allow the Macintosh to be the premier platform for digital media, and in doing so pre-empt Microsoft’s release of multimedia extensions to Windows 3.0.
In his own summary at the conference, John Sculley promised:
…the next generation of breakthrough applications will be on the Mac.
Sculley did not mention that work on QuickTime had not even started.
Eric Hoffert was a founding member of Project Warhol and now QuickTime. He became the project leader and in turn the patent-holder for many software-based image compression algorithms, recalls:
I do remember after WWDC when Don pre-announced QuickTime that many of us were surprised and also asking each other, ‘So what is it exactly that we need to deliver?
More on QuickTime’s history
Steve Jobs at iMovie
“Videotape recording is very useful in television broadcasting but editing it is not as easy as editing motion picture film.”
So began the 1967 SMPTE paper of a team from Japanese broadcaster NHK that included engineers Yasushi Fujimura, S.Iwamura, Aogu Matsumae, Tsuguo Ohtani, and K.Matsuoka.
It described ‘an automatic video tape editing splicing system that had successfully been used to cut 84 programs. Over a period of two years, NHK had developed a system that used a process computer, two 4-head VTRs and a dedicated control panel.
The output of studio cameras was dual recorded onto a 2” tape and a second helical scan VTR with the associated recording address signals. An engineer used the helical deck to edit with because it offered greater control with still frame, fast spooling and reverse operation than the Quad master. The editor played back the rushes only, and pushed a ‘cut-in’ and ‘cut-out’ button at the desired ‘in’ and ‘out’ points using a single monitor.
The address pulse for the edits was automatically picked up and stored in the memory of the computer, however the editor was unable to watch the edited sequence immediately.
The computer used the decisions on its drum memory to transfer the required scenes from the original 2” rushes tape to a new blank master tape, shot by shot. If changes were required, the entire sequence was erased, and the process had to be started from scratch.
The NHK submission to SMPTE contended:
“Nevertheless the pilot study proved that this was not critical. However, it is a little inconvenient.”
The Lucasfilm Editing Division team continued to experiment. In doing so they were by their own admission, very much followers of the thinking espoused by Fred Brooks in The Mythical Man-Month: Essays on Software Engineering. Brooks believed that adding personnel to a software project that was late in its development cycle, made it later. Brooks’ observations were based on his experiences at IBM where to speed development, he mistakenly attempted to add more workers to a project falling behind schedule. Brooks also contended within his book that prototyping is a crucial element of the design process and that product designers should be prepared to implement new or difficult concepts, and then to throw them away:
…the question is not whether to build a pilot system and throw it away. You will do that. The only question is whether to plan in advance to build a throwaway, or to promise to deliver the throwaway to customers… Hence plan to throw one away; you will, anyhow.
Ralph Guggenheim and his boss Ed Catmull knew that they needed to create many iterations of the edit system and then throw them out to ensure they got it right. Guggenheim recalls:
Ed (Catmull) and I agreed, we didn’t know how to make one of these things, so we needed to create at least 2 or 3 iterations of the design and the code done before we could understand it or grapple with the problems. Let alone before we could show it to professional editors.
The editing project team started to investigate how they could prove the concept. Guggenheim defined the development extents:
…what are the basic elements that people need to use when they’re editing? And sure enough it wasn’t necessarily the same feature set that the CMX style systems gave you. It was actually really basic things we needed to deliver. Being able to split an edit, being able to extend a shot at the head or the tail, instantly and being able to preview an edit without delay and without committing to it.
Just as Adrian Ettlinger had discovered a decade before, the path to acceptance by editors and therefore the film industry at large led back to the traditional film editing systems by Steenbeck, KEM and Moviola. Lucasfilm needed to not only mimic their editing workflow but also their ease of use. The opinion from some film editors wasn’t always positive. Ralph Guggenheim recalls:
Some detested the idea of a computer-based replacement for their film machines. They liked their flat beds. One guy even explained to me that he would always get a better sense of the rhythm of his editing by hearing the splices clicking through the gate of his Moviola than he ever would from what we were proposing.
One of the key developers of Avid Studio App for iPad has spoken about the product’s plan at Corel, after its recent acquisition from Avid.
Jim Sugg told the Timeline Group discussion:
Our dev team is now at Corel, and Studio App for iPad development continues, along with our other efforts……
After announcing Media 100 to the press in January, John Molinari and Gary Godin from Data Translation demonstrated the new system to 1992 NAB attendees. Molinari recalls:
We spent a good two years (1990-1992) proving it couldn’t be done. It was obviously technically difficult and very challenging. Then we launched the ‘proof of concept’ in 1992 at NAB. But it was a bittersweet moment.
Editor John Delmont remembers:
It’s kind of interesting how you can track a company’s progress by the size of their trade show booth. In the beginning, they were so small they didn’t even dare to go on the NAB floor. It’s such a huge investment in money and personnel and they didn’t even have a product yet. The first year they showed up they had a hotel suite.
It was in the Hilton right next to the trade show, but it was definitely an off-Broadway affair. The card was shown to prospective resellers in the living room part of the suite.
They said, “Here’s the prototype card. Do you want to hold it?”.
John Molinari recalls:
We had something that worked as a demo but it was not a real product. The mechanics, with all due respect to the engineers, who worked so hard on it, were unsound.
Tony Molinari recalls the problems with Media 100’s early release.
They tried to get the editing product to work, internally. It just didn’t work. Of course the hardest thing to make work was the interaction of software and hardware. To make it do the things it needed to do. Moving around video in real time at that quality level, displaying the video on a computer screen and a video monitor wasn’t easy. Especially with an open system approach, using off the shelf hardware from Apple and hard drive manufacturers.
John Molinari recalls:
We knew after NAB ‘92 that we had the right idea but we would have to go back, and start again to realize it. What we had was a completely failed technical implementation and it was never going to work right.
After NAB, John Molinari was asked to report to Data Translation’s Board of Directors. As General Manager of the MultiMedia group, he knew that the Media 100 editing product was far from being a shipping product. He recalls:
I was sure, after that, the project would be cancelled. I thought I was going to be fired, and being the boss’ son wasn’t going to save me.
Bill Warner flicked through the Yellow Pages and rang around to find a company in Boston who had online editing suites, and asked for their pricing for computerised editing, and after creating a projected cost, Warner convinced his boss to pay for the GM video to be edited at Video Troupe. After shooting the video, Warner took his collection of ¾” rushes, VHS tapes and slides down to the post facility. He had assumed that computerised meant that the system was Having been involved in the computer industry and having cut some internal videos, Warner thought that he would be able to ‘run the session’ and learn the professional edit system as he worked.
I asked them if I could edit my own project. And they tilted their heads funny and went “mmm”. I didn’t really make too much of that and they walked me down to the edit room and sat me down at the system and said
“Well ok, P is play R is rewind and space bar is stop”.
And I went “what do you mean rewind?”
They replied “Well that rewinds the tape decks”
I said, “What do you mean tape decks? This is a computerized system right?
They said, “Yes”
I said, “Why do you have tape decks, don’t you have video stored in a digital form?
And the guy just paused and said, “What are you talking about?”
I said, “What are you talking about? This is a computerized editor right?
He said, “This is the top of the line CMX computer editor. It’s the best there is”
So I asked him, “If it’s the best, what’s so great about it?”
And he said proudly “its frame accurate”
And I couldn’t believe it
I said, “My Panasonic back at Apollo is accurate to plus or minus 1 or 2 frames”
He replied, “Yes but this is frame accurate and you can rebuild your program from an edit decision list”
I was dumbfounded and then they asked me
“When do you need your 20 minute video complete by?
I said “Tomorrow
They were shocked and replied. “You’d better call your people and tell them you there’s no way you can be ready by then, forget it”
And I said, “I’m not forgetting it”
Warner went back to Apollo and worked all night to create an offline edit. The next day he returned to the postproduction house, and created an online master using a CMX system. Still troubled by the previous day’s experience, he asked one more time.
“Are you sure this is the best system there is?”
And they said, “This is it. Get used to it”
I was bewildered. I just thought something like an Avid existed somewhere and I didn’t know about it.
Then I just figured that any day now someone would do this, a digital editing system. Surely it’s just a matter of minutes and I’ll wait. Meanwhile the rest of 1984 went by, 1985 went by…
Unknown to Warner something was happening.
Nick Schlott was porting code to enable Adobe Premiere to run with Microsoft’s Video for Windows (VFW).
Premiere for Windows was to be based upon VFW and of course that wasn’t released yet, so we were under Microsoft NDA’s and we would travel up there periodically to see what they were up to. They had started after QuickTime and therefore were trailing behind and of course QuickTime was no good to me on the PC, so I wrote my own file format for playing back video. I had to make it work at 1.5mb/s.
And back then I was young and worked long long hours and I could program as fast as anyone I knew, some was good code, some was not so good but I needed to be quick early on to get my head around the task. We had to create a huge amount of infrastructure for the Windows version that we took for granted on the Mac, like the Macintosh’s QuickDraw API. That had to be written almost from scratch.
Schlott hired consultants who had worked on Supermac’s Videospigot to replicate the Mac toolbox.
I wrote as much of that performance-enhancing code (Premiere/Win 1.0) as I could before VFW was complete and then we had to shoe horn the VFW stuff in, once it was available from Microsoft.
Despite Adobe’s growing size with products like Photoshop, the Premiere team was small. Schlott recalls the differences in programming then, and now.
Of course there are people within a company like Adobe whose job it is to get the final version of an application like Premiere and ship it and store it and so forth but on a day to day basis, if you were to ask anyone where the latest build of Premiere was? The answer would be “On Randy’s computer”. That simple. Of course it’s different now but back then it was…different. Teams of people on Photoshop and Postscript, and two of us on Premiere.
One programmer on the Mac, and one on Windows. Every now and then I would come across something in Randy’s code and go ask him how he had done it and it would be a very Mac type of solution he had engineered and I would go away and try to come up with something similar in the Windows world.
Pinnacle Systems was emulating the early days of Silicon Valley pioneer, Ampex.
It had experienced tremendous growth, then nearly closed, after which it created an outstanding new product. Finally it had become overly reliant on one customer. Sales of its Alladin technology to Avid accounted for more than 40% of sales.
Company co-founder Ajay Chopra recalls:
Of course Pinnacle was very successful due to the Alladin deal with Avid but it was something problematical. A significant percentage of our revenue was coming from one client. How do you expand your portfolio without starting to compete in the professional market against your OEM partners? You have this great core technology at the heart of your company, where else can you sell it?
VP of New Business Development, Bill Loesch had an idea. Research available to the market from Sony, estimated that 60% of America’s 25 million camcorder households owned personal computers.
Loesch wanted Pinnacle to create an editing product for under $500, as CEO Mark Sanders recalls:
Bill nagged and nagged and nagged me about it. To which I would reply that ‘the consumer market is too different, we don’t have the distribution channels, we don’t have volume manufacturing or knowledge of that market, it would be a big stretch. Bill just smiled and told me ‘We can do it’. He convinced me.
Pinnacle founder Ajay Chopra recalls:
We looked at the low-end consumer market since our OEM customers such as Avid and Media 100s weren’t in this market. It seemed obvious that there was a real opportunity there.
I saw things very differently to Mark but one of his strengths is not to dismiss ideas he doesn’t agree with. I remember saying to him, “tape based editing is dead, its history, we should make something new”, and he got so mad and yelled at me “What the hell are you talking about? Tape editing isn’t dead!” In the end we were both right, tape based editing was going to be around for a while but the writing was definitely on the wall.
Sanders gave Loesch the green light to create a Consumer Products division with its first product to be a non-professional video-editing package, code named Alibaba.
Over at Digital GraphiX Inc, Ivan Maltz was keen for a change. Bill Loesch recalls:
We had looked at buying the Deko group and of course Ivan Maltz was the lead engineer there so we got to know each other. We exchanged ideas with Ivan and Keith Thomson and persuaded them to leave the east coast and Deko to come to Pinnacle. It was the right time for them and for us.
Over several discussions Bill showed us the business plan for Alibaba, I think I still have it in a box somewhere at home! It was a very exciting idea and on the strength of it we left New Jersey for California to help start a new group. I was always been on the edge of video and editing and always around it from my days at Dubner and then Grass Valley with the ImMIX guys but now I was going to be very much part of it.
We got to Pinnacle, and we had a clean sheet of paper!
We looked around at what was available and there weren’t many consumer products. There was MGI’s VideoWave and a few others and although I had no background in editing and didn’t know what an AVI file was, I knew about video and I knew graphics and I knew how to make applications.
Bill Loesch recalls:
As soon as they arrived we started writing a consumer non-linear editor. I think it’s fair to say that Mark (Sanders) hedged his bets here. He figured that if the consumer play failed, Keith and Ivan were two top-notch professional video engineers who could fit into our professional organization.
Ivan Maltz worked on the systems software, and Keith Thomson the user interface. Maltz continues:
We actually set ourselves the goal of getting an ‘Editor’s Choice’ review with PC Magazine but that was going to be difficult because at the time they didn’t even have a video editing category!
Loesch is candid in his assessment.
There were a lot of people on the board who thought Mark Sanders was crazy.
While Adobe and Macromedia were making plans for Macintosh video products, Apple seemed unfocused. It had relied on Bill Atkinson’s QuickDraw to deliver graphics that set it apart from other personal computer operating systems but had since lost its lead to Wintel.
Apple had then released Atkinson’s HyperCard at the 1987 MacWorld expo, where it was billed as a hypermedia system that allowed users, with little programming knowledge, to create custom applications in minutes. HyperCard combined database capabilities with a graphical, flexible, user-modifiable interface. Schools used it to create interactive learning materials, while industrials like Renault used it to build inventory databases. Atkinson himself was surprised as to what HyperCard was being used for:
It’s as if you gave somebody a crescent wrench and asked him what he was going to use it for. Well it turns out it’s a lot of things, including a hammer.
While a few key players within Apple believed that that the company should look beyond HyperCard for its next operating system update, System 7, there was little support for a new method of delivering media elements like video and audio. Duncan Kennedy recalls:
For about three to five years there was a major effort within Apple to develop hardware based video projects. There were projects trying to leverage HyperCard, but no one was really doing software based video.
Atkinson, himself one of the Macintosh original team, had only created HyperCard after plan to build a handheld tablet computer with a full page display, called Magic Slate, was shunned.
Needless to say, Magic Slate wasn’t the kind of thing that Apple could make in a couple of years. And back then Apple wasn’t into long-term research.
Another Apple scientist, Stephen (Steve) Perlman, had been working on a hardware solution to desktop video for some time, and he went public at SIGGRAPH with a dedicated hardware box called QuickScan that was powered by a custom, high-density chip.
QuickScan was able to deliver symmetrical decompression of full-color video in real time, and could display the results on a standard Mac II screen. Perlman showed viewers computer windows that contained 24-bit color video running in real time, at 30 frames per second while he moved the windows around the screen.
What we were showing was the Mac handling multiple video windows with live, full National Television System Committee (NTSC) bandwidth video. We showed the ability to deal with dynamic objects in much the same way the system deals with static objects.
The demonstration amazed all who saw it, as Perlman later told journalist Jim Carlton:
It knocked people’s socks off.
Eric Hoffert recalls QuickScan:
It was a very sexy demonstration. QuickScan was a display engine that drove a black box capable of showing multiple windows of live video and compositing.
Perlman had previously designed a parallel-processing graphics system at Atari, and then a massively parallel 3D animation chip at Coleco before joining Apple. He had achieved QuickScan by finding a new way to manipulate graphics.
Rather than adding a lot of horsepower to do the graphics acceleration, we’ve figured out a new way of looking at graphics so that motion doesn’t have to cost you processor cycles.
Apple Products President Jean-Louis Gassee was abrupt, but fair with his forecast.
The biggest technical hurdle is the development of “symmetrical, layered, real-time video compression and decompression. That’s a combination of mathematics and silicon that the industry hasn’t licked yet.
Eric Hoffert adds:
The prevailing wisdom was to handle video with MPEG, processed by a chipset on the motherboard, but I knew we could achieve some form of video without hardware. Over the summer we decided to figure it out and make it work.