Back

The History of Non-Linear Editing


We may all use NLEs on our computers to edit video, but how did we get here from Reel-to-Reel? 


Imagine finally wrapping on a shoot, sitting down at your edit bay to get started on your rough cut, and staring at a reel-to-reel editor with a pair of scissors in hand and just a tiny monitor to review your footage with. No bins of footage, no timelines - just one linear strip of film that you have to literally “cut” down yourself. Sounds pretty exhausting, doesn’t it?


Thankfully, we’ve got the help of non-linear video editing to make our lives way, way easier. But how did we get here? When did the industry shift from cutting film strips to shifting clips in a timeline? Let’s take a journey through the history of non-linear video editing software, and how it became a staple tool in the filmmaking industry. 


In The Beginning…


All the way back in 1971, a joint venture between broadcasting giant CBS and Memorex (a magnetic computer tape company) birthed the CMX-600. This behemoth of an editing machine consisted of two monitors - one for playback and one for cutting. You would then edit using a light pen that the monitors would record as input (which to this day is still pretty fascinating).


The drawbacks of this machine made a lengthy list, though. You could only store up to 30 minutes of footage on it at a time on disk-pack drives the size of washing machines. The editor was only displayed in grainy black and white, and could only be used as an offline editor. Even if you did want to use this technology, it wasn’t very accessible to filmmakers with its price tag reaching just under a million dollars. 


After the CMX 600 paved the way for the technology of non linear editing, a company that’s famous for pushing the technology envelope came into play: Lucasfilm. Yes, George Lucas of “Star Wars” Lucasfilm. His company was looking for a way to access banks of footage at a moments notice. Unfortunately at the time, hard drives cost almost $200 per megabyte of storage. So the team at Lucasfilm crafted a machine called the EditDroid - a non linear editing machine that ran off a bank of LaserDiscs, allowing for random access of clips stored on these discs. This technology premiered at NAB in 1984, showing the world that video editing might be making a drastic upturn into the future. EditDroid laid the foundation of features we find in NLEs today, such as clip bins and a functioning timeline. 


The Rise of the Digital NLE

After some companies such as the Montage Picture Processor came along with non linear editing technology that utilized BetaMax tapes as it’s random access memory source and popular movies such as “Full Metal Jacket” and “The Godfather Part III” put them into practice, the industry was looking for something that utilized the rising computing power of modern computers. 


Thus came along the first fully-digital non linear editing system by the Editing Machines Corporation: The EMC2. This powerful invention allowed editors to pull their footage and place it onto a specialized optical disk, then edit it using the software. The technology at the time did not allow the EMC2 to edit the original footage or even display the footage above 240p, but it did have the technology to create a timecode edit that could be used on the raw footage. Now, editors had the ability to create full edits completely on a computer platform. 


Just weeks following the introduction of the EMC2 at the 1989 NAB conference, a familiar name came to the forefront: Avid Technology. 


Avid Media Composer 1 was a revolutionary platform, and very close to what NLEs look like today. It ran on a Macintosh computer, utilizing the Macintosh II’s power. You could import your video, drag it into a timeline, and even splice in clips and effects without having to burn them into your footage. It could only export in M-JPEG, which was a video quality similar to VHS. This turned off a lot of editors from transitioning over to Avid - also due to the steep price tag for the system. The modern NLE was born, but yet to be adopted. 


Due to the limitations of the Macintosh II, you could only access 50GB of storage at a time. This limited NLEs to small commercial projects and other short-form media. It wasn’t until 1993 when a R&D team at Disney found a work-around utilizing external hard drives that allowed Avid Media Composer to access more than 7TB worth of footage instantly. Finally, the NLE had the ability to edit long form content. 


The only thing that held back the NLE at this stage was the exporting abilities at the time. Only low-res projects would be able to come out of the software at the time, hindering the ability of video editors to fully edit and export a project through a computer. This led to the race of video software companies trying to create codecs that would allow full-res exporting. 


Familiar Faces Appear


In 1991, Adobe released Premiere Pro on the Macintosh to compete with Avid’s Media Composer. This software had a slew of new features such as color editing, transitions, and a Quicktime media codec. 


After Premiere had sliced off a bit of their own market share, they began shifting their focus to bring Premiere to the PC. This led a team of developers to jump ship and begin creating their own platform that would rival Premiere and Avid Media Composer. They crafted a software called KeyGrip. Unfortunately due to licensing issues, KeyGrip never got to release their software, and were only able to unveil it at the 1998 NAB conference. Apple executives recognized that editing software was starting to shift off of Apple computers and on to PC platforms, so they purchased the software in the hopes of selling it to an Apple 3rd party developer. With no buyer in sight, they decided to use the software to create their own platform: Final Cut Pro. 


At this point, NLEs had caught up with the times. They could edit full length films, export in high quality, and started fully replacing the old guard of reel-to-reel editing. New competitors came to the forefront such as Sony Vegas, HitFilm, and eventually BlackMagic’s DaVinci Resolve. Editors such as myself that grew up with the original versions of these platforms have never even been involved in an analog editing process. Video Editing had become more accessible than ever, and allowed a new generation of video editors to grow up editing their own projects in their living room. 


Want To Experience the Next Generation of Video Editing? Try Simon Says.

As NLEs in the past have had their struggles adjusting to new technologies, they still have a way to go. Editing a rough cut is the same as it was 20 years ago - combing through hours of footage manually to get the cuts you need. This is where Simon Says comes in. Plug your footage into our platform, and we will transcribe your project into a full transcript of audio you can read through. Once you hop into our Assemble platform, you can edit a rough cut by simply highlighting selections of your transcript and dragging it onto the timeline. Then, you can export a XML file that can seamlessly integrate into your NLE. 


Give it a shot with a free trial and come join us in the future of video editing.