Last June, Apple’s release of FCPX shook the digital content creation industry and raised a lot of speculation among editors about how much Apple understands or cares about professional users.
I recently explored some of Apple’s history in content creation and offered a little analysis about what that might mean for the future in a two-part series for CreativeCOW:
I’ve found myself having the “mind like water” discussion over and over with colleagues and clients in the last couple weeks. It’s a notion I borrowed from David Allen, which he borrowed from his martial arts training.
Picture a still body of water. Throw a pebble into it, and you get little ripples. Drop a boulder in, and you get huge waves. Water always reacts appropriately; you never get huge waves cascading from a little pebble. Water never overreacts, and always returns to calm.
As we get stressed, we tend to overreact and fail to return to calm. We give small details undue attention, and avoid larger issues we should be paying more attention to. The “mind like water” challenge is keep our responses proportional to our inputs.
Happy anniversary, Mac.
25 years ago, Apple introduced the Macintosh and changed the way we thought about computers. For anyone outside a research lab, a computer with a graphical user interface was a revolutionary change in the way we worked with computers.
The Mac was a huge innovation, and its influence is still felt today, but our interactions with computers since its introduction have been largely evolutionary. Seeing multi-touch and speech recognition technologies taking hold in our daily lives today suggests we are on the cusp of revolutionary change.
It’s very exciting to think that we are starting to design our computers to interact with us, instead of forcing us to adapt ourselves to our computers. The implications in communications are huge. Most of our presentations today are static, with viewers passively receiving messaging due to technological limitations. Now, we are developing the tools to convert our viewers into participants, letting them interact with our messaging dynamically—even physically—with sight, sound, and touch.
Here’s to the next 25 years.
Just a couple weeks before the Superbowl, here’s a video that breaks down the technology behind the yellow first-down line on TV broadcasts for American football.
It’s a live compositing system that reads live camera position data from each camera on the field and draws the line over the video feed through some color filtering which masks out the players. I was struck by the simplicity of the engineering, using a spare audio channel from each camera to carry camera position data.