LUMIERA.clone/doc/user/intro/intro.txt

500 lines
18 KiB
Text
Raw Normal View History

Lumiera (as seen) from Outer Space
==================================
:Author: Lumiera_Core_Developers
:Date: Summer 2010
2012-08-26 17:49:58 +02:00
[abstract]
******************************************************************************
2012-08-26 17:49:58 +02:00
The Lumiera Community is in the process of making a non-linear video editing
and compositing FOSS application for Linux/Unix/Posix Operating Systems. The
application is geared towards professional, high-quality work; but
it is equally suitable for low-end users, due to its in-design scalability.
Lumiera builds on common open source video, sound and GUI toolkits and
libraries, being highly flexibile, configurable---user-control over a broad
spectrum of configurable parameters---and with smoth workflows that scale well
to larger more intricate projects as and more smaller projects.
This document outlines the design from a more general perspective,
providing potential users with sufficient insight into the tools and technology
behind Lumiera to get started working with Lumiera quickly.
******************************************************************************
// all things starting with '//' are asciidoc comments and drafts/notes while
// working on this document
.About this Document
2012-08-26 17:49:58 +02:00
// It contains many hyper-links to explanations which are denoted by an arrow ->.
Lumiera is still under active development. Here we describe planned features
without explicitly tagging them; some points have still to be worked out in
detail. Although this document is heavily cross-linked, we try to start with a
broad overview and developing details towards the end.
Vision
------
// objective and goals of the project
2012-08-26 17:49:58 +02:00
Lumiera strives towards being a _professional non-linear video editor_. To start with, we should
point out that ``professional'' does not necessarily mean ``commercial'' or ``industrial''.
2012-08-26 17:49:58 +02:00
It's more of an attitude or frame of mind -- doing work seriously, and to be subject to any
kind of wider goal, demand, or purpose.
The concept of professionality in film editing can mean something of an artistic
nature, a narrative or having a meaning to convey, a political message, or
portray something to an audience.
Anyhow, for the tools, the editing software used to this end, we can identify
several properties and requirements, to be labeled ``professional'':
With this perspective in mind, we can identify a number of key properties
of professional film production tools:
Reliability::
2012-08-26 17:49:58 +02:00
Your work must be safe and protected at all costs against software
glitches and incompatibilities. Ideally, Lumiera should be reliable,
very stable and not crash. In practice, even crashes or power outages
should not result in data or work loss..
Quality::
2012-08-26 17:49:58 +02:00
The demands placed on high-quality, cinema grade digital video material
requires crisp-quality without any compromsise throught the entire work
flow in the final product. All rendering will have to be reproducable
down to the last digit.
Performance and Productivity::
2012-08-26 17:49:58 +02:00
Professionals want to get things done, in time and content, but ideally
with control over all details. The fine balance of these goals is a
central goal of workflow design and usability.
Scalability and Adaptability::
Projects and budgets differ, hardware advances, Lumiera must scale
2012-08-26 17:49:58 +02:00
in different dimensions and use available resources as best it
can. From small Laptops to multi core computers and Renderfarms.
Durability::
Soft and Hardware advances at a fast pace. We must not lock into the
2012-08-26 17:49:58 +02:00
current state of technology but must be flexible enough to extend the
system without breaking compatibility. Projects you create nowadays with
Lumiera should be usable in the foreseeable future, there at least needs
to be a guaranteed upgrade path.
Fundamental Forces
------------------
// the basic ideas which drive the Lumiera design
The Lumiera design is guided by a small number of basic principles. Keeping
these in mind will help to understand how actually more interesting things can
be built up on that foundation.
Open ended combining of Building Blocks
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Lumiera is not so much defined in terms of _features_ -- rather it allows to
combine basic _building blocks._ These basic modules, entities or objects each
have a distinct _type_ explicitly limiting the connections. Within these
limits, any conceivable combination shall be supported without further hidden
limitations.
Lumiera is neither a set of Lego bricks, nor is it the business application
driven by finite usage stories.
Medium level Abstraction and Project specific Conventions
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
These building blocks within Lumiera create a moderate level of abstraction; a
user may, if desired, directly manipulate through the GUI clips, individual
effects, masks, and even the placements xref:placement[->] used to stitch the
objects together, which is comparatively low-level. On the other hand, these
abstractions shield the user from the actual technical details like format
conversions and the accessing of individual channels.
To complement this approach, Lumiera does _not_ rely on hard wired, global
conventions -- rather we allow to build up project specific conventions and
rules xref:rules[->] to fit the given requirements and preferred working
style. To help getting started, Lumiera will ship with a fairly conventional
project template and default configuration.
[[graphs]]
Rendering is Graph Processing
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Processing of Video (and audio) data can be generalized as graph processing
(more precisely ``directed acyclic graphs''). Data flows on the edges of these
graphs and is processed in the nodes.
image:{imgd}/lumiera_big_graph.png[Example for a graph]
When we look at this model we discover that we only need to build
xref:builder[->] such graphs, the nodes themselves can be seen as black boxes
and will be implemented by plugins xref:plugins[->]. Moreover one can
preconfigure subgraphs and handle them as single entity xref:pluginstack[->].
In Lumiera everything will be translated into such a graph. Your footage will
be demultiplexed xref:demultiplexer[->] at a first node, down to the encoding
xref:encoder[->] and multiplexer xref:multiplexer[->] which assembles the
final video.
Pulling not Pushing
~~~~~~~~~~~~~~~~~~~
2012-08-26 17:49:58 +02:00
At a first glance, it looks fairly natural to set up the graphs xref:graphs[->]
as described above and then push data into the system through the input nodes
whereas the final result can then be seen soon on the output node. Several
2012-08-26 17:49:58 +02:00
multimedia frameworks use this approach. However this scheme exhibits a number
of shortcomings which make it inappropriate for non-linear video editing.
2012-08-26 17:49:58 +02:00
Lumiera instead pulls data though the pipe, i.e., a request starts at the
output node and makes its way up to the inputs. This has certain advantages
xref:pull[->], which will be explained later.
Don't waste work
~~~~~~~~~~~~~~~~
2012-08-26 17:49:58 +02:00
Rendering A/V Data can be quite CPU intensive. To ensure that we do not waste
any CPU power by rendering things twice, or worse still, having to throw away
results because it couldn't rendered in time, we use in Lumiera sophisticated
caching xref:caching[->] and profiling xref:profiling[->].
The visible Universe
--------------------
// coarse overview whats seen on the gui, details later
Now its time to take a look at the prelimary Lumiera GUI:
image:{l}/images/lumiera_gui_small.png[Current Lumiera GUI Screenshot]
2010-06-02 19:33:28 +02:00
The GUI is a plugin by itself and only one way to work Lumiera, it will become
possible to create special-purpose GUIs or control Lumiera in different ways,
like a headless rendernode xref:rendernode[->] or frameserver
xref:frameserver[->]. Completely script driven interfaces for automated
processing are also planned.
The GUI screenshot you see above is faily default as when you start Lumiera up
for the first time (the plan is to add a 2nd Viewer to the default
configuration). While we support a much more sophisticated screen concept
xref:screenconcept[->] to adapt to different workplaces and workflows.
Viewer
~~~~~~
2012-08-26 17:49:58 +02:00
The viewer is an area where material can be displayed, i.e., ``play-back'',
which also supports audio playback connections. As there are many sources that
can be displayed, a viewer is attached to a source via the viewer switch board.
Timelines, probepoints, wiretaps and individual clips are examlpes of sources
that can be atached to a viewer. Moreover, the number of viewers open at any one
time is only limited by the hardware, and each viewer can be collapsed, hooked
up to a beamer or monitor.
Transport Controls
~~~~~~~~~~~~~~~~~~
The layout in current gui is rather preliminary -- it is not finally decided
where transport controls will be integrated; possibly as its own gui element
This are devices either controlled by widgets or by some input device
(MIDI, control surface, etc) so their gui may look differently.
Either way they connect to a Play Controler xref.. in the core which
manages playing and cursor positioning.
thus there will be some gui facility to attach Transport controls to Play
Controllers. Transport controls are ganged when they attach to the same
Play Controler.
just playing some footage for preview creates a simple internal timeline,
no magic here.
// TODO: bit unrelated, think about how ganging controls in general should
// work, also for faders, masks and so on
// Note by Ichthyo: the connection to a fader is handled through the placements,
// which allows to inherit such a control connection. IMHO together with the
// tree-like tracks this removes 80% of the need to gang faders.
Timeline View
~~~~~~~~~~~~~
2012-08-26 17:49:58 +02:00
A timeline is a container that provides a time axis and an output. The output
can de derived from various sources and have different configurations. An
output can have various configurations, for each output configuration, there
will be one timeline. A timeline does not temporally arrange material, this is
performed by a sequence, which can be snapped to a timeline.
A typical film will define many sequences, but only a few timelines. A sequence
contains a number of tracks which are ordered in a hierarchy. Tracs do not have
any format associated with them and more or less anything can be put into a
track. Consequently, audio and video material can be equally assigned to a
track, there is no discrimination between audio and video in the Lumiera concept
of a track.
A timeline must be assigned to viewer if playback viewing is desired.
//Format Independent Timeline, one can put anything on the timeline.
//the busses constrain what kind of data is pulled out and in turn the
//builder creates a processing graph which does the necessary conversions and
//stuff.
//
// Q: how to handle interaction, for example when some conversion can only be
// done in a lossy way and some conversion node may or may not be inserted
// (i mean gui wise)?
// A: this usually is detected at build time, which means the incriminating
// object and exit node is just in scope when the problem is detected.
// My intention was to have a problem flag with accompanying information
// attached to this object, so the GUI can highlight the problem location
// and give a general alert.
// TBD: Cursors .. discussion, handling, gui representation
// Note by Ichthyo: we shouldn't focus on cursors, but rather on selections.
// IMHO a playhead or edit marker or similar cursor is just
// a special case of a selection.
Busses
~~~~~~
2012-08-26 17:49:58 +02:00
The GUI provides a separate _bus view_, showing the master busses (subgroups)
2012-08-26 17:49:58 +02:00
in a manner similar to an audio mixing desk. Any bus is just a means of collecting
and and adding (aka overlaying) together the output of various kinds of media
(video, audio, number of channels), produced by various processing elements and
from other busses. These global busses can be concieved as being part of the
timeline.
Asset View
~~~~~~~~~~
2012-08-26 17:49:58 +02:00
We can conceive the Asset View as the timeline's book keeper: it manages the various
constituent in the timeline. Moreover, in addition to managing timeline
constituents, raw material, clips, bins (folders) are managed by the Asset
View, i.e., typical management operations including, deleting, adding,
naming, tagging, groupping into bins, etc. all occur here.
Plugins are also managed in the Asset View.
Manages all assets available in one project.
* source media/footage/soundfiles
2012-08-26 17:49:58 +02:00
* all available effects and transitions
* internal artefacts like sequences and automation data sets
// First this will be simply implemented showing data loaded into the session
// and all available plugins/effects
// The user may build custom effect collections ("effect palette")
// (much) Later it is planed to make this a database driven interface, where
// the tabs showing things are basically just database queries. Then it
// becomes possible to create/extend this by customized queries and augment
// assets with all kinds of metadata which can be queried
// Actually, the same underlying data structure is used to implement the
// asset view with folders, clip bins and effect palettes, and the timeline
// view with tracks, clips and attached effects. Technically, there is no
// difference between a track or a clip bin -- just the presentation varies.
// Timeline contents can be viewed like assets for bookkeeping purposes, and
// the contents of a clip bin can be played like a storyboard
''''''''
Dark Matter
-----------
2012-08-26 17:49:58 +02:00
The material in this section provides a cursory view of features not required by
a typical user, but of more importance to people loking under the hod, i.e.,
programers, etc.
Session storage
~~~~~~~~~~~~~~~
[red]#to be written#
2012-08-26 17:49:58 +02:00
//databank with logging, no data loss.
// not generateable data
// its the timeline mostly
// session storage
// benefits, unlimited undo, selective undo
// export/import plugins
// everything is stored in the session
[[placement]]
Placements
~~~~~~~~~~
Generic mechanism to stitch together media objects. Any placement may contain
a list of conditions how to locate the placed object, examples being
time-absolute/relative, relative to another object, or relative to some
specific source media frame.
All of the session model contents are attached by placement, forming a large
tree. Placements are to be _resolved_ to find out the actual position, output
and further locational properties of an object. Missing placement information
is _inherited_ from parent placements in the session tree. This causes a lot
of relational and locational properties to be inherited from more global
settings, unless defined locally at a given object: time reference point,
output destination, layering, fade control, audio pan,...
Rendering Engine
~~~~~~~~~~~~~~~~
[red]#to be written#
rendering...
[[builder]]
[red]#to be written#
rules system
[red]#to be written#
I/O Subsystem
~~~~~~~~~~~~~
[red]#to be written#
// file handling
// vault, work, cache
// repositories
// explain details later
Configuration
~~~~~~~~~~~~~
[red]#to be written#
// configuration system
// serves defaults, actual data are stored in the session
Plugins/Interfaces
~~~~~~~~~~~~~~~~~~
What are Plugins?
^^^^^^^^^^^^^^^^^
A Plug-in is a kind of generalisation of a library.
All applications use, to varying degrees of intensity, libraries. A programmer
will not reinvent the wheel each time he sits down to programme an
application. A programmer will typically borrow and use features and
functionality from other programmers---or even borrow from himself, stuff
written long ago in the past. Such features are collected together in
libraries.
A library is used in an application by _linking_ the library into the
application. (There are other things to be done, but we'll call these 'details',
which wont concern us here.) There are different ways to _link_ a library
into an application: statically linking and dynamically linking.
_Staticall Linking_ is done while the application is being built, or
compiled. It is performed by the linker. The linker can perform some checks
(mostly checks on syntax) and warn the user that some particular feature is
being used incorrectly. The user can then correct the offending code, and
recompile.
There are a number of disadvantages associated with static linking. Features and
libraries are being constantly improved. If the application wants to use new
features, it will have to be recompiled with the new library which provides the
new features.
_Dynamic Linking_ helps rectify the necessity of having to recompile. If a
new, improved library becomes available, all the user has to do is to install
the new library onto the operating system, restart the application and the new
features can be used by the application. The features provided by a dynamic
library are loaded when the application starts to run.
However both methods exibit a number of shortcomings. Wouldn't it be better if
all features could be loaded only when needed? If features could be loaded only
when needed, then they could also be unloaded when not required, thus saving
memory and possibly increasing performance. This scheme of making features
available to an application is known as run-time linking, aka plug-ins.
Plug-ins offer other benifits: the application can continue to use both the old
features and the new features together, side-by-side, by using the version
number associated with the plug-in. This saves the application from considerable
headaches associated with other linking methods, havocked library version
incompatibility.
Most modern applications use plug-ins, some are heavily dependent on plug-ins
and only provide limited functionality without any plug-ins.
Lumiera will not reinvent the wheel. One major goal is to provide considerable
functionality via well-designed, external code supplied to Lumiera by plug-ins.
2012-08-26 17:49:58 +02:00
How are Plugins Implemented?
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Rendering Video
---------------
[red]#to be written#
// describe the flow of data to render a frame
// viewer
// final
Pulling a Frame
~~~~~~~~~~~~~~~
[red]#to be written#
// special cases,
// case studies,
// constrain viewer
// proxy
// viewer circuit
// render circuit
//Example Plugins
//---------------
// show some case-studies that someone gets a feel how plugins work
[red]#TODO# Consider integrating the following things into the document above
* plugins
* timeline
* pull
* bus defines rendering format
* caching
* frameserver
* screenconcept / perspectives
* automation
* 3 layered model
Glossary
--------
The above outline of the design uses a lot of special terms and common termes
used with specific meaning within Lumiera. To ease the understanding, we've
collected a link:Glossary.html[Glossary of common terms].