DOC: Wiki / UML clean-up

Remove some orphaned diagrams and PNG images not actually used
in the TiddlyWiki. Add a page with some hints regarding Bouml

See also #960 -- Bouml has been discontinued and is closed source now
not sure how to proceed with this
This commit is contained in:
Fischlurch 2015-01-05 15:44:17 +01:00
parent 55b2c79aad
commit 1a5e5eaa10
16 changed files with 77 additions and 90 deletions

Binary file not shown.

Before

Width:  |  Height:  |  Size: 34 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 78 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 11 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 16 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 9 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 11 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 13 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 6.5 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 10 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 6.1 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 3.6 KiB

1
uml/.gitignore vendored Normal file
View file

@ -0,0 +1 @@
README_UML.html

51
uml/README_UML.txt Normal file
View file

@ -0,0 +1,51 @@
Some Hints regarding UML usage
==============================
This Directory contains some UML modelling done with the Software *'bouml'*
http://www.bouml.fr/[Bouml] was written by Bruno Pages (bouml@free.fr).
It used to be OpenSource, released under GPL up to
http://www.bouml.fr/historic_old.html[Version 4.21].
Some years ago, there was a somewhat confusing move of the original author,
who blamed ``Wikipedia editors and copyright violation'' to destroy his work.
Seemingly the point of contention was the licensing of Bouml logo images on
wikimedia. See the http://en.wikipedia.org/wiki/Talk:BOUML[Wikipedia page]
for some indirect hings. Judging from further indirect mentions, there must
have been a flame war somewhere. Anyway, the author went closed source.
As a consequence, Bouml was dropped from Debian, since it relies on qt3.
* bouml uses a custom, text based session format for its UML-``Projects''
* we track all these bouml session files in our Git tree
* but note: the actual format is slightly dependent on the actual bouml version in use footnote:[as of
1/2015, we still use Version *`4.21`* of bouml, which is the Version found in Debian/Squeeze. It runs
without modification on Debian/Wheezy]
* and the layout of the rendered diagrams is unfortunatly _highly dependent on the installed fonts_.
* for that reason, we check in any _relevant_ diagram images also into Git. See `doc/devel/uml`
Here, ``relevant'' means all diagram images, which are linked into the website or the TiddlyWiki
* These images can be regenerated by producing an ``HTML Report'' from within bouml. But we don't
upgrade the versions in Git _unless really necessary_ (due to the dependency on the installed font).
And we do not check in any other generated artefacts. Thus, after re-runing this export from bouml,
just add to Git what you _really_ need for linking in the documentation, and remove all other
artefacts afterwards.
Housekeeping
------------
some practical hints
- you can find out which images are used in the TiddlyWiki: just grep for the link to an png:
egrep 'fig.+\.png' renderengine.html
- you can grep over the bouml project files, to find out about the numbers and titels of the diagrams:
for D in *.diagram; do D=${D%.diagram}; echo ===$D===; egrep "diagram $D" *; done
- by a variation of this techique, you can find obsoleted diagrams left back by bouml:
for D in *.diagram; do D=${D%.diagram}; if ( ! egrep -q "diagram $D" *) ;then echo Orphaned Diagram $D;fi; done

View file

@ -1,63 +0,0 @@
format 58
classinstancecanvas 128005 classinstance_ref 134661 //
xyz 233 116 2000
end
classinstancecanvas 128133 classinstance_ref 134789 //
xyz 297 53 2000
end
classinstancecanvas 128261 classinstance_ref 134917 //
xyz 335 112 2000
end
classinstancecanvas 128389 classinstance_ref 135045 //
xyz 335 144 2000
end
classinstancecanvas 128517 classinstance_ref 135173 //
xyz 335 175 2000
end
fragment 128773 ""
xyzwh 311 90 1994 83 139
end
classinstancecanvas 129157 classinstance_ref 135301 //
xyz 65 116 2000
end
classinstancecanvas 129285 classinstance_ref 135429 //
xyz 95 176 2000
end
classinstancecanvas 129413 classinstance_ref 135557 //
xyz 95 208 2000
end
classinstancecanvas 129541 classinstance_ref 135685 //
xyz 95 241 2000
end
fragment 129669 ""
xyzwh 70 153 1994 93 136
end
classinstancecanvas 129925 classinstance_ref 135813 //
xyz 75 33 2000
end
fragment 130181 "EDL"
xyzwh 12 12 2000 167 305
end
fragment 130437 "asset management"
xyzwh 221 12 1989 184 236
end
objectlinkcanvas 128645 norel
from ref 128133 z 1999 to ref 128261
no_role_a no_role_b
objectlinkcanvas 128901 norel
geometry VH
from ref 128005 z 1999 to point 256 61
line 129029 z 1999 to ref 128133
no_role_a no_role_b
objectlinkcanvas 129797 norel
from ref 129157 z 1999 to ref 129285
no_role_a no_role_b
objectlinkcanvas 130053 norel
from ref 129925 z 1999 to ref 129157
no_role_a no_role_b
objectlinkcanvas 130309 norel
from ref 129157 z 1999 to ref 128005
no_role_a no_role_b
preferred_whz 435 373 1
end

View file

@ -1,37 +1,27 @@
window_sizes 1619 1028 270 1339 881 71 window_sizes 1619 1028 270 1339 883 71
diagrams diagrams
classdiagram_ref 136453 // Session backbone classdiagram_ref 136453 // Session backbone
631 352 100 4 0 0 631 352 100 4 0 0
objectdiagram_ref 138885 // ModelAssetRelations active objectdiagram_ref 138885 // ModelAssetRelations
730 488 100 4 0 0 730 488 100 4 0 0
classdiagram_ref 143877 // Player Entities
663 654 100 4 0 0
objectdiagram_ref 144005 // Play Process Structure
562 424 100 4 0 0
sequencediagram_ref 145157 // output data exchange
586 416 100 4 0 0
active classdiagram_ref 151685 // Player Output
643 590 92 4 0 0
end end
show_stereotypes show_stereotypes
selected selected
package_ref 129 // lumiera package_ref 129 // lumiera
open open
package_ref 128005 // design package_ref 128005 // design
classview_ref 128389 // Controller Workings classview_ref 128389 // Controller Workings
class_ref 185221 // Allocation
class_ref 178565 // DataSink
class_ref 185349 // Connection
class_ref 185477 // OutputSlotImpl
class_ref 178693 // BufferProvider
package_ref 132229 // Session package_ref 133637 // Play
package_ref 132229 // Session
class_ref 153733 // QueryFocusStack class_ref 153733 // QueryFocusStack
classview_ref 128261 // Builder Workings
usecaseview_ref 128261 // config examples usecaseview_ref 128261 // config examples
package_ref 128389 // RenderEngine package_ref 128389 // RenderEngine
package_ref 129157 // BackendLayer package_ref 129157 // BackendLayer
end end
end end

View file

@ -3162,7 +3162,7 @@ __Note__: nothing within the PlacementIndex requires the root object to be of a
</pre> </pre>
</div> </div>
<div title="MultichannelMedia" modifier="Ichthyostega" created="200709200255" modified="201202112338" tags="Model design img"> <div title="MultichannelMedia" modifier="Ichthyostega" created="200709200255" modified="201501051317" tags="Model design img" changecount="1">
<pre>Based on practical experiences, Ichthyo tends to consider Multichannel Media as the base case, while counting media files providing just one single media stream as exotic corner cases. This may seem counter intuitive at first sight; you should think of it as an attempt to avoid right from start some of the common shortcomings found in many video editors, especially <pre>Based on practical experiences, Ichthyo tends to consider Multichannel Media as the base case, while counting media files providing just one single media stream as exotic corner cases. This may seem counter intuitive at first sight; you should think of it as an attempt to avoid right from start some of the common shortcomings found in many video editors, especially
* having to deal with keeping a &quot;link&quot; between audio and video clips * having to deal with keeping a &quot;link&quot; between audio and video clips
* silly limitations on the supported audio setups (e.g. &quot;sound is mono, stereo or Dolby-5.1&quot;) * silly limitations on the supported audio setups (e.g. &quot;sound is mono, stereo or Dolby-5.1&quot;)
@ -3170,8 +3170,8 @@ __Note__: nothing within the PlacementIndex requires the root object to be of a
* inability to edit stereoscopic (3D) video in a natural fashion * inability to edit stereoscopic (3D) video in a natural fashion
!Compound Media !Compound Media
[&gt;img[Outline of the Build Process|uml/fig131333.png]]
Basically, each [[media asset|MediaAsset]] is considered to be a compound of several elementary media (tracks), possibly of various different media kinds. Adding support for placeholders (''proxy clips'') at some point in future will add still more complexity (because then there will be even dependencies between some of these elementary media). To handle, edit and render compound media, we need to impose some structural limitations. But anyhow, we try to configure as much as possible already at the &quot;asset level&quot; and make the rest of the proc layer behave just according to the configuration given with each asset. Basically, each [[media asset|MediaAsset]] is considered to be a compound of several elementary media (tracks), possibly of various different media kinds. Adding support for placeholders (''proxy clips'') at some point in future will add still more complexity (because then there will be even dependencies between some of these elementary media). To handle, edit and render compound media, we need to impose some structural limitations. But anyhow, we try to configure as much as possible already at the &quot;asset level&quot; and make the rest of the proc layer behave just according to the configuration given with each asset.
{{red{Note 1/2015}}}: various details regarding the model representation of multichannel media aren't fully settled yet. There is a placeholder in the source, which can be considered more or less obsolete
!!Handling within the Model !!Handling within the Model
* from a Media asset, we can get a [[Processing Pattern (ProcPatt)|ProcPatt]] describing how to build a render pipeline for this media * from a Media asset, we can get a [[Processing Pattern (ProcPatt)|ProcPatt]] describing how to build a render pipeline for this media
@ -3444,7 +3444,7 @@ While actually data frames are //pulled,// on a conceptual level data is assumed
As both of these specifications are given by [[Pipe]]-~IDs, the actual designation information may be reduced. Much can be infered from the circumstances, because any pipe includes a StreamType, and an output designation for an incompatible stream type is irrelevant. (e.g. and audio output when the pipe currently in question deals with video) As both of these specifications are given by [[Pipe]]-~IDs, the actual designation information may be reduced. Much can be infered from the circumstances, because any pipe includes a StreamType, and an output designation for an incompatible stream type is irrelevant. (e.g. and audio output when the pipe currently in question deals with video)
</pre> </pre>
</div> </div>
<div title="OutputManagement" modifier="Ichthyostega" created="201007090155" modified="201112222226" tags="Model Rendering Player spec draft"> <div title="OutputManagement" modifier="Ichthyostega" created="201007090155" modified="201501051334" tags="Model Rendering Player spec img draft" changecount="5">
<pre>//writing down some thoughts// <pre>//writing down some thoughts//
* ruled out the system outputs as OutputDesignation. * ruled out the system outputs as OutputDesignation.
@ -3465,6 +3465,9 @@ Initially, [[Output designations|OutputDesignation]] are typically just local or
We should note that in both cases this [[mapping operation|OutputMapping]] is controlled and driven and constrained by the output side of the connection: A viewer has fixed output capabilities, and rendering targets a specific container format -- again with fixed and pre-settled channel configuration ({{red{TODO 9/11}}} when configurting a render process, it might be necessary to pre-compute the //possible kinds of output streams,// so to provide a sensible pre-selection of possible output container formats for the user to select from). Thus, as a starting point, we'll create a default configured mapping, assigning channels in order. This mapping then should be exposed to modification and tweaking by the user. For rendering, this is part of the render options dialog, while in case of a viwer connection, a switch board is created to allow modifying the default mapping. We should note that in both cases this [[mapping operation|OutputMapping]] is controlled and driven and constrained by the output side of the connection: A viewer has fixed output capabilities, and rendering targets a specific container format -- again with fixed and pre-settled channel configuration ({{red{TODO 9/11}}} when configurting a render process, it might be necessary to pre-compute the //possible kinds of output streams,// so to provide a sensible pre-selection of possible output container formats for the user to select from). Thus, as a starting point, we'll create a default configured mapping, assigning channels in order. This mapping then should be exposed to modification and tweaking by the user. For rendering, this is part of the render options dialog, while in case of a viwer connection, a switch board is created to allow modifying the default mapping.
[&gt;img[Output Management and Playback|uml/fig143877.png]]
!Connection to external outputs !Connection to external outputs
External output destinations are never addressed directly from within the model. This is an design decision. Rather, model parts connect to an OutputDesignation, and these in turn may be [[connected to a viewer element|ViewerPlayConnection]]. At this point, related to the viewer element, there is a mapping to external destination(s): for images, a viewer typically has an implicit, natural destination (read: actually there is a corresponding viewer window or widget), while for sound we use an mapping rule, which could be overridden locally in the viewer. External output destinations are never addressed directly from within the model. This is an design decision. Rather, model parts connect to an OutputDesignation, and these in turn may be [[connected to a viewer element|ViewerPlayConnection]]. At this point, related to the viewer element, there is a mapping to external destination(s): for images, a viewer typically has an implicit, natural destination (read: actually there is a corresponding viewer window or widget), while for sound we use an mapping rule, which could be overridden locally in the viewer.
@ -3472,6 +3475,7 @@ Any external output sink is managed as a [[slot|DisplayerSlot]] in the ~OutputMa
&amp;rarr; the OutputManager interface describes handling this mapping association &amp;rarr; the OutputManager interface describes handling this mapping association
&amp;rarr; see also the PlayService &amp;rarr; see also the PlayService
!the global output manager !the global output manager
Within the model routing is done mostly just by referring to an OutputDesignation -- but at some point finally we need to map these abstract designations to real output capabilities. This happens at the //output managing elements.// This interface, OutputManager, exposes these mappings of logical to real outputs and allows to manage and control them. Several elements within the application, most notably the [[viewers|ViewerAsset]], provide an implementation of this interface -- yet there is one primary implementation, the ''global output manager'', known as OutputDirector. It can be accessed through the {{{Output}}} façade interface and is the final authority when it comes to allocating and mapping of real output possibilities. The OutputDirector tracks all the OutputSlot elements currently installed and available for output. Within the model routing is done mostly just by referring to an OutputDesignation -- but at some point finally we need to map these abstract designations to real output capabilities. This happens at the //output managing elements.// This interface, OutputManager, exposes these mappings of logical to real outputs and allows to manage and control them. Several elements within the application, most notably the [[viewers|ViewerAsset]], provide an implementation of this interface -- yet there is one primary implementation, the ''global output manager'', known as OutputDirector. It can be accessed through the {{{Output}}} façade interface and is the final authority when it comes to allocating and mapping of real output possibilities. The OutputDirector tracks all the OutputSlot elements currently installed and available for output.
@ -4550,7 +4554,7 @@ We need a way of addressing existing [[pipes|Pipe]]. Besides, as the Pipes and T
&lt;&lt;tasksum end&gt;&gt; &lt;&lt;tasksum end&gt;&gt;
</pre> </pre>
</div> </div>
<div title="PlayProcess" modifier="Ichthyostega" created="201012181714" modified="201306022259" tags="def spec Player img" changecount="1"> <div title="PlayProcess" modifier="Ichthyostega" created="201012181714" modified="201501051334" tags="def spec Player img" changecount="2">
<pre>With //play process//&amp;nbsp; we denote an ongoing effort to calculate a stream of frames for playback or rendering. <pre>With //play process//&amp;nbsp; we denote an ongoing effort to calculate a stream of frames for playback or rendering.
The play process is an conceptual entity linking together several activities in the [[Backend]] and the RenderEngine. Creating a play process is the central service provided by the [[player subsystem|Player]]: it maintains a registration entry for the process to keep track of associated entities, resources allocated and calls [[planned|FrameDispatcher]] and [[invoked|RenderJob]] as a consequence, and it wires and exposes a PlayController to serve as an interface and information hub. The play process is an conceptual entity linking together several activities in the [[Backend]] and the RenderEngine. Creating a play process is the central service provided by the [[player subsystem|Player]]: it maintains a registration entry for the process to keep track of associated entities, resources allocated and calls [[planned|FrameDispatcher]] and [[invoked|RenderJob]] as a consequence, and it wires and exposes a PlayController to serve as an interface and information hub.
@ -4564,6 +4568,8 @@ The play process is an conceptual entity linking together several activities in
The Controller is exposed to the client and acts as frontend handle, while the play process body groups and manages all the various parts cooperating to generate output. For each of the participating global pipes we get a [[feed|Feed]] to drive that pipeline to deliver media of a specific kind. The Controller is exposed to the client and acts as frontend handle, while the play process body groups and manages all the various parts cooperating to generate output. For each of the participating global pipes we get a [[feed|Feed]] to drive that pipeline to deliver media of a specific kind.
Right within the play process, there is a separation into two realms, relying on different programming paradigms. Obviously the play controller is a state machine, and similarily the body object (play process) has a distinct operation state. Moreover, the current collection of individual objects hooked up at any given instance is a stateful variable. To the contrary, when we enter the realm of actual processing, operations are carried out in parallel, relying on stateless descriptor objects, wired into individual calculation jobs, to be scheduled as non-blocking units of operation. For each series of consecutive frames to be calculated, there is a descriptor object, the CalcStream, which also links to a specificaly tailored dispatcher table, allowing to schedule the individual frame jobs. Whenever the controller determines a change in the playback plan (speed change, skip, scrubbing, looping, ...), a new CalcStream is created, while the existing one is just used to mark any not-yet processed job as superseded. Right within the play process, there is a separation into two realms, relying on different programming paradigms. Obviously the play controller is a state machine, and similarily the body object (play process) has a distinct operation state. Moreover, the current collection of individual objects hooked up at any given instance is a stateful variable. To the contrary, when we enter the realm of actual processing, operations are carried out in parallel, relying on stateless descriptor objects, wired into individual calculation jobs, to be scheduled as non-blocking units of operation. For each series of consecutive frames to be calculated, there is a descriptor object, the CalcStream, which also links to a specificaly tailored dispatcher table, allowing to schedule the individual frame jobs. Whenever the controller determines a change in the playback plan (speed change, skip, scrubbing, looping, ...), a new CalcStream is created, while the existing one is just used to mark any not-yet processed job as superseded.
&amp;rarr; for overview see also OutputManagement
</pre> </pre>
</div> </div>
<div title="PlayService" modifier="Ichthyostega" created="201105221900" modified="201202010348" tags="Player spec draft"> <div title="PlayService" modifier="Ichthyostega" created="201105221900" modified="201202010348" tags="Player spec draft">
@ -5196,7 +5202,7 @@ config.macros.rssFeedUpdate = {
//}}} //}}}
</pre> </pre>
</div> </div>
<div title="RelationClipAsset" modifier="Ichthyostega" created="200710191541" modified="201112222247" tags="design decision img"> <div title="RelationClipAsset" modifier="Ichthyostega" created="200710191541" modified="201501051315" tags="design decision img" changecount="1">
<pre>What is the Role of the asset::Clip and how exactly are Assets and (Clip)-MObjects related? <pre>What is the Role of the asset::Clip and how exactly are Assets and (Clip)-MObjects related?
First of all: ~MObjects are the dynamic/editing/manipulation view, while Assets are the static/bookkeeping/searching/information view of the same entities. Thus, the asset::Clip contains the general configuration, the ref to the media and descriptive properties, while all parameters being &quot;manipulated&quot; belong to the session::Clip (MObject). Besides that, the practical purpose of asset::Clip is that you can save and remember some selection as a Clip (Asset), maybe even attach some information or markup to it, and later be able to (re)create a editable representation in the Session (the GUI could implement this by allowing to drag from the asset::Clip GUI representation to the timeline window) First of all: ~MObjects are the dynamic/editing/manipulation view, while Assets are the static/bookkeeping/searching/information view of the same entities. Thus, the asset::Clip contains the general configuration, the ref to the media and descriptive properties, while all parameters being &quot;manipulated&quot; belong to the session::Clip (MObject). Besides that, the practical purpose of asset::Clip is that you can save and remember some selection as a Clip (Asset), maybe even attach some information or markup to it, and later be able to (re)create a editable representation in the Session (the GUI could implement this by allowing to drag from the asset::Clip GUI representation to the timeline window)
@ -5212,9 +5218,11 @@ In either case, we have to solve the ''problem of clip asset proliferation''
!!multiplicity and const-ness !!multiplicity and const-ness
The link between ~MObject and Asset should be {{{const}}}, so the clip can't change the media parameters. Because of separation of concerns, it would be desirable that the Asset can't //edit// the clip either (meaning {{{const}}} in the opposite direction as well). But unfortunately the asset::Clip is in power to delete the clip-MO and, moreover, handles out a smart ptr ([[Placement]]) referring to the clip-MO, which can (and should) be used to place the clip-MO within the session and to manipulate it consequently... The link between ~MObject and Asset should be {{{const}}}, so the clip can't change the media parameters. Because of separation of concerns, it would be desirable that the Asset can't //edit// the clip either (meaning {{{const}}} in the opposite direction as well). But unfortunately the asset::Clip is in power to delete the clip-MO and, moreover, handles out a smart ptr ([[Placement]]) referring to the clip-MO, which can (and should) be used to place the clip-MO within the session and to manipulate it consequently...
[&gt;img[Outline of the Build Process|uml/fig131333.png]]
At first sight the link between asset and clip-MO is a simple logical relation between entities, but it is not strictly 1:1 because typical media are [[multichannel|MultichannelMedia]]. Even if the media is compound, there is //only one asset::Clip//, because in the logical view we have only one &quot;clip-thing&quot;. On the other hand, in the session, we have a compound clip ~MObject comprised of several elementary clip objects, each of which will refer to its own sub-media (channel) within the compound media (and don't forget, this structure can be tree-like) At first sight the link between asset and clip-MO is a simple logical relation between entities, but it is not strictly 1:1 because typical media are [[multichannel|MultichannelMedia]]. Even if the media is compound, there is //only one asset::Clip//, because in the logical view we have only one &quot;clip-thing&quot;. On the other hand, in the session, we have a compound clip ~MObject comprised of several elementary clip objects, each of which will refer to its own sub-media (channel) within the compound media (and don't forget, this structure can be tree-like)
{{red{open question:}}} do the clip-MO's of the individual channels refer directly to asset::Media? does this mean the relation is different from the top level, where we have a relation to a asset::Clip??</pre> {{red{open question:}}} do the clip-MO's of the individual channels refer directly to asset::Media? does this mean the relation is different from the top level, where we have a relation to a asset::Clip??
{{red{Note 1/2015}}} several aspects regarding the relation of clips and single/multichannel media are not yet settled. There is a preliminary implementation in the code base, but it is not sure yet how multichnnel media will actually be modelled. Currently, we tend to treat the channel multiplicity rather as a property of the involved media, i.e we have //one// clip object.</pre>
</div> </div>
<div title="RenderEngine" modifier="Ichthyostega" created="200802031820" modified="201202112356" tags="def"> <div title="RenderEngine" modifier="Ichthyostega" created="200802031820" modified="201202112356" tags="def">
<pre>Conceptually, the Render Engine is the core of the application. But &amp;mdash; surprisingly &amp;mdash; we don't even have a distinct »~RenderEngine« component in our design. Rather, the engine is formed by the cooperation of several components spread out over two layers (Backend and Proc-Layer): The [[Builder]] creates a network of [[render nodes|ProcNode]], the [[Scheduler]] triggers individual [[calculation jobs|RenderJob]], which in turn pull data from the render nodes, thereby relying on the [[Backend's services|Backend]] for data access and using plug-ins for the actual media calculations. <pre>Conceptually, the Render Engine is the core of the application. But &amp;mdash; surprisingly &amp;mdash; we don't even have a distinct »~RenderEngine« component in our design. Rather, the engine is formed by the cooperation of several components spread out over two layers (Backend and Proc-Layer): The [[Builder]] creates a network of [[render nodes|ProcNode]], the [[Scheduler]] triggers individual [[calculation jobs|RenderJob]], which in turn pull data from the render nodes, thereby relying on the [[Backend's services|Backend]] for data access and using plug-ins for the actual media calculations.