...which uncovers further deeply nested problems,
especially when referring to non-copyable types.
Thus need to construct a common type that can be used
both to refer to the source elements and the expanded elements,
and use this common type as result type and also attempt to
produce better diagnostic messages on type mismatch....
To complete the mock setup, the next step would be to extend the GenNode-based spec langage
to allow defining prerequisite Mock-JobTickets. Setting this up seems rather straight forward --
however, defining a simple testcase to cover this extension runs into surprisingly tricky problems..
- for one, the singleValIterator from Itertools has serious difficulties handling references
- but even more surprising, it seems impossible to make the "prerequisites iterator"
fit into the Tree-Explorer framework (which I intend to use as replacement
for the monadic approach)
after some extended analysis of generic types and template instances,
it seems that not TreeExplorer as such is the primary problem, but rather
there is a conceptual mismatch somewhere deep down in Itertools or Iter-Adapter
By reasoning and analysis I conclude that the differentiation into
multiple channels is likely misplaced in JobTicket; it belongs ratther
into the Segment and should provide a suitable JobTicket for each ModelPort
Handling of prerequisites also needs to be reshaped entirely after
switching to a pipeline builder for the Job-planning pipeline; as
preliminary access point, just add an iterator over the immediate
prerequisites, thereby shifting the exploration mechanism entirely
out of the JobTicket implementation
Testcase: A simple Sementation with a single and bounded Segment
As aside, figured out how to unpack an iterator such as to
tie a fixed number of references through a structural binding:
auto const& [s1,s2,s3] = seqTuple<3> (mockSegs.eachSeg());
...now able to build a mock segmentation which issues dummy jobs,
and is wired such as to verify the right job is invoked for each segment.
And this allows to build and verify the Dispatcher,
without being able to invoke actual render jobs yet.
There are 12 distinct cases regarding the orientation of two intervals;
The Segmentation::splitSplice() operation shall insert a new Segment
and adjust / truncate / expand / split / delete existing segments
such as to retain the *Invariant* (seamless segmentation covering
the complete time axis)
- how to pass-in a specification given as GenNode
- now this might be translated into a MockJobTicket allocated in the MockSegmentation
Unimplemented: actually build the Segment with suitable start/end time
right now we're lacking a complete working implementation of render node invocation,
and thus the Dispatcher implementation can only be verified with the help
of mocked jobs. However, at least a preliminary implementation of tagging the
invocation instance is available, and thus we're able to verify that
a given job instance indeed belongs to and is "backed" by a specific JobTicket.
This is prerequisite for building up a (likewise mocked) Fixture datastructure,
and this in turn was meant to form the basis for attacking an actual Scheduler
implementation, followed by a real render node invocation.
- can now create a Job from JobTicket::NIL
- on invocation this Job will to nothing
Only when the first real output backend is implemented,
we can decide if this simplistic implementation is enough,
or if an empty output must be explicitly generated...
* using a simplified preliminary implementation of hash chaining (see #1293)
* simplistic implementation of hashing for time values (half-rotation)
* for now just hashing the time into the upper part of the LUID
Maybe we can even live with that implementation for some time,
depending on how important uniform distribution of hash values is
for proper usage of the frame cache.
Needless to say, various further fine points need more consideration,
especially questions of portability (32bit anyone?). Moreover, since
frame times are typically quantised, the search space for the hashed
time values is drastically reduced; conceivably we should rather
research and implement a good hash function for 128bit and then combine
all information into a single hash key....
...using the MockJobTicket setup as point of reference,
since the actual invocation of render nodes will only be drafted
later in this "Vertical Slice" integration effort...
- introduce a JobTicket::NOP (null-object pattern)
- assuming that the function splitSplice() will retain complete coverage allways
Remark:
`Fixture::getPlaylistForRender()` is a leftover from the very early implementation drafts.
This function was more or less based on the way Cinelerra works; it is clear by now
that Lumiera can not possibly work this way, given that we'll build a low-level model
and dispatch precompiled render jobs....
The Fixture and the low-level model backbone deserve a distinct namespace on their own.
Since it's built by the Builder from the Session contents, and also used by the frame dispatch,
we can expect dependence on some types from Steam-Layer, and thus this namespace
needs to reside in Steam-Layer rather, while the actual low-level Model
might become part of Vault-Layer, creating a hierarchy of data structures.
(Remark: likely also the session related namespaces will need a reorganisation)
The idea is to escape a "design deadlock" by using a test-driven prototype
implementation of the data structure to back a further development
of the Dispatcher and Scheduler implementation, which then can be used
to gradually elaborate and switch over to an actual implementation
data structure
...requires a first attempt towards defining a `JobTiket`.
This turns out quite tricky, due to using those `LinkedElements`
(intrusive single linked list), which requires all added records
actually to live elsewhere. Since we want to use a custom allocator
later (the `AllocationCluster`), this boils down to allocating those
records only when about to construct the `JobTicket` itself.
What makes matters even worse: at the moment we use a separate spec
per Media channel (maybe these specs can be collapsed later non).
And thus we need to pass a collection -- or better an iterator
with raw specs, which in turn must reveal yet another nested
sequence for the prerequisite `JobTickets`.
Anyhow, now we're able at least to create an empty `JobTicket`,
backed by a dummy `JobFunctor`....
Looks like we'll actually retain and use this low-level solution
in cases where we just can not afford heap allocations but need
to keep polymorphic objects close to one another in memory.
Since single linked lists are filled by prepending, it is rather
common to need the reversed order of elements for traversal,
which can be achieved in linear time.
And while we're here, we can modernise the templated emplacement functions
- build the reworked Job-planning pipeline more or less from scratch
- back that with mocked `Dispatcher` and `JobTicket`
- then transfer this into a `RenderDrive`, which can be tested as well
- could continue then to a `CalcStream` integration test....
- decision: the Monad-style iteration framework will be abandoned
- the job-planning will be recast in terms of the iter-tree-explorer
- job-planning and frame dispatch will be disentangled
- the Scheduler will deliberately offer a high-level interface
- on this high-level, Scheduler will support dependency management
- the low-level implementation of the Scheduler will be based on Activity verbs
The APIs for time quantisation were drafted in an early stage of the project
and then never followed-up. Especially Grid::gridAlign has no
real-world usage yet, and is only massaged in some tests.
When looking at QuantiserBasics_test, I was puzzled and led astray,
since this function suggests to materialise a continuous time into
a quantised time -- which it doesn't (there is another dedicated
function Quantiser::materialise() to that end); so, without engaging
into the discussion if this function is of any use, I'll hereby
choose a name better reflecting what it does.
...in an attempt to clarify why numerous cross links are not generated.
In the end, this attempt was not very successful, yet I could find some breadcrumbs...
- file comments generally seem to have a problem with auto link generation;
only fully qualified names seem to work reliably
- cross links to entities within a namespace do not work,
if the corresponding namespace is not documented in Doxygen
- documentation for entities within anonymous namespaces
must be explicitly enabled. Of course this makes only sense
for detailed documentation (but we do generate detailed
documentation here, including implementation notes)
- and the notorious problem: each file needs a valid @file comment
- the hierarchy of Markdown headings must be consistent within each
documentation section. This entails also to individual documented
entities. Basically, there must be a level-one heading (prefix "#"),
otherwise all headings will just disappear...
- sometimes the doc/devel/doxygen-warnings.txt gives further clues
the reason for the failure, as it turned out,
is that 'noexcept' is part of the function signature since C++17
And, since typically a STL container has const and non-const variants
of the begin() and end() function, the match to a member function pointer
became ambuguous, when probing with a signature without 'noexcept'
However, we deliberately want to support "any STL container like" types,
and this IMHO should include types with a possibly throwing iterator.
The rationale is, sometimes we want to expose some element *generator*
behind a container-like interface.
At this point I did an investigation if we can emulate something
in the way of a Concept -- i.e. rather than checking for the presence
of some functions on the interface, better try to cover the necessary
behaviour, like in a type class.
Unfortunately, while doable, this turns out to become quite technical;
and this highlights why the C++20 concepts are such an important addition
to the language.
So for the time being, we'll amend the existing solution
and look ahead to C++20
as it turns out, "almost" the whole codebase compiles in C++17 mode.
with the exception of two metaprogramming-related problems:
- our "duck detector" for STL containers does not trigger anymore
- the Metafunction to dissect Function sigantures (meta::_Fun) flounders
When drafting the time handling framework some years ago,
I foresaw the possible danger of mixing up numbers relating
to fractional seconds, with other plain numbers intended as
frame counts or as micro ticks. Thus I deliberately picked
an incompatible integer type for FSecs = boost::rational<long>
However, using long is problematic in itself, since its actual
bit length is not fixed, and especially on 32bit platforms long
is quite surprisingly defined to be the same as int.
However, meanwhile, using the new C++ features, I have blocked
pretty much any possible implicit conversion path, requiring
explicit conversions in the relevant ctor invocations. So,
after weighting in the alternatives, FSecs is now defined
as boost::rational<int64_t>.
This was prompted by a test failing under Boost-1.65 (--> see #294)
When reviewed now, the whole idea of testing Steam-Layer Commands for
equivalence feels a bit sketchy.
Just the comparison for the command ''identity'' alone seems sufficient,
i.e. the test if a command-ID is associated with the same backend-handle
and thus the same functor binding.
- we got occasional hangups when waiting for disabled state
- the builder was not triggered properly, sometimes redundant, sometimes without timeout
As it turned out, the loop control logic is more like a state machine,
and the state variables need to be separated from the external influenced variables.
As a consequence, the inChange_ variable was not calculated properly when disabled in a race,
and then the loop went into infinite wait state, without propagating this to
the externally waiting client, which caused the deadlock
basically we can pick just any convention here, and so we should pick the convention in a way
that makes most sense informally, for a *human reader*. But what we previously did, was to pick
the condition such as to make it simple in some situations for the programmer....
With the predictable result: even with the disappointingly small number of usages we have up to now,
we got that condition backwards several times.
OK, so from now on!!!
Time::NEVER == Time::MAX, because "never" is as far as possible into the future
- most notably the NOBUG logging flags have been renamed now
- but for the configuration, I'll stick to "GUI" for now,
since "Stage" would be bewildering for an occasional user
- in a similar vein, most documentation continues to refer to the GUI
...it should have been explicit from start, since there is no point
in converting an EntryID into a plain flat string without further notice
this became evident, when the compiler picked the string overload on
MakeRec().genNode(specialID)
...which is in compliance to the rules, since string is a direct match,
while BareEntryID would be an (slicing) upcast. However, obviously we
want the BareEntryID here, and not an implicit string conversion,
thereby discarding the special hash value hidden within the ID
As it turns out, using the functional-notation form conversion
with *parentheses* will fall back on a C-style (wild, re-interpret) cast
when the target type is *not* a class. As in the case in question here, where
it is a const& to a class. To the contrary, using *curly braces* will always
attempt to go through a constructor, and thus fail as expected, when there is
no conversion path available.
I wasn't aware of that pitfall. I noticed it since the recently introduced
class TimelineGui lacked a conversion operator to BareEntryID const& and just
happily used the TimelineGui object itself and did a reinterpret_cast into BareEntryID
I am fully aware this change has some far reaching ramifications.
Effectively I am hereby abandoning the goal of a highly modularised Lumiera,
where every major component is mapped over the Interface-System. This was
always a goal I accepted only reluctantly, and my now years of experience
confirm my reservation: it will cost us lots of efforts just for the
sake of being "sexy".