hugin-devs team mailing list archive
-
hugin-devs team
-
Mailing list archive
-
Message #03385
Re: [Bug 927509] Re: enfuse feature proposal: weighting 'technical' qualities
Am 28.02.2012 14:30, schrieb Christoph Spiel:
>
> Adding all your technical-merit parameters would put a lot of baggage
> into Enblend and Enfuse. The next day someone has a another new idea
> what to put into his/her weighting functions and we need even more
> code. This seems unmaintainable over the long haul -- in particular
> as we have almost no "developer resources".
I agree. So let's discuss how we can achieve these goals - and more - in
an efficient, transparent way which is easy to implement.
> I have two possible (non-mutually exclusive) solutions in mind. By
> the way, each of them would lend itself to an interesting GSoC
> project.
I'm in two minds about GSoC projects. I feel there's a good possibility
they produce monolithic, underdocumented code which later on can't be
maintained because the person doing the project has long moved on to do
something different. Just a prejudice, probably...
> 1. Implement a dynamic-load interface for Enblend and Enfuse.
> 1.1 Make Enblend/Enfuse extensible with the help of the dlopen(3)
> interface to the dynamic linking loader.
I am a Python person myself, so this is where I get my ideas of handling
dynamic loading. My first idea to deal with the situation was to make
enfuse/enblend python modules. The implementation of other python
modules to operate on the data, once they're in python space, is a
well-established process - no need to reinvent the wheel.
> 1.2 Insert one or more interface classes into Enblend/Enfuse through
> which dynamically loaded extensions can access _all_ data
> (including e.g. meta data and ICC profiles) of _any_ input image
> and the output image.
this access is easily implemented via a python module, as I've
demonstrated with the python interface I've written for hugin. In fact I
didn't actually have to code anything, I merely had to tell swig to
process the relevant C++ headers. If the C/C++ code already has these
data, letting python have them isn't hard. Of course, a deliberately
coded set of interface classes is more beautiful than the brute force
approach of simply wrapping all the extant objects, which usually
results in a bit of bloat because there's more stuff than is needed.
I wonder, though, if the infrastructure to pass around these data isn't
already there in other python modules, and it would be just a matter of
using existing code for the purpose.
> 1.3 Allow the dynamic extensions to register call-back functions.
> 1.3.1 Enfuse: the user-defined extensions are called for each pixel
> in each input file.
hmmm... a first step, but I fear it's too atomistic. If I'm not
mistaken, function calls are reasonably expensive. So having to execute
one per pixel would slow things down. At least the inner loop has to
move into the called function: then, with a wXh image, you only have,
say, h function calls instead of wXh. Terrible waste passing in all the
intensity (or whatever) values by value and returning a value. Much
better to pass in pointers. Also to allow for different data types: you
can just pass in a char*, a stride and a type tag, and dispatch to the
relevant C routines. If you pass around values, you have to use variants
for every possible data type.
And what of the functions which access neighbourhoods of pixels, like
the contrast and entropy weighting? I suppose you calculate them for the
whole matrix to have the per-pixel values handy when you need them.
> 1.3.2 Enblend: the user-defined extensions are called for the
> lightness levels (only 2^8 calls necessary), contrast levels,
> etc.
I've not looked into enblend's code so I'm not really sure how it does
what it does ;-)
2^8 sounds more reasonable than a few millions, though.
> 1.3.3 Supply the user with a script that compiles and links simple
> extensions in one pass without detailed knowledge of the
> underlying build process.
Now there's a template for this sort of behaviour, and it's mathmap. An
excellent concept, which was implemented nicely and since then seems to
have been trundling along at half impulse power which is really a shame.
I always wondered if it wasn't worth looking into for salvageable parts,
but the python approach has ever been more attractive to me.
> 1.4 Prospects
> 1.4.1 Fast, because the user code gets compiled.
This is the way to go. If it's slow, noone will use it. You may have one
in thousand users who actually implement code because they want or need
it, and the 999 others just want to use prefabricated stuff and they'll
not use it if it's slow.
> 1.4.2 The user must compile her extensions with a compiler (and
> compiler settings) that are compatible to the binaries.
> However, 1.3.3. helps here.
> 1.4.3 Supplying new command-line options with a dynalinked extension
> could turn out "interesting" to implement correctly.
"interesting" indeed :)
I suppose all of this can be done in C/C++, but it's unnecessarily
complicated. Dynamically supplying new command line options in python,
on the other hand, is a snap.
> 2. Integrate a Guile interpreter into Enblend and Enfuse.
> 2.1 Use a Guile interpreter to dynamically extend Enblend and Enfuse.
> 2.2 Like 1.2.
> 2.3. Supply "enough" hooks for Scheme user functions to allow the user
> to implement her own weighting strategies and much more.
> 2.4 Prospects
> 2.4.1 Guile has been developed for exactly this use case: extending
> existing applications.
> 2.4.2 Easy to use; Scheme is simple to learn.
(giggle) oh noooo... please, not a Lisp dialect. Lisp gives me a
headache. I can think of a few projects using Lisp or scheme for
plugins, but I'd say that the common approach is to implement a python
interface as well because the general public just won't go for scheme,
'easy to learn' as it may be. Best to use something peolple know already.
> 2.4.3 No extra tools required.
> 2.4.4 Many internals of Enblend and Enfuse can be made
> configurable with user-accessible Guile functions.
> 2.4.5 Probably slow; certainly slower than 1.
Indeed. It would be nice for an academic pet sort of thing which you run
a few times on a supercomputer, but it should really be usable by the
general public on a day-to-day basis.
> 2.4.6 Some interpreted user functions could be called from parallel
> regions of the C++-code. This certainly raises "interesting"
> problems with races, deadlocks, and of course performance of an
> interpreted language in such an environment.
can't comment here, because I'm as of yet blissfully ignorant of the
inner workings of enXXX
Kay
--
You received this bug notification because you are a member of Hugin
Developers, which is subscribed to Enblend.
https://bugs.launchpad.net/bugs/927509
Title:
enfuse feature proposal: weighting 'technical' qualities
Status in Enblend:
Triaged
Bug description:
Hi all!
Enfuse assesses the intensity values of corresponding pixels in the
set of source images, or values derived from intensity values. Yet at
times, it would be desirable to look at qualities which have nothing
to do with intensity-related values. You may ask what these qualities
might sensibly be, so let me propose a few:
- focal length of the source image
when stacking images done with different lenses, if priority is given
to an image taken with a longer lens, forcing patches with higher
resolution into a lower-res image becomes simple. Often only a section
of the target image is covered with the longer lens - like a horizon
sweep with a standard wide angle lens which is to be layered on top of
a fisheye set. In this situation, actual blending of the low-res and
high-res content is undesirable, and using a steep weighting function
or a hard mask, this could be used to use only the high-res content
where available. This would make the process of layering higher-res
content simple and much more convenient than having to deal with masks
and layers in an image processor further down the line, yet may
provide perfectly adequate results.
- other photographic parameters
analogous to focal length, other parameters spring to mind which might
be used for prioritization, as exposure time and aperture
- additional band information
weighting might have already been derived from some external
mechanism. This weight might have been stored in a band of the image -
be it an additional band specifically added for this purpose, or, for
example, the alpha channel coerced into this function.
- sequence in the list of input images
currently, weighting is independent of the order of the input images.
Yet at times it might be desirable to give more weight to some images
than to others based on considerations which needn't concern enfuse.
Giving weight to argument order would be a simple way of allowing
this. Alternatively one might use
- explicitly modified weights
currently, weighting is 'egalitarian' insofar as a pixel which has
'more' of a certain quality will score 'better' and only global
statistic parameters can be used to modify this behaviour. At times
one might wish to 'cheat' and simply prefer some images over others.
Passing explicit weight modifying factors in the command line could
provide for such a feature.
- distance from nearest transparent pixel
when blending in patches, this parameter could be used to effect a
smooth transition into the content of the patch, like a feathering.
so much for my bit of brain storming. I'm certain that the concept of
'technical' weighting can produce new possibilities for enfuse, making
it an even more verstile tool. I'd be curious to see more ideas
following this template, as I'm sure there must be.
You may have noticed that some of my proposals would conventionally be
seen to belong to the blending domain rather than to the fusing
domain. In fact both might benefit from relaxing the boundaries.
Enblend would do well to offer parameters to prioritize certain
content. Yet again the field where I have most missed such
functionality is in insertion of higher-res patches, which is
cumbersome and circuiticious in my current work flow.
Kay
To manage notifications about this bug go to:
https://bugs.launchpad.net/enblend/+bug/927509/+subscriptions
References