← Back to team overview

hugin-devs team mailing list archive

[Bug 883208] Re: cpfind mode for gigapixel panoramas.

 

Attached is the patch that implements matrixmode (shooting left-to-right rows bottom-up (*)). 
It also implements snake mode: going left-to-right alternates with right-to-left. 

These modes just cause a second "neighbor" to be checked for matches.
The neighbor is predicted using the user-supplied row-pitch.

Testing: 
I now get a good number of matches in 10 to 15 minutes. I have just over 200 images. Simple linear mode would check about 200 image pairs for a match. Current "snake mode" tests about 400 pairs, while otherwise the "all pairs" mode would have had to check 20000 pairs. As that pairing is the most time-consuming, that would have taken around 12 hours..... 

Am I correct when I think that the older "multirow" mode runs a
preliminary layout step on the panorama and uses those results to
determine what pairs overlap?

In my case the "hugin optimize step" seems to take at least an hour on this pano. (Just in: With the extra controlpoints offered by "snake mode" it only takes 40 minutes!).  Not something you easily wait around for. 
Moreover due to the long "sweeps" small errors in alignment accumulate and the strips of matching images will show a lot of overlap where there is none and they will not overlap when they should. 

Todo: 
- implement the "bigger number of images surrounding the predicted image" just like "linear" mode. 
- implement automatic detection of the "pitch" in matrix or snake mode. 
- implement dynamic adjustment of the pitch in matrix or snake modes. 


The user interface changed. There is now a matchmode parameter that specifies "all", "linear", "multirow", "matrix" or "snake".



(*) Left-to-right can of course be right-to-left and bottom-up can be
top-down. It doesn't matter for the matcher. Similarly, you can shoot
columns and then move left or right.

** Patch added: "rewcpfind.patch"
   https://bugs.launchpad.net/hugin/+bug/883208/+attachment/2581664/+files/rewcpfind.patch

-- 
You received this bug notification because you are a member of Hugin
Developers, which is subscribed to Hugin.
https://bugs.launchpad.net/bugs/883208

Title:
  cpfind mode for gigapixel panoramas.

Status in Hugin - Panorama Tools GUI:
  Triaged

Bug description:
  I have built my own "gigapan" hardware. I made a "test shot" with only
  200 images.

  The camera swept across the FOV snake-mode left-to-right, up one row
  and then right-to-left.

  CPfind found matches between most adjacent images. Whenever the camera
  went UP the controlpoiints are mostly in the upper half of the
  lowernumbered image and then in the lower half of the higher numbered
  image.

  Once this happens, the sum of the image numbers indicates that up to
  thirty images further (I have horizontal sweeps of 30 images) that
  same sum of the image numbers should be tested.

  For example, when image 30 and 31 match vertically, then the sum is 61
  and we should try to match 32-29, 33-28, 34-27 .... 60-1. Next we'll
  get a vertical match again between 60 and 61, the sum becomes 121, and
  we will have to match 62-59, 63-58 ... 90-31.

  I am now stuck with a choice between "fast" which only finds (part of)
  the "serpent" control points (some images are too dark and
  featureless.) and an "exponentially expensive" mode which I expect to
  run for at least days...

  Adjacent is now:

  for (i=1 ; i < nimages;i++)
     process_pair (i, i-1);

  while I think it should be:

  for (i=0;i<nrules;i++) {
     for (j=0;j < rules[i].len; j++) {
         process_pair (j+rules[i].offset, (rules[i].mode== OFFSET) ?
                                                                    j+rules[i].number:
                                                                    rules[i].number - j);
        }
    }
  The first rule should then be initialized to: len=nimages, offset = 0, number = 1, mode = OFFSET. 
  My case  should then have the "rule":  len = 30, offset = 0, number=1 mode= DIFFERENCE. 

  A multirow panorama where each row is taken left-to-right, would then have rules of the "OFFSET" type. (in fact one rule would
  suffice.)
  It would be a further enhancement to have rules detected automatically. (with rule-0 already present, the "if matches are perpendicular to the normal match direction, add a sum-rule with the current total of the two image numbers" is almost free in terms of computational requirements.

  To detect the multirow left-to-right pano, you would find that e.g.
  there is no match between 30, 31 then no matches between 60,61 etc
  etc. However I would think that with some images almost featureless,
  this would be difficult to detect automatically.

  That would leave us with an O(N) detection algorithm: 
    chose sample image (nimages/2) (with matches on both sides). 
       try to match this image with all other images. 

  Whenever you get a match, add the rule for that same offset.

  I currently take shots with a linear 50% overlap. So I'd get offsets
  linesize-1, linesize, linesize+1 for the images below-left, below,
  below-right.

  I can't think of a "cheap" autodetect mode.

To manage notifications about this bug go to:
https://bugs.launchpad.net/hugin/+bug/883208/+subscriptions


References