# Testing calibration @calibration Feature: calibration Scenario Outline: calibration Given I use calibration_.json for calibration When I launch videostitch-cmd for calibration with calibration/scenes//.ptv and " -d 0 -v 3 " Then I expect the command to succeed When I analyze score of calibration/scenes//output_scoring.ptv Then I expect the score to be more than 0.75 When I analyze uncovered_ratio of calibration/scenes//output_scoring.ptv Then I expect the full coverage to be And The calibration cost of output "calibration/scenes//output_calibration.ptv" is consistent with "calibration__ref.ptv" Examples: | ptv | full_coverage | | paramotor | true | | louvre | true | @slow Scenario Outline: calibration Given I use calibration_.json for calibration When I launch videostitch-cmd for calibration with calibration/scenes//.ptv and " -d 0 -v 3 " Then I expect the command to succeed When I analyze score of calibration/scenes//output_scoring.ptv Then I expect the score to be more than 0.75 When I analyze uncovered_ratio of calibration/scenes//output_scoring.ptv Then I expect the full coverage to be And The calibration cost of output "calibration/scenes//output_calibration.ptv" is consistent with "calibration__ref.ptv" Examples: | ptv | full_coverage | | louvre_incremental | true | | factory | false | | factory_incremental | true |