The associated paper and data can be found at http://amyfire.com/projects/inferringcausality/.
The required data can be obtained from the project page.
CVPR2012_fluent_result
(fluent source data by Bruce)CVPR2012_*
(action source data (.txt) by Ping, parsed results (.mat) by Mingtian, parsed results (.py) by ...?)
CVPR2012_humanTestAnnotation.txt
-- this file defines how the clips were broken up for human annotationamy_cvpr2012.db
-- a sqlite file containing raw human annotations as they progressed through the clips
cvpr_db_results.csv
(humans, source data, causalgrammar all merged) -- this is generated bydealWithDBResults.py
requires CVPR2012_humanTestAnnotation.txt
, CVPR2012_fluent_result
(fluent detections) and CVPR2012_*/*
(one directory of action detections).
python dealWithDBResults.py (upload|download|upanddown)
You can specify a non-default action directory with the -a parameter. (the default will not run out of the box as it was essentially a symlink to two different directories, depending). Those directories are in the "minimal" dataset--see the minimal dataset readme for details.
Runs the following methods:
causal_grammar.import_summerdata
-- importing fromCVPR2012_reverse_slidingwindow_action_detection_logspace/*
by default, which was changed in the "minimal" dataset to two different choices--see the minimal dataset readme for details - (python files)munge_parses_to_xml(fluent_parses, temporal_parses)
-->orig_xml
resultscausal_grammar.process_events_and_fluents
-->fluent_and_action_xml
resultsorig_xml
results are inserted to db as 'origdata' and 'origsmrt'fluent_and_action_xml
results are inserted to db ascausalgrammar
andcausalsmrt
viauploadComputerResponseToDB
; ifsource.endswith('smrt')
,buildDictForFluentBetweenFramesIntoResults
is called, which does some very basic fixing of local inconsistencies (versusbuildDictForDumbFluentBetweenFramesIntoResults
which does no fixing of local inconsistencies)
- creates unified
results/cvpr_db_results/*.csv
for each user (human or algorithm) analyze_results.R
converts thecvpr_db_results/*.csv
to a singlecvpr_db_results.csv
R --vanilla < analyze_results.R
plotAllResults.sh
loops through each table/element in summerdata and generates timeline heatmaps for every agent (human, computer, source, ...)
Simply calls upload and then download.
usage: dealWithDBResults.py [-h]
[-d | -o EXAMPLES_ONLY | -x EXAMPLES_EXCLUDE | -g EXAMPLES_GREP]
[-s] [-i] [-a ACTIONFOLDER] [-n] [--debug]
[--database {mysql,sqlite}]
{upload,download,upanddown,list}
positional arguments:
{upload,download,upanddown,list}
optional arguments:
-h, --help show this help message and exit
-d, --dry-run Do not actually upload data to the db or save
downloaded data; only valid for "upload" or
"download", does not make sense for "upanddown" or
"list"
-o EXAMPLES_ONLY, --only EXAMPLES_ONLY
specific examples to run, versus all found examples
-x EXAMPLES_EXCLUDE, --exclude EXAMPLES_EXCLUDE
specific examples to exclude, out of all found
examples
-g EXAMPLES_GREP, --grep EXAMPLES_GREP
class of examples to include
-s, --simplify simplify the summerdata grammar to only include
fluents that start with the example name[s]
-i, --ignoreoverlaps skip the "without overlaps" code
-a ACTIONFOLDER, --actionfolder ACTIONFOLDER
specify the action folder to run against
-n, --inconsistentok don't require consistency in parse building
--debug Spit out a lot more context information during
processing
--database {mysql,sqlite}
hitrate.py
, formerlyanalyzeData-nearesthuman-hitrate.py
-- "hitrate for CVPR 2015 submission" (Nov 2015)pr.py
calculates average precision and recall and F1- Obsolete:
plotPR.R
generates a set of precision/recall graphs from the output ofanalyzeData-nearesthuman-pr.py has been removed, last existed in git hash 3032f8analyzeData-nearesthuman-pr.py
- Obsolete:
analyze_results.go
by Mingtian readscvpr_db_results.csv
;analyzeData.py1
refers to the results of the go code infindDistanceGivenLineOfGoOutput
, calling it obsolete. #analyzeData.py
has been removed, last existed in git hash 3032f8 - Obsolete:
plotPR.sh
callsanalyzeData-nearesthuman-pr.py
and thenplotPR.R
to generate a set of precision/recall graphs #analyzeData-nearesthuman-pr.py
has been removed, last existed in git hash 3032f8
-
getFluentChangesForFluent
(operates oncausal_grammar.process_events_and_fluents
xml) -
getFluentChangesForFluentBetweenFrames
(operates oncausal_grammar.process_events_and_fluents
xml) -
108 videos -- cut up from "maybe 10" -- in SIX scenes (9406, 9404, 8145, lounge) -- 2 grammars