Activities in Extended Videos Prize Challenge
sponsored by IARPA
Nov 08, 2018 - March 29, 2019
Prize Challenge Leader
RANK | SUBMISSION_ID | EVALUATION_NAME | TRACK_NAME | TEAM_NAME | SYSTEM_NAME | W_PMISS@0.15RFA | PRIZE_ELIGIBLE |
---|---|---|---|---|---|---|---|
1 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00191_MUDSML_20190327-115431-4980.sr-20190327-115431-5488 | ActEV-2018 | AD | MUDSML | MUDSML_SECONDARY | 0.60473 | no |
2 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00224_MUDSML_20190401-061952-6662.sr-20190401-061952-7224 | ActEV-2018 | AD | MUDSML | MMVG-AlibabaAIC | 0.60473 | no |
3 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00188_MUDSML_20190412-103103-6528.sr-20190412-103103-7181 | ActEV-2018 | AD | MUDSML | MUDSML_MAIN | 0.64286 | no |
4 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00185_INF_20190413-223023-5926.sr-20190413-223023-6427 | ActEV-2018 | AD | INF | tst | 0.67599 | no |
5 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00216_BUPT-MCPRL_20190412-224234-5507.sr-20190412-224234-6508 | ActEV-2018 | AD | BUPT-MCPRL | MCPRL_PC | 0.69327 | |
6 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00221_BUPT-MCPRL_20190324-082801-6204.sr-20190324-082801-6550 | ActEV-2018 | AD | BUPT-MCPRL | MCPRL_C2 | 0.69371 | |
7 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00220_BUPT-MCPRL_20190403-100702-5648.sr-20190403-100702-6024 | ActEV-2018 | AD | BUPT-MCPRL | MCPRL_C1 | 0.69404 | |
8 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00214_BUPT-MCPRL_20190324-040537-0726.sr-20190324-040537-1127 | ActEV-2018 | AD | BUPT-MCPRL | MCPRL_BUPT | 0.69467 | |
9 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00261_INF_20190430-024948-8188.sr-20190430-024948-8860 | ActEV-2018 | AD | INF | clean | 0.70772 | no |
10 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00209_MUDSML_20190430-022449-0672.sr-20190430-022449-1455 | ActEV-2018 | AD | MUDSML | MUDSML_ADD | 0.71175 | no |
11 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00227_vireoJD-MM_20190424-103901-3123.sr-20190424-103901-3557 | ActEV-2018 | AD | vireoJD-MM | Pipeline-V3 | 0.73317 | |
12 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00218_UNSW-InsData-PC_20190321-152215-6408.sr-20190321-152215-7334 | ActEV-2018 | AD | UNSW_InsData_PC | UNSW_InsData | 0.74234 | |
13 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00237_vireoJD-MM_20190419-120757-3612.sr-20190419-120757-4173 | ActEV-2018 | AD | vireoJD-MM | Pipeline-V4 | 0.74516 | |
14 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00162_UCF_20190416-110758-3352.sr-20190416-110758-3742 | ActEV-2018 | AD | UCF | UCF-PC | 0.74997 | no |
15 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00201_UMD_20190207-125352-3189.sr-20190207-125352-3536 | ActEV-2018 | AD | UMD | UMD | 0.75031 | no |
16 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00189_IBM-MIT-Purdue_20190312-153824-9633.sr-20190312-153825-0992 | ActEV-2018 | AD | IBM-MIT-Purdue | k1 | 0.75658 | no |
17 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00189_STARK_20181221-134529-1523.sr-20181221-134529-2194 | ActEV-2018 | AD | STARK | k1 | 0.75780 | no |
18 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00198_STR-DIVA-Team_20190320-180832-6691.sr-20190320-180832-7046 | ActEV-2018 | AD | STR-DIVA Team | STR | 0.76192 | no |
19 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00229_STR-DIVA-Team_20190324-160109-5400.sr-20190324-160109-5739 | ActEV-2018 | AD | STR-DIVA Team | STR2 | 0.76192 | no |
20 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00204_UMD_20190221-211220-5543.sr-20190221-211220-5868 | ActEV-2018 | AD | UMD | UMD3 | 0.76432 | no |
21 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00202_UMD_20190321-132838-6826.sr-20190321-132838-7142 | ActEV-2018 | AD | UMD | UMD2 | 0.76441 | no |
22 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00223_vireoJD-MM_20190329-055409-2177.sr-20190329-055409-2581 | ActEV-2018 | AD | vireoJD-MM | Pipeline-V2 | 0.76539 | |
23 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00196_vireoJD-MM_20190321-155052-3583.sr-20190321-155052-3896 | ActEV-2018 | AD | vireoJD-MM | Pipeline-V1 | 0.76826 | |
24 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00186_INF_20190201-135322-9654.sr-20190201-135323-0281 | ActEV-2018 | AD | INF | tst2 | 0.77348 | no |
25 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00199_JHUDIVATeam_20190322-195555-4851.sr-20190322-195555-5913 | ActEV-2018 | AD | JHUDIVATeam | Simulation_Synthesis | 0.79336 | no |
26 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00168_JHUDIVATeam_20190201-010657-0332.sr-20190201-010657-1986 | ActEV-2018 | AD | JHUDIVATeam | Structured_Genralization_AD | 0.79420 | no |
27 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00228_NtechLab_20190329-052339-1867.sr-20190329-052339-2168 | ActEV-2018 | AD | NtechLab | Combined s1 and s2 | 0.79968 | |
28 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00161_SRI_20190206-141136-4690.sr-20190206-141136-7075 | ActEV-2018 | AD | SRI | sri_pc_test1 | 0.80460 | no |
29 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00171_SRI_20190204-075310-0761.sr-20190204-075310-5138 | ActEV-2018 | AD | SRI | TardisVS | 0.80518 | no |
30 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00200_NtechLab_20190321-092010-7359.sr-20190321-092010-7736 | ActEV-2018 | AD | NtechLab | Detect&Track | 0.80620 | |
31 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00197_SRI_20190208-175020-5929.sr-20190208-175021-0000 | ActEV-2018 | AD | SRI | TardisVP_pruning | 0.81229 | no |
32 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00203_NtechLab_20190321-161831-5586.sr-20190321-161831-5894 | ActEV-2018 | AD | NtechLab | ActionRCNN | 0.82827 | |
33 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00170_SRI_20181217-094621-1223.sr-20181217-094621-1972 | ActEV-2018 | AD | SRI | TardisVP | 0.83609 | no |
34 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00217_Shandong-Normal-University_20190314-084457-6128.sr-20190314-084457-6690 | ActEV-2018 | AD | Shandong Normal University | sdnu-resnet | 0.85823 | |
35 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00169_JHUDIVATeam_20181214-181827-2859.sr-20181214-181827-3922 | ActEV-2018 | AD | JHUDIVATeam | CLI_Indepedent_Eval | 0.87234 | no |
36 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00219_USF-Bulls_20190326-112622-4715.sr-20190326-112622-5039 | ActEV-2018 | AD | USF Bulls | PC2 | 0.88561 | |
37 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00193_USF-Bulls_20190129-003000-8660.sr-20190129-003000-9033 | ActEV-2018 | AD | USF Bulls | PC1 | 0.88816 | |
38 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00166_DIVA-TE-Baseline_20181213-083956-4174.sr-20181213-083956-4501 | ActEV-2018 | AD | DIVA TE Baseline | baselineACT_1_AD | 0.90723 | no |
39 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00133_DIVA-TE-Baseline_20181212-084002-2087.sr-20181212-084002-2555 | ActEV-2018 | AD | DIVA TE Baseline | baselineRC3D_1_AD | 0.91297 | no |
40 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00225_IVP_20190321-095107-2888.sr-20190321-095107-3306 | ActEV-2018 | AD | IVP | IVP-baseline_1_AD | 0.93736 | |
41 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00226_IVP_20190321-101007-2212.sr-20190321-101007-2509 | ActEV-2018 | AD | IVP | IVP_2_AD | 0.93736 | |
42 | ActEV-2018_AD_Prize-Challenge-Leader_SYS-00215_XXR_20190314-004302-2142.sr-20190314-004302-2628 | ActEV-2018 | AD | XXR | XXR | 0.97173 |
During the Prize Challenge and updates were posted here
- Nov 30, 2018: General website revisions.
- Nov 8, 2018: ActEV-PC Evaluation Plan added
- Nov 8, 2018: VIRAT data license pdf added
- Feb 11, 2019: updated the ActEV-PC schedule
- Feb 11, 2019: The top 6 participants proceeded to phase 2, instead of the top 8 participants
- Feb 11, 2019: Updated ActEV-PC Evaluation Plan added
- Feb 25, 2019: The "ALGORITHM SUBMISSION" tab added to the ActEV-PC website
- March 23, 2019: See below
- 1st Place ($25,000) BUPT-MCPRL (Beijing University of Posts and Telecommunications)
- 2nd Place ($15,000) NtechLab
- 3rd Place ($10,000) vireoJD-MM (City University of Hong Kong and JD AI Research)
The top 6 Phase 1 ActEV Challenge participants delivered their algorithms in a form compatible with the ActEV Command Line Interface (ActEV CLI) to NIST for Phase 2 testing. The command line interface implementation that you provided formalizes the entire process of evaluating a system including: (1) downloading and installing your software via a single URL, (2) verifying the delivery works AND produces output that is “consistent” with output you produce, and (3) enabling a user to process a large collection of video in a fault-tolerant, parallelizable manner.
To complete this task you needed the following things described in detail below:
- FAQ - Validation Phase Processing
- CLI Description
- The Abstract CLI Git Repository
- The CLI Implementation Primer
- The Validation Data Set
- Example CLI-Compliant Implementations
- NIST Hardware and Initial Operating System
FAQ - Validation Phase Processing
The Phase 2 - Validation Phase Processing FAQCLI Description
The ActEV CLI is described in the ActEV-CLI-2019025The Abstract CLI GIT Repository
The abstract CLI is located at the URL https://gitlab.kitware.com/alexandreB/diva_evaluation_cli . The repo contains documentation.The CLI Implementation Primer
There are 6 steps to adapt your code to the CLI. The ActEV Evaluation CLI Programming Primer describes the steps to clone the Abstract CLI and begin adapting the code to your implementation.The Validation Data Set
As mentioned above, the CLI is used to verify the downloaded software is correctly installed and produces the same output at NIST as you produce locally. In order to do so, we have provided a small validation data set (ActEV-Eval-CLI-Validation-Set1) on the Data Sets page that was processed both at your site and at NIST. Please use this data in Step 3 of the Primer described above.Example CLI-Compliant Implementations
Here are links to example CLI implementations used for the baseline algorithms on the leaderboard:- The RC3D baseline system with CLI implementation by Alexandre Boyer.
- DIVA-baseline RC3D by Ameya Shringi
- DIVA-baseline ACT by Ameya Shringi
NIST Hardware and Initial Operation System
NIST installed your system from a fresh Ubuntu 16.04 cloud image available from https://cloud-images.ubuntu.com/releases/16.04/release/ on the following hardware.
- 16 cores (32 hyperthreaded)
- 128GB of memory
- 4 x Nvidia GTX1080Ti
- 40GB root disk space
- 256GB of ephemeral SSD disk space mounted to /mnt/
An ActEV
- Activity Name - A mnemonic handle for the activity
- Activity Description - Textual description of the activity
- Begin time rule definition - The specification of what determines the beginning time of the activity
- End time rule definition - The specification of what determines the ending time of the activity
- Required object type list - The list of objects systems are expected to identify for the activity. Note: this aspect of an activity not addressed by ActEV-PC.
For example:
- Description: A person closing the door to a vehicle or facility.
- Start: The event begins 1 s before the door starts to move.
- End: The event ends after the door stops moving. People in cars who close the car door from within is a closing event if you can still see the person within the car. If the person is not visible once they are in the car, then the closing should not be annotated as an event.
- Objects associated with the activity : Person; and Door or Vehicle
- Description: A vehicle turning left or right is determined from the POV of the driver of the vehicle. The vehicle may not stop for more than 10 s during the turn.
- Start: Annotation begins 1 s before the vehicle has noticeably changed direction.
- End: Annotation ends 1 s after the vehicle is no longer changing direction and linear motion has resumed. Note: This event is determined after a reasonable interpretation of the video.
- Objects associated with the activity: Vehicle
- Description: An object moving from person to vehicle.
- Start: The event begins 2 s before the cargo to be loaded is extended toward the vehicle (i.e., before a person’s posture changes from one of “carrying” to one of “loading”).
- End: The event ends after the cargo is placed into the vehicle and the person-cargo contact is lost. In the event of occlusion, it ends when the loss of contact is visible.
- Objects associated with the activity: Person; and Vehicle
List of Activities
List of Activities |
---|
Closing Closing_trunk Entering Exiting Loading Open_Trunk Opening Transport_HeavyCarry Unloading Vehicle_turning_left Vehicle_turning_right Vehicle_u_turn Pull Riding Talking activity_carrying specialized_talking_phone specialized_texting_phone |
The Activity Detection (AD) task
Given a target activity, a system automatically detects and temporally localizes all instances of the activity. For a system-identified activity instance to be evaluated as correct, the type of activity must be correct and the temporal overlap must fall within a minimal requirement as described in the Evaluation Plan
- 1st Place ($25,000) BUPT-MCPRL (Beijing University of Posts and Telecommunications)
- 2nd Place ($15,000) NtechLab
- 3rd Place ($10,000) vireoJD-MM (City University of Hong Kong and JD AI Research)
ActEV-PC Open Leaderboard Evaluation (Phase 1)
For the ActEV-PC Open Leaderboard evaluation, challenge participants ran their activity detection software on their compute hardware and submit system output defined by the Evaluation Plan to the NIST ActEV Scoring Server. This phase served as a qualifying stage where the top 6 participants proceeded to phase 2.
ActEV-PC Independent Evaluation (Phase 2)
For the ActEV-PC Independent evaluation, invited challenge participants submited their runnable activity detection software to NIST using the Evaluation Commandline Interface Submission Instructions. NIST then evaluated system performance on sequestered data using NIST hardware.
- First Place Prize – $25,000
- Second Place Prize - $15,000
- Third Place Prize - $10,000
To receive the full award payment, each winning team must send at least one representative to CVPR'19 ActivityNet workshop for the challenge results announcement and present the winning team's results at the workshop. The challenge was one of the tasks of CVPR'19 ActivityNet workshop .
See the complete eligibility requirements in Evaluation and Rules tab.
NIST sent you information on how to download data. After signing in (using the link above) with your registered account, you uploaded your JSON formatted results (see Section 5 in the Evaluation Plan for format details) to the ActEV Scoring Server using the "Instructions" link (available on this web site after Sign In). The challenge then proceeded with the selection of 6 participants, who delivered their activity detection algorithm to NIST.
Phase 1 - ActEV-PC Open Leaderboard evaluation:
Phase 2 - ActEV-PC Independent evaluation:
- Create a login account by registering (using the link above) for the ActEV Prize Challenge
- During account registration, you were provided:
- acknowledge that you have read and accepted the VIRAT data license
- agree to the rules of the Prize Challenge Evaluation Plan
- You received the username and password for the data site.
- The previously provided username and password gave you access to the data
- Must have completed and submitted a registration form on https://actev.nist.gov/prizechallenge
- Must meet any other account creation or registration requirements, such as creating an account on challenge.gov, etc.;
- Must have complied with all the requirements stated in these Rules and Terms/Conditions;
- Must agree to abide by the decisions of ODNI, NIST, and/or the individual judges, which shall be final and binding in all respects;
- Must agree to follow all applicable local, state, federal and country of residence laws and regulations.
- Must be (1) an individual or team of individuals each of whom are 18 years of age and over, or (2) a for-profit or non-profit entity organized or incorporated under law;
- May not be a federal entity or federal employee acting within the scope of their employment;
- Shall not be deemed ineligible because the individual or entity used Federal facilities or consulted with federal employees during a competition if the facilities and employees are made available to all individuals and entities participating in the competition on an equitable basis;
- In the case of federal grantees may not use federal funds to develop challenge applications unless consistent with the purpose of their grant award;
- In the case of federal contractors, may not use federal funds from a contract to develop challenge applications or to fund efforts in support of a challenge submission;
- May not be employees of NIST or ODNI and/or any other individual or entity associated with the development, evaluation, or administration of the competition, as well as members of such persons’ immediate families (spouses, children, siblings, parents), and persons living in the same household as such persons, whether or not related, are not eligible to participate in the competition;
- May not be prime contractors or subcontractors and their employees of the IARPA DIVA program, due to its similarity to the ActEV prize challenge (however, they may still register under the “forgoing prizes” option, in order to compete and have their solutions posted on the leaderboard without the possibility of a cash prize); and
- Must not be currently on the Excluded Parties List (https:// www.epls.gov/).
- Is your own original work, or is submitted by permission with full and proper credit given within your Entry;
- does not contain confidential information or trade secrets (yours or anyone else’s);
- does not knowingly, after due inquiry (including, by way of example only and without limitation, reviewing the records of the United States Patent and Trademark Office and inquiring of any employees and other professionals retained with respect to such matters), violate or infringe upon the patent rights, industrial design rights, copyrights, trademarks, rights in technical data, rights of privacy, publicity or other intellectual property or other rights of any person or entity;
- does not contain malicious code, such as viruses, malware, timebombs, cancelbots, worms, Trojan horses or other potentially harmful programs or other material or information;
- did not violate any applicable law, statute, ordinance, rule or regulation, including, without limitation, United States export laws and regulations, including, but not limited to, the International Traffic in Arms Regulations and the Department of Commerce Export Regulations; and
- does not trigger any reporting or royalty or other obligation to any third party.
- Any incorrect or inaccurate information, whether caused by a Participant, printing errors, or by any of the equipment or programming associated with or used in the Competition;
- unauthorized human intervention in any part of the Entry process for the Competition;
- technical or human error that may occur in the administration of the Competition or the processing of Entries; or
- any injury or damage to persons or property that may be caused, directly or indirectly, in whole or in part, from a Participant’s participation in the Competition or receipt or use or misuse of an Award.
If for any reason an Entry is confirmed to have been deleted erroneously, lost, or otherwise destroyed or corrupted, the Participant’s sole remedy is to submit another Entry in the Competition.
Each entrant retains full ownership and title in and to their submission and expressly reserve all intellectual property rights not expressly granted under the challenge agreement. By participating in the challenge, each entrant hereby irrevocably grants to the Office of the Director of National Intelligence (ODNI ), of which IARPA is a component, and NIST a limited, non-exclusive, royalty-free, worldwide license and right to reproduce, publically perform, publically display, and use the submission for internal ODNI and NIST business and to the extent necessary to administer the challenge. By submitting an Entry, you grant a non-exclusive right and license to ODNI and NIST to use your name, likeness, biographical information, image, any other personal data submitted with your Entry and the contents in your Entry (including any created works, such as YouTube® videos, but not including any App software submitted with or as part of your Entry), in connection with the Competition. You also agree that this license is perpetual and irrevocable.
You agree that nothing in this Notice grants you a right or license to use any names or logos of ODNI, IARPA, or NIST or any other intellectual property or proprietary rights of ODNI, IARPA, and NIST. You grant to ODNI and NIST the right to include your company or institution name and logo (if your Entry is from a company or institution) as a Participant on the Event Web site and in materials from IARPA announcing winners of or Participants in the Competition. Other than these uses or as otherwise set forth herein, you are not granting ODNI or NIST any rights to your trademarks.
ODNI and NIST would retain and use the system for a period of no longer than two years after submission or when the Participant ceases to participate in ActEV evaluations, whichever is later.
Any data or documentation that qualifies as business proprietary information, as defined by the Freedom of Information Act (5 USC Section 552), and is properly marked as such would be treated as confidential and would only be used for the purposes of the ActEV test.