Activities in Extended Videos Prize Challenge
sponsored by IARPA
Nov 08, 2018 - March 29, 2019


ActEV-PC Leaderboard for Phase 1

Prize Challenge Leader

Updated: 2019-04-30 02:53:55 -0400
RANK SUBMISSION_ID EVALUATION_NAME TRACK_NAME TEAM_NAME SYSTEM_NAME W_PMISS@0.15RFA PRIZE_ELIGIBLE
1 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00191_MUDSML_20190327-115431-4980.sr-20190327-115431-5488 ActEV-2018 AD MUDSML MUDSML_SECONDARY 0.60473 no
2 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00224_MUDSML_20190401-061952-6662.sr-20190401-061952-7224 ActEV-2018 AD MUDSML MMVG-AlibabaAIC 0.60473 no
3 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00188_MUDSML_20190412-103103-6528.sr-20190412-103103-7181 ActEV-2018 AD MUDSML MUDSML_MAIN 0.64286 no
4 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00185_INF_20190413-223023-5926.sr-20190413-223023-6427 ActEV-2018 AD INF tst 0.67599 no
5 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00216_BUPT-MCPRL_20190412-224234-5507.sr-20190412-224234-6508 ActEV-2018 AD BUPT-MCPRL MCPRL_PC 0.69327
6 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00221_BUPT-MCPRL_20190324-082801-6204.sr-20190324-082801-6550 ActEV-2018 AD BUPT-MCPRL MCPRL_C2 0.69371
7 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00220_BUPT-MCPRL_20190403-100702-5648.sr-20190403-100702-6024 ActEV-2018 AD BUPT-MCPRL MCPRL_C1 0.69404
8 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00214_BUPT-MCPRL_20190324-040537-0726.sr-20190324-040537-1127 ActEV-2018 AD BUPT-MCPRL MCPRL_BUPT 0.69467
9 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00261_INF_20190430-024948-8188.sr-20190430-024948-8860 ActEV-2018 AD INF clean 0.70772 no
10 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00209_MUDSML_20190430-022449-0672.sr-20190430-022449-1455 ActEV-2018 AD MUDSML MUDSML_ADD 0.71175 no
11 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00227_vireoJD-MM_20190424-103901-3123.sr-20190424-103901-3557 ActEV-2018 AD vireoJD-MM Pipeline-V3 0.73317
12 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00218_UNSW-InsData-PC_20190321-152215-6408.sr-20190321-152215-7334 ActEV-2018 AD UNSW_InsData_PC UNSW_InsData 0.74234
13 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00237_vireoJD-MM_20190419-120757-3612.sr-20190419-120757-4173 ActEV-2018 AD vireoJD-MM Pipeline-V4 0.74516
14 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00162_UCF_20190416-110758-3352.sr-20190416-110758-3742 ActEV-2018 AD UCF UCF-PC 0.74997 no
15 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00201_UMD_20190207-125352-3189.sr-20190207-125352-3536 ActEV-2018 AD UMD UMD 0.75031 no
16 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00189_IBM-MIT-Purdue_20190312-153824-9633.sr-20190312-153825-0992 ActEV-2018 AD IBM-MIT-Purdue k1 0.75658 no
17 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00189_STARK_20181221-134529-1523.sr-20181221-134529-2194 ActEV-2018 AD STARK k1 0.75780 no
18 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00198_STR-DIVA-Team_20190320-180832-6691.sr-20190320-180832-7046 ActEV-2018 AD STR-DIVA Team STR 0.76192 no
19 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00229_STR-DIVA-Team_20190324-160109-5400.sr-20190324-160109-5739 ActEV-2018 AD STR-DIVA Team STR2 0.76192 no
20 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00204_UMD_20190221-211220-5543.sr-20190221-211220-5868 ActEV-2018 AD UMD UMD3 0.76432 no
21 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00202_UMD_20190321-132838-6826.sr-20190321-132838-7142 ActEV-2018 AD UMD UMD2 0.76441 no
22 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00223_vireoJD-MM_20190329-055409-2177.sr-20190329-055409-2581 ActEV-2018 AD vireoJD-MM Pipeline-V2 0.76539
23 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00196_vireoJD-MM_20190321-155052-3583.sr-20190321-155052-3896 ActEV-2018 AD vireoJD-MM Pipeline-V1 0.76826
24 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00186_INF_20190201-135322-9654.sr-20190201-135323-0281 ActEV-2018 AD INF tst2 0.77348 no
25 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00199_JHUDIVATeam_20190322-195555-4851.sr-20190322-195555-5913 ActEV-2018 AD JHUDIVATeam Simulation_Synthesis 0.79336 no
26 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00168_JHUDIVATeam_20190201-010657-0332.sr-20190201-010657-1986 ActEV-2018 AD JHUDIVATeam Structured_Genralization_AD 0.79420 no
27 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00228_NtechLab_20190329-052339-1867.sr-20190329-052339-2168 ActEV-2018 AD NtechLab Combined s1 and s2 0.79968
28 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00161_SRI_20190206-141136-4690.sr-20190206-141136-7075 ActEV-2018 AD SRI sri_pc_test1 0.80460 no
29 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00171_SRI_20190204-075310-0761.sr-20190204-075310-5138 ActEV-2018 AD SRI TardisVS 0.80518 no
30 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00200_NtechLab_20190321-092010-7359.sr-20190321-092010-7736 ActEV-2018 AD NtechLab Detect&Track 0.80620
31 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00197_SRI_20190208-175020-5929.sr-20190208-175021-0000 ActEV-2018 AD SRI TardisVP_pruning 0.81229 no
32 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00203_NtechLab_20190321-161831-5586.sr-20190321-161831-5894 ActEV-2018 AD NtechLab ActionRCNN 0.82827
33 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00170_SRI_20181217-094621-1223.sr-20181217-094621-1972 ActEV-2018 AD SRI TardisVP 0.83609 no
34 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00217_Shandong-Normal-University_20190314-084457-6128.sr-20190314-084457-6690 ActEV-2018 AD Shandong Normal University sdnu-resnet 0.85823
35 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00169_JHUDIVATeam_20181214-181827-2859.sr-20181214-181827-3922 ActEV-2018 AD JHUDIVATeam CLI_Indepedent_Eval 0.87234 no
36 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00219_USF-Bulls_20190326-112622-4715.sr-20190326-112622-5039 ActEV-2018 AD USF Bulls PC2 0.88561
37 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00193_USF-Bulls_20190129-003000-8660.sr-20190129-003000-9033 ActEV-2018 AD USF Bulls PC1 0.88816
38 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00166_DIVA-TE-Baseline_20181213-083956-4174.sr-20181213-083956-4501 ActEV-2018 AD DIVA TE Baseline baselineACT_1_AD 0.90723 no
39 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00133_DIVA-TE-Baseline_20181212-084002-2087.sr-20181212-084002-2555 ActEV-2018 AD DIVA TE Baseline baselineRC3D_1_AD 0.91297 no
40 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00225_IVP_20190321-095107-2888.sr-20190321-095107-3306 ActEV-2018 AD IVP IVP-baseline_1_AD 0.93736
41 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00226_IVP_20190321-101007-2212.sr-20190321-101007-2509 ActEV-2018 AD IVP IVP_2_AD 0.93736
42 ActEV-2018_AD_Prize-Challenge-Leader_SYS-00215_XXR_20190314-004302-2142.sr-20190314-004302-2628 ActEV-2018 AD XXR XXR 0.97173
ActEV-PC Updates

During the Prize Challenge and updates were posted here
June 03, 2019: Of the 7 teams invited to take part in the ActEV-PC Independent Evaluation, the winners are:
NIST and the winners presented at the ActivityNet Workshop at CVPR 2019 on June 17.
Algorithm Delivery for Phase 2 Selectees

The top 6 Phase 1 ActEV Challenge participants delivered their algorithms in a form compatible with the ActEV Command Line Interface (ActEV CLI) to NIST for Phase 2 testing. The command line interface implementation that you provided formalizes the entire process of evaluating a system including: (1) downloading and installing your software via a single URL, (2) verifying the delivery works AND produces output that is “consistent” with output you produce, and (3) enabling a user to process a large collection of video in a fault-tolerant, parallelizable manner.

To complete this task you needed the following things described in detail below:

  1. FAQ - Validation Phase Processing
  2. CLI Description
  3. The Abstract CLI Git Repository
  4. The CLI Implementation Primer
  5. The Validation Data Set
  6. Example CLI-Compliant Implementations
  7. NIST Hardware and Initial Operating System

FAQ - Validation Phase Processing

The Phase 2 - Validation Phase Processing FAQ

CLI Description

The ActEV CLI is described in the ActEV-CLI-2019025

The Abstract CLI GIT Repository

The abstract CLI is located at the URL https://gitlab.kitware.com/alexandreB/diva_evaluation_cli . The repo contains documentation.

The CLI Implementation Primer

There are 6 steps to adapt your code to the CLI. The ActEV Evaluation CLI Programming Primer describes the steps to clone the Abstract CLI and begin adapting the code to your implementation.

The Validation Data Set

As mentioned above, the CLI is used to verify the downloaded software is correctly installed and produces the same output at NIST as you produce locally. In order to do so, we have provided a small validation data set (ActEV-Eval-CLI-Validation-Set1) on the Data Sets page that was processed both at your site and at NIST. Please use this data in Step 3 of the Primer described above.

Example CLI-Compliant Implementations

Here are links to example CLI implementations used for the baseline algorithms on the leaderboard:

NIST Hardware and Initial Operation System

NIST installed your system from a fresh Ubuntu 16.04 cloud image available from https://cloud-images.ubuntu.com/releases/16.04/release/ on the following hardware.

  • 16 cores (32 hyperthreaded)
  • 128GB of memory
  • 4 x Nvidia GTX1080Ti
  • 40GB root disk space
  • 256GB of ephemeral SSD disk space mounted to /mnt/


Activities and the Activity Detection Task

An ActEV activity is defined to be “one or more people performing a specified movement or interacting with an object or group of objects”. Activities are annotated by humans using a set of annotation guidelines that specify how to perform the annotation and the criteria to determine if the activity occurred. Each activity is formally defined by five elements:

  • Activity Name - A mnemonic handle for the activity
  • Activity Description - Textual description of the activity
  • Begin time rule definition - The specification of what determines the beginning time of the activity
  • End time rule definition - The specification of what determines the ending time of the activity
  • Required object type list - The list of objects systems are expected to identify for the activity. Note: this aspect of an activity not addressed by ActEV-PC.

For example:

Activity Name
Description and Example Chip Videos

Closing
  • Description: A person closing the door to a vehicle or facility.
  • Start: The event begins 1 s before the door starts to move.
  • End: The event ends after the door stops moving. People in cars who close the car door from within is a closing event if you can still see the person within the car. If the person is not visible once they are in the car, then the closing should not be annotated as an event.
  • Objects associated with the activity : Person; and Door or Vehicle


Vehicle_turning_left
  • Description: A vehicle turning left or right is determined from the POV of the driver of the vehicle. The vehicle may not stop for more than 10 s during the turn.
  • Start: Annotation begins 1 s before the vehicle has noticeably changed direction.
  • End: Annotation ends 1 s after the vehicle is no longer changing direction and linear motion has resumed. Note: This event is determined after a reasonable interpretation of the video.
  • Objects associated with the activity: Vehicle


Loading
  • Description: An object moving from person to vehicle.
  • Start: The event begins 2 s before the cargo to be loaded is extended toward the vehicle (i.e., before a person’s posture changes from one of “carrying” to one of “loading”).
  • End: The event ends after the cargo is placed into the vehicle and the person-cargo contact is lost. In the event of occlusion, it ends when the loss of contact is visible.
  • Objects associated with the activity: Person; and Vehicle

List of Activities

The target activities are listed in the table below. The detailed definitions are in the Evaluation Plan.
List of Activities
Closing
Closing_trunk
Entering
Exiting
Loading
Open_Trunk
Opening
Transport_HeavyCarry
Unloading
Vehicle_turning_left
Vehicle_turning_right
Vehicle_u_turn
Pull
Riding
Talking
activity_carrying
specialized_talking_phone
specialized_texting_phone

The Activity Detection (AD) task

Given a target activity, a system automatically detects and temporally localizes all instances of the activity. For a system-identified activity instance to be evaluated as correct, the type of activity must be correct and the temporal overlap must fall within a minimal requirement as described in the Evaluation Plan

Activities in Extended Videos Prize Challenge (ActEV-PC)

Updates
Of the 7 teams invited to take part in the ActEV-PC Independent Evaluation, the winners are:
NIST and the winners presented at the ActivityNet Workshop at CVPR 2019 on June 17.


Activity Annotated Video (4x speed and bounding boxes added for clarity).
Summary
The Activities in Extended Videos Prize Challenge (ActEV-PC) sought to encourage the development of robust automatic activity detection algorithms for an extended video. ActEV-PC is operated by NIST and sponsored by IARPA. Challenge participants developed algorithms that address the "Activity Detection Task" which requires systems to detect and temporally localize the activity instance for 18 activities that are to be found in extended videos. Extended videos contain significant spans without any activities and intervals with potentially multiple concurrent activities. ActEV-PC had two phases – the ActEV-PC Open Leaderboard evaluation (Phase 1) and the ActEV-PC Independent evaluation (Phase 2).

ActEV-PC Open Leaderboard Evaluation (Phase 1)
For the ActEV-PC Open Leaderboard evaluation, challenge participants ran their activity detection software on their compute hardware and submit system output defined by the Evaluation Plan to the NIST ActEV Scoring Server. This phase served as a qualifying stage where the top 6 participants proceeded to phase 2.

ActEV-PC Independent Evaluation (Phase 2)
For the ActEV-PC Independent evaluation, invited challenge participants submited their runnable activity detection software to NIST using the Evaluation Commandline Interface Submission Instructions. NIST then evaluated system performance on sequestered data using NIST hardware.

Prizes
The developers of the most accurate activity detection algorithms was eligible to win one of three cash prizes from a total prize purse of $50,000 (USD).
  • First Place Prize – $25,000
  • Second Place Prize - $15,000
  • Third Place Prize - $10,000

To receive the full award payment, each winning team must send at least one representative to CVPR'19 ActivityNet workshop for the challenge results announcement and present the winning team's results at the workshop. The challenge was one of the tasks of CVPR'19 ActivityNet workshop .

See the complete eligibility requirements in Evaluation and Rules tab.

What Activities?
For ActEV, an activity is defined to be “one or more people performing a specified movement or interacting with an object or group of objects”. The activities that systems are required to detect in this challenge are defined in on the Activities tab and Section 2.5 of the Evaluation Plan
How to Participate
To take part in the ActEV-PC evaluation you need to click Register above and then acknowledge that you have read and accepted the terms and conditions from the ActEV-PC Evaluation Plan and the VIRAT data license PDF, by clicking on the two checkboxes to agree.

NIST sent you information on how to download data. After signing in (using the link above) with your registered account, you uploaded your JSON formatted results (see Section 5 in the Evaluation Plan for format details) to the ActEV Scoring Server using the "Instructions" link (available on this web site after Sign In). The challenge then proceeded with the selection of 6 participants, who delivered their activity detection algorithm to NIST.
Judging Criteria and Metrics
The main scoring metrics was based on activity detection using the weighted probability of missed detection at 0.15 false alarms per minute averaged over activity types. A complete description of the metrics is provided in the Evaluation Plan. NIST also provides the Scoring software for ActEV-PC on GitHub.
Task coordinator
ActEV NIST team (actev-nist@nist.gov )
Activities Detection in Videos Prize Challenge (ActEV-PC) Schedule

Phase 1 - ActEV-PC Open Leaderboard evaluation:

Nov 08, 2018: Account registration opens, evaluation plan released and encrypted evaluation data available for download evaluation over
Nov 12, 2018: Evaluation data unlocked (decryption key published) evaluation over
Dec 12, 2018: Leaderboard open for submissions evaluation over
March 21, 2019 - 4:00 pm EST: Top six teams on the Leaderboard selected for participation in Phase 2 evaluation over

Phase 2 - ActEV-PC Independent evaluation:

March 29, 2019 (Friday): Challenge participants selected for Phase 2 deliver software conforming to the ActEV Evaluation CLI evaluation over
March 29, 2019 (Friday): NIST evaluates challenge participants’ code performance on sequestered data evaluation over
May 22, 2019: NIST reports results to IARPA evaluation over
May 30, 2019: Challenge winners announced evaluation over
June 03, 2019: Deadline for the reports by winners evaluation over
June 17, 2019: Challenge Workshop/Presentation of winning results (at CVPR'19 ActivityNet workshop) workshop over
Data Licensing for the ActEV Prize Challenge Evaluation
New ActEV Teams:
  • Create a login account by registering (using the link above) for the ActEV Prize Challenge
  • During account registration, you were provided:
  • You received the username and password for the data site.
Data Website URL: (requires a NIST-provided user and password)

Existing ActEV Participants:
  • The previously provided username and password gave you access to the data

ActEV-PC Evaluation Plan and Participation Rules

Evaluation Plan
The Updated Evaluation Plan describes the evaluation task, metrics, and scoring protocols, and system inputs and outputs. The Original Evaluation Plan
Who is Eligible to Participate?
To be eligible to win a prize under this competition, an individual or entity
  1. Must have completed and submitted a registration form on https://actev.nist.gov/prizechallenge
  2. Must meet any other account creation or registration requirements, such as creating an account on challenge.gov, etc.;
  3. Must have complied with all the requirements stated in these Rules and Terms/Conditions;
  4. Must agree to abide by the decisions of ODNI, NIST, and/or the individual judges, which shall be final and binding in all respects;
  5. Must agree to follow all applicable local, state, federal and country of residence laws and regulations.
  6. Must be (1) an individual or team of individuals each of whom are 18 years of age and over, or (2) a for-profit or non-profit entity organized or incorporated under law;
  7. May not be a federal entity or federal employee acting within the scope of their employment;
  8. Shall not be deemed ineligible because the individual or entity used Federal facilities or consulted with federal employees during a competition if the facilities and employees are made available to all individuals and entities participating in the competition on an equitable basis;
  9. In the case of federal grantees may not use federal funds to develop challenge applications unless consistent with the purpose of their grant award;
  10. In the case of federal contractors, may not use federal funds from a contract to develop challenge applications or to fund efforts in support of a challenge submission;
  11. May not be employees of NIST or ODNI and/or any other individual or entity associated with the development, evaluation, or administration of the competition, as well as members of such persons’ immediate families (spouses, children, siblings, parents), and persons living in the same household as such persons, whether or not related, are not eligible to participate in the competition;
  12. May not be prime contractors or subcontractors and their employees of the IARPA DIVA program, due to its similarity to the ActEV prize challenge (however, they may still register under the “forgoing prizes” option, in order to compete and have their solutions posted on the leaderboard without the possibility of a cash prize); and
  13. Must not be currently on the Excluded Parties List (https:// www.epls.gov/).
Warranties
By submitting an Entry, you represent and warrant that all information you submit is true and complete to the best of your knowledge, that you have the right and authority to submit the Entry on your own behalf or on behalf of the persons and entities that you specify within the Entry, and that your Entry (both the information and software submitted in the Entry and the underlying technologies or concepts described in the Entry):
  1. Is your own original work, or is submitted by permission with full and proper credit given within your Entry;
  2. does not contain confidential information or trade secrets (yours or anyone else’s);
  3. does not knowingly, after due inquiry (including, by way of example only and without limitation, reviewing the records of the United States Patent and Trademark Office and inquiring of any employees and other professionals retained with respect to such matters), violate or infringe upon the patent rights, industrial design rights, copyrights, trademarks, rights in technical data, rights of privacy, publicity or other intellectual property or other rights of any person or entity;
  4. does not contain malicious code, such as viruses, malware, timebombs, cancelbots, worms, Trojan horses or other potentially harmful programs or other material or information;
  5. did not violate any applicable law, statute, ordinance, rule or regulation, including, without limitation, United States export laws and regulations, including, but not limited to, the International Traffic in Arms Regulations and the Department of Commerce Export Regulations; and
  6. does not trigger any reporting or royalty or other obligation to any third party.
If the Submission includes any third party works (such as third party content, equipment, or open source code), entrant must be able to provide, upon the request of ODNI or NIST, documentation of all appropriate licenses and releases for such third party works. If entrant cannot provide documentation of all required licenses and releases, ODNI and NIST reserves the right to disqualify the applicable Submission, or sought to secure the licenses and releases for the benefit of ODNI and NIST, and allow the applicable Submission to remain in the Competition. ODNI and NIST also reserve all rights with respect to claims based on any damages caused by participant’s failure to obtain such licenses and releases.
Limitation of Liability
By participating in the Competition, you agree to assume any and all risks and to release, indemnify and hold harmless ODNI, NIST, and Engility Corporation, each of the Judges, and Subject Matter Experts, from and against any injuries, losses, damages, claims, actions and any liability of any kind (including attorneys’ fees) resulting from or arising out of your participation in, association with or submission to the Competition (including any claims alleging that your Entry infringes, misappropriates or violates any third party’s intellectual property rights). Participant agrees that they would not seek compensation for any equipment, materials, supplies, information, travel, labor and/or other Participant-provided services. In addition, you agree to waive claims against the Federal Government and its related entities, except in the case of willful misconduct, for any injury, death, damage, or loss of property, revenue, or profits, whether direct, indirect, or consequential, arising from your participation in this Competition, whether the injury, death, damage, or loss arises through negligence or otherwise. Entrants are not required to obtain liability insurance or demonstrate financial responsibility in order to participate in the competition. ODNI, NIST, and Engility Corporation are not responsible for any miscommunications such as technical failures related to computer, telephone, cable, and unavailable network or server connections, related technical failures, or other failures related to hardware, software or virus, or incomplete or late Entries. ODNI and NIST are not responsible for:
  1. Any incorrect or inaccurate information, whether caused by a Participant, printing errors, or by any of the equipment or programming associated with or used in the Competition;
  2. unauthorized human intervention in any part of the Entry process for the Competition;
  3. technical or human error that may occur in the administration of the Competition or the processing of Entries; or
  4. any injury or damage to persons or property that may be caused, directly or indirectly, in whole or in part, from a Participant’s participation in the Competition or receipt or use or misuse of an Award.

If for any reason an Entry is confirmed to have been deleted erroneously, lost, or otherwise destroyed or corrupted, the Participant’s sole remedy is to submit another Entry in the Competition.
Additional Information
These rules cannot be modified except by ODNI or NIST. All decisions by ODNI or NIST regarding adherence to these rules are final. The invalidity or unenforceability of any provision of these rules shall not affect the validity or enforceability of any other provision. In the event that any provision is determined to be invalid or otherwise unenforceable or illegal, these rules shall otherwise remain in effect and shall be construed in accordance with their terms as if the invalid or illegal provision were not contained herein. ODNI and NIST reserve the right in their sole discretion to amend these rules throughout the duration of the contest should extenuating circumstances arise, to extend or modify the dates of the Competition, and to change the terms set forth herein governing any phases taking place after the effective date of any such change.

Payment Terms
Prize payments was be provided by Engility Corporation. Prize winners need to submit a W-9 tax form, or a W8-BEN form in order to receive payment. Participants are responsible for all taxes incurred from the acceptance of Prize funds. No payments was made to participants where US sanctions preclude it.
Cancelation
IARPA reserves the right to cancel the ActEV prize challenge due to low participation or indications from early results that insufficient progress is being make toward challenge goals. It was anticipated that at least five participants would be necessary to award any prize challenge payment.

Intellectual Property

Each entrant retains full ownership and title in and to their submission and expressly reserve all intellectual property rights not expressly granted under the challenge agreement. By participating in the challenge, each entrant hereby irrevocably grants to the Office of the Director of National Intelligence (ODNI ), of which IARPA is a component, and NIST a limited, non-exclusive, royalty-free, worldwide license and right to reproduce, publically perform, publically display, and use the submission for internal ODNI and NIST business and to the extent necessary to administer the challenge. By submitting an Entry, you grant a non-exclusive right and license to ODNI and NIST to use your name, likeness, biographical information, image, any other personal data submitted with your Entry and the contents in your Entry (including any created works, such as YouTube® videos, but not including any App software submitted with or as part of your Entry), in connection with the Competition. You also agree that this license is perpetual and irrevocable.

You agree that nothing in this Notice grants you a right or license to use any names or logos of ODNI, IARPA, or NIST or any other intellectual property or proprietary rights of ODNI, IARPA, and NIST. You grant to ODNI and NIST the right to include your company or institution name and logo (if your Entry is from a company or institution) as a Participant on the Event Web site and in materials from IARPA announcing winners of or Participants in the Competition. Other than these uses or as otherwise set forth herein, you are not granting ODNI or NIST any rights to your trademarks.

ODNI and NIST would retain and use the system for a period of no longer than two years after submission or when the Participant ceases to participate in ActEV evaluations, whichever is later.

Any data or documentation that qualifies as business proprietary information, as defined by the Freedom of Information Act (5 USC Section 552), and is properly marked as such would be treated as confidential and would only be used for the purposes of the ActEV test.




If you have any question, please send an email to actev-nist@nist.gov