ActEV Sequestered Data Leaderboard


ActEV Sequestered Data Leaderboard (SDL) Evaluation

Updates
  • March 01, 2020: The ActEV 2020 SDL opens with the expanded MEVA Test3 dataset
  • March 01, 2020: ActEV 2020 SDL is a guest task under ActivityNet CVPR'20 ActivityNet workshop
  • May 17, 2020 May 10, 2020 at 12:00 noon EST: New Deadline for CLI submissions to be included in ActEV SDL ActivityNet rankings. Top two submissions to be announced on June 1, 2020.
  • June 14, 2020: CVPR'20 ActivityNet workshop ActEV SDL guest task presentations
  • April 28th, 2020: CLI submissions deadline extended to May 17th, 2020 at 12:00 noon EST
Summary
The ActEV (Activities in Extended Video) Sequestered Data Leaderboard is an ongoing ranking of software systems that watch lengthy videos and detect activities of interest. Anyone can submit their system to NIST, which will then run the system on sequestered data, score the results and post the score to the leaderboard.
The sequestered data is from the MEVA dataset, which contains hours of videos, including indoor and outdoor scenes, night and day, crowds and individuals, and videos are from both EO (Electro-Optical) and IR (Infrared) sensors. Hours can go by with no activities, but then multiple activities happen simultaneously. The data is multi-camera, in that multiple cameras may be pointed at the same scene at the same time. There are also two separate leaderboards for EO and IR videos.
What
Build and submit a software system to watch videos and detect if and when an activity of interest occurs. System runtime must be 1 times the video length on the designated evaluation hardware.
Who
Everyone. Anyone who registers can submit to the evaluation server.
How
Register here and then follow the instructions on the Algorithm Submission tab above. Systems must follow a NIST-defined command line interface and automatically run on NIST’s servers, both of which are described in the instructions.
Evaluation Task
Detect if and when an activity occurs. Given a target activity type and a set of videos, submitted systems must automatically detect all instances of the activity in the videos. While different activities instances have different durations, a submitted system will be considered to have detected an activity if it correctly identifies at least 1 second of the activity.
Data
The data is from the Multiview Extended Video with Activities (MEVA) dataset and the videos are from both EO (Electro-Optical) and IR (Infrared) sensors. The data used for SDL evaluation from March 01, 2020 is the the expanded MEVA Test3 dataset. The public MEVA dataset includes hundreds of hours of data from the same cameras at the same facility, which can be used for training. You can download the public MEVA dataset for free at mevadata.org; and more info about the datasets is on the data tab. We also provide annotations for 20 hours of MEVA data, and instructions on how to make and share activity annotations are at mevadata.org.
Metrics
Submitted activity detection systems must give a confidence score for each activity they detect. Detected activities are then thresholded based on this confidence score. Varying the threshold makes a trade-off between being sensitive enough to identify true activity instances (low threshold) vs. not making false alarms when no activity is present (high threshold). Submitted systems are scored on both of these, measured by Probability of Missed Detection (Pmiss) and Time-based False Alarm (TFA). Pmiss is the portion of activities where the system did not detect the activity for at least 1 second. TFA is the portion of time that the system detected an activity when in fact there was none. Submitted systems are scored for Pmiss and TFA at multiple thresholds, creating a detect error tradeoff (DET) curve. The leaderboard ranking of a system is based on a summary of its DET curve: the area under the DET Curve across the TFA range between 0% to 20% divided by 0.2 to normalize the value to [0:1]. Lower numbers are better, as they reflect fewer errors. See details in the SDL evaluation plan or check out the ActEV Scoring Software GitHub repo.
Evaluation Plan
Task coordinator
ActEV NIST team (ActEV-nist@nist.gov)
News
01March
ActEV 2020 SDL starts with expanded MEVA Test3 dataset
17May
New Deadline to submit for ActivityNet task

ActEV SDL Evaluation Schedule
March 01, 2020: ActEV 2020 SDL opens with MEVA Test3.

May 17 May 10 , 2020 at 12 noon EST : New Deadline for CLI submissions to be included in ActEV guest task under CVPR'20 ActivityNet workshop.

June 01, 2020 : We will invite the top two teams on the ActEV 2020 SDL leaderboard to give ActEV guest task oral presentations at the CVPR'20 ActivityNet workshop based on the CLI submission deadline (May 17th May 10th , 2020).

June 14, 2020: CVPR'20 ActivityNet workshop ActEV SDL guest task presentations.

Leaderboard Remains Open: Leaderboard will continue to remain open to allow participants to show continued progress on this challenging problem.
ActEV SDL Dataset

Multiview Extended Video with Activities (MEVA)

The ActEV SDL evaluation is based on the Multiview Extended Video with Activities (MEVA) dataset (mevadata.org) collected at the Muscatatuck Urban Training Center with a team of over 100 actors performing in various scenarios. The data was built by the Intelligence Advanced Research Projects Activity (IARPA) Deep Intermodal Video Analytics (DIVA) program to support activity detection in multi-camera environments for both DIVA performers and the broader research community.
The MEVA dataset has two parts: the public training and development data and sequestered evaluation data used only by NIST to test systems. The data is accompanied by activity annotations.

Public Training and Development Data
The Multiview Extended Video with Activities (MEVA) dataset website mevadata.org is to share the public MEVA video dataset and annotations. The size of the public MEVA dataset is 333 hours of ground-camera and UAV video. The size of the provided annotated video dataset is 28 hours. ActEV participants are encouraged to annotate the MEVA KF1 dataset for the 37 activities as described at mevadata.org.


The MEVA data GIT repo is the data distribution mechanism for MEVA Related annotations and documentation. The repo presently consists of schemas for the activity annotations https://gitlab.kitware.com/meva/meva-data-repo.

The ActEV data GIT repo, is the data distribution mechanism for the ActEV evaluation. The repo presently consists of a collection of corpora and partition definition files to be used for the evaluations https://gitlab.kitware.com/actev/actev-data-repo.

Sequestered Evaluation Data
As of March 2020, NIST is using a 140-hour collection of annotated MEVA data for sequestered data evaluations. The data set consists of both EO and IR cameras, public cameras (examples of which are in the public data set). The leaderboard presents results on the full 140-hour collection reporting separately by EO and IR data. Developers receive additional scores by activity for the EO_subset1 and the IR_subset1. Both subsets consist of data from public cameras.
ActEV SDL Leaderboard

SDL20-scoring-EO

Updated: 2020-05-25 01:17:01 -0400
RANK SCORING_SUBMISSION_ID SCORING REQUEST NAME TEAM NAME SUBMISSION ID SUBMISSION DATE SYSTEM NAME SYSTEM ID SCORING PROTOCOL PARTIAL AUDC* TIME LIMITED PARTIAL AUDC* RELATIVE PROCESSING TIME DETECTED ACTIVITY TYPES† PROCESSED FILES‡ MEAN-P_MISS@0.04TFA TIME LIMITED MEAN-P_MISS@0.04TFA
1 16601 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00328_VUS_20200511-132633-4864.sr-20200511-132633-7646 VUS submission_description/16259|16259 2020-05-08 VUS-V1 328 ActEV_SDL_V1 0.41576 0.50368 1.344 100% 100% 0.47767 0.54197
2 17473 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00326_UMD_20200520-155344-4251.sr-20200520-155344-7345 UMD submission_description/16984|16984 2020-05-15 UMD+UCF 326 ActEV_SDL_V1 0.42339 0.51434 1.226 97% 100% 0.49598 0.56691
3 16256 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00326_UMD_20200508-110531-4667.sr-20200508-110531-7901 UMD submission_description/15747|15747 2020-04-30 UMD+UCF 326 ActEV_SDL_V1 0.42166 0.51806 1.253 100% 100% 0.50700 0.57633
4 17319 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00282_Team-Vision_20200519-012627-9123.sr-20200519-012628-2507 Team_Vision submission_description/16275|16275 2020-05-08 STARK 282 ActEV_SDL_V1 0.53722 0.53735 1.002 100% 100% 0.62601 0.62568
5 16200 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00282_Team-Vision_20200507-153946-0409.sr-20200507-153946-4286 Team_Vision submission_description/16060|16060 2020-05-05 STARK 282 ActEV_SDL_V1 0.53722 0.53735 1.003 100% 100% 0.62601 0.62567
6 17325 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00282_Team-Vision_20200519-030728-0090.sr-20200519-030728-3599 Team_Vision submission_description/16705|16705 2020-05-12 STARK 282 ActEV_SDL_V1 0.63015 0.64861 1.096 172% 100% 0.71891 0.73204
7 15026 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00270_UCF_20200320-131358-4621.sr-20200320-131358-6150 UCF submission_description/13949|13949 2020-03-11 UCF-P 270 ActEV_SDL_V1 0.47410 0.333 97% 100% 0.54346
8 14819 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00281_UMD_20200319-091913-4675.sr-20200319-091913-7008 UMD submission_description/14296|14296 2020-03-13 UMD 281 ActEV_SDL_V1 0.47371 0.703 100% 100% 0.56423
9 14577 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00283_INF_20200317-124405-8174.sr-20200317-124406-4388 INF submission_description/14480|14480 2020-03-16 INF_MEVA1 283 ActEV_SDL_V1 0.43371 0.527 97% 98% 0.50320
10 14669 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00270_UCF_20200318-053708-4851.sr-20200318-053708-6369 UCF submission_description/14578|14578 2020-03-17 UCF-P 270 ActEV_SDL_V1 0.47410 0.296 97% 100% 0.54346
11 14857 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00270_UCF_20200319-153022-2982.sr-20200319-153022-5094 UCF submission_description/14760|14760 2020-03-18 UCF-P 270 ActEV_SDL_V1 0.44214 0.349 100% 100% 0.53583
12 14963 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00270_UCF_20200320-041131-0946.sr-20200320-041131-2570 UCF submission_description/14858|14858 2020-03-19 UCF-P 270 ActEV_SDL_V1 0.44261 0.346 100% 100% 0.53297
13 15224 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00312_IBM-MIT-Purdue_20200415-114050-1365.sr-20200415-114050-6403 IBM-MIT-Purdue submission_description/15129|15129 2020-04-10 Purdue 312 ActEV_SDL_V1 0.64832 0.148 100% 100% 0.73288
14 15414 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00307_INF_20200421-115525-8081.sr-20200421-115526-4730 INF submission_description/15132|15132 2020-04-13 INF_MEVA_IOD 307 ActEV_SDL_V1 0.42021 0.848 97% 98% 0.49086
15 15413 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00312_IBM-MIT-Purdue_20200421-105807-5524.sr-20200421-105807-8721 IBM-MIT-Purdue submission_description/15320|15320 2020-04-18 Purdue 312 ActEV_SDL_V1 0.74381 0.047 100% 100% 0.85580
16 15986 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00270_UCF_20200504-164235-0111.sr-20200504-164235-3032 UCF submission_description/15544|15544 2020-04-27 UCF-P 270 ActEV_SDL_V1 0.44552 0.356 97% 100% 0.50857
17 16614 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00312_IBM-MIT-Purdue_20200511-173821-3000.sr-20200511-173821-6176 IBM-MIT-Purdue submission_description/15562|15562 2020-04-28 Purdue 312 ActEV_SDL_V1 0.74647 0.047 100% 100% 0.85376
18 16135 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00329_vireoJD-MM_20200506-100639-0175.sr-20200506-100639-4400 vireoJD-MM submission_description/15735|15735 2020-04-30 Vireo 329 ActEV_SDL_V1 0.54823 0.160 100% 96% 0.67862
19 16051 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00288_INF_20200505-093715-5641.sr-20200505-093716-2346 INF submission_description/15846|15846 2020-05-01 meva_inf2 288 ActEV_SDL_V1 0.41775 0.927 97% 100% 0.48830
20 16257 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00270_UCF_20200508-110705-2778.sr-20200508-110705-5402 UCF submission_description/15949|15949 2020-05-03 UCF-P 270 ActEV_SDL_V1 0.46991 0.480 100% 100% 0.57272
21 16557 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00290_INF_20200510-225720-2108.sr-20200510-225720-7177 INF submission_description/16142|16142 2020-05-06 set3 290 ActEV_SDL_V1 0.48045 0.559 97% 100% 0.54064
22 16558 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00270_UCF_20200510-225902-1029.sr-20200510-225902-4001 UCF submission_description/16212|16212 2020-05-07 UCF-P 270 ActEV_SDL_V1 0.44625 0.317 94% 100% 0.51932
23 17057 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00298_INF_20200516-013223-2420.sr-20200516-013223-9916 INF submission_description/16309|16309 2020-05-08 speed1x 298 ActEV_SDL_V1 0.39595 0.498 97% 100% 0.47110
24 17346 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00270_UCF_20200519-111655-5218.sr-20200519-111655-8039 UCF submission_description/16386|16386 2020-05-09 UCF-P 270 ActEV_SDL_V1 0.48704 0.305 94% 100% 0.57566
25 17008 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00331_vireoJD-MM_20200515-163029-9498.sr-20200515-163030-4115 vireoJD-MM submission_description/16445|16445 2020-05-09 vireo2 331 ActEV_SDL_V1 0.56415 0.163 100% 96% 0.69369
26 17151 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00270_UCF_20200516-215958-2007.sr-20200516-215958-5118 UCF submission_description/16743|16743 2020-05-12 UCF-P 270 ActEV_SDL_V1 0.38123 0.581 94% 100% 0.43576
27 17090 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00312_IBM-MIT-Purdue_20200516-094544-8275.sr-20200516-094545-1616 IBM-MIT-Purdue submission_description/16902|16902 2020-05-14 Purdue 312 ActEV_SDL_V1 0.54174 0.080 97% 100% 0.67248
28 17326 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00334_VUS_20200519-030816-7908.sr-20200519-030817-0670 VUS submission_description/16908|16908 2020-05-14 VUS-V1-FAST 334 ActEV_SDL_V1 0.42462 0.829 100% 100% 0.49021
29 17387 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00312_IBM-MIT-Purdue_20200519-233350-8538.sr-20200519-233351-2525 IBM-MIT-Purdue submission_description/16909|16909 2020-05-14 Purdue 312 ActEV_SDL_V1 0.57399 0.080 97% 100% 0.70339
30 15300 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00263_NIST-TEST_20200416-142124-7336.sr-20200416-142125-0703 NIST-TEST submission_description/13021|13021 2020-03-05 NIST Test 263 ActEV_SDL_V1 0.47278 0.822 97% 100% 0.55971
31 17477 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00337_vireoJD-MM_20200520-200345-0804.sr-20200520-200345-4326 vireoJD-MM submission_description/17093|17093 2020-05-16 vireo3 337 ActEV_SDL_V1 0.54630 0.149 100% 96% 0.67944
32 14009 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00282_Team-Vision_20200312-162922-5840.sr-20200312-162922-7715 Team_Vision submission_description/13117|13117 2020-03-08 STARK 282 ActEV_SDL_V1 0.70461 0.685 97% 100% 0.77328
33 14202 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00298_INF_20200313-132531-6559.sr-20200313-132532-1240 INF submission_description/13315|13315 2020-03-10 speed1x 298 ActEV_SDL_V1 0.47186 0.615 97% 100% 0.55191
34 15027 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00282_Team-Vision_20200320-133708-7523.sr-20200320-133708-9190 Team_Vision submission_description/13533|13533 2020-03-10 STARK 282 ActEV_SDL_V1 0.70641 0.681 97% 98% 0.77357
35 14856 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00281_UMD_20200319-133322-4135.sr-20200319-133322-6300 UMD submission_description/13739|13739 2020-03-11 UMD 281 ActEV_SDL_V1 0.47258 0.684 97% 100% 0.55999
36 14483 ActEV-2018_AD_ActEV19-SDL-Scoring_SYS-00298_INF_20200316-112137-9583.sr-20200316-112140-4511 INF submission_description/13770|13770 2020-03-11 speed1x 298 ActEV_SDL_V1 0.47186 0.613 97% 100% 0.55191

SDL20-scoring-EO

Updated: 2020-05-25 01:17:01 -0400

SDL19-scoring-EO

Updated: 2020-04-30 16:10:26 -0400
RANK TEAM NAME SUBMISSION ID SUBMISSION DATE SYSTEM NAME PARTIAL AUDC* TIME LIMITED PARTIAL AUDC* RELATIVE PROCESSING TIME DETECTED ACTIVITY TYPES† PROCESSED FILES‡ MEAN-P_MISS@0.04TFA TIME LIMITED MEAN-P_MISS@0.04TFA
1 UMD submission_description/11045|11045 2020-01-24 UMD+UCF 0.41585 0.65517 1.154 100% 100% 0.49360 0.67902
2 UCF submission_description/10640|10640 2020-01-21 UCF-P 0.43810 0.362 100% 100% 0.52356
3 UCF submission_description/12650|12650 2020-02-25 UCF-P 0.44047 0.364 97% 100% 0.54081
4 UCF submission_description/9105|9105 2020-01-15 UCF-P 0.44056 0.424 97% 100% 0.52513
5 UCF submission_description/12222|12222 2020-02-11 UCF-P 0.44198 0.395 100% 100% 0.53514
6 UCF submission_description/8328|8328 2020-01-10 UCF-P 0.44360 0.403 100% 100% 0.52972
7 UCF submission_description/10821|10821 2020-01-22 UCF-P 0.44473 0.407 100% 100% 0.54228
8 UCF submission_description/11849|11849 2020-02-02 UCF-P 0.45165 0.378 100% 100% 0.51752
9 UMD submission_description/5595|5595 2019-12-06 UMD 0.47596 0.725 97% 100% 0.54479
10 UMD submission_description/12468|12468 2020-02-18 UMD 0.47611 0.849 97% 100% 0.55945
11 UMD submission_description/8222|8222 2020-01-10 UMD 0.47668 0.68909 1.260 100% 100% 0.54492 0.71689
12 UMD submission_description/8198|8198 2020-01-09 UMD 0.48037 0.68601 1.134 97% 100% 0.54825 0.71585
13 UCF submission_description/7728|7728 2020-01-04 UCF-P 0.48290 0.438 94% 100% 0.51235
14 UMD submission_description/7992|7992 2020-01-08 UMD 0.48333 0.68298 1.098 97% 100% 0.55028 0.71410
15 INF submission_description/11911|11911 2020-02-06 set3 0.48978 0.646 97% 100% 0.55928
16 UMD submission_description/10000|10000 2020-01-19 UMD 0.49061 0.803 100% 100% 0.56996
17 UMD submission_description/7472|7472 2020-01-02 UMD 0.49689 0.61793 1.042 97% 93% 0.55515 0.65471
18 UCF submission_description/7235|7235 2019-12-30 UCF-P 0.51676 0.347 97% 100% 0.55164
19 UCF submission_description/6499|6499 2019-12-18 UCF-P 0.52068 0.333 97% 100% 0.57885
20 UMD submission_description/4809|4809 2019-11-15 UMD 0.53144 0.965 100% 98% 0.60011
21 UMD submission_description/3666|3666 2019-10-10 UMD 0.53147 0.970 100% 98% 0.60101
22 UMD submission_description/6285|6285 2019-12-12 UMD 0.53459 0.723 100% 98% 0.59596
23 INF submission_description/12068|12068 2020-02-09 speed1x 0.53654 0.642 97% 94% 0.59701
24 INF submission_description/11850|11850 2020-02-03 meva_inf2 0.59044 0.960 94% 97% 0.64252
25 UCF submission_description/5851|5851 2019-12-09 UCF-P 0.60431 0.293 97% 100% 0.64157
26 Edge-Intelligence submission_description/8003|8003 2020-01-08 Edge-Intelligence 0.62842 0.939 97% 100% 0.75488
27 IBM-MIT-Purdue submission_description/6513|6513 2019-12-18 Purdue 0.64182 0.272 100% 100% 0.73321
28 IBM-MIT-Purdue submission_description/11346|11346 2020-01-26 Purdue 0.64182 0.128 100% 100% 0.73320
29 IBM-MIT-Purdue submission_description/11345|11345 2020-01-26 Purdue 0.64182 0.125 100% 100% 0.73320
30 Edge-Intelligence submission_description/7845|7845 2020-01-06 Edge_Intelligence 0.64737 0.887 94% 100% 0.76268
31 UMD submission_description/2074|2074 2019-08-16 UMD 0.65325 0.744 97% 98% 0.73860
32 IBM-MIT-Purdue submission_description/6815|6815 2019-12-20 Purdue 0.65351 0.315 100% 100% 0.73730
33 IBM-MIT-Purdue submission_description/5004|5004 2019-11-25 Purdue 0.65503 0.124 100% 98% 0.73745
34 IBM-MIT-Purdue submission_description/5005|5005 2019-11-25 Purdue 0.65913 0.124 100% 98% 0.73971
35 IBM-MIT-Purdue submission_description/6501|6501 2019-12-18 Purdue 0.66055 0.442 100% 100% 0.74490
36 IBM-MIT-Purdue submission_description/4660|4660 2019-11-11 Purdue 0.66383 0.087 100% 98% 0.74381
37 IBM-MIT-Purdue submission_description/11343|11343 2020-01-26 Purdue 0.66394 0.038 100% 98% 0.74397
38 IBM-MIT-Purdue submission_description/11344|11344 2020-01-26 Purdue 0.66394 0.035 100% 98% 0.74397
39 IBM-MIT-Purdue submission_description/11268|11268 2020-01-26 Purdue 0.66496 0.124 100% 100% 0.74147
40 IBM-MIT-Purdue submission_description/6303|6303 2019-12-12 Purdue 0.66900 0.096 100% 100% 0.75098
41 IBM-MIT-Purdue submission_description/6547|6547 2019-12-18 Purdue 0.67114 0.099 100% 100% 0.75100
42 UMD submission_description/1907|1907 2019-08-13 UMD 0.67831 0.727 0.74917
43 IBM-MIT-Purdue submission_description/4999|4999 2019-11-25 Purdue 0.67912 0.098 100% 98% 0.76349
44 INF submission_description/11670|11670 2020-01-28 INF_MEVA1 0.67983 0.928 100% 100% 0.78032
45 IBM-MIT-Purdue submission_description/5001|5001 2019-11-25 Purdue 0.68547 0.099 100% 98% 0.76606
46 IBM-MIT-Purdue submission_description/11188|11188 2020-01-25 Purdue 0.69501 0.108 100% 98% 0.75741
47 UCF submission_description/5596|5596 2019-12-06 UCF-P 0.69848 0.304 86% 100% 0.73165
48 IBM-MIT-Purdue submission_description/6443|6443 2019-12-17 Purdue 0.70117 0.82690 1.072 100% 100% 0.77819 0.87029
49 IBM-MIT-Purdue submission_description/4876|4876 2019-11-19 Purdue 0.70382 0.088 97% 98% 0.76211
50 Team_Vision submission_description/6865|6865 2019-12-20 STARK 0.71736 0.793 97% 100% 0.77673
51 Team_Vision submission_description/4434|4434 2019-11-05 STARK 0.71911 0.774 97% 100% 0.77884
52 UCF submission_description/5439|5439 2019-12-04 UCF-P 0.72830 0.304 86% 100% 0.75572
53 IBM-MIT-Purdue submission_description/8154|8154 2020-01-09 Purdue 0.74090 0.380 100% 100% 0.82467
54 IBM-MIT-Purdue submission_description/9978|9978 2020-01-18 Purdue 0.74091 0.145 100% 100% 0.82468
55 UCF submission_description/5077|5077 2019-11-27 UCF-P 0.74243 0.312 86% 100% 0.77197
56 IBM-MIT-Purdue submission_description/4228|4228 2019-10-31 Purdue 0.75410 0.093 97% 98% 0.82355
57 IBM-MIT-Purdue submission_description/4520|4520 2019-11-08 Purdue 0.75509 0.087 100% 98% 0.82359
58 IBM-MIT-Purdue submission_description/10037|10037 2020-01-19 Purdue 0.79701 0.151 86% 100% 0.84131
59 IBM-MIT-Purdue submission_description/4179|4179 2019-10-30 Purdue 0.81573 0.093 97% 98% 0.86015
60 IBM-MIT-Purdue submission_description/10041|10041 2020-01-19 Purdue 0.82020 0.143 100% 100% 0.88273
61 IBM-MIT-Purdue submission_description/3830|3830 2019-10-21 Purdue 0.82025 0.246 100% 97% 0.86425
62 Team_Vision submission_description/4071|4071 2019-10-29 STARK 0.82121 0.757 75% 100% 0.85712
63 Team_Vision submission_description/8506|8506 2020-01-12 STARK 0.82442 0.739 94% 58% 0.85486
64 Team_Vision submission_description/3557|3557 2019-10-08 STARK 0.83722 0.723 78% 100% 0.88187
65 IBM-MIT-Purdue submission_description/10038|10038 2020-01-19 Purdue 0.84098 0.144 81% 100% 0.87514
66 Team_Vision submission_description/3382|3382 2019-09-23 STARK 0.84258 0.726 81% 100% 0.88996
67 UCF submission_description/3472|3472 2019-09-26 UCF-P 0.84269 0.292 100% 100% 0.89750
68 INF submission_description/8367|8367 2020-01-11 set3 0.84990 0.88917 1.332 89% 85% 0.89547 0.92076
69 IBM-MIT-Purdue submission_description/8290|8290 2020-01-10 Purdue 0.85131 0.301 100% 100% 0.88623
70 INF submission_description/7212|7212 2019-12-29 INF_MEVA1 0.85864 0.922 89% 100% 0.87950
71 INF submission_description/6967|6967 2019-12-23 set3 0.86506 0.911 89% 100% 0.88947
72 IBM-MIT-Purdue submission_description/10973|10973 2020-01-24 Purdue 0.86613 0.151 72% 100% 0.88674
73 IBM-MIT-Purdue submission_description/10972|10972 2020-01-24 Purdue 0.86613 0.152 72% 100% 0.88674
74 IBM-MIT-Purdue submission_description/10896|10896 2020-01-23 Purdue 0.86614 0.143 72% 100% 0.88673
75 IBM-MIT-Purdue submission_description/9866|9866 2020-01-17 Purdue 0.86887 0.142 72% 100% 0.88733
76 UCF submission_description/3104|3104 2019-09-10 UCF-P 0.87068 0.291 97% 100% 0.91260
77 INF submission_description/6965|6965 2019-12-23 INF_MEVA1 0.87216 0.906 89% 100% 0.89303
78 IBM-MIT-Purdue submission_description/10040|10040 2020-01-19 Purdue 0.87422 0.147 70% 100% 0.89421
79 Team_Vision submission_description/2694|2694 2019-08-27 STARK 0.87562 0.91641 1.795 86% 80% 0.90833 0.92705
80 IBM-MIT-Purdue submission_description/9873|9873 2020-01-17 Purdue 0.87606 0.139 70% 100% 0.89165
81 UCF submission_description/4510|4510 2019-11-06 UCF-P 0.87657 0.293 97% 100% 0.91490
82 INF submission_description/7152|7152 2019-12-26 speed1x 0.88292 0.989 89% 86% 0.89853
83 Team_Vision submission_description/12208|12208 2020-02-11 STARK 0.89021 0.783 94% 36% 0.89837
84 UCF submission_description/2014|2014 2019-08-15 UCF-P 0.89321 0.236 91% 81% 0.92557
85 Team_Vision submission_description/12319|12319 2020-02-12 STARK 0.89465 0.841 97% 26% 0.89748
86 UCF submission_description/4060|4060 2019-10-25 UCF-P 0.89623 0.273 97% 100% 0.92032
87 Edge-Intelligence submission_description/7427|7427 2020-01-02 Edge_Intelligence 0.92170 0.736 13% 96% 0.95539
88 INF submission_description/3276|3276 2019-09-13 INF_MEVA_IOD 0.93009 0.588 94% 91% 0.94479
89 INF submission_description/8273|8273 2020-01-10 INF_MEVA1 0.93512 0.95304 1.054 59% 100% 0.93680 0.95338
90 INF submission_description/2928|2928 2019-08-29 INF_MEVA1 0.97344 0.788 89% 98% 0.98175
91 IBM-MIT-Purdue submission_description/8278|8278 2020-01-10 Purdue 0.97569 0.361 29% 100% 0.98028
92 INF submission_description/7151|7151 2019-12-26 meva_inf2 0.97762 0.931 83% 100% 0.97655
93 IBM-MIT-Purdue submission_description/8287|8287 2020-01-10 Purdue 0.97778 0.361 24% 100% 0.98142
94 IBM-MIT-Purdue submission_description/8284|8284 2020-01-10 Purdue 0.97778 0.374 24% 100% 0.98142
95 IBM-MIT-Purdue submission_description/8288|8288 2020-01-10 Purdue 0.97778 0.359 24% 100% 0.98142
96 IBM-MIT-Purdue submission_description/8286|8286 2020-01-10 Purdue 0.97778 0.357 24% 100% 0.98142
97 IBM-MIT-Purdue submission_description/8289|8289 2020-01-10 Purdue 0.97778 0.370 24% 100% 0.98142
98 INF submission_description/4959|4959 2019-11-21 meva_inf2 0.98157 0.98830 1.703 83% 89% 0.98886 0.99250
99 DIVA TE Baseline submission_description/1786|1786 2019-08-02 RC3D 0.98216 0.959 29% 100% 0.98722
100 INF submission_description/3163|3163 2019-09-12 speed1x 0.98315 0.860 78% 94% 0.98496
101 Team_Vision submission_description/8299|8299 2020-01-10 STARK 0.99165 0.061 83% 86% 0.99143
102 Team_Vision submission_description/8295|8295 2020-01-10 STARK 0.99211 0.060 72% 96% 0.99190
103 DIVA TE Baseline submission_description/7911|7911 2020-01-07 RC3D 0.99255 0.99552 2.791 27% 75% 0.99302 0.99521
104 DIVA TE Baseline submission_description/7928|7928 2020-01-07 RC3D-WHEEL 0.99608 0.99608 2.715 27% 37% 0.99580 0.99580
105 INF submission_description/2682|2682 2019-08-27 INF_MEVA1 1.00000 0.000 0% 100% 1.00000
106 Team_Vision submission_description/8313|8313 2020-01-10 STARK 1.00000 0.049 0% 79% 1.00000
107 IBM-MIT-Purdue submission_description/6056|6056 2019-12-11 Purdue 1.00000 0.037 0% 100% 1.00000
108 IBM-MIT-Purdue submission_description/6055|6055 2019-12-11 Purdue 1.00000 0.037 0% 98% 1.00000
109 IBM-MIT-Purdue submission_description/6000|6000 2019-12-10 Purdue 1.00000 0.037 0% 98% 1.00000
110 Jay Chou submission_description/5904|5904 2019-12-10 Jack 1.00000 0.179 0% 0% 1.00000
111 IBM-MIT-Purdue submission_description/5818|5818 2019-12-09 Purdue 1.00000 0.037 0% 98% 1.00000
112 WeiLai submission_description/5817|5817 2019-12-09 Tryyitry 1.00000 0.177 0% 0% 1.00000
113 WeiLai submission_description/5651|5651 2019-12-07 Tryyitry 1.00000 0.178 0% 0% 1.00000
114 WeiLai submission_description/5576|5576 2019-12-05 Tryyitry 1.00000 0.178 0% 0% 1.00000
115 IBM-MIT-Purdue submission_description/5506|5506 2019-12-04 Purdue 1.00000 0.039 0% 98% 1.00000
116 IBM-MIT-Purdue submission_description/11642|11642 2020-01-27 Purdue 1.00000 0.016 0% 100% 1.00000
117 WeiLai submission_description/5452|5452 2019-12-04 Tryyitry 1.00000 0.178 0% 0% 1.00000
118 IBM-MIT-Purdue submission_description/5384|5384 2019-12-02 Purdue 1.00000 0.039 0% 98% 1.00000
119 INF submission_description/4513|4513 2019-11-08 INF_MEVA1 1.00000 0.039 0% 98% 1.00000

SDL19-scoring-EO

Updated: 2020-03-20 15:13:05 -0400

SDL19-scoring-IR

Updated: 2020-05-25 01:16:57 -0400
RANK SCORING_SUBMISSION_ID SCORING REQUEST NAME TEAM NAME SUBMISSION ID SUBMISSION DATE SYSTEM NAME SYSTEM ID SCORING PROTOCOL PARTIAL AUDC* TIME LIMITED PARTIAL AUDC* RELATIVE PROCESSING TIME DETECTED ACTIVITY TYPES† PROCESSED FILES‡ MEAN-P_MISS@0.04TFA TIME LIMITED MEAN-P_MISS@0.04TFA

SDL19-scoring-IR

Updated: 2020-05-25 01:16:58 -0400
Activities for the ActEV Sequestered Data Leaderboard

We have updated the activity names for the SDL. We intend for these names to be used for the duration of the DIVA program. The list below shows the "ActEV 2020 SDL Activity Name" and "ActEV 2019 SDL Activity Name" of the 37 activities to be detected for the ActEV SDL evaluation. Detailed activity definitions are in the ActEV Annotation Definitions for MEVA Data document. Note that current scoring for the SDL is based exclusively on activity detection, and object detection is not considered. The link to the activity-name-mapping.csv .

ActEV SDL Activities


ActEV 2020 SDL Activity Name ActEV 2019 SDL Activity Name (DEPRECATED)
person_abandons_package abandon_package
person_closes_facility_door person_closes_facility_door
person_closes_trunk Closing_Trunk
person_closes_vehicle_door person_closes_vehicle_door
person_embraces_person person_person_embrace
person_enters_scene_through_structure person_enters_through_structure
person_enters_vehicle person_enters_vehicle
person_exits_scene_through_structure person_exits_through_structure
person_exits_vehicle person_exits_vehicle
hand_interacts_with_person hand_interaction
person_carries_heavy_object Transport_HeavyCarry
person_interacts_with_laptop person_laptop_interaction
person_loads_vehicle person_loads_vehicle
person_transfers_object object_transfer
person_opens_facility_door person_opens_facility_door
person_opens_trunk Open_Trunk
person_opens_vehicle_door person_opens_vehicle_door
person_talks_to_person Talking
person_picks_up_object person_picks_up_object
person_purchases person_purchasing
person_reads_document person_reading_document
person_rides_bicycle Riding
person_puts_down_object person_sets_down_object
person_sits_down person_sitting_down
person_stands_up person_standing_up
person_talks_on_phone specialized_talking_phone
person_texts_on_phone specialized_texting_phone
person_steals_object theft
person_unloads_vehicle Unloading
vehicle_drops_off_person vehicle_drops_off_person
vehicle_picks_up_person vehicle_picks_up_person
vehicle_reverses vehicle_reversing
vehicle_starts vehicle_starting
vehicle_stops vehicle_stopping
vehicle_turns_left vehicle_turning_left
vehicle_turns_right vehicle_turning_right
vehicle_makes_u_turn vehicle_u_turn
Task for the ActEV Sequestered Data Leaderboard

In the SDL evaluation, there is one Activity Detection (AD) task for detecting and temporally localizing activities.


Activity Detection (AD)
For the Activity Detection task, given a target activity, a system automatically detects and temporally localizes all instances of the activity. For a system-identified activity instance to be evaluated as correct, the type of activity must be correct and temporally overlap the true activity for at least one second. Additional details may be found in the SDL Evaluation Plan.
Algorithm Delivery for the SDL participants

System delivery to the leaderboard must be in a form compatible with the ActEV Command Line Interface (ActEV CLI) and submitted to NIST for testing. The command line interface implementation that you will provide formalizes the entire process of evaluating a system, by providing the evaluation team a means to: (1) download and install your software via a single URL, (2) verify that the delivery works AND produces output that is “consistent” with output you produce, and (3) process a large collection of video in a fault-tolerant, parallelizable manner.

To complete this task you will need the following items described in detail below:

  1. FAQ - Validation Phase Processing
  2. CLI Description
  3. The Abstract CLI Git Repository
  4. The CLI Implementation Primer
  5. The Validation Data Set
  6. Example CLI-Compliant Implementation
  7. NIST Hardware and Initial Operating System Description
  8. SDL Submission Processing Pipeline

1. FAQ - Validation Phase Processing

The ActEV SDL - Validation Phase Processing

FAQ

2. CLI Description

The ActEV CLI description

3. The Abstract CLI GIT Repository

The Abstract CLI GIT repository. The repo contains documentation.

4. The CLI Implementation Primer

There are 6 steps to adapt your code to the CLI. The ActEV Evaluation CLI Programming Primer describes the steps to clone the Abstract CLI and begin adapting the code to your implementation.

5. The Validation Data Set

As mentioned above, the CLI is used to verify the downloaded software is correctly installed and produces the same output at NIST as you produce locally. In order to do so, we have provided a small validation data set (ActEV-Eval-CLI-Validation-Set3) as part of the ActEV SDL Dataset that will be processed both at your site and at NIST. Please use this data in Step 3 of the Primer described above.

6. Example CLI-Compliant Implementation

The links below provide two example CLI implementations for the leaderboard baseline algorithm:

7. NIST Independent Evaluation Infrastructure Specification

NIST will begin installing your system from a fresh Ubuntu 18.04 cloud image available from https://cloud-images.ubuntu.com/releases/18.04/release/ on the following hardware.

  • Chassis: Asus ESC4000 G4S
  • CPU: 2 x Intel(R) Xeon(R) Silver 4214 CPU @ 2.20GHz (12 cores/CPU)
  • Motherboard: Asus Intel® C621 PCH chipset
  • HDD/SSD: 2x 1.92GB Intel SSD DC S4500
  • RAM: 12x 16GB DDR4-2400 ECC RDIMM
  • GPU: Four PNY RTX2080Ti blower style
  • OS: Ubuntu 18.04
  • Storage Volume- 1TB (variable)
  • Supplied object store (read only) for source video

8. SDL Submission Processing Pipeline

There are three stages for the submission processing pipeline. They are:

  • Validation: during this stage we run through each of the ActEV CLI commands to install your system, run the system on the validation set, compare the produced output to the validation set, and finally, take a snapshot of your system to re-use during execution.
  • Execution: during this stage, we use the snapshot to process the sequestered data. Presently, we can divide the sequestered data into 1-hour sub-parts of the data set or process the whole dataset. Each sub-part is nominally 12, 5-minute files. We are only processing MEVA data through your system. Presently, we have a runtime limit per part and if a part fails to be processed, we retry it once.
  • Scoring: After all the parts have been processed, the outputs are merged, and scored.

Datasets

Framework

The DIVA Framework is a software framework designed to provide an architecture and a set of software modules which will facilitate the development of activity recognition analytics. The Framework is developed as a fully open source project on GitHub. The following links will help you get started with the framework: The DIVA Framework is based on KWIVER, an open source framework designed for building complex computer vision systems. The following links will help you learn more about KWIVER:
  • KWIVER Github Repository This is the main KWIVER site, all development of the framework happens here.
  • KWIVER Issue Tracker Submit any bug reports or feature requests for the KWIVER here. If there's any question about whether your issues belongs in the KWIVER or DIVA framework issues tracker, submit to the DIVA tracker and we'll sort it out..
  • KWIVER Main Documentation Page The source for the KWIVER documentation is maintained in the Github repository using Sphinx. A built version is maintained on ReadTheDocs at this link. A good place to get started in the documentation, after reading the Introduction are the Arrows and Sprokit sections, both of which are used by the KWIVER framework.
The framework based R-C3D baseline algorithm implementation with CLI; see for details.

Baseline Algorithms

KITWARE has adapted two "baseline" activity recognition algorithms to work within the DIVA Framework:

Visualization Tools

Annotation Tools

Contact Us

For ActEV Evaluation information (data, evaluation code, etc.) please email: actev-nist@nist.gov

For ActEV Evaluation Discussion Please Visit our Google Group.