6. Validation Procedure for web interface

In this section, we briefly explain some sort of “scripted” validation procedure for presentations/tutorials of the BEAT platform. This procedure should be executed ahead of possible presentations. Automated testing as, e.g., with Selenium (http://www.seleniumhq.org/) would be a plus.

6.1. Front page

  • Hit front page, all components should load (video tutorial), right bar with stats.

  • A yellow banner gets the user attention to our terms of service

  • The video tutorial should be playable

  • The user should be able to browse all public resources on the platform from this page, selecting the link from the top-bar button

  • The user should be able to search all public experiments on the platform from this page, using the search box

  • The user should be able to login or sign-up using the buttons on this page - in particular, access to the user “tutorial” should be possible through the login page

  • The user should be able to click on the “User Guide” button and a user guide should open

6.2. User page

  • By clicking on the “Home” button on the front page, after login, the user is taken to the “User page”

  • The page should display 3 columns, on the left, shared material, on the centre, monitored searchers and on the right, own contributions

  • The values for own contributions should make sense w.r.t. material developed by that user

  • Clicking on one of the leaderboards should get the user to the related search.

6.4. Experiments

  • It should be possible to fork an existing experiment (I typically use this one: https://www.beat-eu.org/platform/experiments/tutorial/tutorial/eigenface/1/atnt-eigenfaces-5-comp/) and run it by adjusting some parameter, the experiment name and clicking on the button “Run”

  • It should be possible to go to the toolchain tab on the experiment configurator page, show the associated toolchain

  • That experiment should run in a reasonable amount of time (less than 5 minutes)

  • Once the experiment finishes, it should take the user to the experiment “done” page

  • The user should receive an e-mail notification that the experiment finished

  • It should be possible go from tab to tab in the experiment page

  • It should be possible to click on the “Search similar” button on that page and search for similar experiments. When this happens, there should be a list of experiments to which we can compare the recently run experiment with

  • The last point implies that updates on the environments, databases or related stuff should be followed up with the test execution of at least the following experiments: tutorial/tutorial/eigenface/1/atnt-eigenfaces-5-comp/, 6, 7, 8, 9 and 10 components

  • It should be possible to parallelize blocks in experiments

  • It should be possible to change the software environment for globally or on a per-block basis

  • There must be more than one environment for selection (typically, a couple Python-based and a C++ based environment)

  • It should be possible to change the processing queue either globally or locally in each block

  • It should be possible configure block details on a per block basis, besides queue and software enviroment (e.g. parameters)

  • Once an experiment is “Done”, it should be possible to change its privacy settings

  • Changing the privacy settings of an experiment affects all underlying components correctly

  • It should be possible to share an experiment with a group, users or make it public

  • It should be possible attest (certify) an experiment

  • It should be possible to browse to the attestation of an attested experiment by clicking on the icon on the left of the experiment name (on the top of the page)

  • If you re-run an experiment that was just executed, the execution of the new experiment is cached and it runs immediately

  • During tutorials, it should be possible to run multiple experiments at once since multiple users will try to run concurrently

  • Here is a list of experiments that should be checked if underlying changes are performed to the platform:

6.5. Toolchains

6.6. Algorithms

6.7. Databases

6.8. Attestations

  • All assets related to a certified experiment should be either public or “executable” (for algorithms)

  • It should be possible to “unlock” an attestation

  • It should be possible to delete a “locked” attestation

  • The perma-link to a locked attestation should be visitable by an anonymous party

6.9. Reports

(Considering we have moved the features on the staging to production)

  • It should be possible to create a new report and to populate it from the experiment list

  • It should be possible to fill-in documentation

  • It should be possible to create groups and associate experiments to these groups with aliases

  • It should be possible to add documentation cells arbitrarily in each group

  • It should be possible to add tables and figures to each group

  • It should be possible to lock a report and ready it for publication

  • It should be possible to unlock a report and make it return to editable mode

  • It should be possible to make a report public and lock it permanently

  • Reports we normally mention in tutorials (they must always be working):

  • Make sure that a MR DOES NOT touch the API and breaks the logic which is well tested

  • Check that a locked report DOES NOT have their full name experiments as reviewers are going through it as some point. Which brakes completely the “blind” review which would give hints to a reviewer to accept or reject a report due to a possible friendship or not with the person who created the report.

  • On a locked report people shouldn’t be able to click on the experiment names at all, only the alias should be visible

  • When creating a table, verify that only the experiment alias is visible for locked reports

  • When going to the experiments page and adding experiments to a report from there, the pop-up asks if we wish to go to the report with a button View Report. When clicking on that button, check that you are sent to the page https://www.beat-eu.org/platform/reports/<username>/<report>/