Competition protocols

The protocols used were the same as in EvAAL 2012 edition. These are the protocols used both for competitors (C) and evaluation committees (EC) during the competition:

  • Track 1 on Location for AAL (you can download the protocol in pdf here)
  • Track 2 on Activity Recognition for AAL (you can download the protocol in pdf here)

 


 

Track 1 on Location for AAL (you can download the protocol in pdf here)

1. Welcome and briefing

For each of the following, it is sufficient that one person from the EC carries out the tasks (to allow other EC members to carry out tasks in parallel, related to a previous competitor):

  1. EC: Welcome the competitor on arrival.
  2. EC: Provide the competitor with a copy of this protocol, explain it, and answer any questions about it.
  3. EC: Explain that one person fro the EC will be responsible for monitoring that the protocol is followed properly, and recording that each step has been executed.
  4. EC: Appoint the person to be responsible for checking that this protocol is followed, and supply that person with a paper copy of this protocol, to use as a checklist on which completion of each point will be ticked off.
  5. EC: Inform the competitor of any inaccuracies in information provided beforehand (e.g. measurement data for coordinates in Living Lab).
  6. EC: Ask the competitor if it is acceptable to take photographs and/or video during the competition.

2. Infrastructure Preparation

  1. EC: Make sure that the server in the Living Lab is running, and receiving events according to specification.  Fix any problems that are noted.
  2. C: Inform of any adjustments needed in the Living Lab itself before the timed installation can begin;   carry out any such adjustments that are feasible and approved by the EC.
  3. EC: Clear the entire Living Lab of people and/or other obstacles that are not part of the normal Living Lab set-up.
  4. C: Signal when ready to start installation.
  5. EC: Start timing of installation process.
  6. C: Carry out installation, and give signal when complete.
  7. EC: Stop timing.
  8. EC: If the competitor has installed fixed devices (e.g. sensors), it is desirable to know the exact location of each.  If the competitor already has this data readily available in a suitable format, ask the competitor to provide this information.   Otherwise, measure and record the location of each device.
  9. EC: Take some photographs.

3. Evaluation of Installation Process

Note 1: If the competitor needs to take a break during installation for any reason: inform EC, who will pause the timer.

Note 2: The installation time includes any measurements, software configuration etc. that the competitor may need to carry out, in addition to the physical installation itself.

Note 3: The recording of device locations (after timing has stopped) is not part of the competition itself and does not influence the score for the competitor. The data is collected for scientific interest, and documentation afterwards.

4. Integration of EvAAL and Competitor’s Software

EvAAL provides a software infrastructure for gathering data generated by the competitor’s software, comparing this with benchmark data on the user’s location, and providing the basis for scores.  It is necessary to be sure that the systems communicate properly and have synchronized clocks before the location evaluation can begin.

  1. C: Launch an NTP client (for clock synchronization) , and connect to the NTP server made available as part of the EvAAL infrastructure.
  2. EC: Ensure that the clocks are properly synchronized.
  3. EC: Check that lighting events are generated and received properly.
  4. EC: Switch off all lights in the Living Lab.
  5. EC: Check that bicycle events are generated and received properly.
  6. EC: Check that the EvAAL infrastructure software properly receives the information generated by the competitor’s system.
  7. EC: Install markers on the floor to show the precise points making up the paths through the Living Lab to be used in Benchmark Testing.
  8. EC: Clear the entire Living Lab of people and/or other obstacles that are not part of the normal Living Lab set-up.
  9. EC: Assign someone to take photographs (provided the competitor agrees).
  10. EC: Start video recording (provided the competitor agrees).
  11. EC: Remind the competitor to make sure that generation of logs or other kinds of metadata etc. in their software should be enabled, in preparation for each phase of the benchmark testing.

5. Preparation for benchmark tests

  1. EC: Install markers on the floor to show the precise points making up the paths through the Living Lab to be used in Benchmark Testing.
  2. EC: Clear the entire Living Lab of people and/or other obstacles that are not part of the normal Living Lab set-up.
  3. EC: Assign someone to take photographs (provided the competitor agrees).
  4. EC: Start video recording (provided the competitor agrees).
  5. EC: Remind the competitor to make sure that generation of logs or other kinds of metadata etc. in their software should be enabled, in preparation for each phase of the benchmark testing.

6. Benchmark Testing

Benchmark testing takes place in three phases, and is based on three pre-defined paths through the Living Lab (referred to as P1, P2 and P3 below).

The phases are:

1. Accuracy Test: one person.

  • One person (“the actor”) walks pre-defined paths P1 and P2 around the Living Lab, following precisely defined locations and adhering to sound signals providing precise timing indications.
  • Some parts of the paths also include brief pauses at certain locations.
  • When the actor passes “context” points (e.g. light switch, exercise bike), the actor operates the device.  The EvAAL software generates software events (made available to competitors) when the actor makes such operations.
  • The time and location data generated by the competitor’s system is gathered by the EvAAL infrastructure software, and compared with the benchmark location and time data.   Accuracy is determined by checking the difference between the two values.
  • No other person is present in the Living Lab during this test.

2. Accuracy Test: two people.  This is the same as the accuracy test for one person, except that:

  • A second person from the EC is present in the Living Lab, and moving around.  (The location of this second person is not to be monitored).
  • The paths followed are P2 and P3.

3. Area-of-Interest (AOI) Test. This is the same as the accuracy test for one person, except that:

  • The paths followed are:  P3 (reverse order of points), P1 and P2 (reverse order of points).
  • The basis for the score is the information provided by the competitor’s system about the AOI or AOIs in which the actor is located at each moment.

EC+C: Carry out tests twice, for phase 1.

EC+C: Carry out tests twice, for phase 2.

EC: Ask the competitor to check whether logging of internal data etc. has been successful. If not, fix the problem and repeat the tests for the tests where it was missing.

EC+C: Carry out tests twice, for phase 3.

EC: Collect all log data from the competitor. If some is missing, repeat the affected tests.

EC: Stop video recording.

EC: Provided that all log data has been successfully collected, inform the competitor that it is OK to start dismantling equipment.

C: Start dismantling and packing equipment.  (If the competitor has several staff, this may proceed in parallel with the de-briefing described below; otherwise it can be started here, then completed after de-briefing).

EC: Inform the competitor of the score attained for each test, as calculated by the EvAAL infrastructure software.  Make it clear that the score may be adjusted afterwards, if a detailed analysis of all information stored in logs indicates that there may be a reason to do so.

Note that the recorded score for each phase will be based on whichever of the two tests provides the most favourable score.

7. De-briefing and wrap-up

The main purpose of this phase is to make some simple conclusions about key aspects of the competitor’s system, both in its current form and by judging its future potential.  A “checklist” of questions with simple yes/no/not applicable answers is used for this purpose.

For each of the tasks below related to the checklist of questions, there should be at least 3 members of the EC present, to discuss points where there may be diverging views.  The EC member responsible for data management should not be included in the minimum set of 3 members, because the duty of the data manager at this time is to make sure that metadata etc. is collected.  Other EC members are free to carry out tasks in preparation for the following competitor.

  1. EC: For each of the questions on the checklist, suggest to the competitor (based on observations made during tests, and on basic knowledge of the competitor’s system) what the appropriate answer would be.
  2. C: For each question, confirm the correctness (or otherwise) of the answer proposed by the EC.  In the event of disagreement, details will be discussed, and the EC will (if necessary) vote on what should be recorded as an accurate answer.
  3. EC: Record all answers on a scoring sheet.
  4. EC: Ask competitor to complete a feedback questionnaire on the process of the evaluation itself.
  5. C: Complete questionnaire (optional).
  6. EC: Collect any extra information (intermediate data, logs) generated by the competitor’s system that the competitor is willing to provide.  This can be used for later analysis.
  7. EC: Gather all paper documents related to the competitor in one folder.
  8. EC: Store all logs, photos, videos, benchmark data in a common repository, and provide competitor with copies of any requested files.
  9. EC: Remove markers on the floor that were used to show the precise points making up the paths through the Living Lab used in Benchmark Testing.
  10. EC: Thank the competitor for participation, and award a small gift as a mark of gratitude.
  11. EC: Politely ask the competitor to remove equipment from the Living Lab, and to leave the Living Lab, in order to make room for the next competitor.  It may be interesting and useful for some EC staff to continue technical discussions with the competitor, but this should take place away from the Living Lab area itself.

 


Track 2 on Activity Recognition for AAL (you can download the protocol in pdf here)

1. Welcome and preparation:

When the competitor arrives the EC welcome warmly and explain the protocol.

One person will be the responsable of checking this document twice so that nothing is missed

2. Installation and configuration/calibration:

The team have 60 minutes (it corresponds to X value in the Technical Annex) to install and calibrate the system. Calibration include any training that actor must do with the system.

When installation starts, no one a part from the competitor's team must be in the place.

One measures the time for the installation time.

Installation ends when the competitor tells it. With the calculus explained in Technical Annex we can  obtain the result for Installation complexity evaluation criteria.

Two people measure (metadata) the devices and 1 person makes some pictures (if competitor agrees).

3. Evaluation system set up:

Competitor is invited to launch  his system and to check if the server receives correctly the information sent:

  • The IP address used will be 192.168.233.12 and port 7777 (please prepare your software to easily change the IP, port or/and NTP server in case there is any problem).

One checks that competitor's clock and the EvAAL system clocks are really synchronized (competitor should indicate which ntp server is using,  we are using pool.ntp.org)

One checks the integration of the competitor's system with EvAAL system

If some problema occurs a máximum of 20 minutes (it corresponds to Y value in the Technical Annex) is given to fix integration, otherwise the  log backup solution will be used. Log backup correspond to a txt file created by the competitor system with the same information required through sockets (milliseconds from 1970, integer of activity) i.e.:

1340107198026,6

1340107208500,2

1340107214386,0

1340107233972,3 …

Any problem related to the integration is logged on a note (for feedback to universAAL)

4. Actor performance:

Only the actor and one EC member (responsable to annotate ground truth activities) will  be inside the living lab during  the competition, the  competitor as well as the evaluation committee will be outside.

One is responsable for recording all the video (if the competitor agrees)

The evaluator starts the mp3 file that will guide the actor.

The actor moves and performs activities using the mp3 file. The evaluator uses a client to mark activities and transitions.

Once the mp3 finish, the actor will continue during 1 minute performing the last activity in order to capture the delayed events recognized by the competitor system. If any competitor has a delay greater than 1 minute he must indicate us  before the performance begin.

After that time the evaluation software will run in order to obtain the performance criteria and the delay criteria. The software will select the recognition delay that maximizes the performance criteria.

The performance will be repeated again to obtain a second mark in performance and delay criteria and we will select the best performance.

5. Interview:

Competitor will be interviewed about Interoperability and User acceptance criteria. To avoid subjective  values in user acceptance and interoperability we will use the yes/no questions in annex I and II. However E.C. can ask any other questions.

Interview will be annotated in a scoring sheet.

If any vote must be taken by the EC it is taken AFTER the competitor has left the room

6. Finalization:

The competitor is invited to answer a feedback questionnaire

The competitor is asked for the intermediate data produced during the competition, one person from the EC is responsible  for collecting  the data

Competitor is asked to provide  the logs of the EvAAL system

All the documents are stapled together, ALL files including photos, logs, etc.

The competitor is asked to remove installation

The competitor is thanked warmly and given a small present.

 
Joomla SEO powered by JoomSEF