EWSN 2019 :: Beijing China
  • Home
  • Organisation
  • Program
  • Calls and Submission
    • Call for Papers Call for Posters & Demos Call for Competitors CFP of PhD Forum
  • Keynotes
  • Phd Forum
  • Posters & Demos
    • The Posters & Demos Session
    • Poster and Madness
  • Competition
  • Workshops
    • workshops
    • CoWireless
    • CISC
    • 6lowpan
    • WSRRWSN
    • lpnet
    • DFSD
  • Registration
    • Registration Information
  • Visa, Venue and Travel
    • VISA
    • Venue and Travel
  • Gallery
    • Day 1
    • Day 2
    • Day 3

International Conference on
Embedded Wireless Systems and Networks

February 25-27, 2019
Beijing, China

Bild
Bild
Bild

EWSN 2019 Dependability Competition - Detailed Information

This page provides a more detailed description of this year's competition scenario, categories, and rules.
Corresponding posters: 2nd floor open space, FIT building, Tsinghua University

Dependability Competition Session

In the dependability competition's session, the winners of the following two categories will be awarded:
  • Category 1: Data collection for condition monitoring.
  • Category 2: Dissemination of actuation commands.
The winning team(s) will then hold a short presentation about their solution.
Right after the competition session, during the poster & demo session, all competing teams will present a poster about their solution.

Competition Scenario

The dependability competition scenario emulates the operation of a multi-hop wireless sensor and actuator network in an industrial setting where several co-existing devices are crowding the RF spectrum. Unlike the past editions of the competition focusing on the transmission and reception of binary events, this year, raw sensor values of different length need to be transferred across the multi-hop network.
Furthermore, this year's competition aims to seamlessly benchmark the competing solutions using different input parameters, such as address of source and destination nodes, amount of data to be transmitted, and traffic load. By doing so, the competition's results will be more general compared to the previous years, where the competing solutions were only tested for a very specific set of input parameters.
Type of nodes. The entire wireless sensor and actuator network consists of up to 70 nodes deployed over several floors in an area of approximately 1440 m2. Nodes can either act as source, forwarder, or destination: their role is known beforehand and does not change over time. Contestants can assume that at most eight nodes may act as sources.
Sending and receiving data. Each source node is connected via software I2C to an EEPROM that contains the raw sensor values to be transmitted to one or multiple (at most eight) destination nodes. The benchmarking infrastructure (which is based on D-Cube also this year) signals the availability of new data to be sent in the EEPROM by toggling a pre-defined GPIO pin. Data should be forwarded to the intended destination node(s) over a multi-hop network as quickly and efficiently as possible. As soon as a destination node receives new sensor data, it needs to write this data to the EEPROM connected via software I2C and raise a pre-defined GPIO pin accordingly. The benchmarking infrastructure will then verify the correctness of the information written in the EEPROM and use the GPIO rising edge time-stamp to compute the end-to-end latency of communications.
Sensor data. Sensor values are unique and random for each source node, i.e., there is no correlation between consecutive sensor readings. On the one hand, this allows the benchmarking infrastructure to unequivocally determine if a packet was correctly received. On the other hand, this allows contestants to transmit and receive information out of order (i.e., there is no penalty in case a destination node reports data out of order).
Interaction with the benchmarking infrastructure. The performance of the competing solutions will be seamlessly benchmarked for different input parameters, such as address of source and destination nodes, amount of data to be transmitted, and traffic load. To this end, the competition infrastructure will directly inject these input parameters into the firmware under test by applying patches to the provided ihex binary. This is achieved by having contestants embedding a well-known data structure in their firmware, as explained in this paper.
Input parameters. The competition infrastructure will inject into the contestants' ihex binaries the amount of data to be transmitted in bytes (where the data is to be read from and written to EEPROM), the address of source and destination nodes, as well as an indication of the traffic load. The latter can be periodic or aperiodic. An integer value indicating the traffic load will be injected to the benchmarked firmware using D-Cube's binary patching capability. This integer number specifies the period in milliseconds at which data is generated (or an aperiodic traffic when this value is zero). Note that a well-known set of values for all input parameters will be announced beforehand (e.g., period ∈ {0, 1000, 5000, 30000} ms, or amount of data to be transmitted ∈ {1, 4, 32, 64} bytes). The same set of input parameters will be used for both preparation and evaluation phase. Note also that, in case of aperiodic traffic, the minimum and maximum inter-arrival time will also be announced beforehand and injected to the benchmarked firmware using D-Cube's binary patching capability.
Further details about the interaction between the firmware under test and the benchmarking infrastructure, as well as the format of the input parameters will be published as part of the logistic information in the next weeks.
RF interference. Competing solutions will be tested both in the absence and in the presence of RF interference. RF interference will be generated across the whole 2.4 GHz ISM band in a repeatable fashion using D-Cube's observer nodes. The generated RF interference will resemble the patterns of common Wi-Fi appliances and will vary over time. Contestants cannot assume that some IEEE 802.15.4 channel(s) will be constantly interference-free.
Hardware. All nodes in the network are off-the-shelf Maxfor MTM-5000-MSP sensor nodes (TelosB replicas using the TI CC2420 radio). This choice allows contestants to better prepare for the competition by testing their protocols on most public WSN and IoT testbeds.
Allowed frequencies. The use of frequencies below 2400 and above 2483.5 MHz are strictly forbidden, whilst there is no limitation about the usage of frequencies between 2400 and 2483.5 MHz. Note that, as the TI CC2420 radio allows to send and receive packets also outside the 2.4 GHz band (roughly between 2230 MHz and 2730 MHz), the frequency usage will be monitored during the evaluation phase and any detected violation will lead to a disqualification.
Node failures. Whilst the contestants can assume source and destination nodes to be highly reliable, forwarding node may suffer a power failure at any point in time (repeatably triggered by the competition infrastructure), which could cause the node to be no longer available or to reboot after a random amount of time.
Software upload. For each competition category, contestants will be asked to provide a single binary ihex for all nodes. The binary will be uploaded to all nodes in the network using a common MSP430 Bootstrap Loader.

Categories

Using the aforementioned scenario, this year's competition foresees the following two categories:
  • Category 1: Data collection for condition monitoring. In this category, a fixed number of source nodes communicate to a single destination node over a multi-hop network (i.e., multipoint-to-point traffic). The destination node collects the data transmitted by all source nodes and forwards it to a control center for further processing.
  • Category 2: Dissemination of actuation commands. In this category, a fixed number of source nodes needs to disseminate actuation commands to a specific set of destinations nodes in the network (i.e., point-to-multipoint traffic). Each source node is associated to a specific set of destinations (at most eight), which will be injected as input parameter into the firmware under test by the competition infrastructure. Contestant can assume that a destination receives messages from only a single source node, and that it does not act as a source at the same time.
Note that a single ihex binary firmware is expected per category. This binary should support different input parameters as mentioned beforehand: this allows the benchmarking infrastructure to automatically test the firmware using a different set of input parameters. Although this implies that a single firmware should support different periods or payload lengths, those are patched by the competition infrastructure during the uploading and will not change at runtime (i.e., they will remain the same during the same run). Also note that, differently from the previous edition of the competition, the provided firmware does not need to support multiple types of traffic patterns at once.

Evaluation Procedure

Evaluation metrics. Solutions will be evaluated based on three criteria: (i) the reliability of transmissions, i.e., the number of messages correctly reported to each intended destination, (ii) the average end-to-end latency in communicating each message to its intended destination(s), and (iii) the energy-efficiency of the solution. The latter will be continuously measured at each source, forwarder, and destination node using D-Cube's observer nodes.
Winners selection. The teams that perform best across all three evaluation metrics and on the entire set of input parameters will be selected as winners in each category. Note that relative differences between solutions will be considered, and that reliability will have a higher weight than end-to-end latency and energy-efficiency. Competing teams can expect honorable mentions in case their solution performs exceptionally well for a specific set of parameters.

Access to the Competition's Testbed Infrastructure

In order to get access to the competition's testbed infrastructure, at least one member of each team needs to register to the EWSN conference. After this step is completed, each team needs to accept the terms and conditions for the use of the competition's testbed facility and send a scanned copy of this document (together with the registration confirmation) to the competition organizers via e-mail. Once these steps are completed, the credential information to access the competition's testbed facility will be provided to all team members.
Link to the competition's testbed: https://iti-testbed.tugraz.at/.
Link to the blog of the 2019 edition: https://iti-testbed.tugraz.at/blog/tag/ewsn2019/

List of Final Contestants

  • Team 01: Using DeCoT+ to Collect Data under Interference; Xiaoyuan Ma (Shanghai Advanced Research Institute, Chinese Academy of Sciences, China; University of Chinese Academy of Sciences, China), Peilin Zhang (Carl von Ossietzky University of Oldenburg, Germany), Ye Liu (Nanjing Agricultural University, China), Xin Li (Shanghai Advanced Research Institute, Chinese Academy of Sciences, China; ShanghaiTech University, School of Information Science & Technology, China), Weisheng Tang (Shanghai Advanced Research Institute, Chinese Academy of Sciences, China; University of Chinese Academy of Sciences, China), Pei Tian (Shanghai Advanced Research Institute, Chinese Academy of Sciences, China), Jianming Wei (Shanghai Advanced Research Institute, Chinese Academy of Sciences, China), Lei Shu (Nanjing Agricultural University, China), and Oliver Theel (Carl von Ossietzky University of Oldenburg, Germany).
  • Team 02: Low-Power Wireless Bus Baseline; Fabian Mager (TU Dresden, Germany), Romain Jacob (ETH Zurich, Switzerland), Reto Da Forno (ETH Zurich, Switzerland), and Marco Zimmerling (TU Dresden, Germany).
  • Team 03: Keep it Simple, let Flooding Shine; Jan Mueller (ETH Zurich, Switzerland), Anna-Brit Schaper (ETH Zurich, Switzerland), Romain Jacob (ETH Zurich, Switzerland), and Reto Da Forno (ETH Zurich, Switzerland).
  • Team 05: Alternating Multicast with Aggregated Data Collection in Wireless Sensor Networks; Ayesha Naureen (University of Manchester, United Kingdom) and Ning Zhang (University of Manchester, United Kingdom).
  • Team 06: Adaptive Software Defined Scheduling of Low Power Wireless Networks; Michael Baddeley (Toshiba Research Europe Ltd., United Kingdom), Aleksandar Stanoev (Toshiba Research Europe Ltd., United Kingdom), Usman Raza (Toshiba Research Europe Ltd., United Kingdom), Yichao Jin (Toshiba Research Europe Ltd., United Kingdom), and Mahesh Sooriyabandara (Toshiba Research Europe Ltd., United Kingdom).
  • Team 07: Centrally Scheduled Low-Power Wireless Networking for Dependable Data Collection; Oliver Harms (Chalmers University of Technology, Sweden) and Olaf Landsiedel (Chalmers University of Technology, Sweden).
  • Team 08: Actuating Network with Multi-Channel Codecast; Ebram Kamal William (National University of Singapore, Singapore), Paramasiven Appavoo (National University of Singapore, Singapore), Mun Choon Chan (National University of Singapore, Singapore), and Mobashir Mohammad (Ackcio Pte Ltd., Singapore).
  • Team 10: RedNodeBus, Stretching Out the Preamble; Antonio Escobar-Molero (Infineon Technologies AG, Germany), Javier Garcia-Jimenez (RedNodeLabs, Germany), Jirka Klaue (RedNodeLabs, Germany), Fernando Moreno-Cruz (Infineon Technologies AG, Germany), Borja Saez (Infineon Technologies AG, Germany), Francisco J. Cruz (eesy-innovations GmbH, Germany), Unai Ruiz (eesy-innovations GmbH, Germany), and Angel Corona (Bernitz Electronics GmbH, Germany).
  • Team 11: CRYSTAL; Matteo Trobinger (University of Trento, Italy), Timofei Istomin (University of Trento, Italy), Amy L. Murphy (Bruno Kessler Foundation, Italy), Gian Pietro Picco (University of Trento, Italy).
  • Team 12: OpenWSN, a Development Environment for 6TiSCH; Tengfei Chang (INRIA, France), Thomas Watteyne (INRIA, France), and Xavi Vilajosana (Universitat Oberta de Catalunya, Spain).

Supporters




We look forward to your contribution to EWSN 2019 in Beijing, China!


Platinum
Sponsor:
Bild

Sponsors:
Bild
Bild
  • Home
  • Organisation
  • Program
  • Calls and Submission
    • Call for Papers Call for Posters & Demos Call for Competitors CFP of PhD Forum
  • Keynotes
  • Phd Forum
  • Posters & Demos
    • The Posters & Demos Session
    • Poster and Madness
  • Competition
  • Workshops
    • workshops
    • CoWireless
    • CISC
    • 6lowpan
    • WSRRWSN
    • lpnet
    • DFSD
  • Registration
    • Registration Information
  • Visa, Venue and Travel
    • VISA
    • Venue and Travel
  • Gallery
    • Day 1
    • Day 2
    • Day 3
✕