Abstracts

CPS-IoTBench will feature a mix of invited talks and paper presentations to inspire stimulating discussions.

See the program page for the schedule details.

Invited Talks

Keynote

Dare to Share: Risks and Rewards of Artifact Sharing in Computer Science

Christian Collberg, Professor of Computer Science, University of Arizona

     Presentation (coming soon)

Abstract

We report on our experiences with artifact---mostly code---sharing in computer science research: we recount some personal anecdotes, report on outcomes from a deception study, and show initial data from a longitudinal study of sharing rates from our artifact indexing site FindResearch.org. We then discuss the steps that we need to take--as academic departments, publishers, funding agencies, and as individual educators and researchers--to improve the sharing of artifacts in Computer Science.

Short Bio

I'm a Professor in the Department of Computer Science at the University of Arizona. Prior to arriving in Tucson I worked at the University of Auckland, New Zealand, and before that I got my Ph.D. from Lund University, Sweden. I have also held a visiting position at the Chinese Academy of Sciences in Beijing, China, and taught courses at universities in Russia and Belarus.
My main research interest is computer security, in particular the so-called Man-At-The-End Attack which occurs in settings where an adversary has physical access to a device and compromises it by tampering with its hardware or software. With Jasvir Nagra, I am the author of Surreptitious Software: Obfuscation, Watermarking, and Tamperproofing for Software Protection, published in Addison-Wesley's computer security Series. It has also been translated into Portuguese and Chinese.
IMG_20190303_171258

Invited Speaker

Taming Performance Variability

Aleksander Maricq, University of Utah

     Presentation

Abstract

The performance of compute hardware varies: software run repeatedly on the same server (or a different server with supposedly identical parts) can produce performance results that differ with each execution. This variation has important effects on the reproducibility of systems research and ability to quantitatively compare the performance of different systems. It also has implications for commercial computing, where agreements are often made conditioned on meeting specific performance targets. Over a period of 10 months, we conducted a large-scale study capturing nearly 900,000 data points from 835 servers. We examine this data from two perspectives: that of a service provider wishing to offer a consistent environment, and that of a systems researcher who must understand how variability impacts experimental results. From this examination, we draw a number of lessons about the types and magnitudes of performance variability and the effects on confidence in experiment results. We also create a statistical model that can be used to understand how representative an individual server is of the general population. The full dataset and our analysis tools are publicly available, and we have built a system to interactively explore the data and make recommendations for experiment parameters based on statistical analysis of historical data. Finally, we remark on preliminary developments in our collection of data since the initial 10 month period.

Short Bio

I'm a Research Associate who has been working with the Flux Research Group at the University of Utah since 2016, after receiving my Masters in Computer Science from UC San Diego. In addition to my work on maintaining the CloudLab and Emulab projects, I have been primarily responsible for the large-scale collection of various performance metrics from the servers in the aforementioned projects. My research involves analyzing the collected data with my involved colleagues to characterize variability of workloads on compute hardware, find new analysis and statistical methods to use for our data, and improve/expand the collection process.
BERKELEY_599_alt

Invited Speaker

Planes, Trains, Apples, and Oranges - Reproducible Results and Fair Comparisons in Localization Research

Pat Pannuto, University of California, Berkeley

     Presentation

Abstract

Localization remains a continually captivating research problem as so much of human interaction and reasoning relies on location-based contexts. However, decades of work have revealed that there is no one-size-fits-all solution to the bevy of applications that require location information. If every application sets its own requirements, how then do we measure progress in localization technology when each technology sets its own benchmarks? How can we tease apart improvements to underlying physical layer estimates from enhancements to algorithms that process these data points and measure these contributions against the development of systems that exploit fusion to realize better gains than either physical or algorithmic advancements alone? More questions than answers, this talk aims to lay out the state of the field as it is today and to invite discussion on what defines fair comparisons and how to enhance the creation of reproducible artifacts, experimental configurations, and location traces and datasets.

Short Bio

Pat Pannuto is currently completing his PhD in the Department of Electrical Engineering and Computer Sciences at the University of California, Berkeley. He received his MSE in Computer Science and BSE in Computer Engineering from the University of Michigan. Pat's research is in the broad area of networked embedded systems, with contributions to computer architecture, wireless communications, mobile computing, operating systems, and development engineering. Pat's work has been recognized as a Top Pick in Computer Architecture and selected as a Best Paper Finalist at IPSN, and has been awarded NSF, NDSEG, and Qualcomm Innovation fellowships. Pat has also received teaching awards from the Computer Science Department, the College of Engineering, and the Rackham Graduate School at the University of Michigan.
hskim

Invited Speaker

Cause & Effect Analysis for Low-Power Wireless Network

Hyung-Sin Kim, University of California, Berkeley

     Presentation (coming soon)

Abstract

In this talk, I will share my research experience on low-power wireless network, specifically about various factors affecting network performance: protocol parameter, routing topology, channel fading, interference, and hardware. After running a network protocol implemented on an embedded platform, we get some numerical results “somehow,” which should be analyzed carefully. I will present several cases of analyzing why a certain result is observed. Some of them became a research problem but some of them ended up with engineering efforts. Overall, sharing these experiences shows why the “IoTBench initiative” is needed.

Short Bio

Hyung-Sin Kim is a postdoctoral scholar of EECS at the University of California at Berkeley and working at Building Energy Transportation Systems (BETS) group led by Prof. David E. Culler. He received his B.S. degree in Electrical Engineering from Seoul National University (SNU) in 2009. He received his M.S. degree in 2011 and his Ph.D. degree in 2016, both in Electrical Engineering and Computer Science (EECS) from SNU, both with outstanding thesis awards. From March 2016 to August 2016, he was a postdoctoral  researcher at Network Laboratory (NETLAB) at SNU led by Prof. Saewoong Bahk.  He received the Qualcomm Korea Fellowship in 2011, and the National Research Foundation (NRF) Global Ph.D. Fellowship and Postdoctoral Fellowship in 2011 and 2016, respectively. His research mainly focuses on networked embedded systems for the Internet of Things while broadly spanning neighboring areas such as cellular communication, mobile network, augmented reality, security, and fog/edge computing, resulting in 45+ academic publications and several awards.
Romain Jacob portrait picture
3rd_place

Invited Speaker

IoTBench - Past, present, and future of a community-driven benchmarking initiative

Romain Jacob, ETH Zurich
Markus Schuss, Graz University of Technology

     Presentation

Abstract

Since the first discussions in 2016, the IoTBench initiative has come a long way. The community has sketched a roadmap to improve the reproducibility and comparability of low-power wireless networking experiments. We have now started the journey and the gaps are being filled.

In this talk, we will present the vision, the end goal, and our proposed strategy by the IoTBench initiative to get there. We will review what has been done, what is ongoing, and most importantly, what is ahead. The talk will be directly followed by an open discussion with all workshop participants. Because ultimately, everyone needs to be involved if we want our community's practices to change and improve.

Short Bio - Romain Jacob

Romain Jacob is a final year doctoral student from ETH Zürich, Switzerland. His doctorate focuses on low-power wireless communication protocols using on Synchronous Transmissions. He has been heavily involved on the IoTBench Initiative, which aims to improve the standards for evaluation, comparison, and ultimately benchmarking in the low-power networking community. Besides, he is an active supporter of the migration towards Open Science models, and the development of novel means of evaluating research outputs, as proposed in DORA. Before he joined ETH in 2015, Romain got his Master degree from the École Normale Supérieure Paris-Saclay (France) and spent a year as Visiting Scholar in UC Berkeley (USA).

Short Bio - Markus Schuss

Markus Schuß is a third-year PhD student at the Institute for Technical Informatics of TU Graz, Austria. He received his M.Sc degree in Information and Computer Engineering from TU Graz in 2016. His research interests encompass the development of testbeds and benchmarking infrastructures, as well as the  evaluation of the performance of low-power wireless communication protocols used in the Internet of Things and in industrial automation. Among others, he has developed (i) D-Cube, an open-source tool to create large-scale testbeds used to evaluate the dependability of state-of-the-art IoT protocols, and (ii) JamLab-NG, an open-source tool to generate repeatable Wi-Fi interference. Both tools have been used to run the EWSN Dependability Competition series (quantitatively comparing the performance of state-of-the-art IoT protocols in harsh RF environments), for which he served as a general co-chair.

Accepted Papers

GenEE: A Benchmark Generator for Static Analysis Tools of Energy-Constrained Cyber-Physical Systems
Christian Eichler, Peter Wägemann, Wolfgang Schröder-Preikschat
Friedrich-Alexander-Universität Erlangen-Nürnberg, Germany
   Paper       Presentation   

The Impact of Decreasing Transmit Power Levels on FlockLab To Achieve a Sparse Network
Matthew Bradbury [1], Arshad Jhumka [2], Carsten Maple [1]
[1]  WMG University of Warwick, UK
[2] Department of Computer Science University of Warwick, UK
   Paper       Presentation   

HATBED: A Distributed Hardware Assisted Testbed for Non-invasive Profiling of IoT Devices
Li Yi , Junyan Ma, Te Zhang
School of Information Engineering, Chang’an University Xi’an, China
   Paper       Presentation   

Towards a Methodology for Experimental Evaluation in Low-Power Wireless Networking
Romain Jacob [1], Carlo Alberto Boano [2], Usman Raza [3], Marco Zimmerling [4], Lothar Thiele [1]
[1] ETH Zurich, Switzerland
[2] Graz University of Technology, Austria
[3] Toshiba Research Europe Limited, UK
[4] TU Dresden, Germany
   Paper       Presentation       Video

Lessons Learned and Challenges on Benchmarking Publish-Subscribe IoT Platforms
Ana Aguiar [1], Ricardo Morla [2]
[1] University of Porto, Instituto de Telecomunicações Porto, Portugal
[2] University of Porto, INESC TEC Porto, Portugal
   Paper       Presentation