IEEE ISCC 2020: IEEE SYMPOSIUM ON COMPUTERS AND COMMUNICATIONS 2020
PROGRAM FOR TUESDAY, JULY 7TH
Days:
next day
all days

View: session overviewtalk overview

09:00-12:30 Session 1A: 2nd International Workshop on Social (Media) Sensing (SMS)

Billions of users daily interact with social media platforms such as Facebook, Instagram, Pinterest, Twitter, etc. The checking of the preferred social media platform is becoming the first thing to do at wake-up time, and the last thing to do before falling asleep.

With no doubts, social media revolutionized the behavior of millions of people and the functioning of different ecosystems (e.g., news, entertainment and business, just to name a few). Indeed, within social media, people communicate, collaborate and share information about people, brands, products, services, personal preferences, issues and opinions.

It is therefore not surprising that social media sensing is being used to understand people’s opinions and behaviors. For instance, politicians analyze social media to gauge the public mood and to improve their political decisions, enterprise managers analyze social media conversations to increase customers engagement by tracking what people think about products and services, city administrators use social media to get insights about citizens’ opinions with the goal of enhancing the life quality of the city, advertisers use advanced analytics tools to analyze what people think of a brand and to produce tailored commercial messages.

However, the use of social media sensing to get insights about people and society is in its infancy stage, is not trivial and involves many different disciplines: from computer science to social engineering, from psychology to linguistics, from semiotics to economics, from sociology to philosophy.

09:00
What influences sentiment analysis on socialnetworks: a case study

ABSTRACT. Sentiment analysis, social networks analysis, and social media sensing are becoming important tools adopted in different contexts, ranging from social interactions, touristic activities, shopping and e-commerce, and so on. In particular, in these days, they are showing their potential as a way to let the people communicate and keep in touch during the CoVid-19 quarantine, and to monitor and understand people mood and feelings. On the one hand, privacy issues and international laws and acts drive such analysis (e.g. GDPR), with the aim of protecting persons' privacy and security. On the other hand, these can limit somehow such activities. Hence, a precise and accurate identification of the strategies to adopt should be done with the aim of balancing privacy issues and sentiment analysis activities. Taking into account the requirements of a Urban Innovation Action project, which is based on the active involvement of citizens, this work aims to describe limitations and potentialities of social networks monitoring and analysis to understand users' mood about the project actions adopted in the city of Ravenna (in Italy) to improve specific areas.

09:15
COVID-19 Outbreak through Tweeters’ Words: Monitoring Italian Social Media Communication about COVID-19 with Text Mining and Word Embeddings

ABSTRACT. In this paper we aim to analyze the Italian social media communication about COVID-19 through a Twitter dataset collected in two months. The text corpus had been studied in terms of sensitivity to the social changes that are affecting people's lives in this crisis. In addition, the results of a sentiment analysis performed by two lexicons were compared and word embedding vectors were created from the available plain texts. Following we tested the informative effectiveness of word embeddings and compared them to a bag-of-words approach in terms of text classification accuracy. First results showed a certain potential of these textual data in the description of the different phases of the outbreak. However, a different strategy is needed for a more reliable sentiment labeling, as the results proposed by the two lexicons were discordant. Finally, although presenting interesting results in terms of semantic similarity, word embeddings did not show a predictive ability higher than the frequency vectors of the terms.

09:30
A data visualization interactive exploration of human mobility data during the COVID-19 outbreak: a case study

ABSTRACT. In this paper, we present a real-world study where a community-based tracking infrastructure has been put to good use for understanding human mobility during the COVID-19 outbreak, in order to predict/contrast its diffusion. In particular, the infrastructure, deployed in the Madeira Archipelago (Portugal), is able to collect a massive amount of spatio-temporal data, that can be enriched with potentially independent data sources of additional values (such as the official number of people affected by the coronavirus disease). These enriched hyper-local data can be exploited to visualize the correlation between COVID-19 diffusion and the mobility of people. We here present the deployed infrastructure and the data visualization, interactive system to make sense to human mobility data during the COVID-19 outbreak.

09:45
Air Quality Control through Bike Sharing Fleets

ABSTRACT. Air quality and the presence of tiny particular matter are a crucial factor in human health, especially when considering urban scenarios. In this context, smart mobility coupled with low-cost sensors can create a distributed and sustainable platform for social sensing able to provide pervasive data to citizens and public administrations. Sustainable and eco-aware decisions can then be supported by empirical evidence, resulting in an improved life and city administration. In this paper, we present ArduECO, a simple Arduino-based wireless device able of collecting air quality data. Without loss of generality, we have designed our device as a box that can be installed on a bike; in this way, beyond private bikes, municipalities could exploit their bike sharing fleets as pervasive sensing systems.

10:00
Untangling between fake-news and truth in socialmedia to understand the Covid-19 Coronavirus

ABSTRACT. “Covid-19 is a virus developed to rule the world” is just one of the many fake-news published on the Web. In this pandemic period, the Web is flooded with real news, allegedly true or blatantly false. To understand how fake news is affecting the Covid-19 perception, we selected 40 news (either true or fake) related to the origin, diffusion, treatment and effects ofCovid-19 and we asked 293 volunteers to express their opinion on the truthfulness of the news. Then, we propose an Awarenessindex to compute knowledge degree of the volunteers. The results highlight a large ignorance on medical news, ignorance that goes beyond educational background. The study highlights the need for Health Institution to enter social media platforms in order to clearly explain what is true and what is false on Covid-19.

10:15
On the Efficiency of Decentralized File Storage for Personal Information Management Systems
PRESENTER: Mirko Zichichi

ABSTRACT. This paper presents an architecture, based on Distributed Ledger Technologies (DLTs) and Decentralized File Storage (DFS) systems, to support the use of Personal Information Management Systems (PIMS). DLT and DFS are used to manage data sensed by mobile users equipped with devices with sensing capability. DLTs guarantee the immutability, traceability and verifiability of references to personal data, that are stored in DFS. In fact, the inclusion of data digests in the DLT makes it possible to obtain an unalterable reference and a tamper-proof log, while remaining compliant with the regulations on personal data, i.e. GDPR. We provide an experimental evaluation on the feasibility of the use of DFS. Three different scenarios have been studied: i) a proprietary IPFS approach with a dedicated node interfacing with the data producers, ii) a public IPFS service and iii) Sia Skynet. Results show that through proper configuration of the system infrastructure, it is viable to build a decentralized Personal Data Storage (PDS).

09:00-12:30 Session 1B: 8th Workshop on Communications in Critical Embedded Systems (WoCCES)

Currently, there is a growing demand for embedded systems in several areas e.g. agricultural, industrial, civil, transportation and residential. The fast growth of computing power on microprocessors and microcontrollers, and the higher connectivity has led to a whole new class of applications, such as autonomous vehicle systems (air, ground, underwater, etc.) and mobile robots. Moreover, depending on the application, these systems are not only embedded systems, but critical embedded systems as they deal with risk related to lives and high value assets.

Due to the high connectivity provided by the Internet of Things and the flexibility of Cloud Computing-based systems, a large number of devices can currently be deployed in areas of interest that are likely to be considered critical. Meeting very specific requirements on communications and control is mandatory to guarantee that sensitive tasks are accurately performed on targeted environments.

WoCCES aims at the integration of researchers working on the fields of communications and control in critical embedded systems including Connected Vehicles, Internet of Things, Smart Cities, Cloud and Fog Computing, Security, Cryptography, and Privacy.

09:00
STUART: ReSilient archiTecture to dynamically manage Unmanned aeriAl vehicle networks undeR atTack

ABSTRACT. The growing demand for Unmanned Aerial Vehicles (UAV) has the potential to increase productivity and economy in the industry, due to its use in various fields, such as health, security, aerial photography, surveillance, military missions, agriculture, etc. The production and use of the UAV have increased lately, and there is a demand for the improvement of decision-making, security, safety, and knowledge about relevant technologies. Thus, these vehicles must continually adapt to complex missions where they face unpredictable issues. In this context, the aim of this paper is to advance the state of the art through the definition and development of a resilient architecture for UAV that dynamically manages the network, even when subjected to an attack during a mission, integrating security methods and safety. The architecture will be composed by three modules: (1) decision-making module, (2) diagnosis module, and (3) resilient module. This work also investigates the incorporation of safety and security as a unified concept in the development of UAV.

09:15
Efficient File Collections for Embedded Devices

ABSTRACT. This paper studies methods for efficiently transferring and storing collections of related files in embedded devices and other environments with limitations on storage, network, and energy use. Files in a collections based on purpose (e.g., system configurations) or other aspects often exhibit substantial inter-file similarities. These similarities may be used to achieve significant reductions in the network resources required for transferring or updating the collection, as well as for the storage resources required on the embedded devices on which it is stored.

09:30
Data Mining applied to the navigation task in autonomous robots

ABSTRACT. This paper reports the results of the classifications algorithms used, in data mining, for the creation of a navigation model applied to an autonomous robot to perform the wall-following task. The goal is to make the robotic vehicle capable of moving, guiding and making decisions autonomously, avoiding collisions, using the smallest resource set as possible. A simulator was used to develop a test environment, the robot vehicle and simulate the proposed navigation system. The results suggest that the J48 (C4.5) classification algorithm obtained the most satisfactory results, generating a model with high degree confidence in decision making.

09:45
Cognitive-LoRa: adaptation-aware of the physical layer in LoRa-based networks
PRESENTER: Lucas Figueiredo

ABSTRACT. Network technologies for large areas based on sub-GHz have emerged as a way to provide long-range communication with low cost and complexity. Among the various existing solutions, LoRa is arguably the most adopted and promising in this context. Its main application has been to allow the possibility of ubiquitous connectivity to IoT from a simple network and management structure. Some factors must be taken into account when proposing a LoRa network. The type of application directly affects the LoRa network communication, such as the center frequency, the spreading factor, the bandwidth, and the coding rates chosen by each node. We will observe in this work the characteristics of LoRa physical-layer and its automatic configuration based on the quality of the perceived signal-to-noise ratio (SNR). In this way, we propose an adaptable protocol for LoRa networks with low overhead and complexity. The results obtained by a real scenario show that only 23\% of the observation time make changes in the configuration and has an average gain of 4.68\% for SNR.

10:00
A fog architecture for privacy-preserving data provenance using blockchains

ABSTRACT. Data provenance tracks the origin of information with the goal of improving trust among interested parties. Data provenance is an important requirement for a range of applications such as food safety, supply chains, and tracking of epidemic outbreaks. Many of these applications are inherently distributed and require high levels of privacy and trust.

Fog computing and Blockchains are recent technological solutions that were born from advancements in cloud and distributed computing. Fog focuses on bringing the cloud closer to the edge user while Blockchain provides transparency without the need for a trusted centralized entity. Both can be complimentary as Fog spreads the data and computer storage while the Blockchain can keep it consistent and trustworthy. These technologies can be used to improve several aspects in a Data provenance context.

In this paper we describe an architecture that allows the tracking of data provenance in a wide-area distributed fog. While we employ blockchains to provide transparency, localized Fogs have control over what is made public on the cloud. The architecture proposed in this paper enables fast and reliable data provenance for clients executing in the Fog using software services that keep the information consistent across all interested parties in the cloud. Any information in the system is associated with a proof of authenticity, but authors have control over the eventual publication of the information.

Our proposal was built upon the well established provenance model W3C Prov, which simplifies adoption of the framework.

We developed an application consisting of a client and web services that is able to store and share provenance information using open standards in a blockchain. The relate work, architecture and performance tests for the proposal are presented here.

10:15
Comparison of secure communication with AES between embedded system and general purpose computer

ABSTRACT. Embedded systems are the combination between hardware and software designed to perform some specifics functions. An embedded system is generally a part of a larger system that uses a computer network to communicate with other devices. For this reason, secure communicaartion is of great importance to guantee the confidentiality of the transmitted information, which can be provided by using encryption algorithms. The security is usually not a requirement when designing an embedded system, causing few techniques to be implemented for the security of these devices. This paper presents an evaluation of run-time performance of AES using a cryptography library developed for embedded systems, called Relic in a communication between two process in a general purpose computer, between a general purpose computer and a embedded system and between two embedded systems. It is considered the key sizes supported by the algorithm, the size of the transmitted data and the architecture used, also measuring the relation of the time spent by encryption and decryption and the total time for the secure communication. The results demonstrated that the architecture has a greater impact on the encryption and decryption times along with the message sizes. However, even for the communication between embedded systems, the percentage of the encryption and decryption time on the entire secure communication is smaller than 14%.

10:30
Context-Aware Operation for Unmanned Systems with HAMSTER

ABSTRACT. Unmanned Vehicles can benefit from contextual information to improve their operation and security. In fact, a node in any network might assume different levels of criticality depending on several factors, such as their inner components' states, data relevance, provided services, and contextual information. Being aware of a criticality level for an individual node helps determining more consistent approaches to communication and security/safety implementations. In this paper, the integration of security contextual information in a UV communication architecture is demonstrated, in order to increase its safety, overall security and survivability.

10:45
Exploratory Analysis of Public Transportation Data of Curitiba, Brazil

ABSTRACT. The dynamics between technologies and everyday life is increasingly volatile, pushing for greater convenience and ease of both trivial and complex processes. A consequence of the wide implementation of Internet of Things-based devices in smart city applications clearly reflects the mentioned dynamics by the huge amount of data to be handled. To extract useful information from such data, a clever approach to conduct data analysis is the interdisciplinary one, i.e., engineering and computer scientists as close partners of the problem domain experts. This paper addresses IoT data of a Public transportation System (PTS) aiming at contributing to the data analysis process for new insights about bus based PTS applications. Open Data of Curitiba PTS was explored looking for approaches to handle and analyze this kind of data, aiming to find important information for users, managers and planners of the public transport network in Curitiba.

09:00-12:30 Session 1C: 12th Workshop on Performance Evaluation of Communications in Distributed Systems and Web based Service Architectures (PEDISWESA)

This is the 12th Workshop on Performance Evaluation of Communications in Distributed Systems and Web based Service Architectures (PEDISWESA) 2020, orgnized in conjuntion with ISCC 2020

ISCC takes place annually, usually in the Mediterranean region, in locations such as Egypt, Greece, France, Tunisia, Portugal, Spain, Morocco, Italy, and Brazil. In 2020, the Symposium is going to happen in Rennes City, France.

PEDISWESA sessions will include presentations on new research results. Papers describing original work are invited in any of the computer and communications areas that the Call for Papers considers. Accepted papers will be included in the ISCC 2020 Conference Proceedings, which will be subject to independent peer-review procedures for quality, and may be eligible for inclusion in the IEEE Xplore® Digital Library. Merit, relevance, and originality will guide paper acceptance. Best Paper Award and Best Student Paper Awards will be presented.

09:00
Obstacle detection based on Cooperative Intellignet Transport systems data

ABSTRACT. Cooperative Intelligent Systems development is growing and the data they produce is increasing exponentially. This amount of data will soon be large enough to fall in big data paradigm. We propose to exploit these data as data stream. We aim to detect anomaly on the road using concept drift detection methods over data stream. To this purpose we create a simulator to generate data used for our study. We use two scenarios, a stopped car and a growing pothole, in our simulation. We focus our study on the vehicle hezading data on which we apply Page- Hinkley and ADWIN methods. We have shown that the use of C-ITS data could improve the automatic detection of incidents

09:15
Intelligent Malicious URL Detection with Feature Analysis

ABSTRACT. The Website security is an important issue that must be pursued to protect Internet service users. Traditionally, blacklists of malicious websites are maintained, but they do not help in the detection of new malicious websites. This work proposes a machine learning architecture for intelligent detecting malicious URLs. Forty-one features of malicious URLs are extracted from the data types and processes of domain, Alexa and obfuscation. ANOVA (ANalysis Of Variance) test and XGBoost (eXtreme Gradient Boosting) algorithm are used to extract the 17 most important features for analyzing malicious URL. The dataset including 13027 benign URLs and 13027 malicious URLs is used to build the XGBoost-based malicious URL detector, which has a detection accuracy of more than 99%.

09:30
EdgeKV: Distributed Key-Value Store for the Network Edge
PRESENTER: Karim Sonbol

ABSTRACT. We are witnessing new emerging technologies of storage and computations which makes cloud no longer the only best source of data storage and computation. With improvements in computation and storage resources, data access through the network becomes the bottleneck for several cloud applications. Even with high-speed networks, the high latency of the cloud makes it unfeasible or unfavourable for latency-sensitive applications such as autonomous driving, smart factories and video streaming. Utilizing the resources of the network edge provides a solution but it can be quite challenging to maintain the stability, fault-tolerance, and efficiency in a large-scale system. In this paper, we introduce and present the design of EdgeKV: a novel general-purpose distributed key-value store for the network edge. We show how the decentralized design of EdgeKV achieves high efficiency and scalability while providing flexibility, ease of use, and data privacy. We evaluated our prototype on the Grid'5000 framework with multiple realistic Yahoo! Cloud Serving Benchmark (YCSB) workloads. Our initial results show that EdgeKV achieves 72% higher throughput and 47% lower latency on average than a centralized cloud storage, for read-dominated workloads.

09:45
Joulin: Blockchain-based P2P Energy Trading Using Smart Contracts

ABSTRACT. As a decentralized immutable ledger where several trustless peers can reach consensus with each other without the need of any trusted third party, blockchain technology fits perfectly with the peer-to-peer (P2P) energy trading paradigm. In this paper, we propose, design and analyze a marketplace for energy trading based on smart contracts on the blockchain. The proposed system named Joulin serves as a competitive and efficient marketplace where peers can both produce, buy and sell energy depending on their needs. As a proof-of-concept, we developed the prototype of the Joulin system using Ethereum blockchain. Our results, in terms of usability, flexibility and resiliency, demonstrate the potential to achieve an easily extendable and reliable system with low transaction costs. Low gas costs and quick response times demonstrate usability. Our smart contracts have also been tested with security tools to ensure that they are not vulnerable to outside manipulations.

10:00
A Proactive-Restoration Technique for SDNs

ABSTRACT. Failure incidents result in temporarily preventing the network from delivering services properly. Such a deterioration in services called service unavailability. The traditional fault management techniques, i.e. protection and restoration, are inevitably concerned with service unavailability due to the convergence time that is required to achieve the recovery when a failure occurs. However, with the global view feature of software-defined networking a failure prediction is becoming attainable, which in turn reduces the service interruptions that originated by failures. In this paper, we propose a proactive restoration technique that reconfigure the vulnerable routes which are likely to be affected if the predicted failure indeed occurs. This proposed approach allocates the alternative routes based on the probability of failure. Experimental evaluation on real-world and synthetic topologies demonstrates that the network service availability can be improved with the proposed technique to reach up to 97\%. Based on the obtained results, further directions are suggested towards achieving further advances in this research area.

10:15
Survey on the Cloud-IoT paradigms: Taxonomy and architectures

ABSTRACT. In the last decade, an exponential growth in the number of smart objects is observed. These devices, designed to enhance the human well-being, are able to communicate and cooperate with each other to form the IoT (Internet of Things). However, the IoT is facing many challenging issues to be a widespread technology, such as: battery lifetime limitation, poor computing and storage resources capacities. To cope with that, the Cloud model has been rethinked to bring more flexibility in the way the resources are managed in the IoT. Such a solution provides potentially unlimited computing and storage capabilities as well as many other services. In the literature, numerous architectures have been proposed to combine these two new technologies by introducing emerging computing paradigms such as FOG computing, EDGE computing, Mobile Cloud computing. In this paper, we survey the existing IoT-CLOUD architectures and we classify them by proposing a new taxonomy based on appropriate criteria, that we discuss the relevance.

10:30
On Securing IoT from Deep Learning Perspective
PRESENTER: Yazan Otoum

ABSTRACT. The extensive growth in the Internet of Things (IoT) has been realized in diverse applications such as smart homes, smart cities, Intelligent Transport System (ITS), smart factories and so on. IoT integrates billions of smart devices (predicted to jump from 27 billion in 2017 to 125 billion by 2030) and establishes communication among them. However, this huge connectivity introduces a further need for analysis from the perspective of security. The involvement of millions of things and users brings increasing vulnerability for the IoT environment. On the other hand, Deep Learning (DL) approaches, which come from the family of machine learning (ML), have shown their efficiency in many research fields. Current studies have been shown the effectiveness of DL approaches in IoT security applications. In this paper, we first present a detailed analysis on IoT with its security requirements and challenges. Subsequently, we elaborate on the role of DL approaches in IoT security. We survey the state-of-art research works on securing IoT environments using DL approaches. Brief comparative analysis between DL algorithms, such as RNN, LSTM, CNN, DBN, AEs etc., is also provided. Finally, we highlight research problems presented in current studies and outline the future direction of DL algorithms in the IoT security domain.

10:45
Energy Efficiency in SDDC: Considering Server and Network Utilities

ABSTRACT. Software Defined Networking (SDN) has eased the management and control of networks through separation of the control and data planes. Software defined data centers (SDDC) automate the management of end systems which are physical machines and virtual machines. In data centers, although there is a vast work on minimizing power consumption of physical machines and virtual machine migration performance, energy efficiency of the network components is given little attention. In this paper, a software-based energy efficiency framework that jointly minimizes the power consumption of end systems and network components in SDDC is proposed. Moreover, a novel physical server utility interval based metric, namely Ratio for Energy Saving of Physical Machines (RESPM) which measures how energy efficient the physical servers with respect to virtual machines residing within is proposed. To jointly maximize network energy efficiency and RESPM values, an Integer Programming (IP) formulation has been introduced. Experiments conducted on real-world virtual migration traces show that the proposed framework jointly reduces the power consumption of end systems and network components. The system has shown an improvement of 9% in RESPM, 35% energy saving in RESDN, and more than 50% in links saving.

11:00
Data Driven Network Performance Inference From Within The Browser
PRESENTER: Imane Taibi

ABSTRACT. The ability to monitor web and network performance becomes crucial to understand the reasons behind any service degradation. Such monitoring is also helpful to understand the relationship between the quality of experience of end users and the underlying network state. Many troubleshooting tools have been proposed recently. They mainly consist of conducting active network measurements from within the browser. However, most of these tools either lack accuracy, or perform measurements to a limited set of servers. They are also known to introduce non negligible overhead onto the network. The objective of this paper is to propose a new approach based on passive measurements freely available from within the web browser, and couple them to deep learning models to estimate the latency and bandwidth metrics of the underlying network without injecting any additional measurement traffic. We develop and implement our approach, and compare its estimation accuracy with the most known web-based network measurement techniques available nowadays. We follow a controlled experimental approach to derive our inference models. Results of our study show that our approach can give a very good accuracy compared to others, its accuracy is even higher than most standard techniques, and very close to the rest.

09:00-12:30 Session 1D: 5th edition of the IEEE workshop on ICT Solutions for eHealth (ICTS4eHealth) : Session I

e-Health is one of the major research topics that have been attracting cross-disciplinary research groups. The deployment of new emerging ICT technologies for Health, especially based on Cloud computing, Internet of Things (IoT), and Computational Intelligence, is attracting the interest of many researchers. ICTS4eHealth 2020 is the 5th edition of the International IEEE Workshop dedicated to ICT solutions for e-Health, especially based on Cloud computing, Internet of Things (IoT), and Computational Intelligence. The workshop will bring together researchers from academia, industry, government, and medical centers in order to present the state of the art in the emerging area of the use of cloud systems in connected health infrastructure and applications, and the use of IoT and Computational Intelligence techniques in the area of eHealth.

09:00
A network performance view of a biobanking system for diagnostic images

ABSTRACT. A significant contribution of ICT to healthcare is constituted by systems automating and enhancing the management of research and clinical data. More specifically, PACS (Picture Archiving ad Communication Systems) have improved the efficiency of diagnostic images and clinical data management. Their evolution (image biobanks) are now enabling new collaborations and analysis possibilities similarly to—and beyond—the biobanks (their biologic samples analogous and complement). In this work we describe and evaluate the network performance of a biobanking system for diagnostic images, based on the XNAT open source platform, as implemented and operated by Bio Check Up Srl. The point of view of the user is adopted, in assessing the performance in three setups: local (virtual machines communicating in a single host), LAN (organization-local access), and VPN (remote secure access through the Internet). Both upload and download usage cases are considered, with both a medium-sized and big-sized set of diagnostic images. Several metrics are extracted from traffic traces captured in the experimental campaign, and discussed. Results show that the current setup is well provisioned for satisfying the planned number of concurrent users, and point to further experimental campaigns.

09:15
A Web-based Information System for the Management of ICU Beds During the Coronavirus Outbreak

ABSTRACT. With the world pandemic generated by Covid-19, many places in the world are not able to rapidly measure the number of intensive care unit (ICU) beds existing and available in a city, state, or country. Knowing precisely the number of ICU beds in real-time is very important to estimate the health system collapse and to create strategies for the government to provide new ICU beds for patients. Thus, at Rio Grande do Norte, a state in Brazil, the State Health Department requested us to rapidly develop a technology to integrate ICU beds and patient data related explicitly to Covid-19 from the 58 health units (hospitals and clinics) in the state. Thus, this article presents the methodology and strategies used for the development and implementation of a web information system, called Leitos, for the management of ICU and semi-ICU beds assigned to Covid-19 patients. Nowadays, more than 200 government agents and clinical unit staff are using this system that presents the real-time situation of the ICU beds in this state.

09:30
A mHealth solution for contact-less self-monitoring of blood oxygen saturation

ABSTRACT. Mobile health (mHealth) technologies play a fundamental role in epidemiological situations such as the ongoing outbreak of COVID-19 because they allow citizen to self-monitor their health status while staying at home and being constantly in remote connection with the physicians despite the quarantine. Special care should be given to self-monitoring vital parameters such as blood oxygen saturation (SpO2), whose abnormal values are a warning sign for potential infection by COVID-19. Measurement of SpO2 is commonly made through the pulse oximeter that requires skin contact and hence could be a potential way of spreading contagious infections. For this reason, contact-less solutions for self-monitoring of SpO2 would be beneficial. In this paper we present a mHealth approach to self-monitor SpO2 that does not require any contact device since it is based on video processing. Video frames of the patient's face acquired by a camera are processed in real-time in order to extract the remote photoplethysmography signal useful to derive an estimation of SpO2. Preliminary experimental results show that the SpO2 values obtained by our contact-less solution are consistent with the measurements of a commercial pulse oximeter used as reference device.

09:45
Towards an Ambient Estimation of Stool Types to Support Nutrition Counseling for People affected by the Geriatric Frailty Syndrome

ABSTRACT. A person’s stool can tell a lot about his or her state of health. Among other things, diarrhea or constipation lead to reduced digestive efficiency. Due to social developments, for many people their own stool is a shameful topic. However, the effectiveness of digestion of food has a direct influence on the recommendations for patients undergoing nutritional therapy. This paper outlines a prototypical system for an automatic and ambient classification of stool forms into three classes: thin, normal and hard stool based on the Bristol Stool Scale. The stool is recorded in transit after exiting the anus until it reaches the toilet floor to avoid the problems of conventional procedures. Corresponding data were generated under laboratory conditions. Various procedures from the field of machine learning and deep learning were applied to this data. The evaluation results show that two out of five algorithms achieve classification rates of 100%.

10:00
Diagnosis of COVID-19 in CT image using CNN and XGBoost

ABSTRACT. Coronavirus disease 2019 (COVID-19) has infected more than 3.6 million people worldwide and is responsible for more than 250,000 deaths. A major problem faced in the diagnosis of COVID-19 is the inefficiency and scarcity of medical tests. The use of computed tomography (CT) has shown promise for the evaluation of patients with suspected COVID-19 infection. CT’s analysis is complex and requires specialist effort, which can lead to diagnostic errors. The use of CAD systems can minimize the problems generated by the analysis of CTs by specialists. Recently, CNN’s that are models of deep learning have been employed in the development of CAD systems. This article presents a methodology for diagnosing COVID-19 using CNN for resource extraction in CTs and classification using XGBoost. The methodology consists of using a CNN to extract resources from 708 CTs, 312 with COVID-19, and 396 Non-COVID-19. After the extracted data, it used XGBoost for classification. The results show an accuracy of 95.07, recall of 95.09, precision of 94.99, F-score of 95, AUC of 95, and a kappa index of 90. The results obtained show that the proposed methodology can be used as a diagnostic aid system by specialists.

10:15
Gamification, mHealth and user adherence

ABSTRACT. MyDiabetes is a mobile application targeted for the management of type I diabetes. A considerable challenge faced by diabetes' management applications is user adherence and motivation. In this article we will describe the different gamification techniques implemented in the MyDiabetes mobile application in order to tackle the mentioned challenges and increase the number of daily user records. To evaluate the success of the implemented gamification techniques, a survey was conducted at the endocrinology service of S. João's Hospital. Twenty three participants, diabetics type I, participated in the survey. Different types of users' motivations resulted in contrasting opinions about particular gamification elements. Nonetheless, the implemented gamification was, in general, well accepted. The gamification elements with the lowest classification were streaks and social comparison of points. Even so, these elements obtained an acceptance rate of 83%. There was a common preference for gamification elements that promote better glycaemic management. Classic gamification elements that promote points, levels and objective completion were not disregarded by the inquired patients, but were given a secondary roll in usage motivation. Although promising, it was not possible to determine the long-term effectiveness of the implemented gamification. Further trials are planned to verify the impact of gamification on long-term user adherence.

10:30
Parkinson’s Disease Diagnosis: Towards Grammar-based Explainable Artificial Intelligence

ABSTRACT. The basic technology that reinvents machines to personalize human experiences is Machine Learning (ML), a branch of Artificial Intelligence (AI) and a strong buzzword in today’s digital world. Despite its success, the most significant limitation of ML is the lack of transparency behind its behavior, which leaves users with a poor understanding of how it makes decisions, such it is the case for Deep Learning models. If the final user does not trust a model, he will not use it. This is especially true in medical diagnosis practice: physicians cannot simply use the predictions of the model but must trust the results it provides. This work focuses on the automatic early detection of Parkinson's disease (PD), whose impact on both the individual's quality of life and social well-being is constantly increasing with the aging of the population. To this end, we propose an explainable approach based on Genetic Programming, called Grammar Evolution (GE). This technique uses context-free grammar to describe the language of the programs to be generated and evolved. In this case, the generated programs are the explicit classification rules for the diagnosis of the subjects. The results of the experiments obtained on the publicly available HandPD data set show GE's high expressive power and performance comparable to those of several ML models that have been proposed in the literature.

10:45
A Capsule Network-based for identification of Glaucoma in retinal images

ABSTRACT. Glaucoma is an eye disease responsible for the second most common cause of blindness in the world. The need to detect this disease in its early stages is notorious, considering that late treatment can cause loss of vision. In this context, computational methods are being developed to assist specialists in the task of analyzing ocular images to provide greater precision to the diagnosis. In this paper, we present a methodology for automatic classification of glaucoma using Capsule Network (CapsNet), a recent model of deep learning that analyzes the hierarchical spatial relationships between characteristics to represent images, so that it requires fewer training samples than traditional CNNs to achieve efficient classification. Before the execution of CapsNet, we applied a pre-processing step to the images, in order to highlight the characteristics. Our results were promising, with 90.90% accuracy, 86.88% recall, 94.64% precision, 90.59% f1-score, 0.904 AUC and 0.801 kappa index. The main contribution of our method is the fact that we have achieved promising results without the need to apply data to increase and segment the region of the optical disc. Thus, our study showed the potential of the capsules in identifying the relationships between the characteristics, even in the face of a reduced set of training.

11:00
A Computational Model and Algorithm to Identify Social Isolation in Elderly Population

ABSTRACT. Research has shown that social isolation is a serious health risk issue which not only has unignorably negative impacts on well-being and quality of life of individuals but also it is harmful to healthy human development and is a high-risk for public health. In recent years, by the rapid growth of geriatric populations around the world, the problem has gained growing attention as a public health priority as seniors are among the most vulnerable to social isolation. In this research work, we propose a novel algorithm and model to identify socially isolated individuals in a community using social network analysis and optimization techniques. We also define and model the concept of social isolation in a computational form and introduce new quality functions to measure it in the community. The performance and efficiency of our proposed model are evaluated using various synthetic social graphs.

11:15
A cloud-based REST platform for real-time health resources availability registering, discovering and matching in pandemic crisis conditions
PRESENTER: Evangelos Boutas

ABSTRACT. Resources shortage during a time of crisis is a typical but also dangerous situation that health systems need to mitigate. When the health crisis reaches a global level, this shortage becomes more critical and spans across all related resource categories and types (materials, equipment, personnel etc). The aim of this work is to present an online flexible platform, based on a combination of REST API and messaging technologies, that targets at offering a mechanism for swift registration of resource types across affected entities, government or private institutions, individual or corporate donors and volunteers. A full range of functionalities is presented, including dynamic submission of resource availability or demand, matchmaking based on criteria, investigation of supply or demand status, asynchronous alerting, general overview of the situation as well as analytics capabilities. Thus either centralized government coordination or individual corporate or volunteer efforts can be redirected and optimize the distribution and usage of resources.

11:30
Improving Machine Learning Algorithm Processing Time in Tele-Rehabilization Through a NoSQL Graph Database Approach: A Preliminary Study

ABSTRACT. Nowadays, recent advancements in ICT have sped up the development of new services in healthcare. In this context, remote patient monitoring and rehabilitation activities can take place either in satellite hospital centers or directly in patients' homes. Specifically, using a combination of Cloud/Edge computing, Internet of Things (IoT) and Machine Learning (ML) technologies, patients with motor disabilities can be remotely assisted avoiding stressful waiting times and overcoming geographical barriers. This is possible by applying the Tele-Rehabilitation as a Service (TRaaS) concept. The objective of this paper is twofold: i) studying how Machine Learning can improve the TRaaS, and ii) demonstrating how a NoSQL graph database approach can enhance the performance since it works directly at the database layer instead of at application one. In particular, the K-NN algorithm is studied in order to identify the best therapy, i.e., rehabilitation training, for a new remote patient with motor impairment. Experiments compare two system prototypes, that are respectively based on Python and Neo4j, showing that the latter presents better performance in terms of processing time guaranteeing the same accuracy.

11:45
Covid-19: A Digital Transformation Approach to a Public Primary Healthcare Environment

ABSTRACT. Digital transformation in e-health is a challenge problem reported from several studies from several dimensions. On the other hand, new technologies could represent a differential effort to improve scenarios such as public primary healthcare environments. In contrast, in the present war that the world is facing against the Covid-19 it is not common to see references about the use of these existing technologies. In this paper, we present a proposal effort that can be understood as a digital transformation approach to a public primary healthcare environment. The proposal environment adopts the use of smart bands by groups of different type of voluntaries, from where digital data is gather, a recommendation system is employed, and also an environment simulator software helps to illustrate predictable scenarios. Initial results from the proposal indicates a differentiated approach to tackle large challenges, similar to these created by the actual Covid-19 pandemic scenario. In addition, our experiments illustrate that the adoption of these computational technologies require changes on the present behaviour, from governments and people, to be successful approaches to individual protection inside public primary healthcare environments.

12:00
Dicoogle Framework for Medical Imaging Teaching and Research
PRESENTER: Rui Lebre

ABSTRACT. One of the most noticeable trends in healthcare over the last years is the continuous growth of data volume produced and its heterogeneity. In the medical imaging field, the evolution of digital systems is supported by the PACS concept and the DICOM standard. These technologies are deeply grounded in medical laboratories, supporting the production and providing healthcare practitioners with the ability to set up collaborative work environments with researchers and academia to study and improve healthcare practice. However, the complexity of those systems and protocols makes difficult and time-consuming to prototype new ideas or develop applied research, even for skilled users with training in those environments. Dicoogle emerges as a reference tool to achieve those objectives through a set of resources aggregated in the form of a learning pack. It is an open-source PACS archive that, on the one hand, provides a comprehensive view of the PACS and DICOM technologies and, on the other hand, provides the user with tools to easily expand its core functionalities. This paper describes the Dicoogle framework, with particular emphasis in its Learning Pack package, the resources available and the impact of the platform in research and academia. It starts by presenting an overview of its architectural concept, the most recent research backed up by Dicoogle, some remarks obtained from its use in teaching, and worldwide usage statistics of the software. Moreover, a comparison between the Dicoogle platform and the most popular open-source PACS in the market is presented.

14:00-17:30 Session 2A: 1st Workshop on Blockchain theoRy and ApplicatIoNs (BRAIN)

A blockchain protocol is employed to implement a tamper free distributed ledger which stores transactions created by the nodes of a P2P network and agreed upon through a distributed consensus algorithm, avoiding the need for a central authority.

Blockchain technology has a great potential to radically change our socio-economical systems by guaranteeing secure transactions between untrusted entities, reducing their cost and simplifying many processes. Such a technology is going to be exploited in many different areas like IoT, social networking, health care, electronic voting and so on. Even if several applications are already leveraging its disruptive potentials, further research and development is required for several challenging aspects like scalability, efficiency, privacy and support of complex queries, both from a theoretical and applicative point of view.

This workshop aims to provide a venue for researchers from both academy and industry to present and discuss important topics on blockchain technology. The workshop’s goal is to present results on both theoretical and more applicative open challenges, as well as to provide a venue to showcase the current state of existing proposals.

14:00
The Bisq DAO: On the Privacy Cost of Participation

ABSTRACT. The Bisq DAO is a core component of Bisq, a decentralized cryptocurrency exchange. The purpose of the Bisq DAO is to decentralize the governance and finance functions of the exchange. However, by interacting with the Bisq DAO, participants necessarily publish data to the Bitcoin blockchain and broadcast additional data to the Bisq peer-to-peer network. We examine the privacy cost to participants in sharing this data. Specifically, we use a novel address clustering heuristic to construct the one-to-many mappings from participants to addresses on the Bitcoin blockchain and augment the address clusters with data stored within the Bisq peer-to-peer network. We show that this technique aggregates activity performed by each participant: trading, voting, transfers, etc. We identify instances where participants are operating under multiple aliases, some of which are real-world names. We identify the dominant transactors and their role in a two-sided market. We conclude with suggestions to better protect the privacy of participants in the future.

14:15
Performance Overhead of Atomic Crosschain Transactions

ABSTRACT. Atomic Crosschain Transaction technology allows composable programming across permissioned Ethereum blockchains. It allows for inter-contract and inter-blockchain function calls that are both synchronous and atomic: if one part fails, the whole call graph of function calls is rolled back. This paper analyses the processing overhead of using this technique compared to using multiple standard non-atomic single blockchain transactions. The additional processing is analysed for three scenarios involving multiple blockchains: the Hotel - Train problem, Supply Chain with Provenance, and an Oracle. The technology is shown to reduce the performance of Hyperledger Besu from 375 tps to 39.5 tps if all transactions are instigated on one node, or approaching 65.2 tps if the transactions are instigated on a variety of nodes, for the Hotel-Train scenario.

14:30
Binding of Endpoints to Identifiers by On-Chain Proofs

ABSTRACT. In many applications, identity management (IdM) is used to associate a subject public key with an endpoint at which the subject can be contacted (telephone number, email, etc.). In decentralized applications based on distributed ledger technologies (DLTes), it is desirable for the IdM to be decentralized as well. Currently, endpoints are either verified by who needs it, which is impractical in DLT-based applications, or by a centralized authority, which contrasts with the spirit of DLTes.

In this paper, we show two DLT-based protocols to prove the association between a subject and an endpoint in a decentralized manner, contributing in filling the gap of the current IdM approaches with respect to decentralization. Our protocols are compatible with a wide variety of endpoints. We analyze the security of our protocols and evaluate their performance and cost against the common approaches.

14:45
Security and Performance Evaluation of Master Node Protocol in the Bitcoin Peer-to-Peer Network

ABSTRACT. This paper proposes a proximity-aware extensions to the current Bitcoin protocol, named Master Node Based Clustering (MNBC). The ultimate purpose of the proposed protocol is to evaluate the security and performance of grouping nodes based on physical proximity. In MNBC protocol, physical internet connectivity increases as well as the number of hops between nodes decreases through assigning nodes to be responsible for propagating based on physical internet proximity.

15:00
A Decentralized System for Fair Token Distribution and Seamless Users Onboarding

ABSTRACT. Tokens are digital, transferable and programmable assets and one of the most promising tools offered by blockchains. They could enable a wide range of applications, from down to earth to futuristic. One of the main issues in achieving wide adoption of tokens is onboarding: main platforms require users to deal with specific tools such as wallets, transaction fees, key generation and storage. The most common solution to offer a familiar experience to naive users are custodial intermediaries, which have the important drawback of centralizing the process, and keeping users away from the intrinsic advantages of blockchains. In this paper we present a process for the distribution of digital tokens to end users, exploiting “physical” objects for initial distribution. The process is aimed at making onboarding of users not yet accustomed with blockchain tools and concepts maximally easy, in a secure and decentralized way.

15:15
RenewLedger : Renewable energy management powered by Hyperledger Fabric

ABSTRACT. Trading and storage of renewable energy offers a way for the prosumer to extract value from the surplus energy that they produce, while also mitigating energy shortfall. Power companies can enlist prosumers in demand response strategies for grid stability and cost savings. We present RenewLedger, a blockchain-based framework for renewable energy transaction, storage management and direct-to-consumer demand response incentivization and gamification for peak shaving. We design and implement this system using Hyperledger Fabric and report on performance benchmarking experiments conducted using Hyperledger Caliper.

15:30
Tokenization and Blockchain Tokens Classification: a morphological framework
PRESENTER: Pierluigi Freni

ABSTRACT. The work here presented moves from the acknowledgement that, even if blockchain technology has been around for more than ten years, the knowledge about its economic and business implications is fragmented and heterogeneous. In the first place, it is analysed the shift from economics to tokenomics and the central role of the token within blockchain-based ecosystems. Subsequently, a generalized definition of the token is proposed. Diving into the requirements for a comprehensive description of tokens, that takes into account their wide variety, a comparative assessment of token classification frameworks available in the literature is performed. This analysis is leveraged to propose a new and comprehensive token classification framework, based on a morphological analysis representation. The proposed framework will be further refined with an empirical and iterative approach in future works.

15:45
Decentralized Social Media Applications as a Service: a Car-Sharing Perspective

ABSTRACT. Social media applications are essential for next generation connectivity. Today, social media are centralized platforms with a single proprietary organization controlling the network and posing critical trust and governance issues over the created and propagated content. The ARTICONF project funded by the European Union's Horizon 2020 program researches a decentralized social media platform based on a novel set of trustworthy, resilient and globally sustainable tools to fulfil the privacy, robustness and autonomy-related promises that proprietary social media platforms have failed to deliver so far. This paper presents the ARTICONF approach to a car-sharing use case application, as a new collaborative peer-to-peer model providing an alternative solution to private car ownership. We describe a prototype implementation of the car-sharing social media application and illustrate through real snapshots how the different ARTICONF tools support it in a simulated scenario.

14:00-17:30 Session 2B: 10th Workshop on Management of Cloud and Smart City System (MOCS)

The last years have witnessed a permanent change of vision of cloud systems. Nowadays, the most important stakeholders such as private companies, public agencies, research communities and citizens rely on the cloud for a number of purposes, stemming from sharing hardware infrastructures to software, data, and sensing services.

The services designed for complex scenarios like Multi-Access Edge Computing (MEC), smart cities, and the upcoming Industry 4.0, pave the path for a new era of the cloud. The complexity of human dynamics in a city can be better analyzed by decentralizing the infrastructure, integrating and opening the data and sharing the services. The MEC paradigm, standardized by ETSI, is a key enabling technology for upcoming 5G networks, whereby applications and network functions are hosted in edge cloud data centers. By being closer to the end-user, besides better supporting low-latency applications, MEC systems are a candidate architecture for such a decentralized, context-aware infrastructure. Despite such a rapid (re-)evolution of cloud systems, it remains unclear whether current solutions are able to support these emergent application scenarios. Through sensing as a service processes, crowd-sensed data is made available to the cloud stakeholders. Some of them like citizens become data contributors, customers and service consumers at the same time. This data exchange calls for secure and reliable data trading transactions between the counterparts. Blockchain technology provides the stakeholders with a transparent, unalterable, ordered ledger by enabling a decentralized and secure environment. Solutions for its integration in the cloud for smart-city services are however yet to appear.

14:00
Edge Virtualization in Multihomed Vehicular Networks

ABSTRACT. Vehicular Networks (VANETs) are a critical component of a Smart City environment. They extended the connectivity plane with support for a wide range of applications, from safety to entertainment. Such services, when deployed outside the vehicular network, may imply an additional delay, which can be critical. In addition, these services become inaccessible whenever the vehicles lose contact with the infrastructure.

This paper proposes a practical solution that aims to minimize the impact of the services' location and its inaccessibility in a VANET. The solution focuses on using Network Function Virtualization technologies to support the deployment of the services at the edge of a mobility-enabled multihomed VANET, thus allowing the services to be accessible in intermittent connectivity situations, as well as enabling lower delays for critical services. The results obtained show that the solution is capable of deploying services at the edge of the VANET with low delay and with a fast recovery when in handover and mobility scenarios.

14:15
The Service Node Placement Problem in Software-Defined Fog Networks

ABSTRACT. Nowadays, cloud computing has become a key paradigm in distributed applications thanks to the rise of low-power Internet-connected devices as commonplace. However, stringent Quality of Service (QoS) requirements are complicated to achieve when a pure cloud computing paradigm is applied, due to the physical distance between end devices and cloud servers. This motivated the appearance of fog computing, a paradigm that adds computation and storage resources, named fog nodes, closer to the end devices in order to reduce response time and latency. However, the placement of fog nodes, as well as the relative placement of the end devices each fog node serves, can affect the QoS obtained. This can be crucial to those services that have stringent QoS requirements. In this work, we analyze the effects that different placements of fog nodes have on QoS and present the problem of placing fog nodes to obtain an optimal QoS, with a focus on the Industrial Internet of Things domain because of its strict QoS requirements. We conclude that an optimized placement of the fog nodes can minimize latency to support the QoS requirements of IIoT applications.

14:30
A proximity-based indoor navigation system tackling the COVID-19 social distancing measures

ABSTRACT. The emergency we are experiencing due to the coronavirus infection is changing the role of technologies in our daily life. In particular, movements of persons need to be monitored or driven for avoiding gathering of people, especially in small environments. In this paper, we present an efficient and cost-effective indoor navigation system for driving people inside large smart buildings. Our solution takes advantage of an emerging short-range wireless communication technology - IoT- based Bluetooth Low Energy (BLE), and exploits BLE Beacons across the environment to provide mobile users equipped with a smartphone hints on how to arrive at the destination. The main scientific contribution of our work is a new proximity-based navigation system that identifies the user position according to information sent by Beacons, processes the best path for indoor navigation at the edge computing infrastructure, and provides it to the user through the smartphone. We provide some experimental results to test the communication system considering both the Received Signal Strength Indicator (RSSI) and the Mean Opinion Score (MOS).

14:45
Real-Time Automatic Air Pollution Services from IOT Data Network

ABSTRACT. In recent years, there is an increasing attention on Air Quality and derived services. In most cases, the main objective consists in providing services independently on the number of sensors, and on any position of the users. On the other hand, sensors are collected in limited positions, then a dense grid is computed to satisfy the needs of services such as: conditional routing, alerting on data values for personal usage, general heatmap production for Dashboards in control room of the operators, and for web and mobile applications for the city users. The paper formalizes the development process and describes how it is possible to automatically integrate Data Analytics in data flow real time processes, called in the paper IoT Applications. Two interpolation methods have been compared and validated in order to assess the accuracy. Interpolation errors trends have been used to detected devices’ dysfunctions on sensors. The specific case presented in this paper refers to the data and the solution of Snap4City for Helsinki. Snap4City (https://www.snap4city.org) has been developed as a part Select4Cities PCP of the European Commission, and it is presently used in a number of cities and areas in Europe.

15:00
A Continuous Data Imputation Mechanism based on Streams Correlation

ABSTRACT. The increased adoption of the Internet of Things (IoT) for the delivery of intelligent applications over huge volumes of data opens new opportunities to draw conclusions from data and support efficient decision making. For this reason many applications have been developed for data collection and processing. A large part of them are aligned with the requirements of the vast infrastructure of IoT. However, one of the biggest problems occurring at real-time applications is that they are prone to missing values. Missing values can negatively affect the outcomes of any processing activity, thus, they can limit the performance of IoT applications. In this paper, we depart from the relevant literature and propose a data imputation model that is based on the correlation of data reported by different IoT devices. Our aim is to support data imputation using the `knowledge' of the team of IoT devices over their reports for various phenomena. Our scheme adopts a continuous correlation detection methodology applied at real time reports of the involved devices. Hence, any missing value can be replaced by the aggregated outcome of data reported by correlated devices. We provide the description of our approach and evaluate it through a high number of simulations adopting various experimental scenarios.

15:15
A Stackelberg Game Approach for Incentive V2V Caching in Software-Defined 5G-enabled VANET
PRESENTER: Ahmed Alioua

ABSTRACT. Software-defined networking (SDN) is considered as one of the main enabler technology of 5G that is expected to propel the penetration of vehicular networks. The rapid development of wireless technology has generated an avalanche demand for bandwidth-intensive applications (e.g., video-on-demand, streaming video, etc.) causing an exponential increase in mobile data traffic. Moreover, the use of edge caching technique enhances network resource utilization and reduce backhaul traffic. Many incentive mechanisms have been developed to encourage caching actors to enhance the caching process. In this paper, we propose an SDN based incentive caching mechanism for a 5G-enabled vehicular network. Our caching strategy consists of a small base station (SBS) that encourages mobile vehicles equipped with embarked caches to store and share its popular contents using vehicle to vehicle (V2V) communication. SBS aims to offload the cellular core links and reduce traffic congestion, where cache-enabled vehicles compete to earn more SBS reward. The interaction between the SBS and the cache-enabled vehicles is formulated using a Stackelberg game with a non-cooperative sub-game to model the conflict between cache-enabled vehicles. The SBS acts first as a leader by announcing the number of popular contents that it wants to cache and the cache-enabled vehicles respond after by the optimal number of contents they accept to cache and the corresponding caching price. Two optimization problems are investigated and the Stackelberg equilibrium is derived. The simulation results demonstrated the efficiency of our game theoretical based incentive V2V caching strategy.

15:30
Speed Based Distributed Congestion Control Scheme for Vehicular Networks

ABSTRACT. The Internet of Vehicles (IoV) and vehicular clouds use sensor data and information from vehicles to implement an intelligent transportation system for future smart cities. The tremendous amount of data generated in IoV can lead to channel congestion, packet loss and delay of time-sensitive messages. This can have a serious impact on the performance of applications and services provided by a vehicular network, particularly safety applications that are time critical. As such, network congestion control is an topic in vehicular networks and various methods of controlling the message transmission rate and power have been explored to-date. In this paper we have proposed a new distributed congestion control algorithm which manipulates the transmission power based on a density estimation derived from the vehicle’s driving speed. The results indicate that the proposed approach is effective in reducing packet loss and improving the relevance of the received messages.

15:45
Secure Identity Management Framework for Vehicular Ad-hoc Network using Blockchain

ABSTRACT. Vehicular Ad Hoc Network (VANET) is a mobile network formed by vehicles, road side units, and other infrastructures that enable communication between the nodes to improve road safety and traffic control. While this technology promises great benefits to drivers, there are many security and privacy concerns that must be addressed before it can be fully adopted. It is essential to ensure that vehicles participating in the network are authenticated and held accountable in case of misbehaviour. On the other hand, there should be adequate mechanisms for preserving the privacy of vehicles and drivers, so they are protected against unauthorized tracking and release of private information. Many current VANET technologies also depend on a central trusted authority that becomes a single point of failure for the network. In this paper, we propose a new blockchain based decentralized authentication approach for VANET. In this scheme vehicles maintain conditional anonymity in the network and their real identities can only be revealed to authorized entities. Using the blockchain technology, we create a distributed framework and maintain an immutable record of the data, strengthening the integrity of the system. We use the Hyperledger Fabric, a permissioned blockchain technology, to implement our approach and compare its performance to the traditional PKI based method for VANET authentication.

16:00
Smart Contract Designs on Blockchain Applications

ABSTRACT. The rapid increase of the world’s urbanization process has been improving citizens’ quality of living. Combining new technologies of smart government, smart healthcare, smart transportation and other services under a framework of smart city minimizes urbanization challenges. However, these services demand a large data technology to support the infrastructure of smart cities. It is a major benefit to using blockchain technology as a framework to integrate multiple technologies of smart city such as Internet of Thing, big data platforms and smart transportation to enhance the automation, security and decentral- ization of smart city services. However, querying the blockchain to retrieve a transaction record is one of the major limitation of blockchain systems. The operation requires scanning the ledger searching for the result. In this paper, we utilized different smart contract designs to support indexing and querying the blockchain for ride sharing data. We evaluated the complexity, measured by gas consumption of mined transactions for two smart contract designs, Catalog and Sparse smart contracts. Our experiments evaluate retrieving data from the blockchain for the two smart contracts designs.

16:15
On the Applicability of Secret Share Algorithms for Osmotic Computing

ABSTRACT. Osmotic Computing (OC) is an innovative computation paradigm that runs services on Cloud, Edge, and Internet of Things (IoT) resources based on the workload. Services are encapsulated in containers stored into a central repository on the Cloud. OC suffers from privacy and security issues, for example, hackers could attack the repository and download all images. A possible solution to solve this problem is to employ Secret Share techniques to split the images of services into chunks and distribute them among Edge devices. This work aims to test the applicability of these techniques for OC employing the Redundant Residue Number System (RRNS) to split and store Micro-Elements (MELs). We made our analyses for different Osmotic Computing scenarios composed of 10, 100 and 1000 nodes running 1000 MELs each. Furthermore, we considered several degrees of redundancy from $0$ to $7$. From experimental analyses, we found that the reliability of the system increase with the increasing of the redundancy but the security decreases.

16:30
HS-AUTOFIT: a highly scalable AUTOFIT application for Cloud and HPC environments

ABSTRACT. Technological progress is leading to an increase of instrument sensitivity in the field of rotational spectroscopy. A direct consequence of such a progress is the increasing amount of data produced by instruments, for which the currently available analysis software is becoming limited and inadequate. In order improve data analysis performance, parallel computing techniques and distributed computing technologies like Cloud or High Performance Computing (HPC) can be exploited. Despite the availability of computer resources, neither Cloud computing nor HPC have been fully investigated for identifying unknown target spectra in rotational spectrum. This paper proposes the design and implementation of a Highly Scalable AUTOFIT (HS-AUTOFIT), an enhanced version of a fitting tool for broadband rotational spectra that is capable of exploiting the resources offered by multiple computing nodes. With respect to the old program version, the new one scales on multiple computing nodes thus guaranteeing higher accuracy of the fit function and consistent boost of execution time. The result of tests conducted in real Cloud and HPC environments demonstrate that HS-AUTOFIT is a viable solution for the analysis of huge amount of data in the addressed scientific field.

14:00-17:30 Session 2C: 5th edition of the IEEE workshop on ICT Solutions for eHealth (ICTS4eHealth) : Session II

e-Health is one of the major research topics that have been attracting cross-disciplinary research groups. The deployment of new emerging ICT technologies for Health, especially based on Cloud computing, Internet of Things (IoT), and Computational Intelligence, is attracting the interest of many researchers. ICTS4eHealth 2020 is the 5th edition of the International IEEE Workshop dedicated to ICT solutions for e-Health, especially based on Cloud computing, Internet of Things (IoT), and Computational Intelligence. The workshop will bring together researchers from academia, industry, government, and medical centers in order to present the state of the art in the emerging area of the use of cloud systems in connected health infrastructure and applications, and the use of IoT and Computational Intelligence techniques in the area of eHealth.

14:00
Sensitive Data Exchange Protocol Suite for Healthcare
PRESENTER: Thibaud Ecarot

ABSTRACT. Learning Healthcare System is an increasingly deployed approach in health to improve patient care. For the successful implementation of this approach, communications must become cross-cutting between research and primary care. To meet this need, standardized protocols for health data exchange, such as Fast Healthcare Interoperability Resources from Health Level Seven organization, are massively used in healthcare organizations. However, these protocols don’t meet new security needs and they don’t natively integrate anonymization mechanisms for data sources and patients while maintaining individuation. In this paper, a new protocol suite is proposed for sensitive health data exchange. Thus, an architecture is presented: it integrates proxies and anonymizers for the extraction and transmission phases of sensitive data. Then, requirements on several new protocols are detailed to meet the exchanges needs between the learning health system entities. Finally, a comparison of security properties and a vulnerability analysis are carried out between the Fast Healthcare Interoperability Resources protocol and the protocol suite proposed. These analyses show that the protocol suite integrates most of the defenses against common protocol attacks and that anonymization, confidentiality, authentication and logging requirements are met.

14:15
Evaluating the autonomy of children with autism spectrum disorder in washing hands: a deep-learning approach

ABSTRACT. Monitoring children with Autism Spectrum Disorder (ASD) during the execution of the Applied Behaviour Analysis (ABA) program is crucial to assess the progresses while performing actions. Despite its importance, this monitoring procedure still relies on ABA operators' visual observation and manual annotation of the significant events. In this work a deep learning (DL) based approach has been proposed to evaluate the autonomy of children with ASD while performing the hand-washing task. The goal of the algorithm is the automatic detection of RGB frames in which the ASD child washes his/her hands autonomously (no-aid frames) or is supported by the operator (aid frames). The proposed approach relies on a pre-trained VGG16 convolutional network (CNN) modified to fulfill the binary classification task. When tested, the fine-tuned VGG16, achieved a recall of 0.92 and 0.89 for the no-aid and aid class, respectively. These results prompt the possibility of translating the presented methodology into the actual monitoring practice, as a valuable tool to support ABA operators during the therapy session

14:30
A Predictive Machine Learning Model to Determine Alcohol Use Disorder

ABSTRACT. Prediction of alcohol use disorder (AUD) may reduce the number of deaths caused by alcohol-related diseases. However, prediction of AUD based on patients’ historical clinical data is still an open research objective. This study proposes a method to predict AUD from electronic health record (EHR) data through supervised machine learning. The study combines EHR data with patient reported data from 2,571 patients in the Region of Southern Denmark that labels patients into two categories, AUD positive (457) and AUD negative (2,114). These unique datasets are used to validate the proposed method for prediction of AUD using machine learning methods based on historical clinical data from EHRs.

14:45
Reducing Sparse Motion Artifacts in MR-Thermometry Using Robust Principal Component Analysis
PRESENTER: Alaleh Alivar

ABSTRACT. Magnetic Resonance Imaging is one of the most prevalent, reliable, and comprehensive thermal monitoring methodology to measure tissue temperature changes during thermal therapies. The proton resonance frequency shift (PRFS) technique is a widely MRI thermal imaging method. However, the PRFS method is sensitive to inter-frame motions that may result in incorrect temperature change profiles. Considering each MRI temperature image as a superposition of a temporally correlated background and a sparse matrix representing motion artifacts, we aim to recover the low-rank matrix from corrupted observations using robust principal component analysis. This problem is solved using iterative soft thresholding of the singular values of both low-rank and sparse matrices and singular value thresholding techniques. We apply this method to MRI observations with artificial introduced motion artifacts and the results indicate that the proposed approach is effective in recovering the clean temperature profiles from noisy observations during heating procedures with average of %81 decrease in RMSE.

15:00
Priority Based Traffic Pre-emption System for Medical Emergency Vehicles in Smart Cities

ABSTRACT. Over the years, traffic lights have been used to manage traffic at intersection of major roads. Though relatively effective, the vehicular queues that build up at each intersection being managed by a traffic light and the subsequent delay thereof, could have adverse effects on medical emergency vehicles(EVs). Reports have shown that queues at traffic intersection can increase travel times of medical EVs by an average of 20%. This in many instances could mean the difference between life and death. Prioritizing EVs could be a potential solution to this challenge. In this paper, an Internet of Things based priority pre-emption model for EVs in smart cities is proposed. It leverages on sensors to dynamically track the EV’s location and speed, and adaptively adjusts the timing sequence of all traffic lights on the EV’s path. This ensures that the EV experiences little or no delay to and from its destination. Experimental results show that the proposed model has the potential to reduce travel delays experienced by medical EVs by up to 35%.

15:15
Improving the Recognition of Sign Language from Acquired Data by Wireless Body Area Network
PRESENTER: Aymen Shaafi

ABSTRACT. Accurate and fast recognition of sign language would greatly improve communications between the deaf and the hearers using hand-held devices. We used Myo armband as our wireless data measurement device, which is wearable technology equipped with on-board 3D Accelerometer, 3D Gyroscope and 8 channel Electromyogram acquisition system. The main objective of this paper is to provide a lightweight approach for American Sign Language recognition by reducing the dimensionality of inputs using a novel method. Data from each sensors are aggregated into one dimension to reduce the required time for data processing, as well as the amount of required memory for storage. Afterward, we extract features from aggregated data and we proceed to classification using Support Vector Machine (SVM). We compare the performance of SVM with and without our aggregation approach. Our experimental results prove that our proposed approach improves the speed of model derivation (four times faster than existing methods) and reduces the size of input data with the same accuracy.

15:30
Predictive Models for Mitigating Covid-19 Outbreak

ABSTRACT. Starting in the Wuhan province of China at the end of 2019, the Corona virus 2019 (Covid-19) is a pandemic that has hit many countries worldwide including Iran, Italy, Spain, and more recently the USA, while affecting the African continent with lower caseloads. As of May 2020, South Africa has been the most affected country with the highest caseload on the African continent with Cape Town being the the epicentre of the pandemic in South Africa. It is widely recognised that preempting the pandemic rather than attempting to cure infected patients is very crucial, especially on the African continent given its poorer healthcare system compared to the more developed countries of the Western world where the pandemic has caused much more casualties despite their more advanced state of the healthcare system. This paper proposes two predictive analytic models that can be used in the mitigation against the pandemic by i) validating of the proposed protective measures through simulation modelling and ii) pre-empting the evolution of the pandemic through data analytics. The simulation modelling builds around the classic SIR model to mimic the main protective measures suggested by the world health organisation (WHO) and implemented by affected countries. The data analytics model is built around a multi linear regression machine learning model used to predict future confirmed cases based on the data currently collected. The two models were implemented using real Covid19 data of the city of Cape Town. The results revealed the accuracy of the models and the relevance of combining simulation modelling and data analytics as relevant tools in the fight against the pandemic.

15:45
Exploit Multilingual Language Model at Scale for ICD-10 Clinical Text Classification

ABSTRACT. The automatic ICD-10 classification of medical documents is actually an unresolved issue, despite its crucial importance. The need of machine learning approaches devoted to this task is in contrast with the lack of annotated resources, especially for languages different from English. Recent Transformer-based multilingual neural language models at scale have provided an innovative approach for dealing with cross lingual Natural Language Processing tasks. In this paper, we present a preliminary evaluation of the Cross-lingual Language Model (XLM) architecture, a recent multilingual Transformer-based model presented in literature, tested in the cross lingual ICD-10 multilabel classification of short medical notes. In detail, we analysed the performances obtained by fine tuning the XLM model on English language training data, used for the prediction of ICD-10 codes of an Italian test set. The obtained results show that the use of the novel XLM multilingual neural language architecture is very promising and it can be very useful in case of low resource languages.

16:00
A Decision Support System for Therapy Prescription in a Hospital Centre

ABSTRACT. Several cases are reported every year where the prescribed therapy results incompatible with the patient’s medical history, leading to worsening of clinical condition or death. Some technologies and processes to prevent this misbehaviour already exist, but a concrete solution is not available in hospitals yet. This paper presents a Decision Support System (DSS) that can be easily integrated into a typical health workflow at hospitals and provides feedback on the possible prescription of drugs at a patient with specific diseases. The DSS is based on a Big Data analysis algorithm able to check drugs and diseases relationships and detect possible failures. We developed a prototype of the proposed solution, implementing the DSS system and setting up the necessary Big Data management tools for the effective adoption of the DSS system. We performed some evaluations to assess the efficacy and the response time of the DSS algorithm.

16:15
Contactless Walking Recognition based on mmWave RADAR

ABSTRACT. Analysis of a person's movement provides important information about his or her health status. This analysis can be performed with wearable devices or with non-contact technologies. These latter in particular are of some interest, since the subject is free to move and the analysis of the movement is realistic. Despite being designed for other purposes, automotive mmWaves radars represent a powerful low-cost technology for detecting people's movements without contact and finds interesting applications as a support for home monitoring of health conditions. In this paper it is shown how to exploit commercial radars to distinguish with high precision the way of walking of a subject and the position of his hands during the activity carried out. The application of Principal Component Analysis (PCA) for feature extraction from raw data is considered, together with supervised machine learning algorithms for the actual classification of the various activities carried out during the experiments.

16:30
Evaluation of data balancing techniques in 3D CNNs for the classification of pulmonary nodules in CT images

ABSTRACT. Lung cancer is the second most common in Brazil, the early detection of pulmonary nodules is essential for patient survival. In this work, we propose an algorithm based on 3D Convolutional Neural Network (CNN) to classify pulmonary nodules as benign or malignant in computed tomography images. Three architecture of 3D CNNs are proposed, containing different input sizes and numbers of convolutional layers. In addition, we investigated data augmentation techniques and modifications in the network training cost function to address the problem of imbalanced data. The best result was achieved for input size of 32$\times$32$\times$32, 2 blocks of convolutional layers and 2 pooling layers. Also, the modification of cost function achieved promising results, with accuracy of 0.9188, kappa of 0.8019, sensitivity of 0.8481, specificity of 0.9479 and AUC of 0.8980 in the test set during malignant nodule detection.

16:45
High-Resolution Physiological Stress Prediction Models based on Ensemble Learning and Recurrent Neural Networks

ABSTRACT. High-resolution stress detection is an essential requirement for designing time- and event-based stress monitoring systems as a building block for mobile and e-health systems aimed at supporting personalised treatments, both in clinical and remote settings. However, most of the existing solutions focus on binary or few-class stress detection, thus providing a limited feedback and reducing their utility and applicability in real-world scenarios. In this paper we present an alternative approach that overcomes the standard formulation of stress detection as supervised classification problem, by using ensemble learners and recurrent neural networks (RNNs) as the most relevant models for solving time series regression tasks. We trained and tested models using WESAD, a public multimodal wearable dataset for stress and affect detection, and we defined and computed stress scores based on various validated questionnaires stored in the dataset. Leave-One-Subject-Out (LOSO) cross-validation scheme has been applied to test the generalisation capabilities of each model in predicting individual stress scores. Result show that Nonlinear AutoRegressive network with eXogenous inputs (NARX), Random Forest (RF), and Least-Squares Gradient Boosting (LSBoost) provide high-resolution personalised stress predictions for the majority of analysed subjects. The proposed predictive models may be integrated as support to decision making into a Decision Support System (DSS) for online stress monitoring, with the main goal to design personalised stress management and alleviation strategies related to the inferred stress severity.

17:00
Server-Based Secure Key Management for the IEEE 802.15.6 standard
PRESENTER: Benmansour Tariq

ABSTRACT. Wireless Body Area Networks (WBANs) connects a variety of sensor nodes that operate in close vicinity to, on or inside a human body. Recently, the IEEE Task Group 6 has established the first international WBANs standard, called IEEE 802.15.6. Since some communications can carry sensitive information, the standard provides for strong security by a security association procedure that identifies WBANs’ nodes and the Body Network Coordinator (BNC) to each other. However, many security vulnerabilities are noticed in the above procedure, especially to the Key Compromise Impersonation (KCI) and the Impersonation attacks. In this paper, we design a secure key management and nodes authentication scheme, called Server-Based Secure Key Management for the IEEE 802.15.6 standard (SBSKM). The objective of the proposed scheme is to improve the IEEE 802.15.6 security by ensuring the encryption of all communications, starting from the beginning of the security association procedure. To that, we extended the security architecture of the standard with a trusted server, responsible for the creation, initialization, and distribution of encryption keys as well as guarantee of the identity of the sensor nodes joining the network. Over Castalia simulator based on OMNeT++, expanded simulations have been performed and the results show the robustness of our solution which, on the one hand, allowed us to overcome the lack of authentication and confidentiality in the standard security scheme, and on the other hand, it doesn’t lead to a reduction in the standard's performances.