FRCCS 2021: FRENCH REGIONAL CONFERENCE ON COMPLEX SYSTEMS
PROGRAM FOR WEDNESDAY, MAY 26TH
Days:
next day
all days

View: session overviewtalk overview

09:00-09:45 Session Speaker S1: Guillaume Deffuant - Do interactions among unequal agents undermine average self-esteem?
09:00
Do interactions among unequal agents undermine average self-esteem?

ABSTRACT. We consider a recently published model in which agents hold opinions about each other and influence each other’s opinions during random pair interactions. When initial opinions are close to each other, the interactions tend to increase the opinions over time. In the contrary, when the initial opinions are very diverse, the opinions tend to decrease on average, especially if there is gossip. We derive an analytical model of the evolution of opinions averaged over noise and interactions which shows the existence of a positive bias on self-opinions and a negative bias on the opinion about others and we relate the observed patterns to these biases.

09:45-10:00 Session Poster P1 A: Epidemics & Biological Networks
09:45
Characterizing the divergence between two different models for fitting and forecasting the COVID-19 pandemic
PRESENTER: Tian Gan

ABSTRACT. Since the novel Coronavirus (COVID-19) has been announced as a global pandemic, researchers from different disciplines have attempted to describe and forecast the spread of COVID-19. Some recent studies try to predict the future trend of the COVID-19 pandemic by deep learning, e.g., the long short-term memory (LSTM), but most works focus on the compartmental epidemic model based curve fitting and forecast. The susceptible-infected-removed (SIR) model and the susceptible-exposed-infected-removed (SEIR) model are two most commonly used compartmental models. The question is to what extent the choice of epidemic models will affect the fitting and long-term forecast performance. In this work, we compared the fitting and prediction performance by considering and ignoring the exposed state to characterize the divergence between these two different models.

09:45
Gaussian Mixture Model and Linear-Discriminant-Analysis for clustering complex biological data
PRESENTER: Samuel Diop

ABSTRACT. Algorithms of machine learning are routinely used in biology for standardized analysis of multidimensional datasets generated, for example, by next-generation sequencing (NGS) or high-resolution imaging. Despite this continuous progress, a problem remains for the treatment of medium-scale experimental datasets extracted manually by researchers for which specific in-house developed analytic tools are needed to convey, formalize and visualize biological information in a manner that excludes possible interpretation biases. In experimental hematology, clonal differentiation assays are routinely performed for the characterization of diverse stem/progenitor cell subsets. In this setting, phenotypically defined cell subsets are cultured under conditions that permit expansion and differentiation along multiple lymphoid or myeloid lineages. In the lab, we have developed a new clonal assay that allows for improved discrimination between hematopoietic stem cells and lineage-biased multipotent, oligopotent, and unipotent lympho-myeloid progenitors. At methodological levels, stem/progenitor cells are isolated from the human bone marrow or umbilical cord blood and cultured for two weeks before characterization by flow cytometry of their mature or semi-mature progeny. Dataset analyses rely on pre-defined quantitative (absolute cell numbers) and qualitative (population composition) metrics. In order to classify the resulting clones and establish a developmental hierarchy between their cells of origin, we have developed a mathematical approach that combines a Gaussian Mixture Model approach used for clusterization of the clonal datasets, and a Linear Discriminant Analysis dimension-reduction algorithm for data visualization.

10:00-11:15 Session Oral O1: Social Complexity
10:00
Evidence of a common pattern in social explosions
PRESENTER: Yerali Gandica

ABSTRACT. in pdf

10:15
Mapping pro migrant discourse on Facebook : The Belgian migration activist scene and its (re)configuration from 2014 to 2018
PRESENTER: Alexandre Leroux

ABSTRACT. In this contribution we investigate the online space of pro-migration mobilization in Belgium. First we identify actors of the mobilization on Facebook then leverage their discursive practices to uncover cognitive and structural patterns and map them upon a multi-dimensional space of mobilization.

In reaction to the refugee reception crisis of 2015, citizen movements sprouted up over Europe in solidarity with the struggles of the migrants. These new movements worked alongside traditional actors such as trade unions, associations, and universities. However, in Belgium, citizen activist discourses and actions reproduced, and even reinforced, the institutional distinction between 'asylum seekers' and 'economic' migrants.

How do these new actors blend into the Belgian activist scene? And how have traditional actors dealt with divergent philosophies and mobilisation repertoires? The present contribution addresses the reconfiguration of the French-speaking Belgian activist scene by analysing a network of 116 Facebook pages, which support migrants, and their publications between 2014 and 2018 (38 000 posts, 2.4 million words).

Conducting an analysis of such an amount of data requires a multidisciplinary approach which efficiently combines both quantitative and qualitative methods. For this purpose, our methodology relies on three levels of analysis of varying granularity: a network analysis, a computational linguistics model, and a discourse analysis. First, we construct a network made of publications shared between pages to identify community structure and actors' prominence. Second, by means of a probabilistic topic model, we analyse the whole corpus and its thematic trends to extract discursive similarity between actors and its evolution over time. Third, we rely on discourse analysis to focus on nouns used to designate human groups (such as "movement") and their members (such as "refugees"). This approach emphasises the way actors express relationships and their network in terms of belonging, identity, and affiliation as well as self-representation. These three parallel methods show the ambivalent relationships between the different actors /as expressed/ in their Facebook publications with those that exist within the Facebook network itself (posts and shares).

10:30
Multi-body Interactions and Non-Linear Consensus Dynamics
PRESENTER: Renaud Lambiotte

ABSTRACT. We introduce and analyse a three-body consensus model (3CM) for non-linear consensus dynamics on hypergraphs. Our model incorporates reinforcing group effects, which can cause shifts in the average state of the system even in if the underlying graph is complete (corresponding to a mean-field interaction), a phenomena that may be interpreted as a type of peer pressure. We further demonstrate that for systems with two clustered groups, already a small asymmetry in our dynamics can lead to the opinion of one group becoming clearly dominant. We show that the nonlinearity in the model is the essential ingredient to make such group dynamics appear, and demonstrate how our system can otherwise be written as a linear, pairwise interaction system on a rescaled network.

10:45
Detection of antagonism and polarization on social media through community boundaries
PRESENTER: Alexis Guyot

ABSTRACT. Social network data are increasingly used to extract value from them, in different domains such as marketing, politics or sociology. These data can be represented by graphs which model the interactions between individuals through directed and weighted links. The detection and study of communities in online social networks are important tasks to understand the behaviour of users. However, for a detailed interpretation of a phenomena, it is also necessary to study their interactions and to be able to detect and evaluate polarization of communities. We propose a method which allows to evaluate the antagonism of the communities and to identify their boundaries in weighted and directed networks. An implementation is available in open access. We experimentally validate our proposal by studying conspiracy theories within tweets related to COVID-19 vaccines.

11:00
In-depth study of contacts in non-confined crowds and the associated risks of viral spread

ABSTRACT. The broad patterns of contacts in a population may not be sufficient to inform detailed epidemiological models, in particular if they aim to be applied to a specific situation. Instead of these contact rates, we show that detailed field-data about pedestrian interactions in given settings can be collected empirically and combined with spatially resolved models of viral transmission via respiratory droplets to infer the risks of new infections raised in each situation. Redesigning strategies to mitigate viral spread can then be assessed.

11:30-12:15 Session Speaker S2: Marten Düring - In-between complexities: On the exploration of datafied historical sources
11:30
In-between complexities: On the exploration of datafied historical sources

ABSTRACT. Traditionally, historical research is associated with the contemplation over a small number of mostly written historical documents. The digital age has by now deeply transformed such practices through mass digitisation of historical sources and secondary literature more or less available in online repositories. Today, historians somewhat paradoxically face overwhelming amounts of information whilst permanently being limited by the fragmentary nature of digitisation and the surviving historical record itself. Nevertheless, computer scientists and historians have rightfully pointed to the immense value of such digitisation for research. Following digitization, historical sources are increasingly also datafied, i.e. automatically enriched so as to open up new opportunities for their exploration and analysis using for example named entity recognition and linking, topic modeling or text reuse detection for written documents and object detection for images.But behind their visions for the analysis of the resulting data stand two distinct understandings of complexity: Historians commonly use the term “historical complexity” to stress the multitude of relevant relationships which link past events and processes through time, observed through the cross-consultation of fragmentary and ambivalent, sometimes contradictory historical sources. Data, especially when automatically extracted from such sources, does not adequately represent such complexity in the historical sense, nor does it in most cases lend itself to answer relevant historical questions through straightforward quantitative analysis. In this sense it fails to meet the expectation to lead to radically new insights driven by (big) data analysis and the ensuing methodological toolkits and concepts.In-between these complexities, however, we find the biggest value such data has for the majority of historians: As a means to shift between close and distant reading perspectives on historical sources so as to critically assess their epistemic value and guide exploratory research based on discovery and source criticism. This talk discusses three research projects (histograph, BLIZAAR and impresso) and their efforts to facilitate such an exploration of datafied historical sources.

12:15-13:30Lunch Break
13:30-15:15 Session Oral O2: Economics & Finance
13:30
TIME DEPENDENCE OF STOCK PRICE STUDIED BY A STATISTICAL PHYSICS APPROACH
PRESENTER: Hung T. Diep

ABSTRACT. In the present work, we study by Monte Carlo simulations the evolution during the time of the price in a commodity market by examining the effects of several parameters: the majority of the neighbors, the market atmosphere, the variation of the price and some specific measure applied at a given time. Each agent is represented by a spin having a number of discrete states $q$ or continuous states, describing the tendency of the agent for buying or selling. The market atmosphere is represented by a parameter $T$ which plays the role of the temperature in physics: low $T$ corresponds to a calm market, high $T$ to a turbulent one. We show that there is a critical value of $T$, say $T_c$, where strong fluctuations between individual states lead to a disordered situation in which there is no majority: the numbers of sellers and buyers are equal, namely the market clearing. We will show in particular that a specific measure taken by the government or an economic organization during a short lapse of time to boost or to lower the market price can have a long-lasting effect. Mean-field theory is also used to study the time dependence of the stock price.

13:45
THE ANATOMY OF GOVERNMENT BOND YIELDS SYNCHRONIZATION IN THE EUROZONE
PRESENTER: Mauro Napoletano

ABSTRACT. We investigate the synchronization of Eurozone’s government bond yields at different maturities. For this purpose, we combine principal component analysis with random matrix theory. We find that synchronization depends upon yield maturity. Short-term yields are not synchronized. Medium- and long-term yields, instead, were highly synchronized early after the introduction of the Euro. Synchronization then decreased significantly during the Great Recession and the European Debt Crisis, to partially recover after 2015. We show the existence of a duality between our empirical results and portfolio theory and we point to divergence trades and flight-to-quality effects as a source of the self-sustained yield asynchronous dynamics. Our results envisage synchronization as a requirement for the smooth transmission of conventional monetary policy in the Eurozone.

14:00
Optimal investment and consumption strategies on Levy financial markets with jumps under transaction costs
PRESENTER: Sergei Egorov

ABSTRACT. We investigate a portfolio optimization problem for financial markets described by exponential Levy processes with jumps. For power utility functions we find the optimal strategy in explicit form. Moreover, using this strategy and the Leland approach we develop asymptotic optimal investment and consumption methods for the financial markets with proportional transaction costs

14:15
Country Centrality in the International Trade Network and the COVID-19 Pandemic
PRESENTER: Giorgio Rizzini

ABSTRACT. International trade is based on a set of complex relationships between different countries that can be modelled as an extremely dense network of interconnected agents. On the one hand, this network might favour the economic growth of countries, but on the other, it can also favour the diffusion of diseases, like the COVID-19. In this paper, we study whether, and to what extent, the topology of the trade network can explain the rate of COVID-19 diffusion and mortality across countries. We compute the countries’ centrality measures and we apply the community detection methodology recently proposed by Bartesaghi et al. (2020). Then, we evaluate the countries' centrality in each community, using these measures as focal regressors in a linear regression framework. In doing so, we also compare the effect of different measures of centrality computed through different methodologies.

14:30
The Rise of China in the Global Production Network: What can Autocatalytic Sets teach us ?
PRESENTER: Arnaud Persenda

ABSTRACT. We investigate the emergence of China as a dominant player in the international trade network by using the innovative concept of autocatalytic set (ACS) introduced by Jain and Krishna (2001). We start by building a World Input-Output Network (WION) from the second release of the World Input-Output database (WIOD), which covers the period 2000{2014. We can empirically identify ACSs in the WION and explore both their scaling properties and time patterns. Our analysis shows the evolution of Chinese industries from peripheral to core positions of autocatalytic structures in both local and global production systems.

14:45
The Occupation Space: network structure and centrality in French Labor market
PRESENTER: Charlie Joyez

ABSTRACT. We develop an innovative approach of labor mobility by mapping an “Occupation Space” that reveals the relatedness of skills requirement across occupations. From this weighted and directed network, we compute node centrality as an index of occupation-specific potential of outward mobility. Fined-grain employers-employees data confirm that workers in central occupations earn a wage premium, and that men are over-represented in such occupations. Our approach also improves the understanding of employment dynamics, as we show that centrality reduces the probability and the length of unemployment. At last, we describe the evolution of the community structures in the network revealing the changes at stake in French labor market.

15:00
Robustness of Global Medical Equipment Supply Chains
PRESENTER: Zachary Boyd

ABSTRACT. Several contemporary events, including the Suez canal blockage, COVID-19, cyber-attacks, and terrorism, have highlighted the fragility of global supply chains. Despite its critical importance for global security and human wellbeing, study of global supply chain networks has long been hindered by the fact that companies are protective of the identity of suppliers and customers. We have constructed the largest known empirical supply chain network, consisting of 44,927 firms connected by 115,118 edges, centered on medical device supply firms, whose importance was illustrated in the early COVID ventilator shortage. we probe the robustness of this network under random and targeted attacks at the firm, country, and industry scales. Additionally, we consider attacks involving international trade disruptions. Using custom reachability metrics, we show that there is only modest redundancy in the medical supply chain, such that failure of a small number of firms (either random or targeted) can hinder medical device production. We also show that our full 10 tiers of supply chain linkage is necessary to understand this phenomenon, which casts doubt on the validity of some previous studies which used much less data.

15:15-15:30 Session Poster P2 A: Infrastructure, Planning, and Environment
15:15
Predicting and Modulating On-street Parking in Cities
PRESENTER: Nilankur Dutta

ABSTRACT. Finding a parking slot is a serious issue in contemporary urban mobility. It is estimated that the average driver in the U.S., U.K. and Germany wastes 17, 44 and 41 hours a year respectively searching for parking, at an estimated annual cost of 72.7 billion dollars, 23.3 billion pounds and 40.4 billion euros in these countries. Despite the importance of the topic (30% of cars might be cruising for parking in many large cities) and the central role given to parking policies, surprisingly little is known about the basic laws governing the search time.

In this work, we present a set of analytical and computational approaches to investigate the role of the drivers' perception of the 'attractiveness' of parking spots, in determining the occupancy of on-street parking spots in busy downtown districts. Under this concept of attractiveness, we subsume the various factors governing the selection of a place to park, including its distance to the destination, cost, and intrinsic characteristics. We implement this idea in a stochastic agent-based model and simulate it numerically to investigate the cruising phenomenon in the central district of Lyon.

We also demonstrate the effect of modulating spot attractiveness on the on-street parking spot occupancies and the time spend cruising for parking. As a matter of fact, the occupancy in such a model is exactly solvable and we develop an analytic formula to do so. We verify the accuracy of our results by comparing with occupancies generated through in-silico experiments.

15:15
Application of Network Science in detection of hospital communities
PRESENTER: Belfin R

ABSTRACT. Healthcare is one of the important aspect of a nation. Planning and executing the healthcare for the people of a nation needs a lot of brainstorming. Devoloping nations iteratively improve its healthcare system by studing their people and environment around them. In this work, we study Indian healthcare and a way to plan and improve the same. This can be used for other developing nations too with some minor changes with respect to its environment and people. India stands second in the fastest burgeoning population. Population explosion is one problem that hinders the planning and executing of good schemes to its citizens. Health care in India faces many challenges like inequality, shortage of resources, lack of research, and low health outlay. This article focuses on the health care problems in the Indian context and proposes an alternative health care planning strategy using network science. This work clusters the small and nearby hospitals in rural places to form virtually connected hospitals that can support each other and share patient’s health records. These connected hospitals can be of different specialities to treat the rural community with the help of Non-Governmental Organisations, Corporate Social Responsibility funds, and the Government announced health care schemes. The Indian hospitals’ data from the ‘Indian government data repository’ has been used for the experiment. The virtual multispecialty hospital uses Google Maps API to plan healthcare for the unreached rural hamlets. The work also suggests creating a government regulatory body to connect the Non-Governmental Organisations, Corporate Social Responsibility funds, and virtual multispecialty hospitals. This model is an apt model for a pandemic scenario like COVID-19 because Virtual Multispeciality Hospitals can split and keep track of all the citizen records under small hospital clusters with the support of the Government, the NGOs, and the CSRs.

15:30-16:15 Session Speaker S3: Alain Barrat - Social contagion and norm emergence on simplicial complexes and hypergraphs
15:30
Social contagion and norm emergence on simplicial complexes and hypergraphs

ABSTRACT. Complex networks have been successfully used to describe the spread of diseases in populations of interacting individuals. Conversely, pairwise interactions are often not enough to characterize social contagion processes such as opinion formation or the adoption of novelties, where complex mechanisms of influence and reinforcement are at work. In this talk, I will present two models of social contagion and emergence of norms in a population that take into account group interactions by considering higher-order structures such as hypergraphs. Generalizing the interaction and contagion mechanisms to these structures, I will highlight the emergence of novel phenomena with respect to the pairwise contagion processes. In a social contagion on simplicial complexes, I will show how higher-order effects can make the phase transition between the "healthy" and "endemic" states become discontinuous, with the appearance of a bistable region. In this region, a critical mass phenomenon appears, where the final state depends on the initial condition. We will then explore the issue of critical masses under a different angle, namely, the minimal size needed for a committed minority to overturn social conventions. While many studies have proposed that such critical size could be as high as 10-40%, several observations suggest that much smaller minorities may be sufficient to bring a system to its tipping point. By introducing resistance to social influence within a standard model for social conventions, we provide theoretical support to these observations: the critical mass necessary to trigger behaviour change is dramatically reduced (in some cases down to 0.3%) if individuals are less prone to change their views. Moreover, we generalize the model to group interactions and unveil their complex role: the ability of the committed minority to overturn existing norms is non-monotonic in the group size. These findings explain the observation of rapid change in social convention triggered by very small committed minorities. They shed new light on the phenomenon of norm change and highlight the importance of the emerging field of higher-order networks, beyond pairwise interactions.

16:30-17:45 Session Oral O3: Structure & Dynamics
16:30
Modularity affects the robustness of scale-free model networks under betweenness and degree-based node attack
PRESENTER: Quang Nguyen

ABSTRACT. We build Barabasi-Albert model networks with different modularity parameterized by a rewiring probability and investigate the efficacy of node attack (removal) strategies based on node degree (ID) and node betweenness (IB). We found that when model networks present absent or low modular structure ID is more effective than IB to decrease the largest connected component (LCC). Conversely, in the case the model network presents higher modularity, the IB strategies becomes clearly the most effective to fragment the LCC. The percolation transitions of different order (1st and 2nd ) are also found to depend on the network modularity.

16:45
Unraveling the role of node metadata in network robustness: the feature-based percolation model
PRESENTER: Oriol Artime

ABSTRACT. Percolation is an emblematic model used to understand the robustness of interconnected systems, in which the nodes of a network are removed and different topological properties of the resulting structure are analyzed. Despite being a model broadly studied in statistical physics and mathematics -- with insightful applications ranging from biological and neural systems to large-scale communication and transportation networks --, from a theoretical perspective this process is usually investigated in relatively simple scenarios, such as the removal of the system's units in random order -- simulating unpredictable site failures -- or sequentially ordered -- simulating targeted attacks -- by specific topological descriptors, the simplest one being the number of node connections. However, in the vast majority of empirical applications, it is required to dismantle the network following more sophisticated protocols than the aforementioned ones, such as based on more convoluted topological properties or even non-topological node metadata obtained from the application domain.

In this work we propose a novel mathematical framework to fill this gap. The nodes of a network are assigned, in addition to their degree, a set of features, whose nature varies from problem to problem, such as the age of the agents in social networks, the physical distance between the power grid stations and their closest city or the probability of being infected in the steady state of an epidemic dynamical model running on top of the network, to name but a few examples. Therefore we have a multidimensional, correlated joint degree-feature distribution P(k,F) for which we derive several percolation quantities, such as the critical point, critical exponents and the size of the giant connected component. We focus, theoretically and numerically, on scenarios where nodes are removed according to their importance in the feature space. We are able to provide an excellent match between the analytical results and the simulations when the network is intervened following feature-based protocols.

Several examples are given to show the broad applicability of the theory. In particular, we apply our framework to degree-feature relations of different nature. We start from ad hoc degree-feature distributions that capture the main characteristics of correlations observed in empirical systems, such as power-law positive and negative degree-feature correlations, moving to features that arise naturally in the process of network growth and ending with the case in which features are coupled to dynamical processes running on top of the network, such as epidemics or biochemical dynamics, among others. Both synthetic and real-world networks of different origin are considered in the analysis. Moreover, we show the potential of our model by employing state-of-the-art Bayesian probability techniques that are able to give the most plausible closed-form expression for the degree-feature distribution when it cannot be computed analytically. By feeding these most plausible expressions into the equations of our model, we study feature-based percolation in systems for which it is only known the feature and the degree of the individual nodes, instead of the entire degree-feature joint probability distribution (see Figure below). This considerably broadens the applicability of the theory and bridges our theory, grounded on statistical physics, with Bayesian machine learning techniques suitable for knowledge discovery.

To sum up, we propose a natural generalization of percolation, flexible enough to include, for first time, node removal protocols based on the combination of the degree and non-topological node metadata (features) of different nature. We discuss in depth the methodological consequences of this generalization, and its applicability by means of a multitude of examples, synthetic and empirical. The inclusion of the features in the robustness assessment of networks impacts both at a fundamental and applied level. To the former, because very interesting new phenomenology is obtained due to the inclusion of features. To the latter, because insightful lessons can be learnt to better protect, or dismantling, real systems.

The article has been recently accepted in Nature Communications.

17:00
Markov Chain Aggregation in the Schelling Model

ABSTRACT. Schelling's 1971 multi-agent model revealed the emergence of macroscopic segregation in a system of interacting agents even when individual preferences do not favour homogeneous neighbourhoods over mixed neighbourhoods. The behaviour of the origin Schelling model, and of its abundant variants in the literature, has been studied extensively through numerical simulations, leading to phase diagrams showing which stationary macrostate (mixed or segregated) is reached depending on the values of a handful of parameters. One of these parameters is the so-called "tolerance" T: the maximum level of heterogeneity for which an agent is still satisfied with her neighbourhood - above this level, she will move to another site in the system. Starting from Schelling's observations, phase transitions have been shown to occur in the simulations when T is increased from 0 through to 1.

We present here an analytical approach of the Schelling model, using Markov chain aggregation techniques. The initial space of microscopic states is reduced into a smaller state space by aggregating micro-states that lead to the same value for a macroscopic variable - in that case the interface density. We derive the transition matrix of the corresponding aggregated Markov chain, and we determine a critical value of T that characterizes the phase transition between segregated and mixed stationary states.

17:15
Scalable Learning of Independent Cascade Dynamics from Partial Observations
PRESENTER: Mateusz Wilinski

ABSTRACT. Spreading processes play an increasingly important role in modeling for diffusion networks, information propagation, marketing and opinion setting. We address the problem of learning of a spreading model such that the predictions generated from this model are accurate and could be subsequently used for the optimization, and control of diffusion dynamics. Unfortunately, full observations of the dynamics are rarely available. As a result, standard approaches such as maximum likelihood quickly become intractable for large network instances. We introduce a computationally efficient algorithm, based on a scalable dynamic message-passing approach, which is able to learn parameters of the effective spreading model given only limited information on the activation times of nodes in the network. We show that tractable inference from the learned model generates a better prediction of marginal probabilities compared to the original model. We develop a systematic procedure for learning a mixture of models which further improves prediction quality of the model.

17:30
A metric on directed graph nodes based on hitting probabilities
PRESENTER: Zachary Boyd

ABSTRACT. We introduce a distance function on directed graphs using the hitting probability of an ordered pair of nodes, which is the probability that a random walker starting at the first node will reach the second before returning to the first. Our metric uncovers direction-based structure, such as looping and dynamical trapping, that is invisible to other directed graph metrics and symmetrizations. Example applications, such structure discovery, weak community detection in dense graphs, multiscale analysis, and partitioning of Markov chains, will be discussed. The full details are available at arXiv:2006.14482 and in a forthcoming SIAM J. Math. Data Sci. article.