Workshop 1 : Stability and Stabilization of Time-Delay Systems: An Operator-Theoretic Development


  Speaker:Professor Jie Chen

  Affiliation:City University of Hong Kong



  Speaker:Professor Dan Ma

  Affiliation:Northeastern University, Shenyang, China



  Speaker:Dr Tian Qi

  Affiliation:South China University of
Technology, Guangzhou, China



  Speaker:Dr Jing Zhu 

  Affiliation:Nanjing University of Aeronautics
and Astronautics, Nanjing, China

Abstract: Time delays arise in the transport of energy, mass, information and such, and are omnipresent in natural and engineered systems. Modern interconnected networks are especially prone and indeed, are vulnerable to long and variable delays; systems and networks in this category are many, ranging from communication networks, sensor networks, cyber-physical systems, to biological systems. Except on rare instances, time delays are likely to result in degraded performance, poor robustness, and even instability, which consequently pose significant challenges to the analysis and design of control systems under delayed feedback.

While a recurring subject of study, over the last two decades or so there have been particularly notable advances in the stability analysis of time delay systems, thanks to the development of analysis methods drawing upon robust control theory, and the development of computational methods in solving linear matrix inequality (LMI) problems. An extraordinary volume of the literature is in existence on stability problems, and various time- and frequency-domain stability criteria have been developed. Of these developments, while an overwhelming majority of the available results are obtained based upon time-domain Lyapunov-Krasovskii methods and require the solution of LMIs, frequency-domain conditions in the spirit of small-gain theorem have also been sought after. Generally, time-domain stability conditions are applicable to both constant and time-varying delays, but are known to suffer from a varying degree of conservatism. In contrast, frequency-domain tests are largely restricted to constant delays though often provide tight conditions and appear more susceptible to feedback synthesis.

Despite the considerable advances on stability analysis, control design problems for time-delay systems prove far more challenging. Feedback stabilization of time-delay systems poses a difficult problem and has been somewhat an underdeveloped research area. Fundamental robustness issues have been seldom investigated as well. Furthermore, recent advances in broad fields of science and engineering brought forth new issues and problems to the area of time-delay systems; time delays resulted from the interconnected systems and networks present new challenges unexplored in the past and are increasingly seen to have far more grave effects, which the existing theories do not seem to be well equipped with.

In the workshop we intend to discuss a wide variety of subjects on the stability and stabilization of time-delay systems. We ask such questions as when will a delay system be stable or unstable, and for what values of delay? When can an unstable delay system be stabilized? What range of delay can a feedback system tolerate to maintain stability? Fundamental questions of this kind have long eluded engineers and mathematicians alike, yet ceaselessly invite new thoughts and solutions. We shall present tools and techniques that answer to these questions, seeking to provide exact and efficient computational solutions to stability and stabilization problems of time-delay systems. In particular, we shall develop in full an operator-theoretic approach that departs from both the classical algebraic and the contemporary LMI solution approaches, notable for both its conceptual appeal and its computational efficiency. Extensions to networked control and multi-agent systems will also be addressed.

Topics included:

We propose a 3~4 hour, half-day pre-conference workshop that addresses the following subjects, all unified under an operator-theoretic, small-gain theorem approach:

Classical stability tests for time-delay systems.

Eigenvalue perturbation theory.

Eigenvalue series for stability analysis of time-delay systems.

Small gain stability conditions for time-delay systems.

Robust stability of delay systems.

Stabilization of delay systems: The delay margin problem.

Fundamental bounds on delay margin.

Delay margin achievable by PID controllers.

Delay effects on networked feedback stabilization.

Delay effects on multi-agent consensus.


Workshop 2 : Economic Model Predictive Control


  Speaker:Dr. Jinfeng Liu 

  Affiliation:University of Alberta

Abstract: Model predictive control (MPC) has been an important and successful advanced control technology in process industries mainly due to its ability to handle effectively complex systems with hard control constraints. MPC presents a very flexible optimal control framework that can handle a wide range of industrial issues while incorporating state or output feedback. Traditionally, MPC with quadratic cost functions had dominated the focus of MPC research. Advances in technologies in the last decades have enabled us to look beyond the traditional MPC and brought new challenges and opportunities in MPC research. One important example of this technology-driven development is economic MPC.

Economic MPC removes the separation between optimization and control in the traditional hierarchical real-time optimization systems and addresses both optimization and control in one single layer. Economic MPC optimizes a general economic cost function, which in general is not quadratic. This workshop is intended to introduce researchers to (i) the theory and design of economic MPC systems, (ii) numerical implementation of economic MPC, and (iii) applications of economic MPC to different systems.

Workshop 3 : Visioning Safe and Smart Cities with Situational Awareness and Computational Intelligence


 Speaker:Professor Vincenzo Loia

 Affiliation:University of Salerno, Italy

Abstract: It’s not uncommon for large-scale enterprises to manage hundreds of   thousands of computers, networks with thousands of devices, and petabytes of data.  What makes problems at this scale particularly difficult is that often there must be a human in the loop. Security is no exception. Security analysts suffer in automating problem solving duties because it’s inherently difficult to capture the tacit  knowledge and procedures that they use to arrive at a decision; in addition,  integrating with all of the myriad tools and sources of information an analyst uses to make a decision and react is cost prohibitive.

Humans introduce significant delays in the time to mitigate a threat. Yet, security analysts are typically flooded with far more alerts than they can possibly handle. How these alerts should be prioritized is poorly understood. Humans make mistakes, so organizations typically establish processes and best practices for their labor force to ensure that problems are dealt with systematically and predictably. Although this doesn’t guarantee that errors won’t happen, these techniques can help manage these errors to a tolerable level. But policies and best practices are designed to address well-understood threats and often don’t adequately address emerging threats, especially in a highly dynamic and changing environments.

Situation Awareness is usually defined in terms of what information is important for a particular job or goal.

Most of the problems with Situation Awareness occur at the level “Perception” and “Comprehension” because of missing information, information overload, information perceived in a wrong way (e.g., noise) or also information not pertinent with respect to the specific goal. Thus, the current situation must be identified, in general, in uncertainty conditions and within complex and critical environments. In this case, it is needed an effective hybridization of the human component with the technological (automatic) component to succeed in tasks related to Situation Awareness.

Situation Awareness oriented systems have to organize information around goals and provide a proper level of abstraction of meaningful information. To answer these issues, we propose a Cognitive Architecture, for defining Situation Awareness oriented systems, that is defined by starting from the well-known Endsley’s Model and integrating a set of Computational Intelligence techniques (e.g., Fuzzy Cognitive Maps and Formal Concept Analysis) to support the three main processes of the model (perception, comprehension and projection). One of these techniques is Granular Computing that makes information observable at different levels of granularity and approximation to allow humans to focus on specific details, overall picture or on any other level with respect to their specific goals, constraints, roles, characteristics and so on.

Furthermore, the proposed Cognitive Architecture considers some enabling technologies like multi-agents systems and semantic modeling to provide a solution to face the complexity and heterogeneity of the monitored environment and the capability to represent, in a machine-understandable way, procedural, factual and other kind of knowledge and all the memory facilities that could be required.

Practical experiences deriving from the realization of complex systems in the domain of Smart and Safe Cities will be presented during the talk.

Workshop 4 : System Modeling and Control: At the Junction with Data


 Speaker:Professor Witold Pedrycz

 Affiliation:University of Alberta, Canada

Abstract: He apparent challenges encountered in system modeling inherently associate with large volumes of data, data variability, and an evident quest for transparency and interpretability of established constructs and obtained results. Along with the emergence and increasing visibility and importance of data analytics, we start to witness a paradigm shift whose several dominant tendencies become apparent: (i) reliance on data and building structure-free and versatile models spanned over selected representatives of experimental data, (ii) emergence of models at various levels of abstraction, and (iii) building a collection of individual local models and supporting their efficient aggregation.

We advocate that information granules play a pivotal role in the realization of this paradigm shift. We demonstrate that a framework of Granular Computing along with a diversity of its formal settings offers a critically needed conceptual and algorithmic environment.  Information granules and information granularity are synonyms of levels of abstraction. A suitable perspective built with the aid of information granules is advantageous in realizing a suitable level of abstraction and becomes instrumental when forming sound, practical problem-oriented tradeoffs among precision of results, their easiness of interpretation, value, and stability (as lucidly articulated in the form of the principle of incompatibility coined by Zadeh). All those aspects emphasize importance of action ability and interestingness of the produced findings either for purpose of control or decision-making. Granular models built on a basis of available numeric models deliver a comprehensive view at the real-world systems. More specifically, granular spaces, viz. spaces of granular parameters of the models and granular input and output spaces play a pivotal role in making the original numeric models more realistic.

The data-oriented models tend to depart from analytical descriptions (in the form of nonlinear mappings) but directly exploit subsets of meaningful/representative data over which such models are developed. A representative class of models with this regard concerns associative memories, which realize both one-directional and bidirectional recall (mapping).  We carefully revisit and augment the concept of associative memories by proposing some new design directions. We focus on the essence of structural dependencies in the data and make the corresponding associative mappings spanned over a related collection of landmarks (representatives OD data). We show that a construction of such landmarks is supported by mechanisms of collaborative fuzzy clustering. In the sequel, structural generalizations of the discussed architectures to multisource and multi-directional memories involving associative mappings among various data spaces are proposed and their design is discussed.