Invited Speakers

 

Dr Sebastian E. AHNERT (UK)

Self-assembly, modularity, and physical complexity.


Self-assembly is ubiquitous in physics, chemistry and biology, and has many  applications in materials science and engineering. Here we present a general approach for finding the simplest set of building blocks which will assemble into a given physical structure. Our procedure can be adapted to any given geometry, and thus to any given type of physical system. The amount of information required to describe the simplest set of building blocks which self-assembles into a given structure provides a quantitative measure of the structure's physical complexity. This measure is capable of detecting any symmetry or modularity in the underlying structure. We also introduce the notions of joint, mutual and conditional complexity for self- assembling structures. We illustrate our approach using self- assembling polyominoes, and demonstrate the breadth of its potential applications by using it to quantify the physical complexity of molecules and crystals, as well as protein complexes.


Pr Gustav BERNROIDER (Austria)

Common Grounds: The Role of Perception in Science.


Science is made by Scientists and Scientists are consciously perceiving agents, experiencing, observing and  analysing the world. From this simple and rather indisputable fact it is easy to see that we cannot discuss the emerging picture of our world without paying some attention to the way we perceive and experience the world. Physical realism suggests that all results of our observations depend on pre-existing properties of nature and are ‘local’,i.e. are not dependent on space-like separated events. This is how we perceive the world. However,such views are not compatible and generally violated by quantum-mechanical predictions (e.g.Gröblacher et al. Nature, 446, 871, 2007 for a review). So, nature can be at variance with physics.This is a rather unpleasant situation and exposes the limits of our science. In this presentation I will search for possible solutions to this rather radical and persistent problem. I will advocate the view that some scale-invariant phenomenal structure providing the primitives of our experience must be an indispensable ingredient of our science to overcome these problems. From a physical perspective, the concept proposed here could be considered as a type of contextual hidden-variable model(CHV), similar to the recently proposed fractal invariant sets suggested by TN Palmer. Whereas Palmers invariant sets are motivated by gravitational systems, the present concept is founded within the phenomenology of perception and experience. Contextuality is seen as a local change in the similarity dimension within a partition of some fractal phenomenal dynamics. I argue that involving this view into physics can help to resolve the problem behind physical realism and the ‘tuning problem’ between life and cosmology. 


Pr Luciano BOI (France)

Seeing the world from inside: internal (geometrical) and external (physical) representations of dynamical and living systems.


There are many different definitions of complex systems. It my range from the classical algorithmic complexity (Kolmogorov, Chaitin) to more recent and sophisticated definitions, such as: chemical definitions, statistical-physics definitions, topological-dynamical definitions, and biological definitions. It is clear that any given definition (especially mathematical one) cannot capture all the complex meanings we associate with the word complexity. One interesting definition rest on the basic idea that the more complex the system is, the more can be said about. In my presentation I will exclude the factual description of the system, which my be very long. I will refer only to the global characteristics. In the physical sciences and more peculiarly in the biological ones, generally the complexity is related to the existence of different levels of descriptions. For example, one can describe an Escherichia coli at the molecular level, at the biochemical level, and at the functional level. If we move towards a mathematical definition, we must realize that the concept of complexity, like entropy, is often of probabilistic nature and it can be defined more precisely if we try to define the complexity of ensembles of objects of the same category. This is related to the notion of classification. The meaning of a complex classification is quite clear intuitively: a classification is very complex if there are many levels (i.e. orders, families, genera) and there are many elements and relationships between them in each level. In this lecture we will introduce the notion of external and internal representations (or modelling) of complex systems in order to understand how the above-mentioned mathematical classification of the different levels of organization of a physical or living system can be captured.


Pr Alfred BRUCKSTEIN (Israel)

From Ants to A(ge)nts: Modelling and Analyzing Multi-Agent Systems.


An ant colony is a marvel of cooperation and coordinated, purposeful work carried out by simple a(ge)nts with very limited capabilities. Ants do not have GPS systems, have no compasses nor odometers,  do not use laser range finders , nor do they have good memories or extraordinary computational resources, and employ no sophisticated long-range sensing or communication equipment. Yet they are ruling the earth, by numbers and  by resilience, and by some evolution-developed local response algorithms, that rely on pheromone-mediated myopic interactions. The environment becomes a huge, shared resource covered with "chemical memory" signals.


The paradigm of swarming agents/bots is an attempt to mimic this phenomenal success of nature. In the attempt to analyze the capabilities of colonies of small and limited agents to perform a variety of tasks one encounters formidable mathematical difficulties. The direct problem of analyzing  the emergent global behavior that results from a set of rules of local interaction is tractable in a few interesting cases, like for example in gathering and region covering or patrolling missions. The inverse problem of deriving local rules of behavior, based on the ant-like agents' limited sensing and communication capabilities, is far less approachable. Several examples illustrating the mathematical tools available for analyzing the behavior of swarms of myopic agents will be discussed.


Pr Dalia CHAKRABARTY (England)

Phase portrait of dynamical systems: the governing rule of evolution.


The idea of smooth parametric phase space densities in galaxies does not do justice to the complexity manifest in the dynamics of these systems. Analyses of stellar velocities in the solar neighbourhood have indicated a highly multi-modal Milky Way phase space - at least on local scales. Dynamics and evolution in such a phase space is found to be markedly non-linear, with a preponderance of chaos. The Bayesian

nonparametric techniques that I am currently using to study dynamics of distant galaxies as well as the Milky Way, will be discussed, along with already published statistical methodlogies and results. The non-linear nature of phase space in an example external galaxy will also be discussed, in the context of the possible bistability of the system potential. The consequent risk involved in the estimation of dark matter in galaxies, on the basis of the implementation of tracer kinematics will be discussed.


Pr Oliver DORN (UK)

Data to Images, a shape based approach


When solving inverse problems we typically have physical data given and are looking for an image (which might be 2D or 3D) which reproduces these data in some specified sense. Typically these images represent coefficients of a Partial Differential Equation, called the forward model, and a simulator is used in order to verify the data fidelity of any given image which is proposed as a possible solution.

Classical inverse problems theory tells us that inverse problems are ill-posed, and a well-defined unique image which fits the data does either not exist, or its calculation from the data is unstable, or (typically) both. Then, in most deterministic approaches for calculating candidate images, regularization schemes are employed

which provide images from a certain class, in most cases smooth images. Certainly, natural images are not necessarily smooth, but might contain discontinuities which provide the image with structure.

This structure is of high importance in many applications. It might represent different lithologies in geophysical applications, or different materials in non-destructive testing, or different organs or tissue types in medical imaging applications. Structure can be imposed by brute force on the already reconstructed images by applying off-the-shelve image segmentation techniques to them. However, this approach has various severe drawbacks in many applications, and might simply not work at all. We propose in this talk a novel level-set

based approach for finding such structured images (i.e. containing sharp interfaces between different characteristic regions) directly from the given data without making this detour via image post-processing techniques. Several examples from medical imaging, petroleum engineering and non-destructive testing applications are presented which show the performance of this novel technique in realistic situations.


Pr Jean-Claude HEUDIN (France)

Reality, Models and Representations: the cases of Galaxies, Intelligence and Avatars.


This talk discusses relationships between reality, models and representations in the study of complex systems. We begin by proposing a definition of these concepts. We then outline the relationships between them and with an observer. We show that visualization is a necessary representation of a model, which is itself a necessary representation of a phenomenon. However, using a thought experiment based on a generalized Turing test, we conclude that there is no possible equivalence between them. We illustrate the consequences of these results on three different case studies: (1) the modelling of colliding galaxies, (2) the creation of believable human-like virtual agents, and (3) their graphical 3D representations.


Prof. Hans Liljenström (Sweden)

Evolving Complexity, Cognition, and Consciousness


All through the history of the universe there is an apparent tendency for increasing complexity, with the organization of matter in evermore elaborate and interactive systems. The living world in general, and the human brain in particular, provides the highest complexity known. It seems obvious that all of this complexity must be the result of physical, chemical and biological evolution, but it was only with Darwin that we began to get a scientific understanding of biological evolution. Darwinian principles are guiding in our understanding of such complex systems as the nervous system, but also for the evolution of human society and technology. 

Living organisms have to survive in a complex and changing environment. This implies response and adaption to environmental events and changes at several time scales. The interaction with the environment depends on the present state of the organism, as well as on previous experiences stored in its molecular and cellular structures. At a longer time scale, organisms can adapt to slow environmental changes, by storing information in the genetic material carried over from generation to generation. This phylogenetic learning is complemented by ontogenetic learning, which is adaptation at a shorter time scale, occurring in non-genetic structures. 

The evolution of a nervous system is a major transition in biological evolution and allows for an increasing capacity for information storage and processing, increasing chances of survival. Such neural knowledge processing, cognition, shows the same principal features as non-neural adaptive processes. Similarly, consciousness might appear, to different degrees, at different stages in evolution. Both cognition and consciousness depends critically on the organization and complexity of the organism. In this presentation, I will briefly discuss general principles for evolution of complexity, focussing on the evolution of the nervous system, which provides organisms with ever increasing capacity for complex behaviour, cognition and consciousness. I will also discuss some computational approaches, as tools for understanding relations between structure, dynamics and function of the nervous system.


Pr Giuseppe LONGO (Italy) and Pr S. George Djorgovski (USA)

The emerging virtual reality technologies for professional research in astronomy.


Like any other field of the modern human endeavor, science, scholarship, and education are being transformed by the advances in computation and information technology, and astronomy is no exception. Much of the scholarly work is now moving to virtual environments, typically accessible through the Web. The exponential growth of data volumes, and the simultaneous increase in the data complexity have prompted the rise of virtual scientific organizations: distributed research environments connecting both data archives, and the web-based tools for their federation and analysis. The Virtual Observatory (VO) framework is an excellent example of this trend. However, the uptake of these new methodologies for scientific research in the 21st century has been relatively slow in the academic community, and much of the VO effort has been focused on establishing an effective global data grid for astronomy. Thus, we now see a rise in “X-Informatics”, where X = (astro, bio, geo…), as many sciences are incorporating modern computational tools to enable discovery in their domain; an example is the emerging field of Astroinformatics. Equally transformative changes have occurred in scientific publishing and knowledge dissemination through electronic media. Scientists now spend much of their time interacting with the data, theoretical models, published literature, and their colleagues through various Internet-based mechanisms. Meanwhile, information technology also enables and empowers education and public outreach, activities whose importance is fundamental and growing. In fact, the current and upcoming generations of “digital natives” expect their education to be delivered through electronic media. Any technology that enables a better communication between scientists, scholars, and students, and also between scientists and their data, models, and literature is important. One example of is the rise of immersive virtual reality environments and virtual worlds, which may become the next generation interface to the informational content now available through the Web, as well as human-friendly interaction environments for scientific collaboration and education.


Pr Rafael MOLINA (Spain)

The super-resolution applied to satellite imagery


Many satellites on different Earth observation platforms provide multispectral images together with a panchromatic image.  Pansharpening, is a pixel level fusion technique used to increase the spatial resolution of the multispectral image while simultaneously preserving its spectral information.We provide a review of the pansharpening methods proposed in the literature, give a clear classification of them, and a description of their main characteristics. We analyze how the quality of the pansharpened images can be assessed both visually and quantitatively and examine the different quality measures proposed in the literature. Finally we analyse the effects of pansharpening methods on the accuracy provided by classification algorithms.


Pr Fionn MURTAGH (Ireland)

Hierarchical Clustering in Large High Dimensional Datasets.


Our focus is on hierarchical clustering using a range of applications including astronomy, chemistry, and textual analysis. Hierarchical clustering offers much scope for finding patterns based on symmetries in data. We look at recent results that go beyond the well-established quadratic time agglomerative hierarchical clustering algorithms. These include a linear time hierarchy embedding that uses the Baire or longest common prefix (ultra)metric. We look too at model-based clustering in extremely high dimensional spaces. Finally we discuss the hierarchical clustering of time (or otherwise) sequenced multivariate observations.


Pr Vladimir NEKORKIN (Russia)

Non-linear dynamics and biological systems


The role of the dynamical approach in investigation of spatio-temporal patterns of neural activity generated by various structures of nervous system is analyzed. Dynamic models of individual neurons, as well as of large and small neuronal networks are considered. Dynamic mechanisms of formation of complex patterns, including chaotic ones are elucidated. It is shown that such structures are associated with existence in a system for traveling waves of diverse heteroclinic contours and are formed due to simultaneous instability of a large number of autowaves in the form of pulses and wave fronts. It was found that stable localized patterns of activity possessing autowave properties may exist in a two-dimensional bistable network of FitzHugh-Nagumo neurons with complex threshold. In spite of bistability, such structures do not “switch” the medium over from one equilibrium state to another; instead, they change the shape of the spatially localized formations propagating with constant velocity.


Pr Sankar PAL (India)

F-granulation, Generalized Rough Sets and Entropy:

Uncertainty analysis in pattern recognition with applications


The role of rough sets in uncertainty handling and granular computing is described. The significance of its integration with other soft computing tools and the relevance of rough-fuzzy computing, as a stronger paradigm for uncertainty handling, are explained. Different applications of rough granules and certain important issues in their implementations are stated. Three tasks such as class-depedendent rough-fuzzy granulation for classification,rough-fuzzy clustering and defining generalized rough sets for image ambiguity measures and analysis are then addressed in this regard, explaining the nature and characteristics of granules used therein.Merits of class dependent  granulation together with neighborhood rough sets for feature selection are demonstrated in terms of different classification indices. Significance of a new measure, called "dispersion" of classification performance, which focuses on confused classes for higher level analysis, is explained in this regard. Superiority of rough-fuzzy clustering is illustrated for determining bio-bases (c-medoids) in encoding protein sequence for analysis. Generalized rough sets using the concept of fuzziness in granules and sets are defined both for equivalence and tolerance relations. These are followed by the definitions of entropy and different image ambiguities. Image ambiguity measures, which take into account both the fuzziness in boundary regions, and the rough resemblance among nearby gray levels and nearby pixels, have been found to be useful for various imageanalysis operations


Pr Tim PALMER (UK)

The Butterfly and the Photon: New Perspectives on Unpredictability in Classical and Quantum Physics


The three revolutions of 20th Century physics: quantum theory, relativity theory and chaos theory, sit uncomfortably  together. The difficulties of unifying quantum theory and general relativity theory are well known.  Moreover, the basic paradigm of chaos theory, the butterfly effect (the exponentially growing deviation of neighbouring  phase space trajectories), is not itself a relativistically invariant notion.  And finally, the unpredictability of chaos theory is generally not believed to have any relevance to the unpredictability of quantum measurement; for one thing the Schrödinger equation is precisely linear, and for another Bell’s theorem appears to disallow any underpinning of the quantum state by  chaotic determinism, at least within a relativistically-invariant framework. 

However, it seems unsatisfactory that these divisions exist. We therefore seek a new approach to try to unify these 20th Century revolutions, and claim that a unification may be found  by taking more seriously than hitherto, one of Einstein’s key insights: the geometrisation of the laws of physics, and one of Einstein’s overarching concerns: the nature of physical reality. Combining this key insight with this overarching concern, our programme is to construct a geometric framework to describe the notion of physical reality at a primitive level, and show that quantum theory is a coarse-grain description of this geometry. Specifically, we postulate that states of physical reality belong precisely to a dynamically-invariant fractal geometry embedded in the Euclidean phase space of the universe as whole.  Some physical justification for this ontological postulate is given.

By focussing on the geometrical properties of this invariant set (eg the fractal dimension) then a relativistically-invariant description of chaos is easily given. More importantly, it is shown that such a geometric description of some underlying causal deterministic description of the evolution of reality is not constrained by Bell’s theorem. In other words, we uncover a truly massive loophole in the Bell Theorem, hitherto unexplored, using this geometric approach to the notion of physical reality, and the Schrödinger equation becomes a Liouville equation on this non-computable geometry. Finally, it is suggested that this invariant set postulate presents some quite new ideas towards a final unification of gravity with quantum physics.



Pr Fabio PASIAN (Italy)

The International Virtual Observatory: tools implemented in the framework of astronomical images and data mining


In the past ten years, the concept of Virtual Observatory (VO) has increasingly gained importance in the domain of astrophysics, as a way of seamlessly accessing data in different wavelength domains stored in digital archives. There are many reasons why the VO is useful for the development of science: to monitor time variability of phenomena, to compare phenomena in different bands, to increase return for investment (by fostering data re-use for scientific, educational and outreach purposes), to perform statistical analysis and mining on large quantities of data.
The International Virtual Observatory Alliance (IVOA) has paved the way for the VO to become a really useful tool for the scientific community, by promoting standards, by defining data interoperability methods, by fostering the needed coordination among data providers.

But the VO is more than just archives and standards: it is also infrastructure, basic software tools, advanced applications, evolution of methods and techniques, cross-fertilisation with other communities. Discovering information in wide-field images and mining large archives are key items towards the use of the VO as a tool for developing science.


Pr Carlos PEREIRA (Brazil)

Information in Statistics


We explore the concept of information in statistics: information about unknown quantities of interest, the parameters. We are not concerned with the old and important concept of information of a (prior) distribution.

First, we discuss an intuitive idea of what should be information in statistics. Operationally, information could be in the observed data or, as expected information, in a future experiment. Clearly, one must have an information measure for each of these entities; data and future experiment. Hence, we present and discuss particular measures based on an intuitive concept. The definition of trivial experiment, the Likelihood principle, the conditional principle and the Blackwell sufficiency are discussed in the context of information.  This discussion leads us to present interesting examples comparing experiments (sample points). The objective is to decide which experiment (sample point) is more informative.


Pr Sisir ROY (India)

The role of noise in living Systems.


Noise plays a fundamental role in all living organisms from the earliest prokaryotes to advance mammalian forms such as humans. In the context of living organisms the term noise usually refers to the variance amongst measurements obtained from repeated identical experimental conditions or from output signals from these systems, which are universally characterized by the having background fluctuations. In non-biological systems of the realm of basic electronics or in communications sciences where the aim is to send error free messages and hence noise is generally regarded as a problem. The discovery of Stochastic Resonances (SR) in non-linear dynamics brought a shift to that perception i.e. noise rather than representing a problem is, actually, fundamental in system function, especially in biology. The question now is to what extent is biological function dependent on random noise, And a corollary to that question: is the significance of noise that depends on intrinsic system properties more meaningful than the noise brought in from the environment. From this perspective it is already known from both theoretical and experimental investigations on biological systems, that the addition of input noise improves delectability and transduction of signals in non-linear systems. This effect is popularly known as Stochastic Resonance (SR). SR has been found to be established phenomena in sensory biology, but it is not presently known unambiguously, to what level SR is embedded in the system. Indeed, it is not clear whether SR can occur at the level of single ion channel in cells membranes or whether it is mostly an ensemble property of channel aggregates. Presently both forms of SR seem to be present as it occurs based on external noise as well as in its absence as exemplified by neuronal multimodal aperiodic spike firing in the absence of external input.

Both experimental findings and computer simulations for central neurons indicate that SR is enhanced when both noise and signals are simultaneously presented to neurons. It is also known that spontaneous synchronization occurs if noise is added to coupled arrays of neurons. Indeed, coherence resonance has been observed in hippocampal neuron array. In fact, it is evident that Verveen and Derksen (1966) changed the prevalent view in biology of noise from a bothersome phenomenon to an essential component in biological systems. Recent developments in neuroscience further pointed out that the Central Nervous System (CNS) can utilize noise (which carries no signal information) to enhance the detection of signal through SR thus emphasizing the fundamental role of noise in brain information processing. Moreover, the intrinsic noise associated to neuronal assemblies is known to produce synchronous oscillations utilizing the ISR or CR mechanism. More fundamentally the unicellular entities such as Diatom show Brownian like noise even after its death contrary to the fact that a directed transport motion is observed during its living state. Indeed, the ionic channels of Diatom remain active for the certain period of time following “death”. In its living state, the ion channels of Diatom may produce kind of synchronicity but becomes asynchronous after its death, with the Brownian like motion present till the functioning of the channels stop. However, a careful analysis of the noise is needed to understand this kind of phenomenology and its physical/biological significance. The analysis of the role of noise in Diatom may shed new light in understanding the distinction between the living and dead state. Our main intention of this paper is to provide an overview of the role of noise in biological system at various levels of its functioning. This lecture is based on the joint work of the speaker and Prof. Rodolfo Llinás, New York University School of Medicine, NY, USA.


Pr Antonio SAGGION (Italy)

Causation and Extensive Quantities


We propose a general account on causation based on the exchange of both conserved and non conserved Extensive Quantities. We think that the proposed explication has very old roots and, to some  extent, might be brought back to Newton’s Substances’ Dynamics. The formalism of Thermodynamics is most suited to give a sound formal basis to such a representation of system interactions.

Presentations