# Introduction

Science, put simply, can be understood as working on three levels:

- i.) analyzing the nature of the object being considered/observed,
- ii.) developing the formal representation of the object's features and its dynamics/interactions,
- iii.) devising methods for the empirical validation of the formal representations.

To be precise, level i.) lies more within the realm of philosophy (e.g., epistemology) and metaphysics (i.e., ontology), as notions of origin, existence and reality appear to transcend the objective and rational capabilities of thought. The main problem being:*"Why is there something rather than nothing? For nothingness is simpler and easier than anything."*; [1].

In the history of science the above mentioned formulation made the understanding of at least three different levels of reality possible:

- a.) the fundamental level of the natural world,
- b.) inherently random phenomena,
- c.) complex systems.

While level a.) deals mainly with the quantum realm and cosmological structures, levels b.) and c.) are comprised mostly of biological, social and economic systems.

# Examples

### a.) Fundamental

Many natural sciences focus on a.i.) fundamental, isolated objects and interactions, use a.ii.) mathematical models which are a.iii.) verified (falsified) in experiments that check the predictions of the model - with great success:*"The enormous usefulness of mathematics in the natural sciences is something bordering on the mysterious. There is no rational explanation for it."*; [2].

### b.) Random

Often the nature of the object b.i.) being analyzed is in principle unknown. Only statistical evaluations of sets of outcomes of single observations/experiments can be used to estimate b.ii.) the underlying model, and b.iii.) test it against more empirical data. Often the approach taken in the fields of social sciences, medicine, and business.

### c.) Complex

Moving to c.i.) complex, dynamical systems, and c.ii.) employing computer simulations as a template for the dynamical process, unlocks a new level of reality: mainly the complex and interacting world we experience at our macroscopic length scales in the universe. Here two new paradigms emerge:

- the shift from mathematical (analytical) models to algorithmic computations and simulations performed in computers,
- simple rules giving rise to complex behavior:
*"And I realized, that I had seen a sign of a quite remarkable and unexpected phenomenon: that even from very simple programs behavior of great complexity could emerge."*; [3].

However, things are not as clear anymore. What is the exact methodology, and how does it relate to underlying concepts of ontology and epistemology, and what is the nature of these computations *per se*? Or within the formulation given above, i.e., iii.c.), what is the "reality" of these models: what do the local rules determining the dynamics in the simulation have to say about the reality of the system c.i.) they are trying to emulate?

# Outlook

There are many coincidences that enabled the structured reality we experience on this planet to have evolve: exact values of fundamental constants (initial conditions), emerging structure-forming and self-organizing processes, the possibility of (organic) matter to store information (after being synthesized in supernovae!), the right conditions of earth for harboring life, the emergent possibilities of neural networks to establish consciousness and sentience above a certain threshold, ...

Interestingly, there are also many circumstances that allow the observable world to be understood by the human mind:

- the mystery allowing formal thought systems to map to patterns in the real world,
- the development of the technology allowing for the design and realization of microprocessors,
- the bottom-up approach to complexity identifying a micro level of simple interactions of system elements.

So it appears that the human mind is intimately interwoven with the fabric of reality that produced it.

But where is all this leading to? There exists a natural extension to science which fuses the notions from levels a.) to c.), namely

- information and information processing,
- formal mathematical models,
- statistics and randomness.

Notably, it comes from an engineering point-of-view and deals with quantum computers and comes full circle back to level i.), the question about the nature of reality:*"[It can be shown] that quantum computers can simulate any system that obeys the known laws of physics in a straightforward and efficient way. In fact, the universe is indistinguishable from a quantum computer."*; [4].

At first blush the idea of substituting reality with a computed simulation appears rather *ad hoc*, but in fact it does have potentially falsifiable notions:

- the discreteness of reality, i.e., the notion that continuity and infinity are not physical,
- the reality of the quantum realm should be contemplated from the point of view of information, i.e., the only relevant reality subatomic quanta manifest is that they register one bit of information:
*"Information is physical."*; [5].

# References and Further Reading

[1] von Leibniz, G. W., "Principes de la nature et de la grâce", 1714

[2] Wigner, E. P., "Symmetries and Reflections", MIT Press, Cambridge, 1967

[3] Wolfram, S., "A New Kind of Science", Wolfram Media, pg. 19, 2002

[4] Lloyd, S., "Programming the Universe", Random House, pgs. 53 - 54, 2006

[5] Landauer, R., Nature, 335, 779-784, 1988

See also: "The Mathematical Universe" by M. Tegmark.

Related knowledgeBase topics: