Research note:  „Bücher der neuen Anthropologie und Biologie“, edited by Hans André




Picture 1 and 2: Leaflet found inside the book whose title page is shown in picture three, bought from an online used book shop. The book belongs to a series of books “Bücher der neuen Anthropologie und Biologie” (“Books of the new anthropology and biology”, edited by the biologist and philosopher Hans André from 1925 to 1934). It advertises the first 7 books from this series. The book whose title page is shown is the 8th in the series. By online library catalog research, I could find 3 more titles in this series:

Volume 9: Karl Beurlen u. Hans André: “Das Gesetz der Überwindbarkeit des Todes in der Biologie” (1933)

Volume 10: Armin Müller: “Ganzheitsbiologie und Ethik” (1933)

Volume 11: Hedwig Conrad-Martius: Die ‘Seele’ der Pflanze (1934)

A few notes on the authors (to be extended):

André, Hans, Dr.

Biologist, Botanist, Philosopher, 24.03.1891, Kaiserslautern – 26.07.1966, Bonn

Letters by Hans André in archives catalogued on Kalliope:

Record in the database of the  Deutsche Nationalbibliothek about him:

Literature by Hans André (including books edited by him and cooperations) (catalogue of Deutsche Nationalbibliothek):

Secondary literature:

[Andrés Philosophie des Lebens]
La philosophie de la vie de Hans André
Siewerth, Gustav. – Paris : Desclée de Brouwer, [2016]

He is also mentioned in: JOACHIM FISCHER. “Philosophische Anthropologie. Eine Denkrichtung des 20. Jahrhunderts“ ( (excerpts available on Google books).

According to Fischer, he seems to have been exchanging letters with Adolf Portmann. André seems to have been part of the circle around Scheeler in Cologne. He was influenced by Helmuth Plessner, who, according to Georg Toepfer’s “Historisches Wörterbuch der Biologie” (article “Vitalismus”) can also be viewed as belonging into the complex of vitalism. (Toepfer does not mention André.)

André seems to come out of an Aristotelian/Neo-Thomist Katholic direction.

Stölzle, Remigius, Prof. Dr.

* 23. November 1856 in Ob/Baden; † 23. Juli 1921 in Würzburg. Philosopher. professor in Würzburg.

Also seems to come out of an Aristotelian/Neo-Thomist Katholic direction.

Kranichfeld, Hermann

More research required. The second publication listed by the Deutsche Nationalbibliothek might be from a different author. He seems to have been a biologist.

F. J. J. Buytendijk

1887-1974. Professor of  philosophy in Groningen (according to the leaflet). According to Fischer, Andé translatet volume 6 into German (so probably the other book (volume 5) was also translated by André).

Wasmann, Erich

Biologist, katholic theologist. 1859-1931 (or 1932).

Albert Wigand

Biographical data not certain, more research required. According to the leaflet he was a botanist. The entry in the Deutsche Nationalbibliothek seems to confuse him with an artist of the same name.

Armin Müller

1888 – 1967.

Belongs into the vicinity of the Spannkreis, around Othmar Spann.

von Brandenstein, Béla, Dr.

1901 – 1989. German-Hungarian philosopher. He was professor in Hungary. Fled Hungary in 1944 and became professor of philosophy in Saarbrücken, Germany.éla_von_Brandenstein

His ideas are vitalistic, but he does not share the Aristotelian/neo-Thomist approach of André. In the preface to the book (volume 8 of the series), André somewhat distances himself from von Brandenstein, whose views he did not share but found worth publishing in his series.

Karl Beurlen

Paleontologist. 1901 – 1985

Proponent of orthogenesis.

Conrad-Martius, Hedwig

1888 – 1966. Philosopher, professor in Munich.

The “Philosophisches Wörterbuch“ (Schischkoff, Georgi (ed.), 11. edition, Stuttgart, 1957) writes about her (rough translation by myself): “tries, starting from phenomenology, to build up an ontology of reality (“Realontologie”) of nature, using the Aristotelian-scholastic ontology, especially the concept pair “potency” and “act” or “matter” and “form”. For the problem of the special laws of the living (“Selbstgesetzlichkeit des Lebens”) she proposes a “species logos”, an entelechy of essence which individuates itself in the organic substance and operates as (immanent) entelechy of formation (“Bildungsentelechy”).”

Georg Wunderle

Kath. theologist, see

A few thoughts on the Aristotelian/neo-Thomist thinkers in this group (at least André, Stölzle, Wasmann, Conrad-Martius). Based on the scholastic tradition of thought, they develop ideas that are  essentially vitalistic. They are anti-mechanistic and anti-darwinistic. However, although coming out of a Christian, specifically Katholic point of view, their ideas seem to be vastly different from the American evangelical brand of Bible-literalistic creationism. This evangelical element is largely absent from Germany. Since in Germany, there was traditionally a coupling of both the Katholic and the Lutheranian churches to the state, other groups (from the evangelical spectrum) left Germany (mostly for the United States). So while the anti-darwinists in the USA are mainly of the creationist brand, this type of creationism did not and does not play any major role in Germany. The theology of the Lutheranians largely developed into more secular directions (Bultmann ect.). Katholic (and especially: Jesuite) theology, on the other hand, was able to tap into the tradition of Scholasticism with its relatively rich (compared to the evangelical movements) and varied body of philosophical thought that had incorporated both Aristotelian and, earlier, (neo-)Platonic thought. The description of Stölzles first book in the series says, for example: “In the last chapter, Stölzle develops […] the idea of ‘indirect, potential or secondary creation’, in accordance with the scholastic view of the eductio formarum e potentia materiae [the generation of forms from the potential of matter], according to which the principles of life of plants and animals are not created from nothing but are generated from the disposition (the ‘potencal’ of matter).” So the existence of a richer philosophical tradition of philosophy inside the Katholic church lead to the development of ideas that can be viewed as being part of the vitalistic spectrum, while American style creationism did not play a role.

Even outside this Katholic context, the absence of evangelical Christians meant that anti-mechanistic or anti-darwinistic thought (e.g. von Brandenstein) appears in the form of different brands of vitalism, not as creationism. This vitalistic direction in Biology that seems to have been dominant in Germany from the late 1800s up to the middle of the 20th century (at least up to the early 1930s) has been neglected in American or Anglo-Saxon works of history of science, where generally, only the darwinism vs. creationism debates are treated.


Are we living inside a computer simulation? – 3


Some people, including some philosophers (see for an example), have recently put forward the idea that we might inhabit a giant computer simulation. For example, one idea is that some civilizations might develop to a state where they start building such simulations in order to simulate possible histories of themselves.

This is the third and last part of a series on this topic. The previous parts can be found here and here.

The “perfect crime” problem

The perfect crime is one that is unrecognizable. The criminal must disguise himself in such a way that one cannot even see that a crime has happened. All the data must be consistent.

The super civilization simulating us must do something similar. The fact that we are living inside a simulation must be invisible to us. It is obviously impossible to set up a detailed simulation of the whole universe. The information content of the universe is higher than anything that can become accessible even to the most advanced civilization possible. So the simulation must have a spatial border, or its granularity must reduce at the outside. Moreover, it must start at a certain point in history (let’s say, 10.000 years ago, or last week Thursday). Here, the initial state of the simulation must be set in a consistent way. It would have to start with faked data, but this data must not contain any contradictions that we, the inhabitants of the simulation, would be able to spot.

Now, assuming we are not living in a simulation, the information stored in our environment at any time cannot simply be derived from the laws of physics. It is the result of the history of the universe and our galaxy, the history of the solar system, the geological history of earth, the history of life on earth and the history of humans, as well as our individual biographies. It appears unlikely that this information can be set up by the simulating civilization, starting the simulation in such a way that all the correlations contained in it are present and that it does not contain any contradictions. In fact, the only way to calculate this starting state might be to start the simulation earlier, e.g. a million years ago, but that only shifts the problem. You might have to go back to the primordial earth. But running the simulation takes time and it is unlikely that you can become much faster than reality if your granularity is high enough to simulate the emergence of life and civilization and the rich world we are perceiving. The simulation might then have to run for millions or even billions of years. This appears implausible.

If, however, the super civilization has a way to calculate this starting state of the simulation without running the simulation (and I don’t think there is a way to do so, but let’s just assume there is one – they are a super-civilization, aren’t they?), then they don’t even need any simulation. They can calculate how the world looked last week Thursday and how it looks today or at any time they might find interesting. In that case, the simulation becomes dispensable.

Another approach would be the “Man in Black” method. You allow inconsistencies in the world to be there, but whenever somebody is spotting them, you tamper with their memory (or with history). But such a “simulation” in which you would have to introduce corrections all the time would have no value for research. What is happening is what you put in, so what is the point of simulating.

So since scientists are not observing inconsistencies in the past or at some distance from Earth, it appears highly implausible that we are inside a simulation. (Or have the men in black just been modifying our memories and records?)

(The picture is from:

Are we living inside a computer simulation? – 2

File:NS binary merger simulation 330.tif

Some people, including some philosophers (see for an example), have recently put forward the idea that we might inhabit a giant computer simulation. For example, one idea is that some civilizations might develop to a state where they start building such simulations in order to simulate possible histories of themselves.

This is part 2 of a series on this topic. Part 1 is here.

The resolution/resource problem

When scientists simulate a physical system, e.g. a star, the simulation has a limited temporal and spatial granularity. For example, you would divide the star into cubes of a certain size and describe each such voxel by a number of parameters, like pressure, temperature, chemical composition, magnetic field properties etc. You may also divide time into steps. Obviously, such a simulation can only be an approximation of reality. You may then squeeze a development that in reality takes millions of years into just a couple of days of computing time. However, there is no guarantee that the results are matching what is going on in reality. To increase the accuracy of the simulation, you may increase the resolution, making the units we are looking at smaller and the time frames shorter. Obviously, this will increase the amount of information to be stored and the amount of calculations to be done.

Now the problem is that storing information and performing calculations requires physical resources. There are limits to how much information can be squeezed into a certain amount of energy and space. The amount of information a system can store depends on two things: how much matter/energy is contained in the system and how large the system is. When you leave the energy content constant but increase the system’s radius and hence its volume, there are more possible configurations available (or, looking at it in terms of wave mechanics, longer additional wavelengths fit into the system), so it can store more information. If you add matter or energy, there are more possible objects (particles) inside whose properties or configurations can be employed to store information (or, in the wave mechanics view, you can add additional high frequencies, increasing the bandwidth again). So keeping the volume constant, you may squeeze in more information by adding energy, but there is a limit, because too much energy in too small a volume leads to gravitational collapse. Of course, the real limits will be reached long before this point, no matter what kind of advanced technology a super civilization might have.

You can put in more information by making the system larger and less dense, but this would reduce processing speed (because the speed of light is the limit for information exchange).

So whatever technology might be possible, you would need some physical resources to store all the information. Moreover, performing computations takes physical resources as well. You can miniaturize the computers to some extent, but there are limits here because of the granularity of matter.

Now, as we increase the spatial and temporal resolution of our simulation, we will reach a point where the amount of resources needed for the simulation exceeds the amount of resources required to build the actual object. For example, there is a simulation of a simple bacterium. This simulation, simulating the chemical processes inside the cell (in such a way that the simulated cell can actually divide), requires a big computer. The mass and energy consumption of this computer is by far larger than the mass and energy consumption of the simulated bacterium. And it should be noted here that the description of the bacterial cell is not physically complete in any way. The researchers had to enter by hand the reaction strengths of the different chemical reactions going on in the cell. The model cannot predict what happens when you introduce a certain mutation. This would require to model molecules on an atomic level to predict their properties. The amount of data and computation to do so would be even far greater.

For any physical system, there should be a degree of spatial and temporal resolution where the size (in terms of matter, energy and/or space) of the simulator becomes larger than the simulated system. There must be such a point because a simulator running an accurate, physically complete simulation of a system would have to contain more information than the simulated system. The physical system develops according to a set of laws (think of a system of equations). In a physical system, these laws do not contribute to the information content of the system. In a simulation, they have to be represented as information. Moreover, the mathematical knowledge for doing the calculations (the simulation software) must also be represented as information (even if it is hardwired into some hardware). Since information needs physical resources to be represented, a completely accurate simulation of a physical system would need more resources than the system itself. If the system contains a lot of redundancy, i.e. repetitive or ordered structures, the simulator might take advantage of information compression to reduce the storage requirements, but this would make the computations more difficult and time-consuming.

The computing machinery used to simulate stars in order to study different theories of how stars explode is much smaller than an actual star, but to create the detail of the world we are observing, a simulation would have to employ physical resources that are probably much larger than the actual systems simulated. So instead of setting up giant computing machinery, it would be much cheaper for a super civilization just to set up a real planet and let it develop. This would also remove the problem of hardware reliability you would otherwise be facing (and automatically give you a consistent starting state, more on that topic in the next part of this series). It is therefore highly unlikely that we are living in a simulation. The assumption that most instances of consciousness would be simulated ones, given in the article referenced above, is very likely wrong because the amount of resources needed to simulate a human being and its environment is very likely much higher than the amount of resources taken up by a real, physical human being (see a related (satirical) article about the “Frnx-Theorem” and the “Frnx-Limit” here:

(The picture, showing a frame of a simulation of the merger of two neutron stars, is from

Are we living inside a computer simulation? – Part 1

File:New NASA 3D Animation Shows Seven Days of Simulated Earth Weather (14906862243).jpg

Some people, including some philosophers (see for an example), have recently put forward the idea that we might inhabit a giant computer simulation. For example, one idea is that some civilizations might develop to a state where they start building such simulations in order to simulate possible histories of themselves. This is part 1 of a short series of (not so short) articles on this topic.

The problem of computational complexity and of computability

This new twist of metaphysics is based on the assumption that such a simulation is possible. If it is not possible for us, it might be possible for a “super-civilization”. This argument is used to sweep the question of feasibility under the carpet. The super-civilization is super, so they have all kinds of super-technology, and as a result, there is nothing they cannot do. But the problem here is: there may be limits to what technologies are possible.

In a branch of theoretical computer science called complexity theory, computer scientists have studied the question of how much resources are needed to calculate certain problems. It turns out that there are some tasks for which any possible algorithm quickly exceeds any bounds of resources as the instance of the problem is getting larger. Except for the most trivial cases, the resources required will soon exceed the resources available in the whole universe, i.e. such problems are computationally intractable, no matter what technology you have.

What does computational complexity theory have to do with computer simulation metaphysics? The answer is that some physical processes may have a very high computational complexity. For example, it looks like the folding of proteins, a process happening in the cells of our bodies all the time in myriad instances, is such a problem. New protein molecules are produced as long chains of amino acids. A newly generated protein molecule will then quickly fold into its final configuration. This is, in most cases, a state of lowest energy. It has been argued that calculating this folded structure is practically impossible. Look at (Aviezri S. Fraenkel: Complexity of protein folding, Fraenkel, A.S. Bltn Mathcal Biology (1993) 55: 1199. doi:10.1007/BF02460704), where we read the following abstract:

“It is believed that the native folded three-dimensional conformation of a protein is its lowest free energy state, or one of its lowest. It is shown here that both a two-and three-dimensional mathematical model describing the folding process as a free energy minimization problems is NP-hard. This means that the problem belongs to a large set of computational problems, assumed to be very hard (“conditionally intractable”). Some of the possible ramifications of this results are speculated upon.”

(The term “NP-hard” is a technical term from complexity theory.) The folding of the protein in reality only takes the resources contained in the molecule (and the surrounding molecules and ions (water etc.)): a very small amount of matter and a very tiny amount of energy. Yet the mathematical analysis is showing that a system capable of calculating (simulating) this process in every detail would take computational resources that are larger than the amount of matter and energy in the observable universe. This means that there are processes in nature which happen according to laws that cannot practically be calculated by any kind of computer that can be practically built, by any civilization whatsoever. This implies that the computer simulation metaphysics must be doubted. Physics does not work by some underlying system simulating or emulating the physical processes in terms of a calculation. The physical process is not the process of something performing a calculation. The laws of a system might be describable in terms of a system of equations. The system then evolves in a way described by these equations but that does not require that the set of equations can be solved by any efficient algorithm.

Many scientists tacitly seem to assume that physical processes can be thought of as calculations. But what if this idea is wrong? If a system can behave according to a set of equations that is not practically computable (because of its computational complexity), i.e. if the physical process itself is not a process of computation, then there is no reason to assume that all physical laws have to be computable at all. Nature might contain systems that behave according to systems of equations for which no algorithms exist to solve them. This would not just mean that we do not know such algorithms yet but that they would simply not be possible. The physical system would act according to those laws but we could not calculate the process (except for special cases). In other words, the mathematical description of such a system would involve functions that are not Turing-computable. The physical theory describing such a system would be computationally incomplete.

There is no a-priori reason why physics should be restricted to the computable. And it looks like physicists and engineers are quite familiar with systems of this kind. There are systems where they don’t know how to solve the equations except for some special cases. Scientists and engineers use methods of approximation in such cases (“numerical methods”), i.e. methods that can give an approximate result. However, if the equations are not linear, small errors might be amplified and the results of approximate methods might be totally wrong (as is often the case, for example, in weather predictions).

So we should not think that when a physical process is happening, something is doing a calculation. Physical systems evolve according to the laws describing them, but they do not perform calculations. Such calculations would be constrained by the mathematical constraints of complexity theory and of computability theory, but such constraints do not seem to exist in nature.

If we were living inside a computer simulation, we should not be able to find processes in nature that have a high computational complexity. Furthermore, if we were living inside a computer simulation, we should not be able to find processes in nature that involve non-computable functions. These questions can be investigated empirically, by looking for processes with high computational complexity and processes involving non-computable functions. Before starting to speculate about the question if we live inside simulations, we should first investigate the computability of the world we inhabit. If we find processes that are not computable in principle or in practice (or, to use a terminology I have introduced elsewhere, if our world is a proteon), we would thereby prove that our world is not a simulated one. The restrictions are of mathematical nature and apply to any possible technology (and any possible physics in any possible universe).

(The picture, showing a frame of a weather simulation of Earth, is from

Research Note: The “Philosophisches Wörterbuch”

I am currently evaluating several editions of the “Philosophisches Wörterbuch” from Kröner Verlag. The current edition is, I think, the 23rd. The edition I am mainly using here is the 1957 14th edition edited by Georgi Schischkoff. It contains a lot of hints on the philosophical schools and currents I am interested in here.

Most of this material has been removed in later editions, so the current edition, while containing lots of stuff on current developments in philosophy, does not show the chapter of the history of philosophy I am currently researching. What is remaining (I think there is an article on Oswald Spengler still inside the current edition) appears as isolated fragments whose context is missing.

I am planning to write a series of short research notes based on articles in that dictionary. I am going to add material from other sources as well. In doing so, I am going to use this blog as a kind of public research notebook. At the moment, I don’t expect much interest in these articles but I prefer to make my results public, however preliminary they might be. As far as my time allows, the materials collected in these research notes might then form the basis for some extended articles or essays.

It is interesting to compare subsequent editions of the dictionary. The dictionary was established by Heinrich Schmidt. Schmidt was the director of the Ernst Haeckel-Archiv in Jena. He edited the dictionary from the first to the 9th edition. The earliest edition I currently have is the 8th, from 1931. This is the last edition Schmidt did alone. A lot of the material I am interested in here is not yet contained in it. The 9th edition (1934) was done with the help of one Dr. Friedrich Blaschke. In the preface, Schmidt seems to distance himself slightly from Blaschke, he writes that Blaschke was helpful “with the articles about the newest philosophy and the history of ideas […] A certain contrariness of our views forced me again and again to deepen and broaden the basis of the epistemic foundations and to a more pregnant version of the presentation.” It is possible that Blaschke was forced upon Schmidt by the Nazi authorities. Influences of Nazi ideology start to appear in the 9th edition. Schmidt died in 1935.

The 10th edition was prepared, in 1943, by Werner Schingnitz and Joachim Schondorff. A lot of material was removed, a lot was added and a lot was totally rewritten. This 10th edition is completely steeped in Nazi ideology.

I do not currently have the 11th to 13th edition, edited by Justus Streller. I am going to try to get these editions as well as earlier editions before the 8th.

The 14th edition I have seems to be based on the editions prepared by Streller, which in turn is based on the 9th edition. Most of the ideological material has been removed (although some things have slipped through, obviously unintentionally (for example, there is an article for “Degeneration” that just points to an article “Entartung” which, however, no longer exists in the 14th edition. In the 9th edition that article is still present, with racist content. It also exists in the 8th edition. It must be noted here that Schmidt was a follower of Haeckel who was not only a Darwinist but also a racist). There are some articles that are interesting for my current research that where not there yet in the 9th edition, so this edition is a particularly rich source. However, I do not know if this material was added by Schischkoff or by Streller.

It would be interesting to find out when this material (like, for example, articles about Breysig or Frobenius) was removed from the dictionary. There seems to have been a paradigm shift, perhaps at the end of the 1960s or in the early 1970s. I have not yet found out when these articles were removed from the dictionary. Using a current edition would give a rather impoverished or depleted view of the history of 20th century philosophy. One could also say: it is a cleaned view. But I think removing this material from such a dictionary is a mistake. While it might be irrelevant for current philosophy it should be there as part of the history of philosophy, especially since a lot of these thoughts – although they have been removed from the academic environments of philosophy, history, social sciences and cultural anthropology, are still out there in the public and are strongly resurfacing in recent times in right-wing movements.

I think this is the current edition:

Research Note: Kurt Breysig

According to (Schischkoff 1957), Kurt Breysig is a representative of culture morphology.

The article on Breysig in that dictionary reads (translation by me):

Breysig, Kurt: Historian, Sociologist and Philosopher of History [Geschichtsphilosoph]. *July 5th 1866 Posen, † June 16th, 1940 Rehbrücke bei Potsdamm. 1896 to 1934 Prof. in Berlin; created a theory of history that, starting with the established facts [vom Boden der erforschten Tatsachen] seeks to rise to ever-higher and more comprehensive overall observations, views the history of culture in the way of culture morphology as a history of the soul and the development of humanity as a development from natural processes [Naturgeschehen] to processes of spirit [Geistesgeschehen]. Main works: Die Geschichte der Seele [The history of the soul] 1931. Naturgeschichte und Menschheitsgeschichte [Natural history and history of humanity] 1933. Der Werdegang der Menschheit vom Naturgeschen zum Geistesgeschehen. 1934 [The development of humanity from natural processes to processes of spirit].

[secondary literature:] E. Hering. Das Werden als Geschichte, Kurt Breysig in seinem Werk. 1939.


The (German) Wikipedia article ( is in a mediocre condition, its content has to be taken with reservation, but it looks interesting.


 (Schischkoff 1957): Philosophisches Wörterbuch. 14th Edition, ed. by Georgi Schischkoff, est. by Heinrich Schmidt, Alfred Kröner Verlag, Stuttgart.

(The 8th edition of that dictionary (1931) did not yet contain an article on Breysig. It appears for the first time in the 9th edition (1934), again edited by Heinrich Schmidt, with the assistance of one Dr. Friedrich Blaschke. This first version of the article might have been written by Blaschke. In the preface, Schmidt seems to distance himself slightly from Blaschke, he writes that Blaschke was helpful “with the articles about the newest philosophy and the history of ideas […] A certain contrariness of our views forced me again and again to deepen and broaden the basis of the epistemic foundations and to a more pregnant version of the presentation.” In this ninth edition, the first signs of Nazi ideology are entering the dictionary, in the form of articles missing in the previous version (e.g. there is an article “völkisch”, missing from the 8th edition. Schischkoff or his predecessor as editor, Justus Streller, who edited the 11th to 13th editions, then obviously shortened the article on Breysig and added the hint on Kulturmorphologie.)

Research Note: Karl Friedrich Vollgraff

Karl (or Carl) Friedrich Vollgraff (1794 – 1863) seems to have had ideas that are similar to those of Kulturmorphologie. It remains to be seen if there is a connection or just a parallel.


Especially the following title seems to be interesting in this respect:

  • Erster Versuch einer Begründung der allgemeinen Ethnologie durch die Anthropologie und der Staats- und Rechtsphilosophie durch die Ethnologie oder Nationalität der Völker, 4 Bde. (1851-1855)

The current (German) Wikipedia article about him ( about him is in a very unsatisfactory condition (e.g. violating the Wikipedia principle of neutral point of view), but hints at interesting content here.